Automated Society

How Else Are People Using FindClone's $5/Month Face Recognition Service? 

“These services help identify who killed Nemtsov and to investigate war crimes by the Russian military in Ukraine” is what an administrator of the anti-corruption project Skaner (or “Scanner”) told me a few weeks ago, when I asked about their thoughts on FindClone, a 5-dollar-per-month Russian face recognition site (described by BellingCat’s Aric Toler as “the most powerful open source tool I’ve used”). In July 2019, Skaner posted the social media accounts of police officers, who’d been snapped brutally beating up Muscovites demonstrating for democratic elections. “It was the peak [of using face recognition to expose violent police],” says another activist to whom I spoke earlier this week. But afterwards, police started hiding their faces behind tinted helmets and masks. And officers began using face recognition to ID protesters: “This way they now manage to avoid the ugly image of beatings in public squares and simply come to activists' private homes to detain them a few hours after the event.”

Meanwhile, journalists in Russia have used FindClone to help identify an online casino owner, local athletes paid to intimidate election observers in St. Petersburg and, um, the developers of FindClone themselves. Men filmed torturing and killing a man in Syria were identified as mercenaries of the Russian private military company “Wagner Group”. The Novaya Gazeta journalist who broke that story once had a sheep head and funeral wreath sent to his office. Another journalist told me he uses FindClone to help him identify “just people, or their friends, or their mothers and fathers”, for his investigative work.

“Gentleman, anons, I have 40 FindClone searches left, and the term will expire soon, it’s a pity if they disappear. Therefore, I offer my search services.” In the past years, again and again, posts like this appear on the Russian anonymous forum 2ch.hk (otherwise known as “Dvach”). Replies stream in: Users post pictures of teens, stills from porn videos or homemade sex tapes, a lady who “refused to hire me”, “him please”, “her please.” FindClone matches profiles of strangers to profiles on VKontakte, Russia’s Facebook equivalent (“Dvach'' is hosted by the state-affiliated company VK, which also operates VKontakte). On Dvach, users have used FindClone to look for the VKontakte pages of women in porn videos or escort dating services. They share tips on how to “expose” them: who to message (the women’s friends, family, their children’s school friends) and what to say (introduce yourself as a journalist, “blackmail and harassment”).

“I don’t sit on Dvach or any other anonymous image boards,” one woman, who has criticized the forum’s attacks on sex workers, tells me. “But I’ve heard that it’s a very toxic environment and face recognition tools are actively used to deanonymize.” Since 2010, “Dvach” is run by a user called Abu, who said he bought the site for 10,000 dollars. In 2011, Abu’s real life identity was revealed as Nariman Namazov, a 27-year-old Azerbaijani SEO specialist from Moscow. In 2020, Dvach users doxxed the Russian actresses who starred in a porn horror film by Rammstein singer Till Lindemann. “If someone needs to poison women, they would have recognized and poisoned them 10 years ago without these tools,” one Russian feminist, who received death threats after defending the women in the Lindemann video, told me, arguing that face recognition tools, “can also service for good, for example to recognize criminals and crooks”.

We will publish a story about FindClone on the AlgorithmWatch website later this month – if you’re interested, stay tuned.

NEWS  in automated decision-making

The controversial face recognition firm Clearview AI says it plans to pivot to the private sector, for example by selling "visitor management systems" to schools [Reuters, 24 May].

EU agencies Frontex and Europol are proposing a "European System for Traveller Screening" that "could include AI technology." [Statewatch, 19 May].

At five supermarkets in São Paulo, Brazil, customers can now pay for their groceries by face recognition or vein biometrics (waving their hand over a camera) [Biometric Update, 17 May].

This newletter will remain free of charge, but please support our work on automated decision-making with your donation:

WE READ IT  so you don't have to

Report  "How Dare They Peep into my Private Life?" Children's Rights Violations by Governments that Endorsed Online Learning During the Covid-19 Pandemic, Human Rights Watch, 25 May.

A new report from Human Rights Watch analyses how 164 education apps and websites that were endorsed across 49 countries during the Covid-19 pandemic collected and shared students’ information. Many of these products shared children's data with advertising technology companies, whose algorithms analyzed the data to guess a child’s characteristics and predict their future behavior. Some sold this information on to advertisers for the purpose of targeting ads.

The report found that, amongst other products, Ruangguru, an app used by students in Indonesia, collects, analyses and shares students' personal information with Facebook and AppFlyers. Meanwhile MUSE, a primary school app used by children in Pakistan was found to have collected and shared students’ personal data with Google and Facebook, improving their ability to target students with behavioral advertising. Various products did not disclose in their privacy policies that their users' data was being used for behavioral advertising.

MORE FROM ALGORITHMWATCH

  • Professor of digital and data journalism Christina Elmer is AlgorithmWatch's new shareholder. 
  • Read AlgorithmWatch's latest summary on how the Council of Europe is working on legal frameworks to regulate AI systems.
  • The ZDF documentary "Programmed Injustice" features AlgorithmWatch's Anna Lena Schiller and Matthias Spielkamp [DE]  

If you want to get in touch, give feedback on the newsletter, share a personal experience with an automated system or a new development you've heard of, just reply to this email. I'd love to hear from you.

This e-mail has been sent to @, click here to unsubscribe.

AlgorithmWatch 2022, CC-BY
AW AlgorithmWatch gGmbH, Linienstraße 13, 10178 Berlin, Germany
algorithmwatch.org/en