This can be very useful intelligence for the Ukrainian military - and for war crimes investigators.
Russians are likely conducting similar analyses, but the more troops know they are being watched and can be identified, the more likely their behavior will be more restrained. JL
Tom Simonite reports in Wired:
A conflict between two internet-savvy nations in a region with good cellular coverage offers rich pickings for intelligence. Cross-referencing social media posts and other sources can reveal locations or losses of military units. Abundant online photos that are the legacy of years of social networking and easy access to face recognition algorithms allow a a screenshot of a person’s face to track down their name and family. Russian analysts are also tracking Twitter and TikTok (but) “The more individuals are publicly identified and know their movements are followed, the less chance they will commit war crimes.”ON MARCH 1, Chechnya’s leader, Ramzan Kadyrov, posted a short video on Telegram in which a cheery bearded soldier stood before a line of tanks clanking down a road under an overcast sky. In an accompanying post, Kadyrov assured Ukrainians that the Russian army doesn’t hurt civilians and that Vladimir Putin wants their country to determine its own fate.
In France, the CEO of a law enforcement and military training company called Tactical Systems took a screenshot of the soldier’s face and got to work. Within about an hour, using face recognition services available to anyone online, he identified that the soldier was likely Hussein Mezhidov, a Chechen commander close to Kadyrov involved in Russia’s assault on Ukraine, and found his Instagram account.
“Just having access to a computer and internet, you can basically be like an intelligence agency from a film,” says the CEO, who asked to be identified as YC to avoid potential repercussions for his sleuthing. Tactical Systems’ client list includes the French armed forces, and it offers training in open source intelligence gathering.
Russia’s assault on Ukraine, a conflict between two internet-savvy nations in a region with good cellular coverage, offers rich pickings for open source intelligence, or OSINT. Compiling and cross-referencing social media posts and other public sources can reveal information such as the locations or losses of military units. The abundant online photos that are the legacy of years of social networking and a handful of services that provide easy access to face recognition algorithms allow some startling feats of armchair analysis.
Not long ago, a commander or prisoner of war pictured in a news report might be recognizable only to military and intelligence analysts or the individual's own colleagues, friends, and family. Today a stranger on the other side of the globe can use a screenshot of a person’s face to track down their name and family photos—or those of a look-alike.
WIRED used a free trial of a Russian service called FindClone to trace a photo of a man that a Ukrainian government adviser claimed was a captured Russian soldier. It took less than five minutes to find a matching social media profile. The profile, on Russian social network VKontakte, included the teenager’s birthdate and photos of his family. It listed his place of work as “polite people/war.” The Russian phrase “polite people” is used to refer to soldiers from Russia active in Ukraine during the 2014 annexation of Crimea. Ukrainian open source intelligence group InformNapalm independently made the same connection in an earlier post claiming to identify two of the claimed captives and confirmed in a message to WIRED that it had relied in part on face recognition.
That power to identify people from afar could bring new accountability to armed conflict but also open new avenues for digital attack. Identifying—or misidentifying—people in videos or photos said to be from the front lines could expose them or their families to online harassment or worse. Face algorithms can be wrong, and errors are more common on photos without a clear view of a person’s face, as is often the case for wartime images. Nonetheless, Ukraine has a volunteer “IT Army” of computer experts hacking Russian targets on the country’s behalf.
Even amateur investigators can access multiple face recognition services. Some can search across millions of faces found online in a way similar to controversial US startup Clearview, which markets primarily to law enforcement. To identify the bearded Chechen soldier, YC of Tactical Systems first used FindClone, which searches across photos sourced from VKontakte. The results led to a photo of the soldier clasping hands with Kadyrov. An openly accessible demo of a Microsoft service that compares faces in two photos, marketed for uses like checking IDs, also judged that the photos showed the same person.
A face search engine called PimEyes, which was founded in Poland and once claimed to have compiled 900 million faces, turned up more photos. One pointed to an Instagram account with a photo that revealed Hussein Mezhidov’s name. Searches using that name returned articles describing him as a commander and special forces trainer, as well as a YouTube video apparently shot in Ukraine in which he pulled the national flag down from a government building.
Tactical Systems’ Twitter thread recounting that investigation spread quickly. Its CEO says he hopes to inspire others to develop open source intelligence skills that can help hold combatants in Ukraine or other conflicts to account. “The more these individuals are publicly identified and know that the OSINT community is following their movements, the less chance they will commit war crimes,” he says. Microsoft, PimEyes, and FindClone did not reply to requests for comment.Face recognition can also be used to debunk identification claims. Last weekend, Tactical Systems and high-profile open source intelligence group Bellingcat both turned to Microsoft’s face verification service after reports, including from Ukrainian newspaper Ukrayinska Pravda, that the bandaged face of a man said to be a Russian pilot shot down in Ukraine matched that of a pilot pictured alongside Vladimir Putin in a 2017 news photo from Syria. Microsoft’s algorithms spat out a low score and said the faces did not match.
Bellingcat includes advice on the use of face recognition tools in its guides to open source intelligence. The group credited FindClone in a 2019 report that identified several people alleged to have been involved in shooting down a Malaysian Airlines flight over eastern Ukraine in 2014. Dutch investigators concluded that the flight was downed by a Russian missile, but the country’s government denied involvement.
Posts that cite face recognition to back up claims about people on the frontlines in Ukraine have for the most part generated a positive reaction on social media—in contrast to the typical response to revelations about police or government use of face recognition.
Jameson Spivack, an associate at Georgetown’s Center on Privacy & Technology, says some of the same concerns about government uses of the technology also apply when it's being used for identifications in war-torn Ukraine.
One is that face recognition performs unreliably on images that don’t capture people head-on, a limitation for both police detectives and those sourcing images from war zones. Another is the potential unintended consequences of correct or incorrect identifications. “Individuals using the technology don’t have the power of the state behind them like law enforcement, but the internet can put the collective power of the mob behind them,” Spivack says.
YC of Tactical Systems agrees. He says that he always takes care to back up algorithms’ assessments with other visual clues or contextual information. In the case of the bearded Chechen, a distinctive notch in the man’s beard helped confirm some matches. “Humans are needed, too,” he says.
If distant volunteers can identify combatants using face recognition, government agencies can do the same or much more. “I’m sure there are Russian analysts tracking Twitter and TikTok with access to similar if not more powerful technology who are not sharing what or who they find so openly,” says Ryan Fedasiuk, an adjunct fellow at the Center for a New American Security.
0 comments:
Post a Comment