Intro

An end to privacy

On March 13, 2022, 34-year-old English teacher Yulia Zhivtsova left her Moscow apartment to meet her friends at the mall. Bundled up against the freezing cold, she entered the metro at the CSKA station on the Bolshaya Koltsevaya line, passing through station barriers that let travelers pay by scanning their faces.

But when she went down to the platform, two police officers plucked her out of the crowd. 

“Hey!” said one, and then addressed her by her full name, including the Russian patronymic. “Yulia Maksimovna. Come with us.”

The officers looked back and forth between Zhivtsova and an image on their smartphones. They seemed unsure if they had the right person. Catching a glimpse of the screen, Zhivtsova recognized a photo of herself taken the month before, when she was detained for protesting Russia’s war in Ukraine. Her hair looked different: In the photo it was faded blue, but that day it was back to a gleaming teal. “I do tend to change my hair color a lot,” Zhivtsova told Rest of World

After a while, the officers decided to trust the image on their smartphones. Another anti-war demonstration was taking place in Moscow that day, and even though Zhivtsova didn’t plan to attend, they detained her preventively, holding her for a few hours.

The changing face of protest thumbnail

The changing face of protest

Mass protests used to offer a degree of safety in numbers. Facial recognition technology changes the equation.

This story explores how thanks to new facial recognition technology, protesters’ safety in numbers is becoming a thing of the past. We take a look at three case studies — in Russia, India, and Iran — to show the proliferation of facial recognition as a tool to control and curtail protest.

Written by Darren Loucaides. Narrated by Jane Seidel. 

Original story: https://restofworld.org/2024/facial-recognition-government-protest-surveillance/

0:00/33:10

Over the past decade, there has been a steep rise globally in law enforcement using facial recognition technology. Data gathered by Steven Feldstein, a researcher with the Carnegie Endowment for International Peace, found that government agencies in 78 countries now use public facial recognition systems.

The public is often supportive of the use of such tech: 59% of U.K. adults told a survey they “somewhat” or “strongly” support police use of facial recognition technology in public spaces, and a Pew Research study found 46% of U.S. adults said they thought it was a good idea for society. In China, one study found that 51% of respondents approved of facial recognition tech in the public sphere, while in India, 69% of people said in a 2023 report that they supported its use by the police.

But while authorities generally pitch facial recognition as a tool to capture terrorists or wanted murderers, the technology has also emerged as a critical instrument in a very particular context: punishing protesters. 

The last 20 years have shown that mass demonstrations can have real impacts. Starting in 2010, a wave of protests across the Middle East and North Africa, known as the Arab Spring, toppled regimes in Tunisia, Libya, Egypt, and Yemen, and spurred revolts in many other countries. In 2014, protesters in Hong Kong took to the streets for universal suffrage, sometimes called the Umbrella Revolution owing to protesters’ use of umbrellas to shield against pepper spray. While authorities did not make any concessions at the time, the protests drew global attention, and when Hong Kongers took to the streets with new demands in 2019, the government withdrew a controversial bill that would have allowed suspects to be extradited to mainland China. In 2020, the #EndSARS movement against police brutality in Nigeria resulted in the disbanding of the Special Anti-Robbery Squad, the police force at the center of the controversy, while mass protests in Chile led to a cabinet reshuffle and a referendum to rewrite the country’s constitution. So far, 2024 has been marked by several large-scale protest events, including farmers’ protests in India and Europe, and protests in many countries against the war in Gaza.

In countries where demonstrating can come with physical or political risk, large-scale protests have historically offered a degree of anonymity, and, with it, a level of protection. Mass protests are a way for citizens to express dissent as a collective — often under the assumption that “they can’t arrest us all.” 

But in the last decade, the spread of facial recognition technology has changed that equation: A lone face in a crowd is no longer anonymous; facial recognition allows authorities to capture people’s identities en masse. 

It’s no coincidence that the widespread adoption of the technology has evolved in parallel with increasingly draconian laws against protest. As part of its “Protect the Protest” project, Amnesty International tracks repressive legislation that imposes illegitimate restrictions on protests, with examples across five regions. Facial recognition tech helps enable this repression by offering a way to enforce such regulation on a sweeping scale.

A map of the world showing all the countries known to use facial recognition technology.

In the U.S., law enforcement used facial recognition at Black Lives Matter protests in 2020, resulting in at least one activist being targeted at their home. In the U.K., the London Metropolitan Police admitted to using facial recognition technology on tens of thousands of people attending King Charles III’s coronation in May 2023. 

Often, facial recognition is used to disproportionately target people belonging to a racial, ethnic, or religious minority. “Again and again we see that it’s people who are already targeted by police or subject to severe movement restrictions, or have already been subject within their communities to police brutality, that are most targeted by these tools,” Matt Mahmoudi, a researcher at Amnesty International who specializes in facial recognition, told Rest of World

Mass demonstrations have become opportunities for authorities to net thousands of faces via CCTV, van-mounted cameras, and police mobile devices, which can then be added to facial recognition databases. In the past month, Indian authorities used the technology to identify people who participated in farmers’ protests, threatening to cancel their passports. A Russian civil society group believes Moscow police used the technology to track down people who attended opposition leader Alexei Navalny’s funeral.

A black and white photograph showing a number of security officers wearing camouflage and body armor, walking along a street past a large crowd.
Alexander Nemenov/AFP/Getty Images

The result is a fundamental shift in the power balance between authorities and the general public that is changing the nature of protest. The most obvious outcome is a chilling effect: Facial recognition technology puts demonstrators at greater risk of persecution, often stymieing efforts to protest before they even occur. 

Meanwhile, those still brave enough to take to the streets are finding ways to try to mitigate the threat. Protests in Chile and Argentina saw people donning face masks and balaclavas. Activists in London have painted their faces with so-called dazzle makeup, developed by artist Adam Harvey and designed to confuse algorithms (although experts warn that this may not be effective against newer facial recognition systems). In Hong Kong, during the pro-democracy protests that started in 2019, demonstrators again employed umbrellas, which helped conceal their faces from police surveillance, fired lasers to blind cameras, and felled surveillance towers. In mainland China, protesters demonstrated against the government in November 2022 by holding blank pieces of paper in the air and in front of their masked faces.

Authorities are often secretive about their use of facial recognition at protests. Often, the people arrested are not told whether the technology has played a role in their detention, even if they suspect it. Over six months, Rest of World spoke to researchers, activists, and people targeted by facial recognition systems around the world to track how this technology is upending protest as we know it. We found evidence of facial recognition tools being used at major protests worldwide, often in a way that clashes with civil liberties. The context may vary by location, but the overall outcome is shared: Facial recognition technology is making the act of protest riskier than ever, putting demonstrators at greater risk of persecution, exacerbating the targeting of minority groups, and changing the way people express dissent. 

Combined with a rise in authoritarianism in many countries, some activists and civil groups even fear that the increased use of facial recognition could mean an end to protest as we know it. “I don’t see [almost] any protest anywhere,” Shivangi Narayan, a sociologist in India who studies digital policing, told Rest of World. “Even a person like me who’s working on government surveillance and policing — I’m wary of who I’m talking to.”

Now, if she knows there will be CCTV surveillance at a particular location, Narayan avoids the area, or covers her face.


Russia

The rise of pre-crime

A black and white illustration with aqua highlights showing security officers monitoring people waiting to enter a subway entrance.

Six months after being detained at CSKA station, Yulia Zhivtsova was back on the Moscow Metro. It was September 30, 2022, and the government was holding a concert rally next to the red-brick walls surrounding the Kremlin to celebrate the “new territories” annexed from Ukraine. Zhivtsova hadn’t attended protests since her last detention; she was heading home from a dance class. 

But when she entered the metro at Tverskaya station, two officers stopped her. Again, they took a while to compare the image on their screens with the woman in front of them — who now had shorter hair — before taking her in. 

Zhivtsova expected to be detained for a couple of hours and let go like last time. But at the police station, officers noticed a tattoo on her hand. After her previous detention, she’d gotten the words “No to war” inked. The officers charged Zhivtsova with “discrediting the military” under draconian new legislation that carries a possible five-year prison sentence.

A black and white photograph showing a woman seated by herself with a backback on a subway car
Yulia Zhivtsova

In August 2023, she was convicted and fined 50,000 rubles ($540). Shortly after, she left Russia, fearful that she would be sent to prison if she were caught with the tattoo again. Zhivtsova now lives in Montenegro. “I’m not going back anytime soon,” she said.

Other anti-war protesters in Russia have similar stories. After the invasion of Ukraine, Andrei Markov put up anti-war posters in public spaces. He also wrote snippets of anti-war graffiti on the Moscow Metro. One Thursday afternoon in July, he boarded a quiet carriage and scrawled the words “no war” on the corner of a subway map in permanent marker. 

The following Sunday, police knocked at his apartment. “That was a bad surprise,” he told Rest of World. “I didn’t open the door.” But when he entered the metro for his commute on Monday morning, police stopped him. 

Markov asked to be identified by a pseudonym because he currently lives in Russia and fears authorities could target him further for speaking out. He said he spent 40 hours in a 3-square-meter cell between interviews. “It was a very bad experience. I could not call relatives, lawyers or anybody else, and it made me really nervous. They even did not give any drink or food,” Markov said. Finally, officers told him he was accused of discrediting the military. They showed him a video of him writing the graffiti. They also told him that he had appeared in Sfera, a facial recognition system used in Moscow’s metro. He was released with a fine.

But the next time Markov went to take the metro, a police officer immediately stopped him. Markov explained he’d already been arrested and paid the fine. The officer shrugged. “This is not my problem,” he said. 

Markov doesn’t take the metro any more, and has changed his apartment and phone number. “But I’m sure they can still find me very easily,” he said. “Not just me, but everybody. They’re using modern systems that are really good to find any person.” 

Rest Of World spoke to five Russians who had attended protests in the past and were later tracked down by law enforcement using facial recognition technology. OVD-Info, a Russian human rights and media group dedicated to fighting political persecution, has documented 595 cases in which facial recognition technology was used against dissenters since 2021. Of those, 141 were cases of activists being detained preventively on the metro — all since the invasion of Ukraine. OVD-Info defines preventive detentions and arrests as stopping people who have not been charged with anything, as a way of preventing them from attending political protests.

A black and white photograph showing a large group of people, the person in the center is holding up a picture of Alexei Navalny with Russian text under it.
Kirill Kudryavtsev/AFP/Getty Images

But while many detainees report police officers verbally stating that they have been caught by facial recognition technology, or even showing them the Sfera system on their devices, the tech rarely appears in court documents. 

“The authorities are still hiding the facts,” Stanislav Seleznev, a lawyer working for Net Freedoms Project, which defends Russian digital rights, told Rest of World

Russia’s use of facial recognition has coincided with a general clampdown on protest following mass demonstrations in December 2011 around elections to the State Duma, the lower house of Russia’s parliament. The next year, authorities installed 60,000 new cameras in Moscow alone. In subsequent years, several Russian companies emerged offering facial recognition tech to the authorities. 

In 2017, the Moscow Department of Information Technologies announced that 3,000 of the city’s now 160,000 cameras were being connected to a facial recognition system by software company NtechLab. Authorities also started to regularly deploy the tech at protests, rallies, and private events. In 2018, it was connected to the Moscow Metro; in 2019, the Ministry of Internal Affairs claimed it had identified 90 wanted persons using the tech. NtechLab did not respond to a request for comment from Rest of World.

The full potential of facial recognition technology as a tool against protest became clear in April 2021 at a march in support of Navalny. The opposition leader had just returned to Russia after being treated in Germany for poisoning, and had been imprisoned. Few people were detained at the protest, but OVD-Info’s lawyers soon realized that facial recognition had been deployed at the event when police started showing up at the homes and workplaces of many attendees.

A chart and map showing protests across the country of Russia since 2020, highlighting the month of March 2024.

A. A., who asked to be identified only by his initials out of fear of receiving more unwanted attention from the authorities, attended a protest in support of Navalny in January 2021. A week after the march, he opened his door to the police, who showed surveillance photos of him at the protest and warned him against attending “unauthorized events.” He was arrested and spent the better part of a weekend in detention. Eventually, he went to court and received a fine. 

Thanks in part to its existing database of dissenters, the Kremlin was in an even stronger position to clamp down on protests following Russia’s invasion of Ukraine. According to OVD-Info’s data, protests against the invasion peaked in 2022, when there were more than 20,000 political arrests; in 2023, protests dwindled, though close to 800 people across the country faced criminal charges for “anti-war” views. While Moscow is the clear leader in use of facial recognition technology, it’s spreading fast across Russia: Some 62 regions now use the tech today, up from just five in 2021.

“After spring 2022, mass protests practically disappeared.”

Seleznev suggests facial recognition technology, alongside increasingly repressive laws, has had a chilling effect on protest movements. “After spring 2022, mass protests practically disappeared,” he said. According to him, anti-war protests are not mass events any longer, but “individual in nature”  — like Zhivtsova’s tattoo and Markov’s graffiti. 

“I think many may be frightened by facial recognition, as punishment becomes more and more inevitable,” said A. A. “I think this is one of the main reasons why protests do not occur in Moscow and, to a lesser extent, in other cities.”

Even those who still wish to protest may struggle to do so because of preventive detentions. Maria Nemova, a lawyer at OVD-Info, has encountered many such cases where people were stopped in Moscow’s metro. “People were detained not for a specific action, but simply because they were considered dangerous,” she told Rest of World

Nemova said she knows of several people who decided to leave Russia after being preventively detained via facial recognition technology on the metro. “These individuals explained their decision by a sense of vulnerability and lack of protection: The state no longer needs a reason, such as participation in a protest, to detain you, and it can track you almost anywhere.”


India

Targeting minorities

A black and white illustration with aqua highlights showing a large protest being surveilled by CCTV cameras and a drone.

When India’s government passed the Citizenship Amendment Act (CAA) in December 2019, it triggered some of the biggest protests the country had seen in years. The law, which was proposed by the ruling Bharatiya Janata Party (BJP) and its Hindu nationalist prime minister, Narendra Modi, streamlined the path to citizenship for migrants from neighboring countries, while excluding Muslims. Critics argued that, combined with the National Register of Citizens and its strict requirements for birth and identity documents, the new law could make many Muslims in India effectively stateless. 

Shivangi Narayan, a 39-year-old sociologist who lives in a New Delhi suburb, recalled noticing many people she wasn’t used to seeing at demonstrations, including more women, ethnic minorities, and students. The issue had clearly struck a nerve. “The protests became national. And there were women at the forefront,” Narayan told Rest of World. “Anybody who wanted democracy and freedom of speech and expression — and who wanted a lively diverse culture in India — was loving it.”

Narayan, who specializes in researching high-tech solutions to policing, did not fail to notice that law enforcement was using surveillance at the protests: videoing the crowds, approaching demonstrators to ask their names and affiliations, and deploying drones. “I think nobody really thought much about it,” she said. “I know a couple of people who were active in protests who were like, so what? Take my picture, it doesn’t really matter.”

“A lot of students we did not know by name or face before that,” Radhika Ganesh, a prominent political youth organizer, told Rest of World. “They’re not activists. They’re not people who have an agenda. They’re not people who have a network or the support systems that some of us do. They didn’t think of hiring lawyers, or what their paper trail was going to be. These were really unassuming young people who found their agency.”

Like Narayan, Ganesh was used to being surveilled. Soon, however, she and other veteran activists noticed a shift in who was being targeted and how. “You could be an absolute no-one, just a random student in the middle of the crowd surrounded by a sea of people,” she said. Participants like these were being picked up before and after protests. Given the omnipresence of surveillance cameras at the protests, and the fact that it would be almost impossible to manually identify so many random individuals lacking any public profile, activists believed facial recognition was in use.

A black and white photograph showing a massive crowd of people protests on the street of an Indian city, many are waving white flags and holding signs.
Sanchit Khanna/Hindustan Times/Getty Images

As the protests rolled on, police began to aggressively target student activists from Jamia Millia Islamia university, which has a strong Muslim history. “Young hijabi women were being arrested and picked up,” Ganesh said. “All of a sudden you hear of parents running into the protest sites and saying, ‘Oh my daughter has been taken away — I can’t figure out which police station they’ve been taken to.’”

Things got worse when rioting related to the citizenship law broke out in northeast Delhi in February 2020. More than 50 people died in the riots — the majority of them Muslim. Evidence collected by the Delhi Minorities Commission suggested that the police deliberately stoked violence against Muslims — attacking them in at least one case. After the riots, video footage was used to arrest hundreds of people. The Delhi police commissioner told local press a year later that 755 people were arrested in connection with the riots across Delhi, 231 of whom were traced using CCTV or other video footage. Of these, 137 were identified using facial recognition. 

Surveillance in India began to increase after the BJP came to power in 2014, with cameras increasingly common in cities. But the CAA protests were a watershed. Anushka Jain, formerly of India’s Internet Freedom Foundation, documented facial recognition via a website called Project Panoptic. Jain told Rest of World the CAA protests were the first time her organization had heard of facial recognition being used against protesters. Since then, the use of the technology has quickly ramped up.

A key concern is the disproportionate use of facial recognition against minorities, and particularly Muslims. A study by the Vidhi Centre for Legal Policy, an Indian think tank, found that the tech would almost inevitably disproportionately impact Delhi’s Muslim community — a consequence of both police prejudice and the over-surveillance of Muslim-populated areas.

There has been a rise in anti-Muslim sentiment in India, which has been reflected in policing. Data from the National Crime Records Bureau shows that more than 30% of those detained in Indian prisons in 2021 were Muslim (Muslims form around 14% of the country’s population). A study in 2019 found anti-Muslim prejudice was rife among India’s police. In this context, facial recognition can be employed to specifically target Muslims for arrest.

“A lot of the use of this technology is under wraps.”

This prejudice can also take a more passive form. In May 2021, during the second wave of Covid-19 in India, social activist S.Q Masood was heading to his home in Hyderabad. With 83 cameras per 1,000 people, the southern Indian city has the densest CCTV network outside of China, according to the consumer research site Comparitech. Masood was stopped by a dozen police officers who had set up a picket near a Muslim-dominant area. Two officers, who were carrying tablet-sized phones, asked him to remove his face mask. Given that not wearing a face mask carried a penalty, Masood refused. The police photographed him anyway.

Afterwards, Masood heard that the state police were using facial recognition technology to compare photographs of people to images in India’s nationwide Crime and Criminal Tracking Network and Systems (CCTNS) database.

“I was worried, being a Muslim in the current political scenario where every day the minority community is targeted,” Masood told Rest of World. Concerned about his photo potentially being in the authorities’ database, he contacted lawyers and activists. With the support of the Internet Freedom Foundation, Masood took Telangana state — of which Hyderabad is the capital — to court, claiming that the use of facial recognition technology was illegal and unconstitutional. “There is no law in the state and central government that empowers law enforcement agencies to use facial recognition,” Masood said. 

A chart and map showing protests across the country of India since 2020, highlighting the month of March 2024.

The matter is still pending. Masood does not expect the authorities to stop using facial recognition, but hopes to force greater transparency around how they use it.

The potential for the technology to be weaponized as a tool for targeting minorities was again seen in India in 2021 with the widespread farmers’ demonstrations, which resulted in a police clampdown that disproportionately targeted the Sikh community — in part through the use of facial recognition. 

As a result, while facial recognition technology may have a chilling effect on protest generally, this effect is felt particularly keenly by communities that are already marginalized.

Srinivas Kodali, an independent digital researcher based in Hyderabad, told Rest of World that facial recognition is proving a powerful deterrent against public demonstrations. “It stops you from coming to a protest, because you know the police are going to recognize you,” he said. Protests in Hyderabad have become very rare, Kodali said, in part due to the use of facial recognition alongside other police tactics. “Police arrest people before they even come to the protest site,” he said.

Iran

Phantom technology

A black and white illustration with aqua highlight showing a hand taking a photo on a cell phone of a woman seated at a table drinking with other women in a cafe.

On a mild spring evening in 2023, Maryam, who is in her late 30s, was sitting in the courtyard of a Tehran coffee shop. Maryam — who asked to use a pseudonym to protect her safety while living in Iran — was not wearing a hijab, although it is required by the country’s law. Since the protests that followed the death of 22-year-old Mahsa Amini in September 2022, who died in police custody after being arrested for not wearing a “proper” hijab, Maryam sometimes kept her hair uncovered in public — an act of personal defiance. “I prefer to show in public that we want to have the right to choose,” she told Rest of World in an interview in November 2023.

Several months after her coffee outing, Maryam was summoned to court for failing to wear a hijab. She planned to say that they had the wrong person, but then she was shown a photo of herself at the cafe. “I was shocked and couldn’t say that [it wasn’t me],” Maryam said. 

There were several other women in court with Maryam that day for clothing-related charges. Maryam watched them being questioned by the judge and signing documents about how to behave. One woman complained the photographer had acted “unjustly” because her headscarf had only fallen for a moment. “The photographers are paid for this job and they are just doing their job,” Maryam recalled the judge responding. “It is not unjust.”

“There was no other way to recognize me.”

Maryam stayed quiet. The judge gave her a document that stated she, too, had unintentionally let her headscarf fall — even though she knew full well that she had not been wearing the hijab at all. She signed the confession anyway, and received a fine. 

Maryam doesn’t know how her picture was taken — she didn’t see anyone taking photographs at the cafe — but she firmly believes she was identified through facial recognition. “There was no other way to recognize me,” she said.

A few weeks before the anti-hijab protests following Amini’s death, the Iranian government had announced that it was planning to use facial recognition to identify women not wearing the hijab on public transport. This came shortly after a new decree signed by the hard-line president, Ebrahim Raisi, restricting women’s clothing, after women posted videos of themselves on social media with their heads uncovered. The secretary of Iran’s Headquarters for Promoting Virtue and Preventing Vice, Mohammad Saleh Hashemi Golpayegani, said that images would be compared with the national database for citizens’ ID cards. 

In the wake of the protests, authorities reiterated that facial recognition would be used to identify offenders. “People who remove their hijab in public places will be warned first and presented to the courts as a next step,” said Iran’s police chief in an interview on state television last April. A police statement said the authorities would “take action to identify norm-breaking people by using tools and smart cameras in public places and thoroughfares.” In June 2023, a video released by a news agency linked with the Islamic Revolutionary Guard Corps showed supposed facial recognition technology being used.

A black and white photograph showing a personal standing atop a large trash bin, holding up two fingers in a 'v' alongside a number of other people during a protest.
MEI/Redux

It remains unclear to what extent the Iranian authorities are actually using facial recognition technology. The government could be exaggerating its capabilities in order to induce fear, Azin Mohajerin, deputy director at human rights organization Miaan Group, told Rest of World. Mohajerin has separately worked with a group of volunteers to document detainees from last year’s protests.

Mohajerin and her fellow volunteers found cases where people were arrested at home after a protest. But there are several ways to identify individuals in a crowd: Authorities often post pictures of suspects on websites and Telegram channels, and invite members of the public to report their neighbors. In recent years, the authorities have also launched an app called Nazer that encourages the public to report individuals who are not properly clothed. “They were using various techniques to identify the protestors,” said Mohajerin. Facial recognition could be among these methods, “but we couldn’t have any proof of that,” she said.

A research project by the Miaan Group, called Filterwatch, analyzed a trove of hacked prosecutorial emails. It found that two Iranian companies working on facial recognition for commercial purposes had been collaborating with the authorities since 2015. Iran is also known to have acquired smart cameras from Chinese firm Tiandy Technologies.

A chart and map showing protests across the country of Iran since early 2023, highlighting the month of March 2024.

There were also recent reports from the city of Mashhad about commuters entering the metro and passing in front of CCTV cameras, after which their photos, names, and genders — seemingly taken from the national ID database — appeared on a screen, suggesting facial recognition was being implemented. After backlash over the violation of citizens’ right to privacy, the authorities confirmed the tech’s use. A spokesperson for the city council said it would only be used to catch “enemies of the regime.”

For now, activists cannot tell the extent to which Iranian law enforcement is actually using facial recognition, and how much it is invoking the idea of the technology to scare people into not attending protests or participating in other forms of dissent. 

“Currently it seems the authorities are leveraging the use of these technologies more as a tool for intimidation,” Azam Jangravi, a researcher at Citizen Lab, told Rest of World. “Based on my analysis, they have not achieved the capability to conduct facial recognition on a widespread scale.”

To some degree, the result is the same. In the context of the strict hijab law, just the threat of an all-seeing eye could be all the government needs, even if the system isn’t very advanced or widespread. “It doesn’t necessarily need to be in place in order to disincentivize women to come out and protest,” said Amnesty International’s Mahmoudi. “Ultimately, the fear that it generates is the same.”

Maryam has continued with her personal protest of not wearing the hijab, though she is on high alert for anyone around her who might be trying to photograph her. She does not feel aggrieved for being summoned to court. “It is somehow like we pay for our freedom,” she said. “I was satisfied with a fine.”

She is concerned, however, about what might result from a repeat offense if she is caught again. “What is the fine for the next time? Is it the same or will something worse happen?” she said.


End

Policing emotion

Illustration of a protest sign lying face down on the ground, blanketed with the shadows of surveillance cameras

As the adoption of facial recognition at protests spreads, digital rights groups are mobilizing to try to force legislation to protect civil liberties from the technology. But Access Now’s Daniel Leufer worries that, as the world’s de facto leading regulator on tech, the EU has set a very low bar. Its recent legislation on the use of artificial intelligence and facial recognition has “problematic loopholes and exceptions,” Leufer told Rest of World.

Facial recognition systems are also outpacing any legislative efforts. Leufer points to emerging technology that claims to offer emotion detection, which he said could be adapted for use against protestors. EU funds have supported research into surveillance tech that can predict the level of potential violence at large public events. NtechLab is developing features including aggression and violence detection, according to reports.

“If a crowd goes above an aggressivity threshold, then the system could alert the authorities,” Leufer said. Experts question the technology’s accuracy in predicting human emotion based on facial expressions. Baked-in biases play a part here, too: One study, using 400 images of NBA players, showed how two facial recognition systems consistently rated Black players as more “angry” than white players.

“But also, more fundamentally — is anger at protests not acceptable?” Leufer said. “The criminalization of emotions is going down a very dark road.”