Artists’ Bid to ID Police Officers – A Tool Against Unlawful Surveillance

Around the world, plenty of police officers use facial recognition systems to find crime suspects. But artists and social activists have now decided to turn the tables on them and have created AI-based tools that might as well be the Police Hall of Shame.

Their system attempts to send a message to police officers who overstep and act violent towards citizens, making them easily identifiable.

One of these tools’ creators is Christopher Howell, a lifelong US activist and self-taught coder. But he is not alone in the fight to counter violent police agents. Italian artist Paolo Cirio and other social advocates have also created projects that follow the same principle.

Let’s find out more about what drove them to these initiatives and their implications.

Fighting against facial recognition with art

2020 has definitely been a year of protests.

People from different corners of the world came together and took a stand. They protested abusive regimes, elections, judicial reforms, and police brutality.

However, not once, law enforcement officers have applied cruel treatments to citizens. Those who were supposed to protect civilians were doing the opposite. So, people decided to take the law into their own hands.

In creating his tech, Howell was motivated by recurring cases of US police officers covering their name tags or refusing to show any ID during violent incidents.

The simple act of police officers concealing their name tag might allow them to act more aggressively. Protected by their anonymity, they could have felt less liable for their actions.

Across the world, in Italy, Cirio transformed a thorny social issue into art. His piece is a colossal image made up of the faces of over 4,000 French police officers. He claimed his project could be the first step in developing a facial recognition app to identify an officer.

However, Cirio’s project was canceled as Gérald Darmanin, France’s interior minister, threatened to take legal action against the artist. Yes, that’s the same France that oversteps in its population surveillance programs.

The problem with facial recognition tools

Howell and Cirio’s projects seem to be a counterattack to Clearview AI, the American technology company that provides facial recognition software to businesses, law enforcement agencies, universities, and individuals.

Clearview AI has a database of more than 3 billion images gathered from all corners of the internet, including platforms like Facebook, Instagram, and YouTube.

From a snapshot, Clearview AI can identify a face and then provide more publicly available details about the person. More than 2,400 police agencies use it.

Several of the tech companies that had their content scraped have sent Clearview AI cease-and-desist letters. Meanwhile, IBM, Amazon, and Microsoft have stopped selling their facial recognition technology to police departments since their systems are later used to violate basic human rights and freedoms.

But it’s not just that. Facial recognition algorithms are the subject of controversy because of:

      • Privacy rights: It can become a tool for governments to track people’s every move, regardless of whether they are suspects of a crime or not.
      • Freedom of expression: It can hinder people from protesting in public places and expressing their rights and opinions.
      • Erroneous result: Many face recognition systems proved to have high error rates. This could lead to innocent people becoming potential suspects on the police’s radar.
      • Discrimination bias: The false-positive error rates are significantly higher for women, children, and people of color. This creates an unfair discriminatory impact, making these people more vulnerable to becoming suspects.

There are also other questions that have yet to be sufficiently explored: who owns all the facial image data? How can it be securely stored? Is it vulnerable to malicious actors?

What lies ahead

Facial recognition apps are powerful surveillance tools, and they can endanger people’s privacy rights. They need to be better regulated, or else the clash between governments and societies will only grow.

Take it from this US Congress letter put forward in October 2020 as a response to police’s brutal actions during Black Lives Matter protests:

‘No one should fear that they’ll be tracked by the FBI or Homeland Security for simply exercising their freedom of speech.’

The letter asks for the investigation of federal agencies surveilling the protests, but only time will tell if that can actually happen.

How about you? Do you believe police should use facial recognition algorithms?

Let me know in the comments below.

Leave a comment

Write a comment

Your email address will not be published. Required fields are marked*