Your identity is unique. But it doesn’t seem to be a secret since digital whereabouts can be easily tracked. And that includes photos you posted on online platforms.
But how would you feel knowing that those very same pictures may be part of research projects meant to build facial recognition technology? Because that’s a valid use case nowadays.
Let’s find out how you can uncover the secret life of your images as algorithm fodder.
How Exposing.ai works
Artist Adam Harvey and programmer Jules LaPlace founded the MegaPixels website in 2019. They aimed to reveal the journey of the photos people posted online and show that they are reused without people’s consent. The two used public Flickr images that were uploaded under copyright licenses that allow liberal reuse.
Flickr is an image and video hosting service.
Harvey and LaPlace initially designed a facial recognition tool to find people’s photos in datasets. But they soon discovered this didn’t work as planned, so they built a scalable search tool.
The duo renamed their project Exposing.ai. You can use it to match information from Flick to image datasets. All you have to do is fill in a username, photo URL, or hashtag, and you’ll learn if your pictures are among those that developers used to train their facial recognition algorithms.
According to the tool’s developers, examining how yesterday’s photographs became today’s training data is one of their goals. They also hope to include other photo search options beyond Flickr in the future.
Harvey and LaPlace received support from S.T.O.P. (Surveillance Technology Oversight Project) organization:
S.T.O.P. was proud to support Adam Harvey and Jules LaPlace in launching Exposing.AI because we believe that no one should have their face used for AI training without consent. It is wrong that our biometric information is powering the tools that fuel mass incarceration and undermine human rights around the world.Albert Fox Cahn, Executive Director S.T.O.P.
Congratulations! If you use Flickr you’ve probably contributed your intimate moments to facial recognition #intermountainpi #intermountainpination #privacy #flickr https://t.co/3yLnwP8rCy
— Scott Fulmer (@intermountainpi) February 8, 2021
Sadly, there isn’t much you can do once your photos have been scrapped. You can’t remove your photos from image datasets that have already been distributed.
But you can prevent having your photos displayed on Exposing.ai, as well as anywhere online.
Just log into your Flickr account and edit your photos: change their status to private or hidden, or delete them. Plus, you always have the option of deleting your account. This way, your photos can no longer be loaded from Flickr.com.
Your photos fuel facial recognition tech
Flickr photos can be shared under a Creative Commons license.
Creative Commons is a public copyright license that allows content to be copied, distributed, edited, and built upon, all within copyright law boundaries.
However, CC licenses allow materials to be used for educational purposes as well. So, teachers and students can freely copy, share, and sometimes modify a CC work without seeking the creator’s permission. Many universities used this loophole to experiment with publicly available photos and facial recognition technology.
Here are just a few examples:
-
-
- Researchers at Stanford University collected and used 10,000 images. They later shared this database with the China’s National University of Defense Technology and an artificial intelligence (AI) company that provided China with surveillance technology.
- Duke University has also collected thousands of images used to train AI tech in the US, China, Japan, and the UK.
- The University of Washington in Seattle posted a database called MegaFace with over 3 million photos from Flickr. While most images were included without explicitly given consent, the collection practices were legal at the time. The MegaFace library is now offline, but various organizations and businesses have used it.
-
But it’s not just Flickr. Social media platforms and other tech companies let developers scrape photos one way or another. OkCupid dating website uses images posted on their website to develop machine learning for facial recognition.
And while IBM, Amazon, and Microsoft took a step back from facial recognition in 2020, the practice is still going strong. Companies like Amazon’s Ring want to go forward with it.
Facial recognition technology is also used as part of US Customs and Border Protection’s biometric tech to ID passengers. Travelers who board flights may encounter this type of technology at many airports.
Wow! China Airport face recognition systems to help you check your flight status and find the way to your gate. Note I did not input anything, it accurately identified my full flight information from my face! pic.twitter.com/5ASdrwA7wj
— 𝙈𝘼𝙏𝙏𝙃𝙀𝙒 𝘽𝙍𝙀𝙉𝙉𝘼𝙉 (@mbrennanchina) March 24, 2019
We need more facial recognition regulations
Ethical concerns aside, using publicly available images for facial recognition is also a matter of copyright and fair use.
For example, within the European Union, the General Data Protection Regulation states that data processing, biometric info included, can only happen after users consent.
In the United States, things are a bit vague and not evenly aligned.
Illinois was the first state to pass a law (The Illinois Biometric Information Privacy Act – 2008) requiring companies to inform users when collecting certain biometric information, including details used in facial recognition technology.
Later, Texas and Washington followed suit, while California enacted a comprehensive privacy law in 2020. Some states provide legal coverage or protection for biometric information, including facial templates.
Ideally, laws worldwide would empower users to hold tech companies and governments accountable for facial recognition’s fair use. Regulations should also determine for how long a company can keep the biometric information.
US ruling Jan 21, condemns Everalbum use of facial recognition technology to group users’ photos by the faces of the people who appear in them and allowed users to ‘tag’ people by name. Clarity required on how #data collected, by whom & how processed.#Privacy #photos https://t.co/qZGnzKyliW
— Jenny-Lee Moore (@moore_jennylee) February 24, 2021
Keep your digital footprint to a minimum
In 2020, the misuse and abuse of facial recognition technology at the forefront of news coverage. There have been stories of governments, law enforcement agencies, and private companies using the tech to track people. As you can imagine, his happened without their knowledge or consent.
To stay private online, here are a few things you can do.
-
-
- Limit your posts, don’t disclose personal info, and don’t upload photos of yourself. Whenever possible, make your profile private or delete social media accounts altogether.
- If you’re on iOS, switch to Secret Photo Vault to keep your treasured memories to yourself.
- Protect your digital identity with a reliable VPN. With your IP address hidden and your connection encrypted, you’ll be untraceable online.
-
What’s your opinion on reusing web images for machine learning research? Do you believe there should be a law that strictly forbids it?
Let me know in the comments section below.
Leave a comment