Use this free tool to check if your photos were used in AI surveillance research
If you’re on social media, you’ve probably uploaded at least a few photos to your account over the years. Most Facebook users have photo albums filled with selfies, vacation photos, candids and other images they want to share.
Your photo uploads probably aren’t limited to social media channels, either. Many have at least one or two photo accounts that they use to create, share and store digital images. Web-based photo hosting sites are a common way to help organize your messy photo collection, too.
What you may not realize is if you’ve uploaded any photos to the web, chances are they’ve been used to help build facial recognition systems. It’s not uncommon for developers to train facial recognition algorithms using images from different websites without warning the photos’ owners. Now there is a new tool that can tell you if it’s happened to you.
Here’s the backstory
Exposing.ai can help you find out if your photos are among the images developers have used for these types of tasks. This tool focuses on the photo hosting site Flickr in particular. Flickr is regularly used in AI research by companies like IBM due to its huge cache of images, prime pickings for facial recognition training.
It’s not unusual for developers to scrape photos from these online databases to help train AI facial recognition. Flickr has allowed developers to scrape the site for images to use for these purposes. Social media and other image databases are known to allow it, too.
Neither Flickr nor the developers have to get user permission to do it, either — it’s completely legal for them to scrape sites like these for images to use. And without this tool, it’s nearly impossible to know whether your images have been included in past training datasets.
How this tool works
Exposing.ai was created as part of the MegaPixels art and research project. According to the tool’s creators, the MegaPixels project stems from years of research into the background on how these types of facial recognition and biometric image training datasets are built.
“After tracking down and analyzing hundreds of these datasets a pattern emerged: millions of images were being downloaded from Flickr.com where permissive content licenses are encouraged and biometric data is abundant. Telling the complex story of how yesterday’s photographs became today’s training data is part of the goal of this project,” the Exposing.ai website states about the project.
To use this tool:
- The only information you need to use the Exposing.ai tool is your Flickr username, photo URL or hashtag, which is entered in the website’s search bar.
- The tool will then scan through over 3.5 million photos using your unique identifiers, such as username and photo ID, to find your pics.
- If it finds your photos, the search results are displayed on the screen along with images pulled directly from Flickr.com. You will get thumbnails of images that were used in the results as well.
While you can’t do much to remove your images from the datasets they were used in, you may be able to use the tool to request the removal of your images from future dataset releases. The Exposing.ai team said information on that process would soon be included with your search results. Future versions of the tool could include more search options as well.
Why should I use this tool?
In general, there’s nothing nefarious about training AI with user photos from a database like Flickr. Researchers regularly use these types of databases to train AI to eliminate some of the ongoing biases that occur with facial recognition software.
However, there is a chance this type of software could be used for other not so great purposes, like training surveillance technology. This type of facial recognition technology is already being used as part of U.S. Customs and Border Protection’s biometric tech, which uses physiological details to ID you. Travelers who board flights in 2021 will likely encounter this type of technology at the airport.
It’s impossible to know whether images used by developers have been used for which purpose, but the information is useful nonetheless. And, as Exposing.ai adds more tools to the mix, it could become a beneficial tool for limiting how third parties like these use your photos.
The bottom line
If you’re a Flickr user, there’s a good chance your images have been used to help train AI facial recognition. It’s completely legal for developers to do, and while you can’t do much to limit the process right now, you can arm yourself with information on how your photos are being used for these purposes via this tool.
Keep reading
- Take this quiz to see which government agencies have your photo on file
- Tap or click for details on a facial recognition app with connections to law enforcement
- Stop Facebook from looking for you with face recognition
Tags: developers, Facebook, facial recognition, photos, search results, social media