The Limited Times

Now you can see non-English news...

Exposing.ai: Did your Flickr photos train a face recognition?

2021-02-01T17:34:57.914Z


A new reverse search enables explosive research: It can be checked whether Flickr photos have landed in databases that were used by arms companies or Chinese surveillance companies.


Icon: enlarge

"Not as advanced as one would attribute it to": facial recognition technology

Photo: David Malan / Getty Images

Adam Harvey has been fighting facial recognition technology for years.

In 2014, for example, the Berlin-based US artist presented a collection of hairstyle and make-up tips intended to confuse facial recognition software.

In 2019, he revealed that a database created and provided by Microsoft with portraits of around 100,000 people was also used by Chinese companies that supply authorities in Xinjiang Province, where the Uyghur Muslim minority is closely monitored.

MegaPixels was the name of Harvey's project at the time, now there is a successor called Exposing.ai.

On the associated website, you can check whether you have unwittingly become part of a database that may be used to train facial recognition software.

The comparison takes place against six such databases, which are used for research purposes, but also for commercial or military purposes.

They have names like DiveFace, IJB-C or VGG Face.

Harvey and his colleague Jules LaPlace and the Surveillance Technology Oversight Project from New York have found out what they have already been used for based on studies from around the world in which the names of the databases appear.

According to research by the New York Times, one of the six databases - it's called MegaFace - was used more than 6,000 times, including by the armaments company Northrop Grumman, In-Q-Tel, the CIA's venture capital department, by the TikTok parent ByteDance from China and Megvii, also from China and specializing in surveillance technology.

However, Exposing.ai has one important limitation: The search for photos is currently limited to those that have been published on Flickr and can still be found there.

Photos from YouTube videos should follow

The photo service, whose owner has changed several times in recent years, allows users to release their photos for further use with a Creative Commons license.

Researchers have gladly taken advantage of this to download facial photos without having to ask permission from the author (even if the licenses often come with certain restrictions that some researchers may have ignored).

Flickr has become an important resource for training databases.

With such databases, in turn, machine learning models are trained to classify faces.

If you want to use Exposing.ai you have to enter a link to a Flickr photo or its ID, a username or a hashtag.

"This is the metadata associated with every photo," Harvey wrote to SPIEGEL in a signal chat.

“They are either partially or fully contained in each of the six records.

Where something was missing, I supplemented the data with the help of detailed information from other databases until most of the photos could be indexed using their metadata. "

If photos searched for by text entry appear in one of the six indexed databases, they are displayed on Exposing.ai.

They are loaded directly from Flickr and not saved on Exposing.ai because the project itself is not intended to be a face database.

Next, Harvey also wants to index faces from YouTube videos that have ended up in training databases.

The irony of the story: Harvey initially wanted to use facial recognition himself.

Users should be able to upload a photo of themselves, and the software would then search for the face in the indexed databases.

But for one thing, he “didn't want to create another problematic application,” he says.

Among other things, she would have allowed stalkers to abuse her.

The other reason he deviated from the plan: “It didn't work as well as you might think.

Face recognition is not as advanced as it is supposed to be. "

Icon: The mirror

Source: spiegel

All tech articles on 2021-02-01

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.