The Limited Times

Now you can see non-English news...

She thought a dark moment from her past was behind her, until she did an online search for her face.

2022-05-24T18:24:20.068Z


A software engineer tested a facial recognition website that returned photos from a painful past she had left behind.


Facial recognition: do they use the photos we publish?

1:02

(CNN Business) --

Cher Scarlett, a software engineer, has been misidentified by face-scanning technology before, including a case where a distant ancestor could have appeared in a photo.

So when she learned of an online facial recognition tool that she hadn't heard of, she wanted to check if she mistook photos of her mother or her daughter for her own.


On February 1, Scarlett uploaded some images of her teenage daughter and her mother to PimEyes, a facial recognition website aimed at finding photos of oneself on the web, ostensibly to help eradicate problems like revenge porn and bullying. Identity Theft.

She didn't get any pictures of her in return for her: photos of her daughter resulted in other children, she said, while one of her mother resulted in some photos of her mother, as well as images of other similar-looking women.

He decided to try something else.

Next, Scarlett herself uploaded a couple of photos of herself, curious if she would take him to photos of her relatives.

They didn't, but the results surprised her nonetheless: Beneath some recent images of her and mismatches that featured photos of Britney Spears and the pop star's sister, Jamie Lynn, were photos of a younger version. of Scarlett.

They were photos from a dark time she didn't fully remember, a time at age 19 when she says she traveled to New York and was coerced into performing humiliating and sometimes violent sexual acts on camera.

"I'm looking at these photos and all I can think is that someone photoshopped my face in porn," Scarlett told CNN Business in an interview.

Scarlett, known as a former Apple employee who founded the worker organizing movement known as #AppleToo, has been outspoken online and in the media about her life and struggles, including being sexually abused as a child. girl, having dropped out of high school, having struggled with addiction, and having nude photos of her shared online without her consent.

What happened to her in New York in 2005 was so traumatic that she tried to take her own life in the following weeks, she has said, and in 2018 she started using the last name Scarlett (she officially changed her name in December 2021).

advertising

Cher Scarlett, a software engineer, told CNN Business in an interview: "I'm looking at these photos and all I can think is that someone photoshopped my face in porn."

Scarlett has worked hard to overcome the trauma of the past.

She now lives in Kirkland, Washington and has been working as a software engineer for years.

She is raising her daughter and is a recovering drug addict.

Since leaving Apple in late 2021, has pending complaints against Apple that are being investigated by the National Labor Relations Board (Apple did not respond to a request for comment), started a job as a senior software engineer at video game developer ControlZee in March.

But with a few clicks, PimEyes brought back to you a real-life nightmare that happened nearly two decades ago.

He has since tried unsuccessfully to have all explicit photos removed from PimEyes search results, despite the site saying it would remove images of Scarlett from results.

This week, sexually explicit images of Scarlett can still be found via PimEyes.

Giorgi Gobronidze, who identified himself to CNN Business as the current owner and director of PimEyes (he said he bought the company from its previous owners in December), said he wants no one to experience what Scarlett went through, which he acknowledged was "very, very painful." ".

"However, simply saying 'I don't want to see the images' or 'I don't want to see the problem' does not make the problem go away," he said.

"The problem is not that there is a search engine that can find those photos, the problem is that the photos are there and there are people who have uploaded them and they have done it on purpose."

It is true that unknown image discovery can be useful for some people who are trying to delete those photos of you on the internet.

But Scarlett's saga starkly shows how easily facial recognition technology, now available to anyone with Internet access, can cause unexpected damage that may be impossible to undo.

This technology has become increasingly common in the United States in recent years, and there are currently no federal regulations governing its use.

However, it has come under fire from digital rights and privacy advocacy groups for issues of privacy and racial bias and other real and potential dangers.

More people will "no doubt" have experiences like Scarlett's, said Woodrow Hartzog, a professor of law and computer science at Northeastern University.

"And we know from experience that the people who will suffer first and hardest are women and people of color and other marginalized communities for whom facial recognition technology serves as a tool of control."

As Scarlett put it, "I can't imagine the horrible pain of having that part of my life exposed not by me, but by someone else."

They investigate a facial recognition company in the US 0:39

"This may interest you"

Scarlett's discovery of the photos on PimEyes was my fault.

I have known about her work as a labor activist for a long time and I follow her on Twitter.

Since I often write about facial recognition software, I contacted her after she posted a confusing tweet in late January related to an experience she had on Facebook in October 2021. Scarlett had been tagged in a blank photo vintage-looking black and white of a woman and a man, a photo that had been posted on Facebook by a friend of a friend, whom she says she is distantly related to.

At the time, she said she had been "self-tagged" using Facebook's facial recognition software, which was disabled after she posted the photo;

he now believes the tag was a suggestion activated by the software.

And what's even stranger: some research on Ancestry.com led him to believe that the woman in the photo was his great-great-grandmother.

(Facebook said it never automatically tagged users in pictures; however, before turning off facial recognition, it could suggest a user be tagged in a picture if they had facial recognition turned on, and notify the user if it appeared in a Facebook image, but had not been tagged).

  • A facial recognition system sent this innocent man to jail

Scarlett and I discussed, via private Twitter messages, how strange this experience was and the repercussions of facial recognition software.

That's when I sent him a link to an article I had written in May 2021 about a website called PimEyes.

Although the website prompts users to search for themselves, it does not prevent them from uploading photos of anyone.

And while it doesn't explicitly identify anyone by name, as CNN Business discovered by using the site, that information may be just a few clicks away from the images PimEyes pulls.

His images come from a number of websites, including business, media and porn sites, the latter of which PimEyes told CNN Business in 2021 that he lists it so people can search online for any revenge porn they can find. show up unknowingly.

PimEyes says that it does not take images from social media.

"This might interest you," I wrote, introducing my article.

Minutes later, Scarlett told me that she had paid $30 for the cheapest PimEyes monthly service.

(PimEyes shows users a free, slightly blurry preview of each image that its facial recognition software determines is likely to include the same person as the photo the user initially uploaded; there is a fee to click. and go to the websites where the images appear).

Shortly after, he sent me a message: "oh no".

Processing the results

Scarlett took time to process what she was seeing in the results, which included images related to forced sexual acts that were posted on numerous websites.

At first, he thought it was his face plastered on someone else's body;

then, he wondered, why did he look so young?

She saw an image of her face, in which she remembers that she was sitting;

she recognized the shirt she was wearing in the photo, and her hair.

He sent me this photo, which seems benign without Scarlett's context: it shows a younger version of her, with dark brown hair parted in the center, a silver necklace around her neck, and a turquoise tank top.

She saved a copy of this image and used it to run another search, which she says turned up dozens of more explicit images, many of them aggregated on various websites.

Some images were posted on websites dedicated to torture porn, with words like "abuse," "asphyxiation" and "torture" in the URLs.

“And it was like,” Scarlett said, pausing and making a sort of brain-exploding sound as she described what it was like to look at the images.

In an instant, she realized that the memories she had of her brief stay in New York did not match what appeared in the photos of her.

"It's like there's a part of my brain that's hiding something, and another part of my brain that's looking at something, and this other part of my brain that knows this to be true, and they all bump into each other," he said.

"As if this thing isn't hidden from you anymore."

Adam Massey, a partner at CA Goldberg Law who specializes in issues such as non-consensual pornography and technology-enabled abuse, said that for many people he has worked with it can feel like "a whole new violation" every time a victim comes across these types of images.

"It's incredibly painful for people and every time you're in a new place it's a new shock," he said.


Scarlett not only saw more clearly what had happened to her, but she also knew that anyone searching through PimEyes could find them.

Whereas in decades past those images might be on DVDs, photos or VHS tapes, "they're forever on the Internet and now anyone can use facial recognition software and find them," she said.

  • Olympia Law: digital violence against women will be punished

Opt out

Scarlett was quick to switch from her PimEyes subscription to the $80-a-month service, which helps people "manage" their search results, for example by skipping their image results from public PimEyes searches.

Scarlett received help submitting DMCA takedown requests to websites hosting images she wanted removed, she said.

However, she does not own the copyright to the images, and the requests were ignored.

Scarlett is upset that people have no right to decide to take part in PimEyes.

The website does not require users to prove who they are before they can search for themselves, which could prevent some forms of use or abuse of the service (for example, an employer searching for potential employees or a harasser searching for his victims ).

Gobronidze says that PimEyes works this way because it doesn't want to accumulate a huge database of user information, such as photos and personal data.

It currently stores facial geometry associated with photos, but not photos, he said.

"We don't want to become a monster that has this huge amount of photographs of people," he said.

PimEyes is a facial recognition website aimed at finding photos of oneself on the web, ostensibly to help crack down on problems like revenge porn and identity theft.

Users can ask to be excluded from PimEyes search results for free, but Scarlett's story shows that this detail can be easy to miss.

Users have to find the link first (it's in small gray text on a black background at the bottom right of the PimEyes website);

it requires filling out a form, uploading a clear image of the person's face, and verifying their identity with an image of an identity document or passport.

"It's definitely not very accessible," said Lucie Audibert, chief legal officer at Privacy International, a London-based human rights group.

Gobronidze said the opt-out option will be easier to find with a website update that is in the works.

She also shared a link that anyone can use to request PimEyes remove data relating to specific photos of his face from its index, which she said will also be easier to find in the future.

He also wants users to know they don't need to pay to opt out of their results, and said the company plans to publish a blog post about the opt-out process this week.

Scarlett opted out of her results, saying she asked PimEyes to remove her images from its search results in mid-March.

He had not heard from PimEyes until April 2, when he recounted his experience on Medium, a decision he made in part because he hoped PimEyes would respond to his request.

However, it was about more than that, he said.

"We have to look at facial recognition software and how it's being used, in terms of we're losing our anonymity, but also the far-reaching consequences of losing that anonymity and letting anybody put up a picture of our face and find all the places. where we've been on the internet or in the videos," he said.

Also in early April, Scarlett upgraded to the $300 "advanced" tier of PimEyes, which includes the ability to perform a deeper search for images of your face online.

This resulted in more explicit photos of her.

On April 5, three days after posting her story on Medium and tweeting about her experience, PimEyes approved Scarlett's request to opt out of their service, according to an email from PimEyes that Scarlett shared with CNN Business.

"Potential results containing your face have been removed from our system," the email said.

Gobronidze told CNN Business that PimEyes typically takes no more than 24 hours to approve a user's opt-out request.

"Images Will Resurface"

But as of May 19, plenty of images of Scarlett, including sexually explicit ones, were still searchable through PimEyes.

I know because I paid $30 for a month of access to PimEyes and searched for images of Scarlett with her permission.

First of all, I tried to use the recent image of Scarlett that appears in this article, a photo that was taken in May.

PimEyes reported 73 hits, but only showed me two of them: one of Scarlett with bleached hair, which led to a dead link, and one of her smiling slightly, which led to a podcast episode in which she was interviewed.

Below the results, the PimEyes website encouraged me to pay more: "If you want to see the results that can be found through a more extensive search called Deep Search, please purchase the Advanced plan," it said, with the last four words underlined and linked to PimEyes pricing plans.

Next, I tried an image of Scarlett from 2005 that she instructed me to use: of her in a turquoise T-shirt and necklace, which she claimed was the same image she submitted to PimEyes to exclude from their search results.

The results were much more disturbing.

Alongside a handful of recent photos of Scarlett from news articles were numerous sexually explicit images that appeared to be from the same era as the image she'd used to search.

This shows that the opt-out process "prepares people to fight a losing battle," said Hartzog, the law professor, "because this is essentially like playing whack-a-mole or Sisyphus rolling the rock forever." up hill".

"It will never stop," he said.

"The images will resurface."

  • This facial recognition app is causing a lot of fear

Gobronidze acknowledged that the PimEyes opt-out process doesn't work the way people expect.

"They just imagine that they will upload a photo and it will disappear from the search results," she said.

The reality is more complicated: even after PimEyes approves an opt-out request and blocks similar-looking photo URLs, it can't always remove all of a person's images that have been indexed by the company.

And it is always possible that the same or similar photos of a person will appear again, since the company continually crawls the Internet.

Gobronidze said users can include multiple photos of themselves in an opt-out request.

Scarlett continues to have questions, like what PimEyes plans to do to prevent what happened to her from happening to someone else.

Gobronidze said part of this will come through making it clearer for people how to use PimEyes, and through improving its facial recognition software so it can better remove images that users don't want appearing in search results on the site. .

"We want to make sure these results are removed once and for all," he said.

Scarlett, for her part, remains concerned about the potential of facial recognition technology in the future.

"We have to stop and look at technology, especially this kind of technology, and say, 'What are we doing? Are we regulating this enough?'" he said.

Privacyfacial recognition

Source: cnnespanol

All news articles on 2022-05-24

You may like

News/Politics 2024-04-02T04:26:11.631Z
News/Politics 2024-04-06T00:13:32.091Z
News/Politics 2024-04-15T17:12:53.627Z
News/Politics 2024-03-29T08:16:18.279Z

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.