The Limited Times

Now you can see non-English news...

“Google closed my account for 'sexual content'. But they still don't tell me which one and I've lost everything."

2022-09-17T10:45:14.844Z


David Barberá, a Valencian teacher, has been left without access to thousands of private files in the cloud due to alleged illegal images. His case is not the only one among users of large technology companies


Five years ago, after the death of a friend and bandmate, David Barberá decided to pay for a Google Drive cloud account.

He wanted to keep music files so that his friend's children could hear how his father played.

“So I signed up for the Google Drive service,” he says.

"It was the safest thing that occurred to me so that Javi's music would not be lost, the children were very young then," he adds.

Barberá, 46, a high school teacher, had not, however, foreseen a key detail: Google's terms of service hide a guillotine that disables accounts when it detects prohibited content, such as child sexual material or terrorism.

“The only thing I can think of is that I uploaded something I shouldn't have uploaded, like downloaded movies back in the days of eMule.

Can there be child pornography or terrorism?

It can”, explains Barberá from Valencia to EL PAÍS in a long telephone conversation.

Barberá did not know what had happened.

He was only fitting pieces by reading forums or articles in the press.

This teacher describes a desperate experience of helplessness to try to understand what happened: in the month of July, he needed some music files that he had on old hard drives.

First, to organize it, he started uploading everything to his Drive, for which he still pays every month to have 2 terabytes of space.

Within minutes of starting the process, Google disabled his account with a message saying they had found "harmful content."

He then started several claim processes, answered emails from apparent Google employees who asked for new details (Nahuel, Rocío, Laura), called all the company phones he found without ever being able to talk to a human, asked a relative for help journalist and even finally managed to chat with an apparent Google employee, who asked him for "patience."

sexual content

From all this process, he only got one concrete answer.

It was a message to his wife's email (which he had previously added as a secondary account), with this confusing text: "We believe that your account contained sexual content that may violate Google's terms of service and may also be prohibited by Google. law,” it begins, but then continues: “We have removed this content” and “if you continue to violate our policies, we may terminate your Google account.”

This happened on August 26 and, although it seems like a warning, the account is still disabled.

"I have everything there from the last 14 years, and for five years, I only have it there," he says, referring to the fact that he does not keep it on external drives.

The loss of the Google account does not only mean the disappearance of photos and videos.

Barberá has also missed his classes, a blog he ran and his YouTube account.

Also the services that he had contracted with his email, from Amazon to Netflix, through a German music application: "Now I have to renew it but how do I explain that yes, it's me, but it's not me because of pederasty or terrorism... They are going to love it”, ironically.

The

New York Times

newspaper published two similar cases in the US in August. Google told the journalist that the "forbidden" images were photos of children's genitalia that two parents took to send to the pediatrician for a skin problem.

When EL PAÍS asked the same thing, Google replied that they could not provide that information since it is a European user and they were only going to share it with him.

But Barberá still does not receive any details.

Google has offered this newspaper "in the background" conversations with employees, which in jargon means that the journalist cannot identify the interlocutors or quote their verbatim words.

According to the company, insisting that they were not referring to this case, the "sexual content" mail is only sent in cases of child abuse, in no case for adult porn.

Why then that phrase that implies "don't do it again"?

Google didn't elaborate, other than that it all depends on what was in that account.

A Google employee asked if this newspaper was going to name the affected user, but he did not clarify why he was interested in knowing.

EL PAÍS has found three other cases similar to Barberá: two more with Google accounts and one with Microsoft.

All cases are from 2022 and only in one case has the account returned to its owner so far.

But it was not for alleged sexual images of children, but for a problem with the password that was never clarified either.

The other three users interviewed by EL PAÍS are in the limbo of large corporations, which are actually too small to manage more than a billion accounts.

a friend on google

Another victim, who has asked not to appear with his name because his company may have Google as a client, turned to "a close friend" who works within the company in Spain.

The friend does not work in a department linked to content moderation, but he did internal research on what used to happen in these cases.

His response was less than optimistic: this is handled overseas and no idea if anyone will actually read the complaint.

He gave her little hope.

As in the case of Barberá, this user had seen his account disabled after uploading 40 gigabytes of photos, videos and WhatsApp conversations that he had on his hard drive.

The upload of files was so remarkable that the cybersecurity managers of his company called him to ask him what was happening.

Google does not clarify when or how it analyzes the accounts of its users.

But both in the cases of the

New York Times

in the US and these two, it occurred when file movements were detected.

In the Spanish cases, when there were massive data uploads.

The third victim has put his case against Microsoft in the hands of lawyer Marta Pascual, who is preparing the lawsuit.

Her client is desperate because she has lost data from her private life but also from her work: “Her master's degree from IESE, taxes, photos of the birth of children and work databases.

She is suffering,” says Pascual.

She sees no other way out than to sue.

"The judge can say that he has seen his right to privacy violated, although I have not found jurisprudence," she adds.

Pascual's client believes that the suspicious files come from WhatsApp groups, whose content was kept and uploaded automatically.

The three affected have children and, although they do not remember photos for the pediatrician, they did have the typical images of children in the bathtub, bed or swimming pool.

Microsoft also does not give details

Microsoft still gives less information than Google.

It only sends a few statements about how it fights pederasty in its systems: “First, we fund research to better understand how criminals abuse technology.

Second, we develop technology like PhotoDNA to detect cases of child sexual exploitation.

Third, our agent staff quickly investigates reports of violating content and removes it.

And fourth, we work with other technology companies and law enforcement to refer crimes.”

Like Microsoft, in a conversation that this newspaper had with Google, the confidence in its detection systems is remarkable.

In fact, its software has been refined or is finding more and more false positives: between July and December 2021, it suspended 140,868 accounts, almost double compared to the first half of 2020.

Google analyzes accounts to search for child sexual material with two technologies: the already known images have a numerical code that identifies them.

If its systems find images that match those codes, it disables.

It is the PhotoDNA system cited by Microsoft.

The problem is the new ones.

For those, Google has created a second computer vision system that interprets the images and assigns them a probability that they are pedophilia.

Then, in theory, they go to human reviewers who decide if a photo crosses the sexual threshold.

The company is now concerned about material created by young people in sexual exploration without further ado, which can be taken out of that context.

Google has also talked to pediatricians, for example, so that the computer knows how to distinguish when the body of an adolescent is already an adult.

The pretense of objectivity with a laudable purpose can also cause many innocent victims to fall.

the thin red line

The same fine red line runs when it comes to the typical photos of children in swimming pools or innocent environments.

Google is concerned that, taken out of context or photoshopped, these photos may end up in files shared between pedophiles.

Google will focus on supposedly flagrant cases, but this newspaper published a story about teenage athletes and girls whose bodies were used on YouTube for dubious purposes and the videos remain open.

Users affected by these suspensions may one day receive another call: the Police.

"I have a friend who is a National Police and I called him to tell him about the case and he told me that he would look at it with computer crimes," says Barberá, the Valencian professor.

"They told him they didn't know of any case like mine."

However, it is likely that a case has reached Spain due to the efforts of large corporations.

Google or Microsoft must report their suspicious findings to the National Center for Missing and Exploited Children (NCMEC) in the US. The center is the one who notifies the national police.

The NCMEC sent 33,136 reports to Spain in 2021. Police sources confirm that this is the usual process and that they may receive reports with one or a few images.

These are usually cases that are not investigated.

In any case, the Police do not report back to Google or Microsoft that this person is not a suspect.

The companies make their own decisions and depend on the victim being able to justify the presence of the detected material.

For that, however, they must report what that material is, which does not always happen.

It is likely that if, in your opinion, the files found are of extreme severity or in huge numbers, there is no longer any recourse option.

Rubén Losada is a journalist who was left without an account due to a problem entering the password.

For some reason, he explains, Google thought it wasn't him.

Losada was in Tenerife on a trip and urgently needed to catch a bus.

For some reason, Google asked for his password.

He was wrong several times and wanted to introduce a new one.

So he was blocked.

Losada believes that the change of residence and his mistakes struck him down.

Although the beginning of his case is different from the rest, the defenselessness and the wall were similar.

Like the rest, Losada paid for his account and still never found an interlocutor.

He considered going to court or whatever.

There was no way he was going to lose his account, he says.

Every two or three weeks he claimed again.

But after six months he was able to log back in, without knowing why: "An acquaintance who is a security analyst told me that sometimes these systems are programmed like this and after six months they reboot," he says.

You can follow

EL PAÍS TECNOLOGÍA

on

Facebook

and

Twitter

or sign up here to receive our

weekly newsletter

.

Source: elparis

All tech articles on 2022-09-17

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.