The Limited Times

Now you can see non-English news...

What Facebook knew about how it radicalized its users

2021-10-23T03:53:10.020Z

Internal documents suggest that Facebook has long known that its algorithms and recommendation systems drive some users to extremes. This is what the company is doing to combat it.



By Brandy Zadrozny -

NBC News

In the summer of 2019, a new Facebook user named Carol Smith signed up to the platform, describing herself as a politically conservative mother from Wilmington, North Carolina.

Smith's account indicated an interest in politics, motherhood and Christianity, and followed some of her favorite brands, including Fox News and then-President Donald Trump. 

Although Smith had never expressed an interest in conspiracy theories,

in just two days Facebook recommended that he join groups dedicated to QAnon

, a movement and an extensive and unsubstantiated conspiracy theory that claimed Trump secretly was saving the world from a cabal of pedophiles and satanists.

Smith didn't follow the recommended QAnon groups, but the algorithm Facebook uses to determine how you should participate on the platform kept doing its homework.

Within a week, Smith's feed was littered with groups and pages that had violated Facebook's rules

, including those prohibiting hate speech and misinformation.

Mark Zuckerberg may announce brand changes on Facebook

Oct. 20, 202103: 16

But Smith was not a real person, but a researcher hired by Facebook who invented the account, along with those of other fictitious "test users" in 2019 and 2020, as part of

an experiment to study the role of the platform in disinformation. and polarization of users

through their recommendation systems.

That researcher said that Smith's experience on Facebook was "a barrage of extreme, conspiratorial and graphic content." 

The body of research has consistently shown that Facebook pushes some users into dark places, where users' prejudices and conspiracy theories are fueled.

People radicalized through these burrows make up a small portion of total users, but at Facebook's scale, that can amount to millions of individuals.


Illustration about a trap that Facebook users can fall into

The findings, reported in a report titled

Carol's Trip to QAnon

, are among thousands of pages of documents of the disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by legal counsel to Frances Haugen, who He worked as a product manager for Facebook until May. 

Haugen asserted his whistleblower status and filed several specific complaints that

Facebook puts profit over public safety

.

Earlier this month, he testified about his allegations before a Senate subcommittee. 

Versions of the disclosures, which redacted the names of the investigators, including the author of

Carol's Journey to QAnon

, were shared digitally and reviewed by a consortium of news organizations, including our sister network NBC News.

A month ago, The Wall Street Journal published a series of reports based on many of the documents. 

"Now we know the truth about the destructive impact of Facebook," says former employee of the company in the Senate

Oct. 5, 202101: 28

"Although this was a hypothetical user study, it

is a perfect example of the research the company conducts to improve our systems

and helped support our decision to remove QAnon from the platform," said a Facebook spokesperson in response to questions sent by email.

Facebook CEO Mark Zuckerberg has broadly denied Haugen's claims

, defending his company's "industry-leading research program" and its commitment "to identifying the important issues and working on them."

Documents released by Haugen partially support those claims, but also reveal the frustrations of some of the employees involved in that investigation.

Haugen's revelations include research, reports and internal posts suggesting that Facebook has long known that its algorithms and recommendation systems push some users to extremes.

And while some managers and executives ignored the internal warnings, anti-vaccine groups, conspiracy theorists and disinformation agents

took advantage of their permissiveness, threatening public health, personal safety and democracy in general.  

"These documents effectively confirm what outside researchers have been saying for years, and that was often dismissed by Facebook," said Renée DiResta, director of technical research at the Stanford Internet Observatory and one of the first pioneers to report the risks. of Facebook's recommendation algorithms. 

Facebook's own research shows the ease with which a relatively small group of users can hijack the platform

and, for DiResta, solves any questions that may remain about the role of the social network in the growth of conspiracy currents. 

"Facebook literally helped create a cult," he declared. 

A pattern on Facebook

For years, company researchers had conducted experiments like Carol Smith's to measure the platform's influence on user radicalization, according to documents viewed by NBC News.

Why did UnidosUS, the largest group advocating for Latino rights, cut its ties with Facebook?

Oct. 10, 202101: 02

This inside job repeatedly found that recommendation tools pushed users toward extremist groups, a series of revelations that helped inform policy changes, recommendations adjustments, and news ratings. 

Those ratings are a tentacular, ever-evolving system widely known as "the algorithm" that pushes content to users.

But

the research at the time failed to inspire any changes in the groups and the pages.

That resistance was an indication of "a pattern on Facebook," Haugen told reporters this month.

"They want the shortest path between their current policies and any action," he said. 

Haugen added: "There is a great reluctance to solve problems proactively." 

A Facebook spokesperson rebutted that the investigation had not prompted the company to act and pointed to the changes to the groups announced in March.

As followers of QAnon committed acts of violence in the real world in 2019 and 2020, groups and pages related to the conspiracy theory skyrocketed, according to internal documents.

The briefs also show how teams within Facebook took concrete steps to understand and address those issues, some that employees deemed too little too late.  

Trump says it is a "great shame" that Facebook maintains the suspension of their accounts

May 5, 202100: 55

In the summer of 2020, Facebook was home to thousands of private QAnon groups and pages, with millions of members and followers

, according to an unpublished internal investigation. 

A year after the FBI designated QAnon as a potential national terrorist threat in the wake of gun battles, kidnappings, stalking campaigns and shootings,

Facebook labeled QAnon a "conspiracy network inciting violence."

,

and banned it from the platform

, along with the militias and other violent social movements. 

A small team working in various departments at Facebook hosted hundreds of ads on Facebook and on Instagram worth thousands of dollars and millions of views, "praising, supporting or representing" the conspiracy theory.

Facebook's spokesperson said in an email that the company has “taken a more aggressive approach in reducing content that is likely to violate our policies, as well as not recommending groups, pages, or individuals who regularly post content that is likely to violate our policies. ”.

For many employees within Facebook, enforcement came too late

, according to messages left on Workplace, the company's internal message board. 

"We have known for more than a year that our recommendation systems can very quickly lead users down the path of conspiracy theories and groups," wrote an integrity researcher, whose name had been redacted, in a post announcing leaving the company.

“This fringe group has grown to national prominence, with QAnon congressional candidates and QAnon hashtags and groups trending in the mainstream.

We were willing to act only * after * things had gotten out of control, ”he lamented.

"We should be worried"

Although it seems that the Facebook ban was effective at first, there was a problem that persisted: eliminating groups and pages did not eliminate the most extreme followers of QAnon, who continued to organize through the platform.

This is how coyotes use Facebook to promote human trafficking

May 23, 202102: 06

"There was enough evidence to alert the community of experts because Facebook and other platforms failed to address the extremist and violent dimension of QAnon," said Marc-André Argentino, a researcher at the International Center for the Study of Radicalization at King's College London, who has studied QAnon extensively.

Followers of this conspiracy theory simply changed their names, baptizing themselves as anti-child trafficking groups, or migrated to other communities, including those around the anti-vaccine movement.

Others turned to groups promoting Donald Trump's lies about alleged electoral fraud against him in the 2020 presidential election. Many of the people who ended up storming the Capitol were active in these Facebook groups, according to a report included in these. documents, first reported by BuzzFeed News in April.

The researchers cautioned that one of the reasons Facebook has been unable to stop these groups is because it has targeted them individually and has not treated them as a cohesive movement.

"A headache"

The attack on the Capitol led to a moment of self-criticism for employees.

One team noted that the lessons they learned about the QAnon warned them against being overly lenient with the content of the anti-vaccine movement, which researchers said came to understand half of the impressions on vaccines across the platform.

Mark Zuckerberg rejects accusations from former Facebook employee who testified in Senate

Oct. 6, 202102: 45

Facebook's 'Dangerous Content' team formed a task force in early 2021 to find ways to deal with the kinds of users who had been a challenge to the social network: communities like QAnon, COVID-19 deniers and the movement misogynist incel that are not obvious terrorist or hate groups, but which, by their nature, pose a risk to the safety of individuals and societies.

The goal was not to eradicate them, but to slow the growth of these “harmful topic communities”

, with the same algorithmic tools that had allowed them to grow out of control.

"We know how to detect and remove harmful content, adversarial actors, and malicious coordinated networks, but we have yet to understand and deal with the additional harms associated with forming harmful communities," the team wrote in a 2021 report.

A Facebook group dubbed Drebbel, by the 17th century Dutch engineer Cornelis Drebbel, was tasked with discovering how users radicalize to block these paths on Facebook and Instagram.

The group works with the program to "de-amplify" extremist groups and false information.

In March, researchers found that many of the users participating in the QAnon community came through groups that act as gateways.

They believe that they can put obstacles so that users do not reach these groups, in the same way in which their algorithms previously directed them to them.

"We found, like many problems on FB, that this is a headache

with a relatively small number of actors creating a large percentage of the content and growth," the group wrote.

The Facebook spokesperson said the company had "focused on results" in relation to COVID-19 and had seen a 50% decrease in content that misrepresents the efficacy and safety of vaccines, according to a survey that conducted in conjunction with Carnegie Mellon University and the University of Maryland.

It remains to be seen whether Facebook's most recent integrity initiatives will be able to stop the next dangerous move of conspiracy theory

or the violent organization of existing movements.

But his policy recommendations may carry more weight now that the violence of January 6 exposed the enormous influence and dangers posed by even the smallest extremist communities and the misinformation they fuel.

The power of the community, when based on harmful themes or ideologies, represents a greater threat to our users than any content, adversary actor or malicious network ”

2021 facebook report

The Facebook spokesperson added that the recommendations in the 'Deamplification Roadmap' are already being implemented: "This is important work and we have a long history of using our research to make informed changes to our applications," the spokesperson wrote. .

“Drebbel is consistent with this approach, and his research helped inform our decision this year to permanently stop recommending civic, political or news groups on our platforms.

We are proud of this work and hope it continues to inform product and policy decisions in the future, ”he concluded.




Source: telemundo

All news articles on 2021-10-23

You may like

News/Politics 2021-10-23T03:53:10.020Z

Trends 24h

News/Politics 2021-11-23T17:34:44.270Z
News/Politics 2021-11-23T07:58:14.269Z

Latest

© Communities 2019 - Privacy