The Limited Times

Now you can see non-English news...

Social networks are on alert for the anniversary of the assault on the Capitol

2022-01-06T20:43:53.910Z


One year after the assault on the Capitol, the main social media platforms are closely monitoring content about the attack.


Biden lashes out at Trump a year after the assault on Capitol 3:30

New York (CNN Business) -

January 6 was a clear turning point for the major social media platforms, as they demonstrated that, under certain circumstances, they would be willing to deactivate the platform of a sitting United States president.

But some experts worry that they have not yet done enough to address the underlying issues that allowed Trump supporters and others on the far right to be duped and radicalized, and organized using their platforms.


Heading into the anniversary, Meta, parent company of Facebook, Twitter and YouTube say they have been monitoring their platforms for harmful content related to the Capitol riots.

  • How to Watch CNN's Live Event Commemorating the First Anniversary of the Capitol Storming

"We have strong policies that we continue to enforce, including banning hate organizations and removing content that praises or supports them," a Meta spokesperson told CNN, adding that the company has been in contact with law enforcement agencies. , including the FBI and the Capitol Police, around the anniversary.

As part of its efforts, Facebook is proactively monitoring content praising the assault on the Capitol, as well as content calling on people to carry or use weapons in Washington, according to the company.

"We continue to actively monitor threats on our platform and will respond accordingly," the Meta spokesperson said.

Twitter convened an internal working group with members from various parts of the company to make sure the platform could enforce its rules and protect users around the one-year mark of Jan.6, a Twitter spokesperson told CNN.

"Our approach, both before and after January 6 [2020], has been to crack down on accounts and tweets that incite violence or have the potential to lead to offline harm," the spokesperson said. , adding that Twitter also has open lines of communication with federal officials and law enforcement.

advertising

YouTube's Office of Intelligence, a group tasked with proactively finding and moderating problematic content, has been monitoring trends around content and behavior related to the Capitol riots and its anniversary.

As of this Wednesday, the company had not detected an increase in content with new conspiracies related to the January 6 or the 2020 elections that violate its policies, according to spokeswoman Ivy Choi.

"Our systems are actively targeting high-authority channels and limiting the spread of misinformation harmful to election-related issues," Choi said in a statement.

These efforts come after Facebook, Twitter, YouTube and other platforms have faced intense criticism over the past year for the role of social media in the crisis.

Businesses, for their part, have largely argued that they had strong policies in place even before the Capitol riots and that they have done nothing more than tighten protections and enforcement ever since.

When the agitators escalated their attack on the Capitol on January 6, breaking into the building, looting Congressional offices and subjugating officers, social media platforms rushed to do what they could to stem the consequences, first by tagging President Trump's posts, then deleting them and then suspending his account entirely.

But some experts wonder if the approach to moderation has changed substantially in the last year.


"While I certainly hope they have learned from what happened, if they have, they haven't really reported it publicly," said Laura Edelson, a New York University researcher who studies political communication online.

This is especially concerning, Edelson says, as there could be a resurgence of misinformation about the assault and the conspiracy theory that the elections were stolen, popping up around the anniversary of the assault.

"Much of the narrative within the far-right movement is that, firstly, [the assault] wasn't that bad, and secondly, that it was actually the others who did it," he said.

In interviews leading up to the January 6 anniversary, some Trump supporters in Washington told CNN they believe Democrats or the FBI were responsible for the attack.

Those responsible for the assault on the Capitol will go to justice, says Garland 2:39

Facebook's response to January 6

Facebook, now a division of Meta, was the social media platform that was hit the hardest by January 6, due in part to internal documents leaked by whistleblower Frances Haugen, which showed that the company had removed the protections it had in place. implemented for the 2020 elections before January 6 of last year.

Haugen told the SEC in a filing that the company only re-implemented some of those protections after the assault began.

Days after the Capitol riots, Facebook banned "stop the steal" content. And internally, investigators looked at why the company was unable to stop the movement from growing, documents since released by Haugen (and obtained by CNN from a Congressional source) revealed. Meta has also taken steps to "disrupt militarized social movements" and prevent QAnon and the militias from organizing on Facebook, Meta Vice President of Integrity Guy Rosen said in an October blog post about the company's efforts in around the 2020 elections.

Meta refuted Haugen's claims and tried to distance himself from the attack.

Nick Clegg, the company's vice president of Global Affairs, told CNN in October that it is "ridiculous" to blame the unrest on social media.

"Responsibility for the violence on January 6 and the insurrection on that day rests squarely with the people who inflicted the violence and those who encouraged it," Clegg said.

However, investigators say the company continues to fight to combat disinformation and extremist content.

"We haven't really seen any substantial change in moderation for Facebook content that they have talked about publicly or that has been externally detectable," Edelson said.

"It appears that externally they continue to use rather rudimentary keyword matching tools to identify problematic content, be it hate speech or misinformation."

Meta noted in a September blog post that its Artificial Intelligence systems have improved at proactively removing problematic content, such as hate speech.

And in its November report on community compliance, the company stated that the prevalence of views of hateful content over other types of content declined for the fourth consecutive quarter.

A new report released Tuesday by the technology advocacy and research group Tech Transparency Project (TTP) found that content related to the "Three Percenters," an extremist and anti-government group whose supporters were charged in connection with the 6 January, remains widely available on Facebook, some of which use "militia" in group names or include well-known symbols associated with the group. According to the report, when TTP researchers examined this content, Facebook's "suggested friends" and "related pages" functions recommended accounts or pages with similar images. (TTP is funded in part by an organization founded by Pierre Omidyar.)

  • ANALYSIS |

    January 6 may just be a preview of a deeper democratic breakdown

"As Americans approach the first anniversary of the assault, the TTP has encountered many of the same troubling patterns on Facebook as the company continues to overlook militant groups that pose a threat to democracy and the state of Right, "the report states, adding that" Facebook's algorithms and advertising tools often promote this type of content to users. "

"We have removed several of these groups for violating our policies," Meta spokesman Kevin McAlister said in a statement to CNN about the TTP report.

Facebook says it has removed thousands of groups, pages, profiles and other content related to militarized social movements and has banned such organizations, including the Three Percenters, noting that the pages and groups cited in the TTP report had a number relatively small number of followers.

Other actors

It is clear that the disinformation landscape extends well beyond Facebook, even to more marginal platforms, such as Gab, which have gained popularity after January 6 thanks to their promises not to moderate content, while the largest companies are they faced calls to take action against hate speech, misinformation and violent groups.

In August, the House Select Committee investigating the deadly Jan.6 riots on Capitol Hill sent letters to 15 social media companies, including Facebook, YouTube, and Twitter, in order to understand how it existed on their platforms. misinformation and efforts to annul the elections by both foreign and domestic actors.

Six days after the attack, Twitter said it had removed 70,000 accounts spreading conspiracy theories and QAnon content.

Since then, the company says it has removed thousands more accounts for violating its policy against "coordinated harmful activity" and also says it bans violent extremist groups.

"Engagement and focus across government, civil society and the private sector are also critical," the Twitter spokesperson said.

"We recognize that Twitter has an important role to play, and we are committed to doing our part."

  • Former YouTube engineer says they didn't prevent misinformation when developing their algorithm

YouTube said that in the months leading up to the Capitol riots it had removed the channels of various groups that later became associated with the attack, such as those related to the Proud Boys and QAnon, for violating its existing policies on hate, harassment and violence. electoral integrity. During the assault and in the days after, the company removed live broadcasts of the riots and other related content that violated its policies, and YouTube says its systems are more likely to direct users to authoritative sources of election information. .

"Over the past year, we have removed tens of thousands of videos for violating our policies related to US elections, most before reaching 100 views," said Choi of YouTube.

"We remain vigilant in the face of the 2022 elections and our teams continue to monitor closely and act quickly in the face of electoral misinformation."

- CNN's Oliver Darcy contributed to this report.

Assault on the Capitol Social media

Source: cnnespanol

All news articles on 2022-01-06

Similar news:

You may like

News/Politics 2024-02-15T07:31:28.117Z

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.