The Limited Times

Now you can see non-English news...

Anti-vaccine movements: how do they thrive on social media?

2021-03-15T10:22:38.130Z


For years, they flourished uncontrollably on the networks. Now it is difficult to fight them. Furthermore, efforts to eliminate misinformation are painfully slow.


03/15/2021 6:00 AM

  • Clarín.com

  • World

Updated 03/15/2021 7:09 AM

As vaccination against COVID-19 is in full swing, social platforms such as Facebook, Instagram and Twitter

say they have stepped up their fight against misinformation

that aims to undermine confidence in vaccines.

But problems abound.

For years, the same platforms allowed anti-vaccine propaganda to flourish, so now those views are hard to end.

And its efforts to eliminate other types of misinformation about COVID-19 - often with fact-checking, informational labels, and other containment measures -

have been painfully slow.

Twitter, for example, announced this month that it will

remove dangerous falsehoods about vaccines

, in the same way that it has done with conspiracy theories and misinformation related to COVID.

But since April 2020, it has deleted a total of 8,400 tweets spreading disinformation about COVID, a tiny fraction of the flood of falsehoods about the pandemic that popular users with millions of followers tweet daily, critics say.

"As long as they don't take action, lives are lost

,

"

said Imran Ahmed, CEO of the Center to Counter Digital Hate, a watchdog group.

In December, the nonprofit organization found that 59 million accounts on all social platforms follow anti-vaccine propaganda spreaders, many of whom are hugely popular disinformation super-spreaders.

An anti-vaccine march in Colorado, USA.

AP Photo

However, efforts to crack down on vaccine misinformation have led to accusations of censorship and prompted some broadcasters to

adopt stealth tactics to avoid being removed.

"It's a tough situation because we've been letting this go for a long time," says Jeanine Guidry, an adjunct professor at Virginia Commonwealth University who studies social media and health information.

"People who use social media

have been able to share what they wanted for almost a decade."

Active accounts


The Associated Press identified

more than a dozen

Facebook

pages

and Instagram accounts that collectively boast millions of followers and have made false claims about the COVID-19 vaccine or discouraged people from getting it.

Some of those pages have been around for years.

Of the more than 15 pages

identified by NewsGuard, a technology company that analyzes the credibility of websites, about half remain active on Facebook, according to the AP.

One such page

, The Truth About Cancer

,

has more than a million followers on Facebook

after years of posting unsubstantiated suggestions that vaccines could cause autism or damage children's brains.

The page was identified in November by NewsGuard as a "super-spreader of misinformation about COVID-19 vaccines."

The page recently stopped posting about vaccines and the coronavirus.

Now he recommends people to subscribe to his newsletter and visit his website

as a way to avoid so-called "censorship."

Facebook said it was taking "aggressive steps

to fight misinformation in all of our apps by removing millions of pieces of content about COVID-19 and vaccines on Facebook and Instagram during the pandemic."

"Research shows that one of the best ways to promote vaccine acceptance is to show people accurate and reliable information, which is why we have connected 2 billion people with resources from health authorities and launched an information campaign global, "the company said in a statement.

Facebook

has

also

banned advertisements

advising against vaccines and said it added warning labels to more than 167 million additional content about COVID-19 thanks to its network of fact-checking partners.

(The Associated Press is one of Facebook's partners that verifies data.)

YouTube, which generally avoids the same kind of scrutiny as its social media peers despite being a source of misinformation,

said it has removed more than 30,000 videos

since October, when it began banning false claims about COVID vaccines. -19.

Since February 2020, it has removed more than 800,000 videos related to dangerous or misleading information about the coronavirus, YouTube spokeswoman Elena Hernández said.

However, before the pandemic, social media platforms had done little to stamp out misinformation, said Andy Pattison, manager of digital solutions at the World Health Organization.

In 2019, as a measles outbreak hit the Pacific Northwest and left dozens dead in American Samoa, Pattison pleaded with big tech companies

to strive to tighten rules

on vaccine misinformation because he feared it would. could aggravate the outbreak, but was not successful.

The bells

It was only when COVID-19 hit hard that many of those tech companies began to listen.

Now Pattison meets weekly with Facebook, Twitter, and YouTube to discuss trends seen on their platforms and policies to watch out for.

"When it comes to vaccine misinformation, the really frustrating thing is that

this has been around for years,

" says Pattison.

The recipients of these campaigns tend to adapt quickly.

Some accounts intentionally misspelled words, such as "vackseen" or "v @ x",

to circumvent the bans

.

Other pages use subtler messages, images, or memes to hint that vaccines are unsafe or even deadly.

"When you die after the vaccine, you die from everything except the vaccine," read a meme on an Instagram account

with more than 65,000 followers

.

The post hinted that the government is hiding the deaths caused by the COVID-19 vaccine.

"There is

a very fine line between free speech

and the weakening of scientific data," said Pattison.

Disinformation broadcasters, he said, "learn the rules and dance right on the edge, all the time."

Twitter said it is continually reviewing its rules in the context of COVID-19 and changing them based on expert advice.

Earlier this month, he added a recidivism policy threatening bans on those who repeatedly divulge misinformation about the coronavirus and vaccines.

However, manifestly false information about COVID-19 continues to appear.

Earlier this month, several articles circulating the Internet claimed that Pfizer's vaccine had "killed" more Israeli elderly who had been given the vaccine than those who had died from COVID-19 itself.

One such article, posted on an anti-

vaccine

website, was shared

nearly 12,000 times on Facebook, leading this month to a peak of nearly 40,000 mentions

of "vaccine deaths" on social platforms and the Internet, according to an analysis by the media intelligence company Zignal Labs.

Advertising

Medical experts highlight a study carried out in the real world that shows a strong correlation between vaccination and the reduction of cases of severe illness from COVID-19 in Israel.

The country's Health Ministry said in a statement Thursday that the COVID-19 vaccine

has "sharply" reduced

the rate of deaths and

hospitalizations

.

As the United States' vaccine supply increases, immunization efforts will soon shift from targeting a limited endowment to the most vulnerable populations to delivering as many vaccines as possible to as many people as possible.

That means dealing with

the third of the nation's population

who say they won't or likely won't get vaccinated, according to a February AP-NORC poll.

"Vaccine doubts and misinformation could be a major obstacle to getting

enough people

vaccinated

to end the crisis," said Lisa Fazio, a psychology professor at Vanderbilt University.

Some health officials and academics believe that social platform efforts are helpful, at least on the margins.

What is not clear is to what extent they can affect the problem.

"If someone believes that the COVID vaccine is harmful and feels responsible for communicating it to their friends and family ... they will

find a way to do it,

" Guidry said.

And some still blame business models that they say drive platforms to deliver misinformation about the coronavirus that is attractive yet false for advertising profit.

When the Center to Counter Digital Hate recently studied the crossover between different types of disinformation and hate speech, it found that Instagram tended to favor the cross-pollination of disinformation through its algorithm.

Instagram could feed

an account that followed a conspiratorial QAnon site

with more posts from, say, white nationalists or anti-vaccine people.

"Things continue to be allowed to disintegrate due to an undifferentiated mix of misinformation and information on their platforms," ​​said Ahmed, the center's CEO.

Barbara Ortutay and Amanda Seitz.

AP Agency

Look also

Anti-vaccine nurse and Bolsonaro fan died of Covid-19: she refused to apply the Chinese dose

Chile consolidates itself as the country with the highest vaccination rates in Latin America

Source: clarin

All news articles on 2021-03-15

You may like

Life/Entertain 2024-04-07T17:16:07.858Z
Life/Entertain 2024-04-06T17:34:03.928Z
News/Politics 2024-02-28T15:23:54.089Z

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.