The Limited Times

Now you can see non-English news...

Facebook: impact on sensitive content control with Covid

2020-08-12T08:25:10.443Z


"With more reviewers at home, we have relied on our technology" (ANSA)The coronavirus impacted the control of sensitive content on Facebook, which had to prioritize technology rather than real-life reviewers to review it. This is what emerges from the sixth edition of the Report on the application of Community Standards which quarterly provides data on how the social media has applied its policies in areas such as hate speech or cyberbullying, in the specific case f...


The coronavirus impacted the control of sensitive content on Facebook, which had to prioritize technology rather than real-life reviewers to review it. This is what emerges from the sixth edition of the Report on the application of Community Standards which quarterly provides data on how the social media has applied its policies in areas such as hate speech or cyberbullying, in the specific case from April to June 2020. on both Facebook and Instagram.

"Due to Covid-19 in March we sent our content reviewers home to protect their health and ensure their safety and we had to rely more on our technology to be able to review the content," explains Mark Zuckerberg's company. that just a few days ago extended smart working for employees who want it until July 2021.

Today's report shows, for example, how hate speech content increased considerably on Facebook from 9.6 million in Q1 2020 to 22.5 million in Q2; on Instagram from 808,900 in the first quarter to 3.3 million in the second quarter of 2020. With proactive detection, that is with artificial intelligence technology, "for Facebook in 94.5% of cases these contents were acted upon before they even came reported, for Instagram instead in 84.2% of cases. Despite the impact of Covid - adds the company - the progress made by our technology has allowed us to intervene on a greater number of contents in some areas and to increase our rate of proactive detection in others ".

One area where the platform saw improvements related to overhauling with the technology was that of terrorist content (the content it acted on increased from 6.3 million in the first quarter to 8.7 million in the second). While the smaller number of reviewers impacted the actions for content such as suicide and self-harm, images of child exploitation, violent content and graphics on Facebook, which the social network prefers to entrust to reviewers in the flesh.

Source: ansa

All life articles on 2020-08-12

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.