Meta, Facebook (Photo: ShutterStock)
A Wall Street Journal investigation by outside researchers at Stanford and Amrahst universities found that Instagram had a "wide network" of accounts dedicated to pedophilic content. Unlike forums and file-sharing services where distributors of illegal content tend to hide, on Instagram the algorithm of the popular social network even promoted these contents. The company acknowledged that it has an enforcement problem, and that steps have been taken to prevent Instagram's systems from recommending searches related to sexual exploitation.
Meta was quick to respond to the investigation with the following statement: "Child exploitation is a horrific crime," the company said in response to the investigation. "We are continuously investigating ways to actively defend against such conduct." Meta said it would set up a dedicated task force, telling reporters they were working to block the network and make changes to their systems. The company says it has removed 27 such networks in recent years and is working to remove others. She also tries to prevent pedophiles from contacting each other.
However, the investigation and this discovery serve as a warning sign for a company that in the past ignored similar warnings – such as the recruitment of assassins on Facebook, and findings by internal investigators who found that on its networks, other horrific content is often not blocked and even promoted.
Another indication that Meta knew about the nature of the content but did not block it until the media report was a pop-up screen warning users that "search results may contain child sexual exploitation content" that could cause "extreme harm to children." However, the company did not prevent access to the content, because the on-screen options were "View sources on the topic" or "View results anyway." The last option was blocked only after the exposure in the economic newspaper. Meta did not respond to reporters' questions about why that feature wasn't blocked in the first place.
"If a team of three academics with limited access can find such a large network, it should raise alarms on Meta." Instagram (Photo: ShutterStock)
Alex Stamos, Facebook's former chief security officer, told The Wall Street Journal that "if a team of three academics with limited access can find such a large network, it should raise alarms at Meta." Stamos also noted that Meta needs better tools to deal with the phenomenon and that he hopes Meta is reinvesting in recruiting human researchers.
Mark Zuckerberg answers questions from members of Congress in the investigation into Facebook (Photo: Reuters)
Stanford researchers found that even after viewing one account related to pedophilic content, they received recommendations for additional content linked to people selling such illegal content. "Instagram is an opening for places on the web where explicit content about child sexual exploitation is presented," says lab director Brian Levine. His research group found that the distribution of such content is particularly severe on Instagram, and that it is considered an important network for buyers and sellers of pedophilic content.
- Privacy & Security