The Limited Times

Now you can see non-English news...

Child predators are using Discord, an app popular with teens, for kidnapping and sexual extortion

2023-06-21T18:06:07.724Z

Highlights: "What we see is just the tip of the iceberg," says one expert. A journalistic analysis detects the strategies used by criminals to capture and abuse children. NBC News identified 35 cases over the past six years in which adults accused of kidnapping, child molestation or sexual assault allegedly involved communications on Discord. At least 91 of the prosecutions have resulted in convictions or verdicts, while many other cases are ongoing. The National Center for Missing and Exploited Children said it has seen an "explosive growth" of child sexual abuse and exploitation material on the app.


"What we see is just the tip of the iceberg," says one expert. A journalistic analysis detects the strategies used by criminals to capture and abuse children.


By Ben Goggin — NBC News

Since its launch in 2015, Discord quickly became an online platform for video game fans; during the coronavirus pandemic it was a favorite destination for cryptocurrency investors, and a forum for YouTube gossip and Korean pop.

It is now used by 150 million people worldwide. But the app has a dark side: In hidden communities and chat rooms, adults use it to capture children before kidnapping them, to trade in child sexual abuse material (CSAM) and extort minors they trick into sending nude images.

In a review of international, national and local criminal complaints, news articles and law enforcement communications published since Discord was founded, NBC News identified 35 cases over the past six years in which adults accused of kidnapping, child molestation or sexual assault allegedly involved communications on Discord were prosecuted.

Of those cases, 22 occurred during or after the pandemic. At least 15 have resulted in guilty pleas or verdicts, and many of the other cases are still pending.

Ibrahim Rayintakath for NBC News

These figures only represent cases reported, investigated and prosecuted, posing major obstacles for victims and their advocates. "What we see is just the tip of the iceberg," says Stephen Sauer, hotline director for the Canadian Centre for Child Protection (C3P).

The cases are varied. In March, a teenage girl was taken to another state, raped and found locked in a shed in the backyard, according to police, after being seduced on Discord for months. In another case, a 22-year-old man kidnapped a 12-year-old girl after meeting her in a video game and seducing her on Discord, according to prosecutors.

NBC News identified 165 other cases, including four criminal rings, in which adults were prosecuted for streaming or receiving CSAM through Discord or for allegedly using the platform to extort children into sending graphic sexual images of themselves, known as sextortion.

Consuming or creating CSAM is illegal in almost every jurisdiction in the world, and violates the rules of Discord. At least 91 of the prosecutions have resulted in convictions or verdicts, while many other cases are ongoing.

"He sexualized us from a very young age": protest against Naasón Joaquín García in Los Angeles

June 5, 202300:56

Discord isn't the only tech platform facing the persistent problem of online child exploitation, according to numerous reports over the past year. But experts have suggested that its young user base, decentralized structure and multimedia communication tools, along with its recent growth in popularity, have made it an especially attractive location for those looking to exploit minors.

According to an analysis of reports to the National Center for Missing and Exploited Children (NCMEC), CSAM reports on Discord increased 474% from 2021 to 2022.

When Discord responds and cooperates with hotlines and authorities, the groups say the information is often of high quality, including messages, account names, and IP addresses.

But NCMEC says the platform's responsiveness to complaints has declined from an average of three days in 2021 to nearly five days in 2022. And other hotlines have complained that Discord's responsiveness can be unreliable.

John Shehan, senior vice president of NCMEC, said his organization has seen an "explosive growth" of child sexual abuse and exploitation material on Discord. NCMEC operates the U.S. Ombudsman-supported hotline that receives complaints and reports about child sexual abuse and associated online activities.

"There is a problem of child exploitation on the platform. That's undeniable," Shehan said.

Discord has taken some steps to address child abuse and CSAM on its platform. The company said in a transparency report that it disabled 37,102 accounts for child safety violations in the last quarter of 2022.

Congressmen concerned about labor exploitation of migrant children (they say companies do not cooperate)

June 14, 202302:03

In an interview, Discord's vice president of trust and safety, John Redgrave, said he believes the focus on child safety has improved dramatically since 2021, when Discord acquired its AI moderation company Sentropy.

The company said in a transparency report that the acquisition "will allow us to expand our ability to detect and remove harmful content."

Moderation on Discord is largely left to volunteers in each Discord community.

Redgrave said that whenever the company deploys moderation efforts against CSAM, however, it tries to cast a wide net and search the platform as widely as possible.

Discord wasn't "proactive at all when I started," Redgrave said. But since then, he said, Discord has implemented several systems to proactively detect known child sexual abuse material and analyze users' behavior. Redgrave believes that the company now proactively detects most of the CSAM that has been previously identified, verified and indexed. Currently, Discord is not able to automatically detect newly created CSAM that has not been indexed or messages with signs of grooming.

In a review of publicly listed Discord servers created in the past month, NBC News identified 242 that appeared to market sexually explicit content from minors, using thinly veiled terms like "CP" that reference child sexual abuse material. At least 15 communities appealed directly to the adolescents themselves, claiming that they were sexual communities for minors. Some of these communities had more than 1,500 members.

Discord allows casual text, audio, and video chat in invite-only communities, called servers (some servers are set up to offer open invitations to anyone who wants to join). Discord does not require the real identity of users, like other platforms, and can facilitate large groups of video and audio chat. That infrastructure has proven incredibly popular, and over the past seven years Discord has been integrated into almost every corner of online life.

As the platform has grown, so do the child exploitation issues it faces.

While it's difficult to assess the full extent of Discord's child exploitation problem, organizations that track allegations of abuse on tech platforms have identified themes they've been able to distill from the thousands of Discord-related complaints they process each year: recruitment, creation of child exploitation material, and encouraging self-harm.

According to NCMEC and C3P, reports of incitement, seduction and grooming, in which adults communicate directly with children, are increasing online. Shehan said seduction allegations submitted to NCMEC had nearly doubled from 2021 to 2022.

Discord

Sauer, director of C3P's hotline, said the group has seen an increase in reports of child grooming involving Discord, which can be attractive to people seeking to exploit children given its high concentration of young people and closed environment.

Sauer said predators sometimes connect with children on other platforms, such as Roblox or Minecraft, and move them to Discord so they can have direct, private communication.

"They often create an individual server without any moderation to connect with a young person," Sauer explains.

For many families, the recruitment of minors through Discord has become a real-life nightmare that has led to the kidnapping of dozens of children.

In April, NBC News reported on a Utah family whose 13-year-old son was kidnapped, taken across state lines and, according to prosecutors, sexually assaulted after being groomed on Discord and Twitter.

In 2020, a 29-year-old man counseled a 12-year-old girl from Ohio he met on Discord about how to kill her parents, according to charging documents, which say the man told the girl in a Discord chat that he would pick her up after he killed them, at which point she could be his "slave." Prosecutors say she tried to burn down her home. The same man encouraged another girl, 17, to cut herself and send him sexually explicit photos and videos, according to prosecutors. They said the man admitted to sexually exploiting the girls. The man pleaded guilty and was sentenced to 27 years in prison.

They warn that minors who use social networks could be "at deep risk"

May 23, 202300:22

These cases illustrate another disturbing issue that watchdogs say is emerging in Discord: threats of violence against minors and incitement to self-harm.

An NCMEC report shared with NBC News found that Discord "has had a major problem over the past two years with an apparent organized group of criminals who have extorted numerous child victims to produce increasingly egregious CSAM, self-harm, and torture animals/pets to which they have access."

On dark web forums used by child predators, users share tips on how to effectively trick children on Discord. "Try Discord, impersonate a nervous 15-year-old and join the servers," one person wrote in a chat seen by NBC News. "I got 400 videos and over 1,000 photos," he added.

Ibrahim Rayintakath for NBC News

The tactics described in the commentary match those of the child sex abuse networks on Discord that have been dismantled by US federal authorities in recent years. Prosecutors have described networks with organized functions, including "hunters" who located girls and invited them to a Discord server, talkers who chatted with girls and seduced them, and loopers who broadcast pre-recorded sexual content and posed as minors to encourage boys to engage in sexual activity.

Shehan said his organization frequently receives reports from other tech platforms mentioning Discord users and traffic, which he says is a sign that the platform has become a hub for illicit activity.

Redgrave said, "My heart goes out to the families who have been affected by these abductions," adding that child recruitment often occurs through multiple platforms.

Redgrave said Discord is working with Thorn, a well-known developer of anti-child exploitation technology, on models that can potentially detect grooming behaviors. A representative for Thorn described the project as a potential aid to "any platform with a upload button to detect, review and report CSAM and other forms of child sexual abuse on the Internet." Currently, platforms are able to detect images and videos already identified as CSAM, but have difficulty detecting new content or long-term grooming behaviors.

Although the problem has existed for years, NBC News, in collaboration with researcher Matt Richardson of the U.S. nonprofit Anti-Human Trafficking Intelligence Initiative, was able to easily locate existing servers that showed clear signs of being used for child exploitation.

On websites dedicated to listing Discord servers, people promoted their servers using variations of words with the initials "CP," short for child pornography, such as "cheese-pizza" and "casual politics." Group descriptions were often more explicit, advertising the sale of "t33n" or teen content. Discord does not operate these websites.

Discord

In response to NBC News' questions about Discord's servers, Redgrave said the company was trying to educate third-party websites promoting them about measures they could take to protect children.

On several servers, groups explicitly asked minors to join "unsafe for work" communities. One server promoted itself in a public database of servers, saying, "This is a community for 13- to 17-year-olds. We exchange nudes, we do events. Join us for the ultimate Teen-NSFW <3 [teens in explicit images] experience."

The groups appear to fit the description of child sexual abuse material production groups described by prosecutors, in which adults pose as teenagers to incite real children to share nude images.

Within one of the groups, users directly requested nude images from minors to gain access. "YOU HAVE TO SEND A VERIFICATION PHOTO TO THE OWNER! SHE HAS TO BE NAKED," one user wrote in the group. "WE ONLY ACCEPT BETWEEN 12 AND 17 YEARS."

In another group, which specifically claimed to accept "only girls between the ages of 5 and 17," chat channels included the titles "begging-to-have-sex-chat," "sexting-chat-with-the-server-owner," "nude-videos," and "nudes."

Discord

Richardson said he is working with law enforcement to investigate a group on Discord with hundreds of members who openly celebrate the extortion of children they say they have tricked into sharing sexual images, including images of self-harm and videos they say they have recorded children.

NBC News did not enter the channels, and no child sexual abuse material was seen in the preparation of this article.

Discord markets itself to kids and teens, advertising its functionality for school clubs on its homepage. But many of the other users of the platform are adults. The two age groups can be freely mixed. Other platforms, such as Instagram, have established restrictions on interaction between children under 18 and over that age.

The platform allows anyone to create an account and, like other platforms, only asks for an email address and a birthday. Discord's policies state that US users cannot join unless they are at least 13 years old, but it does not have any system to verify the age declared by the user himself. Age verification has become a hot topic in the world of social media politics, as legislation on the subject is being studied in the U.S. Some major platforms, such as Meta, have tried to establish their own age verification methods, and Discord isn't the only platform that has yet to set up an age verification system.

Redgrave said the company is "very actively researching, and will invest in, age-assurance technologies."

Children under the age of 13 can often create accounts on the app, according to court records reviewed by NBC News.

Despite the closed nature of the platform and the age-rich environment that blend into it, the platform has been transparent about its lack of oversight of the activities that occur on it.

"We don't control all servers or all conversations," the platform says on a page in its security center. Instead, the platform says it mostly waits for community members to point out problems. "When we are alerted to a problem, we investigate the behavior in question and take action."

Redgrave said the company is testing models that analyze server and user metadata in an effort to detect threats to child safety.

Redgrave added that the company plans to debut new models that will try to locate undiscovered trends in child exploitation content later this year.

Despite Discord's efforts to address child exploitation and CSAM on its platform, numerous watchdogs and officials claimed more could be done.

Denton Howard, executive director of Inhope, an organization of hotlines for missing and exploited children around the world, said Discord's problems stem from a lack of foresight, similar to that of other companies.

"Security by design should be included from day one, not day 100," he said.

Howard said Discord had contacted Inhope and proposed him to be a donor partner of the organization. Following a risk assessment and consultation with group members, Inhope rejected the offer and pointed to areas for improvement for Discord, including slow response times to reports, communication issues when hotlines tried to contact, reporting lines receiving warnings on the account when attempting to report CSAM, the continued hosting of communities that trade and create CSAM, and evidence that disappeared before receiving a response from Discord.

In an email from Discord to Inhope seen by NBC News, the company Discord said it was updating its policies around child safety and are working to implement "THORN grooming classifier."

Howard said Discord had made progress since that March exchange, reinstating the program that prioritizes hotline reporting, but that Inhope had not yet accepted the company's donation.

Redgrave said he believes the improvements recommended by Inhope match the changes the company hoped to implement in the future.

Shehan noted that NCMEC had also struggled to work with Discord, stating that the organization had rescinded an invitation to Discord to be part of its roundtable on cyber hotlines, which involves law enforcement and key whistleblowers of cyber hotlines, after the company failed to "identify a senior child safety representative to attend."

"It was really questionable their commitment to what they're doing on the child exploitation front," Shehan said.

Sgt. Tim Brown of the Ontario Provincial Police said the platform had given him a similar response that made him question the company's child safety practices. In April, Brown said Discord asked his department for payment after he had asked the company to preserve records related to an alleged network of child sexual abuse content.

Brown said his department had identified a Discord server with 120 participants that was sharing child sexual abuse material. When police asked Discord to preserve the records for investigation, Discord suggested it would cost an unspecified amount of money.

They ask for a review of weighted bags for babies at risk of increasing sudden infant death

June 17, 202300:28

In an email from Discord to Ontario police seen by NBC News, Discord's legal team wrote, "The number of accounts you have requested in its preservation is too voluminous to be easily accessible to Discord. It would be excessively onerous to search for or retrieve data that may exist. If you want Discord to search and produce this information, we will first need to discuss reimbursing the costs associated with searching and producing more than 20 identifiers."

Brown said he had never before seen another social media company ask to be reimbursed for maintaining potential tests.

Brown said his department ultimately submitted individual requests for each identifier, which were accepted by Discord without payment, but that the process took time away from other parts of the investigation.

"That certainly put up an obstacle," he said. Redgrave said the company routinely asks law enforcement to reduce the scope of requests for information, but that the request for payment was a mistake: "It's not part of our response process."

Source: telemundo

All news articles on 2023-06-21

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.