Tuesday, October 8

The dark side of Discord: the popular app for teens is the target of sextortion and kidnapping cases

The platform allows anyone to create an account, without identity verification.
The platform allows anyone to create an account, without identity verification.

Photo: DAMIEN MEYER/AFP/Getty Images

Discord is an application released in 2015 It started as a hub for online gamers and during the pandemic became a destination for everything from cryptocurrency trading to YouTube gossip to K-pop. However, the platform that now has more than 150 million subscribers around the world also has a dark side.

In hidden communities and chat rooms, adults have used the platform to groom children before abducting them, exchanging child sexual exploitation material (CSAM), and extorting minors who are tricked into sending nude images.

An investigation by NBC News revealed that at least 35 cases have been registered in the last six years in which adults were prosecuted on charges of kidnapping, grooming, or sexual assault that allegedly involved communications on Discord.

“What we see is just the tip of the iceberg,” Stephen Sauer, director of the Canadian Center for Child Protection (C3P) helpline, told the news outlet.

In one case, which occurred in March, a teenage girl was taken across state lines, raped, and found locked in a shed, after being groomed on Discord for months. In another, a 22-year-old man kidnapped a 12-year-old girl after meeting her in a video game and grooming her on Discord, according to prosecutors.

NBC News identified An additional 165 cases, including four criminal gangs, in which adults were prosecuted for allegedly using the platform to blackmail children into submitting sexually graphic images of themselvesAlso known as sextortion. At least 91 of the prosecutions have resulted in guilty pleas or verdicts, while many other cases are ongoing.

Discord is not the only technology platform used by child predators, however its young user base, decentralized structure, and multimedia communication tools, along with its recent growth in popularity, have made it a particularly attractive venue.

The National Center for Missing & Exploited Children (NCMEC) noted that when Discord collaborates with law enforcement the response is typically much more efficient, however the platform’s responsiveness to complaints has dropped from an average of about from three days in 2021 to almost five days in 2022. And other news lines have complained that Discord’s responsiveness can be unreliable.

John Shehan, senior vice president of NCMEC, said that his organization has seen an “explosive growth” of child sexual abuse and exploitation material on Discord.

Discord has taken some steps to address child abuse and CSAM on its platform. The company said in a transparency report that disabled 37,102 accounts for child safety violations in the last quarter of 2022.

In an interview, Discord’s vice president of trust and security, John Redgrave, said he believes the platform’s approach to child safety has improved dramatically since 2021when Discord acquired his AI moderation company, Sentropy.

In a review of the publicly listed Discord servers created last month, NBC News identified 242 that appeared to market sexually explicit content to minors., using thinly disguised terms like “CP” that refer to child sexual abuse material. At least 15 communities directly appealed to the adolescents themselves by stating that they are sexual communities for minors. Some of these communities had more than 1,500 members.

While it’s difficult to assess the full scope of the problem of child exploitation on Discord, organizations that track reports of abuse on technology platforms have identified themes that they’ve been able to distill from the thousands of Discord-related reports they process each year: the grooming, the creation of child exploitation material and the promotion of self-harm.

According to NCMEC and C3P, reports of tempting, baiting, and grooming, where adults communicate directly with children, are increasing online. Shehan said seduction reports made to NCMEC had nearly doubled from 2021 to 2022.

Redgrave said Discord is working with Thorn, a well-known developer of anti-child exploitation technology, on models that can potentially detect grooming behavior.

The platform allows anyone to create an account, and like other platforms, it only asks for an email address and date of birth. Discord’s policies say US users can’t join unless they’re at least 13, but it doesn’t have a system for checking a user’s self-reported age.

Redgrave said the company is “very actively doing research and will invest in age-assurance technologies.”

Denton Howard, CEO of Inhope, a hotline organization for missing and exploited children around the world, said Discord’s problems stem from a lack of foresight, similar to other companies.

“Security by design should be built in from day one, not day 100,” he said.

Keep reading:

  • Indiana man traveled to Texas to have sex with 15-year-old; he arrived at the grandparents house
  • I was looking for young girls on social networks; They accuse him of sexually assaulting one and there could be more victims