Popular app Discord is being exploited by evil people to GROOM and EXTORT children
By zoeysky // 2023-06-29
 
Social media has helped people stay in touch regardless of distance, but some apps can also be used for evil purposes. According to a shocking report, the social platform Discord is home to many hidden communities and chat rooms full of adults who exploit the platform to groom children. Data from the review also found evidence that the app is used to trade child sexual exploitation material (CSAM) and extort minors that adults have fooled into sending nude images. Discord launched in 2015 and rose to fame as a hub for online gamers. During the Wuhan coronavirus (COVID-19) pandemic, users with different interests spent a lot of time on the app to talk to other people who also wanted to talk about topics like crypto trading and K-pop. To date, Discord is being used by 150 million people worldwide. In a review of international, national and local criminal complaints, news articles and law enforcement communications published since 2015, NBC News identified at least 35 cases over the past six years in which adults were prosecuted on charges of grooming, kidnapping or sexual assault. These cases were allegedly linked to communications on Discord, with 22 occurring during or after the pandemic. At least 15 of the prosecutions ended with guilty pleas or verdicts, and there are still many pending cases. (Related: Majority of parents think big tech, social media corrupting kids.) Stephen Sauer, the director of the tipline at the Canadian Center for Child Protection (C3P), warned that this is "only the tip of the iceberg." The cases are varied. In March of this year, a 13-year-old girl from Texas was taken across state lines, raped and found locked in a backyard shed. According to police, the teenager was groomed on Discord for months. The review also identified 165 more cases, including four crime rings, where adults were prosecuted for sending or receiving CSAM through Discord. Others were prosecuted for allegedly using the social platform to extort children into sending sexually graphic images of themselves, a form of sexual exploitation also called sextortion. Globally, it is illegal to consume or create CSAM in nearly all jurisdictions. It also violates the app's rules. At least 91 of the prosecutions have resulted in guilty pleas or verdicts, but there are many ongoing cases as of writing.

Discord's young userbase makes it a perfect hunting ground for child predators

Several reports from 2022 revealed that Discord isn't the only platform dealing with distressing issues like online child exploitation. However, experts believe that Discord's young user base, decentralized structure and multimedia communication tools, together with its steadily growing user base, have turned the app into a hotbed for people who intend to exploit children. Discord can be used for casual text, audio and video chat in invite-only communities called servers. Some servers are set to provide open invitations to anyone who wants to join. Unlike other platforms, Discord doesn't require the users' real identities. The app can also be used to host large group video and audio chats. According to the National Center for Missing and Exploited Children (NCMEC), reports of CSAM on the social platform skyrocketed by an alarming 474 percent from 2021 to 2022. NCMEC operates the U.S. government-supported tipline that receives complaints and reports about child sex abuse and associated activity online. When Discord responds and cooperates with tiplines and law enforcement, they provide detailed information like account names, messages and IP addresses. But NCMEC said the platform's responsiveness to complaints has significantly slowed down. In 2021, complaints were addressed in an average of three days. But in 2022, Discord's response time can take at least five days. Other tiplines also complained that Discord’s responsiveness isn't always reliable. John Shehan, NCMEC's senior vice president, lamented that the organization has observed an alarming increase in child sexual abuse material and exploitation on Discord. Discord is currently unable to automatically detect newly created CSAM that hasn't been indexed or messages that could contain signs of grooming. In a review of publicly listed Discord servers created in May, 242 appeared to have marketed sexually explicit content of minors using thinly veiled terms like "CP" to refer to child sexual abuse material. At least 15 communities appealed directly to teenagers themselves by claiming to be sexual communities for minors. Some of these communities have more than 1,500 members. Experts are still having a hard time analyzing the full scope of the issue of child exploitation on Discord, but organizations that track reports of abuse on tech platforms have identified themes that they have gleaned from thousands of Discord-related reports they process annually: grooming, creation of child exploitation material and encouragement of self-harm. Both NCMEC and C3P warned that reports of enticement, luring and grooming, where adults are communicating directly with children, are increasing throughout the internet. Shehan said enticement reports made to NCMEC had almost doubled from 2021 to 2022. Sauer explained that the group has observed an increase in reports of luring children that involve Discord, which may be attractive to those who want to exploit children because of the apps' many young users and closed-off environment. He cautioned that many predators will talk to children on other platforms, such as Roblox and Minecraft, before asking them to move to Discord so they can talk directly and privately to the children. In most cases, these adults will set up an individual server that isn't moderated.

Grooming on app often linked to abductions

Grooming that allegedly took place on Discord has resulted in the abduction of dozens of children. In 2020, a 29-year-old man taught a 12-year-old girl from Ohio whom he met on Discord how to kill her parents. Charging documents revealed that the man told the girl in a Discord chat that he would come and get her after she killed them, after which she could be his "slave." Prosecutors said the girl then tried to set their house on fire. The same man encouraged another minor, a 17-year-old girl, to cut herself and send him sexually explicit photos and videos. Prosecutors reported that he admitted to sexually exploiting both minors. He also pleaded guilty and was sentenced to 27 years in prison. An NCMEC report revealed that for the past two years, Discord has struggled to deal with an allegedly organized group of offenders who have sextorted many children victims to produce CSAM, engage in self-harm or even torture animals or pets. On dark web forums frequented by child predators, users share tips on how to fool children on Discord. These "tips" included pretending to be teenagers to receive videos and pictures from their victims. The tactics described in the post line up with child sex abuse rings on Discord that has been busted by U.S. federal authorities within the past few years. Prosecutors have described rings with organized roles, including "hunters" who searched for young girls to invite into a Discord server. "Talkers" were tasked with chatting with the girls and enticing them, while "loopers" streamed previously recorded sexual content and pretended to be minors to encourage actual children to engage in sexual activity. Shehan said NCMEC often receives reports from other tech platforms mentioning users and traffic from Discord, which suggests that the platform has indeed turned into a center for illicit activity targeting children. Visit Trafficking.news for more stories about child exploitation.

More related stories:

Tucker Carlson went there: says "It's time we talked about the elite pedophilia problem." Wall Street Journal confirms existence of "Pizzagate" pedophilia network. PEDO PLATFORM: Instagram’s algorithms promote a "vast pedophile network," report reveals. Sources include: NBCNews.com 1 NBCNews.com 2 Brighteon.com