Bradley Cadenhead isn’t a household name, but if you have kids on Discord, you should know his story. At 15, he founded 764 group—a name now synonymous with exploitation on a platform designed to connect people. His abuses were truly horrific. I won’t spend too much time on the particulars, but suffice it to say that they included sexploitation, coercion and child pornography. Cadenhead received an 80-year sentence, a punishment that feels monumental for someone so young, but it reflects the gravity of his actions. While Cadenhead is behind bars, the real question is: how did he manage to operate for so long on Discord, a platform that’s home to millions of kids? And why does Discord still struggle to keep predators like him off?
Discord, with nearly 200 million monthly users, is designed to be open—a space where communities can form around anything from gaming to niche hobbies. But that openness is exactly what makes it dangerous for younger users. Kids who join these communities may be looking for people who understand them—and they might find amazing people who share their interests. But all too often, they find people ready to exploit them as well.
Open platforms like Discord, TikTok and Instagram are all about growth. The more people you connect with, the better—for the platform. But for kids, it’s a different story. Even if they’re in private groups, these platforms are designed to introduce them to more people, which increases the risk of coming into contact with bad actors. Closed platforms, on the other hand, limit who you can interact with, creating a safer environment for kids. As a dad, I don’t want my kids to connect with anyone and everyone on the internet. I want them to connect with family and friends. That’s why, when we first designed Kinzoo Messenger, we focused on making invitations safe. Parents can easily keep track of who their children are connecting with—and they can opt to approve all their invitations if they so choose. Designing the platform this way is a challenge for us as a company because we can’t use the viral growth hacks that open platforms can. But, we strongly believe that it’s the right thing to do for kids.
For parents, navigating Discord can be tricky. It’s like trying to decode Slack but without the slick interface. Discord’s parental controls are basic at best, and keeping tabs on what’s happening in private servers or DMs is practically impossible, especially since they’re end-to-end encrypted. The platform’s complexity makes it incredibly hard for parents to stay on top of what their kids are doing, and the lack of robust safety features means that predators like Cadenhead can slip through the cracks. So what’s the solution to keep kids safer online?
Some experts, like Jonathan Haidt, argue that kids shouldn’t have phones at all, but that’s not realistic. It’s too black-and-white. What we need are better digital onramps, places where kids can safely navigate online spaces without being thrown into the deep end. And right now, that’s not what platforms like Discord are offering.
But, there are solutions: platforms can design safer, closed environments for kids. Companies can stop pushing open models where children can easily find themselves interacting with strangers. And they can work harder to ensure that bad actors can’t just pop up again and again under a new name or email address.
Parents are doing their best, but until these platforms start prioritizing safety over growth, kids will continue to be at risk. It’s time for companies to step up and create real solutions, instead of waiting for the next predator to get caught.
A deeper dive
Here are a few helpful resources in case you want to really dig into today's topic:
At its best, the internet can offer a lifeline for bullied teens, giving them a place to connect with like-minded allies and find support from friends. At its worst, it can offer the opposite. That’s sadly what happened with Cadenhead. According to the Washington Post, “Discord, which is popular with gamers, allowed a child shunned by peers in the real world to easily develop a following online by creating private spaces dedicated to his perverse interests and to attract both victims and like-minded predators from around the globe.” All the more reason to be mindful of the kinds of platforms children spend their time on because some tend to be more toxic than others.
Discord can also be a place where troubled kids dip their toes into violent rhetoric. In the wake of another heartbreaking school shooting, the perpetrator’s online history indicated he was making threats on Discord well before he carried out his attack. Police questioned the school shooter before he carried out his attack after an anonymous tip tied him to an account threatening to commit violence at a school. What’s worse: records show that other users in the channel encouraged him. Nothing came of the police visit because law enforcement couldn’t substantiate the claim that the account belonged to the boy.
TL;DR
Too long; didn't read. It shouldn't be a full-time job to keep up on industry news, so here is a mercifully quick summary of some other notable developments:
A short seller company called Hindenburg has released a report on Roblox claiming that the popular gaming platform has exaggerated user data and failed to protect children. Hindenburg alleges that, when they created a fake account as an under-13 child, they were able to access groups where users were exchanging child pornography and games with inappropriate content themes. For its part, Roblox has refuted the claims, but its stock fell over 9% after the report was released.
Snap has just announced that they’ll be bringing ads to users’ Chat tabs. These “sponsored snaps” will appear alongside messages from friends. If a user doesn’t open the sponsored snap, it’ll eventually disappear. But if they open it, they can click a link to learn more—or even reply to the ad. A majority of kids use Snap as a messenger platform with their friends, so it’ll be interesting to see how users respond to ads served directly in their inbox.
And lastly
Here are a few more pieces of original writing from me and my team—just in case you're keen for more:
If you’re feeling especially nervous about Discord now, you’re not alone. Here’s a piece from my team about why you might want to search for an alternative for your kid.
And, if your kids are interested in TikTok, you might have some questions about how to keep them safe. Here’s a parent’s guide to help you out.