Snap loves to tell us that they’re not a social media company. They claim that they’re something else entirely, and they’ve designed their platform to foster authentic connections that keep us in the moment. But Snap’s design choices have inadvertently fostered a space where predators can operate with alarming ease. And the platform’s focus on promoting network connections at all costs has created an environment where teens and tweens are even encouraged to connect with harmful actors who can exploit them without leaving a trace.
Today, I’d like to talk about Snap and the lawsuit from New Mexico, but also zoom out a bit to the bigger issue: how platforms like Snap, TikTok and Instagram operate, and why that creates such a risky environment for kids.
New Mexico’s lawsuit is calling out Snap for putting growth ahead of safety. The lawsuit alleges that Snap didn’t disclose or address design features that make the platform attractive to bad actors. One of the most eye-opening facts from Snap’s own data is that they received 10,000 reports of sextortion each month. It’s not an isolated issue—it’s systemic.
And, according to a separate internal analysis known instances of sextortion, 70% of victims never reported the abuse. The 30% who did? Nothing happened. Internal chats cited in the lawsuit indicated that Snap’s safety staff knew that “by design, over 90% of account-level reports are ignored today and instead we just prompt the person to block the other person.” This paints a picture of a platform with a sextortion problem that’s under-reported. And when the reports do come in, they’re failing to address them in a meaningful way.
Snap’s platform is often used as a closed messaging app between friends, which gives parents a sense of security. But here’s the catch: Snap and other social platforms are built on the concept of a social graph. That’s the map of your connections, and these companies are incentivized to get you to connect with more and more people. So while kids might use Snap to message their friends, the platform is constantly asking, “Do you know this person? What about that person?” They’re always nudging users to grow their networks, which is where the danger comes in.
The bad actors know exactly where young people hang out—on Snap, TikTok, Instagram—and they take advantage of these platforms’ design flaws. While Snap claims they "fixed" the Quick Add feature in 2022 to stop recommending teens connect with adults they don’t know, a 2023 internal test found that minors were still being exposed to introductions with adult strangers. It’s a flaw that should’ve been addressed more seriously, but Snap didn’t make it a priority, allegedly because addressing child grooming on a larger scale would create “disproportionate admin costs.” That’s a hard pill to swallow, especially when kids are at risk.
What’s even more frustrating is how Snap handled the issue of kids sending inappropriate photos. They didn’t want to tell kids outright to stop because they thought it wouldn’t work, so instead, they offered harm reduction tips—essentially a guide on how to "safely" engage in risky behavior. On one level, I get it. Telling kids “just don’t do it” rarely works. But to offer instructions on how to “safely sext” when you know sextortion is such a huge issue? That’s a head scratcher. It’s like knowing there’s a fire and giving out fireproof gloves instead of just putting out the flames.
The truth is, these platforms face a tough balancing act. They want to grow, and that means encouraging more connections, but they also need to protect their users, especially their youngest ones. It starts with parents educating kids on how to handle these situations—because the platforms aren’t going to do it for us. What’s worse is they promote this idea that kids will magically know how to handle the digital world when they turn 13. But the reality is that we need to provide them with digital onramps, easing them into these environments with the tools they need to stay safe.
Snap is just the latest in a long line of platforms that’s been caught putting profit over people. And while kids are using it as a closed messaging app, Snap is still designed to expand your social network, sometimes at the expense of safety. It’s the risk of the platforms, and unless something changes—both in how these companies operate and in how we prepare our kids—this cycle will just keep repeating.
A deeper dive
Here are a few helpful resources in case you want to really dig into today's topic:
New Mexico’s lawsuit alleges that Snap ran afoul of the state’s laws against deceptive trade practices. According to the suit, the company made misleading claims about the toxic content on Snapchat while internal research showed younger users encountered harmful content frequently. Similar to the case of Meta’s research on Instagram’s effects on youth, Snap apparently had information that contradicted the image it was presenting to the public.
The state also accuses Snap of developing features that posed obvious risks to young users. According to the Wall Street Journal, “Some features—such as allowing users to publicly share the number of days that they had exchanged messages with a particular friend or see when friends were gathered without them—fed insecurities. Others, such as QR codes that made it easy to spread users’ contacts or store pictures in a “My Eyes Only” folder inaccessible to a parent, were bound to facilitate both underage sexting and sextortion.”
TL;DR
Too long; didn't read. It shouldn't be a full-time job to keep up on industry news, so here is a mercifully quick summary of some other notable developments:
You can expect more vigilance from the FTC in the future. According to an announcement from the agency, they’re going to get even more serious about targeting education technology companies that violate children’s privacy laws. The use of this technology increased dramatically during the pandemic, and now, the gravy train is coming to an end for companies “bending current children’s privacy law to have parents opt into sharing a buffet of data, including to advertisers.”
Lots of people have theories about the evolution of the internet and one of them is called the “dead internet theory.” It predicts that a significant portion of the internet will become with automated bots, fake content and AI-generated responses rather than real human interaction. And guess what? You can now experience the dead internet with a new social network called SocialAI. "At a glance, SocialAI — which is billed as a pure “AI Social Network” — looks like Twitter, but there’s one very big twist on traditional microblogging: There are no other human users here. Just you. In a nutshell, SocialAI lets you share your thoughts with an infinite supply of ever-available AI-powered bots that can endlessly chat back.” If you ask me, this is a dystopian scenario that completely sacrifices connection for the sake of engagement and validation.
And lastly
Here are a few more pieces of original writing from me and my team—just in case you're keen for more:
For a closer look at Snapchat’s safety for children, check out our parent’s guide here.
And if you’re curious about the AI integration in Snapchat, check out this article from my team.
I truly appreciate your articles, they help me get a good perspective. Do you (or have you considered) providing content for kids that explain these problems? It’s just as important to me that my kids understand the risks, and I try to explain in terms they can understand but can’t. It’s too abstract a threat to them. They need examples, real life stories, and analogies. Otherwise I’m just a witch for not allowing my pre-teens on snap or insta.