As a dad and the founder of a kid-tech company, I have a vested interest in seeing platforms like Instagram create safer spaces for teens online. So, while it might sound weird, I felt a rising frustration when I saw Meta rolling out new safety tools for teens last week. On the surface, it sounds like a win for families. But if you dig a little deeper, it's hard not to ask: why now?
Meta said the changes would help limit the time teenagers spend online, what content they see and who can find and contact them. They’re also giving parents better visibility into the people their kids are talking to and the posts they’re scrolling past—all while letting them set limits. And importantly, new profiles for children ages 16–18 will default to private, and any existing ones that are public will be switched over in the next few months. Oh, and Instagram plans to stop notifications to minors between 10 pm and 7 am.
All this begs the question: why was Instagram ever sending middle-of-the-night pings to children? Why would their accounts default to public in the first place? And why the renewed focus on youth wellbeing at this particular moment?
The truth is, Meta—like most Big Tech companies—only makes changes when the pressure mounts high enough. Whether it’s public outcry, whistleblower revelations or a new study exposing the dangers their platforms pose, these companies don’t protect kids unless they’re forced to.
And, the pressure is mounting again at the moment. Remember when the Surgeon General called for social media warning labels? Well, a bunch of states have backed the idea, calling on Congress to pass regulation requiring warning labels on social media platforms. And when the Attorneys General for a majority of states call for social media warning labels, it’s just a matter of time until Meta releases some updates intended to better protect teens. Then, add in the pending lawsuit against Meta by 43 states, and there is intensifying regulatory pressure on the company to better protect youth. The platform has a history of retroactively making changes to combat bad press, and this is no exception.
My frustration stems from the fact that these new tools feel like an acknowledgment of the problem: Instagram knows it hasn't been keeping kids safe. As a dad, I feel like these safety features should have been there from day one. The need for "new" protections is really an indication of an ongoing issue—one that we, as parents, have known about for a while. Social media platforms are built on engagement, and keeping teens scrolling is a key part of their business model. Unfortunately, more engagement means more risks, and in the past, those risks haven’t been mitigated until someone’s hand is forced.
I’m not saying that Instagram’s changes aren’t a step in the right direction. But as someone who spends a lot of time thinking about how to make tech safer for kids, I can’t help but wonder why it took them this long. When we set out to design our kids’ messenger app at Kinzoo, we prioritized safety and wellbeing from the outset. It’s absolutely possible to design tech that’s healthy for kids—but you need to make it a priority.
Maybe the pressure is finally catching up with Meta, but it also speaks volumes about the priorities of these platforms. At the end of the day, these companies know how to make their platforms safer—they just don’t always have the right incentives to do so. And while it’s great that they’re adding new features to protect our children, we have to stay vigilant and keep holding them accountable. Because if we don’t, they’ll just keep waiting until the next crisis to act.
Our kids deserve more than reactive measures. They deserve platforms that prioritize their safety from the start. I believe it’s up to us to demand that from the companies shaping their digital lives.
A deeper dive
Here are a few helpful resources in case you want to really dig into today's topic:
Meta doesn’t have a great track record, and most experts don’t trust their motives when it comes to keeping kids safe. Zvika Krieger, a former director of Meta’s responsible innovation team says, “I don’t want to say that it’s worthless or cosmetic, but I do think that it doesn’t solve all the problems.” I agree, especially since kids can circumvent these settings by lying about their age in the first place.
Attorneys general from many states, including New York, California, Florida, Oregon, and Michigan have signed a letter calling for a warning label on social media. While I understand the urge to do something—anything—to help protect children online, I still wonder what a warning label would look like given that the research on the negative effects of social media is not conclusive.
TL;DR
Too long; didn't read. It shouldn't be a full-time job to keep up on industry news, so here is a mercifully quick summary of some other notable developments:
Discord just announced that audio and video chats within the platform will now be end-to-end encrypted. While this kind of technology is sometimes a nice privacy feature for adults, it definitely poses some risks when it comes to child safety. That’s because it’s literally impossible for anyone other than the people involved to know the content of the chats. And, moderation on end-to-end encrypted channels is impossible. This is especially troubling given that bad actors have been drawn to Discord to connect with each other and exploit victims—and the platform has already struggled to combat this.
Jonathan Haidt is back with some new survey numbers. A team when out and asked Gen Z respondents whether wished certain platforms and technologies hadn’t been invented. For some platforms, like Youtube, Netflix, the internet itself and smartphones—most respondents didn’t wish them away. But the numbers rise sharply when we’re looking at social media platforms: Instagram (34 percent), Facebook (37 percent), Snapchat (43 percent), and the most regretted platforms of all: TikTok (47 percent) and X/Twitter (50 percent). What this tells me is that people don’t see technology itself as the problem—rather it’s specific platforms.
And lastly
Here are a few more pieces of original writing from me and my team—just in case you're keen for more:
If you’re feeling especially nervous about Discord now, you’re not alone. Here’s a piece from my team about why you might want to search for an alternative for your kid.
And, if your kids are interested in TikTok, you might have some questions about how to keep them safe. Here’s a parent’s guide to help you out.