Back in 2024, we learned something surprising: nearly half of Gen Z wished social media platforms like X and TikTok didn’t exist. And for a brief moment in January 2025, they got a taste of that world. For a few hours one recent weekend, TikTok disappeared. Parents, teens and influencers alike woke up to a world without the app. And while the ban was quickly reversed, it gave us a glimpse into something deeper: the complicated relationship young people have with social media.
The ban that came and went
So, how did we get here? Lawmakers had been scrutinizing TikTok for years, citing concerns over national security and data privacy. That’s because it’s owned by ByteDance, a Chinese company. The worry was that TikTok could be forced to share user data with the Chinese government, posing a risk to millions of American users. This concern led to legislation that required ByteDance to sell TikTok to an American-owned company or face an outright ban. When no sale was finalized, access to the platform was temporarily blocked in the US, triggering a brief TikTok blackout. It came back online because a new executive order from Trump extended the deadline for a forced sale, giving ByteDance more time to negotiate a deal rather than face an immediate shutdown.
For most teens, losing TikTok isn’t about geopolitics. It isn’t about data privacy. It’s about suddenly losing the platform they spend hours on every day—sometimes against their better judgment.
Teens are stuck in a love-hate relationship with social media
In 2024, a Harris Poll revealed something surprising: almost half of Gen Z wished social media platforms like X and TikTok didn’t exist. Yet, the same survey found that more than half of respondents spent four or more hours a day on these platforms. This contradiction speaks volumes. Young people recognize the toll social media takes on their well-being—but walking away from it doesn’t seem possible. Why? Because these platforms aren’t just entertainment. They’re engineered to be addictive—and for most teens, their entire social network is partaking.
But TikTok is especially dangerous for young users it’s hooked for a few reasons—none of them related to international espionage. The platform thrives on social validation, where likes, views and follows create an endless cycle of self-worth tied to engagement. Its algorithm is designed for endless scrolling, ensuring there’s always another video waiting, making it incredibly difficult to put the phone down. TikTok’s ability to serve content that keeps users hooked is remarkably effective. That’s good for the company’s engagement metrics. But it’s not great for youth mental health. And while TikTok claims to care about child safety online, their actions don’t always live up to that.
A history of failing kids
TikTok has repeatedly been in hot water over child safety. Like many social media companies, the platform has violated children’s online rights and been fined multiple times worldwide. Over the years, the platform has racked up significant penalties for failing to protect young users.
In 2019, TikTok was fined $5.7 million in the U.S. for illegally collecting children’s data. In 2023, the UK penalized the company €12.7 million for failing to safeguard children's privacy, while European regulators hit it with another €345 million fine for weak child data protection. Then in 2024, TikTok was fined €1.875 million for misleading Ofcom about its parental controls. TikTok isn’t designed with children’s well-being in mind—it’s designed for engagement at any cost. And all too often, the cost is out kids’ mental health.
The bigger problem: kids feel trapped
Kids and teens aren’t just mindlessly scrolling—they’re feeling the weight of social media. They have a sense that these platforms aren’t good for them, but FOMO makes quitting unrealistic. When everyone is on TikTok, not being on TikTok feels isolating.
So when TikTok vanished for that one weekend, any teen secretly wishing for a permanent escape was left short-changed. It gave them a quick taste of what a world without TikTok could taste like, and then suddenly, the app was back again.
I don’t think a full ban is the answer (and clearly, it didn’t stick), so what can we do? Social media platforms shouldn’t be engineered to exploit kids’ time and attention, but that’s exactly what they do. More regulation, stricter age verification and real consequences for violating child safety laws are long overdue. And at the same time, we can’t just tell our kids to “use social media less.” It doesn’t work like that. They need the skills to recognize manipulative design and set their own limits. And where possible, we should seek out alternatives that prioritize kids’ well-being, rather than engagement at all costs.
The TikTok ban may have been short-lived, but it exposed something bigger. Our kids don’t just want better—they need it. It’s on us to demand change, guide them through the mess and build a future where tech is designed with their well-being in mind, not just their screen time.
A deeper dive
Here are a few helpful resources in case you want to really dig into today's topic:
According to some users, the vibe on TikTok felt “off” when the platform returned after its brief hiatus. Some have even suspected the company of manipulating the algorithm to prioritize posts that are critical of Meta. But, other users had noticed a shift in the platform even before it went offline. One creator told the Washington Post that “it’s become harder to reach a larger audience and TikTok Shop was making the app feel more like an e-commerce machine than an entertainment engine.”
Just after the shutdown and subsequent return, The New York Times ran an op-ed written by a high school student. Not surprisingly, the teen is a heavy TikTok user—but they actually welcomed the ban. They expressed that they don’t want to scroll the app all the time, but feel powerless to stop using it on their own. And, this isn’t an uncommon sentiment for teens.
TL;DR
Too long; didn't read. It shouldn't be a full-time job to keep up on industry news, so here is a mercifully quick summary of some other notable developments:
Zigazoo bills itself as a safe social media app for children, but its latest feature seems to be a recipe for anxiety. The app recently launched leaderboards—or public rankings for users with the most likes on their posts. Here’s how they’re selling it to kids: “Compete for the top spot by posting videos and bringing in the likes!! Click the "Leaderboard" tab on the control panel to check it out now!!” This kind of public comparison puts pressure on kids to perform and teaches them to value social validation metrics over actual connection. Personally, I don’t think a public display of likes belongs in a kids’ app.
At this point, we all know that AI is far from perfect. One of the big problems? It tends to make things up. That’s why it’s not entirely surprising that Apple is pulling its AI-generated news summary feature. Vice reported on some of the made-up summaries: “One such fake story that Apple News’ was an AI-generated headline that Luigi Mangione, who’s been accused of killing UnitedHealthcare CEO Brian Thompson, had shot himself (he hasn’t). Another fake alert falsely attributed to the New York Times claimed Israel’s prime minister, Benjamin Netanyahu, had been arrested (he also hasn’t).” Of course, AI is a powerful tool, but it should only be deployed in ways that make sense. This clearly wasn’t one of them.
And lastly
Here are a few more pieces of original writing from me and my team—just in case you're keen for more:
Block Blast is a popular game among kids—but what’s with the 17+ age rating? My team put together a parent’s guide with all the ins and outs.
Thinking of getting a Meta Quest for your kid? Here’s a parent’s guide with everything you need to know about setting it up safely.