Earlier in March, we learned some disappointing—but not surprising—details about the cavalier attitude of Big Tech CEOs toward youth mental health. With the release of new, unredacted court documents, we got more insight into what executives at Meta, TikTok and other tech companies knew about the harmful effects of their platforms. We got a peek into the way these teams prioritized growth over wellbeing—and, it’s not a good look.
The revelations came as part of a lawsuit in Oakland about social media addiction. The case includes several complaints from across the country that have been filed on behalf of youth and young adults who allege they suffered depression, anxiety, sleeplessness and disordered eating because of Facebook, Instagram, TikTok, Snapchat and YouTube.
According to the allegations, these companies made products with algorithms designed to addict, and the plaintiffs claim that employees and leaders understood the risks—and deployed these products anyways.
According to the filings, a Meta employee said in 2021 that, “no one wakes up thinking they want to maximize the number of times they open Instagram that day. But that’s exactly what our product teams are trying to do.” And, the documents also reveal that executives at TikTok know young people are more susceptible to dangerous viral challenges on the platform because teens’ ability to weigh risk isn’t fully formed.
I’m not surprised by any of this. I bet most people aren’t. These companies are designing platforms with the highest-possible engagement and they’re exploiting every psychological trick in the book to achieve that—no matter who gets hurt. When platforms make their money from advertisers, they design products in a very particular way. They create features that keep you scrolling, and they do everything they can to literally maximize the number of times you pick up your phone and open the app. We don’t really need a court filing to confirm this stuff. We can just look at the apps themselves.
If I were conducting my own trial of these social media companies, my first piece of evidence wouldn’t be a document. It would be the humble Like button—a feature that’s synonymous with social media. That little thumbs-up that debuted on Facebook in 2009 was one of the first features that really upped the ante. I believe this was the moment that product design really started to exploit human psychology. Much like slot machines payout a prize, a Like on social media gives users a quick dopamine hit, as well as a public scorecard for popularity.
Not only did the Like button shift the social experience, it also led to significant growth for Facebook. From 2009 to 2011, there was a double-digit increase in monthly active users on the platform. In short, they unlocked that hockey stick growth that makes tech investors drool. There were probably other things that contributed to the explosive increases, but I believe the Like button was a major driver. It also helped super-charge Facebook’s algorithm by rewarding users for sharing their own content. This ultimately enticed them to stay on the platform longer.
The Like button also gave us a new way to judge our own popularity and compare ourselves to others. Users’ brains were conditioned to anticipate and even crave the positive affirmation from posting content and getting Likes. With this one feature, Facebook started solidifying a feedback loop that makes the platform so compelling. And, the Like button helped create the pressurized atmosphere of comparison that’s so damaging to kids’ wellbeing.
We can point the finger at damaging algorithms and divisive content, but we can’t talk about the way social media harms children without looking at design features like the Like button. I worry about the social validation that comes from likes (and the feelings of rejection that come from silence). I worried so much, in fact, that I founded my own tech company to build better platforms for kids. I didn’t want my own children subjected to that kind of online atmosphere. I wanted them to have access to the best of technology, not platforms that manipulate them to increase daily active user counts.
The more we learn about social media and its impact on kids, the worse it looks for the people who designed these platforms. My guess is that, years from now, we’ll have the data to confirm beyond a reasonable doubt that the social validation features inherent in social media were a detriment to youth mental health. And my hope is that we have better, healthier tools to offer children—ones that are designed with their wellbeing in mind.
A deeper dive
Here are a few helpful resources in case you want to really dig into today's topic:
More details about the unredacted court filings can be found here. This article touches on the different arguments plaintiffs are making about the social platforms and explores the evidence about what exactly the executives knew.
A lot of focus has been placed on teen girls and their relationship to social media because it seems that their mental health is faring worse than teen boys. Common Sense Media just released a new report on teen girls with stats on how they really feel about specific aspects of social media, like public accounts, endless scrolling and appearance-altering filters. These insights could be valuable to the platforms and help guide them toward healthier designs—if only the companies were willing.
TL;DR
Too long; didn't read. It shouldn't be a full-time job to keep up on industry news, so here is a mercifully quick summary of some other notable developments:
Did you ever watch Jim Carrey in The Truman Show? For those who aren’t familiar, it’s a movie about a man whose entire life has been secretly filmed and broadcast as a TV show to millions of viewers around the world. I couldn’t help but think about Carrey’s Truman Burbank when I read this article on kid influencers—and how they feel when they realize that their entire lives have been posted online as content.
A new law is on the books in Utah and it promises to open a big can of worms. According to the New York Times, the new bill “could dramatically limit youth access to apps like TikTok and Instagram, potentially upending how many minors in the state use the internet.” Crucially, the bill requires some serious identity management on the part of social platforms to confirm the ages of users—and that can be a major detriment to privacy. Stay tuned to see how this plays out.
And lastly
Here are a few more pieces of original writing from me and my team—just in case you're keen for more:
TikTok has been under a lot of scrutiny lately, and much like other social platforms, there’s a lot of concern over the way the platform affects younger users. In an attempt to get ahead of the bad press, they released a new feature imposing time limits on users under 18. But, as my team points out, the new time limits are more like gentle suggestions. Learn more here.
BeReal is a social platform that’s meant to offer a de-pressurized way to connect and share online. My team took a look at the app to see what it’s all about. (Spoiler: it’s not totally de-pressurized.) Check out our parent’s guide here.