Four years ago, when I published my book Screen Captured, I argued that Meta’s products were dangerous for kids. My young daughter had just asked to join Facebook Messenger Kids, and when I checked out the platform, I didn’t like what I saw. And the closer I looked, the more alarmed I became. Next thing I knew, I’d drafted an entire manuscript on the subject. I described in detail how Meta’s business model and product design put younger users at risk. It appears that the 41 US states suing Meta, alleging that Instagram and Facebook are addictive and detrimental to children, are in agreement. In fact, as I read through all 233 pages of the latest lawsuit against Meta (truly thrilling material), I had a serious sense of deja vu. All the things I’d written about—the questionable design, the relentless pursuit of profit—were captured in the pages.
After publishing my book, I continue to write about kid-tech in this newsletter. After focusing on the subject for so long, I’ve noticed a theme: the business models that companies choose influence the products they build and the features they design. The high-level decisions dictate the granular ones. And what I found most striking in the lawsuit is how the plaintiffs connected these dots, too. They homed in on Meta’s on business model and pursuit of profit—and made the case that this informed granular things like individual features. They also included compelling evidence that Meta was aware that kids under the age of 13 were using their platforms—and even trying to court them—while publicly alleging that their platforms weren't designed for kids. I’ll spare you too much of the legalese, but I’d like to look at a couple of the points from the lawsuit—which happen to be arguments I’ve been making for years.
On the high-level side, the lawsuit says of Meta: “Its motive is profit, and in seeking to maximize its financial gains, Meta has repeatedly misled the public about the substantial dangers of its Social Media Platforms. It has concealed the ways in which these Platforms exploit and manipulate its most vulnerable consumers: teenagers and children.” I’m glad to see the lawsuit draw a direct line between profit and product design. I’ve been writing for a while about the ways that a company’s business model impacts just about everything a tech company does.
I’ve devoted a newsletter to the way money shapes the tech we use every day, where I described how Meta’s business model led them to design products that keep us scrolling for hours. I argued that “platforms designed for compulsive engagement aren’t great for adults, and they definitely aren’t appropriate for children.”
I also wrote about the dollars and cents of app design, where I posed this question: "Why do you think your news feed on Facebook never ends? It’s not because there is so much high-quality content that you absolutely need in your life; it’s because reaching the end of your newsfeed would be a natural stopping cue, and a reason for you to close the app. This seemingly harmless feature isn’t there because it serves you, the user. It’s there because it keeps you scrolling ad infinitum.” It seems like the plaintiffs also agree that Meta has certain motivations when it designs its products the way it does. And those motives all come back to money.
I was also struck by the way specific features that were called out. For example, the plaintiffs allege that “Meta designed ephemeral content features in its Social Media Platforms to induce this sense of FOMO in young users.” I’m pretty much always suspicious of Meta’s motives when they release new features, especially when those features appear to be targeting kids. And, I also addressed Instagram Stories in my book and argued that Meta’s real intentions were different than the ones they shared in blog posts and press releases.
While Meta tried to say these ephemeral features are intended to protect their user’s mental health, I suspected something different. I said, “Companies have leveraged the dopamine reward cycle as well as FOMO (Fear of Missing Out, which is driven by our desire for belonging and acceptance) to keep us highly engaged with their platforms. The same will hold true for our children if they are exposed to the same features.”
I’m glad to see the lawsuit address specific features as well, and make arguments about how these features are designed for Meta’s benefit. In tying the lawsuit to business model and product design, the plaintiffs make it harder for Meta to evade accountability. In the past, Big Tech companies have relied on the old argument that they’re not responsible for the toxic content on their platforms because they’re not technically publishers. But, while Meta isn’t technically responsible for the things its users post, it most certainly is responsible for its chosen business model. And it’s responsible for the way it has designed products with the expressed purpose of drawing in younger users.
Meta also repeatedly fell back on its Terms of Service, which state that users must be 13 years of age or older. (The fact that 13 might be too young to be using social media a legitimate argument, but something I’ll leave aside for now.) The company would often throw up its hands and claim that they didn’t design platforms for kids, and therefore, the legislation meant to protect children online didn’t apply to them. The lawsuit mentions this as well: “Meta publicly denies what is privately discussed as an open secret within the company: that very young children are a known component of Meta’s user base and business model.” And, there’s mounting evidence that Meta was intentional in courting younger users. The lawsuit also cites an internal email from a Meta product designer that says, “[s]hort summary is the ‘the [sic] young ones are the best ones.’ You want to bring people to your service young and early.”
I unpacked this idea in my book when I wrote, “Because of their Terms of Use, social media, gaming, and content sites can operate outside of COPPA, which doesn’t require nearly as much parental control, verification, monitoring and reporting. The platforms get to capitalize on the fact that children under 13 are included in their monthly user counts—and they benefit from the improved metrics, higher valuations and increased ad dollars that come with it.”
While the contents of the lawsuit are alarming, it’s also encouraging to see all these points laid out so methodically. I found it heartening to see this constellation of arguments I’ve been making for years in this official, on-the-record format. As the lawsuit progresses, I’ll be interesting to see what other evidence comes to light—and what creative methods Meta employs to counter the claims. I have several predictions for the outcome, which I’ll share in my next newsletter.
A deeper dive
Here are a few helpful resources in case you want to really dig into today's topic:
If you’d like to take the deepest dive of all, here is a link to the argument submitted by the Attorneys General. This lays out what I’ve outlined above in much greater detail.
I wrote my book, Screen Captured, to help empower families to get the most out of technology while avoiding the worst of it. If you’re interested in learning more about the methods Big Tech uses to draw users in, but you don’t want to delve into the lawsuit, this is a parent-friendly piece of reading.
TL;DR
Too long; didn't read. It shouldn't be a full-time job to keep up on industry news, so here is a mercifully quick summary of some other notable developments:
Another lawsuit against Meta filed in New Mexico state court alleges that Facebook and Instagram function as a marketplace for child predators, and that the company has failed to stop children younger than 13 from joining the platform. One of the most damning allegations is that Mark Zuckerberg is “personally responsible for product decisions that aggravated risks to children on Meta’s platforms.”
Earlier this year, the Wall Street Journal reported that Instagram had a child predator problem—and that its own features were connecting them to a marketplace for buying, selling and trading child sexual abuse material. At the time, regulators in the EU warned Meta of “heavy sanctions” if they couldn’t clean the problem up. Fast forward, and now the regulators are requesting information from Meta on their response. The question at stake is whether Meta is complying with EU regulations to protect children. It’ll be interesting to see what kind of grade they receive.
And lastly
Here are a few more pieces of original writing from me and my team—just in case you're keen for more:
StarCraft is a hugely popular real-time strategy game and it’s popular among kids as well. My team wrote an in-depth parent’s guide to answer your most burning questions. Check it out here.
And, if your kids are more interested massively multiplayer online role-playing games, my team also has a parent’s guide for World of Warcraft. Find tips and advice here.