If you’ve been keeping an eye on the technology headlines, you probably already know that Meta, the parent company behind Facebook, Instagram and WhatsApp gets sued regularly. Over the years, they’ve faced legal challenges over privacy violations, mishandling children’s data and various other things. States and school boards in the US have sued, and countries in Europe have sued. Let’s just say that the in-house legal counsel at Meta stays busy.
So, if you’ve been keeping an eye on the technology headlines, you likely aren’t surprised to see another story about another lawsuit against Meta. But, I think this latest one is notable for a few reasons. First, the scale: there are currently 41 states plus the District of Columbia suing Meta. This includes one lawsuit filed in federal court in California on behalf of 33 states and nine additional suits filed in individual states. Second, this suit summarizes a lot of the issues that kid-tech critics have been talking about for some time.
Notably, the complaint accuses Meta of prioritizing its bottom line over children’s safety. It states: “Its motive is profit, and in seeking to maximize its financial gains, Meta has repeatedly misled the public about the substantial dangers of its social media platforms.”
Personally, I’m glad to see Meta’s profits and business model explicitly called out. I’ve written before about the way money directly shapes the tech we use every day. The way a tech company generates revenue influences pretty much every aspect of its business, including product design. The lawsuit aims to draw a direct line between the company’s hunt for profit and the manipulative features it designs, stating “Meta has profited from children’s pain by intentionally designing its platforms with manipulative features that make children addicted to their platforms while lowering their self-esteem.”
The language puts a spotlight on persuasive design. In a previous newsletter, I took a deep dive into this topic, and there’s one quote from child psychologist Richard Freed that’s worth revisiting. As he explains, “The formula is that in order to have behavioral change, you need motivation, ability, and triggers. In the case of social media, the motivation is people’s cravings for social connection; it can also be the fear of social rejection. For video games, it’s the desire to gain skills and accomplishments. Ability basically means making sure that the product is remarkably easy to use. Finally, you add triggers, which keep people coming back. So those videos you can’t look away from, the rewards you get inside an app when you use it longer, or the hidden treasure boxes in games once you reach a certain level—these are all triggers, put there as part of the persuasive design.”
It’s easy to recognize these mechanisms in Meta’s platforms. Triggers come in the form of algorithms that serve up toxic content you can’t look away from. Rewards likes, followers and comments. And literal triggers like constant notifications at all hours. Of course, all these things are design decisions that Meta made intentionally. They’ve deployed problematic features and then raised their hands and cried “unintended consequences” when users are harmed.
The recent lawsuit wants to hold Meta accountable by seeking financial damages and restitution and changes to Meta’s practices that are in violation of the law. I’m pretty dubious about fines for Big Tech, which have historically been too small to be a real deterrent. Of course, it would be something noteworthy if Meta stopped violating laws meant to protect children. (Yes, these laws do exist, and they’re actually pretty good legislation.)
But history also has something to say about these laws and their enforcement. Remember when Google paid an eye-watering $170M USD fine for violating children’s privacy on YouTube? And they agreed to all sorts of changes to make kids safe? And then they just sort of kept on violating children’s privacy?
So, while it’s nice to see this shift in momentum, I don’t think this lawsuit will be any kind of silver bullet that fixes the internet for our children. More than likely, big fines will be leveled and promises will be made. Meta might add to its list of “30 tools” intended to protect kids and teens, but I don’t expect them to change in a meaningful way because 41 states sued them.
But maybe, this lawsuit can add to the subtle but palpable shift in public sentiment. Maybe, if enough of us keep making noise, demanding change and holding Big Tech to account, we’ll finally start to get somewhere. In the meantime, there’s hope in the form of apps and platforms that respect children and bring families together—without compromising their safety, privacy or wellbeing. I started Kinzoo because I believed that children deserved better from technology, and I know there are other companies out there that believe the same.
A deeper dive
Here are a few helpful resources in case you want to really dig into today's topic:
You can find more on the particulars of the lawsuit in this article here. Interestingly, a lot of the analysis of this makes note of a particular moment that seemed to galvanize critics: when Meta announced plans to launch Instagram for Kids, it seemed to wake a lot of people up to the danger of these platforms for younger users.
The lawsuit hinges on the claim that Meta designed social platforms that are addictive. But are they really? So far, experts haven’t been able to answer that question conclusively, so it’ll be interesting to see how this shakes out. In this article, David Greenfield, a psychologist and founder of the Center for Internet and Technology Addiction, points out that the platform use powerful tactics like a slot machine. “As with a slot machine, users are beckoned with lights and sounds but, even more powerful, information and reward tailored to a user’s interests and tastes.”
TL;DR
Too long; didn't read. It shouldn't be a full-time job to keep up on industry news, so here is a mercifully quick summary of some other notable developments:
Roblox is a popular platform for young kids with many user-generated experiences for users to explore. In the wake of the war in Israel, a virtual pro-Palestinian rally has been attended 275,000 times on the platform. The company has said that, “while our Community Standards allow for expressions of solidarity, we do not allow for content that endorses or condones violence, promotes terrorism or hatred against individuals or groups, or calls for supporting a specific political party. We have an expert team of thousands of moderators along with automated detection tools in place to monitor our platform and will take swift action against any content or individuals found to be in violation of our standards.”
In my last newsletter, I wrote about how misinformation and disinformation is spreading in the wake of the war in Israel. As it turns out, a handful of “misinformation superspreaders” are taking advantage of the recent changes at X, the platform formerly known as Twitter, to push mass amounts of false or misleading information into the digital world.
And lastly
Here are a few more pieces of original writing from me and my team—just in case you're keen for more:
If you’re interest in learning more about social validation and the way that likes affect kids, my team dives deep into the topic here.
Way back when, I wrote about Instagram for Kids and how its failed launch might inadvertently turn out to be a good thing for children’s tech. Here are my thoughts on that topic.
If you want to know how I really feel, I was published in TechCrunch back in October 2021 an opinion piece on Instagram Kids after Meta announced its plans to develop a pint-size version of the app.