Meta has run afoul of the Federal Trade Commission. Again. You're not alone if you’re experiencing a sense of deja vu. This isn’t the first time the FTC has set its sights on this particular tech company. But, the regulator’s latest proposal is an escalation from the usual multi-million and billion-dollar fines. This time, the FTC is proposing to bar Meta from monetizing kids’ data altogether.
Here’s the background: back in 2020, Meta agreed to a $5 billion dollar settlement for privacy violations—and they also agreed to undergo an independent privacy assessment. Well, that assessment didn’t go too well. It found “several gaps and weaknesses in Facebook’s privacy program” that posed “substantial risks to the public.”
The FTC also alleges that Meta is violating the Children’s Online Privacy Protection Rule by misrepresenting parental controls on its Messenger Kids app. Facebook promised the app would only connect kids with approved contacts, but that was not the case. Some kids indeed connected with strangers without their parent’s approval.
So now, the FTC is sufficiently fed up. They’re proposing to add stronger restrictions that apply to all of Meta’s platforms, including Facebook, Instagram, WhatsApp and Oculus. The new terms would add a blanket ban on monetizing data from any user under 18. It’s worth noting that Meta disputes this stuff. A spokesperson at the company has called the new FTC proposal a “political stunt.” From where I’m sitting, it looks less like a political stunt and more like a good idea.
I believe that respect for children’s privacy should be a given. When we first started Kinzoo, our business plan revolved around this exact issue. We thought it was important to give parents an alternative to Facebook Messenger Kids because Meta has never (not once) given us a reason to believe that they’ll respect anyone’s privacy. I wasn’t comfortable signing my kids up for that. Meta is ultimately not motivated to protect its younger users. When it designed an app for kids, it included a major loophole that let them connect with strangers.
When we first designed our messaging app, we saw the potential for that same loophole and closed it immediately. We recognized early on that kids might be exposed to strangers. I remember working through the problem on a whiteboard in our first office. We caught the same issue and rectified it before launch—all when we had a team of just six people. Creating safer tech for children is a matter of motivation.
It seems like Meta is motivated by profit to the exclusion of everything else. I expect they will push back on this latest FTC proposal and try to maintain the status quo. They will dig their heels in and argue that they’ve closed all the loopholes and don’t monetize kids’ data anyhow.
But I also don’t think their Messenger Kids platform was ever focused on monetizing children’s data. I think it’s about training kids while they’re young—and waiting patiently until those kids are old enough to generate revenue. And as a nice bonus for Meta, it keeps parents tethered to the platform as well. After all, kids can’t be on Messenger Kids unless the parents are also on Facebook.
But, if the FTC does manage to impose some new rules on Meta, that’s a move in the right direction. In the meantime, we’ll keep working to give parents an alternative. We want to give families tools that make life easier and safer. We want to help them learn and unlock the potential of technology because we believe that kids deserve better.
A deeper dive
Here are a few helpful resources in case you want to really dig into today's topic:
You can find more details on the latest proposal from the FTC in this article here. It gives a good rundown of the different times Meta has run afoul of regulators. And while it might seem like a lot of infractions, keep in mind that this list isn’t even exhaustive.
For a refresher on the infamous Facebook Messenger Kids’ design flaw, here’s an article about how thousands of kids were able to connect with strangers via group chat. Facebook notified parents via a Facebook Message, and let them know that all the affected group chats had been turned off.
TL;DR
Too long; didn't read. It shouldn't be a full-time job to keep up on industry news, so here is a mercifully quick summary of some other notable developments:
Speaking of Meta’s motivations, here is a disturbing article about how Facebook and Instagram have become go-to platforms for child sex trafficking. One story in the article stood out in particular for me: apparently, Facebook refused to shell out a $3,000 fee for an expert presentation on how to prevent child sex trafficking. For a multi-billion dollar company, that certainly speaks volumes.
According to the Surgeon General, children’s declining mental health is the crisis of our time. And, he thinks that social media is a major contributing factor. He believes that kids’ use of social platforms has resulted in more feelings of isolation, stress and inadequacy as children end up constantly comparing themselves to others. Oh, and it keeps them awake into the night when they should be sleeping, which doesn’t help anything.
And lastly
Here are a few more pieces of original writing from me and my team—just in case you're keen for more:
I recently wrote a newsletter about Snapchat’s questionable new AI feature, My AI. If you’re looking for a more in-depth parent’s guide on the feature, my team put this handy article together.
In a past newsletter, I addressed other design flaws inherent in Facebook Messenger Kids—because that design flaw that connected kids to strangers isn’t the only danger.