Please excuse me if the title of this article seems a little clickbait-y. It’s designed to attract attention and entice users to click on a link. But it’s a bit of an over-promise, something that’s a bit too good to be true.
There’s actually nothing simple about making technology safer for kids. But it really does boil down to two things: systemic changes to the industry and better enforcement of laws. Neither is easy–but they’re our best hope for a safer future. I’ve written about both topics before, but neither strategy will work well in isolation. Today, I want to outline what each of these strategies entails, why they’re important—and why we need to do both in order to see any measurable change.
I want to spell out what it’ll take to tackle child safety online because I think it’s important to grasp the scale and complexity before we begin jumping to solutions. To date, any attempts to fix the issue have been piecemeal, with band-aid strategies that don’t address the root cause of the challenges we’re facing. And at the end of the day, nothing short of systemic change and proper enforcement will make a real difference for our kids.
Shifting Big Tech’s priorities
Under the status quo, user wellbeing is at odds with Big Tech’s business model. The way tech companies behave—harvesting our data, designing manipulative features, putting children’s safety and privacy at risk—is all serving their bottom line. Making their products safer and more private is actually bad for their business. That’s why systemic changes are necessary to change how they align their priorities and build their products. These companies need entirely new ways of doing business if they’re going to protect kids’ privacy and safety in a meaningful way.
Most tech companies make their money harvesting your data and showing you targeted ads. I’ve written before about the effect of this business model. It isn’t just that you see lots of ads. It runs a bit deeper than that because for this to be successful, the platform needs to collect lots of data about you. Reams of it. Serving advertisers keeps the lights on for them, and they can do a better job of that when they can target you very specifically. So, they are incentivized to keep their daily active user numbers high. And, they want you to spend as much time on their platforms as possible because that helps them collect the necessary data to keep turning a profit.
That incentive trickles down to product design. Their platforms end up with features that draw you in and cause you to lose track of time. Arguably, platforms designed for compulsive engagement aren’t great for adults—and they definitely aren’t appropriate for children.
So, we need a systemic overhaul. We need to reward businesses that protect children’s safety, not businesses that exploit them.
Enforcement with teeth
Simply put, better enforcement of existing laws would punish companies that put our children in danger. It would give real teeth to the rules protecting kids’ data and privacy. And, it would deter companies from bad behavior.
As it stands, tech companies face fines when they violate existing child safety legislation, but as I’ve written before, enforcement lags far behind. As a refresher: according to a study conducted by fraud and compliance software company Pixilate, more than two-thirds of the 1,000 apps in the App Store most likely to be used by children collect and share their data with the advertising industry. In the Play Store, the number is 79 percent.
The Children’s Online Privacy Protection Act (COPPA) has been used successfully to hold big tech companies like Google and TikTok accountable. But it isn’t effectively protecting our children for two reasons. First, it isn’t applied that much. The majority of the time, when companies violate our kids’ privacy, absolutely nothing happens. And second, even when companies are held responsible under COPPA, the fines tend to be minuscule compared to their bottom lines. When YouTube was hit with a $170M fine for violating children’s privacy, experts estimated that sum was half a day’s cash flow for parent company Google. And, when TikTok was fined $5.7M for violating children’s privacy, parent company ByteDance was worth $75B.
We’ve seen a wide range of penalties in the last few years—and some precedent-setting fines, especially in the EU. Many experts argue that the fines are much too small to even make a difference to these companies. Case-in-point: Facebook’s stock actually went up after the 5 billion dollar fine was announced.
So, we need more fines that cut a but deeper. Companies today look at penalties as “the cost of doing business.” It’s often a better business decision to violate regulations and pony up a fine later on. But if we can shift that status quo and make those fines mean something, if we can make them make a dent in the spreadsheet, that would make those companies think twice.
So yes. These are big and complex goals, but they’re the only real path towards safer technology for our children. Even thought it’ll be a long-term, difficult project to make the digital world safer for our kids, it’s a cause we have to undertake.
A deeper dive
Here are a few helpful resources in case you want to really dig into today's topic:
Even the FTC agrees that fines on their own aren’t enough to deter Big Tech companies. At a conference earlier this year, Commission Chair Lina Khan had this to say: “We've seen historically firms do treat fines, even fines that sound really large—millions of dollars, even billions of dollars—they can sometimes treat those fines as a cost of doing business, if the underlying illegal tactics that they're engaging in are valuable enough to them.”
At that same conference, Khan also mentioned how Big Tech companies are incentivized by their business models. They make money from our data and attention, so they design their platforms accordingly. Security.org did a deep dive into the data that the big companies collect, and this article spells out all the bits and pieces that each one harvests.
TL;DR
Too long; didn't read. It shouldn't be a full-time job to keep up on industry news, so here is a mercifully quick summary of some other notable developments:
Mr. Beast is one of YouTube’s biggest stars, and he’s embroiled in a scandal around one of his co-hosts. According to Variety, allegations involve Ava Kris Tyson, who allegedly engaged in “grooming” of a minor. Mr. Beast, for his part, is focused on hiring an independent investigator to look into the issue, but this is the latest in the an alarming trend of allegations coming forward against notable YouTubers and Twitch streamers including Dream, CookieLoL, Yandere Dev and Dr Disrespect.
If your kids are into gaming, chances are they’re playing Roblox. The platform’s users are 40% children—and allegedly they’re fighting a losing battle to keep predators off the platform. According to Bloomberg, “Since 2018, police in the US have arrested at least two dozen people accused of abducting or abusing victims they’d met or groomed using Roblox.”
And lastly
Here are a few more pieces of original writing from me and my team—just in case you're keen for more:
We all already know that the kids are on Instagram. It can feel like a losing battle as a parent to try and keep them from signing up, but if your children are already on the platform, here are a few ways to make it as safe as possible.
When families are spread far and wide, technology is a great way for us to stay in touch. But, not all platforms are created equal when it comes to keeping kids safe. Here’s a piece from my team about why you might want to consider an alternative to WhatsApp.