You can’t turn on the news these days without seeing a story about child safety online. Many experts have ideas for how social media platforms could change to better protect children, but they hinge on one important factor: age verification. If platforms make changes to keep kids safe, they need to know who’s a kid and who isn’t.
The assumption is that certain features in adult apps might be harmful to children’s wellbeing—and therefore, we should keep children away from those particular features. This is not a new concept. Most adult apps and platforms already have an age cut-off in their terms of service, usually 13.
And they already employ some kind of “age verification,” which usually involves a user putting in their birthday. Obviously, this method won’t stop a kid under 13 from signing up for an adult platform if they’re determined. All they need to do is a bit of basic math to adjust their birthday.
But, as the youth mental health crisis intensifies, lawmakers are motivated to do something—anything to try and support children. One of those strategies is clamping down on social media platforms. That’s because, according to some studies, social media use is correlated with a decrease in youth mental health. Internal data from Meta suggests that Instagram can be a toxic place for youth, especially teen girls.
And, many of us anecdotally feel like social media might not be the healthiest atmosphere for adults, let alone children. So, lots of lawmakers have set their sights on laws banning children from social media—or requiring parental consent when kids sign up. That’s where age verification comes in.
This is a tricky topic, because verifying someone’s age requires a lot of personal information. On the one hand, everyone likes the idea of the internet being safer for kids, but there’s a rub that I’ve written about before. In a previous newsletter, I argued that some of the child safety bills would force platforms to collect even more sensitive, personally identifiable information from users. This kind of age verification isn’t ideal for most users. I don’t know about you, but I don’t trust Big Tech platforms with my sensitive data.
For their part, Big Tech isn’t thrilled about the prospect either, because when they ask for more information when you sign up, they introduce a lot of friction into the process. And, if they require a government issued ID from all users, they have to store it securely as well, which isn’t cheap.
There are other ways to go about age verification, like AI facial age estimation, but you’d still need to upload or take a picture—and so far, the FTC has not been keen to allow this as a method of age verification.
All this has caused Big Tech platforms to throw their hands up in defeat and say age verification is not a problem they can solve. This is a convenient stance for them because they don’t actually want to be responsible for age verification. It isn’t good for their business. But as much as I hate agreeing with Big Tech companies, I’m not convinced they should be responsible for age verification either.
If each platform is required to individually verify users’ ages, that means you need to upload your ID every time you create a new account for a new platform. That means several different companies are storing your sensitive information—which is a bit of a nightmare for your security and privacy.
I argue that this would further skew the competitive landscape in favor of Big Tech incumbents. Requiring small startup platforms to verify users' ages through ID checks or AI-driven tools would force these startups to invest in tools and infrastructure, diverting resources from developing their core products to solve customer problems. This would present a significant hurdle for new companies. While individuals might be more willing to share their private information with a large, established company—or allow such a company to use personal data already collected—many would likely hesitate to share this information with new startups.
Big Tech companies often argue that age verification should be done at the app store level. Apple and Android could verify us once—and then we go on our merry way. This sounds like an elegant solution, especially since Apple and Google are already pretty sure how old we are. But nonetheless, it would still requires that you offer up even more personal information to a large tech company.
So all this begs the question: what are we supposed to do to make the internet safer for our kids? How are we supposed to enforce rules to keep kids safer online if we can’t determine who they apply to?
Other than leaning into age verification at the app store level—or some yet-to-be-discovered technical solution, there is one other way to address this issue. Some digital safety and privacy experts have the wild idea to just… make apps safer and better for everyone. They suggest that we can bypass age verification by extending the highest safety settings to all users of the internet.
While I think it’s a nice idea in theory, I don’t think we’ll be able to retrofit every app and platform to meet the unique safety needs of kids. I’m not sure how to solve this, but I do know we do need to think about the big picture. If we’re dreaming up new laws, we need to be able to enforce them. If we’re setting age restrictions, we need to be able to very ages. And whenever we’re designing technology for kids, we need to think about safety and privacy from the outset.
A deeper dive
Here are a few helpful resources in case you want to really dig into today's topic:
In a first-of-its-kind decision, Federal Regulators have banned a digital platform from serving users under 18. NGL is a popular app among teens and children—and it was aggressively marketed to younger users despite risks of bullying and harassment. According to the Washington Post, NGL agreed to pay $5 million and stop marketing to kids and teens. The suit also argued that the company violated children’s privacy laws by collecting data from youths under 13 without parental consent. While the company is now required to keep users under 18 from accessing the app, that all hinges on (you guessed it) age verification. It’ll be interesting to see how the company goes about that.
While lots of states have tried to implement age restriction laws for social platforms, they tend to run afoul of individual rights. A judge recently blocked a Mississippi law that would require age verification because he said the Attorney General failed to show that alternatives to the law, “like giving parents more information about how to supervise their kids online,” wouldn’t be effective. In my opinion, I don’t think a parent tutorial on internet safety is going to solve the larger issues here.
TL;DR
Too long; didn't read. It shouldn't be a full-time job to keep up on industry news, so here is a mercifully quick summary of some other notable developments:
There’s a new social app on the scene—and it might remind you of a really old platform. Noplace is sort of like Twitter-meets-MySpace with colorful, highly customizable profiles. The company asks, “Remember how fun the internet was before all the algos and ads? we do too… so we’re bringing it back.” The new app is geared towards Gen Z and is already gaining traction in the App Store.
There’s a new law in Illinois that meant to protect child influencers. According to ABC News, the new law “requires that children age 16 and under be compensated if, within a 30-day period, they are in at least 30% of a video or online content for which the adult, whether a parent or caregiver, is being paid. The person making the videos in which the child appears is responsible for setting aside gross earnings in a trust account for the child to receive at age 18.” This helps extend the same protections enjoyed by child actors to child influencers.
And lastly
Here are a few more pieces of original writing from me and my team—just in case you're keen for more:
It’s summertime, which means kids have a lot of free time on their hands. We all want our children to have an idyllic summer outside, but most of them also want to spend a lot of time online. Here’s a guide to some apps to spur creativity and learning—because it’s all about finding a balance.
While most of us aren’t professional influencers, many of us do enjoy posting about our families online. My team put together this guide to smart sharenting with all the best tips and trick for keeping kids safe.