If you’ve ever watched a kid navigate technology, you know they have an almost-supernatural ability to figure things out. They bypass parental controls like they were born with an internal guidebook on how to game the system. Ask any parent, and they’ll have a story about their child circumventing screen time limits, disabling Wi-Fi restrictions, or finding a workaround for an app’s parental locks. So, it’s no surprise that kids can easily sign up for adult platforms. All they need to do is enter a fake birthdate—no hacking required, just basic math.
The truth is, current age-gating systems are laughable. They exist not because they work but because tech companies need plausible deniability. If something goes wrong, they just point to their “age verification” process and claim they did their part. But with growing concerns about child safety online, the search for a real solution has never been more urgent.
There’s no shortage of ideas, but most rely on one critical challenge: how to reliably determine who is a child and who isn’t. Some proposed solutions include requiring government-issued ID or using facial recognition technology—both of which raise serious privacy concerns. Understandably, most people don’t want to hand over more personal data to Big Tech.
For their part, most platforms don’t want to take on age verification themselves. Instead, they argue that the App Store should handle it. And honestly? That might not be the worst idea. If Apple or Google verified ages at the store level, users would only have to do it once, rather than for every app they download. It would also remove a major burden from developers, allowing them to focus on content moderation and other safety measures instead.
Meta, of all companies, has been pushing for this solution. And while I rarely find myself agreeing with them, in this case, they might have a point. If every app is forced to implement its own system, it will create a fragmented, inconsistent process. Worse, it will put small developers at a huge disadvantage—imagine trying to convince users to trust a brand-new app with their government ID.
But Apple and Google aren’t jumping on board. With regulators demanding action, Apple recently released a white paper outlining its new approach to age verification. Starting in 2025, parents will be able to share their child’s age with app developers through an API, ensuring kids only access age-appropriate content. The App Store will also introduce more granular age ratings (13+, 16+, and 18+), and children won’t see apps that exceed their age category.
Sounds promising, right? But there are some big question marks.
For example, if an app needs to verify a child’s age, it will request permission via a pop-up—just like apps currently do for microphone, camera, or location access. Can kids just… decline to share their age? And if so, what happens next? What’s stopping a tech-savvy kid from finding a workaround?
To be fair, Apple’s approach is a step in the right direction. It enhances privacy by preventing users from handing over personal data to every app they use. But whether it actually works remains to be seen.
At Kinzoo, we’ve been thinking about age verification from day one. We’ve invested significant time, energy, and resources into making our onboarding process as secure as possible—because protecting kids isn’t an afterthought for us, it’s the foundation of what we do. And while we welcome new solutions, they must be both effective and privacy-conscious.
What we need is for the smartest minds in tech to step up and create a system that actually works—one that protects kids from inappropriate content without compromising their privacy and security. Because if today’s kids can outsmart age gates with nothing more than a simple math equation, we’re still a long way from a real solution.
A deeper dive
Here are a few helpful resources in case you want to really dig into today's topic:
Right now, lawmakers in nine states are pushing for app stores to take responsibility for age verification and require parental consent before kids can download apps. It makes sense—after all, the app stores control the front door. Apple insists that individual apps should handle it (despite years of evidence that this approach doesn’t work). Google, for its part, admits the issue is complex and is experimenting with a machine-learning model. Meanwhile, child-safety advocates argue that Apple and Google need to step up—and they’re working to expand these laws nationwide.
I find it interesting that, amid all this pressure, Apple is revamping App Store age ratings to 4+, 13+, 16+, and 18+. When parents share their child’s age information with Apple, the company will stop showing them apps that exceed their range. This begs the question: why hasn’t this been the case all along? We see this time and again: Big Tech companies come out with slick safety features only after they face immense public pressure. What ever happened to being proactive?
TL;DR
Too long; didn't read. It shouldn't be a full-time job to keep up on industry news, so here is a mercifully quick summary of some other notable developments:
Big Tech platforms often make a raft of updates to protect child safety when they’re facing pressure. And, Google is responding to the renewed conversation about age verification with some child safety updates. New features include the ability for parents to limit teen’s cell phone use at school and the ability to add contacts to their phones—and limit interactions to those contacts only. These are certainly helpful features, so it’s curious that they waited so long to develop them.
Researchers have trained a special version of ChatGPT to challenge conspiracy theories. And apparently, it’s effective. There was a 20% reduction in conspiracy theory beliefs after study participants interacted with it. It’s interesting to see this kind of novel application for AI, especially when AI is often the tool people leverage to spread fake news in the first place.
And lastly
Here are a few more pieces of original writing from me and my team—just in case you're keen for more:
If your kids are gamers, they’ve likely wanted to play Subway Surfers. My team put together a parent’s guide here with everything you need to know.
And, if you’ve found yourself confused by the 17+ age limit on Block Blast, my team put together this article here to break everything down for you.