Over the last couple of weeks, I’ve been spending time talking about the ins and outs of technology design, and the little things that can make a big difference when it comes to keeping kids safe online. I wanted to dive deep into these topics, like persuasive design, default settings and stopping cues, because they’re terms that child safety experts often use but seldom explain. And, when parents have a grasp on these topics, we’re better equipped to make informed decisions about technology and keep their kids safe. So today, I’m forging ahead with a closer look at a thing called data minimization.
It’s actually a simple mandate for technology companies: collect as little data as possible. But it also happens to be the exact opposite of how Big Tech companies operate. By now, we’re all aware of (or resigned to) the fact that platforms are hoovering up our data at a startling clip. But when we hand over data to a platform, we have to ask the question: who benefits?
There are a few different reasons a company might collect a data point. They might collect some information because they need it to operate the app. They might collect information to understand what features or content is resonating with users. Or, they might collect data points to create an in-depth profile of you and your preferences, so that they can sell ad space to advertisers based on your interests.
Because some major platforms make their money from showing you ads, they are incentivized to collect a whole lot of data. Like, everything they can get their hands on. They want to know as much about you as possible because that helps them optimize their business with advertisers. Much of the data they collect benefits them and their bottom line—not you.
Regulators know this. They understand that when some companies collect reams of data, it doesn’t benefit you. And in fact, it can put users at risk. Especially younger users. That’s where the idea of data minimization comes in. This concept is part of age-appropriate design codes, because our data isn’t just neutral points of information. When companies collect this stuff, especially sensitive stuff, there can be consequences. It can get hacked and put our privacy at risk. Companies can use it against us to sell us things we probably don’t really need. And companies can create such deep personas on us that they are able to deploy even more persuasive design to keep us glued to their platforms. So, regulators have thrown down the gauntlet. Age-appropriate design codes have mandated that anyone creating technology for kids strive to minimize the amount of data they collect in order to protect the children who use the platforms.
Of course, there is certain information that’s vital to a platform’s function. For example: when you sign up for Kinzoo Messenger and we collect your email address so that we can send you a consent email. We collect your name and your PIN so that your account is personalized and safe. This is basic stuff that we need to make sure we can actually provide services for you.
But our strategy has always been to only collect the data we need in order to make our apps and company function. We’ve been operating this way since before age-appropriate design codes have existed, because we believe it’s the right thing to do. We challenged ourselves early on to consider every bit of data we collect. If we can’t think of a good enough reason that benefits children and families, we don’t collect it.
That’s why we don’t ask you for your child’s exact birth date. We do need to know the ages of the kids who use our platform in order to design age-appropriate experiences for them, but we decided we could do that with just a birth year and month. We couldn’t come up with a good enough reason to collect the day as well. Sure, we could probably boost our user metrics if we sent everyone a happy birthday message—but that would really only benefit us, not your family. Similarly, we opted not to collect your child’s gender either. There just wasn’t a good enough argument for how that would benefit you and your family.
And, much of the information we collect about the features in our apps is aggregate, meaning that we know high-level stuff like how many users download a new sticker pack, but we don’t know which individual user has downloaded what.
Many other platforms monitor your activities closely. They want to know if you downloaded a panda sticker pack or watched a panda video—or even if you just paused longer than normal over a headline about pandas at the zoo. They want to know this so that they can make sure you’re hit with as much panda content as possible whenever you use their apps, which keeps you highly engaged and coming back for more.
At Kinzoo, we do things differently. We want you to have control over your own experience. We want you to choose the things you follow and engage with. We believe that parents and kids should have a say in what they see when they use Kinzoo platforms, so we don’t keep tabs on the things you download and purchase. And we don’t presume to choose for you.
At the end of the day, data minimization is about choice as much as it’s about safety. When you minimize the amount of data you collect from children, you not only protect their privacy, but their agency as well. When families have the chance to explore technology in a safe, private and empowering way, that’s when kids get to experience the best of the digital world.
A deeper dive
Here are a few helpful resources in case you want to really dig into today's topic:
When a company collects data from children purely for profit, that tends to raise a few eyebrows among regulators. And back in May, Meta drew the ire of the FTC when they repeatedly violated child online privacy laws. As a result, they FTC proposed a sweeping ban that would prohibit Meta from profiting off any data from users under 18.
Experts say that a fundamental shift could be underway in how we’re treating digital privacy in general. Rather than a hodgepodge of “harm-reduction” approaches to privacy, new regulations have the underlying assumption that users’ own their data—and that they should have a say in what happens to it. It’ll be interesting to see if and how this shift changes the way Big Tech companies interact with their users.
TL;DR
Too long; didn't read. It shouldn't be a full-time job to keep up on industry news, so here is a mercifully quick summary of some other notable developments:
Speaking of children’s privacy online, new research indicates that YouTube and parent company Google may have tracked children across the web. According to the report, about 300 brands’ ads were served on channels designated “for kids.” These were ads for adult products like cars and bank accounts—and if a child clicked on them, they were sent to the brands’ website and subsequently tracked using “persistent cookies.” Google, for its part, claims that it hasn’t violated any laws, and all of this is permissible under current legislation.
Child influencers exist in a bit of a perilous state. They have their lives made into content, but they enjoy none of the financial protections that traditional child actors have. Recently though, Illinois has become the first state to pass a law protecting them. The new legislation will “entitle influencers under the age of 16 to a percentage of earnings based on how often they appear on video blogs or online content.” That money goes into a trust for them, ensuring they get at least some of the money they play a part in generating.
And lastly
Here are a few more pieces of original writing from me and my team—just in case you're keen for more:
If you want to go a little deeper on data privacy for your kids, here’s advice from four industry experts on how to protect children while they’re online.
And, if privacy policies make your head spin, you’re definitely not alone. My team put together this guide to help explain what they mean—and help you spot any red flags for your family.