As parents, we spend a lot of time keeping our children safe: we test the temperature of food on our wrists, we baby-proof our houses, we buy training wheels and helmets. We teach our kids to look both ways when they cross the street and we teach them about stranger danger. In general, we spend an enormous amount of time thinking about their safety and working to keep them happy and healthy. The same principles apply when it comes to technology, but for many parents, it’s not always clear where the dangers lie. Because of the nature of technology, it’s not always immediately apparent what’s safe and what isn’t.
Compounding the problem, we’re constantly told that screen time is bad for our kids, but we’re not told exactly what aspects of technology are detrimental. It leads to a lot of handwringing and anxiety for parents. I know it did for me before I began researching and working in kid-tech. My sincere hope is that I can share what I’ve learned in order to help other parents navigate technology with more confidence. I want to help them understand how apps and platforms work so they can feel at ease when they hand devices over to their children. And while there’s no quick-and-definitive way to tell if a tech product is kid-safe, one big clue that parents can look for is how a tech company makes its money.
This is a massive hint, and I’ll tell you why: the way a company chooses to generate revenue shapes everything. It dictates how they decide on their business goals, design their products and treat their users. And not all business models are created equal. Some lead companies to prioritize user wellbeing. Others do not. The predominant business model in consumer tech is an advertising business model. (Think: Facebook, where the product is “free” but they use your personal data to sell advertising space to companies that want to target you.) This kind of business model inevitably leads businesses to prioritize engagement and time spent in-app. That, in turn, leads to manipulative designs that are meant to keep you glued to a screen by any means necessary.
This kind of business model is not appropriate or even ethical for children. Kids need tech that’s designed to keep them safe and protect their best interests. They don’t need to be manipulated into staying online for hours on end.
If my kids are asking to use a tech product that’s free, my spidey senses start tingling. That’s an indication that I need to look more closely at how that app works (what kind of data it collects, what kind of safety features it has, how it makes its money) before I agree. Even products that are “designed for kids” can just be retrofitted versions of adult apps that are built to prioritize engagement at all costs. This was certainly the case with Facebook Messenger Kids, and Instagram for Kids, which is currently on hold. The 13+ versions of these social apps are designed to be sticky, and for that reason, they just don’t retrofit well for kids.
Parents and children need tech that’s built from the ground up for safety and wellbeing. We need tech that brings us together, encourages creativity and empowers us to discover new things. We also need technology that protects our children from unwanted interactions and respects their right to privacy. Technology that makes money by harvesting our data, selling it or using it to show is targeted ads is designed for advertisers’ best interests, not families.
Of course, not all free apps are bad for kids—and not all paid apps are safe. But, the business model is a major clue in telling you where a company’s priorities lie. When I’m trying to determine whether to download an app that my kid is desperate to try, the business model is a bit of a litmus test for me. I’m lucky to be in a position where I can afford to pay a bit of money for my kids’ technology, whether it’s a one-time payment or a subscription. They say that when a tech platform is free, you are the product. But when you pay for a tech platform, you are a customer. And I’m okay with paying a premium for my kids’ safety.
A deeper dive
Here are a few helpful resources in case you want to really dig into today's topic:
When you try and retrofit an adult product for kids, things can go wrong. Many adult platforms are optimized for engagement with features designed to maximize the amount of people using the app—and the amount of time they spend using it. And, it’s notoriously difficult to lock down an open platform in a way that makes it safe for kids. Case-in-point: Facebook Messenger Kids. A few years ago, a security flaw allowed kids to connect with strangers.
“How do you take on a market where the user isn’t your customer? A market that overlaps and competes with four major industries, but doesn’t belong to any of them?” These are the questions posed in Björn Jeffery’s overview of the kids’ app market. This is a bit of a longer read, but it offers excellent insight into the overall state of kid-tech—with an entire section dedicated to business models.
TL;DR
Too long; didn't read. It shouldn't be a full-time job to keep up on industry news, so here is a mercifully quick summary of some other notable developments:
Elon Musk is buying Twitter, and released a statement saying that “Twitter is the digital town square where matters vital to the future of humanity are debated.” I was immediately struck by this analogy because it’s one that’s often used to describe social media platforms. In fact, it’s something that Tech Ethicist David Ryan Polgar touched upon when my company interviewed him for a blog post. As he explained, “the public square analogy doesn't work perfectly because it’s not like everybody on a platform is in a square speaking to one another. Yes, everybody is in this square, but everybody has a microphone that has dramatically different volumes and the platform has the ability to raise, lower or turn off that microphone. That is the tremendous level of power and a power that social media platforms are starting to realize they need to take extremely seriously, but they also owe the general public a greater level of transparency on how they go about their decisions.” It’ll be extremely interesting to see how Elon Musk plans to navigate these thorny issues.
Pew Research recently released the results of a study investigating parents’ attitudes towards their kids’ screen time before and after the pandemic. Not surprisingly, a lot has changed. According to Pew, “the unique approach of this study—surveying parents about a specific child and looking at how individual parents’ responses changed over time—provides a window into children’s pandemic experiences with technology. Still, parents may not always know what devices their children use or exactly how much time they spend on them.”
And lastly
Here are a few more pieces of original writing from me and my team—just in case you're keen for more:
On the topic of business models, here is a post from Kinzoo about ours. We want to be transparent about how we make money—and we want to help parents understand why it matters.
We do offer in-app purchases, and we put a lot of thought into this particular decision. If you’d like to learn more about it, here is everything you need to know about in-app purchases in Kinzoo.
Okay, that's it from me until next time. If you enjoyed this newsletter, and know of another parent who would as well, please feel free to forward it along.