Last week, I sat down with James Titcomb, a reporter from The Telegraph to give him my take on Instagram for Kids for an article he was working on. We had a great conversation, and I shared a lot of the arguments covered in last week's newsletter. Some highlights, in case you missed it: social validation can be harmful to kids' self-esteem, the risk of online predation is high, Facebook can't be trusted with our data—and Instagram for Kids is a ploy to get adult users to stick around and engage.
We also discussed the argument that Facebook often falls back on when they develop products for children. According to the company, kids are flocking to the adult platform anyways, so it's logical to develop a parallel platform with similar functionality and augmented parental controls. It's a cleverly updated version of the old adage, "if you can't beat 'em, join 'em," and on the surface, it even seems like a compelling case. But at it's core, it's a flawed argument.
Any parent knows that kids often want to try things that might not be age-appropriate for them. They push boundaries and test limits, and it's our job to make sure that they do so safely. Every kid is different, so each family needs to decide what's right for them—but there are certain things that are broadly age-restricted for a reason. Lots of kids are attracted to vaping, smoking, drugs and drinking and want to experiment with those things before they're supposed to. But you don't see tobacco and alcohol companies developing youth products. Some things just aren't healthy for kids. I think that applies to manipulative algorithms, persuasive design and social validation.
Kids are intrigued by Instagram, sure, but there are better, healthier ways to introduce them to technology. The solution isn't to retrofit a platform that many adults find anxiety-inducing by adding a few new parental controls. These social media platforms have a time and a place—but at the end of the day, they're designed to keep us scrolling, extract our data and show us targeted ads. We might be getting something out of the bargain, but as adults, we can weigh the pros and cons and decide for ourselves. Kids, however, deserve to be protected, not exploited.
Instead of a remixed Instagram, children need platforms that are designed specifically for them, with age-appropriate features, safety mechanisms and opportunities for parents to participate, not just control. At Kinzoo, we believe kids deserve the best of technology, without exposure to the worst of it—and in order to give that to them, we started from scratch. Our product design and features—even our business model—are all guided by our values. And if you ask me, that's the kind of care and attention that products for kids need.
A deeper dive
Here are a few helpful resources in case you want to really dig into today's topic:
When the news about Instagram for Kids broke, Congress just so happened to be questioning Mark Zuckerberg about the issue of misinformation on social media platforms. They took the opportunity to ask him and other Big Tech CEOs direct questions about the effect of social media on children's mental health and requested to see any internal research that the companies had undertaken. It'll certainly be interesting to see what comes to light.
Other critics also aren't buying the idea that Facebook's motivations here are altruistic. This article from The Wall Street Journal makes a good point with the analogy of Facebook Messenger Kids: "So while Messenger Kids is managed by parents, much as Facebook says a kid-focused Instagram app would be, it still collects useful information such as children’s images, voices, time zones, birthdays and genders. It also knows with whom children speak most frequently and has access to their cameras, photos and microphones, if enabled."
TL;DR
Too long; didn't read. It shouldn't be a full-time job to keep up on industry news, so here is a mercifully quick summary of some other notable developments:
Do you remember when like counts were still public on Instagram? The platform made them private a few years ago, allegedly to help protect our mental health. Maybe I'm cynical, but I think that their motivations were less altruistic and they just wanted to remove a pain point that caused people to delete content. This article from a Harvard Medical School affiliate offers a great breakdown of how likes affect your brain. Spoiler: they aren't great for us, which means they're also not great for kids.
It's been a minute, so here's your friendly reminder: don't feel guilty about giving your kids access to screen time. As long as balanced articles like this keep getting published, I'll keep sharing them because I think parents need to hear it—loudly and often. I suspect it'll take time to undo the damage created by anti-screen time rhetoric, and I'm happy to amplify articles that advocate for a level-headed approach to technology.
And lastly
Here are a few more pieces of original writing from me and my team—just in case you're keen for more:
If you're curious about what I really think about Instagram's private like count, I wrote a piece about it back when the update first hit Canada. Check out my thoughts here!
When the documentary The Social Dilemma debuted last year, there was renewed interest in the way that social media affects our wellbeing. Our team wrote about some of the reasons that kids (and parents) should consider alternative ways to connect.
Okay, that's it from me until next week. If you enjoyed this newsletter, and know of another parent who would as well, please feel free to forward it along.