Think back to the not-so-distant past when Facebook (a.k.a. Meta) announced it was developing a version of Instagram, especially for children. If you recall, the backlash was pretty swift, and lots of experts implored the company to abandon the plan. Ultimately, they announced they would pause (but not cancel) the idea in late September 2021. This was probably good planning on their part since a whistleblower leaked some damaging documents soon after that showed Facebook knew that Instagram could be damaging to youth mental health.
But, even before those internal documents became public, a lot of people had a nagging feeling that Instagram wasn’t appropriate for children. But then, again, the company always maintained that it wasn’t designed for them anyways. So when they directly targeted children with a mini-Gram, a lot of people felt it was one step too far. In a weird way, I think Instagram for Kids was actually what spurred people to think a bit more critically about Big Tech and small children.
Frances Haugen's whistleblower testimony helped confirm what many of us already felt in our guts: these platforms can be detrimental to users’ mental health. And, Facebook’s own internal research indicated a decline in the mental health of its users, especially teen girls. That’s why it’s critical that we pay attention to the design patterns in apps—because the way they’re built has real, far-reaching consequences.
I’ve long argued that Big Tech platforms should face more scrutiny, especially when children are involved, so I’m glad to see this sentiment is gaining traction. I think that more people are paying attention to the dubious way these platforms are designed, the ways they violate our privacy and the methods they use to capture our attention. And, regular people, industry experts and even high-profile officials are pushing back.
Recently, Surgeon General Vivek Murthy said that he thinks 13 is too young for children to use social media. In an interview with CNN, he stated “I, personally, based on the data I’ve seen, believe that 13 is too early … It’s a time where it’s really important for us to be thoughtful about what’s going into how they think about their own self-worth and their relationships and the skewed and often distorted environment of social media often does a disservice to many of those children.” While his comments alluded to the fact that we haven’t seen definitive evidence through studies, it seems that he’s seen enough to draw a personal conclusion that these platforms aren’t healthy for the youngest users.
Likes and followers are design features known to sort of “hack” the brain, releasing dopamine and ultimately keeping users glued to the platform. The “unintended” consequences on mental health are significant. The high-pressure, curated and competitive environment of social media, is not appropriate for kids, who as the Surgeon General pointed out, are still developing their identities.
And, it doesn't stop with the Surgeon General either; President Biden has also publicly called out Big Tech collecting data from and profiting off of our kids. This is a crucially important issue because these problematic design patterns continue to show up in apps designed specifically for kids.
While there are lots of questionable platforms out there (yes, even ones made explicitly “for kids”), that doesn’t mean that we should keep technology away from children altogether. That's akin to throwing the baby out with the bathwater, and this issue calls for nuance. We don’t need to ban technology for younger users—we just need to prioritize different things. We need to design tech that’s safe and private. That means not only protecting children from data collection but also providing safe spaces where they can be kids without putting themselves on the permanent digital record.
This is one of our deeply held beliefs at Kinzoo. We’re striving to design platforms that give kids the best of technology—without exposure to the worst of it. We include features in all our products that promote connection, creativity, cultivation and fun while avoiding things like public feeds, likes and followers. While it’s encouraging that there is growing bipartisan support for stronger regulation, we still have a long way to go to build a safer digital world for our children. Government regulations only work when they’re enforced, and we’ve already seen with the Children’s Online Privacy Protection Act (COPPA) that enforceability is a challenge. For the foreseeable future, parents will still need to work hard to give their kids a safe introduction to the digital world—but hopefully, more and more companies will take up the cause as well.
A deeper dive
Here are a few helpful resources in case you want to really dig into today's topic:
I offered a plea to Facebook/Meta to eliminate Instagram for Kids rather than just pause the project which was published in TechCrunch
It’s the Surgeon General’s job to oversee public health in the US, so his comments about tech safety during his interview at CNN were significant. It’ll be especially interesting to see if his opinions about social media lead to any policy decisions in the future.
President Biden had a strong message for Big Tech during his State of the Union speech: it’s time to stop collecting data from kids and teenagers. For me, the best part is the reaction of other lawmakers, who clearly agree with what he’s saying.
TL;DR
Too long; didn't read. It shouldn't be a full-time job to keep up on industry news, so here is a mercifully quick summary of some other notable developments:
In a recent newsletter, I wrote about the futility of trying to ban new technologies like ChatGPT. It’s a far better strategy to prepare kids to navigate these platforms safely and responsibly, so I’m excited to see this article from the New York Times. Some schools are using the AI chatbot in creative ways—and asking students to think critically about how it works.
Meta, the company behind Facebook and Instagram, is at it again: according to internal company memos, they’re actively trying to target younger users for their Horizon Worlds metaverse app. They apparently plan to open up the app to users as young as 13 and aim to make the design more appealing to them. It seems like Meta thinks children and teens are key to their future success—and I worry about how far they’ll go to “retain” them.
And Lastly
Here are a few more pieces of original writing from me and my team—just in case you're keen for more:
If you want to dive deeper into my take on ChatGPT, you can check out my full essay on the topic here.
And, you probably won’t be surprised to learn that I’ve already written about Meta’s strategy for courting a younger and younger audience. I believe this is how they intend to build value in the company, which is something we should be concerned about.