"Social media consumes kids today as well, as on average they score their first social media accounts sometime between the ages of 11.4 and 12.6... The largest percentage of kids, at 39 percent, get their first social media account between ages 10 and 12, but another 11 percent signed on when they were younger than 10.
As of August 2017, Statista shows 23.5 million kids age 11 and under on Snapchat, 14.5 million on Instagram, and 3.1 million on Facebook. I can also only imagine how many of YouTube’s 1.9 billion monthly active users are under the age of 13—looking at all the kids on the platform around my household or any public area, I’m guessing it is extremely significant."
The paragraph above is from my book Screen Captured, which I published in 2019. Because of my work in kid-tech, I think about those statistics a lot. I write a great deal about the state of the internet and how it wasn't designed with children in mind. And I tell anyone who will listen about the dangers that kids are exposed to on adult platforms. I share the excerpt above to make a point: while safety concerns for kids online might not be talked about as much as they should be, they aren't unfamiliar to those of us working in this field.
That's why I was not surprised by the statistics in this sensational article from The Verge. To summarize: millions of young children are using adult platforms years before they turn 13, a significant proportion receive abuse, harassment or sexual solicitation from adults, and there is an unacceptable failure on the part of tech platforms to protect their wellbeing and safety. I think the article does a fantastic job of shining a light on this problem—but it isn't new and it isn't worse than we suspected. On the contrary, the safety concerns for kids on adult platforms has been an open secret for some time.
The fact that "25 percent of kids 9-17 reported having had a sexually explicit interaction with someone they thought was 18 or older" is disturbing—and I'm grateful to see it getting some serious attention. I suspect that the increased interest in child safety online is due at least in part to Facebook's plan to launch Instagram for Kids. According to the company, plans are underway to pursue a junior version of the photo-sharing app, and pushback has been fierce. The idea of Instagram for Kids makes a lot of people uneasy—and with good reason. The threat of predation for children on open social networks, especially those where they share photos of themselves, is real. But again, it's not new and it's not worse than we thought. Many of us have been ringing alarm bells for years, but it took an obviously disastrous idea like Instagram for Kids for the safety issues to get mainstream coverage.
Kids have different needs from technology. That's why Kinzoo is building new products from the ground up with a young audience in mind. Children are a fast-growing user segment, and companies must do better to give them access to the best of technology without exposure to the worst of it. And we can't do that by retrofitting adult platforms that already pose such significant risks to our kids.
A deeper dive
Here are a few helpful resources in case you want to really dig into today's topic:
If you're interested in seeing the full report cited in the Verge article, you can find it here. According to Thorn's research, "[t]he likelihood of these online sexual interactions occurring with someone the participant believed to be over the age of 18 appears to increase with age: 29% of teens reported having had an online sexual interaction with someone they believed to be an adult compared to 19% of 9-12-year-olds."
Instagram for Kids is so obviously a terrible idea that forty-four attorneys general recently signed an open letter asking Facebook not to pursue it. If we look at the rates that children are targeted on social media platforms already—and the abject failure of Facebook to protect the safety of its users—it's a recipe for disaster.
TL;DR
Too long; didn't read. It shouldn't be a full-time job to keep up on industry news, so here is a mercifully quick summary of some other notable developments:
We're used to sharing our lives online—and for many parents, that means posting about their children. While "sharenting" can be done in a safe way that protects your kids' wellbeing, a new article from the Washington Post article identifies a startling trend: parents posting openly about their kids' mental health struggles on public platforms and compromising their privacy.
There is a fundamental difference between the ads we grew up watching during Saturday morning cartoons and the ads we see today online. Because advertisers can track our behavior and serve us targeted advertisements based on our interests, Dr. Jenny Radesky argues that we need to take special care to help kids understand modern marketing in order to protect their safety. And I couldn't agree more.
And lastly
Here are a few more pieces of original writing from me and my team—just in case you're keen for more:
Diana Graber is the founder of Cyberwise and Cyber Civics, and the author of Raising Humans in a Digital World. She is a tireless advocate for digital literacy education, and our team had the chance to sit down and chat with her for our Interview Series. Check out her insights here!
Video calling has been a life-saver for lots of families during social distancing, helping us to stay in touch with far-away loved ones. But, keeping kids engaged can admittedly be a challenge. Our team put together 10 fun activities to try on your next call to keep things interesting.
Okay, that's it from me until next time. If you enjoyed this newsletter, and know of another parent who would as well, please feel free to forward it along.