There’s nothing quite like exploring a new digital platform or device for the first time. Figuring out the interface, experimenting with the features, connecting with other users—it’s all a lot of fun poking around and seeing what’s possible. But, how many times do you delve deep into the settings when you’re exploring something new? How often do you tap that gear icon and look at the little toggles and fine print under the various tabs?
Checking in on your settings can feel like an afterthought. But back in 2015, Laurence Scott wrote in his book The Four-Dimensional Human: Ways of Being in the Digital World that “the devil is in th`1e defaults.” He was referring to how a product is set up when you first download, install or unbox it. And while they might seem like innocuous toggles, the truth is that the default settings are actually a really important part of the product and experience.
The default settings affect your privacy and safety. And they can also tell you a lot about the motivations of the people who designed the app. That’s why I want to spend some time talking about default settings, and how they affect kids in particular. They have a big impact on users of all ages, but they can be particularly consequential for children who are just learning the digital ropes.
The settings can significantly change how private and secure a platform is—which has huge implications for our kids’ safety. Just think of the difference between a private Instagram account and a public one. Unfortunately, it’s not uncommon for kids with public profiles to experience unwanted contact from strangers. One of the most egregious examples of this happened when one 37-year-old mom posed as an 11-year-old on Instagram with a public profile—the predators came out of the metaphorical woodwork in force. And while Instagram has made various updates over the years to try and make their default settings safer for younger users, these changes are usually reactive, not proactive.
Your privacy and location-sharing options aren’t just settings, but fundamental safety considerations. And many major social platforms have a history of setting the default wide open. Anyone can search for you, send you contact requests, see and comment on your posts, or send you DMs—by default. That makes it as easy as possible to find other users and add friends and ultimately helps the platform grow faster.
And, it’s not uncommon for default settings to include notifications for every little thing. The more they ping you about likes, comments, retweets, views and even other users’ posting habits, the more likely you are to engage with the app and boost their metrics in the process. For younger users, this can easily lead to overuse and endless scrolling.
The big social companies used to have an easier time pretending that their platforms were only designed for users 13 and older, so they could claim that the wide-open default settings were just fine for all these users. But we all know that many younger children flock to these apps, and they’ve been under increasing scrutiny as a result. And regardless, we’re starting to realize that a public profile for a 13-year-old user is problematic in its own right.
New regulations like the Age Appropriate Design Code in the UK and California are mandating that companies get a bit more thoughtful with their default settings when a product is designed specifically for children. And, there has been a wave of new bills trying to force tech companies to do better by their younger users, even if they’re over the age of 13.
When you design apps for kids, you need to ensure that your default settings protect their safety, privacy and wellbeing. At Kinzoo, we put a lot of thought into the default settings in our apps, because we know they’re a big part of keeping kids safe. We think about every design decision as parents, and we set things up the way we would want for our own kids. When we introduce a new feature, I often think about whether it’s something I would want control over as a parent. If so, we make every effort to give our community a choice in the matter. But we also know that a lot of users might not ever dive into the settings to make changes, so we make the defaults as safe and transparent as possible. At the end of the day, we believe that kids and families deserve the best of the digital world, so that’s how our apps are set up.
A deeper dive
Here are a few helpful resources in case you want to really dig into today's topic:
It wasn’t until July 2021 that Instagram updated its default settings so that users under the age of 16 would have a private profile when they joined, though they can still change this setting if they want. Before that, users would be asked if they wanted a public or private profile when they signed up. And, later in August 2022, Instagram updated its defaults so that younger users would not be served the most sensitive content by default. While the updates might seem like a step in the right direction, I tend to see these steps as too little, too late.
And, in March 2023, TikTok rolled out a suite of new parental controls and default settings for teens—but this move came after a lot of scrutiny over the ways the app affects teen wellbeing. And, many experts agreed that the new settings wouldn’t be a significant enough change to really move the needle for struggling users.
TL;DR
Too long; didn't read. It shouldn't be a full-time job to keep up on industry news, so here is a mercifully quick summary of some other notable developments:
Amazon has agreed to pay a $25 million dollar fine to settle alleged children's privacy laws violations connected to the Alexa voice assistant service. You often hear jokes about voice assistants and how they’re always listening, but apparently, Amazon failed to delete children’s voice recordings when parents requested so. According to the complaint, the company "failed for a significant period of time to honor parents' requests that it delete their children's voice recordings by continuing to retain the transcripts of those recordings and failing to disclose that it was doing so, also in violation of COPPA.”
Facebook, TikTok, Snapchat and YouTube are fighting federal lawsuits that claim the social-media companies have created an addictive product that pushes destructive content to youth. While the platforms have historically used Section 230 as a way to say, “Hey, we’re not responsible for the things people post on our platform,” the lawsuit claims that it’s the product itself that’s the issue, not the content users have posted. It’ll be interesting to see where this one goes.
And lastly
Here are a few more pieces of original writing from me and my team—just in case you're keen for more:
If you’re curious to know more about the TikTok updates, my team wrote this article breaking everything down.
One of the best parts about the work we do? Getting the chance to connect with a diverse range of experts who are making the internet safer for kids. Here’s a roundup of some of their advice from the interviews we’ve done.