The internet was not designed with kids in mind. You can tell this just by looking at the sprawling digital world. Along with all its amazing aspects, it’s also full of dark corners—and because of the way it works, it’s relatively easy for bad actors to act badly and remain hidden.
The same thing can definitely be said of a lot of Big Tech platforms. They’re ultimately designed to capture our attention and hoover up as much of our data as possible. And they use just about every trick in the book to keep us coming back for more. That’s because the more we engage with the app, the more they can use our data points to show us ads. The more time we spend on a platform, the more money they make.
This simple calculus incentives app companies to come up with creative ways to keep us scrolling, chatting, posting and otherwise engaging. They design features that exploit our desire to belong. They pump up the FOMO (fear of missing out). They create little rewards that cause our brains to release a tiny bit of dopamine. They take inspiration from slot machines.
They certainly don’t want their users to take breaks, close the app and come up for air. That’s why your feed is endless. It’s not in chronological order, because if it was, you’d catch up eventually and stop scrolling.
You can find similarities in the way casinos are designed. There’s no natural light or clocks, so you can’t easily keep track of how long you’ve been gambling. If you started in on a slot machine at 2 pm and then noticed the sun going down, that might be a sign that it’s time to get up and stretch your legs. It would be a natural stopping cue. Casinos want your money—so they want you to totally lose track of time.
A lot of Big Tech platforms try and design away any natural stopping cue like your feed ending. They set your default settings to auto-play videos one after another. This makes it difficult for most adults to regulate their screen time, let alone children.
I’d guess that most parents know how… compelling screens can be for children. We’ve probably all experienced our share of meltdowns after enforcing a tech boundary. But, I also know that there are things tech platforms can do to help families out. We can make design decisions that encourage kids to develop healthier relationships with screens. One obvious, easy way to do this is a stopping cue.
Stopping cues can be subtle, like when you watch a show on cable TV and your show ends. You get a natural break and you might move on to something else. Or back in the good old days when your feed was chronological and told you when you were all caught up. Most people will naturally close an app when there’s nothing new to look at.
Today though, Netflix auto-plays the next episode right away. And Instagram scrolls forever. There are fewer moments where you would naturally stop. Tech companies design their platforms this way even though it’s not in your best interest to continually engage forever and ever. We all need balance. We all benefit from stopping cues—but especially kids. They need a variety of activities in their lives to be healthy, and if they get too much screen time, that can be detrimental to their wellbeing.
That’s why we build big, overt, un-missable stopping cues into Kinzoo Messenger. We want our platform to be fun—but most importantly, we want it to be a net benefit to kids and families. So, when you’re enjoying a mini game, you will get an overt message telling you to take a break.
Obviously, stopping cues affect engagement metrics. That’s why a lot of other platforms try to eliminate them altogether. But we don't want to boost our metrics at the expense of your wellbeing. We believe that you and your family deserve better than that.
A stopping cue is just one of the ways that we strive to build balance into our kids’ platforms. Over the next few newsletters, I’ll be taking a look at some of the other common design patterns that tech companies use, how to spot them—and how they affect you and your family.
A deeper dive
Here are a few helpful resources in case you want to really dig into today's topic:
The UK has mandated that technology designed for kids follow certain rules. The Age Appropriate Design Code outlines how organizations can and can’t design their tech, but the basic idea is that you need to design it with kids’ wellbeing in mind. This section on “nudge techniques” suggests that companies include cues that encourage kids to take breaks.
California is also voted to adopt similar rules for tech companies. These codes are important because they shift some of the onus onto tech companies—and away from parents. If we have access to better-designed platforms, it makes digital parenting a little bit easier.
TL;DR
Too long; didn't read. It shouldn't be a full-time job to keep up on industry news, so here is a mercifully quick summary of some other notable developments:
The Federal Trade Commission has accused Amazon of basically tricking customers into subscribing to Prime—and then making it very difficult to cancel. This kind of tactic is often called a “dark pattern.” It’s basically a manipulative way of designing a product to get you to do something you might not otherwise do.
I wrote recently about my fears around deepfake technology—AI-generated content that looks extremely convincing. I’m most nervous about the potential for bullying, especially as my daughter enters high school, but of course, there are plenty of other ways that deepfake technology can be chilling. The latest example? A new type of true-crime content on TikTok featuring AI-generated child victims: “They’re quite strange and creepy,” says Paul Bleakley, assistant professor in criminal justice at the University of New Haven. “They seem designed to trigger strong emotional reactions, because it’s the surest-fire way to get clicks and likes. It’s uncomfortable to watch, but I think that might be the point.”
And lastly
Here are a few more pieces of original writing from me and my team—just in case you're keen for more:
We recently marked a big milestone at Kinzoo—we launched our Marketplace and subscription service in our Messenger app. We believe in giving kids and families the best of technology, and that means building a sustainable business that respects your safety and privacy. Here’s how we plan on doing that.
If your kids are on Snapchat, they have a brand new AI-powered contact. This chatbot powered by Chat GPT comes with a few safety considerations. Here’s what parents need to know.