These days, testifying in front of Congress seems like a right of passage for tech executives—and last week, Adam Mosseri had his turn. Mosseri has been at Facebook (now known as Meta) for over a decade, working his way up the ranks to eventually assume the role of Head of Instagram after the original founders of the photo sharing app abruptly departed in 2018, citing concerns with the direction the app was taking. Before assuming the top position there, his notable achievement was creating the ranked newsfeed—the algorithms that organize content in order to maximize user engagement.
Why question Mosseri now?
He was called to Congress to testify in front of a Senate Subcommittee on Consumer Protection after whistleblower Francis Haugen surfaced internal Meta research revealing that the company knew that Instagram could be toxic for youth, especially young girls. These revelations came around the same time that Meta confirmed it was working on an Instagram platform for kids. Lawmakers and parents alike were alarmed that the company was developing a potentially dangerous product for children, and many called for the plan to be abandoned. After a great deal of public pressure, they agreed to pause development—but Meta is still very much committed to creating a junior version of the app. On Wednesday last week, Congress had its chance to ask Mosseri about Instagram's plans for kids products—and how the company plans to protect young users.
What did we learn?
All along, Mosseri has maintained that kids are using the adult version of Instagram anyways. The way he sees it, developing a special platforms especially for them "solves a problem." Not surprisingly, Mosseri doubled down on this position multiple times during the hearing. All indications are that the company will continue to build the much maligned Instagram for Kids, forging ahead with the platform that no one asked for. I suspect that they will continue to fall back on the justification that "kids are already online" and it will be interesting to see if the public backlash is as strong after the fallout from the whistleblower documents subsides a little.
Instagram maintains that it is consulting with experts and parents to develop a safer platform for kids, but I am skeptical for a few reasons. We've seen again and again that Big Tech is reactive—not proactive—when it comes to safety. In fact, we saw some reactive changes on Tuesday last week, when Instagram conveniently announced new updates to the platform the night before Mosseri testified. These updates include new parental controls, alerts encouraging users to switch to another topic when they are too far down a rabbit hole and the "take a break" feature, which prompts people to close the app if they've been scrolling too long. These are not groundbreaking features. If Instagram really cared about user wellbeing, they would have made these changes eons ago. Or, better yet, they would have been built in from the outset.
In a previous "safety-focused" update to the platform, Instagram announced that all profiles created by teens would default to private. But, Senator Marsha Blackburn embarrassed Mosseri by creating a fictitious account for a 15-year-old whose profile was public by default. The CEO acknowledged the error as a bug that occurs when accounts are created through the web. He promised to have that corrected immediately, but I think this goes to show how serious they really are about safety.
Mosseri did make one new announcement during his testimony: he said they are going to bring back the option for users to choose a chronological feed early next year. This is notable, since it's widely known that engagement skyrocketed on Facebook-owned platforms when they introduced algorithms that maximize engagement. Maybe I'm just cynical at this point, but I wouldn't be surprised if the toggle for this option is buried deep within the user settings—but it's a reactive update that will tick the box.
The Instagram CEO stayed steadfast in recommending that an industry body regulate tech companies instead of government. The subcommittee chair, Senator Richard Blumenthal, retorted that "[t]he time for self-policing and self-regulation is over. Self-regulating relies on trust. And that trust is gone." This was surely the most satisfying moment of the hearing.
What conclusions can we draw?
Most of these hearings amount to a bit of finger-wagging—and this one was no exception. It seems, however, that Congress is developing a deeper understanding of Big Tech; questions continue to get more pointed and better-researched. (At least they now understand that Facebook sells ads.) Reform to Section 230 of the Communications Decency Act (which protects platforms from liability associated with what users post) and the Children's Online Privacy and Protection Act (COPPA) will continue to be topics of discussion, but meaningful regulatory changes are still unlikely.
Despite that, I am optimistic that kids' online safety is finally a topic of discussion. At the very least, we can all acknowledge the need for a solution, and that's something. At one point, when referencing children ages 10–12, Mosseri admitted that "Instagram, quite frankly, wasn't designed for them." And even though that age bracket is already using technology, a Facebook-owned platform with a few extra parental controls is not the right solution.
A deeper dive
Here are a few helpful resources in case you want to really dig into today's topic:
If you're interested in learning more about the new changes to the Instagram platform, check out this article here. It seems like, at this point, safety updates from Meta are met with a collective raised eyebrow. Case-in-point, this quote from Sen. Marsha Blackburn's spokesperson: "We’d like to be hopeful that this shows Instagram’s commitment to making its products safer for kids and teens. But with their track record, it seems like their ‘big announcement’ in the dead of night is more likely to be a smokescreen to try to draw attention away from things they don’t want highlighted at Wednesday’s hearing.”
There are so many reasons that Instagram for Kids is a terrible idea. I recently wrote an article for TechCrunch about why the company should kill their plans for a children's platform. You can check it out here.
TL;DR
Too long; didn't read. It shouldn't be a full-time job to keep up on industry news, so here is a mercifully quick summary of some other notable developments:
About 33 million people use the Life360 app. It's marketed as a solution for keeping your family safe—but it's recently come to light that company is selling kids’ and families’ precise location data to about 12 data brokers, who in turn, sell the data to nearly anyone who wants to buy it. Not a good look—and more evidence that tech business models that rely on data capture and sale are not appropriate for kids. Justin Sherman, a fellow at Duke Tech Policy Lab quipped that "[f]amilies probably would not like the slogan, ‘You can watch where your kids are, and so can anyone who buys this information.’"
On a lighter note, the tears of joy emoji is still the most popular emoji of them all. According to data from the Unicode Consortium, the little cry-laughing face and the red heart were the first- and second-most popular emojis in 2021. Gen Z thinks the tears of joy emoji is painfully uncool—but apparently that's not stopping the rest of the world from continuing to embrace it.
And lastly
Here are a few more pieces of original writing from me and my team—just in case you're keen for more:
Joel Silk is a dad and Senior Director of Moderation at Roblox. As a former Air Force Intelligence Officer, he knows a thing or two about online safety. We sat down with him to learn how parents can help keep kids safe during screen time. Check out the full interview here.
Likes and comments are an inherent part of Instagram. They are central to the platform—and kids and adults alike feel their impact. My team wrote an article about how social validation affects children in particular, and you can find it here.
Okay, that's it from me until next time. If you enjoyed this newsletter, and know of another parent who would as well, please feel free to forward it along.