Note: this is not a commentary on the US election nor the candidates involved. My discussion instead focuses on the impacts of social platforms in the context of the US election.
It's difficult to find the words to describe the US election. I'm not sure I can truly capture all the tension and emotion involved, but suffice it to say that last week was one of the most politically fraught in recent memory. We witnessed polarization play out in the aftermath of the election, but that was not exactly unexpected. In fact, leading up to November 3rd, social media companies were under increasing scrutiny for their role in perpetuating extreme content. Governments, organizations and individual citizens are waking up to the fact that these platforms have (inadvertently or otherwise) played a big role in promoting divisive content. Controversy has a way of hooking us in—and extreme user-generated content often goes viral. This puts platforms in a tricky spot, because they measure their success by engagement, and some of the most engaging content contains vitriol.
In order to help stave off some of this extreme content, YouTube updated its policy, saying it would remove content that is “encouraging others to interfere with democratic processes, such as obstructing or interrupting voting procedures.” Well, that policy was put to the test not long after election night, when One American News Network posted a video titled "Trump Won." It shows a news anchor saying, “President Trump won four more years in the office last night.” As a reminder, the election was decidedly not decided when this video was posted. It went on to make unfounded claims that there was rampant voter fraud against Republican ballots and even urged viewers to take action against Democrats. I don't know about you, but that sure sounds like it violates YouTube's policy against content that encourages interference with democratic processes. And yet, YouTube left the video up.
I wish I could say I was surprised.
It's something I've written about in my book Screen Captured. These platforms are hungry for engagement, and in their endless pursuit for daily active users, they let toxic content run wild. Back in May of 2019, a doctored video of House Speaker Nancy Pelosi started making the rounds on Facebook. The video had been slowed down, which made it look like Pelosi was inebriated or unwell. But even after Facebook acknowledged that it was a fraud, they left the video up.
Ultimately, social media's hesitance to remove false and polarizing content isn't just about their engagement metrics; there is also a particular can of worms they don't want to open, and that's content moderation. These platforms are clinging tightly to Section 230 of the Communications Decency Act, which protects them from being liable for content that's posted to their sites. But if they're forced to more actively curate their platforms, they wade into a mountain of new rules and regulations. And we all know how much big tech platforms hate those.
A deeper dive
Here are a few helpful resources in case you want to really dig into today's topic:
You may be wondering about YouTube's explanation for why it refuses to take down the misleading video from One American News Network. Let me warn you: it's a little arbitrary. Check out the full story here, but in a nutshell, their policy apparently only refers to "videos discouraging voting but not to videos that advocate interference after votes have been cast."
In June 2018, Facebook hired Yaël Eisenstat, a former intelligence officer, White House adviser and democracy activist as its head of global elections integrity ops. Her task was to help Facebook navigate its crucial role in politics—but she ultimately concluded that there wasn't much appetite on the part of the company to enact real changes. She shared her experience in the Independent, she concludes that "[t]he real problem is that Facebook profits partly by amplifying lies and selling dangerous targeting tools that allow political operatives to engage in a new level of information warfare."
TL;DR
Too long; didn't read. It shouldn't be a full-time job to keep up on industry news, so here is a mercifully quick summary of some other notable developments:
A family in Canada was recently slapped with a bill for almost $12,000—from social platform TikTok via the Apple Store. That's because the couple's tween daughter had purchased TikTok coins, which can then be exchanged for likes and follows. The family (who remained anonymous to protect their kid's privacy) normally spent $30–$40 a month on Apple purchases, but received no notification or warning before their $12,000 bill came due. They finally managed to get the bill waved, but TikTok was pointing fingers at Apple—who was pointing fingers at Mastercard. The family is clear about one thing though: they're not blaming their young child for the charges. "In defense of our daughter, she’s a great kid… and this is where the whole thing became very disturbing to us. Why was she having to pay money to people to get them to essentially like her? Our society is so influenced by social media, especially young girls."
California voters passed Proposition 24, strengthening the state's privacy laws and limiting the amount of data that big tech companies are allowed to collect. This is going to make it more difficult for Google and Facebook to track people's activity via third parties, which could seriously threaten their advertising business models. Even though the change only applies to California, it will likely affect all of the US because of the state's outsized influence on the technology industry.
And lastly
Here are a few more pieces of original writing from me and my team—just in case you're keen for more:
I was never a fan to begin with, but in today's reality, screen time limits are even more irrelevant. Here are my thoughts on how the pandemic shifted the narrative.
We recently had the chance to chat with Stephanie Humphrey, a former engineer and Resident Tech Expert for Strahan, Sara & Keke, A Good Morning America show. She shared her advice on how parents can help kids put their best digital foot forward.