What would a healthy social media platform even look like?

What would a healthy social media platform even look like?

Getty Images/iStockphoto

Facebook is under fire, again. But perhaps its problems are more fundamental.

Reading the findings from the Wall Street Journal’s massive, three-part investigation into Facebook — that the platform makes people angry and depressed, and that the company exempts a huge class of VIP users from its rules regarding harassment and incitement of violence — my reaction was: Well, obviously.

Anyone who spends a decent amount of time online knows what happens when you shove a bunch of strangers into the same place. We replicate existing power dynamics, we form groups, we troll, we project our biases, we yell until only the most extreme voices are the ones that get heard. We expect the companies that own these platforms to fix it, but nobody can agree on how.

That’s not to downplay the breadth of the investigation or the bravery of the sources, particularly the teenage girls who shared how Instagram made them feel significantly worse about themselves. A slide from one Facebook presentation in 2019, obtained by WSJ reporters, read, “We make body image issues worse for one in three teen girls.” Another read, “Teens blame Instagram for increases in the rate of anxiety and depression.” Internal researchers found that these effects were specific to Instagram, not social media as a whole, due to its emphasis on images and social comparison.

The report also dug into Facebook’s 2018 algorithm change, which prioritized “meaningful social interactions,” or in other words, comments and shares. What the company failed to take into account is that the stories and links people are most likely to comment on and share are ones that incite an emotional response, like, say, anger. It wasn’t just political or politically charged content that people were consuming, it was, as BuzzFeed CEO Jonah Peretti wrote to Facebook at the time, “fad/junky science, extremely disturbing news, and gross images.” But the strategy, which was put in place as a salve for declining user engagement, worked: More people were logging onto Facebook.

In response to the story, a bipartisan Senate committee is already investigating Instagram’s impact on teens, and Sen. Ed Markey (D-MA) is calling on Facebook to stop work on its reported forthcoming platform, Instagram for Kids. Last year, a report by Democratic members of Congress determined that Big Tech — Amazon, Apple, Facebook, and Google — are essentially monopolies, though an FTC lawsuit arguing the same thing was dismissed in June. But many in Congress have maintained that what’s needed will take more than simply breaking up Facebook — it’ll take updating the antitrust laws that were put in place 100 years ago.

Predictably, Facebook has been in PR mode, though not all of its attempts to court the public’s favor have landed. On Thursday, Instagram’s head Adam Mosseri spoke to Recode’s Peter Kafka and framed the issues laid out in the WSJ investigation as a necessary evil. “Cars have positive or negative outcomes. We understand that. We know that more people die than would otherwise because of car accidents,” he said. “But by and large, cars create way more value in the world than they destroy. And I think social media is similar.” The difference, as many have argued, is that the auto industry (which is not a monopoly) is subject to heavy regulation while social media is not — not even a little bit.

Of course, social media has had both beneficial and destructive effects. It’s at its best when it fosters safe communities and platforms historically marginalized voices and at its worst when it becomes too enormous for a single company to manage. The problem is that if social media companies are meant to operate in the way most companies in America are encouraged to — by growing as large as they can at the expense of everything else — then the shitty parts of the internet aren’t a bug, they’re a feature. If Facebook’s goal is to attract as many users as possible, and that number can theoretically grow to the size of one giant supernation, then the role of a social media company is comparable to that of a government, but with only murky responsibilities and little accountability to its citizens.

No amount of human moderators or algorithm tweaks is ever going to be able to handle this kind of scale. At any moment, bad actors are flooding Facebook, YouTube, Amazon, Google, and TikTok with harmful garbage, and algorithms designed to prioritize engagement are blasting it out to its users. A growth-at-all-costs system is bad for basically everyone aside from people like Mark Zuckerberg and his shareholders, and yet we permit it because Americans seem to be perfectly fine with policing every human behavior except greed.

Where does that leave us? Free-to-use internet platforms that facilitate interactions between strangers are almost always going to end up in the same place; there’s a limit on the different modes of communication people can have with each other (see: all the debates that recirculate throughout Tumblr, Twitter, and TikTok every few months). While Congress is largely still stuck on fixing problems like misinformation and moderation, which themselves are woefully misunderstood, the real problem is more fundamental: The only solution to make the internet better is to prevent companies like Facebook from getting quite so big in the first place.

This column first published in The Goods newsletter. Sign up here so you don’t miss the next one, plus get newsletter exclusives.

Author: Rebecca Jennings

Read More

RSS
Follow by Email