Facebook will push you to read articles before you share them

Facebook will push you to read articles before you share them

Facebook CEO Mark Zuckerberg. | Daniel Acker/Bloomberg via Getty Images

The company says it wants to promote “more informed sharing.”

The next time you try to share an article without actually reading it first, Facebook might warn you to think again.

The social media company announced on Monday morning that, starting today, it will test a new feature prompting users to actually open and read articles before sharing them on the platform. Facebook will start testing the feature on around 6 percent of its global users on Android, a company spokesperson told Recode. Twitter started testing a similar feature in June of last year and rolled it out to all its users more broadly in September.

Facebook’s move is the latest example of social media companies trying to slow the rampant spread of misinformation and harmful content on their platforms by nudging users to slow down before sharing content. Some social media researchers have long advocated for this kind of prompting, which they hope will minimize people reacting to a provocative headline without actually getting the fuller context of the story.

But since these features are relatively new, it’s unclear how much these interventions will actually work, or if people will just skip through prompts and share news without reading it anyway. And even if someone clicks on an article after Facebook asks them to, there’s no guarantee they will actually read the whole story — so this isn’t a complete fix.

Facebook announced the news on a company Twitter account on Monday, including an image of what the prompt will look like.

If you open an article without clicking on it, Facebook will tell you the following:

“You’re about to share this article without opening it. Sharing articles without reading them may mean missing key facts.” The company will then prompt users to either open the article first, or continue sharing without reading.

Facebook did not immediately respond to a request for further comment, beyond clarifying the percent of users that will test the feature.

 Facebook
Screenshot of the new prompts Facebook will warn users with.

There are some early signs that even if features like this won’t entirely stop the spread of false information or polarizing content, they may help people at least read more context about the news of the day.

Back in September, Twitter shared early insights after it started testing a similar feature on its Android app. The data showed the prompts led people to open articles 40 percent more often.

Last week, Twitter also rolled out a feature to prompt people to reconsider tweeting “offensive or hurtful language.” And ahead of the 2020 US presidential election, both Twitter and Facebook started combating misleading information on their platforms by labeling politically misleading tweets and barring users from “Liking” or replying to those posts.

Social media companies have many levers they can pull to slow or stop the spread of harmful information and divisive rhetoric. Banning people outright — like Facebook and Twitter did with Donald Trump — is one of them, but it’s a controversial option, and in many cases, far too blunt a tool. Features like the one Facebook started testing Monday, which “nudge” users to stop sharing uninformed content, can potentially accomplish more by gradually shifting how people post on the platform — before they share divisive or misleading content.

Author: Shirin Ghaffary

Read More

RSS
Follow by Email