Facebook is doing its best to counter anti-vaccination damage done by Facebook

Facebook is doing its best to counter anti-vaccination damage done by Facebook

A Colorado man takes a photo of himself after getting a Covid-19 vaccine. | Michael Ciaglo/Getty Images

To get 50 million people vaccinated, Facebook will have to do more than just PR.

On Monday, Facebook revealed a plan aimed at getting 50 million people vaccinated, the latest in a string of efforts by the social media company to combat the Covid-19 pandemic and the misinformation that has thrived on its platform. The campaign follows years of criticism directed at Facebook for not doing enough to fight the dangers of the anti-vaccination movement.

First announced in a post by CEO Mark Zuckerberg, Facebook’s plans include launching a tool to help people find and make appointments with local vaccination sites, amplifying credible vaccination information from health officials, and adding labels to posts about the coronavirus that point people to information from the World Health Organization. The company is also expanding official WhatsApp chatbots to help people register for vaccines, and offering new stickers on Instagram “so people can inspire others to get vaccinated.” (WhatsApp and Instagram are owned by Facebook.)

On top of all this, and perhaps more critically, Facebook is doing something it hates: limiting the spread of information. The company also announced it would temporarily reduce the distribution of content from users who have violated its Covid-19 and vaccine misinformation policies, or who continue to share content that its fact-checking partners have debunked. Figuring out what is and isn’t misinformation is tricky business, and it’s tough to tell the difference between people purposefully misleading others and having legitimate questions.

These efforts build upon existing promises Facebook has made. In February, Facebook announced it was going to take down anti-vaccination misinformation and use its platform for what it called the world’s largest Covid-19 inoculation information campaign, the beginnings of which it announced this week. The social media company has also partnered with public health researchers to find out the reasons for vaccine hesitancy — and how to combat it — through surveys on the platform.

Critics say Facebook’s efforts aren’t enough to counter the enormity of the situation the platform itself has helped create.

Anti-vaccination rhetoric has flourished for years on the platform, which provided a safe space for vaccine-misinformation groups and even recommended such groups to users. And a lot of the content that pushes vaccine hesitancy wouldn’t be considered misinformation, but rather opinion, so Facebook’s guidelines wouldn’t ban it, according to David Broniatowski, a George Washington University professor who researches anti-vaccination communities.

“People who oppose vaccinations aren’t primarily making arguments based on science or facts, but on values like freedom of choice or civil liberties,” Broniatowski told Recode. “They’re opinions, but very corrosive opinions.”

For example, a post saying “I don’t think vaccines are safe, do you?” probably wouldn’t be flagged as misinformation, but the tone can be insidious.

Facebook is aware that such posts that don’t violate Facebook’s rules are driving vaccine hesitancy, according to a new report from the Washington Post. “While research is very early, we’re concerned that harm from non-violating content may be substantial,” the story quotes from an internal Facebook document.

While Broniatowski lauded Facebook’s moves to partner with health organizations and promote facts about vaccines, he thinks it could do something more effective: allow public health officials to target vaccine-hesitant groups with arguments as compelling as those pushed by vaccine detractors. He noted that vaccine hesitancy was being promoted by a relatively small slice of Facebook users with outsized influence, and that similarly, a small group of public health experts could be used to combat it.

“You have some very sophisticated actors making any number of arguments, whatever will stick, to prevent people from getting vaccinated,” he said. “We need a more nuanced response that’s more responsive to people’s actual concerns.”

Facebook did not immediately respond with a comment.

People who refuse to get vaccinated have a wide array of reasons, according to data released today by Delphi Group at Carnegie Mellon University in partnership with Facebook. Of those surveyed, 45 percent said they would avoid getting vaccinated due to fear of side effects, and 40 percent cited concerns about the vaccine’s safety. Smaller percentages of respondents pointed to distrust in vaccines and the government. Addressing those concerns directly could have a meaningful impact on people’s willingness to get vaccines.

Facebook could also make sure its efforts to limit Covid-19 misinformation amount to more than just its latest public relations campaign, Imran Ahmed, CEO of the Center for Countering Digital Hate, told Recode in a statement.

“Since Facebook’s last announcement of their intention to ‘crack down’ on anti-vaccine misinformation over a month ago, almost no progress has been made,” Ahmed said.

“Facebook and Instagram still do not remove the vast majority of posts reported to them for containing dangerous misinformation about vaccines,” he said. “The main superspreaders of anti-vaccine lies all still have a presence on Instagram or Facebook, despite promises to remove them.”

Since its announcement banning vaccine misinformation in February, the company has said it’s taken down an additional 2 million pieces of content from Facebook and Instagram. Whether that and the new measures will get an additional 50 million people vaccinated remains to be seen.

Author: Rani Molla

Read More

RSS
Follow by Email