Is social media ready for a Covid-19 vaccine?

Is social media ready for a Covid-19 vaccine?

With the news that a Covid-19 vaccine could be arriving soon, misinformation about vaccines is likely to surge. | Matteo Rossetti/Archivio Matteo Rossetti/Mondadori Portfolio via Getty Images

Facebook, Twitter, and YouTube are bracing for more misinformation.

Open Sourced logo

On Monday, Pfizer and BioNTech announced in a press release that their vaccine candidate was more than 90 percent effective at preventing Covid-19 infection, based on initial results from their ongoing phase 3 clinical trial. The company expects to have applied for emergency use authorization with the Food and Drug Administration (FDA) by the end of November and could have as many as 50 million doses produces by the end of 2020.

This is tremendous news — and misinformation about it is already circulating on social media. According to research from VineSight, a slew of social Twitter accounts, including those of Donald Trump Jr. and Sen. Ted Cruz, are already questioning the timing of the results’ release just days after the presidential election. By midday, tweets pushing that narrative had racked up more than 20,000 shares. The researchers estimate that Donald Trump Jr.’s tweet alone could have been seen by nearly 7 million people.

The dream of bringing a speedy end to the pandemic is a complicated one. Even when a vaccine does win initial FDA authorization in the United States, we should expect a lengthy period of “chaos and confusion,” one expert recently told the New York Times. Much of that disarray could play out on social media.

From the possibility of multiple vaccines to regionally distinct distribution plans to still-evolving research, the process of vaccine implementation is already stoking anxiety and misinformation. Since the pandemic began, Facebook, Twitter, and YouTube have faced pressure to combat conspiracy theories about Covid-19 vaccines. When one or more vaccines are ultimately offered to the public, the companies will also need to continue to promote accurate information about ever-evolving public health precautions. And they must act sooner rather than later to grapple with the task of communicating and moderating this next period of the pandemic, according to Jennifer Reich, who has studied vaccine hesitance at the University of Colorado Denver.

“This is not going to be magic,” Reich told Recode. “I think that the way the vaccine has been messaged has been like, ‘Just wait till we have a vaccine and then we can all go back to life as normal.’ That’s probably not a realistic expectation.”

Public health and social media experts told Recode that social media companies should expect anti-vaccination communities to use social media to capitalize on peoples’ understandable concerns about a potential Covid-19 vaccine. At the same time, many will be confused and frustrated at the distribution of the vaccine, and some may be angry when they see others getting a vaccine before they do. That will come amid conspiracy theories and other misinformation that has already spread about potential Covid-19 vaccines.

Basically, it could be a very, very complicated mess.

In preparation for a vaccine, social media platforms are fine-tuning their rules

As news of a viable vaccine draws closer, social media companies are fine-tuning their policies in anticipation of misinformation and changes in public health guidance. Notably, their approach seems similar to how they’ve moderated content around the 2020 presidential election as well as the pandemic in general. Facebook, Twitter, and YouTube have looked to elevate reliable sources, like local officials and organizations such as the Centers for Disease Control and Prevention (CDC) and the World Health Organization (WHO), and they say they will continue turning to official sources of health information if and when a vaccine is announced.

Facebook has changed its approach to clamping down on misinformation evaluating accurate public health messaging in recent months. For much of the pandemic, the company had a policy of taking down misinformation that could cause imminent, physical harm, like the false claim that face masks cause Covid-19, and reducing the prominence of vaccine hoaxes flagged by global health authorities in places like its News Feed and search. But starting this fall, Facebook became more proactive, launching a campaign that urged people to get the flu vaccine and banning advertisements that discourage people from getting vaccinated. Ads related to legislation about vaccines are still permitted, as are false claims about specific vaccines still undergoing trials. Facebook has argued that these ads don’t meet its potential-for-physical-harm threshold.

“While public health experts agree that we won’t have an approved and widely available Covid-19 vaccine for some time, we understand it will pose new challenges and we are actively consulting with health experts on our approach,” Facebook’s vaccine policy lead Jason Hirsch told Recode in an email. “In the meantime, we continue to work with them to remove verifiably false claims about the virus that could lead to imminent harm.”

Hirsch is the same Facebook executive who told Reuters in August, “There’s a ceiling to how much we can do until the facts on the ground become more concrete.” At the same time, the company echoed concern that more forceful takedowns of vaccine-critical views could actually drive more people toward not taking a vaccine.

Meanwhile, a Twitter spokesperson told Recode that the company recognizes its role in spreading credible public health information and is still crafting how its policies and product might change upon the announcement of a medically authorized vaccine. Currently, Twitter says it blocks misleading advertising about vaccines and directs people to public health authorities like the Department of Health and Human Services.

YouTube, for its part, is currently building the infrastructure to ensure that content about Covid-19 vaccines from public health authorities is elevated. The company has also banned misinformation about a Covid-19 vaccine. The company — which currently bans people from saying there is a proven vaccine — recently announced that it would remove videos that include information about vaccines that go against the guidance of local public health officials or organizations like the WHO. YouTube is also limiting the distribution of videos about vaccines that are “borderline,” the same approach it has taken to other on-the-edge content.

But while these three platforms will focus on misinformation about a Covid-19 vaccine, they’re still allowing for criticism of such a vaccine and expressions of reluctance toward getting it. Public health experts told Recode that room for criticism and questioning is vital, and peoples’ concerns about a vaccine should not be unilaterally removed from platforms.

Still, some potentially worrisome content is already on social media. On YouTube, there are some videos of people proclaiming why they will not take a Covid-19 vaccine. On Twitter, Recode found an account purporting to sell a Covid-19 vaccine made in China. And on Facebook, there are specific groups focused on organizing against taking a Covid-19 vaccine if and when one arrives.

Anti-vaccine content has swirled online for years. It could get a lot worse.

Many have grown more concerned about taking a Covid-19 vaccine in recent months. As Pew found in September, just over half of Americans now say they would or probably would get the vaccine. That figure is down from 72 percent in May.

It’s not a mystery why there are varying levels of hesitation about a Covid-19 vaccine. Some are skeptical of the comparatively short time it will have taken to produce a vaccine — it would likely be one of the fastest vaccines produced in human history — and many are worried that the vaccine development process has been politicized.

At the same time, the amount of vaccine misinformation has varied — and at times surged — amid the pandemic, according to research from VineSight. Some conspiracy theories have focused on false claims that billionaires like Bill Gates were behind a made-up effort to secretly implement microchip-sized tracking devices. More recently, theories have emerged that Democrats are somehow behind efforts to stall a vaccine.

Adding to confusion and tension will be those who politicize the vaccine’s research and distribution.

And when a medically approved Covid-19 vaccine is announced, we should expect that anti-vaxxers, political activists, and conspiracy theory groups like QAnon will go after the pharmaceutical companies involved. Jonathan Morgan, the CEO of the social intelligence firm Yonder, is working with some of these firms and says some groups will target researchers in order to use them “as a platform to get more attention for whatever they’re pursuing.”

At the same time, expected delays and pauses in trials — these are normal occurrences that should make the process more trustworthy — could be weaponized to exacerbate the growing feeling among the public that a Covid-19 vaccine is too new and too untested, even if public health officials’ give it the all-clear.

“We have a new virus coupled with a new vaccine coupled with a new way of life — it’s too much newness to people,” Ysabel Gerrard, a digital sociologist at the University of Sheffield, told Recode. “I think the pushback against a Covid-19 vaccine is going to be on a scale we’ve never seen before.”

On a brighter note, it’s also likely that people will spread a significant amount of positive content about the Covid-19 vaccine, as well. We should expect family and friends, as well as high-profile people like celebrities and politicians, to post supportive messages about the importance of getting vaccinated.

Platforms still have time to prepare for nightmare scenarios

Whether they like it or not, social media platforms will be a primary place for many people to learn about a Covid-19 vaccine, and the stakes will be incredibly high. One major challenge will be that, even we have an approved vaccine, there could be scenarios that nobody prepared for. These situations won’t be as simple as directing people to correct information when people share false ideas about a Covid-19 vaccine.

“We can certainly expect that, since it’s a novel vaccine, there are going to be things that we didn’t anticipate,” notes David Broniatowski, a George Washington University engineering professor who has studied social media and vaccines.

But there are things that social media companies probably should do now to prepare for such situations. In addition to lowering expectations for the first vaccine, platforms must also prepare for the kinds of questions people will have about it and to ensure that people have room to openly discuss and question aspects of the vaccine. Inevitably, one of the most powerful things social media companies could do is play a role in making the argument for vaccination itself as being a public health imperative.

We still don’t have a vaccine approved yet, but news of Pfizer’s initial results hints that one could be coming, and more vaccine candidates may be down the line. That means that as the status of a vaccine shifts, so will social media discussions. Misinformation won’t go away, even with companies banning it. But, from preparing people for a complicated and lengthy distribution process to elevating accurate public health updates, these companies could still make a big difference.

Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.

Author: Rebecca Heilweil

Read More

RSS
Follow by Email