Democratic Party leaders are “banging their head against the wall” after private meetings with Facebook on election misinformation

Democratic Party leaders are “banging their head against the wall” after private meetings with Facebook on election misinformation

Facebook CEO Mark Zuckerberg testifying before the House Energy and Commerce Committee in April 2018. | Yasin Ozturk/Anadolu Agency/Getty Images

After months of talks, Democrats say Facebook isn’t ready for the election.

After months of working with Facebook to safeguard the 2020 election, several national Democratic Party leaders say the company has failed to meet promises to stem the tide of misinformation spreading on its platform. If the election’s results are contested after November 3 — which is an acute concern since a surge in mail-in ballots is expected to delay the count and President Trump has refused to commit to a peaceful transition of power — they worry that Facebook is utterly unprepared to prevent people from using its platform to spread chaos.

Recode spoke with four sources with direct knowledge of ongoing monthly private conversations about election misinformation between several senior Democratic party committee leaders and senior members of Facebook’s policy team, including VP and director-level staff. These sources spoke on the condition of anonymity for fear of repercussions for discussing private talks.

Democratic sources told Recode that monthly calls with Facebook that started in May have been “maddening” and have left party members “banging their head against the wall.” They say Facebook employees in these meetings — while seemingly well-intentioned — have failed to stop the spread of misinformation attacks against Democratic candidates, been reluctant to share information about extremist groups encouraging voter suppression, and appeared “flat-footed” in their plan for dealing with conflicting information about the election results.

Facebook spokesperson Tom Reynolds said that Facebook has been proactive in working with party leaders on both sides of the aisle about stopping abuse of its platform ahead of the election and that this is a top priority for the company.

“Elections have changed and so have we. Since 2016, we’re the only tech platform to build a global fact-checking network of over 70 fact-checkers. We have disrupted more than 100 coordinated inauthentic behavior networks and have hired 35,000 people to work on enforcing our policies and keeping the people on Facebook safe,” said Reynolds.

In recent months, Facebook has taken several steps to try to limit election chaos on its platform. It’s banned political ads in the week leading up to the election and promised to take down specific misrepresentations about voting or about when to vote. It’s launched a hub, its Voter Information Center, which the company says has so far helped 2.5 million Americans register to vote. And the company has tightened its rules around who can purchase political ads, though Facebook still doesn’t offer voters much insight into why they’re targeted with political ads on its platform.

But some Democrats believe these moves aren’t enough to stop the barrage of half-truths, lies, and violent rhetoric about the election process that continue to spread on the platform.

“They’re essentially slapping Band-Aids on wounds that require emergency room stitches,” said one Democratic source familiar with ongoing discussions with Facebook.

While Democrats want to see Facebook more aggressively remove misinformation relating to the election, Facebook faces opposing pressure from Trump and leading Republican lawmakers who claim (without evidence) that Big Tech companies are biased against conservatives when they moderate their platforms. So far, it’s the Republicans that have been accusing Facebook and other tech companies of political bias; now Democrats, including Joe Biden and top Democratic lawmakers, are starting to get louder in their criticism, too.

Four years after 2016, Facebook is grappling with similar problems

In 2016, Russian state operatives exploited Facebook by using fake accounts to sow political division in the US, posting ads and other content that reached some 146 million Americans on Facebook and Instagram. (Twitter also reported that tens of thousands of Russian-government-linked accounts reached around 670,000 Americans leading up to the race.) After initially calling it “crazy” to think that Facebook influenced the outcome of the election, Facebook CEO Mark Zuckerberg later acknowledged Facebook’s role in spreading disinformation, apologized, and promised to do better.

As part of its promise to improve, the company has worked more closely with both Democratic and Republican Party leaders, along with outside researchers and US intelligence agencies to police misinformation on the platform.

Even if Democrats are motivated at least in part by their party’s own interests, their complaints about Facebook raise concerns for voters on both sides of the political aisle. What’s at stake is ensuring Americans have accurate information about how to vote.

In early September, for example, Trump encouraged his followers in North Carolina to vote twice (which is a felony) because of baseless allegations that their mail-in ballots wouldn’t count. And this summer, Trump falsely asserted that every Californian will automatically receive an absentee ballot, which will lead to mass voter fraud (in fact, only registered voters will receive ballots, and there is no evidence of mass voter fraud).

In both cases, Facebook did not take down Trump’s posts, nor did it explicitly call them out as misleading. It did put a general label under Trump’s North Carolina post that asserted voting by mail is trustworthy, and which linked to its Voter Information Center — but only hours after Trump already shared the misleading post with his 30 million followers on the platform.

Looking ahead, Democratic Party leaders worry that Trump — or his supporters — could prematurely declare a false victory via social media on election night and that Facebook is ill-equipped to handle the situation.

“Facebook constantly says they’re working on it,” said another source. “We understand these are developing threats, but at the same time, elections have been on the calendar for a while, and it’s unsettling that we have less than 50 days to go.”

Facebook has made progress, but it’s not enough

According to sources, Democrats hoped that monthly meetings between campaign committee leaders and Facebook’s policy team would be an opportunity to talk through substantive solutions to problems around misinformation on Facebook’s platform. Instead, several sources said that Facebook largely used the meetings to run a PR charm offensive, trying to convince Democrats that Facebook is working hard to fix its problems and that Facebook CEO Mark Zuckerberg is a “principled person.”

Democratic leaders told Recode they’ve seen some progress overall from Facebook since 2016: The company has a more direct line of communication with Democrats than four years ago, when these monthly meetings weren’t even happening. They acknowledge Facebook has gotten better at identifying malicious foreign actors.

But a lot has changed in the US political landscape since 2016, and domestic actors, including some Republican Party leaders and extremist groups, have started using disinformation tactics also used in foreign interference. But Facebook hasn’t adapted, Democratic leaders say.

The company has hesitated and at times outright refused to strike down politicians’ false claims about voting, particularly when President Trump makes such claims, and it’s been slow and insufficient when dealing with extremist groups such as QAnon.

“Across the board, progressives, Democratic operatives, and civil rights leaders have been increasingly more and more frustrated, more exacerbated, and less hopeful that Facebook has any desire to do the right thing,” said Jesse Lehrich, a foreign policy spokesperson in Hilary Clinton’s 2016 campaign who now leads the social media policy nonprofit Accountable Tech. Lehrich was not present in Democratic Party leaders’ recent meetings with Facebook.

“People have spent a lot of time in good faith to improve some of these things, and there was just a tipping point where everyone thinks it’s a waste of your time to work with these people [Facebook] because they just string you along,” Lehrich told Recode.

Now, Democratic sources say time is running out — it’s 33 days until the election.

“It feels like we’re just continuing to just throw things into the void and yell off of a rooftop, and it’s falling on deaf ears,” said one Democratic operative with direct knowledge of the meetings between Facebook’s policy team and Democratic campaign leaders.

Several other top Democratic operatives who haven’t been present in the monthly meetings called out two Facebook leaders, besides Zuckerberg, as sources of frustration: Facebook COO Sheryl Sandberg and VP of Public Policy Joel Kaplan. Neither Sandberg nor Kaplan are present in the monthly meetings. Sandberg has long been the public face and behind-the-scenes reputation manager for Facebook when dealing with Democrats, civil rights leaders, and other liberal groups, which hasn’t always gone well. Earlier this summer, Sandberg and Zuckerberg had a disastrous call with leaders of civil rights groups such as Color of Change and the NAACP Legal Defense and Educational Fund over concerns about hate speech on Facebook.

Kaplan has long been scrutinized both inside and outside the company because of his conservative ties, including his public support of Brett Kavanaugh during the Supreme Court Justice’s contentious confirmation hearings. In August, BuzzFeed News reported that Kaplan had intervened to question a fact-check on a Facebook page belonging to right-wing figure Charlie Kirk.

Democrats feel “in the dark” when it comes to how Facebook is handling misinformation

One key complaint Democrats have is that Facebook leaves it up to them to flag examples of political misinformation on its platform, according to several sources. Party leaders expected to see the social media giant take more responsibility for cleaning up its own messes. Facebook disputed this characterization and said the company regularly presents its strategy for attacking misinformation with both Democratic and Republican Party leaders.

“They [Facebook] say ‘let us know if you see anything, you should be monitoring for this,’” said one Democratic source with direct knowledge of the conversations. “Our approach is that we are a resource-stretched political committee; they are a multibillion-dollar company. They should be able to get this right and put in the necessary resources.”

Facebook employees have also been tight-lipped about the extent of the company’s monitoring of extremist groups in monthly meetings, according to three sources familiar with the misinformation discussions.

In one recent meeting, Democratic campaign committee leaders asked Facebook to share which extremist groups on the platform the company was monitoring that could cause physical harm during the election cycle, sources said. Facebook representatives declined to name any, according to three sources with direct knowledge of the meeting.

“I’m not asking for Facebook to share specific data, but they wouldn’t even name five or 10 of the extremist groups that they were most closely watching,” said one source.

“It didn’t seem like something they were very much on top of,” said another source.

Facebook’s Reynolds said the company takes extremist threats seriously and prioritizes this work. In recent months, the platform has tightened its policies against groups like QAnon, the boogaloo movement, and others — but it has also received criticism for not being quick or comprehensive enough.

The dreaded “post-election, pre-results” period

Democrats, civil rights leaders, bipartisan policy leaders, and Facebook itself have warned about the dangerous so-called “post-election, pre-results” period when election polls have closed but the vote totals aren’t finalized yet. In 2020, that’s expected to be a bigger issue than usual because many more people are expected to vote by mail due to Covid-19. It’s a time when political candidates and bad actors could try to undermine Americans’ faith in the electoral process, causing civil unrest.

Trump has been declaring for months that he expects the election to be “rigged” because of the expected increase in mail-in ballots — setting the stage for a contested result.

Sources familiar with Facebook’s discussions with Democrats say Facebook doesn’t seem prepared enough for this scenario.

“It was brought up on a recent call, about what happens post-election if there is not a result, what happens between Election Day and results day,” said one Democratic source familiar with the discussions. “And they essentially had nothing.”

Another Democratic Party source familiar with the discussions said that Facebook assured Democrats that the company was working on “tabletop exercises” and “strategizing behind the scenes” to prepare for election night. When pressed on specific scenarios, however, such as Trump contesting the election results, the company wouldn’t go into details.

“With the hypotheticals we would bring up, they [Facebook] would never have any interest in discussing them,” said another Democratic Party source familiar with the discussions. “They would say, ‘We can’t really deal with hypotheticals, we don’t know necessarily how President Trump would respond.’”

Since Facebook first started its monthly talks with Democrats, it has announced it will work with respected outside sources like Reuters to call the elections, and it will mark up any premature posts about election results with a link to those trusted sources. It also announced it will ban political ads that falsely declare a premature victory. Sources said these are welcome announcements, but not enough.

In order to truly safeguard rhetoric around the elections, party leaders say, Facebook needs to go further and actually plan to mark up posts that dispute election results with a clear fact-check, similar to how Twitter has in the past for some of Trump’s posts. Or delete the posts altogether.

Facebook declined to comment on further plans for how it will handle disputed election results.

“I really don’t think that any of these asks have been partisan,” said one Democratic Party source. “We’re not saying you should do this in order to help Democrats win. That is not fair. It’s to protect voters.”

All of this leaves Democratic leaders deeply concerned about what might unfold on Facebook in the days leading up to the election, and during the still-undetermined time period before votes are counted and the election is called.

Experts estimate it might take weeks until there is a final result, and in the presidential debate earlier this week, Trump suggested it could take months before he accepts the result of the election.

“There is just a lack of urgency when it comes to the election part of this. We’re talking about just a handful of weeks at that point out from Election Day, and they’re still unable to give us timelines,” said one source with knowledge of the meetings. “There’s a lot of talk, but not really much transparency.”


Will you help keep Vox free for all?

The United States is in the middle of one of the most consequential presidential elections of our lifetimes. It’s essential that all Americans are able to access clear, concise information on what the outcome of the election could mean for their lives, and the lives of their families and communities. That is our mission at Vox. But our distinctive brand of explanatory journalism takes resources. Even when the economy and the news advertising market recovers, your support will be a critical part of sustaining our resource-intensive work. If you have already contributed, thank you. If you haven’t, please consider helping everyone understand this presidential election: Contribute today from as little as $3.

Author: Shirin Ghaffary

Read More

RSS
Follow by Email