Democrats are testing the limits of Facebook’s refusal to take down false ads from politicians, and it isn’t pretty.
As it faces pressure tests from politicians and political groups, Facebook is starting to make exceptions to its policy that it won’t fact-check advertisements published by politicians. It’s a position CEO Mark Zuckerberg in particular had taken a hard line on.
To back up, this all began this fall when Facebook announced it wouldn’t fact-check political speech, including ads, and campaigns started to test the implications of this policy. In September, Facebook refused to take down an ad run by Donald Trump’s reelection campaign that made false claims about former Vice President Joe Biden, his son Hunter Biden, and their activities in Ukraine. Facebook wasn’t the only platform to refuse to pull the ad — YouTube, Twitter, MSNBC, and Fox made the same call — but Facebook caught the most flak for it.
Then, Democrats decided to challenge the policy allowing fake ads … by running fake ads of their own on Facebook. Sen. Elizabeth Warren (D-MA), who has emerged as a fierce Facebook critic in the 2020 primary, ran a fake ad claiming Facebook CEO Mark Zuckerberg had endorsed Trump’s reelection. Warren also, without evidence, suggested the social network had adopted the policy as part of a backroom deal with Trump. And last week, high-profile freshman Rep. Alexandria Ocasio-Cortez got Zuckerberg to admit in a House hearing he would “probably” let her run ads against Republicans saying they supported the Green New Deal. Along the way, Zuckerberg continued defending the policy, even as his own employees, in a rare move, wrote a letter expressing concern with the stance and pushed him to rethink his decision.
But in recent days, Facebook has wavered as progressives have tested the limits of its policy. Over the weekend, the company took down an ad that falsely claimed Sen. Lindsey Graham (R-SC) supports the Green New Deal. A left-leaning political action committee, the Really Online Lefty League, had posted the ad, and Facebook said it took the action because the ad came from a political action group, not a politician, and therefore different rules applied.
So the group found a workaround: One of the PAC members, Adriel Hampton, filed with the Federal Election Commission to run for California governor. Now a politician, as the logic of Facebook’s policies would go, he can run as many political ads as he wants.
Except apparently not. Facebook on Tuesday evening said it was nixing Hampton’s workaround. “This person has made clear he registered as a candidate to get around our policies, so his content, including ads, will continue to be eligible for third-party fact-checking,” a Facebook spokesman said in an email to Recode.
Hampton told CNN he is considering legal action against Facebook. In an interview with Recode earlier in the day on Tuesday, he said that Facebook is “basically selling you to the lying politicians.”
“I feel that I’m one of the few people who’s qualified in both that I’m an expert marketing strategist and a politician, and I think that’s what it’s going to take to either get this policy cleaned up and get Trump back on equal footing with other political committees — or to defeat the GOP, defeat Trump, and defeat the Senate GOP with fake ads,” he said.
Hampton suggested he might actually run for office — he is, after all, a longtime political consultant who most recently worked on Mike Gravel’s ill-fated presidential campaign; Hampton also made an unsuccessful bid for Congress in 2009. After Facebook announced it wouldn’t let him run fake ads, he told Recode he will now “lead a movement.”
Hampton’s fight with the platform is highlighting the real issue here: that the company’s decision-making and policy defenses when it comes to free speech on its platform can often seem arbitrary. Its policy says a politician is exempt from third-party fact-checking, and you’re technically a political candidate if you’re registered as one with the FEC. But in this case, Facebook is making an exemption and a judgment about intentions.
Facebook’s hard-and-fast rule on political speech doesn’t seem so hard-and-fast, considering it’s already making exceptions to it.
The dustup also highlights just how enormous Facebook has become and, in turn, how unprepared it seems to be to moderate political speech on its platform, even after the hard lessons it learned in the wake of the 2016 election.
“The big story is that Facebook is too big to govern, and its ads system is too easy to hijack,” Siva Vaidhyanathan, a media studies professor at the University of Virginia, told Recode.
Facebook knows policing speech is a political hot potato
On its face, the decision on the Biden ad should have been an easy one for Facebook: It was the president of the United States making an obviously false claim about the former Vice President of the United States.
But taking down the ad would have created two problems for Facebook. First, it would set a precedent that Facebook is responsible for policing every false political ad on its platform. That would be a challenging but not impossible task. The company has effectively addressed terrorist content and gotten better at combating election interference. It could undertake similar efforts on fake political ads.
The second and bigger complication: taking down the ad could also have caused just as much controversy as leaving it up. Trump and his supporters would likely have cried foul. Facebook and other social media companies are already dogged by unfounded accusations by Republicans that their algorithms contain anti-conservative bias, and they have done a lot of legwork to try to prove they’re not.
In other words, when Zuckerberg says, in defense of the ad policy, “most people don’t want to live in a world where you can only post things that tech companies judged to be 100 percent true,” and, “in a democracy people should be able to see for themselves what politicians are saying,” what he’s not saying is that the underlying problem is that policing political ads would be politically tenuous and hard.
“Facebook is basically saying we’re going to pretend this is a high-minded decision and we’re going to stick by it because we’d rather take the hit for the next few weeks or months on this policy until everyone burns out on it than take the hit for years every time an ad with clear falsehoods makes it through the filter,” Vaidhyanathan said.
The company doesn’t want to deal with the backlash it would face if it were to deem an ad from one political party or the other to be false. Facebook is already hypersensitive to largely unfounded claims of political bias. “I worry much more about Facebook telling me what fake news is than fake news itself,” Rory McShane, a Republican political consultant, told Recode.
Facebook has been pressured to stop dealing in political advertising altogether, with critics noting it’s only a small part of its revenue. But then that would require the company to define what a political ad is. Sure, it could ban ads from the Trump campaign, but what about the NRA? Or the American Federation of Teachers?
Still, in making one decision on the Trump ad and another decision on the ads Hampton was trying to run, Facebook showed it will in fact make political judgments about the ads on its platform. It gets to decide what ads do and don’t run, and it doesn’t have to stick to its policies.
People were bound to test this — which isn’t necessarily great, either
Renée DiResta, a 2019 Mozilla fellow in Media, Misinformation, and Trust and an expert in social media manipulation, told Recode that people like Hampton and Warren are “testing the boundaries of a bad policy by creating examples to illustrate exactly why it’s inadequate.”
“I don’t think most honest, legitimate politicians want to be written about as people who deliberately ran blatantly fake content,” she said.
It’s a complex situation, explained Syracuse University professor and communications expert Whitney Phillips: On the one hand, progressives are right to call attention to what Facebook is doing with its political ads policies and raise questions about the implications. The thing is, it could also serve as an encouragement for Republicans to do the same.
But of course, the president lies a lot, and if conservatives were going to do this anyway, why not highlight the dangers ahead? It could pressure Facebook to change its policies — or the company may just dig its heels in more.
Fake ads and content aren’t new on Facebook. But as we get closer to the 2020 elections, confusion about what’s real and isn’t — especially when it comes to political discourse online — seems to only be increasing. “Just the possibility that political ads could be lying is going to have a significant impact on how people interact with the ads to begin with,” Phillips said. “I worry it’s going to seed systemic, endemic paranoia in response to political ads.”
In other words, the problem isn’t going away.
Author: Emily Stewart