Mark Zuckerberg on leaked audio: Trump’s looting and shooting reference “has no history of being read as a dog whistle”

Mark Zuckerberg on leaked audio: Trump’s looting and shooting reference “has no history of being read as a dog whistle”

Facebook co-founder and CEO Mark Zuckerberg testified before the House Financial Services Committee in October 2019. | Chip Somodevilla/Getty Images

On a tense call with employees, the Facebook CEO defended his decision not to moderate Trump’s posts.

In an internal video call with Facebook employees on Tuesday that Recode obtained, CEO Mark Zuckerberg doubled down on his controversial decision to take no action on President Donald Trump’s post in which Trump referred to the ongoing protests in the US against racism and police brutality and said, “when the looting starts, the shooting starts.”

Facebook’s handling of Trump’s post — which included language similar to what segregationists used when referring to black protesters in the Civil Rights era — has divided employees at Facebook and prompted them to openly criticize Zuckerberg in a way they never have before. Around 400 employees staged a virtual walkout of work on Monday, at least two employees have resigned in protest, others have threatened to resign, and several senior-level managers have publicly disagreed with Zuckerberg’s stance — calling for him to take down or otherwise moderate Trump’s post, as Facebook’s competitor Twitter already has.

This tension spilled over into the Tuesday Q&A meeting that around 25,000 employees tuned into — with several employees’ posing questions that were highly critical of the company’s actions and policies, and scrutinized whether the company is listening to racially diverse voices in its upper ranks.

“I knew that the stakes were very high on this, and knew a lot of people would be upset if we made the decision to leave it up,” Zuckerberg said on the call. He went on to say that after reviewing the implications of Trump’s statement, he decided that “the right action for where we are right now is to leave this up.”

Zuckerberg said that he did a thorough analysis of the history around the apparent reference in Trump’s post, which he called “troubling,” but ultimately did not find it to be an incitement of violence under Facebook’s policies.

“We basically concluded after the research and after everything I’ve read and all the different folks that I’ve talked to that the reference is clearly to aggressive policing — maybe excessive policing — but it has no history of being read as a dog whistle for vigilante supporters to take justice into their own hands,” Zuckerberg said on the call. He also said that, overall, Facebook still reserves the right to moderate Trump.

“This isn’t a case where [Trump] is allowed to say anything he wants, or that we let government officials or policy makers say anything they want.”

Facebook has largely avoided moderating Trump’s posts on its platform. In March, however, after Recode and other outlets reported on deceptive advertisements that made a Trump campaign questionnaire appear to be the official 2020 census, Facebook removed these ads from its platform.

After opening the call, Zuckerberg went on to take questions from a preselected list, with employees asking questions via videoconference. In one of several tense exchanges, an employee asked Zuckerberg to confirm how many black people were involved in Zuckerberg’s final decision not to take down Trump’s post. Zuckerberg’s answer: just one person (Facebook’s global diversity officer, Maxine Williams). Zuckerberg said only about six people were involved in the decision-making process, including Facebook COO Sheryl Sandberg and policy VP Joel Kaplan, who has come under scrutiny for reportedly stymieing efforts to reduce polarization on the platform and openly supporting Supreme Court Justice Brett Kavanaugh during his controversial Senate hearings.

The employee pressed why Facebook’s head of integrity, Guy Rosen, who is tasked with overseeing efforts around general user safety on the platform, wasn’t in the final group of decision-makers.

In response, Zuckerberg appeared to stumble with his reasoning, first saying that he wasn’t sure if Rosen was a part of the final decision. He then saying that he wasn’t, but that ultimately Rosen is responsible for building and enforcing policies overall and not this particular decision.

“I don’t think it’s great that we’re not super clear on whether the VP of integrity was included on a matter of voter suppression and societal violence,” the employee said on the videoconference.

“How can we trust Facebook leadership if you show us a lack of transparency?” asked another employee.

When asked about the criticism Facebook faced in the meeting, a spokesperson for the company sent Recode the following statement:

“Open and honest discussion has always been a part of Facebook’s culture. Mark had an open discussion with employees today, as he has regularly over the years. He’s grateful for their feedback.”

In an acknowledgement of employees’ anger over the situation, Zuckerberg outlined several areas of self-designated improvement for Facebook on the call, including being more transparent around the decision-making process for moderating contentious posts.

Most notably, Zuckerberg said the company is considering adding labels to posts by world leaders that incite violence, instead of simply leaving them up or taking them down. He also said that since the US may be entering a “prolonged period of civil unrest,” they may change their policy on what kind of announcements government leaders can make about state violence, such as excessive use of police force.

Ultimately, though, while Zuckerberg was at times conciliatory, his tone was staunchly defensive of Facebook’s stance not to make what he views as knee-jerk decisions against content that people could find personally offensive. He said even if they do change their policies around moderating potentially violent political speech like Trump’s post, it would not happen overnight.

“These policies have to be developed,” said Zuckerberg. There’s “no way we can do something like that on the fly.”

That brings up the question of why Facebook isn’t more prepared to moderate political speech that pushes the boundaries of its platform’s rules on inciting and glorifying violence. Since the 2016 US presidential election, how Facebook moderates content has come under fire, and the company has promised to do better. Its long-awaited independent oversight board meant to review controversial cases like Trump’s post still hasn’t officially launched; meanwhile, the next US presidential election is less than six months away.

On the call, Zuckerberg acknowledged that this is only the beginning of employee discussion on the company’s handling of the very real controversies coming its way around race, politics, and police violence.

“I know we’re going to keep talking about this, some of the issues, they’re deep and they’re not going to go away any time soon,” Zuckerberg said on the call.


Support Vox’s explanatory journalism

Every day at Vox, we aim to answer your most important questions and provide you, and our audience around the world, with information that has the power to save lives. Our mission has never been more vital than it is in this moment: to empower you through understanding. Vox’s work is reaching more people than ever, but our distinctive brand of explanatory journalism takes resources — particularly during a pandemic and an economic downturn. Your financial contribution will not constitute a donation, but it will enable our staff to continue to offer free articles, videos, and podcasts at the quality and volume that this moment requires. Please consider making a contribution to Vox today.

Author: Shirin Ghaffary

Read More

RSS
Follow by Email