“People do not trust that Facebook is a healthy ecosystem”

“People do not trust that Facebook is a healthy ecosystem”

Facebook CEO Mark Zuckerberg testifying virtually to Congress in July 2020. | Mandel Ngan/Getty Images

Leading social media researcher Laura Edelson explains her misinformation fight with Facebook.

New York University researcher Laura Edelson is at the center of the latest major Facebook controversy over the misinformation that’s eroding our democracy and encouraging Covid-19 vaccine hesitancy.

Earlier this week, Facebook abruptly shut down the personal Facebook accounts and research tools of Edelson and two of her colleagues at the NYU Ad Observatory, which studies political advertisements and misinformation on the platform.

Facebook says the Ad Observatory was violating people’s privacy by tracking some users’ data without their permission through its Ad Observer browser extension tool. Edelson denies this and said that her team only collected data from people who volunteered to share their information. Facebook’s move drew condemnation from free speech advocates and lawmakers, who accused Facebook of squelching independent research. The FTC criticized Facebook’s decision, saying the company’s initial rationale was “inaccurate.”

And Edelson says Facebook is trying to stifle her work, which has shown that ​Facebook has failed to disclose who pays for some political ads and that Facebook users engage with misinformation more than other kinds of information on the platform. “It doesn’t like what we’re finding, and I think it is taking measures to silence us,” Edelson told Recode in her first in-depth interview since the accounts were suspended.

In response to Edelson’s claims that Facebook is silencing her research, Joe Osborne, a spokesperson for Facebook, sent the following statement, in part:

“This doesn’t comport with the facts. We work with researchers around the world, and value work led by NYU’s team. That’s why we went above and beyond to explain these violations to them and offered them an additional privacy-safe dataset containing targeting information for 1.65 million political ads.”

But Facebook’s effective shutdown of the Ad Observatory raises larger questions about whether the company is trying to limit outside interrogation of the company’s business practices in the name of protecting its users’ privacy. At the same time, the social media network has good reason to be worried about privacy as it faces intense regulatory scrutiny for past missteps that led to it having to pay the largest penalty ever imposed by the Federal Trade Commission.

Edelson is one of several researchers who has complained that Facebook doesn’t share enough data with outside researchers to effectively study the scale and impact of misinformation.

Recode’s interview with Edelson, below, has been edited for clarity and length.

Shirin Ghaffary

I want to ask about Facebook’s rationale for banning you. [Facebook] said the project was tracking users’ information without their consent. Can you explain what your understanding is? Is it true that you were tracking any users’ information without their consent?

Laura Edelson

We collect ads, and we collect ad tracking-associated information. What Facebook is saying is that those advertiser names — which we do collect, to be really clear — are private user information. And I think, honestly, this is just a point where Facebook and we disagree. We do not think that advertiser names and ads are private information.

Shirin Ghaffary

So Facebook disagrees with you on the matter that they consider advertisers to be users. But putting that aside, Facebook says Ad Observer was also collecting some user data, not just advertiser data — like comments. What do you say to that?

Laura Edelson

That’s not true. We do not collect anything other than ads. We do not collect any private information. We do not collect user comments. We actually take great pains to be very careful about ad targeting information [so] that we only collect targeting fields that we know do not contain private information.

If there’s a field we don’t recognize, we don’t collect it. And we take all of those steps because we take user privacy extremely seriously. User privacy is our North Star. And that’s actually why, in addition to everything I’ve just said, Mozilla has done a security and privacy review of Ad Observer. And they agree with us that Ad Observer is safe, and it protects user privacy.

Shirin Ghaffary

It really gets down to this issue of trust, right? Who do we trust to study Facebook? Do we trust groups like yours? Or do we trust Facebook on how to do this the right way while preserving people’s privacy?

Laura Edelson

I think this is where I try not to ask people to just trust me. I don’t think that’s a fair thing to ask. I show my work. I make my data public; I make my code public. I try to have other people review my work. Facebook is the one saying, “Trust us.” Facebook is the one saying, “Don’t look behind this curtain.”

Facebook has disputed my research on engagement and other folks’ research on engagement with this information by saying that we don’t have all the data. … But they don’t actually make that data available publicly. So I don’t think that it’s fair for either me or Facebook to just say, “Oh, you should trust us.” But I feel like I have laid my cards on the table. I have been as transparent as I know how to be with the public. And Facebook hasn’t.

Shirin Ghaffary

Facebook has public data it releases to everyone about its ads through the ad library program. And they have other special programs for researchers as well. Why is that not sufficient for you? Why did you start this project to have users opt in and let you in under the hood to see more information about the ads they’re seeing?

Laura Edelson

So there are two big questions that we think Ad Observer is the best way to answer. First, I really do want to give Facebook some credit here. Facebook honestly makes a ton of information about political ads available. And we applaud them for that. But what they don’t do is make information about non-political ads available to researchers.

The other big thing that we get from Ad Observer is [ad] targeting data. I think one thing that we realized early on is that ad targeting is really important for understanding how advertisers are trying to get to particularly vulnerable populations. And so in terms of identifying misinformation that is aimed at those vulnerable populations, ad targeting is a really important part of that overall picture. And Facebook does not make ad targeting data available through the ad library API.

Shirin Ghaffary

Would it be easier for you if Facebook just published [ad targeting data] on its own and you didn’t have to build this browser extension?

Laura Edelson

Absolutely. You know, I’ve said this before, and I mean it: If Facebook made information about all ads available through their API, and if they made targeting information available for all political ads, we wouldn’t need to do this project. I would love to close up shop and go home, to be honest.

(API stands for Application Programming Interface. An API is an interface that allows two applications to communicate with each other to access data. Some researchers have been calling on Facebook to share the APIs they share with advertisers so these researchers can collect more information about how companies target and display ads to certain people.)

Shirin Ghaffary

Do you think that Facebook is penalizing you more harshly than other groups for allegedly violating its Terms of Service or privacy parameters?

Laura Edelson

I don’t want to get into reading Facebook’s mind here. But I will say that we are not the only browser extension that allows users to crowdsource ad observations. There are several others, most notably probably Who Targets Me, which is based out of the UK. The one thing I know of that we do differently is [that] we do publish our data as well.

(Facebook spokesperson Joe Osborne sent the following statement in response to concerns that it is enforcing its rules on some data collection tools but not others:

“We enforce neutrally across the board, regardless of the publicly-expressed intentions of those in violation. The enforcement actions we took against these researchers were consistent with our normal enforcement practices in these kinds of circumstances.”)

Shirin Ghaffary

On Tuesday night, after news broke that Facebook had revoked your and your colleagues’ access, you wrote that Facebook was silencing your research because it calls attention to problems on its platform and that Facebook “should not have veto power over who is allowed to study them.” What do you mean by that? And can you explain this idea that the company should not have veto power?

Laura Edelson

Facebook is saying that their hands are tied, that they have to do this in the name of user privacy. It just seems to me that if they actually believed that, they would have taken some action against Ad Observer, our browser extension. But they didn’t do that. They didn’t sue us. They didn’t try to block our extension technologically. They didn’t petition the browser extension stores to have our extension removed. Instead, they took our ability away to research their platform in other ways. So to me, their words just don’t match their actions.

Shirin Ghaffary

You’re not the first person who has questioned if Facebook is trying to silence research that it disagrees with. Do you think this is a bigger issue? Have you seen other examples of this?

Laura Edelson

Frankly, yes. I think that the public hand-wringing over CrowdTangle a few weeks ago was just another instance of this. [For] researchers who have been looking into how [Facebook] magnifies certain forms of content, it doesn’t like what we’re finding, and I think it is taking measures to silence us.

(CrowdTangle is a data analytics tool owned by Facebook that has been used to show how right-wing media pages gain high levels of shares and “Likes” on Facebook. Some Facebook executives were reportedly considering limiting outside access to CrowdTangle due to concerns that its data was not portraying the company in a good light, according to recent reporting in the New York Times. Facebook disputes this.)

Shirin Ghaffary

Why is it important for this type of research to continue?

Laura Edelson

I think we have reached a point where most people do not trust that Facebook is a healthy ecosystem. I think there’s pretty substantial poll data to show that. And I think we’ve reached a point where disinformation online is having really serious impacts in the world at large. Look at the problems with vaccine disinformation, look at the fact that there are still millions of Americans who think that the election was stolen. We just are not operating with a healthy information ecosystem right now.

And [while] Facebook is not the only reason that this is the case, they’re certainly a part of it. Right now, I really believe that we are racing against the clock to better understand how this is happening, to understand why this is going wrong so badly, to figure out what we can do to combat it. This is a right now problem. And when Facebook stops researchers like me from doing our jobs, they’re taking people out of a fight that we just can’t afford to lose.

Shirin Ghaffary

There are projects that Facebook does with outside researchers, and many of them do have critical findings of the impact of some of the information on the platform. So how do we make sense of those two realities? Can Facebook both be enabling critical research and stifling it at the same time?

Laura Edelson

Absolutely. To be really clear about something else: Facebook is a big company with a lot of people. There are many people working inside Facebook; there are many researchers who work collaboratively with Facebook who are doing excellent work. And I think it’s important that those folks continue to do their work. I think that what we are seeing is, you know, almost a little bit of corporate schizophrenia. You have to understand, my project is aimed squarely at ads, and ads are Facebook’s business — advertisers are its customers.

And they are somewhat understandably very sensitive about protecting what they see as the interests of their customers. So I certainly understand why Facebook might have a rational economic interest in making sure that information about ads that they do not control isn’t public. I just happen to think that the public has a right to know. And that trumps any economic interest that Facebook might have.

Shirin Ghaffary

Sen. Mark Warner made a statement criticizing Facebook for what it did to your research group calling it an attempt to cut off an outside group’s transparency efforts. He called for legislation on this. What do you think about that?

Laura Edelson

I’m really sad that maybe it has come to this. Maybe it is time for legislative change. I think that means that this voluntary transparency regime is just not working.

Shirin Ghaffary

I know that you’re not a policymaker, but you’re in the middle of this debate. What do you think that potential policy could look like, that would help researchers have more access to Facebook?

Laura Edelson

One thing that I’ve put forward in partnership with many other researchers, is that frankly, I think it is time for universal ad transparency.

I think that Facebook and other large platforms that use algorithmic targeting for ads or use self-service ad platforms should make all ad data available to researchers in the public. That includes non-political ads, that includes targeting information. I think that’s the next step we need for the public to have more trust in how they’re being exposed to ads on these platforms. I think probably, in addition to that, other forms of transparency of public content on social media platforms will likely also be necessary.

I think we’ve all just seen too many instances where things as serious as terrorist attacks are being planned in public on social media. I think we have probably reached a point where if platforms want to be the public square, they have to be a lot more open to journalists and researchers.

Author: Shirin Ghaffary

Read More

RSS
Follow by Email