Why we don’t know as much as we should about police surveillance technology

Why we don’t know as much as we should about police surveillance technology

A police officer standing watch over a protest in Brooklyn. | Getty Images

Despite a growing number of high-tech tools, law enforcement agencies don’t seem to want to disclose what they’re using.

Open Sourced logo

There are a host of artificial intelligence and algorithm-based technologies that the New York Police Department could be using, but few know exactly what’s in the NYPD’s arsenal. As multiple sources told Recode, trying to get the department, the largest in the United States, to reveal what surveillance tools they use is a process full of opacity and shifting legal arguments. Basically, if you want to know how the NYPD might be using tech to police your neighborhood, tough luck.

I learned this lesson firsthand this past December when looking into various artificial intelligence-based technologies I thought the police department could be using. Like many journalists, I filed a request for public records, which is a wonky-but-nonetheless-simple process for requesting information from the government. My query included documents like contracts and correspondence related to 21 security companies. My hope was that those documents, if they existed, could give me a better sense of what type of artificial intelligence the NYPD might be using. One technology that particularly interested me was gun detection software, which uses artificial intelligence to automatically notify you when a brandished firearm appears in a video feed.

But just over 48 hours after filing it, my entire request was rejected because, the NYPD said, those documents would reveal “non-routine techniques and procedures” — with no elaboration. So that same day, I appealed, and about a week later I heard back from the department. Now I had a new explanation: Answering my requests could reveal these companies’ trade secrets, harm their “competitive position,” or impact “imminent contract awards or collective bargaining negotiations.” If I wanted to fight further, I’d need to go to court.

It turns out I, a beginner in requesting public records, wasn’t the only one to get such an opaque and frustrating response. Several government transparency advocates interested in law enforcement tech told me they, too, have faced issues when trying to get records from the NYPD. Some have had to sue for these records. That matters because police departments are increasingly making use of new and untested types of surveillance technology. These tools aren’t only expensive to the taxpayers, they also raise concerns about algorithmic bias and threats to civil liberties.

Consider predictive policing tools, which NYPD representatives described in an article in 2017. This software uses historical data, such as police reports, to predict future crime.

“[That data] could be arrest records. It could be conviction records. It could — depending on the kind of predictive policing system [because] different ones operate in different ways — it could be the anticipated weather for that day. It could be school closures,” Rachel Levinson-Waldman, a senior counsel at the Brennan Center for Justice, said.

But knowing exactly what input data a predictive policing system is using is critical. One of the major concerns about these tools is that they’ll exacerbate racial bias, and researchers have found that predictive policing tools can end up factoring in “dirty data,” including data that reflects conscious and implicit bias, as well as manipulated and even unlawful police practices.

A wireless NYPD surveillance camera system in Brooklyn.Getty Images
A wireless NYPD surveillance camera system in Brooklyn.

The NYPD has still not provided the Brennan Center with documents related to the input data these algorithms use. The department did tell the Daily Beast last year that it uses “complaints for seven major crime categories, shooting incidents, and 911 calls for shots fired.” (The department outlined just one example in its 2017 article: 911 calls for shots fired and a database of previous shootings can be used by an algorithm to predict future gun violence). But the Brennan Center filed its public records request for that information back in 2018, and the organization is still waiting on an appeal, filed in November, for those documents. That follows an earlier public records request the nonprofit filed in 2016 about predictive policing, which took two years — and a lawsuit — to ultimately get fulfilled.

Of course, the agency could just give advocates, journalists, and the public more information about their practices to begin with.

“There is absolutely no law today that stops the NYPD from voluntarily putting every one of these systems on their website,” said Albert Fox Cahn, the executive director of the Surveillance Technology Oversight Project (STOP). “The problem is, we have a police department with a decades-long history of trying to do everything within its power to hide its operations from public scrutiny.”

The NYPD did not respond to my repeated request for comment on this article.

STOP and the DC-based digital rights group Upturn are now suing to force the NYPD to turn over records related to so-called “mobile device forensic tools,” which could allow an officer to access data and meta-data from your phone, including your device’s digital history, location data, and types of encrypted communications. Cahn said his group is also preparing another legal proceeding to force the NYPD to hand over documents related to other potential biometric surveillance.

Meanwhile, the Georgetown Center for Privacy and Technology is still wrapped up in litigation stemming from a public records request related to the NYPD’s purchase of facial recognition technology that was filed in 2016. And there are certainly more examples of the NYPD being taken to court for records, and still others are waiting to hear back about their initial inquiries. For instance, Jerome Greco, an attorney who focuses on digital forensics at Legal Aid, filed a request for documents related to the NYPD’s use of drones on Thanksgiving Day last year. He was told to expect a response in April.

“I do think it’s safe to say that the NYPD stands out, based on our experience and the experience of other folks that we talk to,” Levinson-Waldman said. “It does seem like the fairly uniform experience is that it is very hard to get documents out of the NYPD through a just regular FOIL (Freedom of Information Law) process.”

There is some data to back up these claims. A search of MuckRock, an online database that allows people to publicly post and track their public records requests, shows that the New York Police Department has completed less than half of the number of public records requests that the Chicago Police Department has, despite receiving more requests. It’s worth noting that it’s not a scientific comparison; MuckRock has tracked only a fraction of overall requests to the NYPD (just over 500), while NYC’s Open Records data says the department has, overall, received more than 40,000 requests since 2017.

“Most agencies are not that open to sharing that information, but the NYPD is more egregious. They’ve been the ones to make legal claims that no other agency really makes,” Rashida Richardson, the policy director of the AI Now Institute, explained.

In fact, the NYPD has even tried to use a so-called “Glomar” response to public records requests about its potential surveillance of the phones and social media of protesters (earlier last year, a New York court said “not this time”). The “Glomar” response is an argument typically used by federal agencies like the Central Intelligence Agency and the Federal Bureau of Investigation to reject a request for records by neither confirming nor denying that they actually exist.

“This is a police department that we know has a history of acting in ways that have a disproportionate impact on communities of color and individuals of color in New York,” Levinson-Waldman said. “It’s really akin to a counterterrorism department, and so I think it’s especially critical to understand how the police department is functioning.”

Consider the NYPD’s “Domain Awareness System,” a surveillance platform it developed alongside Microsoft in 2012 which the department itself describes as utilizing “the largest networks of cameras, license plate readers, and radiological sensors in the world.”

The NYPD is known to use other AI- and algorithm-based surveillance tools. For instance, the department also uses an AI-based software called Patternizr that mines through police case files to find crime patterns. That’s a tool they introduced in 2016 but only publicly revealed last year.

The extent to which the NYPD fails to fulfill public records requests limits how much we really know about what the department is doing. After all, much of the information we have about law enforcement’s existing surveillance tools is obtained through public records requests. We know how the NYPD uses facial recognition databases in large part thanks to a public records request that Georgetown Center for Privacy and Technology lawyers had to fight for in court.

But the legal mechanism seems increasingly insufficient and slow when transparency-wary departments like the NYPD are buying ever-more-powerful surveillance tech. That’s left some looking for better ways to get information about these tools. For instance, some transparency advocates tried to push New York City to reveal all the automated decision systems used by the government as part of a novel task force on the city government’s use of algorithms. (The city offered up a measly five examples.)

Others point to laws that would force these departments to be more upfront. Levinson-Waldman and Cahn both support the Public Oversight of Surveillance Technology Act, a proposed New York City bill that would force the department to disclose more information about the types of surveillance technology it uses.

“To some extent, it would relieve the pressure on having to rely on public records requests and lawsuits,” Levinson-Waldman told Recode.

Cahn said the bill is part of a broader effort initiated in 2016 and organized by the American Civil Liberties Union called the Community Control over Police Surveillance that urges local governments to take greater control over the surveillance technologies acquired by their law enforcement agencies.

“The reason public oversight is so important is because agencies can get these decisions deeply wrong when left to their own devices,” said Cahn. “The NYPD could waste millions of dollars on tools which simply automate discrimination against communities and lead to false arrests of black and brown New Yorkers without any accountability or oversight.”

Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.

Author: Rebecca Heilweil

Read More

RSS
Follow by Email