One City Rejected a Policing Algorithm. Could It Be Used For Other Purposes?

Pittsburgh announced that they would stop using a "hot spot" algorithm to deploy police to places suspected of being future crime sites.

Pittsburgh announced that they would stop using a "hot spot" algorithm to deploy police to places suspected of being future crime sites. Shutterstock

 

Connecting state and local government leaders

In Pittsburgh, an algorithm that deployed law enforcement officers to predicted crime “hot spots” might be repurposed to send social services to areas in need instead.

A controversial policing tactic in Pittsburgh has been discontinued following concerns that it might help perpetuate systemic inequalities by increasing police presence in neighborhoods that are largely Black and Latino.

The “hot spot” prediction program was an algorithm-informed system that alerted law enforcement to certain areas identified as places where crimes are likely to be committed, prompting proactive law enforcement deployments. In a statement confirming that police would no longer use the tool, Pittsburgh Mayor Bill Peduto said that he shared concerns “about the potential for predictive policing programs to exacerbate implicit bias and racial inequities in our communities.”

The project had been on pause since December 2019, and Peduto confirmed recently that there are no plans to restart it. Instead, the mayor has suggested that the city might rethink how it responds to places identified as “hot spots,” perhaps using the information to guide how it delivers services instead.

Pittsburgh is far from the only city to use “hot spot” algorithms to proactively deploy law enforcement to areas with potential criminal activity. Often called “predictive policing,” the programs sometimes use a mixture of gunshot detection technology, data about the recent locations and times of day of property crimes, and even information from the Facebook profiles of people with convictions. From Chicago to Los Angeles, technologies that give law enforcement potential insight into future crimes are being tested.

But where they’ve been deployed, they’ve also raised serious concerns from civil rights groups, privacy advocates, and Black and Latino advocates, who say their communities are disproportionately represented in crime data. Many fear that the technologies are in fact akin to a crystal ball—murky and sometimes wrong—and that without any oversight or community input, the algorithms could end up perpetuating racial biases in policing under the guise of data-driven policy making or “smart city” innovations.

Metro21: Smart Cities Institute at Carnegie Mellon University, which developed the Pittsburgh program, noted in a June statement that its project had concluded in December 2019 and it was no longer sharing information with the police. The institute emphasized that the tool targeted locations, not people, saying an evaluation found a 34% drop in serious violent crime in areas identified as “hot spots,” while only four arrests were made during patrols sent out because of the tool.

In Pittsburgh, advocates raised concerns about both the oversight of the program and the lack of community engagement prior to the tool’s deployment. The Pittsburgh Task Force on Public Algorithms, which is independent from the city and run out of the Institute for Cyber Law, Policy, & Security at the University of Pittsburgh, is now looking for ways the city could further engage residents around the use of algorithms. Task force members are looking beyond what the city is using in policing to also evaluate algorithms like those that determine bail conditions for pretrial release and others that use data to trigger child welfare interventions in some scenarios.

“We convened the task force with an eye towards scrutinizing algorithmic systems partly because our county has been a leader in doing things with algorithms,” said Christopher Deluzio, a task force member and the policy director at the Institute for Cyber Law, Policy, & Security. “A lot of these systems are deployed without public input or oversight—but with these types of systems, we really need to make sure the public knows what’s going on, is a partner in developing it, and has the means to scrutinize it.”

The group plans to issue a report  next year, providing the city government with suggested frameworks for oversight, ways to engage the community, and fix any situations where an algorithm may have perpetuated systemic inequalities. There are other cities Pittsburgh can look to for models in all these areas, Deluzio said. Seattle, for instance, requires the city council to approve all new uses of surveillance technology, a process that provides members of the public ample time to voice their thoughts and ask questions about how a new technology will be used.

Providing the public with a clear picture of how algorithms are overseen may prove to be a bit trickier. Many algorithms in use in the criminal justice system are black boxes even to the policymakers who implement them because they are bought from third party vendors that are allowed to hide facets of how they work to protect intellectual property. Predictive policing algorithms and other machine learning tools that aren’t transparent about their source code have been challenged in court in recent years and some city councils have tried to banned them.

Not exposing that data to the public is “inconsistent with meaningful oversight,” said Deluzio. “I think it ought to be more difficult for a police department to just procure something off the shelf and present it to the public as a black box,” he said. “If you can’t open the system to auditors … that does not engender public trust.”

Efforts to dig into algorithms can get contentious. In 2017, the New York City Council approved a bill that created a task force to study the use of algorithms in the city, which two years later did release a report about their use. But critics derided the process, saying dissenting voices were sidelined

At a task force community meeting in March, several Pittsburgh residents said that they would like to learn more about how the local government is using algorithms to see if they could possibly be repurposed. The “hot spot” prediction program, some suggested, could be refashioned to deploy resources to address root causes of crime like housing instability, poverty, and joblessness, instead of branding certain neighborhoods and the residents in them as potential sources of criminal activity. 

The mayor seems open to such an idea, saying “hot spots” could be managed by the newly created Office of Community Health and Safety, an agency that “will allow public safety to step back and determine what kind of support an individual or family needs.”

“‘Hot spots’ may benefit from the aid of a social worker, service provider or outreach team, not traditional policing,” Peduto said in a letter to the task force on June 16.

The Mayor’s Office did not respond to a request for comment as to whether progress has been made in converting the hot spot program to one focused on providing social services. But elsewhere in Allegheny County, which is where Pittsburgh is located, a county department is working on new algorithms that officials say will allow them to better allocate resources to communities most in need. 

Erin Dalton, the deputy director for the Office of Analytics, Technology and Planning at the Allegheny County Department of Health and Human Services, said that her agency is debuting a new algorithmic tool next month to help them determine who in the county’s population of homeless people should be prioritized for rapid rehousing or permanent supportive housing. Those at a high risk of four or more emergency room visits in the next year, an in-patient mental health stay, or a jail booking will be top of the list for services.

“We’re trying to reduce the harms of being left unhoused,” Dalton said. “Our job is to use scarce resources for the most vulnerable … [with this] we’ll be able to better prioritize.” 

Dalton said she hopes the new process will be “faster, better, and less traumatic” than the current process. Though the department has faced heavy criticism for its use of algorithms in the past (particularly one used to help decide whether or not to start a child welfare investigation), Dalton said that this algorithm is different because it will “help us decide who gets a set of supportive services.”

Using algorithms to deploy support instead of trigger investigations or law enforcement involvement could give communities more faith in their use, Deluzio said. “If there are tools that can tell us where things are happening that require interventions, why not send services?” he said. “That’s a shrewd way to rethink these tools as they expand. How can we use them to help people?”

Emma Coleman is the assistant editor for Route Fifty.

NEXT STORY: Progressive Prosecutors Push for Reform in Response to Protests