CRIME

This Teen’s Story Is Your Worst ‘Predictive Policing’ Nightmare

Crime-prediction algorithms remain unproven and problematic — but that hasn't stopped police departments across the country from using them

CRIME
Illustration: Tara Jacoby
Apr 12, 2017 at 10:33 AM ET

Connor Deleire was sitting alone in a friend’s car in Manchester, New Hampshire, one late October afternoon in 2015 when the police zeroed in on him. The 18-year-old waited near the intersection of Union and Merrimack Streets as the friend left to pick up his niece a block away. What Deleire didn’t know was that the vehicle was parked in a “predictive hot spot,” an area determined by a computer algorithm used by the Manchester Police Department to be the likely scene of a crime.

Just being in the area was enough to arouse suspicion, officials at the time said. Then, according to police, Deleire failed to give a “legitimate” reason for being there and grew increasingly agitated with officers. The result of that confrontation: Deleire had his head bashed into a cop cruiser, his body zapped with a stun gun, and his face blasted with pepper spray. The beatdown landed him in the hospital with a concussion, his father John Deleire told Vocativ in a recent interview.

In the end, the only crime his son was charged with was resisting arrest.

“If you’re in a predictive hot spot, you’re going to be accused of committing a crime even if you’re in broad daylight and have a reason to be there,” said John Deleire, an attorney who said his family had a long history of law enforcement service in New England. “I think it was the predictive policing part of this that caused this whole thing to happen.”

According to the National Institute of Justice, predictive policing refers to “any policing strategy or tactic that develops and uses information and advanced analysis to inform forward-thinking crime prevention.” These policing strategies usually involve mathematical or algorithmic techniques designed to predict crimes as well as who will commit or be victimized by them.

The Connor Deleire incident underscores the swelling fear among civil liberties groups, who see the rise of predictive policing as further enabling the worst tactics and practices of law enforcement. But even as the crime-fighting technology stirs controversy, police forces are flocking to it. Once the stuff of sci-fi books and Hollywood blockbusters, predictive analytics continues to entice law enforcement agencies, big and small, throughout the United States.

“The ‘Minority Report’ of 2002 is the reality of today,” declared former New York City Police Commissioner William Bratton in 2015, referring to Steven Spielberg’s film in which cops round up killers before they can carry out their heinous crimes.

For decades cops have turned to technology to target would-be criminals and zero in on where they strike. Computers began crunching stats and churning out crime maps while Ronald Reagan was still in the White House. Bratton himself revolutionized American policing in the early 1990s through CompStat, a data-driven tool credited with driving down crime.

The latest breakthrough came around 2010 when police departments — with the help of U.S. Justice Department funding — began experimenting with the same type of software that Google uses to rank websites in their search engines. Using historical crime data and other variables, they created algorithms geared to forecast where and when crime will occur, and who its perpetrators and victims might be. 

The prospect of stopping crime before it happens is all too alluring. At least 20 of the nation’s 50 largest police forces — including Los Angeles, New York, and Philadelphia — now use predictive analytics, one study found last year. Smaller departments, from Fort Meyers, Florida, to Santa Cruz, California, also rely on these crime-fighting algorithms to deploy officers and resources.

“The seduction of technology, of doing more with less and with fewer resources, is just too good pass up,” said Andrew Ferguson, a law professor at the University of the District Columbia and author of the forthcoming book, “The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement.”

With few exceptions, departments that have integrated predictive analytics into their police work claim they’ve seen striking results. Manchester police credited the city’s 25 percent drop in the number of property crimes committed in 2016 to the the very software that helped turn Connor Deleire into a target the year before. Nick Willard, Manchester’s police chief, has been so gung-ho about predictive policing’s potential that he has appeared in at least two promotional videos for the company that created his department’s technology. He did not reply to Vocativ’s request for comment.

Cops in Chicago, meanwhile, believe that predictive policing could be used to combat the city’s sweeping homicide epidemic. Four years ago the Chicago Police Department designed an algorithm using variables such as arrest records and gang affiliations to forecast who is most likely to be involved in gun violence. The department then began to place these potential perpetrators and victims on a secret “heat list” used by its personnel. Last May, Chicago police said that three out of four shooting victims in 2016 had been on the list, as were four out of five people arrested for shootings.

Yet the actual efficacy of these crime-fighting algorithms remains largely unproven. The small body of independent research on predictive policing, in fact, appears to challenge some of the claims made by law enforcement. A RAND Corporation study from 2014 tested a predictive policing model aimed at reducing property crime in Shreveport, Louisiana, and found no evidence that it did so. In a different report released last year, RAND found that people on a 2013 version of Chicago’s heat list were “not more or less likely to become a victim of a homicide or shooting than” a control group its authors looked at.

“It’s definitely not a magic bullet,” said David Robinson, whose technology-focused think tank Upturn published a wide-ranging survey on predictive policing in the U.S. last year. “The actual independent evidence suggests these systems might not make people any safer.”

Just as troubling for an array of civil liberties and civil rights advocates is the belief that predictive analytics — which relies heavily on crime data and other variables traditionally used by law enforcement — may simply reinforce police bias toward poor and minority communities while undermining the constitutional rights of individuals. The ACLU and 16 other groups recently outlined these concerns in a searing rebuke on predictive policing.

“The data driving predictive enforcement activities — such as the location and timing of previously reported crimes, or patterns of community- and officer-initiated 911 calls — is profoundly limited and biased,” wrote the coalition, which also included the NAACP, The Center for Democracy and Technology, and the Brennan Center for Justice. “Systems that are engineered to support the status quo have no place in American policing.”

The use of predictive analytics by law enforcement, however, is not expected to slow any time soon. Police in Mesa, Arizona, just started a 3-year, $170,000 pilot program using predictive policing software. Cops from east Texas to Florida’s panhandle are now gearing up to implement crime-fighting algorithms of their own.

Just what will happen in these communities remains to be seen. For Connor Deleire’s father, his son’s story remains a painful reminder of what can go wrong. “[Connor] grew up in a police family and what happened to him pretty much shattered all of his ideals,” John Deleire said. “It left him with a bad taste. It left all of us with a bad taste.” 

The new season of DARK NET — an eight-part docuseries developed and produced by Vocativ — airs Thursdays at 10 p.m. ET/PT on SHOWTIME.