The National Institute of Justice (NIJ), the agency of the U.S. Department of Justice that's responsible for research and development of law enforcement techniques, recently named the winners of its “Real-Time Crime Forecasting Challenge,” a data science contest for predictive policing with over a million dollars in prize money available. However, even more than the skills and talents of the contestants, the contest showcased the pitfalls and the dangers of using data science, algorithms, and predictive policing as a law enforcement tool. Instead of serving as an insight into the powers of predictive policing, the Real-Time Crime Forecasting Challenge was a disturbing example of how algorithmic law enforcement solutions are fraught with crippling biases and state-sponsored discrimination, even before it ever gets off the ground.
The Real-Time Crime Forecasting Challenge
The NIJ's challenge sought submissions from students and businesses, and had three aims:
- Harness data science advances in other fields to crime forecasting,
- Encourage scientists from all fields to consider the challenges of crime and justice, and
- Conduct the most comprehensive comparative analysis of crime forecasting software and algorithms to date.
Contestants were presented with a problem: Submit a prediction of crimes that would happen in certain areas of Portland, Oregon, for the period of March through May 2017, based on five years of past crimes, as reported by the city's police department.
Winners would bring home a share of the prize money. The loser, however, was already determined: It would be all of American society.
Predictive Policing Uses and Solidifies Biases
The problem with the NIJ's Real-Time Crime Forecasting Challenge was clear even before the submissions started coming in: Like all predictive policing strategies, the contest used past data, exclusively, to predict future crimes. The past data in the contest, however, was especially problematic, as it consisted merely of calls for service – 911 calls reporting, not a crime, but an alleged crime.
Therefore, the best that a contestant could do would be to create an algorithm that sent police to places that crimes might have happened, in the past. Of course, once police were there, they were 100% more likely to find a crime than in a place that they were not sent. As days wear on and the predictive policing that this contest espoused continued to be used, more and more crimes would be found in places that had been targeted by the algorithms, sending more and more police on rounds to those locations, creating a snowball effect that would lead, eventually, to bubbles of a police state within the city of Portland, Oregon. Meanwhile, areas that were not targeted by the original algorithm would be free of a police presence.
Maine Criminal Defense Attorney William T. Bly
Predictive policing might sound like a great idea until you actually think about it. While it might sound like it could be used to find or even deter crime as it happens, in the long run, it is a terrible law enforcement technique with devastating pitfalls that are readily apparent.