Between 2018 and 2021, more than one in 33 U.S. residents were potentially subject to police patrol decisions directed by crime prediction software called PredPol.
In our world, science fiction and reality are blending more and more. Movies like "Minority Report" and "Total Recall" can remind us of predictive policing. These films, based on the works of author Philip K. Dick, show a future where foresight is crucial for law and order. Movies like "Minority Report" and "Total Recall" remind us of predictive policing.
The movie "Minority Report" is about a society that arrests people for future crimes. They use 'Precogs' to predict the crimes. In contrast, "Total Recall" looks at memory and identity. It explores how we see and understand facts. This idea is like analyzing data to predict crime.
In the present, we can see that crime prediction is real in law enforcement, even though it's flawed. The article talks about Geolitica (PredPol), a software used to predict crime. The article asks if this technology works, is fair, and what it means.
The Markup and Gizmodo analyzed the results of relying on software to prevent crime. The researchers looked at how Geolitica (PredPol) worked and the impact.
Technology, as seen in "Minority Report" and "Total Recall," raises worries about bias, discrimination, and loss of trust. These are issues that deeply affect our communities.
As we learn about predictive policing, the warnings from Dick's imagination become real challenges. We must understand, critique, and oversee them.
"Crime is very specific. A serial murderer is not going to wake up one day and start robbing people…." - Christopher Herrmann, John Jay College of Criminal Justice
What Happened
Ineffective Crime Prediction: The Geolitica (PredPol) software, which police departments buy, is not effective. It correctly predicted crimes less than 1% of the time.
Bias in Predictions: The Markup and Gizmodo analyzed millions of crime predictions. They found that the software targeted Black and Latino neighborhoods more often. It also avoided Whiter, wealthier areas.
"It’s like trying to diagnose a patient without anyone fully telling you the symptoms." -Jumana Musa, National Association of Criminal Defense Lawyers
Disproportionate Targeting: The software often recommended patrols in lower-income areas and around public housing. These recommendations resulted in thousands of predictions over the years in those neighborhoods.
Data Security Concerns: The predictive data was found on an unsecured server. That raises concerns about data security and privacy.
Black and Latino populations were higher in the most-targeted neighborhoods
Why It Matters
Waste of Resources: The software doesn't stop crime well, so that the police could use resources in better ways.
Racial and Socio-Economic Discrimination: The software biases predictions and can worsen societal and racial inequalities.
Trust and Legitimacy Issues: Biased predictive policing software can erode trust between law enforcement and marginalized communities.
"I think that what this shows is just how unreliable so many of the tools sold to police departments are." Dillon Reisman, founder of the ACLU of New Jersey’s Automated Injustice Project
Poorer neighborhoods had the most predictions
Reading Between the Lines
"The problem with predictive policing is the policing part." - Andrew Ferguson, American University law professor
Reliance on Flawed Data: The two articles show a big issue with how police use crime numbers. These numbers might be unfair because different places report crimes differently.
Technology vs. Human Judgment: The problem shows that using only technology to stop crime doesn't work well. We also need people to think carefully and understand what's happening locally.
Broader Implications for Policing Practices: The results make us ask bigger questions about how police use technology. Considering people's rights, treating all races equally, and the best ways to stop crime is important.
Need for Transparency and Accountability: We learned that data predicting future crimes was not kept safe. Also, there wasn't enough clear information about the software's function. This tells us that the police need to be more transparent and responsible.
Neighborhoods with the most predictions had the fewest White residents
So, the big takeaway is that relying too much on technology for things like predicting crime might not be the best idea. It's important to look at the facts and consider other ways to make communities safer.
Comments