top of page
Change Peoria Logo Change in Red Over Peoria in Blue using American Captain font

Peoria Defender

Predictive Policing: A Futuristic Idea with Present-Day Flaws

Between 2018 and 2021, more than one in 33 U.S. residents were potentially subject to police patrol decisions directed by crime prediction software called PredPol.

A futuristic cityscape with advanced technology, inspired by the visual style of the movie Minority Report. The city is filled with holographic displays, neon lights, and high-tech surveillance, capturing a sense of advanced policing and crime prediction. The atmosphere is dynamic and slightly ominous, reflecting the movie's theme of a future where technology is used for predictive policing. The image should have a strong visual impact to serve as a thumbnail for an article about predictive policing.

In our world, science fiction and reality are blending more and more. Movies like "Minority Report" and "Total Recall" can remind us of predictive policing. These films, based on the works of author Philip K. Dick, show a future where foresight is crucial for law and order. Movies like "Minority Report" and "Total Recall" remind us of predictive policing.

The movie "Minority Report" is about a society that arrests people for future crimes. They use 'Precogs' to predict the crimes. In contrast, "Total Recall" looks at memory and identity. It explores how we see and understand facts. This idea is like analyzing data to predict crime.

In the present, we can see that crime prediction is real in law enforcement, even though it's flawed. The article talks about Geolitica (PredPol), a software used to predict crime. The article asks if this technology works, is fair, and what it means.

The Markup and Gizmodo analyzed the results of relying on software to prevent crime. The researchers looked at how Geolitica (PredPol) worked and the impact.

Technology, as seen in "Minority Report" and "Total Recall," raises worries about bias, discrimination, and loss of trust. These are issues that deeply affect our communities.

As we learn about predictive policing, the warnings from Dick's imagination become real challenges. We must understand, critique, and oversee them.

"Crime is very specific. A serial murderer is not going to wake up one day and start robbing people…." - Christopher Herrmann, John Jay College of Criminal Justice
An abstract representation of predictive policing technology, showing a digital map with various colored zones indicating different crime prediction levels. The map is overlaid with binary code and digital elements to emphasize the technological aspect. The colors range from cool to warm tones, symbolizing varying degrees of crime predictions, with cooler tones in affluent areas and warmer tones in less affluent areas, depicting bias in the software's algorithms.
Abstract Representation of Predictive Policing Technology: Show Varied Crime Prediction Levels

What Happened

  • Ineffective Crime Prediction: The Geolitica (PredPol) software, which police departments buy, is not effective. It correctly predicted crimes less than 1% of the time.

  • Bias in Predictions: The Markup and Gizmodo analyzed millions of crime predictions. They found that the software targeted Black and Latino neighborhoods more often. It also avoided Whiter, wealthier areas.

"It’s like trying to diagnose a patient without anyone fully telling you the symptoms." -Jumana Musa, National Association of Criminal Defense Lawyers
  • Disproportionate Targeting: The software often recommended patrols in lower-income areas and around public housing. These recommendations resulted in thousands of predictions over the years in those neighborhoods.

  • Data Security Concerns: The predictive data was found on an unsecured server. That raises concerns about data security and privacy.

Black and Latino populations were higher in the most-targeted neighborhoods

Chart showing Black and Latino populations were higher in the most-targeted neighborhoods
Increase or decrease of populations compared to overall jurisdiction, averaged across all 38. Sources: The Markup, PredPol, U.S. Census Bureau

Why It Matters

A conceptual image showing a diverse group of people from various ethnic backgrounds looking concerned and discussing a large screen displaying a map with predictive policing data. The screen shows different areas highlighted, indicating areas targeted by predictive policing. The group reflects a mix of emotions, from confusion to concern, symbolizing the community's reaction to biased predictive policing practices.
A Conceptual Image Showing A Map With Predictive Policing Data
  • Waste of Resources: The software doesn't stop crime well, so that the police could use resources in better ways.

  • Racial and Socio-Economic Discrimination: The software biases predictions and can worsen societal and racial inequalities.

  • Trust and Legitimacy Issues: Biased predictive policing software can erode trust between law enforcement and marginalized communities.

"I think that what this shows is just how unreliable so many of the tools sold to police departments are." Dillon Reisman, founder of the ACLU of New Jersey’s Automated Injustice Project

Poorer neighborhoods had the most predictions

Graph showing Poorer neighborhoods had the most predictions
As the percentage of households making less than $45,000 a year went up, so did predictions. Sources: The Markup, PredPol, U.S. Census Bureau

Reading Between the Lines

"The problem with predictive policing is the policing part." - Andrew Ferguson, American University law professor
A symbolic illustration of data bias in predictive policing, showing a balance scale. On one side, there's a digital representation of a cityscape with affluent buildings and calm colors, symbolizing areas less targeted by predictive policing. On the other side, a denser, more colorful urban landscape representing lower-income neighborhoods, indicating heavier policing focus. The imbalance in the scale visually represents the disparity in policing practices.
Data Bias In Predictive Policing
  • Reliance on Flawed Data: The two articles show a big issue with how police use crime numbers. These numbers might be unfair because different places report crimes differently.

  • Technology vs. Human Judgment: The problem shows that using only technology to stop crime doesn't work well. We also need people to think carefully and understand what's happening locally.

  • Broader Implications for Policing Practices: The results make us ask bigger questions about how police use technology. Considering people's rights, treating all races equally, and the best ways to stop crime is important.

  • Need for Transparency and Accountability: We learned that data predicting future crimes was not kept safe. Also, there wasn't enough clear information about the software's function. This tells us that the police need to be more transparent and responsible.

Neighborhoods with the most predictions had the fewest White residents



Neighborhoods with the most predictions had the fewest White residents
Proportion of neighborhoods' race and ethnicity, averaged across 38 jurisdictions. Sources: The Markup, PredPol, U.S. Census Bureau

So, the big takeaway is that relying too much on technology for things like predicting crime might not be the best idea. It's important to look at the facts and consider other ways to make communities safer.

Predictive Policing Study Sources:

Further Reading:


bottom of page