U.K.’s “pre-crime” algorithm sparks ethical outcry amid privacy fears
04/19/2025 // Belle Carter // Views

  • The U.K. Ministry of Justice is developing an algorithm to predict which individuals – some with no criminal records – might commit murder, drawing on 500,000+ datasets, including mental health histories, age of first police contact and domestic abuse experiences. The project, rebranded from "Homicide Prediction Project," remains unlaunched but has raised alarm over likenesses to dystopian "pre-crime" systems.
  • The initiative builds on tools like the Offender Assessment System (OASys) but adds new sensitive data categories such as self-harm records, disabilities and victim trauma experiences, prompting fears of bias and misuse of personal information.
  • Critics, including Statewatch, argue the algorithm risks amplifying racial disparities, given the U.K.’s history of biased policing and its reliance on data from racially skewed systems, causing disproportionate harm to minorities and marginalized groups.
  • The MoJ insists the project is experimental, aiming to improve public safety through better risk assessments for probationers. However, activists counter that framing it as "research" does not justify deploying biased tools that undermine due process and dignity, likening it to the sci-fi Minority Report premise of punishing potential future crimes.
  • As the U.S. has banned racially biased predictive policing tools, the U.K.'s project highlights tensions between crime prevention and civil liberties. The outcome may set a global precedent on whether such algorithms enforce accountability or deepen systemic discrimination.

The British government's push to implement an algorithm predicting who might commit murder mirrors dystopian sci-fi tropes while clashing with civil liberties concerns. As predictive policing gains traction globally, this project highlights urgent debates over technological overreach, racial bias and the ethics of policing communities that already face systemic discrimination.

The U.K. Ministry of Justice (MoJ) is developing an algorithm to forecast which individuals convicted of crimes may escalate to homicide, according to documents uncovered by civil liberties group Statewatch. Dubbed the Homicide Prediction Project – now rebranded as "Sharing Data to Improve Risk Assessment" – the initiative aggregates data from over 500,000 individuals, some without criminal records, to identify "future criminals." Launched in 2023 and completed in 2024, the project remains unlaunched but has already sparked warnings from activists and legal experts.

The system builds on tools like the Offender Assessment System (OASys), used since 2001 to assess recidivism risks for probation decisions. However, the new algorithm expands its scope to include fresh datasets, including mental health histories, age of first police contact and domestic abuse experiences.

Civil liberties concerns: Biased systems, structural discrimination

Statewatch, which exposed the project via Freedom of Information Act (FOIA) requests, condemns the initiative as inherently flawed and biased. Sofia Lyall, a Statewatch researcher, warns that algorithms trained on data from an institutionally racist police system will disproportionately harm minorities and marginalized communities.

"Time and again, research shows these systems are inherently flawed," Lyall said, citing studies in the U.S. and U.K. showing predictive policing often exacerbates racial disparities. "Coding bias into automated profiling of potential criminals is deeply wrong, particularly when it uses sensitive data on mental illness and addiction."

The U.K.'s legacy of biased policing looms large: Black Britons are seven times more likely to be stopped and searched than white individuals, per 2023 Home Office data. Critics argue the algorithm risks amplifying these inequities, funneling more surveillance and penalties toward communities already overrepresented in the criminal system.

At the center of the debate is the inclusion of personal data from people without criminal records. Statewatch claims victims of crime – including domestic abuse survivors – are being analyzed. MoJ spokespersons deny this, stating only "convicted offenders" are included.

But FOIA documents reveal that shared data includes "special categories" such as mental health markers, self-harm records and disabilities. Age at first police contact as a victim of crime is also listed as a metric, prompting fears of weaponizing trauma.

Government defense: Public safety first

The MoJ insists the project is solely experimental and aims to improve risk assessments for probationers. "This research is about enhancing public safety," a spokesperson said, emphasizing collaboration with police forces like Greater Manchester Police.

The tool's supporters, including some criminologists, argue that predictive analytics could stop violent crimes by identifying high-risk cases early. Yet, Lyall counters, "Public safety is a worthy goal, but not at the cost of due process and dignity."

The U.K.'s initiative harks back to Philip K. Dick's "Minority Report," where psychic "pre-cogs" flagged future criminals.

The U.S. has seen similar pushback against predictive policing. In 2020, Californians banned AI tools that predict criminality due to racial bias concerns. Yet the U.K.'s project moves forward, now backed by data-sharing agreements stretching back to 2015. (Related: Investigation reveals Instagram's algorithm regularly suggests explicit content to users as young as 13 years old.)

Watch the video below that talks about "predictive policing."

This video is from the J. D. Gallé | neoremonstrance channel on Brighteon.com.

More related stories:

Russian researchers unveil AI model that adapts to new tasks without human input.

Researchers develop algorithm that will allow robots to work together with humans … or hunt us like prey.

Bill Gates wants AI algorithms to censor vaccine "misinformation" in real time.

Sources include:

TheNationalPulse.com

TheGuardian.com

Brighteon.com



Take Action:
Support Natural News by linking to this article from your website.
Permalink to this article:
Copy
Embed article link:
Copy
Reprinting this article:
Non-commercial use is permitted with credit to NaturalNews.com (including a clickable link).
Please contact us for more information.
Free Email Alerts
Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.
App Store
Android App
eTrust Pro Certified

This site is part of the Natural News Network © 2022 All Rights Reserved. Privacy | Terms All content posted on this site is commentary or opinion and is protected under Free Speech. Truth Publishing International, LTD. is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. Truth Publishing assumes no responsibility for the use or misuse of this material. Your use of this website indicates your agreement to these terms and those published here. All trademarks, registered trademarks and servicemarks mentioned on this site are the property of their respective owners.

This site uses cookies
Natural News uses cookies to improve your experience on our site. By using this site, you agree to our privacy policy.
Learn More
Close
Get 100% real, uncensored news delivered straight to your inbox
You can unsubscribe at any time. Your email privacy is completely protected.