Why is this important?
The police say AI will make us safer. The truth? It automates injustice and racism.
Predictive policing tools don’t predict crime — they predict policing.
Built on flawed police data, they target the same communities that have always been over-policed: Black and racialised people, low-income neighbourhoods, and migrants. This leads to:
• Over-policing
• Unjust stop and searches
• Harassment, handcuffing, and use of force against targeted people
We all deserve safety, not surveillance.
Sign up to Safety Not Surveillance campaign updates today — and help stop the police use of crime-predicting tech before more communities are punished.
