Join the fight against crime-predicting tech

Almost three quarters of UK police forces are using AI and automated technology to ‘predict’ where crimes will take place and who will commit them. 

They think they can target people and neighbourhoods BEFORE crimes take place. It sounds like science fiction — but it’s happening right now in the UK. Entire neighbourhoods are being classified as high risk.

No one should be treated as a suspect just because of where they live or who they are.

We should be presumed innocent not predicted as guilty.

Sign up to the Safety Not Surveillance campaign and help us fight police use of 'crime-predicting' tech.  

We need 500 people to join the campaign.

2 of 500 have joined the campaign. Will you help us get 498 people more?

Any personal data you provide will be held in-line with our privacy policy (https://www.openrightsgroup.org/privacy-policy/).

You are happy to join our email list to receive updates about this campaign

Why is this important?

The police say AI will make us safer. The truth? It automates injustice and racism.

Predictive policing tools don’t predict crime — they predict policing.

Built on flawed police data, they target the same communities that have always been over-policed: Black and racialised people, low-income neighbourhoods, and migrants. This leads to:

• Over-policing
• Unjust stop and searches
• Harassment, handcuffing, and use of force against targeted people

We all deserve safety, not surveillance.

Sign up to Safety Not Surveillance campaign updates today — and help stop the police use of crime-predicting tech  before more communities are punished.