Vikram Dodd Police and crime correspondent 

UK use of predictive policing is racist and should be banned, says Amnesty

Exclusive: rights group says use of algorithms and data reinforces discrimination in UK policing
  
  

Close-up of a police car
Predictive policing involves computer programmes that use data and algorithmic models to estimate where crimes are most likely to happen. Photograph: georgeclerk/Getty Images/iStockphoto

British policing’s use of algorithms and data to predict where crime will happen is racist and picks on the poor, a report from Amnesty International says.

The human rights group says predictive policing tools, used by most police forces in the UK, are so unfair, dangerous and discriminatory that they should be banned.

Amnesty says the data driving the predictive systems and assumptions they rely on come from established “racist” police practices such as stop and search, where most stops find no wrongdoing and which overly targets Black people. That in turn is corrupting cutting-edge police predictive crime systems, billed as part of the future of battling crime.

Police say predictive policing helps cut crime, allowing officers and resources to be deployed where they are most needed.

Predictive policing involves computer programmes that use data and algorithmic models to estimate where crimes are most likely to happen. It was once the stuff of dystopian fiction, for example in the Steven Spielberg film Minority Report, but is an increasingly popular tool for law enforcement.

Amnesty says that of the 45 local forces across the UK, “32 have used geographic crime prediction, profiling or risk-prediction tools, and 11 forces have used individual prediction, profiling or risk-prediction tools”.

In its report, Automated Racism, to be released on Thursday, Amnesty says: “These systems are, in effect, a modern method of racial profiling, reinforcing racism and discrimination in policing.

“These systems are developed and operated using data from policing and the criminal legal system. That data reflects the structural and institutional racism and discrimination in policing and the criminal legal system, such as in police intelligence reports, suspect data, stop-and-search or arrest data. There is inherent bias in that data.”

The report says use of predictive policing led to a spike in stop and search in Basildon, Essex, from September 2020 to March 2021. “The force stopped and searched more people in Basildon than the rest of the entire police force area. They stopped and searched Black people in Basildon almost 3.6 times more than white people … [and] used force against Black people almost four times as much as white people.”

In London, after the Metropolitan police introduced predictive policing in Lambeth in 2020-21, the area had “the second highest volume of stop and search of all London boroughs”, the report says.

A system used by Avon and Somerset police gives a risk score to individuals. One person referred to in the report as David said he had been targeted by police and left with post-traumatic stress disorder.

He claims to have been stopped 50 times, including after putting a sticker on a lamp-post. He said: “I have therapy every week about some of the stuff that I’ve been through because of the police and how they’ve treated me over the past, say, three or four years It’s scandalous, to be honest. They made me feel like I don’t have any rights at all.”

One resident of Grahame Park, north London, deemed as high crime, said: “It’s labelled a crime hotspot. So when the police enter the area, they’re in the mindset of ‘we’re in a dangerous community – the people here are dangerous’. It doesn’t matter if they’re young people, they’re still ‘dangerous’ and therefore ‘we can police them violently’ and they do police them violently.”

Sacha Deshmukh, the chief executive of Amnesty International UK, said predictive policing had minimal or no effect on cutting crime. “The evidence that this technology keeps us safe just isn’t there; the evidence that it violates our fundamental rights is clear as day. We are all much more than computer-generated risk scores,” he said.

“These technologies have consequences. The future they are creating is one where technology decides that our neighbours are criminals, purely based on the colour of their skin or their socioeconomic background.”

A spokesperson for the National Police Chiefs’ Council said: “Policing uses a wide range of data to help inform its response to tackling and preventing crime, maximising the use of finite resources. As the public would expect, this can include concentrating resources in areas with the most reported crime.

“Hotspot policing and visible targeted patrols are the bedrock of community policing, and effective deterrents in detecting and preventing antisocial behaviour and serious violent crime, as well as improving feelings of safety.”

They added: “It is our responsibility as leaders to ensure that we balance tackling crime with building trust and confidence in our communities whilst recognising the detrimental impact that tools such as stop and search can have, particularly on Black people.”

 

Leave a Comment

Required fields are marked *

*

*