Cops need to come clean on predictive policing of 250,000 people


Illustration: Sophia Checkley
Across the country, artificial intelligence systems are being rolled out in public services. Policing is no exception: Avon and Somerset Police (ASP) are using software to score one in every six people in the region on how likely they are to commit a crime.
The predictive modelling software – developed by the software company Qlik – analyses the likelihood of a person committing a burglary, stalking and harassment, or serious domestic or sexual violence.
Risk scores are generated for every known offender – around 250,000 people – in the Avon and Somerset force area. The score factors in a person’s criminal history, weighted higher for violent crime, in combination with a range of other data. A person’s score, which ranges from 1–100, tries to determine how likely someone is to re-offend, including what type of crime, and when it may be committed.
Using data to mass profile large sections of the population is no replacement for community policing.
Predictive policing has made its way across the pond from the US, where technology originally developed for earthquake prediction was altered for use by law enforcement. It has been lambasted by civil liberty groups as discriminatory but heralded as effective modern policing by policymakers and police forces alike.
Undoubtedly, technology can be transformative and techniques such as machine learning offer the ability to use data to inform decision-making in ways impossible before. But when applied in an age of automation, where new technology is seen as an attempt to plug the gap left by a decade of austerity, we need to have an open debate about who benefits from predictive policing.
Part of this debate centres around how effective these systems are and how they affect communities differently. We need greater openness from ASP and, in the spirit of policing by consent, we need to be able to publicly scrutinise the effectiveness of risk scores, given troubling cases of predictive policing-use internationally.
One area that needs particular scrutiny is the over-policing of communities of colour. We live in an area of the country where you are four times more likely to be arrested as a black person than a white person. The question we face is: do risk scores threaten to encode this existing bias?
Discrimination in the data
In a high profile case in the US, a similar software which scored parole candidates on their likelihood of reoffending was condemned as discriminatory. The software scored black candidates significantly higher than white candidates despite having a similar criminal record. The ability for scoring systems to discriminate even when protected characteristics are excluded comes down to what’s called proxy data – such as postcode, income and education, which can correlate to things like ethnicity.
Jonathan Dowey, Business Intelligence Manager at ASP, tells me about the force’s software, called Qlik Sense: “The data used as part of the modelling process primarily focuses on crime incidents, demographic data (excluding ethnicity) and command and control incident data.”
Dowey tells me the police had not noticed any ethnicity bias, and that they review the accuracy of risk models quarterly. However, these reports are not publicly available so cannot be independently verified. Researchers at the University of the West of England (UWE) are conducting a review on the use of predictive analytics by ASP. It is yet to be confirmed if the findings will be published.
Hannah Couchman from civil rights advocacy group Liberty challenges the idea that police data can be neutral and free from institutional bias. “Police data simply reflects existing and historic discrimination in policing, so algorithms that rely on this data will replicate this discrimination.”
Police cuts
Over at Cardiff University’s Data Justice Lab, Lina Dencik interviewed members of staff within ASP and examined why the software was introduced and how it affects the work of frontline staff.
“The austerity context is really important here,” says Dencik. In her interviews with police, it was noted that Qlik was introduced in the context of £80m worth of funding cuts to the constabulary. Dencik found that increased pressure brought about by cuts has underpinned the rollout of predictive analytics.
This is not to argue for an abandonment to predictive policing, nor is it to state that artificial intelligence has no place in 21st century law enforcement. I’m arguing for an open debate as a city about the role technology plays in policing
Using data to mass profile large sections of the population – 250,000 people in Bristol and surrounding areas – as a means to plaster over gaps left by funding cuts is no replacement for community policing. It’s high time we scrutinise ASP and make sure these questions don’t go unanswered. Accountability, after all, should be a two-way street.