We need your support to continue! Become a member
The Bristol Cable

Cops need to come clean on predictive policing of 250,000 people

Avon & Somerset Police are using algorithms to predict how likely someone is to commit a crime. A logical next step for policing or an invitation for discrimination?

Opinion

Illustration: Sophia Checkley

Across the country, artificial intelligence systems are being rolled out in public services. Policing is no exception: Avon and Somerset Police (ASP) are using software to score one in every six people in the region on how likely they are to commit a crime.

The predictive modelling software – developed by the software company Qlik – analyses the likelihood of a person committing a burglary, stalking and harassment, or serious domestic or sexual violence.

Risk scores are generated for every known offender – around 250,000 people – in the Avon and Somerset force area. The score factors in a person’s criminal history, weighted higher for violent crime, in combination with a range of other data. A person’s score, which ranges from 1–100, tries to determine how likely someone is to re-offend, including what type of crime, and when it may be committed.

Using data to mass profile large sections of the population is no replacement for community policing.

Predictive policing has made its way across the pond from the US, where technology originally developed for earthquake prediction was altered for use by law enforcement. It has been lambasted by civil liberty groups as discriminatory but heralded as effective modern policing by policymakers and police forces alike.

Undoubtedly, technology can be transformative and techniques such as machine learning offer the ability to use data to inform decision-making in ways impossible before. But when applied in an age of automation, where new technology is seen as an attempt to plug the gap left by a decade of austerity, we need to have an open debate about who benefits from predictive policing.

Part of this debate centres around how effective these systems are and how they affect communities differently. We need greater openness from ASP and, in the spirit of policing by consent, we need to be able to publicly scrutinise the effectiveness of risk scores, given troubling cases of predictive policing-use internationally.

One area that needs particular scrutiny is the over-policing of communities of colour. We live in an area of the country where you are four times more likely to be arrested as a black person than a white person. The question we face is: do risk scores threaten to encode this existing bias?

Discrimination in the data

In a high profile case in the US, a similar software which scored parole candidates on their likelihood of reoffending was condemned as discriminatory. The software scored black candidates significantly higher than white candidates despite having a similar criminal record. The ability for scoring systems to discriminate even when protected characteristics are excluded comes down to what’s called proxy data – such as postcode, income and education, which can correlate to things like ethnicity.

Jonathan Dowey, Business Intelligence Manager at ASP, tells me about the force’s software, called Qlik Sense: “The data used as part of the modelling process primarily focuses on crime incidents, demographic data (excluding ethnicity) and command and control incident data.”

Dowey tells me the police had not noticed any ethnicity bias, and that they review the accuracy of risk models quarterly. However, these reports are not publicly available so cannot be independently verified. Researchers at the University of the West of England (UWE) are conducting a review on the use of predictive analytics by ASP. It is yet to be confirmed if the findings will be published.

Hannah Couchman from civil rights advocacy group Liberty challenges the idea that police data can be neutral and free from institutional bias. “Police data simply reflects existing and historic discrimination in policing, so algorithms that rely on this data will replicate this discrimination.”

Police cuts

Over at Cardiff University’s Data Justice Lab, Lina Dencik interviewed members of staff within ASP and examined why the software was introduced and how it affects the work of frontline staff.

“The austerity context is really important here,” says Dencik. In her interviews with police, it was noted that Qlik was introduced in the context of £80m worth of funding cuts to the constabulary. Dencik found that increased pressure brought about by cuts has underpinned the rollout of predictive analytics.

This is not to argue for an abandonment to predictive policing, nor is it to state that artificial intelligence has no place in 21st century law enforcement. I’m arguing for an open debate as a city about the role technology plays in policing

Using data to mass profile large sections of the population – 250,000 people in Bristol and surrounding areas – as a means to plaster over gaps left by funding cuts is no replacement for community policing. It’s high time we scrutinise ASP and make sure these questions don’t go unanswered. Accountability, after all, should be a two-way street.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related content

Rising hate crime: A key concern for voters in PCC election

The Police and Crime Commissioner candidates are asked how they plan to reduce hate crime and rebuild trust among local communities

Bristol History Podcast: State Surveillance in Bristol and Beyond

This week I spoke with Colin Thomas and Tim Beasley of the Bristol Radical History Group to discuss the history of state surveillance in their...

Local policing has been slashed and violent crime is up. Who will claim the mantle of ‘law and order’?

Avon and Somerset Police has had 19% of its police officers cut since the Conservatives came to power in 2010

Investigation: Bristol’s invisible sex trade

The police are struggling to deal with sexual exploitation happening online. How does this off-street sex trade operate in this city?

Exclusive: Former coppers, including undercover, exposed in police data blunder

While police are fighting to protect the identities of officers during the ‘spycops’ public inquiry into undercover policing, an Avon and Somerset Police Facebook group has exposed the identities of former officers, including covert police.

Opinion: Unconscious bias training won’t end police institutional racism

When it comes to police racism, is it the case of a few rotten apples or do we need a new barrel altogether? Sam Kidel...

Join our newsletter

Get the essential stories you won’t find anywhere else

Subscribe to the Cable newsletter to get our weekly round-up direct to your inbox every Saturday

Join our newsletter

Get the essential stories you won’t find anywhere else

Subscribe to the Cable newsletter to get our weekly round-up direct to your inbox every Saturday