We need your support to continue! Become a member
The Bristol Cable

How a police and council database is predicting if your child is at risk of harm

Approximately 90 children may have been wrongly flagged as being at risk of sexual harm by a database used by the council and police, according to Jake Hurfurt, the Head of Research and Investigations at Big Brother Watch.

Opinion

Over the last decade, Avon and Somerset Police and Bristol City Council have developed one of the most sophisticated predictive tools in the public sector, the Think Family database. Run by Insight Bristol, a collaboration between the police and the council, the programme is an exercise in mass data gathering that affects many people in the city.

Protecting children is an important responsibility and digital tools are now commonly used by social services. However, this should not give councils a free pass to collect more data than they need or to handle anyone’s data in an opaque or unfair way. As such, at Big Brother Watch we believe that it is vital that public authorities are held to account for what they do with our personal information.

At least 170,000 Bristolians from 54,000 families have their data held on the Think Family database. We estimate this adds up to around three in every five families with children in the city having something about them on the database, underlining the scale of the data collection.

With so many people on the database, questions must be asked about whether this is a fair way to handle personal data, especially when so many local authorities have shown they do not understand that discrimination can be unintentionally built into their algorithms.

Our new findings about Bristol’s Think Family database are published today in Big Brother Watch’s report, Poverty Panopticon: The Algorithms Shaping Britain’s Welfare State. The nine-month-long investigation into the use of algorithms and automation by councils covers everything from the mass profiling of social housing tenants to private companies selling tools that predict people’s vulnerability to a range of harms, from homelessness to debt.

Insight Bristol is unique in developing and running such a complex predictive tool internally, rather than purchasing one from a private company. Bristol is using perhaps the most advanced tool with one of the largest datasets of any we have investigated. There are three main models used by the Think Family program, that look at the risk of children suffering sexual abuse, criminality or being out of work, education and training (NEET).

Some data is even purchased from the credit reference agency Experian

All three generate risk scores for each type of harm based on risk factor data held on the system. The more complex sexual abuse and NEET models go even further and use predictive analytics when modelling risk. This means that the system does not just look for risk factors directly linked to harm. It instead looks for patterns in the wider data to try and identify the kinds of people vulnerable to harm but may not flag up on risk factors alone. 

So how does it work?

Risk scores, ranging from 1 to 100, are presented to staff working on the scheme alongside information extracted from the child or family’s case files. The approximately 450 staff with access to the database do not use the risk scores alone to make decisions and exercise discretion, but the score would not exist if it was not used at all. 

Around 450 children were flagged as being at risk of sexual exploitation by the model as of June, according to Bristol City Council. In response to an FOI request, the council also told us that the sexual abuse model has a precision rate (the percentage of those flagged as being at risk of being identified correctly) of around 80%. This means that around 90 of the 450 children may have been wrongly flagged as being at risk of harm, a worrisome suggestion.

Dozens of public sector databases, including schools, housing and even NHS files, are trawled for information, and this data covers 40 different social issues from poverty to health and school exclusion. Alarmingly, some data is even purchased from the credit reference agency Experian, which was investigated by the Information Commissioner last year for trading people’s data without their consent.

Anything from teenage pregnancies to free school meals and mental health concerns can be used by the tool. We found that internal council databases from education to housing and social care are combined with information taken from central government, police, prisons and the NHS. Many people who share sensitive information with public services would never imagine it would be shared so widely. It is vital that everyone knows exactly how their information is used and why, so we can all make a choice about what to share and what to keep private. 

City Hall, Bristol City Council

Another major issue is that some of the data risks introducing inadvertent bias into the system. For example, free school meals tilt towards the poorest in society while mental health interventions often overrepresent some ethnic minorities. This is a different problem to non-algorithmic bias, which could take the form of, say, a case worker’s prejudices affecting their work, as biased algorithms can entrench inequality in every decision they influence and on a mass scale.

Despite this, Bristol City Council said in response to a Big Brother Watch FOI request that as personal characteristics such as race are not put into the model they cannot feature in the output and that the model’s results reflect the city’s diversity. 

But this shows a worrying misunderstanding of indirect bias, which has been a common theme across the dozens of local authorities I have investigated. Few seem to realise that even if a factor like ethnicity is excluded from a model, the outcomes can still be disproportionate if a detail that over-represents a certain ethnic group, such as free school meals, is used in the algorithm.

Anything from teenage pregnancies to free school meals and mental health concerns can be used by the tool

I was also alarmed at the extent of data sharing involved in the Think Family programme. The police appear to have access to some information and there are plans to share information with other public bodies, such as schools and GPs.

This could have a chilling effect on vulnerable peoples’ willingness to be frank with public services when necessary. If someone is aware that data from health or school files could be used to profile them and intervene in their life they will may not fully cooperate with public services they may desperately need.

Although documents disclosed to Big Brother Watch and published online suggest that Think Family flags only end up on police files after a family has accepted an offer of help, an awards application for Insight Bristol that we uncovered suggests that police staff have more general access to the database.

Insight Bristol has plans to share insights from the model with GPs and even school staff. This is worrying and could lead to a comprehensive surveillance state for vulnerable people in Bristol, where their interactions with a range of public services may be fed into or pulled from a centralised system, where data about them has been collated without their knowledge or consent.

Big Brother Watch was able to find out more about the Think Family’s predictive model because Bristol City Council was much more open than councils that use privately developed systems.

However, limited transparency is not enough and it does not excuse data gathering on this scale. Protecting children is important but it is not an automatic justification for putting the data rights of so many people at risk, especially when questions remain over the accuracy of the model and its potential for bias.

The breadth of the planned data sharing should alarm everyone in Bristol. The more public services integrate their databases and the information people have shared with health workers or schools for very limited purposes, the closer we get to a centralised database where privacy is sacrificed for a supposed greater good. Unless we make governments, local or national, justify why they hold our private data, there is little stopping them gathering more and more until they know every intimate detail of our lives – what may be trivial for one person may be intensely sensitive for someone else.

This is why Big Brother Watch is calling on Bristol City Council to reassess its data collection and only hold information that is absolutely necessary to protect children, and to halt data sharing across public bodies to make sure individuals can be open and frank with vital public services without fearing sensitive information will be shared more widely.

Jake Hurfurt is the Head of Research and Investigations at Big Brother Watch. He is currently digging deep into the use of AI, algorithms and predictive analytics in welfare and social care.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related content

'We need public leaders and institutions to be bold with big ideas, despite the risks and bad PR'

When we face existential issues like the climate crisis, or calls for radical changes to problems that affect us all, we need public institutions and leaders to not be timid.

Bristol was one of five areas with illegal air pollution in 2020

The city’s Clean Air Zone is finally coming into force next summer after multiple delays. Campaigners say it can’t come soon enough.

Watch: the filmmakers behind 'Rooted in Bristol' discuss land, race and inequality

The new documentary, which premiered at Afrika Eye Film Festival, profiles Bristol’s Black and Afro-Caribbean food growers who discuss the importance of equitable access to land.

A year after opening, some residents fear for their safety at controversial south Bristol flats

Imperial Apartments, a converted Hengrove office block where many flats do not meet space standards, was opened in 2020 to private renters and people facing homelessness. There are serious concerns about conditions there – but with Bristol’s bleak rental market cutting off housing options, some s...

Multi-million pound projects delayed after bust-up between council leaders and metro mayor

Four council leaders refused to attend a meeting of the West of England Combined authority over claims metro mayor Norris’s recent vetoing of decisions was “unlawful”.

‘Bristol Energy has gone, will it pull the mayoral system down with it?’

Former councillor and vice chair of the Audit Committee Clive Stevens argues the lessons learned from the Bristol Energy saga is another example of how the mayoral system isn’t working for Bristol.

Join our newsletter

Get the essential stories you won’t find anywhere else

Subscribe to the Cable newsletter to get our weekly round-up direct to your inbox every Saturday

Join our newsletter

Get the essential stories you won’t find anywhere else

Subscribe to the Cable newsletter to get our weekly round-up direct to your inbox every Saturday