Help us keep the lights on Support us
The Bristol Cable

How a police and council database is predicting if your child is at risk of harm

Approximately 90 children may have been wrongly flagged as being at risk of sexual harm by a database used by the council and police, according to Jake Hurfurt, the Head of Research and Investigations at Big Brother Watch.

Opinion

Over the last decade, Avon and Somerset Police and Bristol City Council have developed one of the most sophisticated predictive tools in the public sector, the Think Family database. Run by Insight Bristol, a collaboration between the police and the council, the programme is an exercise in mass data gathering that affects many people in the city.

Protecting children is an important responsibility and digital tools are now commonly used by social services. However, this should not give councils a free pass to collect more data than they need or to handle anyone’s data in an opaque or unfair way. As such, at Big Brother Watch we believe that it is vital that public authorities are held to account for what they do with our personal information.

At least 170,000 Bristolians from 54,000 families have their data held on the Think Family database. We estimate this adds up to around three in every five families with children in the city having something about them on the database, underlining the scale of the data collection.

With so many people on the database, questions must be asked about whether this is a fair way to handle personal data, especially when so many local authorities have shown they do not understand that discrimination can be unintentionally built into their algorithms.

Our new findings about Bristol’s Think Family database are published today in Big Brother Watch’s report, Poverty Panopticon: The Algorithms Shaping Britain’s Welfare State. The nine-month-long investigation into the use of algorithms and automation by councils covers everything from the mass profiling of social housing tenants to private companies selling tools that predict people’s vulnerability to a range of harms, from homelessness to debt.

Insight Bristol is unique in developing and running such a complex predictive tool internally, rather than purchasing one from a private company. Bristol is using perhaps the most advanced tool with one of the largest datasets of any we have investigated. There are three main models used by the Think Family program, that look at the risk of children suffering sexual abuse, criminality or being out of work, education and training (NEET).

Some data is even purchased from the credit reference agency Experian

All three generate risk scores for each type of harm based on risk factor data held on the system. The more complex sexual abuse and NEET models go even further and use predictive analytics when modelling risk. This means that the system does not just look for risk factors directly linked to harm. It instead looks for patterns in the wider data to try and identify the kinds of people vulnerable to harm but may not flag up on risk factors alone. 

So how does it work?

Risk scores, ranging from 1 to 100, are presented to staff working on the scheme alongside information extracted from the child or family’s case files. The approximately 450 staff with access to the database do not use the risk scores alone to make decisions and exercise discretion, but the score would not exist if it was not used at all. 

Around 450 children were flagged as being at risk of sexual exploitation by the model as of June, according to Bristol City Council. In response to an FOI request, the council also told us that the sexual abuse model has a precision rate (the percentage of those flagged as being at risk of being identified correctly) of around 80%. This means that around 90 of the 450 children may have been wrongly flagged as being at risk of harm, a worrisome suggestion.

Dozens of public sector databases, including schools, housing and even NHS files, are trawled for information, and this data covers 40 different social issues from poverty to health and school exclusion. Alarmingly, some data is even purchased from the credit reference agency Experian, which was investigated by the Information Commissioner last year for trading people’s data without their consent.

Anything from teenage pregnancies to free school meals and mental health concerns can be used by the tool. We found that internal council databases from education to housing and social care are combined with information taken from central government, police, prisons and the NHS. Many people who share sensitive information with public services would never imagine it would be shared so widely. It is vital that everyone knows exactly how their information is used and why, so we can all make a choice about what to share and what to keep private. 

City Hall, Bristol City Council

Another major issue is that some of the data risks introducing inadvertent bias into the system. For example, free school meals tilt towards the poorest in society while mental health interventions often overrepresent some ethnic minorities. This is a different problem to non-algorithmic bias, which could take the form of, say, a case worker’s prejudices affecting their work, as biased algorithms can entrench inequality in every decision they influence and on a mass scale.

Despite this, Bristol City Council said in response to a Big Brother Watch FOI request that as personal characteristics such as race are not put into the model they cannot feature in the output and that the model’s results reflect the city’s diversity. 

But this shows a worrying misunderstanding of indirect bias, which has been a common theme across the dozens of local authorities I have investigated. Few seem to realise that even if a factor like ethnicity is excluded from a model, the outcomes can still be disproportionate if a detail that over-represents a certain ethnic group, such as free school meals, is used in the algorithm.

Anything from teenage pregnancies to free school meals and mental health concerns can be used by the tool

I was also alarmed at the extent of data sharing involved in the Think Family programme. The police appear to have access to some information and there are plans to share information with other public bodies, such as schools and GPs.

This could have a chilling effect on vulnerable peoples’ willingness to be frank with public services when necessary. If someone is aware that data from health or school files could be used to profile them and intervene in their life they will may not fully cooperate with public services they may desperately need.

Although documents disclosed to Big Brother Watch and published online suggest that Think Family flags only end up on police files after a family has accepted an offer of help, an awards application for Insight Bristol that we uncovered suggests that police staff have more general access to the database.

Insight Bristol has plans to share insights from the model with GPs and even school staff. This is worrying and could lead to a comprehensive surveillance state for vulnerable people in Bristol, where their interactions with a range of public services may be fed into or pulled from a centralised system, where data about them has been collated without their knowledge or consent.

Big Brother Watch was able to find out more about the Think Family’s predictive model because Bristol City Council was much more open than councils that use privately developed systems.

However, limited transparency is not enough and it does not excuse data gathering on this scale. Protecting children is important but it is not an automatic justification for putting the data rights of so many people at risk, especially when questions remain over the accuracy of the model and its potential for bias.

The breadth of the planned data sharing should alarm everyone in Bristol. The more public services integrate their databases and the information people have shared with health workers or schools for very limited purposes, the closer we get to a centralised database where privacy is sacrificed for a supposed greater good. Unless we make governments, local or national, justify why they hold our private data, there is little stopping them gathering more and more until they know every intimate detail of our lives – what may be trivial for one person may be intensely sensitive for someone else.

This is why Big Brother Watch is calling on Bristol City Council to reassess its data collection and only hold information that is absolutely necessary to protect children, and to halt data sharing across public bodies to make sure individuals can be open and frank with vital public services without fearing sensitive information will be shared more widely.

Jake Hurfurt is the Head of Research and Investigations at Big Brother Watch. He is currently digging deep into the use of AI, algorithms and predictive analytics in welfare and social care.

Keep the Lights On

Investigative journalism strengthens democracy – it’s a necessity, not a luxury.

The Cable is Bristol’s independent, investigative newsroom. Owned and steered by more than 2,600 members, we produce award-winning journalism that digs deep into what’s happening in Bristol.

We are on a mission to become sustainable – will you help us get there?

Join now

What makes us different?

Comments

Report a comment. Comments are moderated according to our Comment Policy.

Post a comment

Mark if this comment is from the author of the article

By posting a comment you agree to our Comment Policy.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related content

As Bristol battles to build affordable housing, developers are still gaming the system

The Cable has uncovered a brazen attempt by prolific property developers to escape building affordable housing, at a time when the city is still falling well short of its own targets.

Explained: What do council funding reforms mean for Bristol?

After austerity hollowed out council budgets, the new government is changing how local authorities are funded, so is there light at the end of the tunnel for Bristol?

Project helping new dads cope with parenthood faces uncertain future

Dad Matters, set up in 2024, has been supporting men across Bristol to navigate the tricky challenges of early fatherhood – but the council is now working out if it can continue funding the service.

Listen: Bristol Unpacked, with former Lord Mayor Paul Goggin on homelessness, mental health and the struggle for south Bristol votes

'It's been eventful', says Paul Goggin, of a life that has featured both rough sleeping and local politics. He joins Neil Maggs to talk housing, faith, and whether Labour should fear Reform in wards like Hartcliffe and Withywood.

Listen: Bristol Unpacked with Amanda Sharman on leading the charge for boat-dwellers’ rights

Who are the boat-dwellers living around Bristol's harbour, and why have they been at loggerheads with Bristol City Council? Neil Maggs chats to Bristol Boaters' Community Association co-chair Amanda Sharman to find out.

South Bristol’s new youth centre is technically in Knowle West. Can it deliver for kids from Hartcliffe too?

A huge Youth Zone, part of a national network, will be opening in 2026 by the Imperial Retail Park. But is it what this side of the city needs? And will young people feel welcome, no matter what postcode they live in?

Join our newsletter

Get the essential stories you won’t find anywhere else

Subscribe to the Cable newsletter to get our weekly round-up direct to your inbox every Saturday

Join our newsletter

Subscribe to the Cable newsletter

Get our latest stories & essential Bristol news
sent to your inbox every Saturday morning