Police and council defend safeguarding app after calls to stop collecting info on vulnerable young people
Avon and Somerset Police and Bristol City Council have defended a safeguarding app which has “helped protect hundreds” of children. An app and database collates information on risks to vulnerable children and young people to help teachers spot early warning signs.
As reported by the Guardian last week, campaigners have criticised the Think Family Education app and associated database which they say could lead to young people from working class or minority ethnic backgrounds being discriminated against. They claimed its use was “secretive” and called for it to be shut down.
The database, which launched in 2015, combines information from the police, council, government and social care services, and helps professionals spot children at risk of criminal or sexual exploitation.
Assistant Chief Constable Will White, who leads on Avon and Somerset Police’s race matters work, said: “I recognise the concerns being raised by community and campaign groups about the use of the Think Family Education app and more broadly around our use of data analytics to prevent crime and safeguard the vulnerable.
“Our motivation for using this app, in partnership with other agencies, is to protect and safeguard the most vulnerable from harm, support them and provide better services. But I understand there are concerns about disproportionality and the impact this might have on people from racially or ethnically minoritised or more disadvantaged backgrounds.”
The Think Family app is available to more than 100 schools in the Bristol area. Its aim is to help staff identify risks more quickly, avoiding lengthy delays of requesting information from other agencies, so they can provide support or intervene where needed.
White added: “Neither the database nor app replace professional judgement or decision-making, and they do not assess the likelihood of an individual to commit a crime. They provide a vulnerability-based risk score based on a number of factors including whether the young person has previously been a victim of crime and whether they have previously been reported missing.
“This score is designed to help guide and supplement the work of professionals and provide them with information about children at risk that they may not easily see. We do not use ethnicity to assess risk.”
Bristol City Council said a “generational squeeze on public finances” has led to less safeguarding support across the country. Shrinking budgets has meant a growing trend of public agencies working in silos and not communicating well with each other, which the database and app aim to address.
A Bristol City Council spokesperson said: “The establishment of the Think Family Database was one of the ways in which the team sought to bring agencies closer and ensure safeguarding professionals were working together to put in place support for those children and families at risk of breakdown or crisis.
“By ensuring that appropriate and relevant information held by individual agencies is collated in a single place means professionals can act quickly without risk of duplication, to put in place the help children and families need, often before they reach a point of crisis. This approach is preventing incidents that could lead to life-long trauma.”
If teachers have safeguarding concerns about a specific pupil, they can use the app to see if there are any other relevant risks to the pupil, such as a report to social services. This helps build up a more complete picture of when support or intervention is needed.
The council spokesperson added: “The app gives schools access to information from agencies they would receive in any case where there is a safeguarding concern or a child in need. The ability to immediately access relevant information when safeguarding concerns arise is helping schools make informed and effective decisions, in consultation with parents, carers and guardians, about the help they can provide to support vulnerable children.”
But the app and database has caused concern among campaigners who claim their use is secretive and covert. Details of the “secret” Think Family app are published on Bristol City Council’s website. Fair Trials, a criminal justice campaign, said profiles of pupils are created using data from Avon and Somerset Police which “reflect the structural biases in society”, and called for the safeguarding app to be shut down.
Fair Trials said the council has a user charter for Think Family Education, which states that schools must keep use of the system “on a limited need-to-know basis”. Safeguarding leads at a Bristol secondary school also reportedly told campaigners that they keep the system secret from parents and carers.
Griff Ferris, senior legal and policy officer at Fair Trials, said: “It is unbelievable that this needs to be said, but school children should not be monitored, profiled and criminalised by secretive police databases in schools. Surveillance is not safeguarding. Profiling children as criminals is not safeguarding.
“There is no safety in a system that covertly facilitates digital police in schools. Systems like this merely uphold already existing discrimination against children and families from minoritised ethnic and more deprived backgrounds, and reinforce the school to prison pipeline.
“This system is expanding the net of surveillance and criminalisation into our schools, and will lead to Bristol school children being targeted by police and drawn into the criminal justice system, while also facing suspicion, punishment and exclusion at school. It should be shut down immediately.”
This is far from the first time the use of the database has come under fire. In 2021, Big Brother Watch published a report into the the use of algorithms and automation by councils. The research group called on Bristol City Council to limit its data collection to information that is absolutely necessary to protect children, and to halt data sharing across public bodies.
A spokesperson from No More Exclusions Bristol added: “As we have seen with predictive policing, technologies that gather and use information in the name of ‘public safety’ overwhelmingly reproduce racialised ideas of problematic behaviour. Too often solutions to these ‘problems’ are punitive and therefore the covert use of intrusive monitoring is yet another indicator that, in this city, control is valued above education.”