The news this week that councils are set to use people’s data to create predictive models to detect child abuse has caused a stir, but councils in London have been piloting predictive analysis models across their child protection services for the last three years. And it’s not just councils – the NHS and the Department for Work and Pensions have been using them for some time.
A September 2017 article in Apolitical, describes how councils in London have been using data analytics to try to identify children at risk of harm since 2015, something The Guardian missed this week when it covered the issue. It also suggests that the predictive model used in one council had an 80% success rate, though it’s not clear from the article exactly how success rates were measured. What’s interesting about this article is that it highlights the financial advantages to councils, of using data in this way. The article tells us:
“Councils are expected to save over $910,000 for early targeted interventions, $160,000 by replacing human-conducted screenings with an automated system, and $193,000 for improving access to multi-agency data.”
The article says that the company responsible for creating this predictive model is called Xantura, and their pilot inside the child protection sector has been running since 2015. The cost of the software is an eye watering $1.25 million, and was launched in January 2015.
According to the article, councils using Xantura’s software have been slow to trust the model, which is not completely effective, or accurate, though social work teams may be more concerned about the threat the software poses to jobs inside the sector. The fact that councils will have to pay for the software out of their own budgets, has made uptake on the model slow, too.
Xantura is adamant that their software will save councils money in the long run, and some local authorities are getting on board as a result. Our favourite quote from the article comes from Steve Liddicott, Interim Assistant Director of Children and Young People’s Services at the London Borough of Hackney, who says:
“You actually don’t have to prevent that many children from going into care to make quite a significant saving.”
According to Apolitical, Xantura’s “Early Help Profiling System” (EHPS) uses stats from multiple agencies, including information about school attendance and attainment, families’ housing situations, as well as economic indicators. The model then takes those stats and turns them into ‘risk profiles’ for each family.
But here’s the thing. We know that doesn’t work. Remember the Troubled Families Programme? The one where a whistleblower blew the lid on the project, exposing it’s fraudulent activity, which included using stale data to assess families, and massaging the figures to engineer outcomes so that the team involved could cover up the programme’s failure and make it look like a success? They used big data, too.
The software the team used was Clear Core, and it was developed by a company called Infoshare. And that was as long ago as 2013.
While we are not against the use of technology when it is accurate and effective, the government’s drive to use predictive analytics inside the child protection sector, knowing these models do not deliver robust results, makes the software’s predictions highly dangerous, and the government vulnerable to costly litigation.
Is the answer better technology, or can big data never capture the human condition fully enough to make accurate predictions? We don’t know, but for those of you interested in this area, we’ve added some more information below:
Children At Risk: How different are children on Child Abuse Registers?
This is a piece of research from 1991, produced by Mark Campbell. It looks at whether a checklist with 118 items on it was able to identify children at risk of abuse and neglect. The checklist was applied to 25 different families, who were attending local authority centres at the time. Of those families, nine had children on the local child abuse register. The checklist scores of the families on the register were compared with those that were not, in the control group.
The research discovered something fascinating. There was little difference in the factors studied between the two groups. Mark concluded that this could have been down to one of two reasons:
- Either there was little real difference between the characteristics of abusing and non-abusing families, or;
- The process of registration was controlled by a series of events which were not solely related to the characteristics of the families in the control group.
This research deserves to be included in the discussion, as it represents the beginnings of data collection for predictive purposes in this area.
We have written before, about the risks involved in using big data and technology as it stands today. In April 2015, we shared our concerns over New Zealand’s plans to use data to try to create predictive models for child abuse. The lack of sophistication in these processes at the moment means that families could be exposed to predictive models that stereotype individuals and create unhelpful biases which could lead to large scale errors. We also mentioned another article, which was published by WIRED in January, 2018, called “A Child Abuse Prediction Model Fails Poor Families”, and is noteworthy for the way in which it talks about how this kind of software can automate inequality.
As always, these fights are never fair, or clean. In an ideal world, debate around the rights and wrongs of predictive machinery inside the child protection sector would be done only by those who are truly independent, but it’s easy to spot the conflicts of interest if you look hard enough. We’ll let you decide about this lot.
Ian Josephs said:
Police arrest those whom they believe have committed a crime .No punishment without crime.The system we used to have.
Social workers aided by complacent judges take children from parents when they think (or pretend to think) that one day in the future a parent might harm their child.
The system we have now !
Which system do you prefer ??
LikeLiked by 2 people
maureenjenner said:
Computers and their software are only ever as good as those operating them. We are still many years from having ‘thinking’ robots.
Although those that we do have are pretty impressive, they can only operate within the limits of their programme – and the person operating that programme. God forbid that children and childcare should be consigned to robotic machines. It all sound too much the stuff of science fiction and not what childcare systems should be about.
The initial cost is enormous, but will become factored into the calculations of civil servants, who doubtlessly will churn then out to politicians, so they can gleefully spin them to the press and media when on their walkabouts as money-saving soundbites.
It’s all about justifying the spending of great sums of money that sound impressive when addressing great gatherings at conference, etc., but amount to very little in the hard light of day and the constant struggle of making ends meet for Mr & Mrs Everyman. They know from experience – it will mean more taxes and having to face the constantly increasing burden that the daily cost of living has become for so many of us.
LikeLiked by 2 people
Pingback: Government Has Been Using Big Data For Years To Predict Child Abuse « Musings of a Penpusher
Dr. Manhattan. said:
“but it’s easy to spot the conflicts of interest if you look hard enough.”
Same story with stage 2 investigators who are supposed to be independent of the council be have been proven not to be the case.
Conflict of interest seems to be widespread.
LikeLiked by 1 person
Pingback: Government Has Been Using Big Data For Years To Predict Child Abuse | tummum's Blog
Ian Josephs said:
Steve Liddicott, Interim Assistant Director of Children and Young People’s Services at the London Borough of Hackney, says: (in the para above)
“You actually don’t have to prevent that many children from going into care to make quite a significant saving.”
Yes for people like Steve it appears that “cash is king” ! No mention of the possibility that more children should be left with their own parents !Scrap “risk assessments” and the like by giving every sane law abiding mum a chance to keep her baby Under a supervision order.
Give that baby a chance before you deprive it of the love of its mother !
LikeLiked by 1 person
Natasha said:
I thought the quote was disgusting. It seems to imply that adoption agencies shouldn’t worry about the software taking away its lucrative placements as councils will be quite happy to use the software just enough to keep every group with a vested financial interest inside the sector happy. I want to toss my digital cookies every time I read it.
LikeLiked by 1 person
Pingback: Big Data Isn’t Ready To Predict Child Abuse. | Researching Reform