‘No evidence’ machine learning works well in children’s social care, study finds

What Works-developed models fail to identify most children at risk and, where they do identify risk, get it wrong in most cases

Digital image of brain and binary data flow signifying machine learning (credit: Elnur / Adobe Stock)
(credit: Elnur / Adobe Stock)

There is no evidence that using machine learning to predict outcomes for families involved with children’s social care services is effective, research has found.

Models built by What Works for Children’s Social Care and trialled over 18 months in four local authority areas failed to identify, on average, four out of every five children at risk.

Where the models flagged a child as being at risk, meanwhile, they were wrong six out of 10 times.

The research found introducing text information extracted from social work reports did not reliably improve models’ performance, despite this offering a more nuanced picture of families than can be gleaned from demographic information and data tracking interactions with practitioners.

What is machine learning?

Machine learning (ML) seeks to find patterns in data. What Works examined a type of ML called predictive analytics, under which models use patterns from historic data to learn to what extent certain inputs or decisions are associated with particular outcomes. It then uses these patterns to predict the likelihood of the specified outcome in future, given the relevant input data.

The study report called on councils already trialling predictive technology in children’s social work to be transparent about its limitations. One such council, Hackney, axed its Early Help Profiling System (EHPS), commissioned from the private provider Xantura, late in 2019 after it did not “realise the expected benefits”.

“Given the extent of the real-world impact a recommendation from a predictive model could have on a family’s life, it is of utmost importance we work together as a sector to ensure these techniques are used responsibly if at all,” the report concluded.

‘Time to stop and reflect’

The new research follows on from a separate What Works review, published in January 2020, which questioned how ethically compatible machine learning was with children’s social work.

Michael Sanders, the What Works executive director and co-author of the study report, said the findings indicated that it was time for the children’s social care sector “to stop and reflect”.

“The onus is now on anyone who is trying to say [that predictive analytics] does work, to come out and transparently publish how effective their models are,” Sanders told Community Care.

“What we have shown in our research is that with a lot of the best techniques available to us, the data across these four local authorities says it’s not working,” he added.

Sanders, who has also researched machine learning in children’s social care as part of the Behavioural Insights Team (BIT), formerly part of government and known as the ‘nudge unit’, said his views had changed, in line with available evidence, as to the technology’s potential benefits.

“We don’t think we are infallible – if someone can find a mistake we’ve made, or can take our code [which will be publicly available] and do something good with it, then I am happy for that to happen,” he said. “But it needs to be in an open and transparent way, not behind closed doors.”

Sanders added that central government, or bodies such as the Local Government Association (LGA) or Association of Directors of Children’s Services (ADCS), could now take a lead in policing the use of machine learning until such a time as its worth could be demonstrated.

‘Surprisingly bad performance’

The What Works study’s models were developed to predict eight separate outcomes (see box), using three to seven years of data provided by the four councils, from the North West, South West, West Midlands and South East regions.

The eight predictions

The What Works study looked at eight different scenarios, each based on a decision-making point for a social worker in a case and looking at whether the case would escalate at a later point in time. They were:

  • Is a child re-referred within 12 months of a ‘no further action’ decision, and does the case then escalate to statutory intervention?
  • Does a child’s case progress to a child protection plan or beyond within six months of a contact?
  • Is a child’s case open to children’s social care, but below statutory intervention, within 12 months of a ‘no further action’?
  • Is a child’s case escalated to a child protection plan or beyond between three months and two years of a referral?
  • Is a child’s case escalated to a child protection plan or beyond within six to 12 months of a contact?
  • After successfully engaging with early help, is a child referred to statutory services within 12 months?
  • Does a child’s case escalate to a child protection plan between one and 12 months of an assessment authorisation date?
  • Does a child’s case escalate to them becoming looked-after between one and 12 months of an assessment authorisation date?

Each was tested in four different builds, so as to gauge whether including pseudonomised text data from social work records improved performance, and what impact only using historical data (thereby simulating real-world usage) had.

In each instance, the models failed to reach a pre-specified ‘success’ threshold of 65% precision. “This is lower than the threshold we would recommend for putting a model into practice but provides a useful low benchmark,” the report said.

In particular, the study found, the models tended to miss the majority of children at risk of a given outcome, which could potentially lead to results discouraging social workers from intervening.

In models where text had been introduced, performance improved in some scenarios. But it worsened in others, giving an overall picture of no consistent benefit – a result Sanders said was unexpected.

“I was surprised by just how bad the models performed overall,” he said. “From my previous research [with BIT, in a single borough], we found quite a big benefit to using text data as well, but that picture is much cloudier coming out of this piece of research.”

Sanders said that it was likely the evolution of systems and practice models, and turnover of staff, meant text data “is particularly vulnerable to changing over time”, making it less reliable as a basis for predictions.

A poll of 129 social workers conducted as part of the study, uncovered no clear support for the use of predictive analytics across a range of scenarios, with a tool to support practitioners to identify early help for families the most popular, but backed by only 26% of respondents. Just over a third (34%) of respondents said they did not think it should be used at all.

‘We are far from achieving minimum standards’

Responding to the new findings, Claudia Megele, the chair of the Principal Children and Families Social Worker (PCFSW) Network and co-chair of the Social Work Practice and Digital Network, said: “There is a fascination with the use of machine learning and predictive technologies in children’s social care and as demonstrated by previous research and experiences of several local authorities as well as this report, there are significant risks, biases and ethical challenges associated with the application of such models in practice.

“In fact, local authority data often does not have the depth, breadth, structure and detailed information required by such systems,” Megele added. “Therefore, both ethically and practically we are far from achieving the minimum standards for use of machine learning for predictive purposes in practice.”

But, Megele noted, algorithms and machine learning can be used in other areas to intelligently support practitioners.

“It would be helpful if local authority resources were focused on the aspects of technology that are proven to be effective and that can support practitioners in time-taking tasks, ranging from gathering historical data to information sharing and partnership working with other agencies,” she said. These could include automating chronologies or important information-sharing tasks by intelligently routing required information to relevant agencies and professionals; for example court orders, or child protection plans.

“Such automations will offer immediate practical support for practitioners while reducing costs and increasing the accuracy, timeliness and availability of information,” Megele said.

‘Human connection is the heartbeat of social work’

Meanwhile Rebekah Pierre, a professional officer at the British Association of Social Workers (BASW), said the What Works report reinforced “that human connection is the heartbeat of social work, [which] is not an ‘exact science’ that can be replicated by automated processes”.

Pierre added that social work was “founded on relationships, person-centred practice, and meaningful interactions – none of which can be achieved by data sets”, saying that BASW’s 80:20 campaign was continuing to champion this at a time when the coronavirus pandemic had diminished face-to-face contact.

“The margin of error [identified in the study] is deeply concerning – if applied to practice, most children would be left without support,” Pierre said. “It is unsurprising that there is a low level of acceptance of the use of these techniques in children’s social care among social workers. Being experts in the field, it is imperative that frontline practitioners are listened to.

“The recent A-level fiasco, which saw the opportunities of millions of children diminished at the expense of an algorithm, highlights the devastating consequences of predictive technology,” Pierre added. “The safety and wellbeing of society’s most vulnerable children must not be gambled with in the same way within social work.”

Jenny Coles, the president of the Association of Directors of Children’s Service, said: “This report highlights the challenges of trying to predict human behaviour and gives policy makers, local authorities and others a lot to consider. Children’s social care is complex, no two families or situations are the same and building relationships are central to the work of social workers and other professionals supporting families in times of need.”

‘Could be worthwhile exploring further’

But Coles also flagged up the fact that the study did not seek to answer whether or not machine learning could ever work in this context. “We know some local authorities are developing or exploring the use of machine learning models in children’s social care as an additional tool to support professional decision making,” she said. “It could be worthwhile exploring further, particularly if it could help us to be effective in identifying opportunities to support children and families earlier before they reach crisis point.”

A Department for Education spokesperson said: “This report was testing a children’s social care model that used predictive analysis, and its findings will help refine and improve the evidence available. In the coming years we expect to see more local authorities using new technology such as machine learning or artificial intelligence, so it is right that we improve our understanding of how it can improve practice.”

More from Community Care

4 Responses to ‘No evidence’ machine learning works well in children’s social care, study finds

  1. Captain Fog September 11, 2020 at 5:02 pm #

    Very interesting but not surprising. Given the success of AI algorithms like Babylon in diagnosing medical conditions perhaps at last we might see some recognition that the public perception that errors in social work are unforgivable because social work is simple might be challenged. I suspect emotional intelligence combined with use of sophisticated intuition based on a combination of good perceptual abilities, memory stores and meta cognition May very well be very difficult for an AI algorithm to develop – leaving social work as one of the last professions to be taken over by a machine.

Trackbacks/Pingbacks

  1. ‘No evidence’ machine learning works well in children’s social care – IAM Network - September 10, 2020

    […] READ MORE FROM SOURCE ARTICLE […]

  2. ‘No evidence’ machine learning works well in children’s social care – NikolaNews - September 10, 2020

    […] Credit: Google News […]

  3. 'No evidence' machine learning works well in children's social care, study finds – Paper TL - September 10, 2020

    […] Read more at https://www.communitycare.co.uk […]