‘Data sharing, supported by machine learning, can deliver better outcomes for children and families’

After a What Works report found no evidence that machine learning worked well in children's social care, Wajid Shafiq of data analytics firm Xantura sets out how he believes the approach can work in practice

Image of data (credit: Pablo Lagarto / Adobe Stock)
(credit: Pablo Lagarto / Adobe Stock)

by Wajid Shafiq, CEO, Xantura Limited

At Xantura, we welcomed observations by Anne Longfield, the children’s commissioner for England, in her foreword to What Works for Children’s Social Care’s recent research report on machine learning in children’s services.

While the study found that models developed by What Works to test predictive analytics were not effective, Longfield said she “firmly believe[s] that innovative uses of data – be they better analysis, sharing or recording – can unlock considerable benefits, helping local agencies make better and more effective decisions”.

We were also struck by a statistic Longfield quoted, that “there are 2.3 million children in England growing up with a vulnerable family background – far bigger than the number of children being supported by children’s social care at any time”.

Having initially been surprised with the findings of the report, we were heartened by the caveat that while What Works’ models found “no evidence that machine learning works well in children’s social care”, the study did not conclude definitively that machine learning doesn’t work.

Our experience across several real-life implementations is that the range of data used in predictive models is key to their performance, as is how well that data has been pre-processed before the application of machine learning techniques.

Important sources of information

The What Works study only had access to data from children’s social care systems. It did not draw on other important sources of information, such as school attendance, exclusion or youth offending data, and neither did it consider data about the wider family situation.

Professionals wouldn’t make an assessment solely on this basis; rather, they would consider children and their families holistically to build an accurate picture of individual circumstances.

The What Works study developed several algorithms, and we cannot make a direct comparison as our algorithms are not looking at the same outcomes. However, none of the What Works algorithms could get the prediction right more than 65% of the time or identify more than 65% of cases. These were considered as key success thresholds.

In contrast, taking one of our algorithms, that predicts child protection cases that will become child looked-after cases in the next six to 12 months, we predict this accurately more than 80% of the time and identify 77% of cases.

One of the key differences is that our algorithm draws on a range of data sources, underpinned by robust information governance agreements. This difference in performance reinforces our argument that the systems delivering these algorithms need to enable the integration of multiple data sources, in a way that is controlled and ethical.

Supporting social workers

Another significant factor when considering the value of predictive models in children’s services is how they are used in practice. Our approach completely aligns with the statement in Longfield’s foreword to the What Works report that a “statistical model is no match for a human”. But the main report goes on to suggest that, if models could be made to work, they would be likely to disempower, rather than to support, social workers.

We are unaware of any implementation of predictive models in local government that involves automated decision-making (and I would argue that existing legislation and the Information Commissioner’s Office are already effective guardians). In our own implementations, we work with councils to enable the controlled, proportionate sharing of data for use by professionals to support their decision making. Frontline practitioners are presented with textual case summaries and trends of factual information, drawn from multiple sources.

Of course, no algorithms are perfect, and we need to be careful to consider bias and mitigate against unintended consequences.  But this is nothing new – the public sector has done exactly that when deploying ‘algorithms’ in the past, for example DASH models to predict domestic violence risk or the Youth Justice Board model to predict recidivism risk.

We also acknowledge that we need to do more to share what we have learned while respecting client confidentiality. This is a sensitive area and it was understandable the councils that participated in What Works research did so on the basis of anonymity.

Had we had the opportunity to contribute to the study, we would also have shared our insights that the successful use of predictive modelling also relies on wider cultural and transformational change. For example, in Hackney, where our work was discontinued in 2017, it wasn’t that effective models couldn’t be produced (their performance was very similar to the models we are currently supporting) but that there were several wider issues affecting their ability to be deployed effectively and sustainably.

We are looking forward to continuing our work in this area (with huge numbers of children missing from the social care system and in need of support), which is already producing highly effective algorithms. We, and the forward-thinking councils we work with, believe this is an important area of work with real potential to deliver benefits for professionals and for families and children.

Wajid Shafiq is the chief executive of data analysis firm Xantura.

More from Community Care

4 Responses to ‘Data sharing, supported by machine learning, can deliver better outcomes for children and families’

  1. LB September 23, 2020 at 10:35 pm #

    Statistics is not accurate recording and data input is only reliable on the individual inputting accurate and truthful data!

  2. Tony Stanley September 24, 2020 at 5:27 am #

    Hi Wajid, this is Tony (was PSW in Tower Hamlets 2012-15) when you were promoting/ selling this in 2014 ( I recall a big UK Govt Grant for 4 LA’s), and i raised with you and snr leadership, then, considerable ethical issues and concerns of data collection for one purpose and use for another; to which i am not satisfied was addressed then nor now; but i welcome the ongoing debate;

    this sentence in your piece stood out to me …
    “We, and the forward-thinking councils we work with” …..
    curious choice of phase given other LA’s have stopped this work (Tower Hamlets etc)
    what, in deed, in your view is forward-thinking?

    a small contribution from us in Aotearoa NZ to the debate, a few years old now, but i think worth a revisit –
    https://www.communitycare.co.uk/2018/03/29/artificial-intelligence-childrens-services-ethical-practical-issues/

  3. Wajid Shafiq September 28, 2020 at 5:25 pm #

    Hi Tony, Thank-you for your post and the link to your article – which poses very valid concerns that need to be addressed if these techniques are to gain traction in children’s social care.

    All of our work since we met in 2014 (and in the prior 6 years) has been to build in safeguards to mitigate the risks outlined in your article. I would be more than happy to have more in-depth discussion if you have the time for a call?

    My reference to ‘forward thinking councils’ would probably have been better framed as ‘councils that are willing to explore the potential of these approaches’.

  4. Tony Stanley October 10, 2020 at 8:30 pm #

    sure i can be reached at tony.stanley@ot.govt.nz