Artificial intelligence: how a council seeks to predict support needs for children and families

New technology gives London council the opportunity to identify early help support needs

Photo: MH/Fotolia

As the number of children on child protection plans and entering care has continued to rise, early help and intervention services, which have faced drastic funding cuts in recent years, are often seen as a path to preventing future pressures on these high-cost, high-risk services.

If a family can be engaged with services early, then work at that stage can prevent escalation that leads to family breakdowns and children entering care, so the theory goes.

In that spirit, Hackney council has partnered with tech company Xantura to develop an artificial intelligence (AI) computer programme they believe could help them support families with multiple needs earlier, in some cases even before they meet any statutory agency.

Steve Liddicott, head of service for children and young people at Hackney council, said the ‘Early Help Predictive System’ uses data from multiple sources to help identify families where extra support might be needed.

The overall purpose of the technology and analysis is to give the authority as much information as possible to help it decide whether a family needs support and begin as early as possible, whether that be through the Troubled Families programme, schools, education or children’s services.

After two years of testing the software, “we’re now getting to the stage where we are using it monthly to generate a list of between 10 and 20 families where we think there is evidence of future concern,” Liddicott says.

Data sources

The AI takes data at an individual level, which is anonymised before it is processed, from the youth offending system, children’s social care, education and various other systems within the council, including housing.

It uses this data to look at areas such as debt, worklessness, benefits, housing, domestic violence, youth offending, anti-social behaviour, and school attendance to create a profile of need for families. This data, sent to Xantura anonymously, is then sent back to children’s services with different levels of priority according to the criteria, which then decides whether it should act and how.

“It is not necessarily sending a social worker around to see the family, it is about looking at how they could best utilise the support they are already getting,” Liddicott explains.

He says, for families already in contact with council agencies, the process could be about the children’s service getting in touch with the agency to let them know other areas where a family may need support, or how their situation might develop.

“In that testing period we’ve had to look at what the data sources are, whether they are giving us an accurate picture, and how the different risk factors are weighted in the system so it doesn’t give undue prominence to the wrong ones,” Liddicott adds.

The algorithm assigns different levels of priority to risks according to the presence or absence of other criteria it is analysing, Liddicott explains.


So far, the AI system has only been used to help the service identify needs of people currently involved with statutory agencies, but Liddicott explains there will be a point, once it becomes more sophisticated, when it could identify families with no history of interacting with the council.

It will then be up to the council and partner agencies to identify the most appropriate agency to contact the family.

The council is also looking to use the data sources to generate ‘snapshots’ of what is going on in certain families, which it hopes to use in how the service screens referrals going forward.

“That’s also going to be the basis of a pilot project on sharing information with general practitioners to assist them with making referrals to children’s social care, which they will be able to make through this system directly to our front door.”

The system is now up and running, and the council reports there have been early interventions triggered in response to alerts the new artificial intelligence has generated. Now it’s a waiting game to see whether this is the early help solution councils, and families, need.

For two days of free essential learning and to boost your CPD profile, register now for Community Care Live Manchester 2018, taking place on 24-25 April at Manchester Central Convention Complex. Check out the programme here.

More from Community Care

26 Responses to Artificial intelligence: how a council seeks to predict support needs for children and families

  1. TINK March 1, 2018 at 1:18 pm #

    OMG!!! How totally irresponsible ..people are people with emotions/feelings/hearts/souls and minds of their own…ie free will!!!……stop trying to put them in tick boxes and destroying the family/homes and childrens lives!!!!! As well as the profession! Totally Agenda 21!!! Total wickedness and evil in the making!!!

    • Jay March 4, 2018 at 1:30 am #

      Totally agree with you TINK. The is skynet run by the SS… which is a terrifying prospect. And of course, there will be no arguing with the minion that turns up at your doorstep, because of course…if the “computer says NO”, ….well… you’re screwed.

  2. Andrew March 1, 2018 at 1:44 pm #

    I agree completely with the post about agenda 21…this is totally disgraceful

  3. Tanya March 1, 2018 at 2:13 pm #

    Is this the government’s way of getting rid of Social workers all together to save money.

    • Paul March 2, 2018 at 5:12 am #

      Hackney counciil is actually run by a Labour administration.

    • Padraig March 2, 2018 at 8:46 am #

      Up next, print your own 3D Social Worker, no need for expensive training Just run your algorithm and print what you need.

      Jesus wept, what have we become?

  4. Dave March 1, 2018 at 3:36 pm #

    So, if you are not involved with Social care in some form or another, the computer guesses that you might be in the future? I can do that. And I claim my £1m.

  5. Jane March 1, 2018 at 3:52 pm #

    Blood chilling. And disgusting……but as ever human misery is generating profit for a company ….as if the IT industry didn’t make enough dosh out of the rubbish integrated children’s system……..more of the same.

  6. Paul March 1, 2018 at 6:26 pm #

    I have to say it is no surprise that some social workers find asvances in using technology difficult. As far back as the early 20th Century the social work reformer – Mary Richardson complained about the telictance of social workers to use the telephone. The problem may be that social work perceives technology as unemotional and umcaring – which may be true in some ways and perhaps this is also about concerns labelling peole (even though social workers can in my view be very capable of that in my opinion). On balance I would far rather have technology helping to identify vulnerable families sooner than relying on human efforts alone.

    I think this is a brave and beilliant initiative by Hackney. It is rare to see truly creative ideas in social work (rather than re-hashed ideas from the last). In 10 years time this will be the norm.

    • shaun March 2, 2018 at 12:18 pm #

      I think this is a balanced response. Research is starting to suggest that AI, and other types of technology, won’t replace professionals but enhance their ability to provide a service. Would be great if Social Work embraced this and were leaders in how technology can facilitate and amplify their impact.

      AI will be involved in all that we do in the not too distant future, I see social workers using social networking etc in positive ways, which was resisted only a few years back. Why not lead rather than adopt by default?

    • Jane Anstis March 11, 2018 at 8:05 pm #

      What a well informed viewpoint and therefore of course I couldn’t agree more! 🙂 These emerging decision making support tools are used across many agencies and fields and have been for many years. The make correct predictions far more often – FAR more often, than the significantly biased and error-prone analysis of over stretched social workers. This is a win for rational and fair access for families.

  7. peter March 1, 2018 at 7:13 pm #

    At a time when we don’t have enough money, and my caseload is going through the roof – I would be grateful of any help I can get. There is no way AI is going to replace my job – I think the snapshots are a brilliant idea – I spend most of my time digging around to find out what has happened in the past – not wicked – think it could be brilliant!

  8. Karen March 1, 2018 at 7:49 pm #

    The sad thing is that the support wouldn’t be available anyway. This used to be done by social workers doing the job they are trained for, being part of the community they work in and using their skills and knowledge to build trusting relationships. This idea smacks of discrimination and interference on an unhealthy level. There wouldn’t be many from the middle classes identified here but then we all know they don’t have issues and problems that directly affect their children’s wellbeing. Glad I will be retiring shortly.

    • Peter March 2, 2018 at 6:15 pm #

      I think the point is if the system can reduce the administrative effort then social workers might get to spend more time with families – maybe that is a good idea?

    • Clare Chalmers March 3, 2018 at 12:18 am #

      Wellbeing is not defined in law and can equate to nothing more than happiness.
      Surely you mean Welfare?

      • Clare Chalmers March 3, 2018 at 12:19 am #

        Sad that good people are eyeing the door, hope you enjoy retirement though.

  9. Selina March 2, 2018 at 10:29 am #

    I am a social worker and I don’t see how this will replace us but instead help us to do our job better. If a ROBOT will do a lot of the admin tasks that I’m doing at the moment, then I will take the support.

  10. Vivien March 2, 2018 at 4:52 pm #

    But what if for example a foster\ adoption family is not getting any support ? then there no support to utilize .

  11. North West March 2, 2018 at 9:10 pm #

    As noted above, once a family has been identified will the resources be their?
    I have wondered about this coming for a number of years since big data became so powerful.
    It raises massive issues about data protection and sharing that l imaging have pnly poorly been addressed.
    In terms of the reality day to day, l think most workers are going out and meeting families whom they know need extra support but are unable to progress this. The irony might be the computor and the information it gives won’t save the council instead it will flood them with information, every time the governenment gets invovled in IT their is a problem.

    Thinking of the unfortunate situation of when a child may die who is waiting to be processed by the data who will be to blame then?

  12. Vee Craw March 2, 2018 at 10:01 pm #

    I am so pleased I do not have to chase paper accros the country and share the positive views about how we should embrace technology so we have more time to work with the families and make a difference. Well done Hackney I will be following this development to see how it can make a difference to our work life balance by saving valuable time.

  13. Clare Chalmers March 3, 2018 at 12:16 am #

    This appears to me to tread dangerously over Human Rights, right to a private life. I hope full consent for all this data sharing complies with the data protection act of 1998, plus the GDPR.
    English parents might wish to google and see and read the horrific battle that Scottish families have endured to regain control of family life after the Scottish Government promoted illegal data sharing at low thresholds. So many families have been abused as a result of GIRFEC and the Named Person scheme.
    Some families need support not blame and the vast majority need and deserve to be left in peace.

  14. londonboy March 3, 2018 at 10:48 am #

    My big issues are around consent and ownership of data. Should private companies have access to this kind of information and what level of consent is required to share our data? TFP already crossed some of the boundaries around consent to data sharing with little comment/discussion/scrutiny

    The other issue is around health data – ownership of much of this has already been passed to private companies ( many will not hand over data to SCR’s for example because their private interests come before public interests) Many ‘red flags’ around CSE should have bee raised in health settings so how do we feel about adding in health information to the mix?.

    I do not have answers – there are potentially benefits and dangers as I see it but discussion and scrutiny is essential.

  15. dk March 5, 2018 at 8:19 pm #

    Not following the outrage here; based on the article it seems to me that this system is little more than the same kind of statistical analysis that literally every local authority has done for the last 30 or so years for basic forecasting and planning. If it is part of some nefarious neoliberal ploy, it’s not a new one.

  16. Blair McPherson March 7, 2018 at 2:19 pm #


  17. Blair McPherson March 8, 2018 at 11:30 am #

    In the future AI will be able to predict with 100% accuracy who will be an abuser. We can therefore intervene before it happens. In the future there will be no child abuse. The abuser will be identified and removed before they abuse.

  18. londonboy March 14, 2018 at 8:58 am #

    Wonder what age the ‘future abusers’ should be removed ( to where?) – 2, 3, 4, 5?