AI could be time-saving for social workers but needs regulation, say sector bodies

As more councils employ AI tools to help save time on administrative tasks, BASW and the Social Workers Union call for government action to address the ethical implications

Photo by Sutthiphong/AdobeStock

Do you use AI tools, such as CoPilot or Magic Notes, for daily social work tasks?

  • No (79%, 562 Votes)
  • Yes (21%, 151 Votes)

Total Voters: 713

Loading ... Loading ...

Social work bodies have called for the regulation of artificial intelligence (AI) to address the ethical implications, as more councils employ AI tools to save time on administration.

Currently, 28 councils in England are using or testing the AI tool Magic Notes in children’s and adults’ services, to produce case notes from visits and assessments.

Developed by AI company Beam alongside social workers, Magic Notes records meetings and emails the practitioner a transcript, summary and suggested actions for inclusion in case notes based on council-agreed prompts.

According to Beam, the technology complies with social care statutory requirements and, in all cases, practitioners must review the documents before adding them to their case management systems.

Swindon Council, which piloted Magic Notes with 19 adult social workers between April and July 2014, found it reduced the average time to conduct a Care Act assessment conversation from 90 to 35 minutes and time spent on follow-up case notes from four to one-and-a-half hours.

The authority said the tool particularly benefited practitioners with learning difficulties and visual impairments, along with those who were not native English speakers.

Meanwhile, other local authorities, like Barnsley, are using another AI tool, Microsoft’s Copilot, with similar functions of transcribing meetings and generating notes and actions based on prompts.

Calls for regulation

However, the rise of AI in social work has also sparked concerns about data privacy for families, bias, and whether AI-generated actions will be adequately reviewed before being carried out.

Social Workers Union general secretary John McGowan said AI could be a “helpful time-saving tool”, but should not be used as a “quick fix” for the lack of funding and staff in the sector.

“Right now the onus should be on the social work regulators to produce guidance for using AI and on the government for centrally regulating AI,” he said.

“This would put protections in place for social workers and the people and families they support, as this technology has known issues, including biases, presenting false or misleading information as fact, data governance and growing concerns about environmental impact.”

The British Association of Social Workers (BASW) also called for the regulation of AI, along with a national framework of ethical principles for its use, to ensure accountability to citizens and to uphold human rights.

‘We need to apply the brakes on AI’

Christian Kerr, senior social work lecturer at Leeds Beckett University, questioned whether local authorities had carefully considered AI’s implications for privacy and human rights.

“We need to apply the brakes on AI, or at least slow down considerably, to allow the social work regulator, our professional association, education providers and practising social workers to come to grips with the myriad ethical implications and challenges,” said Kerr.

“From my interactions with social workers across the country, it is clear to me that it is practitioners who are leading the ethical debate in local authorities and they need the support of social work leaders, organisations and educators to do that to best effect.”

Swindon’s adult social care privacy notice states that individuals are informed when Magic Notes is used for the recording of calls, with an option to opt out, and that personal data is automatically deleted after a maximum of one month.

Beam has also confirmed that no data is used to train AI systems and the tool has undergone data protection assessments prior to testing.

‘Social work must not be left behind’

However, while BASW chair Julia Ross acknowledged that AI had “bad bits”, she stressed that social workers must engage with the technology to avoid being left behind.

“We can promote what we do and give our input as social workers, but we’ve got to be there,” she said. “We can’t afford to be left behind. We’ll lose huge opportunities for ourselves, our practice and the people we work with.”

Welcoming the debate around AI in social work, she also urged the sector to take the time to understand its varying applications and potential uses before rejecting it.

“You don’t just get into it hook, line, and sinker,” Ross added. “What you do is adapt that tool and merge it with the emotional intelligence social workers are so good at. If we just stand back and say we don’t like it, then we won’t do ourselves, the profession or the people we work with any advantage.”

“We need to remember that social work operates in the real work and the real world now is an AI world.”

What do you think about the use of AI tools in social work?

,

18 Responses to AI could be time-saving for social workers but needs regulation, say sector bodies

  1. Victoria October 6, 2024 at 8:01 am #

    This worries me and there seem to be a lot of pressure to push ‘Beam’ model. Saving time isn’t conducive to promoting social work skills and knowledge. It’s not just about being able to take information in, it’s a skill to sit in a meeting and determine with one’s professional viewpoint and knowledge, which pieces of information are useful and which are not. There is no reference to permission with capacitous consent, of all service users whose information is discussed. For me, there are these two aspects which haven’t been considered.
    1. Absolutely clarity about consent and knowledge of people whose information is used this way, that they know case notes, meetings discussing them etc are going to be interpreted or summarised using AI

    2. That we dismiss the professional skills and learning gained through summarising notes and interactions, making choices and the need for writing good notes to be understood as a key social work skill which aids reflection and decision making.

  2. Jill Honeybun October 7, 2024 at 2:51 pm #

    I’ve just received what I believe is an AI assessment for my son, brain damaged at birth. Really not worth the 75, yes 75 pages it’s written on. I’m a former social worker with a Business Studies degree so ran a niche business and club for 20 years, skilled in desk top publishing as I wrote the club magazine. I don’t know who designed this plan but to put it kindly it’s a heap of rubbish. There are even empty pages. No one is ever going to read to the end. Plans should be clear, as short as possible so that anyone involved with my son can read it reasonably quickly and understand the key issues.

  3. Sally Pepper October 8, 2024 at 7:02 am #

    I’m a mental health social worker not currently using AI. I cautiously welcome AI as a time saving tool, but with reservations.

    I’d love to have more conversations with different people and process good actions more quickly, if that’s what it delivers. I like the idea of editing a transcript which has gathered some main points for me, especially if a day or two has to pass between talking and writing. I think it will inevitably make face to face meetings shorter – we’re not going to want to generate too many thousands of words. Maybe that’s a good discipline, some of the time, where good relationships are still built.

    I’d like to know that the unedited AI recommendations are definitely not going to remain on the system as a standard to measure my work by. I hold a lot of risk, for example. If AI is recommending a risk averse stance with someone who I want to encourage to be independent, I want conversations with my manager to continue to arbitrate that. I want to be accountable for my actions, not my actions compared to the AI.

    I am concerned that AI may lead to greater risk aversion, counter to a strengths-based approach and human rights. What’s AI going to recommend when a person frequently self harms and has suicidal intent? A worker would have to regard the AI confidently and not fearfully.

    I’d like to think social workers could be part of the design process locally. I don’t think it is enough involvement for tech specialists to build it and invite social workers to feed back afterwards.

    I’m interested to know how it would work in MDT meetings. Where professionals present are working to the medical or criminal justice model, how well can AI separate these views from the values we work to?

    Fascinating. If it can help me to focus my mind on the most difficult dilemmas by spending less time on things that are more straightforward, I’m all for it. It’s a complex system though, becoming more complex, with all the hazards and ethical problems that entails.

    • Mrs H October 11, 2024 at 4:29 pm #

      AI won’t recommend anything, or do your job for you. It only records conversations & meetings, providing a summary or completed assessment, ready for you to check its accuracy and then choose to upload into your case recording system. It records and transcribes. Nothing more, but it should save the time taken on case recording and writing assessments etc.

      • Tahin October 12, 2024 at 9:18 am #

        Actually AI has been used to predict and highlight behaviours supposedly ‘missed’ by social workers. Even Community Care had a feature on it. AI isn’t just a transcribing software. Never be seduced by supposed time saving technology. Social work is in the mess it is now because leaders are beguiled by ‘innovation’ and fads never thinking beyond the first flushes of playing with their new toy. Electronic notes were meant to free up social workers to spend more time out of the office. Result? More time spent on admin infront of a screen. Pro formas were meant to make identifying care needs more efficient and equitable. Result? Bureaucracy deskilling social workers and rationing of care. Trying to predict out human reactions in what is a human relationships profession is not social work. Championing it might get you promotion and nomination for an undeserved Social Worker of the Year baubles,or swoon, an MBE but that doesn’t make you a social worker either.

      • Sabine October 12, 2024 at 9:38 am #

        So it might not save anyctime, as it could get its assessment/ recommendations wrong. Where would that leave you as a practitioner?
        I just hope that noone ends up in a dituation where managers say ‘well, do what the AI says’ . Thus him retreating into a place of avoidance of inter-human interaction and exchange. And who reads a 75 page transcript? Inless the allicated social worker does a cut and paste on the documentä to summarise it himself.

  4. Lee October 8, 2024 at 10:33 am #

    Like all technology it can be used for good and bad reasons. With assessments writing this is an interesting development possibly to reduce time spent on such activity, however AI-like software can be used unethically as in Microsoft Teams transcribing without individuals involved in meetings being involved and to not accurately document such important meetings. It cannot be picked up by one’s own organization IT departments whilst recording can.

    This is only the start as AI software in the new Microsoft laptops have AI voice replication software which is concerning when it comes to ourselves as employees, having voice messages appear from years previously to embed victimization in the workplace as evidence.

    Pandora’s box scenario

  5. June Ross October 8, 2024 at 11:15 am #

    I am retired but with 25 years experience as a social worker employed to work in 2 hospitals, for the MOD! then in the community always generically. My reviews were praised by the Inspectorate for their clarity. I often felt that writing up my notes helped me clarify my thinking particularly about complex situations. My concern about AI is that we as a species will lose key elements of brain function. I am not alone in thinking that empathy is dying out…..

  6. dk October 8, 2024 at 2:54 pm #

    I’m not opposed to AI being used to assist with recording (and not on principle in assisting with decision making, although that is a much thornier problem), but I think there is a fundamental thinking error in constructing note taking or case recording as purely an after-the-fact administrative task. It is often the act of writing notes that prompts the thinking and reflection that prompts action, that moves practitioners from simply recording “what” to thinking about “so what” and “now what”.

  7. David Gaylard October 8, 2024 at 10:01 pm #

    Despite still being a registered practitioner, I’ve worked in HE for 18 years+ teaching practitioners across numerous u/g and p/g programmes, so I cautiously see some limited potential benefits of Ai usage within SW practice but with clear guardrails.
    Such well-meaning innovations still require careful regulation. Such technical advances should not replace crucial professional reflection, judgment, and decision-making. Otherwise, what’s the point of becoming a registered professional – if reductive, time-saving prefixed words or prompts alongside set algorithms determine complex SW decisions and recording? Historically, such progressive innovations create ‘unintended consequences’ with the potential danger of creating unthinking bureaucratic practitioners who then won’t require a professional salary. Similar AI pilot schemes are also operational across our legal system, partly due to the enormous backlog of cases…

  8. Clare Stone October 9, 2024 at 8:28 am #

    I have just finished analyzing data from my empirical research into social workers’ experiences of using AI. I found a general lack of understanding and discussion across my sample with a very small number who are engaging with AI but it is almost under ground because they don’t know how social work employers feel about AI.
    There are many ways we can harness AI to enable social workers to spend more time on the relational aspects of social work (but all of the cautions already identified in this thread are valid).
    I am advocating for AI strategy groups so I am really pleased to read this article which promotes the same. We have lots in the university environment about using AI but this has not previously been considered for the workplace. I am therefore developed guidance for Practice Educators who support our students.
    It is great to be part of this discussion

  9. Jimmy Choo October 9, 2024 at 1:36 pm #

    Love being able to use copilot. As someone who struggles to type ‘professionally’ and struggles with spelling, Copilot helps my sentences remain concise without jargon. I save loads of time as well. You can’t ask copilot to write the body of text without putting the information in first so I fail to see what is unethical? Unfortunately, after telling my employer the benefits, they deleted it! More for them on overtime payments courtesy of the tax payer Unfortunately.

  10. Ndiho October 9, 2024 at 11:10 pm #

    There are a range of AI tools that could be very useful. I was averse until my 22 year old advised me to ask Chatgpt about something I was struggling with. So I drafted something and asked chat to improve on it. Bingo I got an improved version same information but written more succinctly. On another occasion I asked chat to provide some context for a presentation I was going to make. This time I got too much information and some of it complicated. So requested the same information but pitched to the understanding of a teenager. Sure enough in seconds I had that information and understood clearly what I needed to make reference to. I made a further request to cut the information down to 5 key points and again I got that in seconds. In short let’s engage with it and yes consider issues of data protection etc but let us not throw out the baby with the bath water else we will get left behind.

  11. Tahin October 10, 2024 at 6:55 pm #

    Will there be an AI of the year added to the Social Worker of the Year shindig?

  12. Kudakwashe Kurashwa October 11, 2024 at 2:42 pm #

    I 100% agree with Julia Ross, social workers must not be left behind. We need to engage with AI and work on addressing the bad bits of the technology. All other sectors of the society eg finance, defence, engineering etc are using AI. Why not social workers? If we do not embrace change, change will change us. Am ready for the future of social work, with AI and other disruptive technologies such as fintech as part of it!

  13. Tahin October 13, 2024 at 9:58 am #

    Social workers would do well to study history once in a while and remember its lessons. The Enclosure Act was justified as necessary for more ‘efficient’ agriculture but it drove peasants off the land and destroyed rural communities when displaced peasants flooded cities in search of work. Child labour and exploitation of women in cities with suppressed male wages to drive the Industrial Revolution followed. Overcrowding, disease, death, alcoholism wasn’t a price worth paying for not being “left behind”. Industrialisation led to laws prohibiting property ownership to most citizens too. Read your Dickens if this is too boring for you. AI created the drone technology that kills and maims thousands in Ukraine, Gaza, Sudan, Yemen and now Lebanon. Arms manufacturers and mass data harvesters like Google, Facebook, Tik Took, supermarkets, ‘loyalty’ card providers and the like are main drivers and investors in AI. They are not being benign, enabling progress and efficiency, they are exploiting us to maximise profit while avoiding paying tax. Think about what information you are manipulated into giving away every time you order on Amazon or go online or do physical shopping if all of this seems far fetched. Just like subsistence peasants, mill workers, shipyard labourers, coal miners, doorstep milk deliverers, and so on AI will if not replace all social workers will further de-skill them. As a consequence employers will justify reducing our numbers. Workers seduced by promises of technology making work less burdensome and ‘efficient’ always realise too late that just like millions before them it actually makes them redundant in all senses of that word. Treating AI in the workplace as if it’s an ‘exciting’ games console is the price most might think it’s worth paying but not me. Don’t be fad driven, be values driven. That upsets your employer but not necessarily the people who use services who hope to have a human warmth in it. Even if relationship we develop come with some frailty. As for ‘Sector Bodies’ just because they might have social work stitched on to their bureaucracies doesn’t make them advocates for social work. Functionaries rarely live up to their own hype.

  14. Fab October 13, 2024 at 3:06 pm #

    It is striking how we often assert that AI assists social workers who are not native English speakers. This perspective can be inherently discriminatory. It seems as though there is a pervasive belief that non-native English speakers cannot achieve a high level of proficiency in the language. However, being a non-native speaker does not preclude one from writing English at an exceptional standard. In fact, many native English speakers exhibit poor linguistic skills.

    The implementation of artificial intelligence across all councils and sectors of social work is essential; it has the potential to save time and significantly enhance productivity.

Trackbacks/Pingbacks

  1. Social workers split over impact of AI on professional skills though usage remains low – Recruitology Careers Blog - October 25, 2024

    […] from the British Association of Social Workers (BASW) and the Social Workers Union for government to regulate AI and address ethical concerns, such as around privacy, bias and quality of […]