The What Works Centre for Children’s Social Care has developed a prototype diagnostic tool for organisations to understand how well they are using research and evidence in practice and in the management of their operations.
This is an area Community Care Inform has been working on over the past year with local authority partners to help them gain a better understanding of how social workers are using the research, case law, legal and practice guidance on Inform in their decision making.
Findings from our research with two good-rated local authorities, which included surveys and interviews with social workers and their line managers, included:
- While social workers and managers were generally comfortable with how risk was being managed within cases and with the quality of practice, this was not because social workers were consistently using research and evidence of best practice. Instead, they were relying mostly on experience and on managers and colleagues to help with their decision making.
- The impact of this was social workers feeling less confident in their decision making and higher workloads for frontline managers, on whom practitioners were reliant for support.
- The most common barrier to using research and evidence in decision making was about priorities. Social workers felt the time spent on finding research and evidence could not be justified when caseloads were high.
Anecdotally, managers will often claim their best social workers are also those who spend time researching and updating themselves on best practice. However, credible, independent and validated evidence of this is hard to find.
While senior leaders would usually agree it is essential their practitioners are up-to-date and research-informed, the reality is that when departments are under intense pressure, spending time researching a decision is seen as a luxury not a priority – instead, speed and compliance are the ‘must-dos’.
As a team manager involved in the research pointed out: “People just go into ‘tick box/get it done’ frame of mind – do visits, quickly visit children, quickly get it done…because there’s so much pressure of keeping in timescales.”
The longer a social worker has been in practice, the more able they feel to cut corners and rely on instinct and experience. This is then passed on to younger members of staff when they ask for advice.
While this is a high-risk strategy there was no data that we could find about the general impact of it on the quality of practice and outcomes for service users, or in relation to whether time spent investing in research was time well spent. To assess this we would need to explore the links between research-mindedness and the quality of social work decision making and the links between those decisions and outcomes, for example whether or not a family was re-referred into social care subsequently.
It feels like there is a distinct research gap here which could be very usefully filled by the What Works Centre.
Quality assurance systems in councils could be gathering this information but to do so would require a far more systematic approach than many can afford to take. From talking to local authorities on the topic, most operate random or themed case audits and do not routinely look at how social workers source their decisions, unless very poor practice is picked up.
However, leaving the scale of the risk of this approach aside, what our research did seem to show was that it was leading to social workers being less autonomous in their decision making and more reliant on frontline managers, increasing workloads for the latter.
Not a priority
In our research, many social workers said they would like to improve their knowledge and understanding of research and case law but they did not feel they could justify the time spent on it.
This means that while most councils would argue that they are learning organisations, this is often not being felt at the front line.
“… I think they pay lip service, I don’t think the systems allow it. There isn’t so many hours a month for training or learning and if there was, if something came in, that goes out the window. I think that’s been the culture of social work for a very long time. Longer than I’ve been around. I think it’s teetering a bit, not a lot.” Social worker
Some local authorities spend money on ensuring resources such as Community Care Inform are at hand for social workers but the failure to use them is seen as a failure on the part of the tool rather than a systemic red flag indicating a workforce is not evidencing decisions.
Interestingly, one of the local authorities in our study includes a requirement for social worker continuing professional development (CPD) portfolios to be submitted before they can apply for a salary increase or promotion. Those who can show they are researching and evidencing their practice consistently, either through usage statistics or case file examples, are more likely to be promoted or rise up the pay scale, creating an incentive for practitioners to invest in their knowledge.
The other council we worked with was surprised to find the research showed a distinct gap between manager and social worker views on which learning tools were the most effective at improving practice. For example, 95% of managers in this authority said organisational policies and procedures had a positive impact on practice, while only 68% of social workers agreed.
It feels that another research gap in the sector is understanding the effectiveness of particular learning strategies and tools and the impact on practice.
Based on this small-scale study and our experience of how social workers use Community Care Inform, these are our tips for senior leaders to embed research and evidence into social worker decision making:
- Don’t look at the issue in isolation or as one only related to learning and development budgets. It must be part of supervision, audit, organisational structures, retention strategies and social work methodologies.
- Ask yourself how you will know if social workers are using research and evidence in their decision making. Think of your existing data collection systems – can they be adapted to include indicators that give you information on this?
- What are your metrics of success? Incorporate questions around research and evidence into audit systems, analysis of court cases and feedback from social workers and service users. Ask managers to classify the proportion of their direct reports who are regularly using research and evidence in their day-to-day decision making.
- Consider the variety of learning options you provide – if time is short and a department is under-resourced then full training days may be unfeasible and costly. Can micro-learning options be utilised such as quick quizzes, podcasts to listen to on drives between visits, videos to watch on commutes, printouts to read over a coffee or devoting 10 minutes to learning in group supervision discussions.
- If it is a departmental priority how will this be demonstrated to your staff and how can you incentivise them to fit it into their day? Consider options such as protected time for research, team learning days out, monthly learning topics, incorporating learning into requirements for promotion or reward systems for those who are researching and evidencing their practice consistently.
- Find ways for social workers to understand their own learning needs and knowledge gaps – online quizzes and role plays can often be effective.
We will be continuing to support our local authority partners in this area and look forward to seeing how the What Works Centre’s diagnostic tool contributes to helping local authorities in this regard.
Judy Cooper was editor of Community Care until September 2018 and now works as a consultant working with the Community Care Group to develop products and content for social work leaders