by Diane Galpin, Annastasia Maksymluk and Andy Whiteford
Regulatory requirements are clear: social workers need to understand and use research in practice if they are to provide effective help – in short, to ensure their practice is evidence-based and evidence-informed.
The knowledge and skills statement for child and family practitioners says they should “make use of the best evidence from research to… support families and protect children”. The Health and Care Professionals Council (HCPC) similarly outlines the requirement for practitioners to ‘be able to engage in evidence-informed practice’.
Meanwhile the latest guidance on the refreshed professional capabilities framework (PCF) articulates the expectation that social workers also generate ‘evidence’ to inform practice.
Summarising the main changes made to the PCF from the previous version, it says it contains “more reference throughout to importance of evidence and evidence-informed practice and the inclusion of more reference to ‘evaluation’ alongside ‘research as key source of evidence and engagement of practitioners in evidence/knowledge generation”.
While practitioners may strive to adhere to these requirements, busy professionals can struggle to generate research, leading to a ‘research gap’, where large caseloads and restrictive notions of what count as ‘evidence’ act as a barrier to practitioners using their day-to-day experiences to inform research.
How then can the notion of ‘evidence’ be challenged to develop new approaches to knowledge creation – ones that are inclusive of the emotional and subjective elements of professionals’ practice – to provide alternative modes of evidence generation?
What is ‘evidence’?
Bruce Lindsay’s 2007 book, Understanding Research and Evidence-based Practice, discusses evidence-based practice (EBP) in simple terms. Lindsay describes “using the best evidence you have about the most effective care of individuals, using it with the person’s best interests in mind, to the best of your ability and in such a way that it is clear to others that you are doing it”.
This provides a useful and broad perspective. Yet our experience as educators of post-qualified professionals suggests practitioners’ experience, and understanding, of what counts as ‘evidence’ is somewhat narrower.
Anecdotal ‘evidence’ suggests to us that educators and regulators are much more focused on objectivity and measurable outcomes as an acceptable evidence base.
Current frameworks seem to exclude professionals’ emotional and subjective practice experiences as a basis for developing evidence based/informed practice. But shouldn’t such experiences also be considered as contributing to evidence based/informed practice – and if so how might this be achieved?
Emotion and evidence-based practice
Social work practice and research are not neutral activities – they are subject to internal and external realities – yet there is a pervading belief that emotion and subjectivity has no place in supporting EBP.
This seems nonsensical given that social work comprises so much emotional labour. Indeed, to work with compassion surely requires a degree of emotion – without emotion, can compassionate practice even exist?
The authors are not advocates of social work ‘misery memoirs’ or practitioners and educators swimming in a ‘sea of me’ in their research. Instead, we consider the potential of using subjective experience to enhance the evidence base of practice.
We recently had the privilege of working with a group of postgraduate students discussing their day-to-day practice. In this discussion, we would suggest, we were creating ‘evidence’ – despite recognising that it might not fit prescribed frameworks associated with EBP development.
The students’ ‘evidence’ related to how they experience pain and vulnerability while working within harsh systems – and to how they witness pain and vulnerability from service users experiencing those harsh systems.
Our teaching was about how we can equip ourselves to categorise all of this taken-for-granted stuff, which we see every day in practice, as ‘evidence’ rather than just ‘how it is’. Once we recognised it as ‘evidence’, we could give it a name – ‘data’ – and could look at it in a multitude of ways, with fresh eyes.
We could see how the students’ data went far beyond their individual practices, into the organisational and political realms. It was not just about what they did, but also about the context in which they work.
Subjectivity in research and knowledge creation
Unquestioning acceptance of nebulous notions of evidence-based/evidence-informed practice, as presented within professional regulatory requirements and capabilities, may actually undermine the aim of enabling practitioners to be involved in research – because they are overly simplistic and one-dimensional. Therefore we suggest additional, and alternative, research methodologies are required to enhance practitioner participation in research, such as auto-ethnography.
Whether practitioners or lecturers, we are immersed in a world where an unremitting belief in objective rationality and measurable outcomes is taken for granted as delivering the gold standard for EBP.
We would suggest evaluation of what counts as evidence might be more usefully appraised using a suggestion made by David Silverman in 1998.
“If there is a ‘gold standard’ for social and cultural research, it should be: have the researchers demonstrated successfully why we should believe them?” he wrote. “And does the research problem tackled have a theoretical and/or practical significance?”
Such a definition provides more space for practitioners to contribute to research and may support flexibility in current frameworks, which seem to prioritise objectivity and exclude professionals’ emotional and subjective practice experiences as a basis for developing evidence based/informed practice.
Autoethnography: creating new spaces for research generation
An additional, and alternative, research methodology can be found in an autoethnographical approach.
Jadwiga Leigh’s 2013 paper on the development of her professional identity in child protection describes autoethnography as a method that combines characteristics of autobiography and ethnography.
Leigh suggests the author retroactively, and selectively, writes about experiences with their use of hindsight. They critically reflect back in time and selectively write about epiphanies that emerge from, or are made possible by, being part of a organisational culture and/or by possessing a particular cultural identity
Karen Staller at Michigan University eloquently demonstrated this approach in her 2007 exploration of the interaction between a social worker and sexually abused child, which resonates with many experiences of the polarity that exists in presenting objectivity as synonymous with professionalism.
“He speaks about his responsibility to retrieve objective stories from sexually abused children, knowing He holds their heart in His hands,” Staller wrote.
“His need to get an ‘objective’ story is because the alternative is subjective or fictitious,” her piece went on.
But is a subjective story fictitious, and hence invalid? How can he make it an objective story, can he, should he?
Staller’s experience of encountering this exchange provided a ‘trigger’ moment for her and this became her ‘data’. Stop a moment and reflect on the questions above in terms of your practice – what is framing your response to them? The answers form your potential data – so what do you do next?
Staller’s work certainly provided a ‘trigger’ moment for us. It provided the stimulus to think about how we frame research, who and what influences decisions about what ‘counts’ as evidence, and what impact might this have on knowledge creation to inform practice.
By thinking critically about dominant discourses and ideologies that directly influence current practice within higher education – such as the use of metrics to ‘rate’ the quality of our research and our institution – we were able to tease out, and adopt a different approach to our research and practice, one that is inclusive of the whole of our whole professional experience, not just objective fragments that meet a scientific measure of what counts as evidence.
Practitioners cannot unsee what they see and feel on a day to day basis – these micro-observations and practices exist within a macro context. Separating the two only serves to decontextualise people’s experiences, leaving their subjective and emotional experiences in a void labelled as ‘invalid’.
Alternative, inclusive and accessible approaches, drawing on practitioners’ emotional and subjective experiences, can provide a rich additional seam of knowledge to inform practice. First though we need to figure out what we actually mean by EBP.