Show us the proof

The 40-year history of Turning Point is widely regarded as a
success story. It has developed its activities and extended its
reach, so that what it does today bears little resemblance to what
it did in 1964. Only the basic philosophy – dealing with the person
not the problem – has remained constant as it has grown from
supporting a handful of street drinkers in Camberwell, south
London, to being (in its own words) the leading UK social care
organisation, dealing with up to 100,000 people a year who have
problems with drugs, alcohol, mental illness or learning
difficulties.

It is worth pausing to consider why Turning Point is believed to be
successful. If judged by its growth – with turnover doubling in the
past five years – it scores highly. But how far is growth a measure
of success? What about the effectiveness of its services? Should
that be taken into account and, if so, how?

Turning Point can cite research findings that some of the people it
has sought to help have responded positively when asked about their
contact with the organisation. It monitors its services closely and
reports to several regulatory bodies. But it seldom commissions
formal evaluations and I could only find one on its website – a
small qualitative study of mental health outreach services in two
English counties.

How far is it possible to trace cause and effect, isolating the
impact on an individual of Turning Point’s services from the impact
of other things happening at the same time? Can we tell whether it
is more or less effective than other service providers? As Turning
Point has grown, so have the numbers of people with problems
related to drugs, alcohol, mental health and learning difficulties:
does this indicate success or failure?

I ask these questions not to put Turning Point on the spot, but to
show that “finding out what works” – which is also the title of a
new King’s Fund report (1) – is a tricky and contested business.
The report challenges assumptions about how evidence is gathered
and used, and how it influences policy and practice.

In one of its recent publications, Turning Point says that, just as
private companies are judged by their financial results, so
“charities must show clear evidence of the benefits they provide to
the individuals and communities they work with”. But has Turning
Point followed its own advice? An army of admirers would say yes.
But many researchers, civil servants, policymakers and potential
funders could, if they chose, argue that there is little robust or
verifiable evidence of benefit to individuals and communities.
Where are the findings from randomised and controlled trials? From
multi-method evaluations, systematic reviews, longitudinal studies,
time-series analyses and so on?

Like many other charities, Turning Point has won an enviable
reputation as a successful organisation without gathering much
evidence that would stand up to uncharitable scrutiny. It has
instead built knowledge over time, learning from the experience of
its own workforce and service users, and guided by inspired
guesswork, expert hunches, political intuition and pragmatism.

Meanwhile there has been a growing tendency in government, notably
since 1997, to favour an evidence-based approach to policy and
practice, using formal and rigorous methodologies, led by qualified
researchers, who may or may not involve front-line workers and
service users in their assessments.

So here are two approaches to determining success. And the
difference may be troublesome for organisations such as Turning
Point, especially if they espouse the idea that charities must
provide “clear evidence of benefit”, without establishing a
consensus about what constitutes evidence and benefit. For, as the
King’s Fund report says, evidence is widely used as a political
tool: “Critics of a particular intervention will demand to know
what evidence it is based on or seek to show that evidence is
absent, weak, inconclusive or inappropriateÉ proponentsÉ
are far less likely to scrutinise evidence that is telling them
what they want to hear.”

Experts who carry out formal evaluations are often poor at
integrating the experience of front-line workers and users into
their research. Those who work in community-based projects have
little expertise in gathering and using formal evidence to help
improve practice. What is needed now is a meeting of minds and a
sharing of methods. And less loose talk about evidence.

(1) A Coot, JAllen and D Woodhead, Finding out What Works: Building
Knowledge about Complex Community-based Initiatives, King’s Fund,
2004

Anna Coote is director of public health, The King’s
Fund

More from Community Care

Comments are closed.