One heavy load

There were slices of birthday cake all round
when a report on the first 100 joint reviews of English and Welsh
social services departments was published last month.

While Audit Commission and Social Services
Inspectorate inspectors celebrated producing their fifth overview
report, some councils could be forgiven for not wanting to join in
the party.

This is because Delivering Results
revealed that almost one-third of local authorities visited by the
joint review team since their creation in 1996 were found to have
“uncertain or poor prospects”.1 Just 8 per cent of
councils were considered to have excellent prospects, with the rest
being rated as promising (News, page 14, 18 October).

The last chapter of the report highlights the
challenges facing joint reviews and contains the admission that the
joint review team itself is “keen to reappraise its approach”.

It says: “While feedback has recognised that
joint reviews have been a significant influence on the wider
inspection framework, it also urged the team to adapt and shift its
own horizons in response to the changing environment in which is it
now operating.”

After the report’s publication the Department
of Health and the Welsh assembly commissioned a review of joint
reviews and their role in the future arrangements for monitoring
the delivery of social care. The review has now been completed by
an independent consultancy, although no date has been set for the
publication of its findings and recommendations.

The joint review team is undoubtedly eager to
improve the way it monitors and assesses the work of social
services departments. But what do local authorities think of the
joint review process? Do joint reviews measure what they set out
to?

Kate Page, strategic director of neighbourhood
services at Milton Keynes Council, is sure they do. Her local
authority has just been joint reviewed and is awaiting publication
of the inspectors’ final report and she describes the experience as
“thought-provoking”.

She says: “The reviewers came with a clear
script of what they had to do and their draft report shows that
they have kept to that. The outcome is a logical progression of
what they set out to do.”

A joint review team must get through a great
deal of work in a short period, according to Shropshire Council’s
head of service for business support Mike Morris. Two reviewers
inspected the authority during November and December last year and
published their final report this June.

He says: “In terms of the time and resources
available to each reviewer they do achieve a lot in a relatively
short timescale.”.

Peter Gilroy, director of social services at
Kent Council, says the way reviews measure councils has become more
sophisticated. “It is always difficult when people mirror back to
you what they think your service is like, whether the feedback be
good, bad or indifferent,” he says.

But do joint reviews offer value for money?
The joint review team spends an average of £55,000 when
reviewing an English council. This covers all the costs from the
receipt of the council’s position statement on how it thinks it is
performing to when the team publishes its final report
approximately 10 months later. In Wales this increases slightly
because joint reviews are published in English and Welsh.

For local authorities the cost of a joint
review can be much higher. London Borough of Havering executive
director of community services Anthony Douglas says the east London
council spent £100,000 on its joint review. This includes
£70,000 on three staff seconded to work on it full-time and
£10,000 on producing the position statement before the review
started.

The review’s final report was published last
month and stated that staff have “an obsession with the front
line”. Douglas says the review “was a catalyst for change” and
resulted in the council backing the £750,000 action plan on
what the social services department needs to do now.

Gilroy admits Kent spent “hundreds of
thousands” on working on its joint review, which was conducted
between December 2000 and February 2001, including the time staff
spent on it. He describes it as a “productive but painful exercise”
and says repeating the process across the country is not good value
for money.

He says: “Reviews need to be shorter and
focused on a number of smaller but important issues.”

The impact of a review on front-line staff –
particularly a negative one – can be enormous, says one adult
services manager who wishes to remain anonymous and whose authority
received a damning report in 1997. She says: “We knew we weren’t
doing brilliantly but we didn’t think we were that bad. It was
demoralising and unhelpful for staff.”

And the reaction of management was not much
better: “For management it was an obsession, as they felt it was
them personally being inspected.”

Being inspected was difficult for front-line
staff, says a social worker working with vulnerable adults whose
council was reviewed this year. He explains: “As a team we felt the
reviewers already had preconceived ideas about the borough because
of some of the negatives things they said.”

Despite this, he believes joint reviews are
still a valid method of measuring a council’s progress. “In some
ways a review helps front-line staff see the bigger picture because
you can just get caught up with what you do. You are a tiny cog in
a big council wheel and the review lets you see you are more than
that.”

The reviewers who visited Shropshire Council
were “extremely professional and capable people”, Morris says. “We
were able to have a very open and frank dialogue with them
throughout the process. They listened to what people said and when
they had questions or needed further explanation they quickly made
us aware of this.”

One area that the joint review team could
improve is their engagement with service users, according to the
social worker with vulnerable clients.

Reviewers at his council had said they might
want to meet his clients but did not confirm this, so clients were
left waiting for appointments that never materialised.

The adult services manager thinks that not
enough users are engaged by joint reviews, with those that are
happy with the service unlikely to go out of their way to say so.
She adds: “Such a sample-size is a dangerous thing for a solid
base.”

No council wants to be named and shamed by a
joint review team and some councils may feel a pressure to make
everything appear hunky dory.

There is no more vivid example of this than
when Haringey Council’s social services team was visited by the
joint review team in March and April 1999. When its findings were
published in November 1999 it reported that the local authority’s
child protection service seemed to be “safe” and “sound”, despite
low spending and high caseloads. When Victoria Climbie died three
months later it was obvious that either things had rapidly
deteriorated over the previous year, or that the joint review had
made a mistake.

It is not the joint review team’s intention to
make councils pretend all services are doing well, says head of
joint reviews John Bolton. He says such a reaction depends on how
the council responds to being inspected: “Many local authorities
approach joint reviews in an open and honest manner, as a way of
getting an audit. Others want to put on a good show.”

He adds that sometimes social services
directors feel vulnerable about their own positions during a review
because they think that they are being judged: “We try and be
sensitive to this and recognise it.”

Bolton says the rethink of joint reviews was
required because of the need to reflect changes in the provision of
health and social care services in the review process.

He says: “It is really important that we leave
authorities with a change agenda they can recognise and it is one
that can help drive them forward.”

1 SSI, Audit Commission,
National Assembly for Wales, Delivering Results – Joint Review
Team Fifth Annual Report
2000-1

This report is available to download
in pdf format from
www.joint-reviews.gov.uk/annrep01.html
 

 

More from Community Care

Comments are closed.