Do new star ratings reflect system faults rather than service changes?

As news of the refreshed ratings for England’s social services
departments sinks in, the star system is shining a little less

Of the country’s 150 social services departments, 18 were awarded a
new rating in last week’s refreshing exercise. Of these, 12 ended
up one star better off than they had been when they received their
original rating in May, while six were obliged to return one of
their precious stars back to the Social Services Inspectorate (see

Although on the face of it this might appear to represent progress
-Êwith twice as many councils apparently improving their
performance than becoming worse – a glance at the reasons behind
some of the upward and downward shifts immediately casts doubts
over such simplistic conclusions.

Questions are being asked about whether some of those departments
which are now the proud owners of an additional star would have
been in that position from May but for a misinterpretation of data
or a data collection formula.

If so, these extra stars do not represent improved performance over
the past few months but a better understanding of the statistics
required. Any councils in this position have, as a result, suffered
six months’ worth of unjustified negative publicity, with all the
knock-on effects that brings for staff morale, recruitment and

Kingston upon Thames director Roy Taylor, although delighted at his
council’s promotion to the three-star category, admits he was
“taken aback” in May when his department was awarded only two stars
just a few months after receiving a glowing joint review.

Problems with data would appear to be at least partly to blame for
Kingston’s lower rating in May and higher rating last week.

“We had to tidy up the data we gave them,” Taylor explains. “There
were aspects of data that we submitted [for the May rating] that
didn’t come out as strongly to them as it should have. We also had
a slight dissonance because we felt we had given the information
but they hadn’t received it.”

Taylor believes that part of the difficulty is the absence of a
reliable national computer system to allow all social services
departments to collect the same data the same way, and therefore
become more comparable.

He also believes smaller councils such as Kingston are
disadvantaged by their size because one slight change can have a
big impact on a performance indicator, whereas a small change in a
larger council will always be more easily absorbed.

The director of two-star Leeds social services Keith Murray is
disappointed that his department was not awarded two stars from the
start, given that it matches or outshines the majority of the
three-star councils on the 11 key performance indicators that are
supposed to act as consistency checks.

But Murray repeats Taylor’s fears about comparing councils’
performances when everyone collates and interprets their data
differently. He argues for data collection to be standardised if
the outcomes are not to undermine the wider ratings system, adding
that inspections do not take place often enough to be able to
verify data submitted.

Ian Wilson, director of social services for London’s Tower Hamlets,
now also a two-star council, believes that his department is one of
a number that were held down a grade in May because a small amount
of data was incomplete or inaccurate.

“We were always a two-star performer on all the performance
indicators and inspection,” Wilson says. The problem was that his
department had under-reported on key indicator C20, which measures
the percentage of children on the child protection register who had
been reviewed.

Unfortunately for Tower Hamlets, Wilson was on holiday when the
call came from the SSI asking for revised data. Instead, the link
inspector spoke to a junior statistician who gave a new number that
was still below the threshold required.

When Wilson returned from leave he was told he was too late to
amend the information, but that Tower Hamlets would receive a
two-star rating in November if performance remained the same.

Camden was another authority held back from its new two-star rating
by submitting “duff data”. Director Jane Held says it is
“regrettable” that her department’s statistics appeared to show
that two of its older people’s homes had not been inspected, when
in fact they had closed mid-year.

Held also raises concerns about the danger of prioritising the
government’s selected key indicators over local needs.

“What I would regret is if we become dominated by performance
indicators, or that in the pursuit of stars we stop listening to
local people,” Held says.

Peter Gilroy, director of the newly promoted three-star Kent social
services department, adds his concerns about the impact of funding
on the ratings process.

“If you are in financial difficulty it’s easy to go up and down
these ratings,” he warns. “The more financial pressure that local
government is under, the more vulnerable we are as directors in
trying to sustain your rating.”

Meanwhile, questions are also being asked about some of those whose
performance has supposedly declined, when their 2001-2 indicators
suggest otherwise

Somerset Council has just lost one of its two stars despite rising
from 16th to 7th place in the 2001-2 performance assessment
framework indicators published last week.

Social services director Chris Davies blames an oversight by an
inspector who failed to carry out nine unannounced inspections of
children’s homes.

The mistake resulted in the council scoring 97 per cent on that
particular performance indicator. Being one of the 11 key
indicators for which a council must meet a specified minimum level
of performance -Êin this case 98 per cent -Êthe error
ended up costing the whole department a star.

Davies says the system, which was set up to show people how their
social services department is performing and boost staff morale, is
“failing on both counts” and is misleading and inflexible.

He believes that, being new, the system ought to be open to change
-Êan opinion the SSI does not appear to share. His attempts to
persuade the SSI to reconsider its decision were met with a
response of “that’s the system and we can’t change it for
individual councils”.

The Department of Health has attempted to justify Somerset’s
downward move by pointing out that there were problems with
inspections of homes for adults and older people, as well as for
children. However, according to the judgements underlying the
council’s star rating, there has been no change in adult services
between May and November, reinforcing Davies’ claim that Somerset’s
star was indeed lost on the basis of a single error.

Meanwhile, Wiltshire social services director Ray Jones says his
department was downgraded from two stars to one partly as a result
of information-gathering problems stemming from considerable
organisational change, including the establishment of the area’s
three new care trusts.

“It is particularly disappointing that our adult care services have
been assessed as having uncertain prospects for improvement when we
are moving ahead with the development of integrated health and
social care services,” Jones says.

In anyone’s estimation, the SSI and DoH have a long way to go if
they are to ensure that the 2003 ratings are not as controversial
as this year’s. The government and the inspectors need to convince
all those in the sector that the system will be robust and
reliable, and that stars will only be lost or gained for credible
reasons -Ênot because a director picks the wrong week to go on
holiday or agrees to pilot the government’s latest joint working
initiative. CC

For full list of star ratings go to

On their way up:

Zero to one star: Merton.

One to two stars: Bath and North East Somerset, Camden,
Herefordshire, Leeds, Medway Towns, Rochdale, Stockton-on-Tees,
Tower Hamlets.

Two to three stars: Kent, Kingston upon Thames, North

On their way down:

Two to one star: Bradford, Somerset, Wiltshire.

One to zero stars: Bedfordshire, Waltham Forest, Windsor and

More from Community Care

Comments are closed.