A target often proves a mixed blessing. The row about the effect
of hospital accident and emergency waiting time targets on planned
operations has been followed closely in the press, and the annual
publication of school league tables is always the subject of public
criticism. Partly as a result of this criticism, fewer targets are
now promised for hospitals and schools.
Despite this negative publicity, the Audit Commission report,
Targets in the Public Sector, strikes a confident note about the
future of performance management. But not much is heard about
social services performance targets. Few deal with life and death,
or concern matters the general population knows much about – hence
the lower profile. Within most social services departments,
however, debate is sharp.
The Department of Health (DoH) performance assessment framework
requires each social services department to report annually on its
performance against 51 performance indicators (PIs) and set targets
for the following year. These PIs are taken into account in the
annual star-rating exercise.
The framework, which has been in operation since 1998, has played a
huge part in changing front-line work conditions and the way
priorities are set in social services departments. Most PIs are
agreed to be reasonable measures of success, although in the main
they measure organisational inputs rather than outcomes for service
users. Expecting social services departments to report on the
number of service users being sent a care plan, the number of
carers’ assessments done or the number of children looked after
having a dental check is clearly reasonable. Many departments have
invested heavily in systems and processes to collect and manage
these data, and are the better for it.
But, as with other professional public sector activity, criticisms
remain. Those from operational staff centre on three themes.
First, some PIs are perceived as perverse. For instance,
improvements sometimes have poor consequences for other operational
imperatives. One is the PI for the number of looked-after children
who have three or more placement changes a year – the national
target being 16 per cent at the highest. But social services
departments are exhorted to set the number of looked-after children
on their books at a particular level. Departments starting from a
high base are moving their settled, stable children off statutory
orders. But the result is that the proportion of children who are
less settled, and have more than two placement changes a year, is
bound to rise. This PI is a “key threshold” – against which
performance must reach a certain standard before a higher star
rating can be allocated. But achieving a reasonable number of
looked-after children is a strategic priority. What is a director
Second, the profile of PIs is sometimes said to skew managerial
effort away from planned developments into activity to improve
performance against isolated PIs. Further, the efforts can be out
of all proportion to the real importance of the activity measured.
In my authority, we faced particular issues last year with two PIs,
one about care leavers and one about care plans for adult service
users. Pushing both of these up the scale was regarded as equal
priority, and work was done on both. One affected 112 service
users, the other 37,500.
Both these criticisms have some force, but they are merely
background noises compared with the key issue. For most PIs relate
unequivocally to important measures and most are not perverse. The
examples of dodgy detail above are used to attack performance
management because of the primary criticism that underlies all
others – that front-line workers commonly experience performance
management as an imposition from above, as government bureaucracy
that has nothing to do with real improvements for real
Social services departments also face more subtle challenges.
A key issue is the timing of the framework’s introduction. This
coincided with the raised expectations of recording laid down by
the DoH guidance, Recording with Care. It is compounded in social
services departments that have introduced service user databases
where social workers enter data directly into assessment and care
Each local authority is charged with developing an electronic
social care record within the year. An integrated children’s system
is being piloted now and the green paper, Every Child Matters,
envisages sophisticated data exchange systems, implying further
direct use of IT. So front-line staff and first-line managers have
responsibility for data entry and the quality of data on a scale
that was unimagined even five years ago. “I didn’t come into this
job to be a typist” is the response of many. Indeed for many their
job has changed beyond recognition.
A further subtle problem is that, because of the speed of these
changes, senior and middle managers no longer have any personal
experience of the day-to-day working pattern of their front-line
All this adds up to a degree of front-line scepticism, which is the
key challenge to incorporation of performance management
Then there are the unhelpful definitions – most of which
operational staff will be unaware of. The most spectacular is the
residential care loophole. Helping frail, elderly people to stay in
their own homes is better for them than placing them in residential
care, so the DoH expects admissions to be low and falling. But
people admitted to long-term residential care immediately after a
short-term stay are not counted as long-term admissions for the
relevant PI. The serpent’s apple in the garden of Eden can hardly
have been more tempting than this piece of bureaucratic magic to
social services departments with high residential placement rates.
We must be thankful this loophole has now been closed.
Finally, the framework is heavily weighted towards the quality of
life of looked-after children, and assessment and care planning for
older people. So staff in these teams can hardly move for PIs,
whereas staff in family centres, residential units and so on are
not directly affected.
Despite these problems, the framework represents a chance for
social services departments to identify what’s important, plan for
improvement, compare themselves with others and pull together
high-level aspirations with the day-to-day work of staff. As the
first five years of the framework come to an end, the DoH has
started a fundamental review of the system. The key to this
review’s success is to engage front-line staff in managing
performance far more comprehensively than has so far been
Tips for the DOH
- Don’t tear up the script and start again. People are getting
used to this set of PIs, and social services departments have made
huge efforts to collect this information. Most PIs make sense and
fit easily within business plans.
- Using the current output measures as a basis, build upon them
with PIs that measure outcomes for service users. This process
requires face-to-face feedback or surveys, so might be expensive.
This is the challenge. Confront it.
- PIs should be meaningfully clustered. Front-line staff do not
need to know all the PIs, but they do need intimate knowledge of
the half-dozen on which their activity and recording have an
- PIs should be understandable. To achieve general understanding,
the definition of each PI should be subject to the one-minute rule
– can a front-line worker understand what a PI means and how it is
to be measured after hearing it explained in less than a
- Count positives, so staff are conscious of measuring successes
rather than avoiding failures. So, instead of re-registrations on
the child protection register, count successful discharges. And
make sure “very good” performance equates to 100 per cent of
whatever activity is being counted, rather than zero – or worse,
somewhere in between.