Balancing act

Part of the unstinting drive towards “modernisation” of public services has been to require better information about the way they “perform” their legal duties and “deliver” services. The performance assessment framework (PAF), introduced in April 2000, was designed to find these things out.

As PAF counts towards the calculation of star ratings, it’s little wonder that social services departments, among others, are seeking ways to more effectively and systematically measure and “manage” performance. Some eyes have thus glanced anxiously across to the private sector seeking inspiration. One industry tried and tested approach is the balanced scorecard (see panel below).

It is as a collection of statistical information perhaps that most people understand performance management – a phrase first coined in the 1970s but only a recognised process in the late 1980s. Yet staff need convincing that the system’s true value lies beyond ticking boxes. And, in part, that also means demystifying a jargon that demands “indicators” apparently “populate” some “domain” or other.

Gaining staff confidence and adapting an essentially manufacturing model for public services have been two of the challenges faced by two social services departments (geographically) at the top and bottom of the country who, respectively in 2002 and 2000, adopted the balanced scorecard: Durham and Portsmouth.

“It is true that it was designed for the private sector,” says Keith Newby, quality and performance monitoring manager, Durham social care and health. “For example, ‘the percentage of retained customers’ may not be an indicator of any value to us, but why should we too not aim for improvements across the identified range of areas?”

At first, Durham used the Kaplan & Norton headings (see panel below) but found it difficult to switch between their headings and the six “domains” that the Department of Health (DoH) use in the PAF: national priorities and strategic objectives, cost and efficiency, effectiveness of service delivery and outcomes, quality of services for users and carers, fair access, and capacity for improvement. “There is absolutely no use in using an inflexible system or a flexible system rigidly. We have kept the principles but changed the domain headings to the DoH ones. In other words we have made the system meet our needs,” says Newby.

These needs mean that staff now look at a whole area of performance and don’t just review themselves against individual indicators. “Looking at children’s services we realised that, while there was much good work going on, we hadn’t addressed the area of ‘quality of services for users and carers’. Making this a focus within the scorecard gave this an added urgency,” says Newby. Seeking user views is now custom and practice across the county: “Surveys are now sent out automatically – everyone who has an assessment will be sent a questionnaire; once a year everyone who uses home care will be asked what they think of the service,” he adds.

Newby believes that top management commitment is essential: “It just doesn’t work if you don’t start with that. They have to be utterly on board.”

This sentiment is echoed by Mike Staniforth, planning, performance & quality manager, Portsmouth social services. “We aimed it first at senior management – this was the best place to start because if they adopt it you’re on to a winner in terms of taking it further.”

As with Durham, Portsmouth started off with the original Kaplan & Norton model and adapted it with developments. However, it was preferred to introduce the model incrementally. “The adult disability sector piloted scorecards first. Teams were encouraged to develop their own set of indicators, based on local priorities, as well as purely national ones, that we thought we could easily measure,” says Staniforth.

However, there was some staff unease, as Staniforth explains: “Some people felt that the approach implied they were being centrally monitored. So there were worries about how this information would be used, who would use it and to what end. But we said to people: ‘This is your scorecard, it’s for your use to track your business.’ We needed to say it was safe and the information wouldn’t imply criticism of them.”

And this reassurance helps defeat the cynicism. Put simply, if anything is to be accepted by staff they need to see that it works for them and is not just something that benefits management. On balance, this is some performance.

Rubbish tips

  • Stick to a performance framework – make your services fit in with what the experts have given us.
  • It’s enough if someone on the top management team is on board.
  • It’s another fad – treat it as such.

Top tips

  • Go with what you’ve got. Don’t try and be too clever about it
  • Use champions (preferably service areas rather than individuals) to carry the message.
  • Have a good IT network: store scorecards on your intranet.

What is the balanced scorecard?

The balanced scorecard is an aid to organisational performance management. It helps to focus not only on financial targets but also on internal processes.

The scorecard provides a “dashboard” view of overall performance through monitoring key indicators.

It was developed in the early 1990s by Harvard Business School’s Robert Kaplan and David Norton. It suggests that the organisation is viewed from four perspectives: learning and growth, business process, customer, and financial.

Kaplan and Norton describe it as follows: “The balanced scorecard retains traditional financial measures. But financial measures tell the story of past events, an adequate story for industrial-age companies for which investments in long-term capabilities and customer relationships were not critical for success.

These financial measures are inadequate, however, for guiding and evaluating the journey that information-age companies must make to create future value through investment in customers, suppliers, employees, processes, technology, and innovation.”

More from Community Care

Comments are closed.