Councils must be more open about weaknesses in their adult social care performance and reductions in service levels in their annual reports to the public, a review has urged.
Many councils were using local accounts as an “annual report on their achievements rather than a balanced assessment of progress”, it found. Accounts often did not compare budget levels or the numbers of people receiving services year-on-year, despite national evidence that the numbers of people supported by councils and real-terms spending are both going down.
Local accounts are designed to provide residents and service users with information on their council’s adult social care performance, activity and objectives. They are a central plank of “sector-led improvement”, the system under which councils support each other to improve, which replaced annual assessments (APAs) by the Care Quality Commission in 2010.
The review was commissioned by Towards Excellence in Adult Social Care (TEASC), the coalition of sector bodies that oversees sector-led improvement, and carried out by former Manchester adults’ services director Caroline Marsh and social care consultant Rachel Ayling. It was based on a stock-take of all accounts published so far (since 2011), analysis of 25 of these and interviews with 11 councils.
Local accounts are not compulsory, however 93% of councils had published one since 2011 and 89% are due to publish one this year reflecting on their performance in 2012-13. However, the review raised concerns that 19 councils had not updated their initial local account from 2010-11, a problem linked to cuts in the number of staff in performance monitoring roles.
Lack of transparency
The review found that it was not uncommon for local accounts to be difficult to find on council websites, despite accounts being explicitly for public consumption. The review found that it was a “cultural challenge” for councils to report on their weaknesses as well as their strengths, “with only a minority of councils doing this well”.
It said other authorities needed to learn from this group and this needed to be a priority for TEASC and Association of Directors of Adult Social Services’ regional networks, who lead on sector-led improvement in their areas.
Information on budgets or numbers of service users were often published without context, making it impossible to tell whether spending or case numbers were increasing or decreasing.
It said for the sake of local accountability councils needed to be open about budget reductions and their impact on service users.
Just 11 of the 25 councils studied in detail contained hard data comparing performance with the previous year, and there was rarely a clear read across from one year’s local account to the next.
The review said this reflected the fact that councils were still experimenting with local accounts and were still coming to terms with accounts being a regular part of their performance cycle.
‘Selective use of data’
While most councils have made efforts to make the statistics in their local accounts more meaningful, some included “almost no comparative data” to enable residents to compare performance with other authorities. Others used this data selectively to highlight good performance only. It was still “common” for statistics to be used randomly, without context – “for example, we offered services to 329 carers in 2011-12” – and for positive case studies to be used as a substitute for hard performance data.
It said councils had to negotiate the tension between making their local accounts accessible and including hard performance data. But it said this should be managed by making accounts a “public-facing summary” that had accessible links to hard data on performance.
Positively, it said councils were increasing the volume and quality of their engagement with local residents on adult social care and this was reflected in local accounts.
However, only a minority of councils had involved their local involvement network (Link), the statutory service user representative organisation, in the development of their local account. Links were replaced by local Healthwatch branches in April 2013 and the review said councils were at an early stage of engaging them in local accounts.
Good practice must be shared
The review concludes with a list of good practice features and things to avoid in relation to local accounts. It said that councils should:
- Be transparent and honest rather than edit out bad news;
- Invite independent challenge of the local account, for example from the local Healthwatch or through peer review;
- Include or attach hard data to enable comparison with previous years’ performance, rather than use statistics randomly without making clear whether they are indicative of good or bad performance;
- Ensure that the local account process is widely publicised and documents are accessible, rather than buried within the council’s website;
- Address the need to prioritise the use of resources in their local accounts rather than duck this and other difficult issues.
TEASC published guidance on local accounts in May, containing similar recommendations. The review authors said it would be counter-productive to produce fresh guidance in the light of the review, but they called on TEASC to be clear about where improvements needed to be made and also to disseminate best practice.
It said these messages should be reinforced through regional Adass networks, and through peer reviews of councils carried out in the coming year. These involve councils being scrutinised on their performance by managers and practitioners from other authorities, and are another key plank of sector-led improvement.
‘We need to be saying what we are not doing so well’
Responding to the report, TEASC chair and Adass president Sandie Keene said the high proportion of councils producing local accounts was good news, but admitted the review findings were a “mixed bag”.
“We need to be saying what we are not doing so well as well as what we are doing well,” she said. “There are issues of transparency and openness.”
She said TEASC was seeking to share good practice on local accounts among councils by identifying those doing well at presenting findings, engaging with local communities, being transparent about performance and linking local accounts to hard data, so that others could learn from them.