An assertive outreach team from the voluntary sector evaluates itself

A trailblazing voluntary sector assertive outreach team has passed its 10th birthday – a good time to take stock and involve service users in evaluating its effectiveness. Graham Hopkins reports

Assertive outreach – providing specialised, co-ordinated and flexible support and treatment in the community – seems such a staple part of mental health work that it is surprising to remember that most teams have only been operational for five years or so.

However, in Norfolk, the Julian Housing active outreach team has already sailed past its 10th birthday. “The team was set up in August 1995 in response to the emerging need to provide support to people with whom other teams had difficulties engaging,” says team manager Ben Curran.

Assertive outreach began life as assertive community treatment in the US, and its service models and those from Australia provided the inspiration. “There was an emerging recognition that engagement was dependent on a more needs-led approach requiring increased flexibility from staff,” adds Curran. “The model called for smaller caseloads and a team approach as opposed to individual caseloads. Staff used social activities as a vehicle for engaging and building trust with service users.”

With such a milestone reached it was opportune to commission an independent service user evaluation. “We wanted to learn from the people we worked with, their experiences of having contact with a service for up to 10 years,” says Curran. “As many other assertive outreach services were approaching their fifth birthdays, what lessons could be learned about providing a service to people beyond the five-year mark?”

Recognising the danger in trying to encourage service users to conform to a set of outcome measurements that might limit the information, the team chose a more qualitative approach. “Many of the people we worked with had fallen between services as a result of an inflexible approach that had failed to respond to their needs. It seemed in keeping with the culture of the service to put it back to the people we worked with to define what they saw as the issues,” he says.

The evaluation, conducted by a student social worker on placement with the team, focused on the improvements the service had made to people’s lives. “We also wanted to acknowledge that there may be areas where the team had prevented people changing, such as by creating over-dependency,” says Curran. “We wanted to learn why people still maintained contact after many years and if service users felt that there was a future when they might not wish to have contact with us.”

The outcomes were largely positive, with nearly all service users agreeing that the team had improved their lives. As one said: “I was a drug addict, using needles, acid. I am now in a band, a guitarist.”

The diversity of responses highlighted that people change and grow at a different pace and so there can be no uniformity of how support is offered or for how long. Indeed, service users tended to take a short-term view about their need for support, while support workers often held the long-term view of building on an individual’s strengths and aspirations.

For Curran, the process of enabling longer term service users to broaden their network of support into the community, away from mental health services, remains crucial. “This must be balanced with supporting them to have access to the team at times of need,” he says. “The team views engagement as a long-term relationship, and the process of supporting people to move on from the service should be approached with equal planning and flexibility.”

But it was the process itself that might well spur the team’s direction. Curran says: “The experience has stimulated a wider discussion among the team as to how we evolve an outcomes measurement tool that matches people’s needs and experiences.

“This is a challenge for all assertive outreach teams as outcome measures often focus on ‘measures of success’ that are subjective and difficult to interpret in terms of hard data, such as improved functioning and improved quality of life, but also have a tendency to be overly service-focused with less attention paid to service users’ views and satisfaction with the service.”

For a copy of the evaluation contact Ben Curran at

Lessons learned

  • In carrying out a survey it was necessary to offer flexibility with service user appointments and it was suggested that the student could meet with people in their own homes, at the office or another community resource, or a venue of the service user’s choice.
  • Equally, the survey needed to be user-friendly: not too many questions; questions that would be easily understood (not in service lingo); questions that were “open” rather than “closed” to generate reflection and conversation.
  • Thinking about engaging service users led to recognition that they could take more ownership of the service and its development. This might include setting up a service user panel.

  • More from Community Care

    Comments are closed.