Background

The four mental health services (First Steps, CAMHS Disability, Specialist CAMHS, and Off Centre) within the City & Hackney CAMHS Alliance each report on outcomes to the commissioning body quarterly. However, each service has been interpreting the outcome questions posed in the quarterly report template differently, making it difficult for the commissioning body to interpret outcomes across the Alliance.

For example, one of the outcomes questions is ‘Number of initial questionnaires completed’; this question has been interpreted by one service to mean ‘Number of initial outcome questionnaires completed’ while another service interprets it as ‘Number of clients who completed at least one initial outcome’. Both interpretations provide useful information, but the different services are measuring and reporting on different things.

In addition, the information produced for the quarterly report is not clinically useful for the services to reflect on. For example, the report requires each service to report on the number of initial and number of follow up forms completed in a quarter, but the population captured by each question is not the same as many service-users do not complete a time 1 and time 2 outcome measure in the same quarter; clinically it would be more useful to know the number of clients who completed a time 1 and a time 2 measure over the course of their treatment, i.e. information on the numbers and percentages of paired measures being captured by the services.

The commissioning body they were keen that an operational definition for each question be decided to address the problem of interpretation, and that the questions be changed so that reporting was no longer a tick box experience but also produced useful information for the services.

What was the problem?

The process is ongoing but was started through discussions with the commissioner, who requested the Assistant Psychologist draft more clinically useful outcomes questions and operational definitions for each question. Part of that stage involved researching CORC and CYP-IAPT reporting guidelines to ensure that the new questions comply with the CORC and CYP-IAPT data collection requirements. The CORC Regional Officer was very helpful in providing suggestions in regard to best practices for reporting on outcomes so the questions are in line with what CORC has found to be clinically useful information to report to commissioners.

The drafted questions have now been taken to the Clinical Leads for discussion. The Leads will discuss whether the questions are appropriate to each of their services and if so, whether the services have the time and resources to report on the suggested questions. For example, one of the suggestions is that each service reports on the reliable improvement across paired measures of recently discharged patients; however, some of the services lack the informatics structures and human resources to gather this information efficiently.

Who was involved?

The project involves engagement from the commissioner, the Clinical Leads in each of the CAMHS Alliance services, and the CAMHS Alliance Outcomes Assistant Psychologist and is aimed at helping management within the services as well as the commissioning body.

Key learning

  • There are many ways to collect and report on routine outcome measures
  • It can be very useful to have an open conversation with commissioners around outcomes and discuss with them the clinical usefulness - and limitations - of collecting outcomes data

Next steps

  • The Clinical Leads will finalise the new quarterly report outcome measures
  • A meeting will be set up with the commissioner to discuss the questions. In particular, services are currently reporting on ‘improvement’ on outcome measures, but it has been suggested by CORC that the services should be reporting on ‘reliable improvement’; this change will likely make services look worse, and thus this must be properly explained and discussed with the commissioning body before any changes are implemented.
  • Hopefully the new questions can be decided on prior to the next quarterly report (end of March 2018).
  • After implementation, the hope is that commissioners will have clearer data to interpret and that the data can be useful for services to reflect on their strengths and areas for improvement.

Many thanks to Phoebe Neville for providing us with this case study.