We’ve asked our CORC Practice Lead Kate Dalzell to talk more about how our Best Practice Framework can help services with their outcome measurement and one of our CORC members with recent experience using the Best Practice Framework - Kim Vassallo, Assistant Psychologist at Hackney Children and Families Service (CFS) - was so kind to talk to us about the Best Practice Framework staff survey her service recently completed.
CORC: What made you decide to use the CORC Best Practice Framework staff survey?
Kim: When I got to the clinical survey, I think what we started to notice immediately was that generally the outcome and feedback measures had decreased over the last couple of years. And they used to be quite good. And over time other things came into place and staff members were quite unmotivated to complete outcome measures and do this as a routine thing. And another thing we noticed was that many of the systems we used to capture our data have not really adapted to be able to capture the measures in a way that was reflecting the type of work we were doing. So, we were losing a lot of information as well, and the few times we did do it, often that information got lost, which is a shame. So, I felt quite stuck and didn’t know where to even begin, especially since our team changes quite quickly and I was new as well. So, I thought the idea of just hearing what my colleagues thought about outcome and feedback measures generally, to hear their attitude and experience would be really helpful, and also to hear their suggestions. So, I was very keen to give this a try. And I also wanted to highlight areas that needed to be improved that I would otherwise not be aware of.
CORC: How did you find the process of arranging the survey?
Kim: Surprisingly easy actually. I think often when you’re doing things online and setting these kind of things up, it can turn a bit complicated, but what I found really good was that you already had a really well-thought-out template, which addressed many of the areas that I was interested in and we were able to tweak it as we needed for our service and what we thought was appropriate and what was not. So that made it a lot easier, rather than me having to come up with a range of questions on my own, which I’m sure we would have missed things by doing that. And then the fact that you had a system that enabled to send the survey out via email was also easier on my end as well. I also think that having an external person setting up the survey rather than myself also helped in terms of the clinicians’ attitude towards the survey, because they know it comes from a body that is involved with this work rather than a single individual that just made something up on survey monkey.
CORC: How did staff respond to being asked to complete the survey?
Kim: I was surprised because the attitude towards outcome measures aren’t great and then to ask them to spend time on that would be quite a hard ask to actually get everyone to fill it out. But the fact that people did, was really good. I didn’t have anyone come to me complain that it’s a waste of their time. So generally, I think reasonably positive. We didn’t obviously get 100% completion, but I think the fact that a lot of my colleagues did do it was a good sign.
CORC: Did you get any feedback from staff about the survey? What did they say?
Kim: Yeah, I actually had a few colleagues come up to me and say that this is a really good idea, because it is not something that they generally would think about and it usually is at the end of the list of things and that they usually don’t have the head space. So, having questions that directly prompt you to give very quick answers was really good. And a few people also said that this is a great starting platform. So, I did receive a few positive comments. And like I said earlier, no one came up to me to tell me that this was a waste of time or "I can’t do it".
CORC: Was the report easy to understand?
Kim: I honestly didn’t know what to expect from the report, because there are so many different formats. So, what I really liked was that it was very straight forward and broken down into very clear sections, which were all understandable to me, but also understandable to other people in the team, which I thought was great. It didn’t use over-complicated language, which I thought was really important, because a lot of the colleagues that I work with are clinicians first and foremost and not working in data research or management. Also, each of the core areas that were discussed, not only gave you the statistics, but they also gave suggestions, areas of considerations and improvements and additional things to think about. So, they gave the data meaning.
CORC: Did it reveal any ‘surprises’?
Kim: A few things did stand out, for example the fact that not many people had training in the last couple of years, which was a bit surprising. And I think that was a bit of a reality check and something we need to work on. And then something that was not a surprise, but something we didn’t think about as much, was involving service users and that was a theme that came across quite a few times and that really stood out. Also interesting was that several people didn’t think outcome measures were evidence-based and a higher percentage thought outcome measures are not particularly useful. But again, I think that comes from the fact that many people aren’t trained in this field and perhaps this is were this attitude comes from.
CORC: What do you plan to do with the report?
Kim: One of the conclusions of the report was that a lot of the staff don’t get feedback in terms of the outcomes of the data we produce from the measures they collect and I really wanted to let them know that they are being heard early on. We did a little session where we discussed this with the team and I was actually surprised by the amount of insight people brought with them and they were willing to think about alternative ways of approaching things, which was really helpful. We also thought about how we can involve service users more when moving forward, so whether having a little group, whether clinicians in their own sessions can involve them a little bit more. So, this is something we would like to work on. And something we also noticed from the report and from the discussion following the report was that our service is really quite niche in comparison to maybe more typical CAMHS services. So, we don’t only address mental health, we address quite a few other areas and many of the outcome measures we work with clinicians don’t like, because they are really limited and they don’t capture the whole work they do. So, we want to address what outcome measures might be more appropriate and giving clinicians and clients a bit more flexibility depending on their needs and areas they want to work on. And ultimately, there is this project to change the data system in the way we capture outcome measures and the report will be very helpful in feeding into this project in the long term.
CORC: Have you shared with colleagues? What reaction did they have?
Kim: I think they were surprised about some of the findings, for example I said to colleagues: “I can see that your motivation is not very high and this is reflective in our outcomes that we are collecting.” And I think they were surprised about that. But I also think that it brought the team together and we began shifting our attitudes and although a lot of the statistics were not great at all, but instead of walking away in a negative state of mind, a lot of us walked away thinking we need to change this and something needs to be done. So, that is quite optimistic, which was a pleasant surprise for me. I have good colleagues that were really thoughtful in sharing the findings.
CORC: Would you recommend the staff survey to other services?
Kim: Yes definitely. Even services who might be doing better in terms of outcome measures collection and getting more feedback from service users. In data management roles, we often make many assumptions about the attitudes of our colleagues and the work they are doing. So, hearing directly from them in this anonymous bubble allows them to be very honest and forthright. Therefore, I would definitely recommend it.
CORC: What advice might you give to a service that were considering carrying out the survey?
Kim: Obviously, the more people respond to the survey, the better. I tried this via email, then I called them or talked to them when I saw them. Also, during team meetings, it’s an opportunity to ask about the survey and for people to do it. So, find different ways to do the staff survey is really important for other services to think about early on. I think lunch time sessions can be helpful too by providing tablet devices for staff members…anything that could increase the number of responses. Also feeding back early on and being transparent. Staff wants to know about the results. So, to know what comes afterwards. Also, pitching it a time that is suitable for people, so not around the Christmas or Easter Holiday, is important, since many people will be on annual leave. So getting a time during the year where many people are around and also giving them enough time to do it, even though it might be only 10 – 15 minutes, but you have many other urgent things to do, so calculate plenty of time.
CORC: Is there anything that could be improved about the survey or report and associated process?
Kim: The report that was generated was very useful, but I had to work on it to make it presentable to the team. So, for me having more and simpler graphics that show the statistics more easily would be very helpful and can be shared more widely.
Many thanks to Kim Vassallo for answering our interview questions.