About HeadStart
HeadStart is a five-year, £56 million National Lottery funded programme set-up by the Big Lottery Fund, the largest funder of community activity in the UK. It aims to explore and test new ways to improve the mental health and wellbeing of young people aged 10 to 16 and prevent serious mental-health issues from developing.

The Evidence Based Practice Unit at the Anna Freud National Centre for Children and Families and UCL is working with Big Lottery Fund and the HeadStart partnerships to collect and evaluate evidence about what does and doesn’t work locally to benefit young people now and in the future. Partners working with the Evidence Based Practice Unit on this evaluation include the Child Outcomes Research Consortium (CORC), Common Room, London School of Economics and the University of Manchester.

Find out more

Introduction

In 2018, as part of the HeadStart programme, over 30,000 young people in 116 schools were surveyed using the Wellbeing Measurement Framework - an online set of questionnaires which measure mental health, wellbeing, and resilience in children and young people. Schools were very successful in ensuring the vast majority of students took part. Achieving a good response rate of over 90% in this kind of survey can be a real challenge, and gathering this level of information about the experiences of children and young people was a significant achievement. We have done some investigation into how this was achieved with the schools involved, and hope that sharing this learning will support other schools surveying the wellbeing of their students.

What did we learn about best practice?
1. Survey introduction and accessibility
Schools saw good engagement where the survey was introduced verbally by a member of staff who was actively involved in HeadStart and really understood the survey’s purpose. This ensured that a consistent message about the importance and potential impact of the survey was delivered to students and minimised the risk of students not taking it seriously or seeing it as a waste of time.

"We focused on the idea that they were helping to shape the way funding would be spent in our area to build their resilience and tried to give them a feeling of importance, rather than just ticking a box for us."
Secondary School Senior Leader

Students and parents had the choice as to whether to take part and clearly communicating the importance of the survey and the potential positive impact of the results was essential to enable an informed decision to be made. Whilst explaining why the request was being made, there is also an important ethical responsibility to respect the right not to participate.

It was also essential that students knew how to access support if completing the survey raised any particular issues or caused them any distress. Making the survey accessible to those with English as an additional language was a challenge in many areas. In a school with high numbers of children with the same first language, the survey was introduced by a bilingual member of staff and the questions were translated during completion. Other schools used bilingual staff to translate t he whole survey in advance.

A common challenge was ensuring the survey was accessible to students with lower reading ability. It was observed that these students required significant support from an adult to understand the questions and took longer to complete the survey.

2. Timing and communication
It was clear that this was the key: schools with the highest response rates carefully chose the best completion time for the year group. The common factor was that time was planned when all of the class or year group were available. For some schools this was during form time, for others during curriculum time and for some during IT lessons when equipment was most readily available.

Ensuring most questionnaires were completed at the first sitting, limited the ‘mop ups’ of absentees. There were always some children who didn’t complete the survey at the same time as their peers. Where response rates were high, these were chased up by the school lead or form tutor. Good planning and communication also helped schools. It helped when the lead for the survey sent reminders to staff who were to be involved in advance of the sessions, and when survey completion was timetabled over
a relatively short period of time. Most schools timetabled the sessions over one or two weeks.

3. Use of IT equipment
Schools used a combination of fixed IT equipment and mobile technology. Many schools worked with IT support to minimise risk of technical issues. To maximise time when accessing the survey, a common practice was to set up a guest login which had a shortcut to the survey on the desktop.

4. Keeping track of completion
Schools that achieved very high response rates had strong record-keeping systems that were held, checked and systematically followed up by the school leads. Cross referencing with school records, including attendance registers and records of consent ensured accurate survey completion records.

Good ideas included giving students slips of paper or stickers with their login details immediately before they completed the survey. It was easy for schools to see who hadn’t been given their login and had not had the opportunity to complete the survey.

This ensured all students had the opportunity to complete the survey, but avoided pressurising those students and families who opted-out.

Recommendations

  • Plan for adjustments to meet the needs of specific groups of students
  • Ensure students understand the purpose of the survey and how important it is
  • Make sure the survey is introduced by a trusted member of staff who really understands and believes in it
  • Timetable the survey for when the maximum number of students can complete it
  • Engage IT support
  • Keep accurate records of completion
  • Prepare login details in advance, e.g. use individual slips or named stickers
  • Task an individual member of staff with following up on absentees who consented to participate but missed the chance to complete the survey

Download Case Study

Many thanks to all HeadStart Partnerships; Southern Road Primary School, Newham; Oasis Academy Isle of Sheppey, Kent; St. George’s Blackpool, Blackpool; Aspire Academy, Blackpool; Colton Hills Community School, Wolverhampton.

 

Our use of cookies

CORC is using functional cookies to make our site work. We would also like to set optional cookies (performance cookies). We don’t use marketing cookies that display personalised ads for third party advertisers.

Essential & functional cookies

Essential and functional cookies make our website more usable, enabling functions like page navigation, security, accessibility and network management. You may disable these through your browser settings, but this may affect how the website functions.

Performance cookies

These remember your preferences and help us understand how visitors interact with our website. We would like to set Google Analytics cookies which will collect information that does not identify you. If you are happy for us to do this, please click “I’m ok with cookies”.

For more detailed information about the cookies we use and how they work, please see our Cookies Policy: https://www.corc.uk.net/privacy-policy/