Outcome Research Lowdown, July 2020: Dr Jenna Jacob reflects on the latest news on the SDQ

This is the first in a series of blogs where we will consider how the most recent outcomes research may be useful to inform your everyday practice and how to think about outcome information generally. This time, we will focus on emerging research exploring the psychometric properties of the Strengths and Difficulties Questionnaire (SDQ; Goodman, 1997). This type of research aims to evaluate the reliability and validity of a measure; to assess whether it measures what it intends to measure, and behaves in a similar way over time and between participants. For some measures, evidence shows very good psychometric properties, whilst for others there is little evidence or mixed findings.

The SDQ is a widely used and recognised measure, which many of you will be familiar with, not least because it is included as a child and parent reported measure in the Mental Health Services Minimum Data Set. It has been evidenced as useful in many ways, e.g. for clinical assessment, screening, outcome monitoring and research. There have been a substantial number of validation studies based on the SDQ and recent research provides both support for the use of the measure, and suggests caution in some regards. This quick review on emerging research demonstrates how a tool can have strengths and limitations in different respects.

First, the self-reported measure.

Black and colleagues (2020) made use of the Headstart (Community Fund, 2020) dataset, which included self-reported SDQ data collected from 30,290 school pupils in year 7 (aged 11-12 years) and year 9 (aged 13-15 years). The authors found that the reading age for the questions varied between five and 18 years. This is more varied than research has previously found (Patalay et al., 2018). The argument here is that a lack of understanding of some words may make the measure less accurate. This led authors to the conclusion that caution should be used when the SDQ is given to younger populations and those with learning disabilities.

The authors also looked at the factor structure of the measure. This determines whether the questionnaire items are related and can be grouped into common factors or constructs. Five structures of the SDQ have been previously identified (i.e. the subscales: emotional problems, hyperactivity, peer problems, prosocial and conduct problems; Goodman, 2001). However, Black and colleagues (2020) found that the items did not neatly load onto these five subscales.  This may suggest that the measure works better at an overall level than as a measure of these specific sub-issues.

The paper can be found here: https://journals.sagepub.com/doi/full/10.1177/1073191120903382 (not free access).

Second, the parent-reported measure.

Murray and colleagues (2019) explored whether the parent-reported version of the SDQ worked consistently even as the child changed in age. The authors made use of the Millennium Cohort Study dataset (MCS; Connelly & Platt, 2014), which included data from 11,315 parent-reported SDQs. The data are collected at age 3,5,7,11 and 14. The findings suggest that results in the parent-reported subscales are consistent at different age points and the five subscales held over time without being impacted by age point or gender.

The findings are quite different to the findings by Black et al., (2020) but it is important to note here that self-reported data would possibly generate different results as the issues related to readability may be diluted when exploring the parent-reported data. The findings here support the use of SDQ for longitudinal research, but these findings are also useful for our shorter-term purposes in CORC too. The paper is freely available online: https://psyarxiv.com/zs6q5/.

So, what is the bottom line?

I have heavily summarised the papers for this blog, so if you are interested in the detail, I encourage you to take a look at the papers. The findings related to the readability of the SDQ (Black et al., 2020) exemplify the need to know the population you are working with and to consider how best to support them in completing the measures. This emerging research suggests that the self-reported SDQ may need more development to review and refine the wording of the items. It has been around for more than twenty years, so this suggestion may be unsurprising. The authors make the very important point that any development should be done in collaboration with young people. In the interim, we may need to provide more support to young people to complete the measure, especially younger children and those with learning disabilities (the advice so far has been that the SDQ may be used with young people with mild learning difficulties, but not with more severe learning difficulties (Law & Wolpert, 2014)). Black and colleagues (2020) set out in their paper the specific questions that include challenging wording, so practitioners may find it helpful to look at the full content.

The completion of the parent-reported SDQ over time and for different age groups is promising, particularly where this might be used to consider trajectories of change throughout childhood. For routine outcome measurement, this is evidence towards the stability of the parent-reported measure, which demonstrates that the measure holds up over time and is likely to be useful for longer term work as well as shorter term work.

You can see that these two papers provide different viewpoints of the SDQ. There are many reasons to continue to use the SDQ, including its widely reported norms, and the general acceptability from those who complete it. The SDQ is one of our most widely used outcome measures. Our advice is always to triangulate the information from measures; to wherever possible, not rely solely on one measure or one perspective. No one measure is perfect, and we encourage a curious approach to outcome measurement and to appreciating the strengths and limitations of different measurement approaches (Wolpert et al., 2014).


Black, L., Mansfield, R., & Panayiotou, M. (2020). Age Appropriateness of the Self-Report Strengths and Difficulties Questionnaire. Assessment, 1073191120903382.

Community Fund (2020) Homepage. https://www.tnlcommunityfund.org.uk/funding/strategic-investments/headstart

Goodman, R. (1997). The Strengths and Difficulties Questionnaire: a research note. Journal of child psychology and psychiatry, 38(5), 581-586.

Goodman, R. (2001) Goodman. R. (2001). Psychometric properties of the strengths and difficulties questionnaire. Journal of the American Academy of Child and Adolescent Psychiatry, 40 (11), 1337-1345.

Law, D., & Wolpert, M. (2014). Guide to using outcomes and feedback tools with children, young people and families. UK: Press CAMHS.

Murray, A. L., Speyer, L.G., Hall., H. A., Valdebenito, S. & Hughes, C. (2019). A longitudinal invariance analysis of the Strengths and Difficulties Questionnaire across ages 3,5,7,11 and 14 in a large UK-representative sample.

Patalay, P., Hayes, D., & Wolpert, M. (2018). Assessing the readability of the self-reported Strengths and Difficulties Questionnaire. BJPsych Open, 4(2), 55-57.

Wolpert, M., Deighton, J., De Francesco, D., Martin, P., Fonagy, P., & Ford, T. (2014). From ‘reckless’ to ‘mindful’ in the use of outcome data to inform service-level performance management: perspectives from child mental health. BMJ Qual Saf23(4), 272-276.

Our use of cookies

CORC is using functional cookies to make our site work. We would also like to set optional cookies (performance cookies). We don’t use marketing cookies that display personalised ads for third party advertisers.

Essential & functional cookies

Essential and functional cookies make our website more usable, enabling functions like page navigation, security, accessibility and network management. You may disable these through your browser settings, but this may affect how the website functions.

Performance cookies

These remember your preferences and help us understand how visitors interact with our website. We would like to set Google Analytics cookies which will collect information that does not identify you. If you are happy for us to do this, please click “I’m ok with cookies”.

For more detailed information about the cookies we use and how they work, please see our Cookies Policy: https://www.corc.uk.net/privacy-policy/