One of my earliest encounters with social and emotional learning as a teacher came in the early 2010s when I removed a faded poster from the mouldy corner of my new classroom.
I was reminded of this experience when Stuart Locke, chief executive of a trust, tweeted his shock that the Education Endowment Foundation advocated social and emotional learning (EEF, 2019b). Stuart based his argument on his own experiences as a school leader during the 2000s and a critical review of some underlying theories (Craig, 2007).
Given this, I decided to look at the evidence for SEL, unsure of what I would find.
When thinking about how strong the evidence is for a given issue, I find it helpful first to imagine what evidence would answer our questions. Two broad questions I have about SEL:
- Is investing in SEL cost-effective compared to alternatives?
- What are the best ways of improving SEL?
We would ideally have multiple recent studies comparing different SEL programmes to answer these questions. These studies would be conducted to the highest standards, like the EEF’s evaluation standards (EEF, 2017, 2018). Ideally, the array of programmes compared would include currently popular programmes and those with a promising theoretical basis. These programmes would also vary in intensity to inform decisions about dosage.
Crucially, the research would look at a broad array of outcomes, including potential negative side-effects (Zhao, 2017). Such effects matter because there is an opportunity cost to any programme. These evaluations would not only look at the immediate impact but would track important outcomes through school and even better into later life. This is important given the bold claims made for SEL programmes and the plausible argument that it takes some time for the impact to feed through into academic outcomes.
The evaluations would not be limited to comparing different SEL programmes. We would even have studies comparing the most promising SEL programmes to other promising programmes such as one-to-one tuition to understand the relative cost-effectiveness of the programmes. Finally, the evaluations would provide insights into the factors influencing programme implementation (Humphrey et al., 2016b, 2016a).
Any researcher reading this may smile at my naïve optimism. Spoiler: the available evidence does not come close to this. No area of education has evidence like this. Therefore, we must make sense of incomplete evidence.
A history lesson
Before we look at the available evidence for SEL, I want to briefly trace its history based on my somewhat rapid reading of various research and policy documents.
A widely used definition of SEL is that it refers to the process through which children learn to understand and manage emotions, set and achieve positive goals, feel and show empathy for others, establish and maintain positive relationships, and make responsible decisions (EEF, 2019b).
CASEL, a US-based SEL advocacy organisation, identify five core competencies: self-awareness, self-management, social awareness, relationship skills, and responsible decision-making (CASEL, 2022). A challenge with the definition of SEL is that it is slippery. This can lead to what psychologists call the jingle-jangle fallacy. The jingle fallacy occurs when we assume that two things are the same because they have the same names; the jangle fallacy occurs when two almost identical things are taken to be different because they have different names.
Interest in social and emotional learning has a long history, both in academic research and in the working lives of teachers who recognise that their responsibilities extend beyond ensuring that every pupil learns to read and write. In England, the last significant investment in social and emotional learning happened in the late 2000s and was led by Jean Gross CBE (DfE, 2007). By 2010, around 90% of primary schools and 70% of secondary schools used the approach (Humphrey et al., 2010). The programme was called the social and emotional aspects of learning (SEAL) and focused on five dimensions different from those identified by CASEL but with significant overlap.
In 2010, the DfE published an evaluation of the SEAL programme (Humphrey et al., 2010). Unfortunately, the evaluation design was not suitable to make strong claims about the programme’s impact. Before this evaluation, there were five other evaluations of the SEAL programme, including one by Ofsted (2007), which helped to pilot the approach.
In 2010, the coalition government came to power, and the national strategies stopped. Nonetheless, the interest in social and emotional learning arguably remains as a 2019 survey of primary school leaders found that it remained a very high priority for them. However, there were reasonable concerns about the representativeness of the respondents (Wigelsworth, Eccles, et al., 2020).
In the past decade, organisations interested in evidence-based policy have published reports concerning social and emotional learning. Here are twelve.
- In 2011, an overarching review of the national strategies was published (DfE, 2011).
- In 2012, NICE published guidance on social and emotional wellbeing in the early years (NICE, 2012).
- In 2013, the EEF and Cabinet Office published a report on the impact of non-cognitive skills on the outcomes for young people (Gutman & Schoon, 2013)
- In 2015, the Social Mobility Commission, Cabinet Office, and Early Intervention Foundation published a series of reports concerning the long-term effects of SEL on adult life, evidence about programmes, and policymakers’ perspectives (EIF, 2015).
- In 2015, the OECD published a report on the power of social and emotional skills (OECD, 2015).
- In 2017, the US-based Aspen Institute published a scientific consensus statement concerning SEL (Jones & Kahn, 2017).
- In 2018, the DfE began publishing findings from the international early learning and child wellbeing (IELS) study in England, including SEL measures (DfE, 2018).
- In 2019, the EEF published a guidance report setting out key recommendations for improving social and emotional learning (EEF, 2019b).
- In 2020, the EEF published the results of a school survey and an evidence review that supported the 2019 guidance report (Wigelsworth, Eccles, et al., 2020; Wigelsworth, Verity, et al., 2020).
- In 2021, the Early Intervention Foundation published a systematic review concerning adolescent mental health, including sections on SEL (Clarke et al., 2021).
- In 2021, the EEF updated its Teaching and Learning Toolkit, which includes a strand on social and emotional learning (EEF, 2021).
- In 2021, the Education Policy Institute published an evidence review of SEL and recommended more investment, particularly given the pandemic (Gedikoglu, 2021).
To make sense of this array of evidence, we need to group it. There are many ways to do this, but I want to focus on three: theory, associations, and experiments.
Theory is perhaps the most complicated. To save my own embarrassment, I will simply point out that social and emotional learning programmes have diverse theoretical underpinnings, and these have varying levels of evidential support. Some are – to use a technical term – a bit whacky, while others are more compelling. A helpful review of some of the theory, particularly comparing different programmes, comes from an EEF commissioned review (Wigelsworth, Verity, et al., 2020). I also recommend this more polemical piece (Craig, 2007).
The next group of studies are those that look for associations or correlations. These studies come in many different flavours, including cohort studies that follow a group of people throughout their lives like the Millennium Cohort Study (EIF, 2015). The studies are united in that they look for patterns between SEL and other outcomes. Still, they share a common limitation: it is hard to identify what causes what. These studies can highlight areas for further investigation, but we should not attach too much weight to them. Obligatory XKCD reference.
Experiments can test causal claims by estimating what would have happened without the intervention and comparing this to what we observe. Experiments are fundamental to science, as many things seem promising when we look at just theory and associations, but when investigated through rigorous experiments are found not to work (Goldacre, 2015).
There are four recent meta-analyses, which have included experiments (Mahoney et al., 2018). These meta-analyses have been influential in the findings from most of the reports listed above. The strength of meta-analysis, when based on a systematic review, is that it reduces the risk of bias from cherry-picking the evidence (Torgerson et al., 2017). It also allows us to combine lots of small studies, which may individually be too small to detect important effects. Plus, high-quality meta-analysis can help make sense of the variation between studies by identifying factors associated with these differences. To be clear, these are just associations, so they need to be interpreted very cautiously, but they can provide important insights for future research and practitioners interested in best bets.
Unfortunately, the meta-analyses include some pretty rubbish studies. This is a problem because the claims from some of these studies may be wrong. False. Incorrect. Mistaken. Researchers disagree on the best way of dealing with studies of varying quality. At the risk of gross oversimplification, some let almost anything in (Hattie, 2008), others apply stringent criteria and end up with few studies to review (Slavin, 1986), while others set minimum standards, but then try to take account of research quality within the analysis (Higgins, 2016).
If you looked at the twelve reports highlighted above and the rosy picture they paint, you would be forgiven for thinking that there must be a lot of evidence concerning SEL. Indeed, there is quite a lot of evidence, but the problem is that it is not all very good. Take one of the most widely cited programmes, PATHS, for which a recent focused review by the What Works Clearinghouse (think US-based EEF) found 35 studies of which:
- 22 were ineligible for review
- 11 did not meet their quality standards
- 2 met the standards without reservations
Using the two studies that did meet the standards, the reviewers concluded that PATHS had no discernible effects on academic achievement, student social interaction, observed individual behaviour, or student emotional status (WWC, 2021).
Unpacking the Toolkit
To get into the detail, I have looked closely at just the nine studies included in the EEF’s Toolkit strand on SEL with primary aged children since 2010 (EEF, 2021). The date range is arbitrary, but I have picked the most recent studies because they are likely the best and most relevant – the Toolkit also contains studies from before 2010 and studies with older pupils. I chose primary because the EEF’s guidance report focuses on primary too. Note sampling studies from the Toolkit like this avoids bias since the Toolkit itself is based on systematic searches. The forest plot below summarises the effects from the included studies. The evidence looks broadly positive because most of the boxes are to the right of the red line. Note that multiple effects were reported in two studies hence 11 effects, but nine studies for review.
It is always tempting to begin to make sense of studies by looking at the impact, as we just did. But I hope to convince you we should start by looking at the methods. The EEF communicates the security of a finding through padlocks on a scale from 0-5, with five padlocks being the most secure (EEF, 2019a). Of the nine studies, two are EEF-funded studies, but for the remaining seven, I have estimated the padlocks using the EEF’s criteria.
Except for the two EEF-funded studies, the studies got either zero or one padlock. The Manchester (2015) study received the highest security rating and is a very good study: we can have high confidence in the conclusion. The Sloan (2018) study got just two padlocks but is quite compelling, all things considered. Despite being a fairly weak study by the EEF’s standards, it is still far better than the other studies.
The limitations of the remaining studies are diverse, but recurring themes include:
- High attrition – when lots of participants are randomised but then not included in the final analysis, this effectively ruins the point of randomisation (IES, 2017a).
- Few cases randomised – multiple studies only randomised a few classrooms, and the number of cases randomised has a big impact on the security of a finding (Gorard, 2013).
- Poor randomisation – the protocols for randomisation are often not specified, and it is not always possible to assess the integrity of the randomisation process (IES, 2017b)
- Self-reported outcomes – most studies used self-reported outcomes from pupils or teachers, which are associated with inflated effect sizes (Cheung & Slavin, 2016). The EEF’s studies have also consistently shown that teacher perceptions of impact are poor predictors of the findings from evaluations (Stringer, 2019).
- Unusual or complex analysis choices – many studies include unusual analysis choices that are not well justified, like dichotomising outcome variables (Altman & Royston, 2006). Further, the analyses are often complex, and without pre-specification, this gives lots of ‘researcher degrees of freedom’ (Simmons et al., 2011).
- Incomplete reporting – the quality of reporting is often vague about essential details. It is difficult to properly assess the findings’ security or get a clear understanding of the exact nature of the intervention (Hoffmann et al., 2014; Montgomery et al., 2018).
- Social threats to validity – where classes within a school are allocated to different conditions, there is a risk of social threats to validity, like resentful demoralisation, which were not guarded against or monitored (Shadish et al., 2002).
The SEL guidance report
Stuart’s focus was originally drawn to the Improving Social and Emotional Learning in Primary Schools guidance report (EEF, 2019b). A plank of the evidence base for this guidance report was the EEF’s Teaching and Learning Toolkit. At the time, the toolkit rated the strand as having moderate impact for moderate cost, based on extensive evidence (EEF, 2019b). Since the major relaunch of the Toolkit in 2021, the estimated cost and impact for the SEL strand have remained the same, but the security was reduced to ‘very limited evidence’ (EEF, 2021). The relaunch involved looking inside the separate meta-analyses that made up the earlier Toolkit and getting a better handle on the individual studies (TES, 2021). In the case of the SEL strand, it appears to have highlighted the relative weakness of the underlying studies.
Being evidence-informed is not about always being right. It is about making the best possible decisions with the available evidence. And as the evidence changes, we change our minds. For what it is worth, my view is that given the strong interest among teachers in social and emotional learning, it is right for organisations like the EEF to help schools make sense of the evidence – even when that evidence is relatively thin.
This rapid deep dive into the research about SEL, has also given me a necessary reminder that from time-to-time it is necessary to go back to original sources, rather than only relying on summaries. For instance, the EEF’s recent cognitive science review found just four studies focusing on retrieval practice that received an overall rating of high, which I know many people are surprised to learn given the current interest in using it (Perry et al., 2021).
I’ll give the final word to medical statistician Professor Doug Altman: we need less research, better research, and research done for the right reasons (Altman, 1994).
Altman, D. G. (1994). The scandal of poor medical research. BMJ, 308(6924), 283–284. https://doi.org/10.1136/BMJ.308.6924.283
Altman, D. G., & Royston, P. (2006). Statistics Notes: The cost of dichotomising continuous variables. BMJ : British Medical Journal, 332(7549), 1080. https://doi.org/10.1136/BMJ.332.7549.1080
Ashdown, D. M., & Bernard, M. E. (2012). Can Explicit Instruction in Social and Emotional Learning Skills Benefit the Social-Emotional Development, Well-being, and Academic Achievement of Young Children? Early Childhood Education Journal, 39(6), 397–405. https://doi.org/10.1007/S10643-011-0481-X/TABLES/2
Bavarian, N., Lewis, K. M., Dubois, D. L., Acock, A., Vuchinich, S., Silverthorn, N., Snyder, F. J., Day, J., Ji, P., & Flay, B. R. (2013). Using social-emotional and character development to improve academic outcomes: a matched-pair, cluster-randomized controlled trial in low-income, urban schools. The Journal of School Health, 83(11), 771–779. https://doi.org/10.1111/JOSH.12093
Brackett, M. A., Rivers, S. E., Reyes, M. R., & Salovey, P. (2012). Enhancing academic performance and social and emotional competence with the RULER feeling words curriculum. Learning and Individual Differences, 22(2), 218–224. https://doi.org/10.1016/J.LINDIF.2010.10.002
CASEL. (2022). Advancing Social and Emotional Learning. https://casel.org/
Cheung, A. C. K., & Slavin, R. E. (2016). How methodological features affect effect sizes in education. Educational Researcher, 45(5), 283–292. https://doi.org/https://doi.org/10.3102/0013189X16656615
Clarke, A., Sorgenfrei, M., Mulcahy, J., Davie, P., Friedrick, C., & McBride, T. (2021). Adolescent mental health: A systematic review on the effectiveness of school-based interventions | Early Intervention Foundation. https://www.eif.org.uk/report/adolescent-mental-health-a-systematic-review-on-the-effectiveness-of-school-based-interventions
Craig, C. (2007). The potential dangers of a systematic, explicit approach to teaching social and emotional skills (SEAL) An overview and summary of the arguments. https://img1.wsimg.com/blobby/go/9c7fd4e5-3c36-4965-b6d8-71c83102ff94/downloads/SEALsummary.pdf?ver=1620125160707
DfE. (2007). Social and emotional aspects of learning for secondary schools (SEAL). https://dera.ioe.ac.uk/6663/7/f988e14130f80a7ad23f337aa5160669_Redacted.pdf
DfE. (2011). The national strategies 1997 to 2011. In 2011. https://www.gov.uk/government/publications/the-national-strategies-1997-to-2011
DfE. (2018). International early learning and child wellbeing: findings from the international early learning and child wellbeing study (IELS) in England. https://www.gov.uk/government/publications/international-early-learning-and-child-wellbeing
EEF. (2017). EEF standards for independent evaluation panel members. https://educationendowmentfoundation.org.uk/public/files/Evaluation/Setting_up_an_Evaluation/Evaluation_panel_standards.pdf
EEF. (2018). Statistical analysis guidance for EEF evaluations. https://d2tic4wvo1iusb.cloudfront.net/documents/evaluation/evaluation-design/EEF_statistical_analysis_guidance_2018.pdf
EEF. (2019a). Classification of the security of findings from EEF evaluations. http://educationendowmentfoundation.org.uk/uploads/pdf/EEF_evaluation_approach_for_website.pdf
EEF. (2019b). Improving Social and Emotional Learning in Primary Schools. https://educationendowmentfoundation.org.uk/education-evidence/guidance-reports/primary-sel
EEF. (2021). Teaching and learning toolkit: social and emotional learning. https://educationendowmentfoundation.org.uk/education-evidence/teaching-learning-toolkit/social-and-emotional-learning
EIF. (2015). Social and emotional learning: skills for life and work. https://www.gov.uk/government/publications/social-and-emotional-learning-skills-for-life-and-work
Gedikoglu, M. (2021). Social and emotional learning: An evidence review and synthesis of key issues – Education Policy Institute. https://epi.org.uk/publications-and-research/social-and-emotional-learning/
Goldacre, B. (2015). Commentary: randomized trials of controversial social interventions: slow progress in 50 years. International Journal of Epidemiology, 44(1), 19–22. https://doi.org/10.1093/ije/dyv005
Gorard, S. (2013). Research design: creating robust approaches for the social sciences (1st ed.). SAGE.
Gutman, L. M., & Schoon, I. (2013). The impact of non-cognitive skills on outcomes for young people Literature review. https://educationendowmentfoundation.org.uk/education-evidence/evidence-reviews/essential-life-skills
Hattie, J. (2008). Visible learning: a synthesis of over 800 meta-analyses relating to achievement. Routledge.
Higgins, S. (2016). Meta-synthesis and comparative meta-analysis of education research findings: some risks and benefits. Review of Education, 4(1), 31–53. https://doi.org/10.1002/rev3.3067
Hoffmann, T. C., Glasziou, P. P., Boutron, I., Milne, R., Perera, R., Moher, D., Altman, D. G., Barbour, V., Macdonald, H., Johnston, M., Lamb, S. E., Dixon-Woods, M., McCulloch, P., Wyatt, J. C., Chan, A.-W., & Michie, S. (2014). Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ (Clinical Research Ed.), 348, g1687. https://doi.org/10.1136/BMJ.G1687
Humphrey, N., Lendrum, A., Ashworth, E., Frearson, K., Buck, R., & Kerr, K. (2016a). Implementation and process evaluation (IPE) for interventions in education settings: A synthesis of the literature. https://educationendowmentfoundation.org.uk/public/files/Evaluation/Setting_up_an_Evaluation/IPE_Review_Final.pdf
Humphrey, N., Lendrum, A., Ashworth, E., Frearson, K., Buck, R., & Kerr, K. (2016b). Implementation and process evaluation (IPE) for interventions in education settings: An introductory handbook. https://educationendowmentfoundation.org.uk/public/files/Evaluation/Setting_up_an_Evaluation/IPE_Guidance_Final.pdf
Humphrey, N., Lendrum, A., & Wigelsworth, M. (2010). Social and emotional aspects of learning (SEAL) programme in secondary schools: national evaluation . In 2010. https://www.gov.uk/government/publications/social-and-emotional-aspects-of-learning-seal-programme-in-secondary-schools-national-evaluation
IES. (2017a). Attrition standard. https://eric.ed.gov/?q=attrition+randomized&id=ED579501
IES. (2017b). What Works ClearinghouseTM Standards Handbook (Version 4.0). https://ies.ed.gov/ncee/wwc/Docs/referenceresources/wwc_standards_handbook_v4.pdf
Jones, S. M., Brown, J. L., Hoglund, W. L. G., & Aber, J. L. (2010). A School-Randomized Clinical Trial of an Integrated Social-Emotional Learning and Literacy Intervention: Impacts After 1 School Year. Journal of Consulting and Clinical Psychology, 78(6), 829–842. https://doi.org/10.1037/a0021383
Jones, S. M., & Kahn, J. (2017). The Evidence Base for How We Learn Supporting Students’ Social, Emotional, and Academic Development Consensus Statements of Evidence From the Council of Distinguished Scientists National Commission on Social, Emotional, and Academic Development. https://www.aspeninstitute.org/wp-content/uploads/2017/09/SEAD-Research-Brief-9.12_updated-web.pdf
Mahoney, J. L., Durlak, J. A., & Weissberg, R. P. (2018). An update on social and emotional learning outcome research – kappanonline.org. https://kappanonline.org/social-emotional-learning-outcome-research-mahoney-durlak-weissberg/
Manchester. (2015). Promoting Alternative Thinking Strategies | EEF. In 2015. https://educationendowmentfoundation.org.uk/projects-and-evaluation/projects/promoting-alternative-thinking-strategies
Montgomery, P., Grant, S., Mayo-Wilson, E., Macdonald, G., Michie, S., Hopewell, S., Moher, D., & CONSORT-SPI Group. (2018). Reporting randomised trials of social and psychological interventions: the CONSORT-SPI 2018 Extension. Trials, 19(1), 407. https://doi.org/10.1186/s13063-018-2733-1
Morris, P., Millenky, M., Raver, C. C., & Jones, S. M. (2013). Does a Preschool Social and Emotional Learning Intervention Pay Off for Classroom Instruction and Children’s Behavior and Academic Skills? Evidence From the Foundations of Learning Project. Early Education and Development, 24(7), 1020. https://doi.org/10.1080/10409289.2013.825187
NICE. (2012). Overview | Social and emotional wellbeing: early years | Guidance | NICE. NICE. https://www.nice.org.uk/Guidance/PH40
OECD. (2015). Skills Studies Skills for Social Progress: the power of social and emotional skills. https://nicspaull.files.wordpress.com/2017/03/oecd-2015-skills-for-social-progress-social-emotional-skills.pdf
Ofsted. (2007). Developing social, emotional and behavioural skills in secondary schools. https://core.ac.uk/download/pdf/4156662.pdf
Perry, T., Lea, R., Jorgenson, C. R., Cordingley, P., Shapiro, K., & Youdell, D. (2021). Cognitive science approaches in the classroom: evidence and practice review. https://educationendowmentfoundation.org.uk/education-evidence/evidence-reviews/cognitive-science-approaches-in-the-classroom
Schonfeld, D. J., Adams, R. E., Fredstrom, B. K., Weissberg, R. P., Gilman, R., Voyce, C., Tomlin, R., & Speese-Linehan, D. (2014). Cluster-randomized trial demonstrating impact on academic achievement of elementary social-emotional learning. School Psychology Quarterly : The Official Journal of the Division of School Psychology, American Psychological Association, 30(3), 406–420. https://doi.org/10.1037/SPQ0000099
Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalised causal inference. Houghton Miffin.
Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-positive psychology: undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22(11), 1359–1366. https://doi.org/10.1177/0956797611417632
Slavin, R. E. (1986). Best-Evidence Synthesis: An Alternative to Meta-Analytic and Traditional Reviews. Educational Researcher, 15(9), 5–11. https://doi.org/10.3102/0013189X015009005
Sloan, S., Gildea, A., Miller, S., & Thurston, A. (2018). Zippy’s Friends. https://educationendowmentfoundation.org.uk/projects-and-evaluation/projects/zippys-friends
Snyder, F., Flay, B., Vuchinich, S., Acock, A., Washburn, I., Beets, M., & Li, K. K. (2010). Impact of a social-emotional and character development program on school-level indicators of academic achievement, absenteeism, and disciplinary outcomes: A matched-pair, cluster randomized, controlled trial. Journal of Research on Educational Effectiveness, 3(1), 26. https://doi.org/10.1080/19345740903353436
Stringer, E. (2019). Teacher training – the challenge of change. https://educationendowmentfoundation.org.uk/news/eef-blog-teacher-training-the-challenge-of-change/
TES. (2021). Toolkit puts “best bets” at teachers’ fingertips. TES. https://www.tes.com/magazine/archived/toolkit-puts-best-bets-teachers-fingertips
Torgerson, C., Hall, J., & Lewis-Light, K. (2017). Systematic reviews. In R. Coe, M. Waring, L. Hedges, & J. Arthur (Eds.), Research methods and methodologies in education (2nd ed., pp. 166–179). SAGE.
Wigelsworth, M., Eccles, A., Mason, C., Verity, L., Troncoso, P., Qualter, P., & Humphrey, N. (2020). Programmes to Practices: Results from a Social & Emotional School Survey. https://d2tic4wvo1iusb.cloudfront.net/documents/guidance/Social_and_Emotional_School_Survey.pdf
Wigelsworth, M., Verity, L., Mason, C., Humphrey, N., Qualter, P., & Troncoso, P. (2020). Programmes to practices: identifuing effective, evidence-based social and emotional learning strategies for teachers and schools: evidence review. https://educationendowmentfoundation.org.uk/education-evidence/evidence-reviews/social-and-emotional-learning
WWC. (2021). Promoting Alternative THinking Strategies (PATHS). https://ies.ed.gov/ncee/wwc/InterventionReport/712
Zhao, Y. (2017). What works may hurt: Side effects in education. Journal of Educational Change, 18(1), 1–19. https://doi.org/10.1007/s10833-016-9294-4