Last week educational globetrotter and purveyor of all things ‘evidence-based’, Visible Learning, tweeted ‘What is John Hattie working on at the moment?” Before they informed us it was Visible Learning for Parents, my first thought was $kerching$. While some may welcome the umpteenth variation of Visible something-or-other, I am skeptical of this addition to the arsenal of products and wanted to share a few thoughts.
The greasy pig that is the secure relationship between intervention and outcome has been a focus of Hattie and Visible Learning’s output for nearly ten years. It seems that across the globe individuals, schools and professional organizations have hailed Hattie’s meta-analysis as an important step forward in making educational decision-making more evidence-based. It also satisfies those who love a list and rank order (updated 2016). It is understandable that in the quest for certainty in an inherently complex and uncertain place like a school, this work would be welcome.
Hattie is no doubt aware that much has been written about how family involvement with a child’s schooling can affect achievement. That said, a myriad of complexities and nuances prevent the evidence from being reliable enough to anchor down definitive interventions that lead to improvements in achievement. He should ‘know the impact’ I hear you say! Well, is this Hattie’s angle? To fill this lacuna?
The influence of Hattie’s meta-analysis and encompassing rhetoric can be seen in many places, from bookshelves to unit plans, classrooms to conferences, national toolkits to policy. There are products and strap-lines abound to reinforce the brand. We see him commentate on television about school improvement trials which give him access to families and communities’ hearts and minds. He can be seen spanning organizations with significant professional clout to manage up to policy and down to the standards that drive teacher practice. Writing for Pearson, he has reminded us of the Politics of Distraction, those things which ‘don’t work’ and suggests where our efforts and thinking should be channeled. He has also proclaimed in evangelical form that he has a dream for educators to be, wait for it, ‘change agents’. So that’s d = 1.57 right, the ‘collective teacher efficacy’ super factor? He and his work also benefit from an extended partnership between ACEL/Corwin/Visible Learning.
What interests me is the work that has been done to shine a spotlight on the short-comings of using meta-analysis and effect sizes to validate all manner of commercial and educational activity and supposed policy legitimacy. For example, back in 2011 Snook et al wrote a critique of Visible Learning. Of particular note were their concluding concerns. After picking apart the methodological inconsistencies, the authors noted that “politicians may use his work to justify policies which he (Hattie) does not endorse and his research does not sanction”. They go on to state that “the quantitative research on ‘school effects’ might be presented in isolation from their historical, cultural and social contexts, and their interaction with home and community backgrounds”.
This final point is of interest when we consider the forthcoming publication of Visible Learning for Parents. What might the book be geared towards? Parents understanding of, and endorsement of school efforts to execute and inculcate strategically selected interventions to improve achievement? Perhaps we may see further brand-strengthening through the introduction of an armada of products and services. There will be a book of course, but what about ($kerching$) an online portal, app or school-home support software to connect schools with parents and families. Perhaps ($kerching$) PD will follow, of course through accredited providers. I dare say the initiative will do the rounds at ($kerching$) conferences the world over, exhorting the vital role parents/families play in the educative process, nailing the critical support of the home situation.
Beyond a schools choice to adopt strategies which anchor themselves in meta-analysis, there is the bigger question of how far up the system chain does the acceptance of intervention effectiveness go and how wide does the sphere of influence extend? Simpson (2017) has noted that our preoccupation with “‘what works’ in education has led to an emphasis on developing policy from evidence based on comparing and combining a particular statistical summary of intervention studies: the standardised effect size.” The paper suggests that research areas which lead to the array of effective interventions are susceptible to research design manipulation – they stand out because of methodological choices. It also asserts that policy has fallen victim to metricophilia: “the unjustified faith in numerical quantities as having particularly special status as ‘evidence’ (Smith 2011)”. Dr Gary Jones does a great job of highlighting this and other worries in his blog post about how this paper puts another ‘nail in the coffin’ of Hattie’s Visible Learning. Similarly, Ollie Orange ably dismantles the statistical concerns of Hattie’s meta-analysis.
The seductive rhetoric of Hattie’s work can be found almost everywhere and certainly seems compelling. However, if education is solely about impact and effect size, i.e. one year’s growth for one year’s input, will this book actually add anything of value to the combined community’s pursuit of improvement in young people’s achievement? With questions being asked of the methodological credibility upon which all else gushes forth, shouldn’t we be questioning how much we buy in to it?