Update: Opportunity Knocks Again, And Again, And Again …

A while back educational globetrotter and purveyor of all things ‘evidence-based’, Visible Learning, tweeted ‘What is John Hattie working on at the moment?” They informed us it was Visible Learning for Parents. While some may welcome the umpteenth variation of Visible something-or-other, I was skeptical of this addition to the arsenal of products and wanted to share a few thoughts. Hattie himself kindly left a comment on my blog, clarified the purpose of the book and even offered me a free copy!

It still leaves me pondering … the greasy pig that is the secure relationship between intervention and outcome has been a focus of Hattie and Visible Learning’s output for nearly ten years. It seems that across the globe individuals, schools and professional organizations have hailed Hattie’s meta-analysis as an important step forward in making educational decision-making more evidence-based. It also satisfies those who love a list and rank order (updated 2016). It is understandable that in the quest for certainty in an inherently complex and uncertain place like a school, this work would be welcome.

The influence of Hattie’s meta-analysis and encompassing rhetoric can be seen in many places, from bookshelves to unit plans, classrooms to conferences, national toolkits to policy. There are products and strap-lines abound to reinforce the brand. We see him commentate on television about school improvement trials which give him access to families and communities’ hearts and minds. He can be seen spanning organizations with significant professional clout to manage up to policy, and down to the standards that drive teacher practice. Writing for Pearson, he has reminded us of the Politics of Distraction, those things which ‘don’t work’ and suggests where our efforts and thinking should be channelled. He has also proclaimed in evangelical form that he has a dream for educators to be, wait for it, ‘change agents’. So that’s d = 1.57 right, the ‘collective teacher efficacy’ super factor? He and his work also benefit from an extended partnership between ACEL/Corwin/Visible Learning.

It was timely then that Scott Eacott released School leadership and the cult of the guru: the neo-Taylorism of Hattie this week to remind us of the worrying alliances that are forming in Australia and in other settings which are elevating the cult of the guru to worrying heights. Eacott notes in his critique, business practices (inspired by Taylor, 1911) which infiltrated American public education and shifted schooling to “business imperatives and in particular the pursuit of efficiency”. The depths and extents of the brand infiltration is further explored, Eacott explaining:

“Visible learning, as a label is now used in a variety of areas … further building the brand and evidence of the brand of Hattie exploiting an opportunity for maximum advantage. Courtesy of sheer presence, Hattie has become canonised in initial teacher education, graduate programmes, and professional dialogue and debate. His work is now ubiquitous with education in Australia.”

What interests me is the work that has been done to shine a spotlight on the short-comings of using meta-analysis and effect sizes to validate all manner of commercial and educational activity and supposed policy legitimacy. For example, back in 2011 Snook et al wrote a critique of Visible Learning. Of particular note were their concluding concerns. After picking apart the methodological inconsistencies, the authors noted that “politicians may use his work to justify policies which he (Hattie) does not endorse and his research does not sanction”. They go on to state that “the quantitative research on ‘school effects’ might be presented in isolation from their historical, cultural and social contexts, and their interaction with home and community backgrounds”.

Beyond a schools choice to adopt strategies which anchor themselves in meta-analysis, there is the bigger question of how far up the system chain does the acceptance of intervention effectiveness go and how wide does the sphere of influence extend? Simpson (2017) has noted that our preoccupation with “‘what works’ in education has led to an emphasis on developing policy from evidence based on comparing and combining a particular statistical summary of intervention studies: the standardised effect size.” The paper suggests that research areas which lead to the array of effective interventions are susceptible to research design manipulation – they stand out because of methodological choices. It also asserts that policy has fallen victim to metricophilia: “the unjustified faith in numerical quantities as having particularly special status as ‘evidence’ (Smith 2011)”. Dr Gary Jones does a great job of highlighting this and other worries in his blog post about how this paper puts another ‘nail in the coffin’ of Hattie’s Visible Learning. Similarly, Ollie Orange ably dismantles the statistical concerns of Hattie’s meta-analysis.

The seductive rhetoric of Hattie’s work can be found almost everywhere and certainly seems compelling. With questions being asked of the methodological credibility upon which all else gushes forth, shouldn’t we be questioning how much we buy in to it? Surely we cannot ignore the noise, not necessarily because of its message, but because the noise is becoming a cacophony. As Eacott (2017) concludes,

“Hatties work is everywhere in contemporary Australian school leadership. This is not to say that educators have no opportunity for resistance, but the presence and influence of brand Hattie cannot be ignored. The multiple partnerships and roles held by Hattie the man and the uptake of his work by systems and professional associations have canonised the work in contemporary dialogue and debate to the extent that it is now put forth as the solution to many of the woes of education.”

 

2 thoughts on “Update: Opportunity Knocks Again, And Again, And Again …

  1. A really timely post, Jon. The canonisation of Hattie’s (methodologically flawed) recommendations as absolute truths has been made possible only because of politician’s obsession with silver bullets.

    Part of the issue, I’d argue, is that the tunnel vision that is the ‘effect size’ argument has a number of very damaging ‘effects’ itself:
    1. It treats children as units of production, and schools merely as efficiency machines;
    2. It seeks the authority of the medical clinical trials system, without acknowledging the potentially catastrophic ‘side effects’ that the medical profession gives priority to;
    3. It de-professionalises teacher expertise and teacher autonomy;
    4. It offers itself as hostage to reductionist interventions and sloganising (the latest in the UK being “Just tell them!”). The OECD may argue that PISA assesses many more things than performance in Maths, Literacy and Science, but it’s politically naive to believe that media outlets and governments will ever go beyond the headlines;
    5. It’s a cherry-pickers charter – encouraging policy makers to plump for the interventions that accord with their own ideologies, whilst conveniently ignoring those that don’t, to the detriment of seeing child development as a complex set of relationships, interactions, contexts and, yes, emotions;
    6. It strangles the real, urgent, debate that society need to have about how we judge an effective education in preparing students for the future, and whether we’re even measuring the right things.

    John Hattie and the Visible Learning bandwagon can’t be blamed for all the above, but I’d have more sympathy with John Hattie personally had his company not financially exploited his findings quite so blatantly.

  2. An excellent, timely post, thanks! On the evidence based teaching and learning, Biesta offers a critique on a philosophical level in his ‘why what works won’t work’ article.
    Part of the problem with EBT practices is that, while the question of ‘how best to do something’ is examined using flawed methodology (based on analogous healthcare models), it never questions what that something is. In short, EBT never stops to examine what it wants to achieve and whether it’s aims and methods are relevant.

Leave a Reply

Your email address will not be published. Required fields are marked *