White Paper: Why the evidence doesn’t stack up

The Schools White Paper reads like a less effective school improvement plan, says Megan Dixon: lots of actions, but no clear evaluation strategy
29th March 2022, 11:17am
Evidence, Stacked

Share

White Paper: Why the evidence doesn’t stack up

https://www.tes.com/magazine/teaching-learning/general/white-paper-why-evidence-doesnt-stack

The Schools White Paper has been published: it’s 68 pages long and while it’s big on ideas, on the whole, it’s not forthcoming about how we will measure the difference these ideas will make. 

For me, it brought to mind some of the less effective school improvement plans I’ve seen: lots of actions, but no clear evaluation strategy. 

Of course, there are targets - such as the expectation that 90 per cent of children will reach the expected standard in reading, writing and maths in key stage 2 by 2030. 

But we know what happens when numbers become targets. Targets are what lie at the end of a road; they don’t show the journey we need to take, or how we know we are on the right path in the first place. 

Interestingly, the Department for Education talks about evidence a lot throughout the document and does acknowledge that the evidence bases it draws upon may benefit from some frequent reviewing and revising. 

For example, it pledges to assess the effect recent reforms of the early years foundation stage have had on teaching practice, and, “where necessary”, identify ways to go further in ensuring children are prepared for key stage 1.

It also stresses that the second part of the reading framework and, indeed, the English and maths hubs will be overseen by the Education Endowment Foundation (EEF) and the repurposed Oak Academy.

It adds that the National Tutoring Programme will continue to put tutoring at the heart of schools, drawing on evidence from before the pandemic. In addition to this, there will be new leadership qualifications, new multi-academy trusts (MATs), new attendance reporting, new behaviour approaches and a new “Parent Pledge”. 

Schools White Paper: how will we know if a school is doing better?

All of these proposals reference evidence from a time that no longer exists. But what about the evidence happening in front of our eyes? How, exactly, will we know if a school is doing better? 

When it comes to the early years, for example, how do we know if teaching has improved? Perhaps it will be determined on whether or not a school can get all children through the phonics screening check. 

There is an issue with this, though: the data in the study cited in the White Paper suggests that passing the check is correlated with your age and gender. Where, then, do we turn instead?

We face similar issues around measuring progress in primary English and maths. Should we look at the percentage rate of children who have passed their Sats? After all, if a primary school’s pass rate goes up, it must be improving, right? 

Maybe not. Since the revisions to the Sats examinations moved the tests from criterion to norm-based assessments, there will always be children who don’t achieve the national standard. With that in mind, how can we really tell if children are improving in literacy and numeracy?  

The White Paper makes it clear that every school should join a strong MAT. But what does one look like? Can we rely on Ofsted to tell us? The well-evidenced bias they have towards schools in more affluent circumstances suggests not. 

Meanwhile, the new Parent Pledge says that schools will support students who are falling behind. But how will we know if, or when, this is happening? And more importantly, how will we know that the solutions touted by the DfE are really the best way forward? Can we guarantee they will result in the best outcomes for the children and young people in our schools? 

Case studies, of course, can be really helpful in describing what can be achieved and the White Paper does include a number of case studies to help us understand what a strong school or MAT with great teachers could look like. And yet, all five case studies look at secondary schools. Where should primary schools turn for best practice?

Evidence can be nuanced and slippery; it can be manipulated and gamed. 

We need a systematic, independent evaluation strategy, which draws on the views of all stakeholders, to identify the positive and negative outcomes, and recognise the unintended consequences. 

For now, my final questions are: who will oversee the robust collection and analysis of all of this evidence, so we can learn from both the process of implementation and the impact? Again I want to ask: how will we know we are on the right path?

Megan Dixon is director of research at Holy Catholic Family Multi-Academy Trust

You need a Tes subscription to read this article

Subscribe now to read this article and get other subscriber-only content:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

Already a subscriber? Log in

You need a subscription to read this article

Subscribe now to read this article and get other subscriber-only content, including:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared