Data about baseball players’ performance was always plentiful before the 1970s, but it wasn’t until a man called Bill James started compiling numbers that baseball statistics could be said to truly identify effective players. And it wasn’t until the general manager of a team called the Oakland A’s started using statistics about players in the 1990s that they really began to be put to winning effect.
That general manager was Billy Beane. He pioneered selecting baseball players more intelligently. His budget was half that of his major competitors, but those competitors pulled teams together based on inaccurate statistics and scouts’ instincts about who had potential, character or the right “look”.
Beane beat the system by using data that reflected players’ achievements and contributions; he identified individuals who were going cheap because their skills had gone unrecognised by richer teams. By focusing on the information that mattered and using it wisely, the A’s achieved impressive winning streaks against far richer sides, repeatedly finishing near the top of their league. Their exploits were so remarkable that they were recorded in Michael Lewis’s book Moneyball in 2003, which became an Oscar-nominated film starring Brad Pitt in 2011.
What the book and the film show is how carefully selected assessments transform our understanding and pave the way for success. Which brings us to education: where do we sit on the spectrum ranging from inaccurate pre-1970s baseball stats to the Billy Beanes of this world?
Schools have come to track students’ progress around the baseball diamond more diligently using data. But alongside the benefits, perverse incentives and inaccurate judgements have thrived. Teachers resemble coaching staff on the sidelines: they count fielders’ errors meticulously, spending hours on assessments and data entry. They learn little and are distracted from students’ needs.
School leaders resemble general managers: they are fixated on big hits and flashy “steals”, roaring for temporary progress; overlooking deeper knowledge and understanding.
Parents, meanwhile, resemble fans: they agonise over meaningless statistics, trying vainly to compare a 4c in maths with a 3b in French, or decrying the loss of “certainty” in schools without levels.
Most assessments used in schools are disconnected from students’ learning: they tell teachers little, they tell parents less. The belief in linear progress has led us to march our students neatly from one sub-level to the next (whether a school still calls it a sub-level or not, that is what most are aiming to do). The knowledge that this is meaningless has tempted us to report inaccurately.
Over the years I have wasted a disproportionate amount of my students’ time teaching them flashy turns of phrase to show that they could “begin to make links between causes” (and so reach Level 5). Looking back, it was time I should have spent helping students to understand the links between the events themselves.
Meanwhile, a colleague’s assessment of a Year 7 student in history was once overruled as incorrect; the head of year argued that since the pupil had made progress in every other subject during that term, she must have improved in history.
With the drumbeat of “progress” in our ears, we invariably round students’ attainment up, ignoring the gaps in their knowledge.
Like the world of baseball, we have been measuring the wrong things in the wrong ways. The current systems aim for a “home run” of progress – without taking any time to analyse whether the hit was skilled, lucky, or a mishit and so not replicable.
So how can we get a little Billy Beane-like mentality into our schools?
How we test is the problem… not the testing itself
Firstly, we must all accept the value of assessment, and recognise that bemoaning a “testing culture” is tempting but unhelpful. The claim that “Weighing the pig doesn’t fatten it” might seem appropriate to some, but it is both an unfortunate metaphor and an appeal to ignorance.
To help our students we must know how well they are doing. This demands frequent, accurate assessment. For this to work, we need three things: simple, useful assessment for teachers; accurate long-term measures for schools; and clear reporting for parents. Data denial and gut instinct provide no substitutes.
Effective assessment for teachers
What matters most for teachers is knowing how much students have learned in each lesson. Assessment systems are dictated by school policies, but two key techniques will work whatever system a school uses: exit tickets and hinge questions.
An exit ticket – an end-of-lesson question or two encapsulating the lesson objective, taking just three minutes – tells us whether our students have learned what we set out to teach them. Five minutes assessing a class set of exit tickets tells us what students need to do in the next lesson.
Hinge questions offer similar information during the lesson itself, allowing us to adapt our teaching immediately.
As an exit ticket, I might ask students, “How did the failings of the provisional government contribute to the October Revolution?” to check if they can remember and explain the lesson’s key points clearly.
As a mid-lesson hinge question, I could ask my class, “Which aspects of the provisional government contributed to the October Revolution?” This would allow me to check whether the students are keeping track by offering choices that include other causes of the revolution and successes of the provisional government, as well as its failings.
Exit tickets and hinge questions can offer teachers a clear understanding of student learning and a clear conscience about assessment. We have to look at the full picture, not just the swing.
Effective, long-term measures for schools
At a whole-school level, school leaders and heads of department can create better testing systems. Michael Fordham, assistant headteacher at West London Free School, has a good model to obtain a rounded picture of pupils’ progress. He suggests three “modes”: Mode 1 – frequent, low-stakes knowledge tests (times tables in maths); Mode 2 – milestone work at the end of a sequence of lessons (a science report or music performance); and Mode 3 – end-of-year exams.
Mode 1 tests offer teachers a sense of week-by-week progress; they also allow useful comparisons between classes: for instance, Mr Andrews would benefit from asking Ms Kelly how she taught the Blitz as her students did far better on the test.
Mode 2 assessments provide accomplishments to work towards and “real world” tasks to complement Mode 1 tests of knowledge.
Mode 3 offers a culmination of the year and a chance to check how much students have retained from the year and how much they can now do compared to last year.
This combination of modes covers the different needs of the school in a practical and helpful way.
Two supports would make the three modes possible. Firstly, we should move to a system of comparative judgement: examining essays A and B and deciding which is better. Then examining essays B and C. And so on.
This would allow us to focus on what makes a piece of work good – rather than agonising over whether a student’s choice of word helps them to fit the vague criteria in a mark scheme.
Secondly, the Commission on Assessment Without Levels’ suggestion of a national bank of questions would save time spent on creating questions afresh for every lesson in every classroom.
Bringing all these things together, I could identify whether the work my class has completed is truly good – against national benchmarks. This would provide genuinely useful information and a sense of how well students are doing and how they can improve. At the very least, it would be more helpful than analysing whether my class is doing well by comparing the percentage who have reached level 4 in history as against those at level 4 in French.
Effective reporting for teachers
Lucy Powell, the shadow education secretary, called for the wrong cure in suggesting the return of levels, but diagnosed an important problem: “as a parent” she noted, the removal of levels has “left me with little to go on”.
The system of hinge questions and exit tickets, or of three modes of assessment, could be used for this purpose, but I’m not sure how clear it is.
While I think marks for end-of-year exams should be shared, regularly sending home marks, which, by their nature, will show limited progress – progress being slow and uncertain – is not particularly helpful.
I find Chris Waugh, head of English at the London National School, and his “Unlock Achievement” approach inspiring. Students can unlock “achievement badges” on the courses that they study, when they complete a particular task to a certain standard.
One such achievement is performing a soliloquy. Students take the test when they are ready and can repeat it if they struggle the first time. Teachers have exemplars of the desired standard to compare against (Chris has adopted comparative judgement). If students are successful, parents receive an email explaining their child’s achievement, with the teacher’s comments and the work attached, too: they have a genuine, tangible sense of their child’s successes.
This is not the whole answer to communicating effectively with parents, but it provides a key part of the puzzle.
Like all teachers and school leaders, Billy Beane found himself in an uncertain environment facing significant financial pressure and demands for strong performance.
The baseball genius didn’t get the answer completely right first time, but his approach certainly points us in the right direction. I don’t pretend that these answers are a complete solution either, but they represent the first steps to the excellent assessment that our students deserve.
Harry Fletcher-Wood is associate director – knowledge development at Teach First @HFletcherWood
The London National School’s Chris Waugh discusses the “Unlock Achievement” approach of his You Choose English curriculum: bit.ly/UnlockAchievement
West London Free School assistant headteacher Michael Fordham outlines his three modes of assessment: bit.ly/BeyondLevels
Harry Fletcher-Wood considers how hinge questions work: bit.ly/HingeQuestions