Should Year 7s sit baseline tests this year?
Ashanti is ushered into a large hall in which rows of desks are facing the front, and on each desk there is a test paper. The room is unfamiliar: she’s been to this school only once before, for a brief open evening nearly 12 months ago.
She takes a seat, as do the 200 or so other Year 7s who have started secondary school this week. She tries to get her bearings, but even holding the pen seems odd to her - when was the last time she actually held a pen rather than typed away at a keyboard? Having not spent time in school since March, it’s all a bit hard to adjust to and, as a result, she misses the start of the speech from the head of year, who has suddenly appeared at the front of the hall.
“…And we hope these little assessments will help us pinpoint where we can best support you this year, so we can begin to make up for all that lost time…”
Tests for Year 7s are not unusual. Many secondary schools deploy them, and they label them with words like “benchmark” or “baseline” or “diagnostic”. They often buy these tests externally from assessment companies. One of the big providers, Renaissance Learning, says that nearly 1 million of its Star tests were used in Year 7 across the UK last year. Another, GL Assessment, says that around half of UK schools, state and private, use its tests.
But will those tests still be used this year? Most schools are talking about recovery curricula that focus on pastoral support and reconnecting with the classroom, so making children sit a test might not fit the narrative.
Yet, students will be entering secondary school without Sats results and following a lot of disruption to their education, so aren’t such tests needed more than ever so that teachers get a good idea of their prior learning?
This tension between diagnostics and transition issues has long existed in schools, but this year’s unique circumstances mean that the problems are much more acute.
So, as a secondary teacher, what might this mean for your September plans for your youngest students?
The reason a school may choose to put Year 7s through a benchmarking assessment is clear: the tests answer a diagnostic need to find out “what it is that pupils do and don’t know”, says Stephen Tierney, who is chair of the Headteachers’ Roundtable and who retired as chief executive of Blessed Edward Bamber Catholic Multi-Academy Trust in Blackpool in December last year, after almost 20 years in school leadership in the town.
And the reason why an external assessment can be preferable to an in-school written assessment is also clear, he says: there are no national benchmarks between key stage 2 and GCSE, so these tests offer schools a chance to “look at how your pupils are doing in a large sample” and measure their progress against a national cohort.
Indeed, GL Assessment chief executive Greg Watson explains that schools can benchmark their students against populations of “tens of thousands” taking the test nationwide, and beyond that against “hundreds of thousands of students of the same age and with similar characteristics” who have taken the tests in earlier years.
All about that baseline
But in more normal times, would the data for Year 7 students not come from the Sats tests taken a few months earlier? Why put students through another test so soon?
Well, the level of trust in KS2 Sats results among many secondary teachers is, to put it euphemistically, limited.
Keziah Featherstone is headteacher at Q3 Academy Tipton in the West Midlands, which will welcome around 300 Year 7 students in September. She says that sometimes KS2 results and assessments from primary teachers “are very accurate”, but on other occasions “you think they look dodgy”. And she adds that by the time the Year 7s arrive, Sats “took place such a long time ago; anything could have happened to a child since then”.
She says that primary teachers “are under an awful lot of stress”, too: “The new-style Sats [use] quite a narrow curriculum and [primary schools] are absolutely hammering what is needed [for the tests].”
So while new Year 7 students “may be very articulate in writing recount texts about the Battle of Hastings, that doesn’t mean they understand source material, causality and the history knowledge you need” in secondary school, says Featherstone. “Therefore, that’s where you need to understand if they’ve got any kind of background in that.”
To do that, her school assesses new students via a mix of methods including cognitive tests, informal in-class tests, KS2 results and teacher assessments from primary school. She has set aside £6,000 for externally marked tests in her school’s budget for the coming year (which, she says, would be the case regardless of the pandemic and absence of Sats).
John Moore, UK director at Renaissance Learning, echoes the idea that there “isn’t a huge amount of trust in Sats”. He adds that secondary teachers “really want to start again to figure out: what have you got on your hands, what is the range of ability within a cohort?”
Watson agrees: “Coming back at the start of Year 7, what [GL’s tests] can provide, on day one of the new term, is a measure of where those students are right then”.
So, what do the tests look like? There is quite a lot of variation. GL Assessment provides three different kinds of tests, explains Watson: there are “measures of cognitive ability”; “measures of progress in the core subject areas of English, maths and science”; and assessments of “barriers to learning”, focused on “attitude, engagement and self-regard as a learner”, where there is a “sub-category” for special educational needs, such as dyslexia, dyscalculia or working-memory deficit.
Meanwhile, the Star assessments in reading and maths offered by Renaissance Learning are “adaptive” tests, which students can take several times without getting the same questions, explains Moore. Plus, he adds, the tests are designed to feel “low stakes” - and if candidates get a few questions wrong, the questions will become easier, with the idea being that all children feel positive after they have finished the test.
Despite the talk of recovery curricula, neither firm is expecting a drop in use of the tests for Year 7s - quite the opposite.
GL “virtually turned off the marketing” during lockdown to leave schools space, says Watson. But, he continues, “it was really noticeable that, beginning June onwards, we started getting schools ringing us instead”. Schools’ reasoning was consistent, he says. He paraphrases: “We’ve realised that we’re going to come back in September and we’re going to have a gap, particularly for new Year 7s who are going to be arriving, because we won’t have Sats results. What can we do?”
Renaissance Learning’s Moore similarly believes that demand from schools is “going to go up this year”.
His colleague James Bell, Renaissance Learning’s director of professional services, explains that in the wake of disruption to education, teachers are “going to spend weeks and weeks working out where those kids are; our testing can do it in 20 minutes”. He adds that the tests can “pinpoint which skills children need to learn next”.
And Watson says the tests can be part of the recovery curricula, particularly the Pupil Attitudes to Self and School (PASS) psychometric assessment. When the start of term comes, “I suspect…we’re not going to find that every child has fallen hugely behind and has lost the will to learn”, he says. “[But] the challenge for schools is to work out which students have fallen back further, which students have lost their motivation and engagement…I’ve definitely seen more interest [from schools] in those measures of engagement and wellbeing that we’ve got.”
Not everyone is convinced by these arguments. Some critics see the tests as a symptom of a broken system - a tool for the academic sorting of students to achieve the most efficient production of GCSE grades at the expense of a holistic education.
For example, David Putwain, a professor at Liverpool John Moores University’s School of Education, points to a “results-driven” culture in which secondary schools focus on GCSE results for fear of being “hammered by Ofsted” and possibly being “hammered internally if they are part of an academy chain that has those kinds of expectations”.
Stemming from a “narrative set by government”, the dominant model of quality in the English education system is “meritocracy based on differentiating those that are academically capable” from those perceived as not being so, he argues.
“I don’t think that, until that aspect of [policy] changes, we’re going to see much happening in terms of those kinds of benchmarks in Year 7, because that’s the system they fit into,” Putwain says.
Let’s talk about sets
There’s no sign of any of those accountability measures being lifted in response to the coronavirus disruption, so are schools being forced into diagnostic testing in Year 7 owing to systemic pressures rather than a benefit to the child?
“Secondary schools are going to be judged on GCSE performance, so really from day one, everything is judged on that,” says Putwain. “So they are going to be looking to set [students] in ways that facilitate GCSE performance.”
Watson does not agree with this portrayal of the tests. He responds that setting would be an “ingredient” in the factors that might make schools use GL tests, but he doesn’t think “it’s the primary reason for doing it”. On this he has some support from Tierney, who says there are lots of ways schools set “without necessarily using externally assessed work”.
But Putwain also takes issue with the nature of some of the testing, highlighting GL’s CAT4 cognitive ability tests. These are “not called IQ tests…but essentially, they are the same thing”, he argues - and the problem with IQ tests is that “nobody can agree on what they do and what they measure”.
Some research, Putwain says, suggests that “people with [good] IQ test [scores] are just good at thinking fast … You could argue that is a really quite narrow approach to skill development”.
Watson rejects the suggestion that CAT4 tests are IQ tests, saying they look more broadly at cognitive ability by assessing verbal, non-verbal, quantitative and spatial skills. The tests aim to get at the core of “underlying ability” and to cut out the clouding effect of the school that a student went to or the level of parental support they have had, he says.
For some critics, though, it’s not necessarily the nature of the tests or what they may be symptomatic of that are the problem - it’s the timing. Peter Mattock, director of maths and numeracy at Brockington College in Leicestershire, criticised the use of baseline tests for Year 7 students in their first few days of school in a Tes article last year.
He has previously worked in schools that use “CATs testing as a matter of course” in Year 7, but Brockington doesn’t do that. Instead it relies on a mixture of internal and teacher assessment that “only works when you’ve got to know [students] a bit”.
One of the “major possible dangers” he identifies with blanket use of external assessment early in Year 7 “is around the anxiety that students have around transition”, meaning that such tests are “unlikely to get their best performance”.
Those transition issues are almost certainly going to be more pronounced this year, with transition events on both the primary and secondary sides severely limited and the school environment still being affected by coronavirus restrictions.
Across the border
But, not surprisingly, test providers think their assessments remain worthwhile, despite the disruption to education for incoming Year 7s. Renaissance Learning’s Moore argues that Star tests can help schools whether they want to “baseline where students are now” or “assess any learning losses”. He says that each test “will be personalised for each student and [an] extensive item bank means that it is impossible to teach to test”, even if anyone wanted to.
Similarly, Watson says GL’s tests “do not rely on previous academic attainment to be effective or useful”. And PASS will be relevant to the recovery agenda by providing “insight into how a student feels being back in the academic environment, for example, how confident they are commencing learning again and how mentally ready they are to embrace teaching and learning in a school environment”, he adds.
In its avoidance of setting and shunning of the Year 7 benchmarking tests, Mattock’s school is probably in the minority of secondaries in England. But in Scotland, whole-cohort diagnostic testing for first-year secondary students is not that prevalent, according to Scottish heads.
“We wouldn’t really, at the start of first year, focus on a test as such,” says Billy Burke, headteacher at Renfrew High School, located just west of Glasgow. “It would more be a combination of different types of formative assessment, just to confirm the learning profile we have from the primary experience.”
Why might Scotland differ in this respect? One key factor here might be the level of trust in the assessment results with which Scottish students arrive at secondary school. Scottish National Standardised Assessments (SNSAs) in literacy and numeracy, introduced in 2017, are taken in the final year of primary school, as well as at other stages of schooling.
These are formative assessments whose results are released only to teachers, and they provide “fairly comprehensive data back about what gaps there are” in knowledge, says John Rutter, headteacher of Inverness High School.
When it comes to setting, Burke says that, in general in Scotland in recent years, particularly in his own subject of maths, there has been “much more willingness to embrace the idea that we should try to prioritise young people’s attitude and mindset towards their own learning” and an awareness of how setting can lead students to a negative perception of their own ability as “fixed”.
The argument is that if you are not setting on academic lines, then a diagnostic test right at the start of the first year in high school is not as much of a requirement.
And another factor in a lack of external testing for Year 7s, says Rutter, is that there are “quite good transitional processes” in Scotland in terms of the wider information provided by primary schools to secondary schools on their new students.
In Renfrew, the high school and the town’s three primaries have “worked together on moderation activity to make sure there’s a consistency on the judgements coming through” and “consistency of learning approaches”, says Burke, a past president of School Leaders Scotland.
Scottish high schools may have different reasons from English secondaries for using external tests. Burke says his school has used CAT4 “where there’s rationale for it”, including in its approach to closing the “poverty attainment gap” among students “to drill a bit deeper into specific gaps or strengths for young people” - but not as a “blanket approach”.
So, where does that leave English secondary schools in September? With no Sats tests, and with the logistics of primary-secondary transition more difficult than ever, it’s unlikely we will see a significant shift away from benchmarking children in Year 7.
If anything, some suggest that we may see the beginnings of a push to have more national testing for this age group: there are those who believe a national benchmarking test at Year 7 would be preferable to one at Year 6. The idea has long been voiced in some quarters of the schools sector - and it is occasionally discussed in government, according to some who have been privy to those conversations. They add that the coronavirus-enforced break in Sats could potentially give the idea another push.
Such a shift might not be popular with many secondary heads; some feel that the current system in which secondaries can carry out testing for their own ends without the insertion of government’s sizeable oar is working quite nicely, thanks all the same.
But what may happen owing to the coronavirus disruption is that schools begin to plan the timing and influence of those tests more carefully. Do they need to be done right at the start of the year? What framework of other assessments do they need to be interpreted within? And how much repeat testing may need to be done to get an accurate picture?
Teenagers are complex creatures, after all. As Burke says of the impact of the pandemic, the “teenage years can be challenging enough in terms of mental health and wellbeing…Schools will absolutely need to focus on people’s readiness to engage and learn, and just reforming those good routines”.
Featherstone agrees, stating that students “can have done lots of work [during lockdown], but if no one has looked after their emotional wellbeing, they will be in bits, really…We will recover [students] educationally. We just need to make sure we’re also looking after them and we’re not putting too much pressure on them to be academically brilliant by October, as if they haven’t missed anything at all”.
So, Ashanti will likely still walk into that school hall for a diagnostic test, but it may be that in her school, they opt to run the tests a little later in the year than with previous cohorts and analyse those assessments within a little more context. Clearly, schools will still feel the drivers to use these external tests for Year 7s while the English system - and the politics around it - remains the same. But they may choose to use them in a slightly different way this year.
John Morgan is a freelance journalist
This article originally appeared in the 21 August 2020 issue under the headline “Are traditional benchmark tests Covid-proof?”
You need a Tes subscription to read this article
Subscribe now to read this article and get other subscriber-only content:
- Unlimited access to all Tes magazine content
- Exclusive subscriber-only stories
- Award-winning email newsletters
- Unlimited access to all Tes magazine content
- Exclusive subscriber-only stories
- Award-winning email newsletters
You need a subscription to read this article
Subscribe now to read this article and get other subscriber-only content, including:
- Unlimited access to all Tes magazine content
- Exclusive subscriber-only stories
- Award-winning email newsletters
- Unlimited access to all Tes magazine content
- Exclusive subscriber-only stories
- Award-winning email newsletters