Weaker pupils fall into the dip
hat is really going on in Year 3? When Year 2 teachers wave their seven-year-old national test veterans off to the key stage 2 corridor, or to the juniors' building what do they think awaits them?
Quite possibly, academic stagnation, says Steve Anwyll, director of the national literacy strategy, which next April will introduce new booster literacy training for Year 3 teachers: "Talking to teachers in Year 2, I think they are often disappointed that children do not make more progress in Year 3."
Tim Coulson, director of the national numeracy strategy, which this term is piloting new-style Year 3 lessons, agrees: "There does not seem to be quite the same urgency in Year 3, quite the same feeling that teachers need to make use of every lesson, as there was in Year 2. So we are piloting these plans which up the pace a bit, which show what we think it is possible to get done by the end of the week."
Both these primary strategists are trying to address what has become almost legendarily known as the "Year 3 dip". The dip appeared in national studies in the 1970s and 1980s. It was charted by the Office for Standards in Education in annual reports from 1994-98, when Year 3 teachers were consistently revealed to gain fewer top inspection grades, and more unsatisfactory ones, than any other primary teachers.
The traditional explanation for all this is simple. Heads have put their newest and weakest teachers into Year 3 classrooms, where they can do least harm. But is that true today? Since Ofsted stopped assessing children's progress in inspected lessons in 1998, is there hard evidence that the dip still even exists? And if it does, can it still be blamed, when teachers are using nationally-approved teaching methods, on weak Year 3 teaching?
Two new sets of figures produced by The TES shed fresh light on what is happening in Year 3. Combined with a research report into children's and teachers' experiences in Year 3, they suggest that the dip does indeed exist, but its causes will take more than extra literacy and maths training to overcome.
Since the Qualifications and Curriculum Authority introduced its "optional" tests for Years 3 to 5, it has been possible in theory to chart children's annual progress. Each year the QCA puts on its website the results of the performance of a sizeable sample of children (about 10,000) in optional tests, mapped against their performance in national tests. It is a progress chart.
The figures for 2001 have been analysed by The TES. They show that a large group of children - often a third, sometimes nearly half - appear to make no progress, or even go backwards, during Year 3 (see table, left). The rate of progress is better in reading than in writing or maths, and able children appear to make more progress.
Some children make big improvements: in maths, for example, 44 per cent of 2b scorers in national tests at seven went up one sub-level to 2a (the annual progress they should make), while another 23 per cent leapt to level 3. Yet more than 20 per cent of 2b scorers in national tests stuck at 2b throughout Year 3, and another 12 per cent of the children went down after a year to 2c or even level 1.
This may be because these children scraped, or were crammed into, a 2b at the end of Year 2. It may also be because optional tests are not precisely the same as national tests, as QCA assessment manager Jackie Bawden points out: "The way the optional tests are taken and used and marked is different from national tests."
But while these figures suggest that many children make no progress in Year 3, they do not point the finger at the quality of teaching. And nor, ironically, does evidence from Ofsted itself. In fact, annual HM chief inspectors' reports over the past five years, collated by The TES, show an apparent huge improvement in the quality of Year 3 teaching (see graph, right).
Although still fractionally weaker than in other primary years, the absolute grades scored by Year 3 teachers are far better now than those scored by teachers in the "best" primary classes a few years ago. If, as the QCA figures suggest, the dip still exists, then it is hard to see how it can be blamed on this cohort of massively-improved teachers.
So what might be its cause instead? An investigation led by Professor Jean Rudduck of Cambridge University's school of education, suggests that the key-stage changeover itself may be to blame.
Interviewing heads, teachers and Year 3 pupils themselves, she and her colleagues found the children both excited and anxious at the prospect of moving up a stage, while teachers stressed that they would be making many more demands on the children.
They would be expected to work more independently and collaboratively, to cover more curriculum content, and to work for longer and produce larger quantities of writing than in Year 2, teachers said.
The study found that where children's academic foundations were shakier, andor where teachers did not explicitly teach pupils how to meet these new demands, disillusionment and disappointment then set in.
These problems could be compounded by new surroundings, new adults and new rules; disrupted friendships and less parental help, either because schools had over-stressed the importance of children working independently, or because parents themselves were less academically confident with Year 3 work.
"Teachers were saying that the children had been mollycoddled in KS1 and now they had to be more independent and take more responsibility," Professor Rudduck explained. "But academically that is quite difficult unless they are supported.
"It is very patchy: it varies a lot from school to school. Where there are good patterns of induction from Year 2 to Year 3, then it is more likely pupils and parents will understand the demands and cope with them.
"Where it is haphazard, children suffer a fall-off of motivation and confidence."
The report "Sustaining Pupils' Progress at Year 3" can be obtained by sending an A4 SAE and pound;1 (cheques to University of Cambridge) to: Nicola Daily, Faculty of Education, Homerton College, Cambridge, CB2 2PH YEAR 3: STANDING STILL OR FALLING BEHIND
THIS TABLE IS NOT AVAILABLE ON THIS DATABASE
NOTE:This table created by The TES reveals the number of children who appear to make no progress in Year 3.
The figures are taken from QCA statistics which relate a large sample group of children's Year 2 national test scores in 2000, with their subsequent performance in Year 3 optional tests in 2001.
Those children who either achieved the same level at the end of Years 2 and 3, or who achieved a lower level at the end of Year 3 than they did a year earlier, are counted as having made no progress.
*More children probably made progress in this group than these figures suggest. This is because level 1 is not subdivided into 1a, 1b, 1c. Since children are only expected to move one subdivision in a year, some children who scored a low level 1 in their KS1 national tests and a high level 1 in their Year 3 tests will not see their achievement reflected here.
**More children probably stood still in this group than these figures suggest. This is because the Year 3 tests divide children into 3c, 3b, and 3a, but the KS1 national tests do not. We have only counted those children who achieved level 3c at the end of Year 3, and who got level 3 in their national tests, as having stood still. This is an underestimate, as some children will already have achieved the equivalent of 3b or 3a in their KS1 national tests, and remained there or gone back into the 3c group a year later.