. . . while Mick McManus concludes that the Office for Standards in Education did not do justice to it. School-centred initial teacher training produces new teachers who are slightly better overall than experienced teachers; heads consider more than 90 per cent of them to be at least adequately prepared for the profession; and in not one case does Her Majesty's Inspectorate find them to be ill-suited to teaching (although 10 per cent of conventionally-trained teachers are found to be so).
These findings contrast with the less positive picture that has been widely reported to be the view of the Office for Standards in Education. Where have I got them from? The recent OFSTED report (paragraphs 32, 33, 45). It's not a very good report by OFSTED standards: turgid prose, misplaced commas, practise used as a noun and dependant as an adjective. Heads should roll, or maybe role.
The report's finding that Scitt students' lessons are not up to the standard of those of ordinary PGCE students is simply not tenable on the evidence given. The report acknowledges that its sample was skewed towards technology (weak nationally) and that the courses were new, but tells us that 77 per cent of students were observed teaching. When universities were visited by HMI nothing like this figure was achieved.
We were asked to supply a balanced sample and of course we all did. We all know, deep in our hearts, that other universities identified a slightly more balanced sample than we did. With schools spread over two or three counties, as many universities have, HMI had no hope of checking the balance by random visits. The slight differences between Scitt schemes and conventional ones can all be attributed to this sampling error.
The report itself urges caution and phrases like "in common with established PGCE courses. . . like their counterparts... as is often the case. . ." occur on every other page. The only safe conclusion is that, despite their inexperience, Scitt schemes produce teachers who are as good as any others. The report gives no clue as to how this is accomplished.
I have had limited contact with only two Scitt schemes but as far as I can tell their effectiveness is largely due to a factor which is pretty much out of university teacher-trainers' control: Scitt schemes apply all their funds to the training of the students without any top-slicing for buildings or central services. Universities cannot disregard their central costs in the way that schools can: unlike schools they have no other business that is their main source of funds.
This financial freedom has two important consequences. One is that Scitt schemes can afford rigorous and time-consuming selection procedures. Students typically have several interviews in more than one school and ample opportunity to show their competence and commitment (or lack of it). They are not actually required to take part in a teach-off against the clock but they might as well be.
Second, resources are applied to the vital point of delivery: the link between the student and subject-mentors and other departmental members. Scitt students I have spoken to regard their subject mentor as their main point of contact.
Schools of education, starved of resources and unable to transfer enough funds to their partner-schools, know how little can reach subject teachers in the shape of guaranteed time.
The OFSTED report does not do justice to the Scitt schemes and offers no lessons for the rest of us. It has nothing to say about finance and nothing about selection procedures. Two key factors are what you start with and what you spend on it.
Mick McManus is principal lecturer at the School of Education, Leeds Metropolitan University and external examiner for a SCITT scheme