They are the teachers who can decide the fate of millions of students every year - not to mention that of their own colleagues - with the stroke of a pen or the click of a mouse.
But now the tables are being turned on hundreds of examiners who are about to face their own high stakes tests. Global education company Pearson has begun to put 650 examiners, moderators and staff from its UK Edexcel exam board through their paces in a bid to improve the quality of assessment.
Appropriately enough there will be an exam at the end of it, with the Durham University academics running the programme warning that some will not make the grade.
"'We want it to be tough,' is what (Pearson) said," according to Professor Robert Coe. "And we said: 'Well, you know people will fail?' And they said 'Yes'."
Many of the examiners taking the course have been setting, moderating or marking papers for years if not decades. But Martin Walker, another of the joint Durham University and Chartered Institute of Educational Assessors (CIEA) team running the course, said they may not be up to date with the latest assessment techniques.
"You are now able to generate an awful lot more interesting data than we were able to do 10 or 20 years ago, because of on-screen marking and all sorts of things," the academic said.
This new abundance of data can be used to analyse with much greater precision and ease whether or not exam papers and questions are doing the job they are supposed to do.
The course has come out of a very difficult period for England's school exam industry.
First, a newspaper expose in December 2011 revealed how boards competing for schools' business used teacher seminars to give secret advice on how to do well in their qualifications. This was followed in the summer of 2012 by the GCSE English grading controversy.
Those problems led to government plans, now abandoned, to have a franchising system with a single exam board for each subject.
"I guess people (in exam boards) were thinking 'we need to be showing that we have got our act together here'," Professor Coe said. "Quite rightly, I think."
Julius Lang from the CIEA felt that the problems had been deep-rooted, pointing to the description of the exam system as "a cottage industry" in 2002 by Ken Boston, then head of the Qualifications and Assessment Authority.
"I think that was true," Mr Lang said. "People became quite senior in organisations like this (exam boards) just because if you stuck around for long enough you would get the job."
Since then, new technology has revolutionised the way the system is run, but Professor Coe said the potential benefits had yet to be fully realised.
"Researchers working in the exam boards have been using these kinds of techniques for a quite a long time and part of the issue is about spreading that capacity across a much wider base," he said.
Expertise among examiners had traditionally been subject- rather than assessment-based, Professor Coe added.
Or, as Mr Walker said: "If you are in charge of an art exam, you are going to be perfectly good at art...but if we gave you some statistics on the (exam) paper that might be something that wasn't terribly meaningful.
"If we are all speaking the same language we can have a very different quality debate about test performance."
The first batch of examiners to complete the training will take the end-of-course test in the spring. When TES attended a course session in London, the initial feedback was positive.
"I have found this fantastic, absolutely brilliant," said Philip Holmes, a chief examiner for applied A level in media. "This will really help me to understand the statistical side of exams and to choose the right questions in the right language."
But John Dunford, CIEA chair, said that to deliver lasting improvements to the exam system, all boards now needed to ensure that their examiners were "fully trained in the science of assessment".
"This is the start of professionalising the examination workforce which hitherto has been the territory of the gifted amateur," he said.