Why your wrong about grammar
In a 12th-century stone relief at Chartres Cathedral, a po-faced personification of grammar holds a scourge threateningly, while two children cower at their books beneath her.
For too many, this resonates with their own experience of grammatical study. A penchant for grammar smacks of dusty pedantry, squeaky chalk and mortar boards. It conjures up black-and-white mental images of canes and gruel.
Unsurprising, then, that the government’s attempts to re-emphasise the importance of grammar through systematic teaching and testing have been much criticised, not least in the pages of this magazine, with pictures painted by critics of bewildered children robbed of their childhood by fronted adverbials, driven to distraction by antonyms and under ruthless assault from ellipses.
Geoffrey Willian’s literary creation, the grammar-lite Molesworth, would no doubt see such a prescriptive regime as an utter “swiz”, dreamed up by sadistic “skool masters”, and beloved only of “swots, wets and weeds”.
Is he right? Or can our collective perception of grammar be rehabilitated? I believe so, but there are minefields to negotiate surrounding questions of individuality, hierarchical control, creativity and freedom of expression.
A ‘trivium’ matter
In the classical world, grammar’s pre-eminent status was unchallengeable. It was one of the three foundational arts – the “trivium” – alongside dialectic (logical reasoning), and rhetoric (the art of communication and persuasion). These three paths put a young Greek firmly en route to becoming an educated, fulfilled human being capable of successful contribution to his community: healthy, wealthy and wise. So how did grammar end up with the bad reputation that it has today?
Perhaps the problem has lain in a tendency to think of (or indeed teach) grammar as the dissection of living language into dry, atomised chunks, which are labelled and filed in the appropriate place by the grammarian, robbing our dynamic language of its natural spontaneity and creative possibilities. No doubt plenty of students have had good reason to think of grammar this way, having associated it with rote-learning endings and principal parts of decontextualised Latin verbs. (Fero, ferre, tuli and latum are all from the same verb? Give us a chance!)
The mistake, of course, has been to define grammar as though it were only the drudgerous element of language learning – hard, colourless, but necessary work in laying the foundations for distant understanding – rather than expanding our definition to include a more generalised sense of discovering meaning in our communications with one another. Philosopher Anthony O’Hear defines grammar in promisingly broad terms, as “the ability to read and express subtleties of meaning in words”.
Language generates meaning in a way that ought to be a source of constant wonder and fascination. We are all guilty of taking it for granted every day. By means of varying our bodily grunts, hisses, tuts, ticks and exhalations, human beings can express feelings, give instructions, joke, tell stories, chide, offend, flatter, and so on and so on. Incredibly, we can achieve exactly the same effects with squiggles and scratches on a page.
It is surely no exaggeration to say that language, spoken and written, is the evolutionary development on which the whole cooperative edifice of civilisation rests, and grammar is the gateway to our understanding of this extraordinary capacity.
We marvel and delight at the effects of the whole when we read Shakespeare – or Molesworth – but would it not be contrary to our natural inquisitiveness if we weren’t also eager to look under the bonnet and peer at the inner workings? This is what true grammarians seek to do, in a way that is perhaps analogous to science. Scientists endeavour to examine the constituent parts of natural things, and discover how they interrelate to produce their effects. Children frequently manifest a natural fascination with all this when they are young. Why should language be different?
Grammar shows us how the positional relationships and inner logic of our words contribute to producing this almost-miraculous phenomenon of meaning-making. Perhaps there has been a failing of schools and society to adequately draw attention (oo-er, a split infinitive – can I get away with that, editor?) to what a truly remarkable development this has been in the history of life on earth.
The failure is not that of grammar, but of how we have sold it. Natural science has its David Attenborough. Who flies the flag for grammar?
Battle of Wittgensteins
Attempts have been made, but they have not necessarily been populist ones. The history of 20th-century philosophy shows a serious preoccupation with understanding the nature and inner workings of language and how this relates to the production of meaning.
For Wittgenstein (almost universally acknowledged as the greatest philosopher of this period), to understand the inner workings of language is to understand everything that can be understood. No small claim: “The limits of language are the limits of our world.”
When he completed his first theory of language (in his early twenties), he was convinced that he had solved all the problems of philosophy, and gave up his academic post to become a primary schoolteacher. (Then he invented a jet engine. Of course.)
As detailed in Tractatus Logico-Philosophicus, for him, the great majority of philosophical and human misunderstanding came from a failure to understand and use language in a properly orderly manner. His model for comprehending the inner workings of language was inspired by a model of a car accident that he saw during a court case. The way each element was positioned in relation to the others was the means by which the model as a whole was able to represent reality.
This was Wittgenstein’s lightbulb moment: language was the same, he thought. Words represent objects in the real world, and the way we place the words in relation to one another is what enables us to mirror the structure of the thing we are describing.
In another metaphor, he compares words to atoms, which can be arranged in different patterns to reflect different states of reality. He also draws a powerful analogy between words and musical notes on a page, which stand logically for particular relations of sounds.
Wittgenstein sets out his theory in a dense series of propositions, which become increasingly mathematical and, to the layman, impenetrable in their use of logic.
But his fellow philosophers, who had also been wrestling with how to bring language on to a more scientific footing, fell for it hook, line and sinker. He persuaded intellectual giants of his age – Bertrand Russell and A J Ayer, as well as the Vienna Circle – of the correctness of his “atomistic” model of language. Or, as some have called it, “picture theory”, whereby words depict reality by direct correspondence.
Wittgenstein’s proposed grammar of logic sparked a revolution in thinking about language. So-called logical positivists urged us to use only propositions that could be verified through our senses or logic. Everything else, we were assured, was meaningless and liable to lead us into misunderstanding and conflict.
Expressions of emotion, for example, could not be checked for factual accuracy, and so could not be thought to belong to the realm of meaningful language.
It is no coincidence that George Orwell’s satirical, dystopian concept of “Newspeak”, in which language is ruthlessly pared back to its bare essentials in order to exert greater control over its users and prevent thought-crime, appears in Nineteen Eighty-Four. It was published in 1949, while logical positivists were still just about taking themselves seriously.
The idea that we might reduce language to simplified, unambiguous expressions of fact and logic was alive and well until the middle of the century, fuelled in part by a desire for greater cross-cultural understanding after the First World War. Charles Kay Ogden’s Basic English, published in 1930, did away with verbs altogether, replacing them with 18 “operators”. He argued, “What the world needs is about 1,000 more dead languages, and one more alive.” To be fair to Ogden, his approach was primarily aimed at second-language learners, but his distaste for variety, ambiguity and irregularity in language was by no means uncommon in his day.
Orwell appears to have been initially attracted by Ogden’s project, with its emphasis on simplicity and mutual understanding. But he rejects it and similar approaches decisively in his essay Politics and the English Language, in which he shows how an appearance of scientific precision is often a thin mask for indefensible political ideology, and all too often robs language of its potent imagery and originality.
He gives the example of Ecclesiastes 9:11: “I returned and saw under the sun, that the race is not to the swift, nor the battle to the strong, neither yet bread to the wise, nor yet riches to men of understanding, nor yet favour to men of skill; but time and chance happeneth to them all.” He translates this as: “Objective consideration of contemporary phenomena compels the conclusion that success or failure in competitive activities exhibits no tendency to be commensurate with innate capacity, but that a considerable element of the unpredictable must invariably be taken into account.”
The artlessness of the latter text shows how very dehumanising the reductive approach was in practice – however appealing the initial theory. It is as though Google Translate were having a particularly bad day. And the notion did little to persuade people of how important grammar was to their lives.
Intransigent on intransitive verbs
Those opposed to the current government’s drive for greater awareness of the rules and labels of grammar might point out that the far-fetched, controlling and mechanistic theories of language in the early 20th century were developing against a backdrop of totalitarian politics, and that the two were not unconnected. Criticisms of the key stage 2 spelling, punctuation and grammar test in English primary schools certainly included an insinuation – if not a direct accusation – that the government was trying to control and restrict how children approached their writing, as well as the interpretation of texts.
Yet, while language can certainly be an instrument of control, this would be a harsh assessment of the government’s intentions. The guidance on teaching English at KS1 and KS2, in which (many) prescribed grammar terms are listed, says: “A high-quality education in English will teach pupils to speak and write fluently so that they can communicate their ideas and emotions to others and through their reading and listening, others can communicate with them. Through reading in particular, pupils have a chance to develop culturally, emotionally, intellectually, socially and spiritually. Literature, especially, plays a key role in such development.”
This all sounds eminently sensible, and it would be hard to argue with a jot or tittle of it. Of course we want our young people to have the tools to discuss the literature that they encounter and how and why it achieves the effects that it does.
But where the government has erred is in the age-appropriateness (or lack thereof) of the grammatical terminology chosen to support the learning of 10- and 11-year-olds. Without wishing to lower expectations, it is hard to imagine a discussion of grapheme-phoneme correspondences or the morphology of intransitive verbs in the playground, unless that playground had been infiltrated by linguistics graduates. It is also hard to imagine how such advanced technical apparatus would add to the appreciation of literature aimed at people of this age.
If there are good reasons for introducing grammatical concepts to young people, and there are, then they ought to be introduced at a sensibly graduated pace, rather than front-loading them and pinning them to a high-stakes exam.
A gentler approach – one that might support the government’s perfectly legitimate aims – also lies, interestingly, in Wittgenstein. In the greatest volte-face in the history of philosophy, he realised that he had got it all wrong, came out of philosophical retirement and had another go at explaining how we should think about the dynamics of language. His concept of “language games” was as influential as his first model, and casts some light on the relevance of grammar to our young people.
Rather than insisting that people converge on a single, agreed grammatical approach, Wittgenstein embraced and celebrated the sheer diversity of approaches we discover in language’s usage. Language as a whole is like a complex of games, each with different aims and rules, he noted. Sometimes the games overlap because of a family resemblance between them, but it is at these points of overlap that we risk falling into misunderstanding – by using the rules of one game in pursuit of another.
For example, when someone fails to realise that their conversational partner is playing the language game of “joking”, communication breaks down, and offence is very possibly taken. Likewise, a religious person speaking of their mystical experience will find little understanding in conversation with someone who only plays the language game of scientific reductionism.
To communicate effectively, we have to understand the background assumptions of the other person, and how those assumptions translate into the language game they are each playing. It would be inappropriate, for example, to ask young children to play the language game of advanced linguistics.
A laudable intention would be to ensure that children of any and all backgrounds are able to engage in such games. This is to sell grammar as social and cultural glue – a community cohesion strategy. It promises to assist social mobility. While critics cry pedantry, and point to the reality of grammatical flexibility in most forms of communication – emails, texts, informal conversations – there is an argument to say that young people need to understand why and when they are abandoning formal conventions so that they can switch between different kinds of appropriateness, depending on their context.
Few are bothered by a missed apostrophe in a text message, but there are plenty of oversubscribed employers who would discard an application on the basis of an inability to use full stops properly. It is empowering to be able to alter one’s language depending on the audience, and disempowering not to know how to do so.
A focus on grammar would enable students to overcome social and educational disadvantages by gaining a facility with the language of the advantaged. It would also help every student to access the richness of our literary heritage and the full breadth of the society in which we live.
In the end, Wittgenstein and (eventually) his early followers realised that their first approach – ruthlessly reductive and tending towards a rule-driven, authoritarian attitude towards its users, usually in pursuit of a single, ideological goal – squeezed out far too much of the sort of language that people need in order to do the sheer variety of things they want to do.
His later approach was far more relativistic, recognising that it is necessary and liberating to know the different rules of engagement for different kinds of discourse, not least so that people might better understand those who are different from them, without having to conclude that they are wrong. It speaks more hopefully to the diverse social and linguistic make-up of schools, where our aim must surely be both to show acceptance of difference, and yet enable commonality in order to equalise opportunity. In this guise, Wittgenstein can be an Attenborough figure.
But he is not alone. MIT linguist and philosopher Noam Chomsky famously developed the theory of “universal grammar” in the second half of the 20th century. Chomsky argues that there is an evolved genetic capacity in all humans to structure the sounds of language according to very basic underlying rules, and that all languages can in the end be understood in these terms. This is a relatively recent genetic endowment: language has only been around for 20,000 years or so – a drop in the evolutionary ocean.
If Chomsky is right, what difference does it make? It implies that the essentials of communication are assured, whether or not we teach complex grammar. A loosening of the grammatical reins under certain conditions will not lead to communication meltdown. It also implies that the basic rules of grammar are indeed just that – basic – and thus comprehensible to all, even across languages, which should hearten our young linguists.
As well as this, it draws attention to our shared humanity, underscoring the fact that we have far more in common than the things that divide us. It reflects the very corporate nature of our species and should mitigate against the more individualistic interpretations of human nature. And it should fire our inquisitiveness that such an extraordinary, world-changing mutation happened at all.
Children’s relationship with grammar should not be that of slavish subservience to the academic establishment. An emerging awareness of the endless possibilities of language should fire their imaginations. But, as in so much of their lives, they should also understand the need to make nuanced, informed decisions about when to conform to conventions so that they might effectively play their part in a corporate body, and when to bend them creatively to express their individuality. Otherwise, we rob them of choices about how they can use language to their own ends.
Alistair McConville is deputy head (academic) at Bedales School in Hampshire
Does spelling matter?
English spelling started out as a relatively straightforward means of mapping spoken sounds onto written symbols, although – since the alphabet was borrowed from Latin with a different set of sounds – it presented problems even for the Anglo-Saxons.
The relationship between speech and writing has been further disrupted by subsequent changes in pronunciation - we no longer sound the “k” and “gh” in knight but continue to write it that way. The flood of words borrowed from other languages, with different uses for the same letters (compare cat, centre and ciabatta), have further complicated the system.
Attempts to reform spelling to make it easier to learn go back to the 16th century, but have failed to find a workable solution. Restoring the link between spelling and speech fails because of the variety of English accents (should card be spelled with an “r” or without?), while the desire to make English spelling resemble Latin by introducing silent letters into words like doubt and debt drove speech and writing further apart.
One question that is seldom raised is whether we all need to spell the same way.
In the Middle Ages, there were hundreds of spellings of common words like “through”, including drowgh, yhurght, trghug, trowffe.
A fixed system is the legacy of the printing press; in the 19th century, authors like Austen and Dickens left spelling and punctuation to their printers.
Today, we are witnessing a destandardising of spelling in electronic discourse, where variant spelling is tolerated, punctuation marks are repurposed and traditional processes of editing are increasingly bypassed.
Given these changes, perhaps it is time we questioned our emphasis on correct spelling, and the hours spent teaching and testing a child’s grasp of a system that is riddled with inconsistencies. Although we can’t fix the spelling system in English, we can keep its significance in perspective.
There never has been a golden age when everyone had the system mastered; most adults have blind-spots that have the potential to cause them embarrassment (how many “r”s and “s”s?). Should a misplaced apostrophe be sufficient reason to dismiss someone as illiterate? Does an ability to spell necessarily signal a more accomplished writer? Roald Dahl’s letters reveal that his spelling was little better than that of the BFG, and yet he was one of the most successful writers of the 20th century.
After all, as Rabbit says of Owl’s mixed success in spelling “Tuesday” in The House at Pooh Corner: “Spelling isn’t everything. There are days when spelling Tuesday simply doesn’t count”.
Simon Horobin is professor of English language and literature at the University of Oxford. He is the author of Does Spelling Matter? (OUP, 2013) and How English Became English (OUP, 2016)