Why you should try multisensory teaching

Illusions like ventriloquism show how interlinked our senses are. So, while the ‘learning styles’ model may be widely discredited, using different sensory inputs can help students to retain knowledge, neuroscientist Uta Noppeney tells Zofia Niemtus
29th March 2019, 12:05am
Illusions Like Ventriloquism Can Teach Us A Lot About Multi-sensory Learning, Writes Zofia Niemtus

Share

Why you should try multisensory teaching

https://www.tes.com/magazine/archived/why-you-should-try-multisensory-teaching

If you’ve ever watched a ventriloquist perform and thought for even a moment that the words you were hearing were really coming from the mouth of the puppet, then you have experienced a multisensory illusion. The ventriloquist’s speech and lack of facial movement, coupled with the puppet’s moving mouth, fools us into thinking that we are hearing the puppet talk. It’s an old trick, but that doesn’t make our brains any less susceptible to it.

There are videos of many similar multisensory illusions available online, such as the “rubber hand illusion”, in which the participant’s brain “disowns” their real hand and they feel things in a rubber one instead. And the “McGurk effect”, where the audio of one sound is played over the visual of a similar noise, leading the viewer to hear only the sound that matches the visual cues.

All these tricks work by playing with the brain’s multisensory integration, a function that is vital not just in daily life but in the classroom, too. It is also an area that Professor Uta Noppeney, chair in computational neuroscience in the University of Birmingham’s School of Psychology, knows a lot about. We experience and rely on multisensory integration all the time, she says, albeit usually unconsciously.

“We are constantly bombarded with many different signals. We hear things, we see things,” Noppeney explains. “Integrating information across all our senses is particularly beneficial in noisy environments, where individual signals are unreliable. For instance, in a noisy pub, you can understand the person sitting next to you much better when you watch their lips. The audio-visual integration here helps you to segregate the speech signal you are interested in from the background noise, leading to much better speech comprehension.”

Eye-opening science

Scientists have long understood that our brains evolved to operate in environments where there were numerous simultaneous sensory inputs. Auditory and visual information enter the brain through different sensory organs (the ears and the eyes). So, Noppeney says, under the traditional model, they were believed to be “initially processed independently along the auditory and visual processing streams and only converged later in the association cortices” - these are areas of the brain outside those associated with one of the primary senses.

However, over the past decade, this understanding has changed, she adds: “There has been a progressive realisation that multisensory integration is far more pervasive than previously assumed, happening even in primary sensory cortices. The current models assume that signals from different senses interact across all processing stages.”

This change in understanding might seem relatively minor, but it holds significant implications for teachers and their lessons, according to Noppeney. Take, for instance, the example of learning to identify different species of birds. Research by Ladan Shams and Aaron R Seitz (2008) explains that a traditional approach would be to get students to look at the physical characteristics of each bird, without listening to bird calls, working on the rationale that they would be tested on the identification of characteristics without an accompanying sound. The assumption would be that hearing the bird calls would be distracting or unnecessary.

Yet the research suggests that hearing the sounds alongside the images supports later identification, even when the sounds are not present during the testing phase.

Recipe for better recall

Does this mean that teaching in a way that relies on a combination of the senses could lead to better recall? Noppeney is enthusiastic about the possibilities. “Teaching students in school concurrently in multiple sensory modalities can be helpful for several reasons,” she says. “First, it’s more engaging; the sound increases your alertness. It may help students to stay engaged and focus on the material and information discussed rather than getting distracted by background noise.

“Second, because different senses can provide complementary and redundant (ie, the same) information in different representational formats, students will obtain a more complete picture of the learning.”

But what does this look like in practice? It’s something that many teachers already do without realising, Noppeney says. Teaching geometry and arithmetic by counting on your fingers, for example, uses vision, touch and proprioception (the awareness of the position and movement of the body). And, as any four-year-old will tell you, this is more memorable than simply trying to learn the numbers verbally.

However, research has also shown that it’s crucial for these multiple modalities to be “congruent” in order to strengthen their effect on memory.

“If you present a picture of a dog and a related sound, like the barking of the dog, then you will be better at recognising the dog as part of the training material later on,” Noppeney says. “And you are better at recognising the picture in the later recognition phase even when presented without the additional barking sound.

“By contrast, if you present the picture of a dog with a semantically incongruent stimulus, such as the miaowing of a cat, participants don’t remember the stimuli so well.”

This is linked to a psychological concept known as “redintegration”, she continues, in which a whole memory is restored from one element of it.

And there is another benefit to multisensory learning, according to Noppeney: it caters to the different sensory abilities in the room.

That’s not an attempt to revive the now widely discredited “learning styles” model, but it is a nod to the fact that students will have variation in sensory reliability, particularly those who may have sensory impairments. “Multisensory integration skills vary across individuals, as has been shown for speech comprehension,” Noppeney explains. “Moreover, individuals with autism spectrum disorders have been shown to bind sensory signals - particularly voices and faces, and other socially relevant signals - differently during neurodevelopment.”

Making sense of the senses

The question of whether additional audiovisual training could be helpful for these young people is yet to be answered by research, Noppeney says.

However, there are studies suggesting that, in general, working memory capacity can be boosted when stimuli are presented using two sensory modalities. You may be better able to store a telephone number in working memory, for example, if this number is presented visually and also sounded out.

Research into multisensory processing in working memory is still in its infancy, Noppeney points out, but it’s an area that she is hoping to explore further in the near future.

In the meantime, she and her colleagues are looking into the metacognitive aspects of multisensory learning - and conducting experiments into the ventriloquism effect mentioned earlier. Researchers present a burst of noise originating from one place at the same time as a flash of light in a different place, which usually leads observers to perceive the sound as coming from near the flash (in the same way as they would perceive the ventriloquist’s voice coming from the puppet’s mouth). The researchers then assess how well participants can metacognitively monitor whether the auditory and visual signals were in spatial conflict, and their uncertainty about where the sound came from.

“Initial results suggest that observers can indeed monitor their uncertainty about causal structure and where the sound comes from,” Noppeney says. “Moreover, we can also see whether they can assess their uncertainties about information provided in different sensory channels.”

Such metacognitive abilities are important in a school environment, where students need to be able to monitor whether or not they have mastered a particular skill or should revisit it, she adds.

And while, once again, there are many outstanding questions around how far these abilities can be trained, the message is that teachers can begin to explore some of the proven benefits of multisensory instruction to support metacognition; the evidence shows that when information becomes encoded in multiple sensory systems at the same time, it forms a more stable memory.

“The important part about multisensory integration for perception is that even though the brain does this automatically, and everyone can do it, it’s quite a tricky thing for the brain to do.

“Students may differ in their abilities to process information in different sensory modalities, where some may rely more on visual and others more on auditory aids,” Noppeney says. “Therefore, providing information concurrently in multiple sensory channels may allow for this diversity in the student population.”

Zofia Niemtus is a freelance writer

This article originally appeared in the 29 March 2019 issue under the headline “Tes focus on... Multisensory integration”

You need a Tes subscription to read this article

Subscribe now to read this article and get other subscriber-only content:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

Already a subscriber? Log in

You need a subscription to read this article

Subscribe now to read this article and get other subscriber-only content, including:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared