How do researchers analyse data?

The way that researchers choose to analyse data can lead to very different findings
How do researchers analyse data?

Share

How do researchers analyse data?

https://www.tes.com/magazine/tes-explains/how-do-researchers-analyse-data

When researchers analyse evidence, they look at what are known as “outcome measures”.

For primary research, this will be raw data collected from a new trial, whereas in a systematic review, this will be outcome data from pre-existing studies.

In both cases, the researchers will work to translate these outcome measures to produce conclusions: for example, the difference in results, on average, between treatment and control groups, or an effect size.

They then interpret the data, making caveats where there are important limitations or discussing whether the results corroborate previous research.

The way that researchers analyse data depends on the methodology used in a study and the research questions they are trying to answer. 

 

Where can I see data analysis in action? 

In 2021, Harry Fletcher-Wood, associate dean at Ambition Institute, and Sam Sims, lecturer at the Centre for Education Policy and Equalising Opportunities at University College London, set out to conduct a new systematic review of the evidence for what makes effective teacher professional development.

To do this, they looked at outcome data from around 104 relevant studies. 

In order to translate this evidence into practical advice, they developed a theoretical model based around two building blocks: purposes and mechanisms.

They identified that a programme was most likely to help teachers to change if it had a balanced design addressing the four purposes:

  • Developing teachers’ knowledge around teaching and learning.
  • Motivating goal-directed behaviour: encouraging teachers to act on this knowledge.
  • Teaching techniques to put this evidence-based knowledge to use.
  • Embedding these changes in teachers’ practice.

They then looked at the mechanisms needed to cause change, and produced a list of 14, based on work by psychologists Susan Michie et al (2013). 

Fletcher-Wood and Sims then identified which of the 104 studies addressed the four purposes, and which had many of the mechanisms. 

They found that, on average, a programme with no mechanisms would have no impact; a programme with 13 mechanisms would have an impact equivalent to two months’ additional pupil progress, and that balanced designs had a three-times higher impact, on average. 

Further reading:

Recent
Most read
Most shared