Harry Madgwick, the EEF’s Senior Content and Engagement Manager, introduces a new resource to support understanding of some common types of research and technical terminology.
The ‘correlation doesn’t equal causation’ adage warns us that, just because two things tend to move in a similar direction, one isn’t necessarily driving the other.
Pithy, memorable proverbs may help something to stick but rarely to get to the root of complex ideas. For example, to fully grasp causal claims and identify those that are well supported by evidence and those that aren’t, we need an awareness of different types of research methods and their strengths and limitations.
To support those working in education to question and critique such claims and build a wider understanding of what research is and how it functions, we share two new glossaries: ‘Types of Research’ and ‘Evaluating Research Evidence’.
‘Types of Research’ Glossary
Understanding about how research is conducted helps us know more about what the evidence it provides can and cannot tell us. Our ‘Types of Research’ glossary defines some research methods, ranging from experimental studies that test cause and effect (such as randomised control trials), to case studies that describe the ‘experience(s) of researchers, teachers, classes, settings, or ‘cases’’. Whilst non-exhaustive, this list provides a helpful starting point for comparing the differences between and purposes research methods and designs.
‘Evaluating Research Evidence’ Glossary
Research can be quite complex, and it is often talked through potentially inaccessible language. The ‘Evaluating Research Evidence’ glossary defines terms that might be perceived as technical (such as ‘statistical significance’ or ‘generalisability’), and those that are frequently used but perhaps in different ways (such as ‘reliability’ and ‘validity’).
No, correlation and causation are not the same – although sometimes they are. It’s therefore essential that researchers select the appropriate research methods for the questions they’re looking to answer – and don’t overclaim based on what they have found out. Similarly, users of evidence need to balance keen interest in research findings alongside healthy scepticism, considering the strengths and limitations of methods whilst gathering practical insights and implications.
To help those involved in school improvement and professional development access and critique research evidence, and break down what it’s application can look in their contexts, the Education Endowment Foundation has produced ‘Using Research Evidence: A Concise Guide’.