The causal influences on educational outcomes are complex and uncertain, leading some commentators to exaggerate or dismiss research findings in accordance to their ideological positions. However, education isn’t the only field of science which deals with trying to identify the causal mechanisms which underpin a highly complex system whilst facing significant controversy – and maybe we should take a leaf from their book in the way that we report and discuss summaries of education research.
Climate change is an enormously complex field. Like education, the causal mechanisms are almost impossible to unequivocally identify from individual studies and ideological views can easily distort the interpretation of the available evidence.
The way that the IPCC gets around this issue is by issuing guidelines regarding the treatment of these uncertainties. For example:
The AR5 will rely on two metrics for communicating the degree of certainty in key findings:
• Confidence in the validity of a finding, based on the type, amount, quality, and consistency of evidence (e.g., mechanistic understanding, theory, data, models, expert judgment) and the degree of agreement. Confidence is expressed qualitatively.
• Quantified measures of uncertainty in a finding expressed probabilistically (based on statistical analysis of observations or model results, or expert judgment).
Areas of the IPCC guidelines that might be of particular interest to education research might include the treatment of validity:
Use the following dimensions to evaluate the validity of a finding: the type, amount, quality, and consistency of evidence (summary terms: “limited,” “medium,” or “robust”), and the degree of agreement (summary terms: “low,” “medium,” or “high”). Generally, evidence is most robust when there are multiple, consistent independent lines of high-quality evidence.
Another convention they use is qualitative expressions of likelihood; again dependent on the strength, range and validity of the evidence supporting a claim.
There is already a move towards this style of reporting, for example the IES practice guide uses ‘Low’, ‘Moderate’ and ‘High’. However, other researchers report mean effect sizes, months of progress, etc to compare results in meta-analyses.
The complexity of climate change perhaps isn’t a bad analogy for the difficulty evidence-based practice faces picking out the causal mechanisms which underlie effective teaching or student outcomes. Perhaps the EdResearch community would benefit from developing a consistent way to report and discuss the genuine uncertainties of ‘what works’.