The failure of ‘pure discovery’ learning:
The case against ‘pure’ discovery learning is pretty damning. A number of fairly recent papers have consistently reported that minimally guided instruction simply doesn’t work.
Mayer – American Psychologist, 2004
Klahr and Nigam – Psychological Science, 2004
Kirschner, Sweller and Clark – Educational Psychologist, 2006
It’s interesting to note that doubts about the efficacy of the approach are not particularly new. Even Bruner (quoted by Tuovinen, here) appeared to hold the view that discovery learning would only play a small role in education.
Although Bruner championed the discovery learning cause, he argued that existing knowledge and culture were not generally passed on by discovery. He wrote (1966, p. 101):
“You cannot consider education without taking into account how culture gets passed on. It seems to me highly unlikely that given the centrality of culture in man’s adaptation to his environment – the fact that culture serves him in the same way as changes in morphology served earlier in the evolutionary scale – that, biologically speaking, one would expect each organism to rediscover the totality of its culture – this would seem most unlikely. Moreover, it seems equally unlikely, given the nature of man’s dependency as a creature, that this long period of dependency characteristic of our species was designed entirely for the most inefficient technique possible for regaining what has been gathered over a long period of time, i.e. discovery.”
Keeping an open mind on ‘enhanced discovery’ learning:
On the other hand there’s some evidence to suggest that when discovery learning is highly scaffolded, then the negative effects are mitigated …
Cobern et al – Research in Science & Technological Education (2010)
… and even some evidence that it can outperform direct instruction in some circumstances.
Alfieri et al – Educational Psychology, 2011
Robert Marzano suggests that:
“When faced with the decision whether to use direct instruction or unassisted discovery learning, a teacher should opt for the former. However, if a teacher is willing to put time and energy into designing lessons that ensure that students have the knowledge needed to understand the content and that provide guidance and interaction along the way, then discovery learning can be a powerful learning experience for students.”
Tuovinen makes the point that some versions of discovery learning involve a significant amount of direct instruction and incremental structure.
“One conclusion that can be drawn is that instead of a clear dichotomy between, say, deductive discovery and deductive reception learning, we have a continuum where the methods of teaching and learning differ by gradually varying amounts of guidance, direction, structure, help, learner control, and other dimensions, rather than being neat separate categories.”
Looking through Tuovinen’s taxonomy there are hints as to the processes which determine the success or failure of an instructional technique. I won’t list them here, instead I want to explore some of the possible reasons why ‘pure’ discovery learning fails and explore why direct instruction and some of the more structured forms of discovery learning tend to succeed.
Why does pure discovery learning fail?
Induction is harder than deduction – one possibility is that discovery learning simply overloads our students’ ability to successfully process and encode the material to be learnt. One reason for this might be that inductive reasoning (synthesising a general rule from a number of specific instances) is enormously complex – requiring the student to correctly attend to the pertinent information in each specific instance (as opposed to all the extraneous information) and hold them all ‘in mind’ long enough to discern the general pattern of behaviour. Deductive reasoning (using specific instances to test a general rule) requires less mental juggling. Testing a hypothesis involves holding a rule in mind and comparing it to a specific case to see if the rule holds.
On the other hand, we remember what we think about, so as long as the inductive jumps are fairly small the additional processing required may assist the learning process. It might be difficult to judge the appropriate amount of element interactivity for a group of students, but the additional challenge it provides might be advantageous in some circumstances.
Subject knowledge – a mediating factor appears to be the extent to which students possess background knowledge which can assist them with processing the new information. We know that our prior knowledge about a topic plays an active role in helping us process new information. Where prior knowledge is poor, working memory is swamped and very little appears to be successfully learnt.
However, there’s some evidence from Tuovinen and Sweller -Educational Psychology (1999) that where domain knowledge is pretty good, then the choice of instructional technique has little influence on outcomes.
“However, if students had previous familiarity with the database domain, the type of practice made no significant difference to their learning because the exploration students were able to draw on existing, well-developed domain schemas to guide their exploration.”
Kirschner et al also note that:
“The advantage of guidance begins to recede only when learners have sufficiently high prior knowledge to provide “internal” guidance”
… which implies that where students have a pretty firm foundation in a topic (e.g. following direct instruction) there may be advantages to using a structured discovery technique as a way of getting students to apply that knowledge.
Along with subject knowledge, the conceptual difficulty of the learning is also likely to be a factor. Typically, direct instruction of an abstract concept may be preferable because it may not be easily observable (and therefore discoverable) – but there are exceptions (e.g. density is a tricky concept for many students, but easy to investigate using displacement).
Group size – another possibility for the failure of pure discovery is the phenomenon of social loafing. This is the finding that as the number of people in a group increases; their individual efforts tend to decrease. Most discovery learning methods involve high levels of student collaboration and a mediating factor might simply be that average effort expended on the task is lower than when working individually.
On the other hand, there’s some evidence from the EEF that collaborative activities can produce significant learning gains – but the devil is in the detail of how the group activity is structured.
“The impact of collaborative approaches on learning is consistently positive, but it does vary so it is important to get the detail right. Effective collaborative learning requires much more than just sitting pupils together and asking them to work together; structured approaches, with well-designed tasks lead to the greatest learning gains.”
Unnecessary focus on biological primary skills – Geary describes biologically primary learning as abilities and knowledge which arise innately, are learnt rapidly and don’t need to be taught. Much of the modern emphasis on skill-over-knowledge tends to highlight the importance of students practicing these sorts of skills. For example, Guy Claxton’s BLP framework encourages teachers to highlight the development of various learning dispositions, for example, imagining, imitation and noticing. It seems likely that these traits don’t need or improve with practice and may even provide a distraction from the biological secondary knowledge and skills that would.
However, there are some learning skills which may benefit from explicit focus. Again, the EEF reports that meta-cognitive and self-regulation strategies can produce strong gains for some students (mainly lower achieving and older students).
“The potential impact of approaches which encourage learners to plan, monitor and evaluate their learning is very high. However it can be difficult to achieve these gains as this involves pupils in taking greater responsibility for their learning and in developing their understanding of what is involved in being successful. There is no simple strategy or trick for this. It is possible to support pupils’ work too much, so that they do not learn to monitor and manage their own learning but come to rely on the prompts and support from the teacher. A useful metaphor is scaffolding in terms of removing the support and dismantling the scaffolding to check that learners are taking responsibility to manage their own learning.”
Learner control – a further issue for discovery learning is that students are directing their own studies rather than following the guidance of a subject expert. Though the teacher may carefully select materials to support the formation an accurate understanding of a topic, students may simply confirm their misconceptions and ignore disconfirming evidence available to them. This factor will likely correlate to the prior knowledge of the students and Tuovinen notes that even ‘progressive’ teachers tend to avoid a genuinely ‘pure’ discovery approach:
“Wittrock (1966) found that many of the discovery learning approaches discussed in the literature actually consisted of a conventional ‘lesson plus practice’ (expository) teaching approach, which approximated to the deductive discovery learning approach, with varying amounts of direction provided during the practice (discovery) stage.”
On the other hand, along with a sense of mastery and purpose, having a sense of personal autonomy is an important element of our intrinsic motivation. Self-determination theory suggests that being able to provide students with elements of choice – obviously within a structured learning environment – may have a positive influence on student performance through improving their self-efficacy.
Behaviour management – There’s certainly evidence that poor behaviour affects student academic performance and the 10 year study by Hayden shows that deficits in classroom climate are not uncommon in English schools. Another mediating factor behind the failure of discovery learning might be the opportunities presented by more flexible learning environments for students to simply not focus on what they are learning. Discovery learning activities typically involves a great deal of student movement and interaction and this tends to raise noise levels and affords potential misbehaviour.
However, many teachers use discovery learning techniques precisely because they believe the elements of choice, peer-interaction, etc improve student motivation and thereby reduce behaviour problems. Whilst there’s plenty of qualitative commentary suggesting this correlation, I’m not aware of any quantitative evidence to support or refute the claim. It would be interesting to see an RCT – perhaps using Haydn’s 1-10 rating scale of classroom climate – comparing enhanced discovery learning and direct instruction. My intuition is that the outcomes would be broadly comparable (i.e. instructional style will not account for much variance in student behaviour) – but I don’t know.
There’s good evidence to suggest that ‘pure’ discovery learning is an ineffective way for students to learn and direct instruction appears always to lead to better outcomes. However, there are interesting lessons to be learnt by examining why minimally guided instruction fails.
There’s also evidence on more structured forms of discovery learning which is far from damning and in some cases suggests it can even outperform direct instruction. It is likely that a variety of complex and interacting factors (of which I’ve only listed a few) influence the relative success or failure of an instructional technique, but there’s some interesting psychology behind why both direct instruction and guided discovery methods might be appropriate depending on the circumstances.
Reblogged this on The Echo Chamber.
That is a masterful summary of the issues; thank you. I just think that reducing the discussion to direct instruction v constructivist approaches; knowledge v skills; or traditional v progressive ends up producing an overly narrow field of view that doesn’t take account of the enormous variety of both teachers and classes. This is why educational research tends not to produce definitive ‘this works / that doesn’t’ answers and why we all need to be aware of the thinking but also respond to what seems to work best for us in our own classrooms. It’s also why neither Ofsted, SLT, nor anyone else, should be telling us to use a particular approach. We should be held to account for the outcomes, but not for the methods we use if the outcomes are good.
Thanks for your kind comments. Yes, you’re right – I think the debate around instructional methods risks ignoring the evidence if it simply polarises along ideological lines. It certainly shouldn’t be the job of the regulator to impose orthodoxy upon teaching practice (SLT are highly influenced by whatever the regulator says in my experience). ‘Research-informed’ professional judgement is the only way to tackle the enormous complexity of teaching and that requires teachers to be both knowledgeable and skeptical when it comes to ‘what works’.
As dodiscimus commented above……..”That is a masterful summary of the issues; thank you”
I couldn’t have said it better myself so I won’t try. A big thank you.
I have read it several times and each time the issueis further illuminated
Thanks for the comment – glad you found it insightful!
Pingback: The restless relationship between science and teaching | Evidence into practice @ Turnford
You might be interested in reading some of the rebuttals to the Kirshner ‘minimally guided’ article. They were published in the same journal: http://edtechdev.wordpress.com/2007/07/25/problem-based-learning-videogames-inquiry-learning-constructivism-pedagogical-agents-all-bad/
‘Minimally guided’ is a strawman argument. Problem-based learning and inquiry guided learning, which Kirshner et al. lump together with discovery learning, are anything but minimally guided.
There are literally thousands of studies now supporting active learning approaches as more effective than the direct instruction approaches Kirshner advocated. Yet another meta-analysis just came out today. This news story, titled “Lectures Aren’t Just Boring, They’re Ineffective, Too, Study Finds”, summarizes it:
And here are some more references:
Inquiry-Based Learning in College Mathematics
Click to access Keynote-Monday-Laursen.pdf
“Comparing students taught with the inquiry-based learning approach and those who weren’t, the study found the former reported better learning gains. An analysis of grades found that students in inquiry-based learning (IBL) classes did as well or better than students who did not complete any IBL classes.
But more importantly, the outcomes for different groups of students were dramatic in IBL classes compared to non-IBL classes. Implementing inquiry-based learning approaches in mathematics improved outcomes not only of high achieving students, but also females, future mathematics teachers and low achieving students.”
When is PBL More Effective? A Meta-synthesis of Meta-analyses Comparing PBL to Conventional Classrooms
“Findings indicated that PBL was superior when it comes to long-term retention, skill development and satisfaction of students and teachers, while traditional approaches were more eff ective for short-term retention as measured by standardized board exams.”
“Across 82 studies and 201 outcomes the findings favor PBL (d = 0.13, +/- .025)”
Interactive-engagement versus traditional methods: A six-thousand-student
survey of mechanics test data for introductory physics courses
Click to access Hake.pdf
Does Active Learning Work? A Review of the Research
Click to access Prince_AL.pdf
Engage to Excel: Producing One Million Additional College Graduates with Degrees in Science, Technology, Engineering, and Mathematics
Click to access pcast_engage_to_excel_release_powerpoint__final.pdf
“One study found that students in traditional lecture courses were twice as likely to leave engineering and three times as likely to drop out of college entirely compared with students taught using active learning techniques.
Students in a physics class that used active learning methods learned twice as much as those taught in a traditional class, as measured by test results.”
MAA Calculus Study: Seven Characteristics of Successful Calculus Programs
“Use of student-centered pedagogies and active-learning strategies”
Problem-based learning even affects (or doesn’t hurt) things like empathy. This article showed that students in engineering & physics had less empathy than students in caring professions. The exception was the computer engineering students, who were taught using problem-based learning:
“For computer engineering students, the differences were largely eliminated. The researchers have a theory about why: the computer engineering students are taught with PBL, problem-based learning, which is not the case for the applied physics students. Chato Rasoal believes this can influence the degree of empathy.”
“In problem-based learning you work in groups a lot. You have to be able to listen to others and accept other people´s thoughts and expressions of emotions. Otherwise it won´t work.”
LikeLiked by 1 person
Hi – thanks for those links.
Yes, it’s easy for both direct instruction and discovery learning to be unfairly stereotyped and it doesn’t help the debate about the efficacy of instructional techniques. My view is that teaching style probably isn’t the principle variant in student outcomes – so long as the structure is there. However, I’ll take a look through the links you’ve suggested and leave a more developed comment when I’ve read them.
Hi again. Had a look through a number of the links and I think my discussion above doesn’t misrepresent the argument (though I accept it simplifies it). It’s difficult to tell what constituted ‘traditional methods’ (e.g. in Meta-analyses Comparing PBL to Conventional Classrooms). If “Minimally-guided instruction” is a straw man, then “Traditional methods” cast as lecturing or meaningless rote learning is surely a ‘straw man’ also?
My point wasn’t to condemn either direct instruction or inquiry learning – but to explore why one or another approach might be more suitable in different circumstances. You may disagree in favour of more inductive approaches in your teaching – and that’s fine. You’ll see from my posts that I think we should trust professionals to select the instructional techniques which work for their teaching style / subject / students rather than impose pedagogical orthodoxy upon them.
Hi Doug. I have read your, and several other, critiques of Kirschner et al. They all have the same shortcoming (but vary in other ways): the starting point that the phrase “minimally guided instruction” is, as you put it here, “a strawman argument”.
Well. That would come as some surprise to Kirschner et al, who I’m sure didn’t realise that their title phrase would be deemed to be an “argument” of any type. It is not, and the many words written trying to rebut the PHRASE in all of these pieces simply miss the point.
“MInimal guidance” is simply a catch-phrase for a slippery complex assortment of educational approaches that tend toward the same ends … reducing the amount of teacher guidance in the way a student learning. They are not all literally “minimal” and they do not all focus on the “guidance” aspect, though they influence it. It is a 5-syllable rubric that whose absence would necessitate such complex verbiage it would make discussion of these things, collectively, quite impossible.
An advocate for better math instruction, I find it necessary to use such compact terms. If I have a 5 minute radio interview explaining the issues I cannot use 4 and a half of them explaining the fine points of what the term “constructivist” (e.g.) means, and then to discuss how I’m talking about a general category that includes this but is not limited to it, and focusses only on certain aspects of the theory versus others etc. etc.
The general public is not interested in that level of information, and the academic community’s discussion of education is seriously hampered by digressions into semantical debates or arguments about precision of terms, when one is talking in GENERALITIES about common threads that require unwinding of these precision to converse about principles, no finely-parsed pet theories of this or that academic wonk.
I had never encountered that phrase. But when I saw it I understood immediately that the authors were using it to bundle together this common thread appearing in the education literature under many different guises: Constructivist pedagogy, student-centered learning, discovery-based and inquiry-based learning, “progressive education”, and so on. As a catch-all I have always used “fuzzy math”, but etymologically this word is more general than “minimal guidance” so both are useful.
Your insistence that this term is, itself, an argument, and your devotion of mounds of writing to refute a TERM simply tells me that you’re in the same camp as numerous folks in the educational establishment whose rebuttal to my substantial arguments boils down to accusing me of being poorly informed. It is not a rebuttal, despite the surface appearance. It is a tactic to avoid engagement with substantive discussion of the matter at hand. In the case of Kirschner et al, that is a broad swath of empirical evidence in cognitive science that leaving novice learners to “construct their own knowledge” in a vacuum (or thinner atmosphere) of teacher-led instruction leads to poorer educational outcomes.
An appropriate reply to Kirshner et al by an advocate of one of these approaches might be to first distance that approach from that which is shown to be of low educational value in these studies, explain how it is avoided in the approach and then (this is the critical part) producing valid empirical studies demonstrating that the approach actually does what is claimed and results in superior learning outcomes.
LikeLiked by 1 person
Hi again Doug. Having followed a bunch of links now I will add this: I don’t see any way in which EIP’s post here is at all lacking. It is broad, covers several different aspects of the question, and takes into account several different perspectives. All writing I have seen on this subject simplify it. I don’t see that this piece OVERsimplifies it by any reasonable standard. Nor does it misrepresent the area.
I am struck, when I read critiques of the KSC metaanalysis of how often construcitivism activist cite studies of college and high school students as if these negate their argument. What is missing is their parsing of WHEN minimal guidance works (when those being taught are subject experts — i.e. they already have a solid foundation of domain-specific knowledge from which to draw) and when it does NOT work (when they are novices in the subject). In light of what the empirical data says on this matter it is tomfoolery to infer generalized educational prescriptions from studies of those who already have expertise conferred by 12 years of exposure to the content of a discipline and apply such to the elementary school classroom, where students are novices. This whole discussion, after all, centers around matters of policy and teacher training in the public schools, and concerns most deeply how to instruct small children. Studies of university students simply are not relevant, and KSC’s analysis actually makes the point of this distinction quite obvious.
Does this debate apply to learning physical activities as well?
I’m a coach in sports. I’m curious if these teaching philosophies apply to sports activities…
I’m not sure how ‘discovery learning’ would lend itself to sports coaching – perhaps akin to pupils co-creating a new game? I suspect you use pretty specific and explicit feedback on performance towards developing clearly identified skills.
Pingback: The science of learning | Evidence into practice
Pingback: The Science of Learning | Blogs of the Month
Pingback: Teacher survey – results and analysis – Part 1 | Evidence into practice
Pingback: Alphabetical Signposts to Teacher Excellence – D – Teach innovate reflect