There’s some interesting evidence to suggest that well applied study skills can have an important influence on student outcomes. Indeed, perhaps the key reason that girls tend to academically outperform boys is related to the effective use of study strategies. For example, Griffin et al (2012)
“The results of this research suggest that it is incorrect to suppose that females necessarily outperform males in intellectual tasks. In pedagogical settings it also does not make sense to perpetuate this misconception. For teaching effectiveness, academia should focus on developing and enhancing the various learning skills and strategies of students regardless of gender.”
Early in my career, a school I was working in invested a fair sum of money taking all year 11 students off-site for a day to work on revision skills. The sessions were presented by a fairly young team, led by a roguish, out of work actor, who guided students through a work book mainly focused on basic mnemonic strategies. Students enjoyed the event, though I suspect mainly because they were off-site for the day. The presenters cracked jokes and indulged in “Don’t call me ‘sir’, I’m not a teacher” informality, whilst accompanying teachers handled crowd control during the ‘dull bits’ where students actually did some writing in the work books. Students and teachers evaluated the whole thing very positively at the end of the day.
Well, almost all teachers. eternal sceptic that I am, I wasn’t particularly convinced the costly exercise improved the quality or effectiveness of revision that our students undertook after the event.
Generic study skills: Mnemonics
Mnemonic strategies will be familiar to most teachers. In the study skills day I attended with Y11s, students learnt how to count to ten in Japanese using the keyword method and memorise a ‘shopping list’ of random nouns using the loci method (the basis of Sherlock Holmes’ ‘mind palace’). They learnt the difference between an acronym and an acrostic and used them to memorise the order of rainbow colours and the (then nine) planets. They created a storyboard to help them recall the key events in a rather dull short story. They learnt how to create a visual map of the revision techniques they’d been taught.
There are two issues with the sorts of activities which typically make up a ‘study skills’ sessions for students. The first and biggest problem is lack of transfer. The efficacy of teaching ‘learning skills’ independent of domain knowledge relies upon students transferring (often fairly abstract) strategies to their own studies. We know that transferability is problematic and the intended transfer of ‘skills’ to other contexts typically doesn’t occur. For example in Perkins and Salomon (1992) they discuss the problems of the transferability of ‘generic’ principles:
“In areas as diverse as chess play, physics problem solving, and medical diagnosis, expert performance has been shown to depend on a large knowledge base of rather specialized knowledge (see Ericsson and Smith 1991). General cross-domain principles, it has been argued, play a rather weak role. In the same spirit, some investigators have urged that learning is highly situated, that is, finely adapted to its context (Brown et al. 1989, Lave 1988).”
Secondly, many mnemonic strategies only really help where the material they have to learn is mnemonic ‘friendly’; for instance ordered lists of words (preferably nouns). The lack of wide applicability of these strategies is one of the reasons it rates so poorly in the Dunloski et al (2013) review of study techniques.
“On the basis of the literature reviewed above, we rate the keyword mnemonic as low utility. We cannot recommend that the keyword mnemonic be widely adopted. It does show promise for keyword-friendly materials, but it is not highly efficient (in terms of time needed for training and keyword generation), and it may not produce durable learning”
Incidentally, there’s a nice summary and discussion about how to apply of the key findings of this review available from the AfT: Dunlosky, J. (2013) Strengthening the student toolbox
In summary, I’m not saying that mnemonic strategies don’t have a role to play in helping students learn content and revise. I’m simply suggesting that the sorts of ‘study skills’ events (which schools often outsource to external providers) are unlikely to have any positive impact on student outcomes. A better plan might be to teach teachers the various mnemonic techniques and encourage them to fins examples of where the ideas might be profitably applied within their own subject domain.
For example, as a science teacher, I can think of examples where verbal and visual mnemonics might be helpful – but it would be better for me to plan these into my teaching, to illustrate their use within science lessons as we progress through sequences of learning, rather than students to learn them in the abstract and hope that they spontaneously identify opportunities and successfully apply them to material across the science curriculum.
Generic study skills: Summarising
Empirical evidence on what makes study skills effective has been slowly gaining momentum within education research, though I suspect it’s not yet really had much impact on how revision skills are taught in schools.
Back in 1999, Purdie and Hattie undertook a meta-analysis of 52 studies to examine the link between study skills and learning outcomes. They found that simply increasing study time was not correlated to outcomes. Where they found positive outcomes, they also noted that it was not due to the inherent quality of any particular study skill, but more likely a function of meta-cognition on the students’ part; the decisions the students made about when and how to employ a particular strategy for a specific learning goal. In essence, students who are able to apply a wide range of available strategies appeared to have better outcomes than students with only a narrow range. However, the case for teaching students study strategies as discrete ‘skills’ appeared to be problematic.
“Of course, it is desirable that students possess a repertoire of desirable study skills, but they also must know when to use study skill x, and when to use study skill Y. … Repeatedly, research points to the importance of selecting the right set of study skills to use for a particular purpose in a clearly defined context…. Effective strategies in one domain may be weak strategies in another ….”
This is an important point; one which undermines a purely ‘skills-based’ approach to teaching revision strategies. In the absence of subject content, study skills may focus on fairly surface-level general strategies rather than deeper strategies related to specific content. Thus, it would seem advisable to embed ‘how to revise’ as a regular feature of subject teaching, rather than rely on one-off revision skill sessions.
The Purdie and Hattie analysis didn’t find particular study skills which were ‘good’ or ‘bad’, though they found highest correlations for note-taking. However, gaining benefit from writing summary notes was not a simple exercise:
“Although notetaking was categorised as an achieving strategy … , closer inspection of those studies in which notetaking was correlated with a measure of student achievement showed that higher correlations were obtained when notetaking involved the identification and manipulation of the most important ideas … rather than the mere recording of information read in texts or heard in lectures.”
The complexity of identifying and organising relevant domain knowledge is likely also a reason why some SEND students underperform. For example, Ghani and Gathercole (2013) reported that for students with dyslexia:
“Results indicated that the dyslexic students were more worried of their school and academic performance, have weakness in managing their time and concentration to meet the learning demands for class or assignments, were less able to select important information from less important information, and using test preparation and test taking strategies less effectively.”
These difficulties are likely the reason why other reviews (e.g. Dunlosky et al, 2013) find that creating summaries has only limited utility as a study strategy.
“On the basis of the available evidence, we rate summarization as low utility. It can be an effective learning strategy for learners who are already skilled at summarizing; however, many learners (including children, high school students, and even some undergraduates) will require extensive training, which makes this strategy less feasible.”
These problems also extend to other common study skills which involve creating simplified content, for example creating visual maps and flash cards. I’ve written before about the effectiveness of training students to create visual maps: Does visual mapping help revision?
“Visual mapping of one sort or another is a commonly suggested revision technique, based on the assumption that the process of organising material in linked, hierarchical and graphical ways is superior to note-writing or simply answering practice questions. However, the evidence for its effectiveness as a process of elaboration is currently poor.”
There are few studies which specifically look at using flash cards as a revision strategy. The limited evidence appears to suggest that they aren’t always used in a very effective way. For example, Hartwig, M. and Dunlosky, J. (2012) found that only about 30% of students in their survey used them for self-testing and their use was restricted by the type of domain knowledge the students had to learn.
“Flashcards may often be used nonoptimally in vivo, such as when students mindlessly read flashcards without generating responses. Even when they are used appropriately, flashcards may be best suited to committing factual information to memory and not equally effective for studying all types of materials.”
The skill of summarising or note-taking is one that is difficult for students as it requires a firm understanding of which are the most important ideas and keywords within a specific subject. Even less formal summarising techniques like visual maps and flashcards cannot be relied upon as generic revision strategies as their application involves the same problems of identifying and organising the relevant domain knowledge.
That’s not to say that these summarising techniques aren’t worth developing – they are – but that teaching ‘how to summarise’ or ‘how to draw visual maps’ as a generic skill will likely have little impact. The most effective way for students to learn these skills is likely to be within the context of a subject-based lesson – where the teacher can help students to focus in on the most important domain specific information.
Generic study skills: Self-testing
In short, there appear to be few independent revision strategies which reliably improve student outcomes. However, one strategy that does seem pretty convincing is practice-testing. The ‘testing effect’ has consistently found across a range of laboratory and classroom based studies. The act of trying to retrieve information appears to enhance the future recall of that knowledge fairly reliably.
The Dunlovsky et al (2013) review rated practice testing as a high utility strategy.
“Testing effects have been demonstrated across an impressive range of practice-test formats, kinds of material, learner ages, outcome measures, and retention intervals. Thus, practice testing has broad applicability. Practice testing is not particularly time intensive relative to other techniques, and it can be implemented with minimal training. Finally, several studies have provided evidence for the efficacy of practice testing in representative educational contexts.”
There’s also evidence that engaging in retrieval practice not only helps recall but also helps to reduce student exam anxiety. This may seem paradoxical to some teachers – after all, classroom tests often appear to be a source of anxiety for some students. However, it does appear that practising the retrieval of information can reduce students’ feelings of anxiety. For example, a recent study by Agarwal et al (2014) found that middle-school and high-school students reported lower test anxiety when they had engaged in low-stakes ‘clicker quizzes’ prior to final (grade relevant) testing.
Teachers can exploit this in lessons by using quizzes and low-stakes tests as a normal part of subject teaching. Indeed, I suspect it would have far more impact on student self-efficacy than attempts to tweak their personalities through social-psychological interventions related to ‘growth mindset’ or ‘resilience’. However, research has also found that self-testing can also have a positive influence on outcomes – making it a potentially effective ‘study skill’.
For example, Hartwig, M. and Dunlosky, J. (2012) identified that students who frequently quizzed themselves on material they were learning had higher GPA scores. However, whilst the positive benefits of testing oneself appear quite robust, there are questions about how effective this strategy is across the full range of contexts of assessment. For example, in the Hartwig and Dunlosky (2012) article they note:
“A major issue is the degree to which these benefits of self-testing will generalize to different kinds of tests (e.g., multiple choice, free recall, or essay), different course contents (e.g., biology, psychology, or philosophy), students with differing abilities, and so forth.”
Incidentally, that same study also found that the scheduling of study also appeared to have an impact on student outcomes. Low-achievers tended to opt for late-night ‘cramming’ sessions, usually close to the deadline of an assessment rather than planned in advance.
Another issue is that students may not engage in self-testing in a way that exploits its effectiveness. For example, Einstein et al (2012) report:
“Existing research suggests that students will sometimes engage in testing during their studying but mainly for diagnosing whether or not they know certain material and not as means of improving their learning and memory… . Students seem to be unaware that retrieval itself enhances memory …”
One reason students may fail to see benefit from independent self-testing is because they mistake recognition for ability to accurately recall information. Effective self-testing likely relies upon having an accurate ‘judgement of learning’. After all, if you believe you know the material well, you are likely to practise it less than material you believe you know less securely or don’t know it at all.
There’s evidence that ‘cue recognition’ (i.e. a feeling of familiarity with the material) may cause us to overestimate how well we know the material. This feeling of recognition may ‘trick’ students into ceasing self-testing before they are genuinely have a secure recall of the material.
For example, Reder and Ritter (1992) found that participants in their study tended to make a quick ‘feeling of knowing’ judgement about material based on familiarity with the question stem rather than accurate assessment of their memory for the material.
Whilst we can likely teach students to recognise the risk of overestimating the security of the recall, knowing the existence of a cognitive bias doesn’t make us immune from that bias. Thus, it is likely that even self-testing will require support and feedback within a domain-specific context to ensure that students have the genuine depth of subject knowledge required for assessments.
Implications for helping students with revision
At the end of the day, one-off sessions on generic revision skills may seem like a worthwhile intervention for students struggling with exams, but such activities may lend themselves to the appearance of ‘doing something’ to help students rather than actually improving outcomes.
A focus on generic ‘skills’ over-simplifies and makes abstract the strategies which can help students in their learning. Whilst useful strategies exist, a purely ‘skills-based’ approach overlooks important subject-specific differences in content and assessment, and the requirement of specialised domain knowledge to apply strategies effectively.
Generic ‘revision skills’ sessions involving mnemonics training and summarisation techniques are unlikely doing any harm, but there’s reason to believe that they also aren’t doing much good. Instead, there’s a case for domain-specific study techniques becoming a feature of regular classroom teaching. Indeed, there’s arguably a case for teaching generic study skills to teachers rather than their students, so teachers can adapt and apply these to their subject with the benefit of their greater domain knowledge. Through this, teachers could make students aware of a wider range of subject-appropriate strategies and properly reinforce the use of these techniques on a regular basis within their teaching.
Post script: Courtesy of Andy Lewis (@iTeachRE) a summary of improving students’ judgements of learning and some of the more robust psychological findings on effective study.
Reblogged this on The Echo Chamber.
Thanks for a useful piece complete with references.
Some similar points are explored in more detail in Brown P, Roediger H and McDaniel M (2014) Making it stick; the science of successful learning London; Harvard University Press. (Review at http://recentreflection.blogspot.co.uk/2014/05/on-make-it-stick-brown-roediger-and.html)
LikeLiked by 1 person
Thanks for the recommendation.
Pingback: Revision skills – are they learnt? | #beJHGSinterested
Pingback: Growth mindset: What interventions might work and what probably won’t? | Evidence into practice
Pingback: Revision… – @ASTSupportaali | NewToThePost
Pingback: 10 Ways To Revise Better - Release Your Inner Drive
Pingback: The science of learning | Evidence into practice
Pingback: The Science of Learning | Blogs of the Month
Pingback: I’ve not been planning ‘for memory’ – thepriestlandspath
Many thanks for writing this post. It has set out huge amounts of useful information about a topic – students’ study skills – that I am increasingly concerned about. Are you aware of any more recent developments since you wrote this piece?
Pingback: Top 10 Revision Strategies
Pingback: Revision – use these approaches with students! | Kesgrave High School