In October I blogged on how student perception surveys might be used to provide a fairly reliable measure of teaching effectiveness. Since then, I have been piloting a version of the MET survey to investigate my own teaching (along with a number of other teachers here at Turnford).
I randomly sequenced the questions listed in the MET survey and drew up a simple set of protocols about how the survey would be used. This included strict confidentiality regarding the data – they would be shared only with that teacher (though they could share those data with whomever they wished, of course) and that any feedback on the pilot study would ensure teacher anonymity. It also included the standard instructions given to students about completing the survey (again, ensuring their contributions were anonymous).
A list of the question items can be found here: Student perception survey
I employed a 5-item Likert scale to record student responses to each item.
Teachers could then select one or more area to focus on over the following term. I chose 7 areas (because I’m greedy like that).
06 My teacher in this class makes me feel s/he really cares about me
21 My teacher seems to know if something is bothering me
20* Student behaviour in this class makes the teacher angry
30* When s/he is teaching us, my teacher thinks we understand when we don’t
23 This class does not keep my attention – I get bored
03 Students get to decide how activities are done in this class
16 My teacher takes the time to summarise what we learn each day
* indicates a reversed scoring item
Earlier this month I ran a follow up survey with the same group – in order to see whether the areas I targeted had made any improvement:
61% 06 My teacher in this class makes me feel s/he really cares about me [+0%]
33% 21 My teacher seems to know if something is bothering me [-6%]
Clearly my Y11s weren’t very impressed with my efforts to be more empathetic and express professional warmth! However, other areas showed some improvement …
78% *20 Student behaviour in this class makes the teacher angry [+39%]
I’m happy about this improvement – and I think it reflects the genuine improvements in the atmosphere and learning culture of the classroom.
67%* 30 When s/he is teaching us, my teacher thinks we understand when we don’t [+7%]
This worried me in the original survey – I’m glad to see my spot questions and encouragement to seek appropriate help are moving this in the right direction, but I still see work to do here.
72%* 23 This class does not keep my attention – I get bored [+11%]
A lot of this was monitoring and being ready to shift my pace (up or down) appropriately to the learning. Glad to see a positive shift here.
50% 03 Students get to decide how activities are done in this class. [+28%]
A big increase – probably because I give more flexibility with revision and review tasks when I’m able. I’ve also tried to offer some choice about how specific studies are consolidated (e.g. using mnemonics, writing frames or visual maps) and I was able to make some of the practical experiments open to student choice (within in a limited range).
72% 16 My teacher takes the time to summarise what we learn each day [+22%]
I was doing this before, I just needed to be more consistent with it.
Students aren’t fooled by a ‘show off’ lesson that might be given for an observation, and by asking the whole class you get the range of student experience you might not get by simply asking one or two students. These are probably some of the reasons why a recent Sutton Trust report rated the reliability of student surveys higher than observation. At the end of the day, students see all of my teaching – warts and all – good days and bad, so can provide some useful feedback on what I’m doing well and where I might look to improve.
As an accountability measure, I think student surveys would be stressful and unfair – but as a coaching tool to get inside your own teaching, I think it has merit.