MicroCPD: What can we do about unconscious bias in students' module evaluations?

This week Dr Clare Saunders addresses some inconvenient truths in the evaluation of teaching (and how to tackle them)

Research summary

“This work addresses potential gender bias in how students perceive their Lecturers and highlights an important aspect of our equality and diversity work at the University. Successful delivery of a quality education experience is built upon active contributions from both staff and students. As we champion success in all of our students so also we encourage awareness of reciprocal virtues as citizens of a diverse community.  Our students have a crucial role in promoting gender equality as our partners.”
Professor Una Martin, Deputy Pro-Vice Chancellor for Equalities

Mengel, F, Sauermann, J, & Zoelitz, U (2017). Gender bias in teaching evaluations. JOURNAL OF THE EUROPEAN ECONOMIC ASSOCIATION. http://ulfzoelitz.com/wp-content/uploads/JEEA-gender-bias.pdf 

For an overview, see https://www.economist.com/news/science-and-technology/21729426-how-long-does-prejudice-last-research-suggests-students-are-biased-against Mengel et al. analysed four years’ worth of data from the School of Business and Economics at Maastricht University, comprising almost 20,000 student evaluations and associated data on those students’ course grades and self-reported study hours. Students were randomly allocated to male or female ‘section instructors’, mitigating any selection problems and allowing the researchers to explore the influence of gender on students’ evaluation of their teacher. Course evaluations were completed via a ‘double blind’ process whereby students did not know their course results prior to evaluation, nor instructors their evaluation results prior to grading students’ work. The authors also compared their results to the findings of other research, including a similarly large-scale study involving both French and US universities and a range of disciplines (Boring et al. 2016, below).

The results of the Maastricht study are neatly summarised in the researchers’ own words, namely: “female faculty receive systematically lower teaching evaluations than their male colleagues despite the fact that neither students’ current or future grades nor their study hours are affected by the gender of the instructor… [i.e.] differences in teaching evaluations do not stem from objective differences in instructor performance”. This finding of gender bias holds across a number of robustness checks (detailed in section 4.2 of the paper). Bias is found to be strongest from male students (although statistically significant regardless of student gender) and towards junior instructors. The bias effect is also found to be larger in quantitative courses and independent of the male/female staff balance, i.e. not simply a product of female staff being in a minority in a particular subject area. Strikingly, this bias is found even in students’ evaluations of their learning materials, despite these being held in common across the entire course cohort and not specific to the instructor; which led the researchers to hypothesise that students ‘anchor’ such responses to their evaluations of the teacher.

Mengel et al. note that course evaluations are increasingly used to inform judgements about staff teaching performance, which in turn may be used not only for teaching awards, but also in recruitment, retention and promotion decisions. They argue that gender bias in course evaluations could therefore contribute to the continuation of male dominance in many academic fields, and conclude that such evidence “should be used with caution”. They also note that their findings show no evidence of significant reduction in students’ bias over the course of their university experience, and thus express concern that these students will continue to exhibit bias “as graduates from one of the leading business schools in Europe… occupying key positions in the public and private sector across Europe for years to come”.

Additional resources and practical tools

Course evaluation

University of Birmingham module evaluation guidance: https://intranet.birmingham.ac.uk/as/registry/studentdata/modeval/index.aspx [accessible only to University of Birmingham staff]

  • A comprehensive guide to module evaluation at the University of Birmingham, encompassing the principles and objectives underpinning the University’s approach; practical guidance on using the system; and policy information on how survey results should – and should not – be interpreted and used.

Implicit bias

Project Implicit: https://implicit.harvard.edu/implicit/uk/

  • “Project Implicit is… [an] international… network of researchers investigating implicit social cognition [which] translates that academic research into practical applications…” – includes a series of self-assessment tools for unconscious bias.

University of Birmingham ‘Pearls of Wisdom’: https://intranet.birmingham.ac.uk/staff/development/pearls/biases.aspx [accessible only to University of Birmingham staff]

  • This short animated video (with transcript) introduces the concept of implicit bias and gives some practical advice on tackling bias in your own thinking.

Further reading

Boring, A, Ottoboni, K & Stark, PB (2016). Student evaluations of teaching (mostly) do not measure teaching effectiveness. SCIENCEOPEN RESEARCH. https://www.scienceopen.com/document?vid=818d8ec0-5908-47d8-86b4-5dc38f04b23e. [For an overview, see https://www.insidehighered.com/news/2016/01/11/new-analysis-offers-more-evidence-against-student-evaluations-teaching]

  • An influential large-scale study of student evaluations in both Europe and the US, which found evidence of not only significant gender bias, but also a lack of correlation between evaluation scores and measures of teaching effectiveness such as student performance.