Student evaluations suck, mostly.
Imagine that you’re up for a promotion at your job, but before your superior decides whether you deserve it, you have to submit the comments section of an internet article that was written about you for assessment.
Sound a little absurd?
That’s in essence what we ask professors in higher education to do when they submit their teaching evaluations in their tenure and promotion portfolios. At the end of each semester, students are asked to fill out an evaluation of their professor. Typically, they are asked both to rate their professors on an ordinal scale (think 1–5, 5 being highest) and provide written comments about their experience in the course.
We’ve repeatedly seen studies that show that student evaluations are skewed to favor popularity and attractiveness of the professor (damn, I lose), and this article points out that there is a gender bias as well: male professors tend to get higher ratings than female professors (so that’s how I’ve managed to get along), and that means these evaluations are discriminatory. And therefore illegal. Cool.
Now I said that student evals suck, mostly. They’re sometimes helpful — not the goofy numerical scores, and I ignore comments that whine about how hard the class is — but the productive, thoughtful comments can be very helpful. If a student says “X worked for me, Y didn’t”, I’ll seriously reconsider X and Y.
The last batch of evaluations I got back I just ignored the numerical scores and browsed through the comments for practical concerns. I got one: there’s a lot of grade anxiety out there, and they really wanted the gradebook available online, so they could see exactly where they stand, point by point. OK, I can do that. Not with our existing software, Moodle, in which the gradebook is a confusing nightmare, but we’re switching to new courseware next year, so I’ll look into it.
I guess it’s good that that was the biggest problem they had with the course. But it’s the stupid numbers that the administration will care about.


