After the brief introduction from last week, I've decided to go ahead and dive in on one of the issues that affects almost all students at the University of Cincinnati: professor and course evaluations.
If you're like me, you probably get a little annoyed when all those emails start flooding in at the end of the semester asking you to take the surveys and reminding you once, twice, maybe even three times. And if you're like me, you might fill out one or two, or you might let them sit and then forget about them.
Professor and course evaluations are supposed to be a way for students to give feedback which will hypothetically pave the way for a better experience for future students. Unfortunately, right now, they don't really work that way. Let's take a look at what's holding them back, and what we can do about it.
If you're like me, you probably get a little annoyed when all those emails start flooding in at the end of the semester asking you to take the surveys and reminding you once, twice, maybe even three times. And if you're like me, you might fill out one or two, or you might let them sit and then forget about them.
Professor and course evaluations are supposed to be a way for students to give feedback which will hypothetically pave the way for a better experience for future students. Unfortunately, right now, they don't really work that way. Let's take a look at what's holding them back, and what we can do about it.
Why Professor/Course Evaluations Are Ineffective
There's no question that our professor and course feedback system isn't as good as it could be. Participation rates are low and it's questionable whether student feedback ever actually leads to change in course design or professors' teaching methods. Here are the main problems I see with how they're currently done:
2. Application is inconsistent between colleges
Let me explain this one a bit. I take classes in both CEAS and LCB (that's the engineering and business colleges), and just like every other student, towards the end of last semester I received bunches of emails asking me to fill out the surveys. I held off on responding until after exams finished. That's when I discovered that the deadline for the LCB surveys was before exam week, and the deadline for CEAS surveys was after exam week. Like...what?
This difference between colleges is silly for a couple reasons. First of all, two separate deadlines? Is this really necessary? Second, what is the point of a deadline before exam week? How are students supposed to:
Fixing this problem would be relatively simple - standardize deadlines across colleges and make sure the deadline is after exam week. But there is still one big issue with our current course evaluation system...
3. Students don't have confidence that feedback leads to change
It's hard to expect students to fill out a bunch of surveys when they really don't know if anyone even looks at them and if anything is ever done about them. Clearly there's room for improvement here. How can we truly convince students (including myself) that their feedback is valuable?
- Students don't like filling out lots of long surveys
2. Application is inconsistent between colleges
Let me explain this one a bit. I take classes in both CEAS and LCB (that's the engineering and business colleges), and just like every other student, towards the end of last semester I received bunches of emails asking me to fill out the surveys. I held off on responding until after exams finished. That's when I discovered that the deadline for the LCB surveys was before exam week, and the deadline for CEAS surveys was after exam week. Like...what?
This difference between colleges is silly for a couple reasons. First of all, two separate deadlines? Is this really necessary? Second, what is the point of a deadline before exam week? How are students supposed to:
- Find time right before exams to fill out a bunch of surveys?
- Evaluate a class that's not actually finished yet?
- Answer a question like "are exams relevant to the material taught in class" before taking the exam?
Fixing this problem would be relatively simple - standardize deadlines across colleges and make sure the deadline is after exam week. But there is still one big issue with our current course evaluation system...
3. Students don't have confidence that feedback leads to change
It's hard to expect students to fill out a bunch of surveys when they really don't know if anyone even looks at them and if anything is ever done about them. Clearly there's room for improvement here. How can we truly convince students (including myself) that their feedback is valuable?
Fixing the Problem
First let's consider a proposal made last year during the student government elections. What about setting up a RateMyProfessor website specifically for UC students? Unlike current course evaluations, RateMyProfessor is actually commonly and enthusiastically used by students. It's an interesting idea, but I'm not sure it's the best solution. Why?
First, it's a little redundant. UC students already use RateMyProfessor, so I'm not sure how much benefit it would be to have a separate one just for UC students. I understand there would be some differences, but it's the same basic concept.
Second, on a platform like RateMyProfessor I think we'd have to be wary about anonymity and baseless accusations against professors, if the application is actually used to bring meaningful change.
Third, I think the current course evaluations do a better job of collecting real feedback (besides just ratings) - the problem is that the feedback is right now limited in quantity and who knows where it goes after you fill it out.
So basically although I think RateMyProfessor is a great resource for students to find good professors, I don't think it's the best way to bring about change in current courses and professors.
First, it's a little redundant. UC students already use RateMyProfessor, so I'm not sure how much benefit it would be to have a separate one just for UC students. I understand there would be some differences, but it's the same basic concept.
Second, on a platform like RateMyProfessor I think we'd have to be wary about anonymity and baseless accusations against professors, if the application is actually used to bring meaningful change.
Third, I think the current course evaluations do a better job of collecting real feedback (besides just ratings) - the problem is that the feedback is right now limited in quantity and who knows where it goes after you fill it out.
So basically although I think RateMyProfessor is a great resource for students to find good professors, I don't think it's the best way to bring about change in current courses and professors.
So what is the magical solution??
Here's an idea that Engineering & Applied Science Tribunal has been considering. Our college already has committees with student representation on curriculum, academic standards (disciplining students), and co-op. How about:
A committee comprised of deans, faculty, and students designed to review the feedback surveys and issue recommendations to faculty based on the results.
Sure, it's not a home-run, but I think this would go a long way towards making students feel like it's worth their time to fill out the surveys, because something might actually be done about them. Student participation would ensure transparency and accountability, and a collaborative approach with deans and faculty would ensure the committee is able to weed out silly complaints and focus on fixing real issues with courses and professors. The surveys would have to remain confidential, but I don't think that would be a problem.
What do you guys think? Is there a better way to make the course evaluations more effective? Is this already being done in another college, and I'm just not aware? Is there a bigger problem with course evaluations that I'm missing?
Leave a comment below!
What do you guys think? Is there a better way to make the course evaluations more effective? Is this already being done in another college, and I'm just not aware? Is there a bigger problem with course evaluations that I'm missing?
Leave a comment below!