How to get the worst from your student course evaluations
Eric Bohms | Managing Director, Electric Paper Ltd.
Module evaluation is, at its heart, an instrument designed to provide the academic with useful feedback.
In this article, rather than discuss how to implement best practice around course evaluation, I have decided instead to write about how to get the absolute worst from your investment in technology. So, here are my seven recommendations to ensure useless results, leading to meaningless metrics, disengaged students, angry academics and management teams with no idea of what is happening in their teaching rooms!
1. Ensure you invoke survey fatigue
In order to increase course evaluation survey fatigue amongst your students, don’t try to identify what surveys are being run across your institution. I have met several HE providers that have had very successful survey fatigue programmes, with students being asked to complete 60 different surveys throughout the year – well done – the more the better in fact to ensure the highest level of survey abuse. Just keep the surveys rolling – students love them, as do the vendors supplying commercial systems to deliver them!
2. Be very sneaky
Don’t worry about data protection when running online surveys. Don’t inform students that there is a team of learning technologists with a live feed, monitoring and collecting the time, location, gender, pass rate and attendance of survey participants. Students don’t mind at all! It’s the age of social media, they don’t care about their personal data.
3. Make sure your academics collect the surveys personally
When using paper surveys, it always makes the student feel better about giving honest feedback when they experience the personal touch of an academic watching over them while they complete it and saying thank you when handing in an evaluation of their performance.
4. Make sure you never actually tell the students what measures or actions will be put in place as a result of their feedback
When running module evaluation, just keep sending out surveys – students will never tire of completing surveys with no discernible outcome or change to teaching practices or
resource provision.
5. Academics are not too busy
Yes I know many of them have to produce research and are now being pushed to improve their teaching and there’s that pesky feedback and contact hours topic always popping up on the NSS. Why not get them to run their own surveys as well? Just give them a log-in and let them go at it; they will appreciate it, and I’m sure they won’t mind at all a bit of clandestine snooping by the analytics team. In fact, while you are at it, why not drive all your enhancement efforts off the comments of students? I can think of no better way to create happy academics than to escalate the vindictive comments of a few disaffected students to the management team!
6. For online surveys, don’t use precious in-class university time
Just send the surveys out and hope the students complete them. Whilst they are tech savvy and live on their smart phones, the distractions of chatting with their friends and checking social media will ensure the survey is always put off until later…and later…and later…result!
happy academics than to escalate the vindictive comments of a few disaffected students to the management team!
7. Lock students out of their Virtual Learning Environments (VLE) until they complete the survey
This is one the best techniques if you want a 100% response rate with questionable integrity! There is no better way to ensure rubbish results than to blackmail students into providing
thoughtful feedback.
In all seriousness, all the examples I’ve mentioned are real and well-meaning attempts at enhancing the learner experience at university. You can imagine how successful they’ve been.
So let’s address these challenges, focusing on three key areas:
1. Students
When they arrive at university, students have very little experience of participating in market research and a trusted advisory relationship has to be cultivated in order to convey the importance and responsibility they have as stakeholders in their education. Further, efforts in gaining trust and buy-in must be sustained (after all students and student reps are only there for three years) and the reasons for participating in course evaluation must be clear and mutually beneficial. This is why it is so important that students see tangible, timely outcomes from their feedback efforts. Seen in this light, it makes no sense to use negative incentives to encourage participation, disregard the protection of their personal data or be vague regarding anonymity.
2. Academic Staff
Due to the increasing diversity of cultures and identities of HE provision in the UK, and the variety of expertise that needs to be harvested across disciplines, consultation is vital in winning the hearts and minds of academics. Equally important is to choose the right survey questions, ensure sensitivity around the use and visibility of textual comments, and be transparent regarding the implications of course evaluations on performance monitoring. Tempting as it might be for the university managers to indulge in ‘sneakiness’, module evaluation is, at its heart, an instrument designed to provide the academic with useful feedback to improve the delivery and design of their courses. The academics need to believe and trust in course/module evaluation policy, or they may actively work against its success.
3. Management
Management teams have a responsibility to support the academics as well as to ensure the overall quality of teaching and learning for their students. Therefore, it is right that they have access to timely and meaningful metrics which enable them to identify excellence as well as areas that need additional resources. The proliferation of surveys needs to be regulated to avoid ‘survey abuse’, and policies must be clear as to how or if the results will be used in performance monitoring of
academic staff and resource allocation, and how the results are communicated with the students.
To conclude, what these best and worst perspectives demonstrate (and it pains me as a vendor to say it!), is that acquiring the right software is simply not enough. However, if you disagree, my software is available in six easy interest free instalments…
Eric has nearly 20 years experience of working in the software sector. Before setting up Electric Paper Ltd in the UK in 2009, he held a number of senior national and international roles across sales, marketing, operations and projects as well as setting up Cardiff Software Ltd in 1996. Eric holds a Masters of Science in Technology and Innovation Management from John H Sykes School of Business, University of Tampa and a Bachelor of Arts degree in History from the University of San Diego. He is passionate about the HE student experience and has specialist knowledge of HE policy and processes in this area. He also specialises in disruptive technologies, new product development, knowledge management, survey design and deployment, and testing and assessment.
0 comments on “How to get the worst from your student course evaluations”