A pilot project on online course evaluations, conducted this semester in the College of Engineering (CoE), has been deemed an overall success. But the University will delay a campuswide launch of the new system until further improvements can be made, say those who led the trial.
More than 74 percent of students enrolled in CoE courses used CTools to fill out at least one online Teaching Questionnaire (TQ) during the Oct. 10-16 evaluation period. In terms of the total number of questionnaires possible — one per student per class or 26,148 — about 43 percent were completed online, according to James Kulik, director and research scientist at the Office of Evaluations and Examinations (E&E), which administers the TQ system.
“We were pleased with the participation, which was twice as high as in past years when midterm evaluations were collected online,” says Noel Perkins, professor of mechanical engineering and a member of the college’s Engineering Teaching Academy (ETA). The online midterm evaluations were initiated as a joint project between the college’s Undergraduate Student Advisory Board and the ETA. Engineering has been collecting midterm feedback online since 2005, but the rate of response to these evaluations in past semesters has been 20-24 percent.
Kulik attributes the greater participation this year to the e-mail reminders sent to students throughout the evaluation period. Reminders were sent the first day the questionnaires were available online. Students who didn’t respond received another reminder in three days, and another three days later, if they had not participated.
At the end of the evaluation period, teachers were able to access summary reports of their rating results on Wolverine Access, and individual ratings and comments were also sent to them via e-mail. The summary reports, posted for teachers on 850 course sites, were accessed more than 1,300 times, says Lisa Emery, a senior business systems analyst with Michigan Administrative Information Systems, who helped design the software for the project. “That suggests that almost all teachers viewed their rating results once or twice in the days following the release of the results.”
Still, project leaders found areas for improvement in the online evaluation system, and the University now plans to postpone the rollout of the new TQ system from Winter 2007 to Fall 2008 so that changes can be made. Among the planned improvements are consolidating and eliminating redundancies in the e-mail messages to students, better reporting of results to faculty and consolidating evaluation forms in team-taught courses.
Redundancy in e-mails seems an especially important problem, Kulik says. “Separate e-mails went out to students for each of their classes,” he notes, “and as a result, some students received a dozen e-mails, where one or two comprehensive e-mails would do. E-mail reminders are valuable tools, and we have to use them more carefully in a more user-friendly way in the future.”
The change to online evaluations is in keeping with a University move toward more electronic processes. The system has many of the same features of the current TQ but, once tweaked, will provide users quicker access to course evaluation data, while eliminating the costly and time-consuming paper process of creating forms, questionnaires and reports.
Like paper-based evaluations, online evaluations protect anonymity, stripping the student’s identity as he or she logs off. Although the program can keep track of who has filled out the form, allowing reminders to be sent, specific responses are free of identification.
Other long-term advantages of the Web-based system include more freedom in the design of questionnaires and a greater level of customization for individual courses or instructors.
