Toronto Metropolitan University's Independent Student Newspaper Since 1967

All Editorial

ICE surveys miss their mark

By Eyeopener Staff

There’s the students who use theirs to draw pretty designs.

There’s the ones who are rarely in class so they don’t even know when they’re handed out.

Then there’s the ones who use them to tell professors what they thought of them and their courses.

But after students put down their HB pencils and the envelope is sealed on Ryerson’s bi-annual Instructor Course Evaluations (ICE), the results disappear as far as students are concerned.

The administration doesn’t publish the findings of students’ surveys, as per the faculty’s collective agreement. It’s no wonder that students have a hard time taking Ryerson’s method of evaluating courses and professors seriously.

Those violet sheets with bubbles waiting to be pencilled in were vaunted as a way to bring accountability to Ryerson programs.

Allow students input and professors will pay attention, in theory.

But no one who witnessed the spotty history of the ICE surveys can take that sentiment to heart when the sheets are slipped into an official brown, sealed envelope and whisked away somewhere far from students’ eyes.

So why do we bother filling them out?

Perhaps, after six years of the current system, it’s time for change to bring more accountability to the whole process.

A Ryerson engineering Web site run by students publishes evaluations of professors along with old lab reports, tests and assignments for some courses, but, when some professors realized what was being said about them, the site was almost muzzled last March. But its frank, useful input is what’s sorely lacking in Ryerson’s outmoded stabs at gauging students’ opinions.

Other schools’ student councils have been far more organized in representing students who want to know more about their programs and how they rank.

This is far from where Ryerson stands. Our academic council, the body charged with conducting the twice-yearly surveys, is unwilling to release results any more detailed than the ones that rank Ryerson’s top five faculties.

But those are so vague, they don’t show what students really think of their programs. Is the school afraid of a revolt if every graphic communications management and interior design student realizes the programs they’ve spent so much to attend rank at the bottom of the list?

Professors have a privacy clause in their collective agreements, most recently negotiated in 1998, that states their own evaluations will be kept out of the public domain. But program rankings are far more generalized, so releasing them wouldn’t target any one instructor.

Then students would get the chance to question what administrators and faculties have done to earn consistently low or high rankings, and how they’ve trying to improve on poor scores.

As it stands, getting ICE surveys filled out and returned appears to be more a twice-a-year chore than an effective tool for change.

So where do they surveys bring accountability?

Students, in their eagerness to get them approved by academic council after years or wrangling, may have given up too many of the right taken for granted at other schools such as the University of Toronto, where an anti-calendar gives the dirt on profs and courses.

It’s time to re-evaluate the surveys Ryerson students fill out, or simply put them on ice.

Leave a Reply