By Chieu Luu Luong
David Steele sits in his RyeSAC office trying to calm his nerves with a bottle of scotch. It’s election night and Steele is up against four other candidates for the student council president position. He’s been campaigning for three weeks, but his run for presidency began in 1996 when he ran for and won a seat on RyeSAC’s board of directors. His main campaign issues include pushing for a student centre and revamping the Instructor Course Evaluations (ICE). “It’s going to be more than at here minute shade-in-the-dot-of-your-choice process,” he says. His goal is to make professors more accountable by publishing a calendar reviewing them.
Mintues before the final numbers are posted, Steele makes hi way to the Ram in the Rye. All the other candidates are there, along with their supporters and members of the press. A runner brings in the results of the last polling station. Steele inches his way to the white election board. He beats his closest competitor by 109 votes.
Nine months have passed since Steele took office. He’s worked hard to remain true to his student centre promise — co-chairing the student campus centre committee and sitting on its three sub-committees. But with so many hours tied into this project, Steele’s election promise to improve ICE — to hold professors accountable and publish a calendar reviewing them — has fallen by the wayside.
But instead of pressing the issue like student presidents before him, who have been lobbying for a transparent faculty evaluation process including a publishing calendar, Steele has taken a different direction, choosing to work with one department at a time. He says he no longer wants to push for the publication of survey results because some faculty members may see it as a “witch hunt” and, as they did in the past, reject participating in ICE.
Yet it seems odd that ICE, which started off as a student council venture in 1972 and included a published calendar of results for seven years, hasn’t been a significant part of RyeSAC’s agenda since the Ryerson Faculty Association (RFA) agreed to include the evaluations in their 1994 collective agreement on the condition results wouldn’t’ be published.
This was the best student council and administration could come up with after more than 15 years of negotiations, research, reviews and test runs — pink, computerized sheets consisting of 11 general statements, which are handed out to students at the end of each semester asking them to evaluate instructors and the courses they teach.
Perhaps it was this lengthy battle to get faculty to agree to using evaluations that were not allowed to be published or used for punitive measures that has scared off RyeSAC presidents from challenging the ICE surveys. But in a year when the RFA and administration have been unable to agree on a new collective agreement, this is an opportunistic time for RyeSAC to once again champion for changes to the ICE process — a process that students, faculty and administration all say needs fixing. Making amendments to the contract would be much easier now, than after a new contract is in place. The ICE issue has been raised at the bargaining table and contract talks are expected to continue into the spring.
However as Steele continues to focus on ironing out student centre details, the opportunity to create an improved ICE, and one that students are satisfied with, slips away.
“They’re [ICE] useless,” says Erin George, RyeSAC’s v.p. education. “No one sees the results [except instructors and administration], nothing is being done about the results, and they don’t address any of the problems with the overall program.”
Administration says the surveys are inadequate for evaluating different teaching methods and are designed for lecture classes, even though many courses at Ryerson are taught in laboratory situations.
And the RFA says the surveys don’t address the right issues because the statements are too general, and instructors are not given enough useful feedback to make any changes to their teaching techniques.
These problems exist because of stipulations entrenched into the RFA’s collective agreement which remains in effect until a new one is hammered out. In the contract, the RFA agreed to allow administration to use ICE results as part of their instructors’ annual reviews and when promotions, raises and tenure are being considered. But a clause in the contract exempts faculty hired before 1992 from having their ICE results affect their annual reviews. This means these professors are the only ones who will see the results of the ICE surveys filled out by their students, and they can do whatever they want with them. The contract doesn’t’ allow students access to the results. The clause, which doesn’t’ allow for ICE responses to be published is a position that the RFA refuses to budge from in current negotiation talks with administration.
“It’s a strange relationship because the instructors interact with the students, but they’re not employed by the students,” says John Morgan, the RFA’s chief negotiator. “They work for the university.”
But George says having ICE surveys transparent to student is the only way to ensure that there is a measure of accountability.
“I think it’s really important for people to be able to see the results,” she says. “If results are not published, how can they be used as a tool for change?”
George says she’d support seeing student council bring back its own version of the “anti-calendar.” Between 1972 and 1979, student council used to distribute the faculty evaluation surveys to students, tabulate the results and then publish them in a book located at the student council office for all students to see upon request. It was patterned after the University of Toronto’s Arts and Science Students’ Union anti-calendar which has been in place since the late 1960s. This year, the U of T union produced 10,000 copies evaluating more than 1,600 courses. And for the sixth year in a row, it was published with approval from the dean’s office.
Jane Seto, the union’s executive assistant, says the evaluations are used by the administration when considering promotion and tenure. She says because the evaluations are part of a professor’s personnel file, the union must first get the professor’s permission before publishing the results in their anti-calendar. But she says over the years, allowing publication of the results has become “accepted reality.”
Last February, during his campaign for president, Steele said he wanted to make professors more accountable by publishing a calendar reviewing them, but has instead decided to take a less controversial position. Instead, he’s chosen to help the School of Interior Design develop it’s own program-specific ICE surveys. IT is something the school, not RyeSAC, has been pushing for for almost 10 years. Steele says it is safer to start out small, and use interior design as a test group.
“We’d like to see how much time, how many resources, how many questions can be asked, and give it a trial run.”
The interior design-revised ICE surveys may be ready as soon as April, and Steele hopes, fi the interior design pilot proves successful, other departments would follow suit and begin working on their own program-specific ICE surveys.
But Steele’s term ends in April and there’s no guarantee RyeSAC’s next president will continue working with individual departments to develop new ICE surveys. However, there is still time to take a more aggressive approach.
To date, the RFA and the administration continue to be embroiled in a lengthy, and at time, heated negotiation process. The talks are expected to continue until at least the end of Steele’s term. It took more than 15 years for faculty, students and administration to accept the current ICE process, yet in the past five, RyeSAC has put the ICE debate on the backburner. Without a strong student council vote to push the issue, students will never get a faculty evaluation process they have been asking for for more than two decades — which is transparent and holds all professors accountable to both administration and students.