McMaster gets its annual report card

Julia Redmond
November 1, 2012
This article was published more than 2 years ago.
Est. Reading Time: 4 minutes

Remember the feeling of getting your report card? McMaster was faced with that feeling this October, as the Globe and Mail published its annual Canadian University Report.

The assessment, released on Oct. 25, tried to get away from the largely data-based rankings of other organizations, instead assigning letter grades to different aspects of the university’s performance based on student surveys.

And McMaster’s administration was certainly pleased with the report card results.

“It’s extremely gratifying to be ranked by students as providing the highest quality teaching and learning experience in Canada,” President Patrick Deane told the Daily News, referring to McMaster’s first-place finish in its division for quality of teaching and learning.

Most notably, Mac ranked first in campus atmosphere, research opportunities and quality of teaching and learning, as well as second in student satisfaction, where it placed behind Western.

It also made an impression at the lower end of the large school division, placing second to last in city satisfaction and information technology. And naturally, McMaster’s infamous SOLAR system earned the university last place in course registration.

“If you take all the rankings, they add up to an interesting perspective that we’re strong, but there are some areas which need our attention,” said Deane.

The premise of the Globe’s rankings is a survey of current undergraduate students. For the 2013 rankings, 33,000 undergrads responded to a survey, and their responses, given on a scale of 1 to 9, were converted into corresponding letter grades. But the entire premise of this style of ranking is problematic, said Lonnie Magee, an economics professor at Mac.

“How would a university student be able to know about another school?” he asked. “It’s so driven by how you compare it with what you’re expecting.” He explained that since students attend only one university, such a comparison not particularly useful.

The Globe and Mail addressed this criticism in its 2012 Canadian University Report, released last October. Alex Usher, president of Higher Education Strategy Associates, an education consulting firm that advises the Globe on the annual report, argued that student surveys are a reliable method of devising rankings.

“Another criticism [of the report] was that student[s] … had no idea what was available at any school other than their own. That’s true to some extent – but if year after year a particular institution gets results which are particularly good or particularly bad compared to other institutions of its type, then the results start to gain in validity,” Usher argued.

Magee notes that such results come from the “temptation to make the results more objective, to accumulate statistics and present them to show that your rankings are based on these ‘hard facts’ that have been collected.” He cautioned that qualitative factors like student satisfaction are tough to compare.

The Canadian University Report is one of two major Canadian university ranking publications. The other, administered by Maclean’s, is the more well-established of the two. It will release its 22nd annual rankings issue this year, while the Globe has just published its 11th.

Rather than following the Globe and Mail’s approach of a heavily student-based survey, Maclean’s compiles a number of factors to generate its rankings. Schools are divided into three categories: medical-doctoral, comprehensive and primarily undergraduate, in order to improve the comparison.

But the factors it uses for this comparison, made up largely of data from Statistics Canada and federal funding agencies, are sometimes criticized for not being entirely relevant to students or administration.

Mike Veall, an economics professor at McMaster, has published work on the effectiveness of the Maclean’s rankings. He described their methods as being a “little bit suspect in terms of gaining indicators.”

“It’s not quite clear that the indicators match quite well with what students or administrators should care about,” he said.

While there are many factors, the rankings do consider data like the number of library holdings and amount of money available for current expenses per weighted full-time-equivalent student.

McMaster has also been rated by broader, global organizations. But these, too, have their limitations.

The Times Higher Education (THE), for example, produces a rankings issue considered to be one of the best in the world.

This year, McMaster placed 88th overall in their report. But the THE also ranks by faculty, and in the “clinical, pre-clinical, and health” category, McMaster earned 14th place in the world, making it the top school in the category in Canada.

Meanwhile, QS, a British firm, ranked McMaster 152nd. A Shanghai-based organization Academic Ranking of World Universities (ARWU) placed McMaster this year at 92nd.

International rankings methods provide a different set of criteria. While Maclean’s and the Globe consider student satisfaction, such firms as QS and THE factor in a school’s industry influence and international impact – an area where McMaster can’t compete as well, especially when factors like number of Nobel Prize winners are considered.

But in the end, a bad report card doesn’t have a huge effect on a school, Veall said. In his 2005 study, co-authored by Qi Kong, a Mac undergrad at the time, he concluded that a change in ranking has little effect on a school’s enrolment share or the entrance average of its students. A shift of one place in the rankings can, at best, change the mean entrance average by 0.3 percent, although Veall emphasized that this conclusion was “not particularly robust.”

But even though the rankings may not matter much in the end, it doesn’t mean McMaster can’t be happy with a good report card.

Subscribe to our Mailing List

© 2024 The Silhouette. All Rights Reserved. McMaster University's Student Newspaper.
magnifiercrossmenuarrow-right