Editorial

Maclean’s magazine released its annual university rankings last week. Once again, Concordia has placed at the bottom of the pack, ranked 11th – last – in the category of “comprehensive” universities, schools that have extensive undergraduate and graduate programs but don’t have a medical school. This has put Concordia in strange company, pitting some of Canada’s largest universities (us, York, Simon Fraser) against schools that are half their size (Memorial, Guelph, Victoria). While some of these schools have comparable programs of study to Concordia, others do not.

As with all university rankings, we must question how valid any comparison of this nature can be. How can one actually compare the quality of an education? How do we measure success? How do we even define success?
All university rankings that come out seem to give different answers.

Université de Montréal was ranked fifth in Canada by the Times Higher Education-QS World University Rankings, based out of England. Meanwhile, Maclean’s had UdeM tied at 13th with Sherbrooke in its category of 15 medical-doctoral universities.
Times Higher Education had Concordia as 20th in the country, also the lowest ranking. But of the 11 comprehensive universities listed by Maclean’s, only five made it onto the Times Higher Education ranking.

Concordia’s John Molson School of Business, which has been rated far more frequently than the entire university, has also had a wide variety of rankings: 34th in the world, and third in Canada, according to the Aspen Institute (incidentally York, which ranked ninth in Maclean’s, was the “best” in the world); 80th in the world, and fifth in Canada, according to Financial Times (York was 23rd); 18th outside of the United States, and fifth in Canada, according to Forbes magazine.

Clearly, there is little rhyme or reason to many of these rankings, in part it is because different rankings use different criteria. For Maclean’s, reputation is an important factor. But since we don’t actually know how our reputation is being judged, or who is doing the judging, it’s hard to say why these opinions matter.

Many Canadian universities took issue with the methodology used by Maclean’s, and for years refused to provide information. This led the magazine to rely solely on publicly available studies, like those released by Statistics Canada and the Association of Universities and Colleges of Canada.

But the problem is, there isn’t actually a study for every grade that Maclean’s gives. Maclean’s judges the student-teacher ratio &- which plays a big role in the rankings &- based on the ratio of full-time teachers to students. There are no publicly available numbers of part-time teachers in Canada. Institutions like Concordia, where part-time faculty teach 40 per cent of the classes, look like they have much larger class sizes than they actually do.

It is no secret that this issue is one of the best &- if not the highest – selling issues of Maclean’s each year. Picked up by anxious parents who want their children to have the “best.” But best what? Best experience? Best chance of getting a job? Best chance to get a higher salary? Best chance to explore something to its full potential?
While there are certain factors that can be compared and contrasted, it is impossible to give definitive rankings to something as complex and diverse as universities and the university experience.

A university education is what you make of it. If a student wants to make an informed decision, they need to rely on a lot more than one issue of a magazine.

Related Posts