Rankings raise many questions
As students gear up for finals, the universities themselves were graded by the Globe and Mail, The National Post and, of course, in the contentious Maclean’s rankings. All of these surveys and report cards claim to help students and their parents determine where to send applications for 2007.
That each survey uses different indicators, weighted and presented differently, is the first sign that this exercise is more complex than it would appear. The Globe and Mail’s insert acknowledges that “there simply is no ‘best university’ across all academic program offerings.” They offer a tool, the Navigator, at globeandmail.com/reportcard that allows students to customize variables and get an individualized ranking.
As Brad Tucker, Director of Institutional Planning, said, “The Navigator recognizes that universities have different strengths and weaknesses, and that what one student is looking for in a university experience may be very different from what another is looking for.” Students studying close to home are not concerned about residence quality any more than science students need state-of-the-art performance facilities.
In the Globe and Mail’s University Report Card, published Oct. 31, Concordia scored high in teaching, student/teacher interaction, class size, and overall student satisfaction. Students ranked 100 variables on a scale from 1 to 5. A score of 3 was graded as a D, which was perhaps a bit harsh.
Tucker is satisfied that the Globe and Mail’s methodology is in line with a focus on outcome rather than input measures. Outcome represents a way to measure the added value that the university itself provides. In other words, the average grade of a student entering the institution is not nearly as important as their knowledge and ability when they leave.
The Globe’s data was collected through a sample of students who accessed a financial aid database.
Any snapshot will have some gaps. Concordia received low marks for recreation and residence facilities. Both areas are currently undergoing a major overhaul, with the new fitness centre under construction in the EV Building and hundreds more beds to be available in the Grey Nuns Motherhouse next September.
It is the university’s ongoing internal assessment through tools like the National Survey of Student Engagement (NSSE) and the Canadian Undergraduate Survey Consortium (CUSC) that determine these weaknesses and how to address them.
The Maclean’s report relies on universities’ participation in their surveys, in addition to providing results from independent surveys, to produce the bulk of its university issue. With less than half of the 47 institutions even cooperating with the magazine’s own survey, it made do with what information it could access, publishing survey results which institutions provide for free on their own web sites.
Concordia did not participate in NSSE for the year evaluated (2005). This year’s data will be up on our web site soon. CUSC data, never intended for use in ranking exercises, is being used that way by Maclean’s. In this regard, the issue listed Concordia as non-compliant instead of explaining why it did not supply the information.
The NSSE National Advisory Board does not support the use of this data for the purpose of rankings: “Reducing student engagement to a single indicator obscures complex dimensions of student behaviour and institutional performance.”
Maclean’s overall ranking calculation does not include NSSE or CUSC and is based on “class sizes and average entering grades to spending, library volumes and faculty success in obtaining national research grants.” Concordia stayed eighth in the rankings for its size and mission, but Tucker points out that “there is no evidence that their indicators and weightings have a proven relation to institutional quality.”
The National Post’s Innovation Leaders is based on research funding figures available through Statistics Canada. There is no indication whether multi-year funding is prorated, or included in the year it is given.
Concordia shows a 7.1-per-cent drop in research funding from the previous year. Quebec overall dropped 13 per cent in the same period for research funding.
The tool the National Post used to determine faculty productivity “cannot really substitute for an indicator that takes into account other types of productivity, such as books, media, etc., especially for us, with our Fine Arts Faculty,” Tucker said.
Ultimately, he said, the universities themselves need to determine where they are and where they want to be, and it is the needs and interests of each prospective student that drives his or her choice of university.