Opinion | A breakdown of U.S. News & World Report’s ranking method: possibly bogus
Published: Thursday, September 12, 2013
Updated: Thursday, September 12, 2013 23:09
U.S. News & World Report’s rank of national universities has become the accepted quality determinant of universities in the United States, and it seems that institutions and their populations take considerable pride in these rankings. The annual ratings were published this week, with Miami University tying for 75th this year, a step up from placing 90th last year. Good news, right? Well, looking closer at the factors involved in determining the rankings, it seems that they are, simply put, bogus.
Let’s break down how the ratings are calculated, all according to U.S. News & World Report’s “How U.S. News Calculated the 2014 Best Colleges Rankings” published on its website. First off, there is “undergraduate academic reputation,” which accounts for 22.5 percent of the formula, all based off of a peer ranking survey taken by “top academics – presidents, provosts and deans of admission,” as well as college counselors at nationally ranked high schools. In other words, almost one fourth of the ranking formula involves the opinions of those who probably, at best, have a superficial understanding of the academic programs of the schools they are asked to rank. Clearly not ideal.
Secondly, there is “faculty resources,” which accounts for 20 percent of the formula. This takes into account faculty salary, average class size, faculty-student ratio, and terminal degrees of faculty. How does faculty salary have anything to do with the quality of a school? Generally speaking, yes, smaller classes and more one-on-one interaction make sense, but many classes have large lectures with break-off sections that provide one-on-one interaction, as well. And does having a terminal degree by default allow professors to teach more effectively?
Thirdly, there is “student selectivity” accounting for 12.5 percent of the formula. This mostly considers the average SAT (math and reading only) and ACT scores of those enrolled, as well as high school class rank. Only 10 percent of the “student selectivity” score factors in acceptance rate. How does the SAT or ACT, which is taken by enrollees prior to entering an institution, reflect anything about the quality of a university? Shouldn’t the focus be on how an institution changes its students after being enrolled?
Three more criteria include “financial resources, graduation rate performance and alumni giving rate,” accounting for 10 percent, 7.5 percent and 5 percent of the formula, respectively. Providing financial resources to students and faculty is highly important, but won’t the wealthier schools receive the higher rating no matter what? And how does “graduation rate performance” (involves actual graduation rate versus U.S. News’ predicted graduation rate) and annual alumni donations have anything to do with the quality of an institution?
Finally, there is “retention,” which accounts for the last 22.5 percent of the ranking formula. This score is based on the six-year graduation rate of students (average proportion of students in a class procuring a degree in six years or less) and first year retention rate. These figures should be factored into the picture, but they are seemingly the only logical aspects of the entire rating formula.
U.S. News & World Report needs to redevelop its rating system to focus more on student success after college, the quality and variety of experiences provided to students (research, studying abroad, student organizations, etc.), the rate of acceptance into graduate programs, and job placement directly following graduation. It could also weigh in performance on examinations such as the CPA, GRE, MCAT, LSAT, etc. taken after university preparation, as well as how the enrolled students and faculty view their own university’s programs and preparation.
Luckily, some are beginning to understand the absurdity of the current ranking system. The Washington Post even published an article on Sept. 10 titled “Why U.S. News college rankings shouldn’t matter to anyone” attempting to expose the ranking’s invalidity. Hopefully, more readers will soon be aware of the uselessness of U.S. News & World Report’s rankings. If a new ranking system is not developed soon, next year’s list might as well be determined by a lottery.