We live in a competitive society. We’re constantly comparing cars, electronics, grades and even people. But is what we’re comparing really the same or are we trying to find a link between apples and oranges? As incoming freshmen, we’re extremely familiar with the college application process and how competitive it really is. I’d be willing to bet that many of us used the U.S. News and World Report college rankings as not only a good starting point for the college search, but also as a way to decide if one school was better than another. As many of you know, the 2008 college rankings were released last week. Wash. U. held steady at number 12, a spot we share with Cornell, one of the prestigious Ivy League schools. But what does being number 12 really mean? It may give us bragging rights, but what are we really bragging about?
Most people probably don’t know how U.S. News derives their annual rankings. Twenty-five percent of a school’s overall rating is comprised solely of a peer-assessment: a survey sent out to colleges to see what they think of other colleges. This survey counts more than any of the other statistics used to come up with the rankings. The U.S. News Web site describes the survey as allowing, “the top academics we consult-presidents, provosts, and deans of admissions-to account for intangibles such as faculty dedication to teaching.” These professionals are asked to rate each of these “intangibles” for a given academic program on a scale of one to five. When coupled with the real statistics used in the ratings, it’s possible that this survey may be helpful, but should it really count more than anything else? For undergraduate business and engineering programs, the rating criteria are even less scientific because one hundred percent of a school’s rating is based on the peer assessment. It seems crazy to think a school should be ranked exclusively based on the opinions of a few people.
There certainly can’t be a perfect way to rank schools because who’s to say what makes one school better than another? I know it may sound crazy, but it’s possible that there’s more to a college than just how many students it admits and the percentage of alumni that donate money (both of which are criteria U.S. News uses in its rankings). U.S. News neglects to take into account so many important factors when judging a school that the rankings seem inherently flawed.
Factors like campus life and student opinion are so much more important when choosing a school than an average SAT score.
You also may have heard that many schools now refuse to submit their statistics or fill out the peer assessments. This definitely seems like a step in the right direction because it seems like schools are finally recognizing the issues with the ranking system. However, if U.S. News continues to release their rankings using old data and fewer peer assessments, the rankings will become less and less reliable. When a school refuses to release their updated statistics and U.S. News cannot find it through other sources, the magazine uses the data from the last year they have on record. That means that schools that won’t release data could be ranked incorrectly. As for the peer assessments this year, only fifty-one percent of individuals asked to fill out peer assessments responded.
I don’t believe U.S. News will ever stop ranking colleges considering it is one of their best selling issues each year, but I do think that people are becoming more conscious of what these rankings really mean and that this will lead people to use the rankings as more of a guideline than a college bible. I’m not saying that we should stop trying to compare schools because rankings and comparisons can be a helpful way to begin a college search. I am merely advising that they should be looked at with a more critical eye, and perhaps be taken slightly less seriously.
Andrea is a freshman in Olin School of Business. She can be reached via e-mail at [email protected].