College season is upon us again and just in time US News and World Report has released its annual list of the countries top colleges. The rankings have not changed much, nor would make any sense if they had (it is not like anything has actually changed at any college over the past 12 months). Now it is not my purpose to decry the overall use of rankings. In fact rankings have a legitimate place in the college process. First, they provide students with a quick and easy way to see their options. They can quickly associate certain colleges with others (that they might not have learned about otherwise). Also with the use of admissions statistics students can quickly figure out the range or tier of schools that they are most applicable to. Second, rankings encourage colleges to raise money and gain faculty. It creates a competition to be the best and to gain the recognition (or money) that comes with being the best.
But the unfortunate side of the college rankings is that there is really only one acknowledged ranking of colleges. Although other outfits have ranked world universities and business schools, US News and World Reports has a stranglehold on our perception of what the best colleges are.
But does that mean that US News also controls what we value in our colleges? Often lost in the headlines proclaiming which college is now the best, is the actual justification of what makes one college better than the other. Let's review US News' actual
metric:
Peer assessment (weighted by 25 percent): The U.S. News ranking formula gives greatest weight to the opinions of those in a position to judge a school's academic excellence. The peer assessment survey allows the top academics we contact--presidents, provosts, and deans of admission--to account for intangibles such as faculty dedication to teaching. Each individual is asked to rate peer schools' academic programs on a scale from 1 (marginal) to 5 (distinguished). Those who don't know enough about a school to evaluate it fairly are asked to mark "don't know." Synovate, an opinion-research firm based near Chicago, collected the data; 57 percent of the 4,098 people who were sent questionnaires responded.
Retention (20 percent in national universities and liberal arts colleges and 25 percent in master's and comprehensive colleges): The higher the proportion of freshmen who return to campus the following year and eventually graduate, the better a school is apt to be at offering the classes and services students need to succeed. This measure has two components: six-year graduation rate (80 percent of the retention score) and freshman retention rate (20 percent). The graduation rate indicates the average proportion of a graduating class who earn a degree in six years or less; we consider freshman classes that started from 1995 through 1998. Freshman retention indicates the average proportion of freshmen entering from 2000 through 2003 who returned the following fall.
Faculty resources (20 percent): Research shows that the more satisfied students are about their contact with professors, the more they will learn and the more likely it is they will graduate. We use six factors from the 2004-05 academic year to assess a school's commitment to instruction. Class size has two components: the proportion of classes with fewer than 20 students (30 percent of the faculty resources score) and the proportion with 50 or more students (10 percent of the score). Faculty salary (35 percent) is the average faculty pay, plus benefits, during the 2003-04 and 2004-05 academic years, adjusted for regional differences in the cost of living (using indexes from the consulting firm Runzheimer International). We also weigh the proportion of professors with the highest degree in their fields (15 percent), the student-faculty ratio (5 percent), and the proportion of faculty who are full time (5 percent).
Student selectivity (15 percent): A school's academic atmosphere is determined in part by the abilities and ambitions of the student body. We therefore factor in test scores of enrollees on the SAT or ACT tests (50 percent of the selectivity score); the proportion of enrolled freshmen who graduated in the top 10 percent of their high school classes for all national universities and liberal arts colleges, and the top 25 percent for institutions in the master's and comprehensive colleges categories (40 percent); and the acceptance rate, or the ratio of students admitted to applicants (10 percent). The data are for the fall 2004 entering class.
Financial resources (10 percent): Generous per-student spending indicates that a college can offer a wide variety of programs and services. U.S. News measures the average spending per student on instruction, research, student services, and related educational expenditures in the 2003 and 2004 fiscal years.
Graduation rate performance (5 percent; only in national universities and liberal arts colleges): This indicator of "added value" shows the effect of the college's programs and policies on the graduation rate of students after controlling for spending and student aptitude. We measure the difference between a school's six-year graduation rate for the class that entered in 1998 and the predicted rate for the class.
Alumni giving rate (5 percent): The average percentage of alumni who gave to their school during 2002-03 and 2003-04.
To arrive at a school's rank, we first calculated the weighted sum of its scores. The final scores were rescaled: The top school in each category was assigned a value of 100, and the other schools' weighted scores were calculated as a proportion of that top score. Final scores for each ranked school were rounded to the nearest whole number and ranked in descending order.
Interestingly enough the only parts that even attempt to measure the quality of education one receives
are the peer assessment and the faculty resources. Peer assessment is an interesting statistic and probably one of the most fruitful that US News uses. It simultaneously measures the rigor, prestige, selectivity, and level of faculty achievement. Let’s take a look at what the top group of schools would be based purely US News' peer assessment score:
1. (4.9) Harvard, Princeton, Yale, Stanford, MIT
2. (4.8) University of California- Berkley
3. (4.7) Cal Tech, Columbia
4. (4.6) Duke, Cornell, Johns Hopkins, University of Chicago
5. (4.5) UPenn, University of Michigan- Ann Arbor
6. (4.4) Dartmouth, Northwestern, Brown
The most significant changes between US News' final rankings and the above rankings are UC Berkley, which jumps from twentieth to fifth. Conversely, Washington University in St. Louis and Rice University which share a peer assessment score of 4.1, would plummet from their respective rankings of eleventh and seventeenth. To a much smaller degree the University of Chicago rises while UPenn drops.
The reason peer assessment is a fruitful ranking is because it ranks the real worth of a college degree. At the end of four years all a student will have is a sheet of paper and debt. The main purpose of that debt and sheet of paper is that it signals to prospective employers that you received an excellent education. It differentiates a graduate of Yale from a graduate of a community college. Of course often there is a major disconnect between prospective employers and the academic world. Whereas Cal Tech and the University of Chicago are well respected universities with some of the best faculties in the country, many employers would perceive a higher value to a Brown or UPenn graduate. But here in lies the power of US News. It has the power to change that perception, and who better to rank colleges then those who run them; those whose everyday existence is the management and education of some of the countries brightest students and leaders.
Of course peer assessment is not without its warts. But the next question to be discussed is what business does anyone have ranking UC Berkley against Cal Tech and ranking those two against Dartmouth. The differences therein are so substantial that it is truly like comparing apples to zucchinis. To a certain extent US News accepts this notion by ranking liberal arts colleges on a different scale than national universities. But US News would be better off creating more groupings, perhaps putting public universities on a different scale than private or separating tech schools from schools with a strong emphasis on the liberal arts.
US News may be best off by publishing a set of rankings. If it published separate rankings of all its various categories, but attempted not to place any particular amount of weight to them it would give prospective students the ultimate say. The notion that everyone is looking for the same thing in a college is ridiculous. That is the beauty of a range of rankings, if students want a list of the most selective schools they could find that out, if percentage of classes under twenty was their thing, then voilà. This is the beauty of Princeton Review's
set of rankings. But they lack any consistency, for example University of Chicago was the rated as the number one college for Undergraduate Academic Experience, this year it is not in the top twenty (having not started this year yet they may know something I don't know).
But this discussion presupposes the eminence of US News' rankings. The best college is something different to different people. It is unfortunate that we have not seen a similar set of rankings from a host of other sources. Obviously the money is there; the rankings have made US News a household name in the United States. Law and business schools benefit from a variety of rankings (business schools compete in numerous rankings,
here is a side by side comparison, law schools also have a couple of different rankings, Richard Posner analyzes them
here and Brian Leiter ranks them
here). There is an important void with plenty of money to be made. Other sources ought to be filling this.
Get on it world.
-Mr. Alec