Playing With Numbers.

AuthorTHOMPSON, NICHOLAS
PositionProblems with U.S. News' college rankings

How U.S. News mismeasures higher education and what we can do about it

MY FRIEND ROB TOOK AS FEW CLASSES as possible at Stanford. He had top-notch SAT scores and high-school grades, and he was smart enough to graduate even if he was less at ease in the library than party-hopping in a caveman suit. We took one class together and were assigned Gulliver's Travels--a book I had read before and loved. Our prestigious professor, however, drained the life out of it by lecturing monotonically about his pet theory that Gulliver's voyage was a metaphor for birth. Good students sat with mouths agape; bad students slept. When we broke into small discussion sections, everyone was so stultified that the understudies running the classes simply rehashed the original monologue. To Rob, it was all "cool"; he was on his way to a degree.

Rob may not have been a stereotypically great student. But he was outstanding from the viewpoint of the U.S. News and World Report college rankings, the most important arbiter of status in higher education. He went to a top-rated school, and he didn't hurt its score because the U.S. News rankings don't measure how much students learn; they don't measure whether students spend their evenings talking about Jonathan Swift or playing beer pong; and they don't measure whether students, like Rob, are just there to get through.

A single magazine's idiosyncratic ranking system may seem peripheral to the larger issues of higher education, but this particular one matters a lot. The U.S. News rankings are read by alumni, administrators, trustees, applicants, and almost everyone interested in higher education. The New York Times aptly described them as "a huge annual event," and they dominate what is far and away the best-selling college guide available. Subsequently, the rankings do have a kind of Heisenberg effect, changing the very things they measure and, in certain ways, changing the entire shape of higher education.

The problem isn't that the rankings put schools in the wrong order: a better ranking system might put Stanford 1st; it might put it 35th. I can't presume to know where it, or any other school, would rank. What I do know, however, is that a better ranking system, combined with more substantive reporting, would push Stanford to become an even better school--a place where students like Rob would have to focus more on learning than sliding by, and a place with fewer teachers putting their students to sleep. Unfortunately, the U.S. News rankings instead push schools to improve in tangential ways and fuel the increasingly prominent view that colleges are merely places in which to earn credentials.

Rank Behavior

The first U.S. News rankings appeared in 1983. The magazine grouped colleges into categories like "national universities" and "regional liberal arts colleges" and sent a survey asking for the opinions of university presidents on the five best schools in their category. There was nothing scientific or subtle about the survey and most people just shrugged it off. Donald Kennedy, president of then-first-ranked Stanford, said, "It's a beauty contest, not a serious analysis of quality."

That issue still sold remarkably well and in 1985 and 1987 U.S. News, under new owner Mortimer Zuckerman, again published rankings based solely on university presidents' perceptions. Then in 1988, U.S. News decided to take the rankings more seriously and to try to develop a franchise much like People's "50 Most Beautiful People" or the "Forbes 400." So Zuckerman placed Mel Elfin, an influential Washington journalist recently lured away from Newsweek, in charge of developing a more respectable system. Elfin found his sidekick a year later in Robert Morse, an intelligent, soft-spoken man who, if he were an actor cast as an introverted accountant, would be criticized for overplaying his role. The team got to work: Morse crunching the numbers, Elfin packaging the rankings with stories on higher education and creating the institution christened "America's Best Colleges."

Morse, Elfin, and, later, Managing Editor Alvin Sanoff, rapidly created a franchise: Every September since 1988, the magazine has produced an eagerly anticipated list that precisely orders every college in the country. According to last September's rankings, for example, St. Mary's College of California is the eighth best western regional college, just slightly better than Mt. St. Mary's College, California, but well ahead of Our Lady of the Lake University in Texas--a school ranked down in the second of three tiers that the magazine groups institutions into once the top 50 schools in a category have been nailed down. "America's Best Colleges" sells about 40 percent more than U.S. News' standard weekly issues and the magazine also produces a hot-selling accompanying book. Last year, eight million people visited U.S. News' Website when it posted the rankings.

The rankings are opaque enough that no one outside the magazine can figure out exactly how they work, yet clear enough to imply legitimacy. For the past 12 years the main ranking categories have remained fairly constant: student selectivity, academic reputation in the eyes of other university presidents and admissions deans, student retention and graduation rates, faculty quality rated by pay and PhDs, financial resources, and alumni giving. A category introduced in 1996 measures a university's "value added," assessed by the difference between actual and expected graduation rates (if you let in highly qualified students, your expected graduation rate is high). In short, the perfect school is rich, hard to get into, harder to flunk out of, and has an impressive name.

Beyond the rough guidelines, each category is then broken up further. Under the rules of the 2000 survey, "student selectivity" is based on some unexplained combination of the SAT scores of the 25th and 75th percentile of the entering freshman class, their class ranks, the percentage of applicants accepted, and "yield," the percentage of admitted applicants who enroll. These numbers are hard to parse, but it's difficult to accuse U.S. News' ranking system of being a simple beauty contest; it's now a complicated beauty contest and its scientific air contributes greatly to the attention people pay it. As Groucho Marx said: "Integrity is everything. If you can fake integrity, you've got it made."

Morse Code

Of course, there is no one definitive way to judge colleges and U.S. News does consistently encourage students to take the rankings with salt. Caltech, for example, was a surprise 1999 choice as the top-rated national university, but even its name suggests that it caters to technophiles, not poets, marines, or aspiring history professors. It also didn't seem like a great place to the seven African-Americans accepted last year: none of them chose to attend. U.S. News understands this...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT