How to Make Sense of College Rankings – The New York Times

Willard Dix is one of the crankiest observers of the college admissions process I know; he’s also one of the smartest. He worked at Amherst, his alma mater, then advised college-bound students at a private secondary school in Chicago. He now blogs about higher education.

I asked him on the phone the other day about the dizzying proliferation of college rankings beyond those by U.S. News & World Report, each using its own methodology and emphasizing different metrics. If a tone of voice can approximate an eye roll, his did.

“You can slice and dice it any way you like, but this isn’t like Consumer Reports, which tests something to see if it does or doesn’t work,” he said. “The interaction between a student and an institution is not the same as the interaction between a student and a refrigerator.”

I can’t improve on that quip. But I can explain it in terms of what rankings do and don’t reveal and how high school seniors, who are right now in the thick of figuring out where they want to apply, should approach them.

There are now dozens of rankings, reflecting both the way we’ve come to fetishize data and the anxiety that so many Americans rightly feel about wringing the most from an increasingly costly investment. Just last month came a new one from The Wall Street Journal and Times Higher Education (which is unrelated to The New York Times).

It joined a jammed field of players, including The Economist, Forbes, and, yes, this newspaper, whose College Access Index looks narrowly at which of the country’s top schools seem to be the most socioeconomically diverse.

Inasmuch as all of these rankings rely on, and compile, objective information about the schools they examine, they’re useful. But all of them also make subjective value judgments about what’s most important in higher education, and those judgments may or may not dovetail with a student’s interests. It’s crucial to look at precisely what’s being measured — which is easy to do, if you read the fine print.

Some rankings assign more weight than others do to the selectiveness of a school and the academic background of its incoming students, on the theory that a high-achieving peer group matters.

Some don’t really try that hard to get at the question of how satisfied a school’s students are. Others do, but take varying routes to the answer. Some look in meaningful ways at diversity, which can greatly influence campus life and classroom discussions and says something about administrators’ priorities. Others don’t.

Over the last few years, there has been a movement toward ranking colleges in terms of how much money their graduates go on to make — something that U.S. News has never directly factored in but that The Wall Street Journal, The Economist, Forbes and Money Magazine, among others, do. My Times colleague James Stewart recently examined this development.

But here, too, there are necessary caveats. Graduates’ incomes probably have more to do with dynamics that precede college — their parents’ wealth, their childhood opportunities, their innate gifts — than with the particular seasoning of a given institution, and not all salary-oriented rankings pay careful attention to this.

The economist Jonathan Rothwell found a way to reward colleges whose graduates achieved more than their backgrounds might have predicted, with a set of “value-added” rankings that he produced for the Brookings Institution early last year. His inaugural list differed markedly from U.S. News’s, with Colgate University, Washington and Lee University, Clarkson University and Manhattan College appearing in the Top 10, above any Ivy League school. He later tweaked and adapted this list for a column in The Times by Stewart last October.

But there are also problems with these income-oriented approaches (beyond their implication that money equals contentment and success). One of the two principal sources for income figures is PayScale, a company that collects salary information. It relies on self-reported numbers from people who use its database, and is by no means a comprehensive, definitive survey.

The other source is the federal government’s College Scorecard, but its figures are only for people who received federal aid and reflect what they’re earning in the earliest years of their careers. Schools whose students move quickly into professions with high starting salaries fare better by this yardstick than do schools whose students choose careers that tend to develop slowly.

My larger point is this: For almost every well-intentioned measurement, there’s either a fundamental shortcoming or possible glitch. Take the Wall Street

Source: How to Make Sense of College Rankings - The New York Times

Leave a Reply

Your email address will not be published. Required fields are marked *