Peter Nelson knew he wanted to attend college somewhere near the West Coast, so he wouldn’t be too far from his hometown in Hawaii.
Scouring the internet for information, Nelson referenced popular ranking systems to create a short list of universities to apply to. But he was wary of a fact that many students overlook: Rankings don’t tell the whole story about any school, and they’re often based on arbitrary, biased or unreliable factors.
“If people rely on the numbers, there’s more of a chance to get misled,” said Nelson, who’s now a senior at Gonzaga University, a school he chose for its Jesuit roots.
Research suggests these numbers have a significant impact on where students apply to college, regardless of whether they fully understand what each ranking system measures. Money magazine, for example, considers how much students make during their first years after graduation. U.S. News factors in a school’s reputation.
“I think we live in a culture that is very rankings-based,” said Julie McCulloh, Gonzaga’s dean of admission. “We use Consumer Reports when we want to buy cars or major appliances or whatever.”
But a college education is a bigger and likely more expensive investment than a car or a vacuum cleaner.
Educators have mixed feelings about college rankings. While some schools – Gonzaga included – are quick to include high scores in brochures and news releases, administrators admit the systems don’t account for many unique and less tangible factors.
“Each family, each student has a unique set of needs, and the needs of the student might be better met by a school that’s not in a top ranking,” McCulloh said.
McCulloh said Gonzaga is similar in many ways to Loyola Marymount University in Los Angeles, but often falls behind Loyola in rankings that rely heavily on acceptance rates. Loyola is about the same size as Gonzaga but located in a more densely populated city, so naturally it turns away more applicants, she said.
“You can’t really pull the story of the school from their acceptance rate,” she said.
On a similar note, Dan Bernardo, the provost at Washington State University, said ranking systems tend to disadvantage large public institutions that aim for access and affordability.
“There’s always a lot of debate about their value and their credibility, because they do try to synthesize a lot of information into essentially a single number,” Bernardo said. “They tend to tilt toward the private schools because essentially they’re using private school criteria. I often tell people around here: We’re not Harvard, and we’re not trying to measure ourselves using the same criteria as Harvard.”
WSU is, however, in the midst of its “Drive to 25” campaign, which aims to secure a spot among the nation’s top 25 research universities. Bernardo said the school is using more “objective” criteria than the ones often used by magazines, such as research expenditures and the number of doctoral degrees granted.
U.S. News, the most prominent ranking system, bases 22.5 percent of its formula on colleges’ reputations, in part by asking administrators at other schools for their opinions.
In a report last year, NPR explained a glaring flaw in that methodology: “How good would you say that Princeton’s undergraduate business program is? That’s a trick question: There is no such program. Yet when other college presidents were asked this question, they gave the nonexistent program top marks. That’s known as the ‘halo effect.’ ”
McCulloh said she’s asked to rate other schools in an annual survey and routinely ticks the box labeled “Do not know.”
“It’s a heavy weight to say this school is excellent or mediocre or whatever,” she said. “How can I know? I don’t have the ability to tell you how good these schools are.”
Bernardo also took issue with rankings that rely on some measures of student performance, such as test scores.
“We don’t really subscribe to the theory that the SAT is a good measure of a student’s quality,” he said. “Our analysis shows that high school grade-point average is a much better predictor of success.”
In a study that’s slated for publication in the Journal of Economic Behavior and Organization, Daniel Hickman, an economics professor at the University of Idaho, scoured 10 years’ worth of data on college applications and compared it to the U.S. News rankings.
Hickman and his collaborators at Marquette University found that applications at any given school dropped by up to 6 percent after the school fell out of U.S. News’ top 50 rankings. That happens to be how many rankings were included in the first page of the magazine’s print editions.
“It makes a big difference when it crosses that threshold,” Hickman said. “People perceive it differently.”
Consumers might respond similarly, he said, when considering buying a car that’s just over the 100,000-mile mark. It’s the same reason a grocery store item might sell for $11.99 rather than $12.
“We always talk about people being well-informed and making rational decisions,” Hickman said. But, “people have to find ways to make decisions with limited information.”
Greg Orwig, Whitworth University’s vice president for admission and financial aid, said he urges students to use college rankings to help narrow their options, not as a tool to make a final decision. He said students should use more customizable options like college search tools that allow students to adjust specific factors.
One such option is the U.S. Department of Education’s College Scorecard website.
Neil Woolf, associate vice president for enrollment management at Eastern Washington University, agreed that rankings should be used only to “start a discussion.”
“As long as you know what the ranking is measuring, it’s a great thing,” he said. “More information is good.”
Subscribe to the Morning Review newsletter
Get the day’s top headlines delivered to your inbox every morning by subscribing to our newsletter