flexiblefullpage -
billboard - default
interstitial1 - interstitial
catfish1 - bottom
Currently Reading

The 5 most questionable college and university rankings of 2015

University Buildings

The 5 most questionable college and university rankings of 2015

By David Lantz | SmithGroup | March 1, 2016
The 5 most questionable college and university rankings of 2015

Harvard University in Cambridge, Mass., claimed the number one spot The World University Rankings. Photo: Todd Van Hoosear/Creative Commons

Since 1983, there have been two major Rites of September for U.S. higher education: the new academic year begins, and U.S. News and World Report releases its thoroughly disabused college and university rankings. These rankings are perennially welcomed back with a hazing of well-founded criticism. And yet they return year after year, like a super-duper senior, forever three credits shy of a Stats degree but totally pumped to celebrate homecoming.

So much sharp critical commentary has been lobbed at U.S. News that it’s become fashionable to criticize the criticism. The Atlantic’s John Tierney referred to this annual outpouring of anti-U.S. News sentiment as a “bray-a-thon” just before hopping on his own dismissive donkey. Meanwhile, there’s been a growing competitive pool of new college ranking systems, all vying for equal scrutiny. How can students, parents, politicians and academic leaders possibly weigh all the relative flaws of these rival scoring systems?

The only answer is to provide them with a definitively numbered list. Here are the five most questionable college and university rankings of 2015 that were not compiled by U.S. News and World Report. As my contribution to this year’s rankings bray-a-thon, think of it as five irritable burros trailing after an ornery mule.


Coming in at five is Niche, formerly College Prowler, with its student-based ranking and review system. Part of the fast-growing Rate My Professors genre of college review websites, Niche clearly understands that today’s students value peer reviews far more than institutional surveys. While student reviews are heavily weighted in Niche’s final rankings, there are a statistically insignificant number of student respondents for many of the ranked institutions. If you think this fact would prevent Niche from joining the college and university rankings game, you would be wrong.

An entertainingly dismissive College Times piece from last year evaluated 18 college review websites for their value to prospective students, and offered this assessment of Niche/College Prowler’s methodology:

“Niche.com arbitrarily mixes data from government databases, school administrators, and students themselves, without communicating to visitors which data is which. Yes, they literally allow campus marketing directors to login and update school profiles however they see fit.”

Niche doesn’t hesitate to apply their free-ranging methodology to such highly subjective subcategories as the Friendliest College (it’s Brigham Young!). They have also started ranking K-12 schools, which they suggest will be a viable future replacement for SAT/ACT scores to help admissions staff better rank prospective college students.


Brigham Young University in Provo, Utah, is the nation's friendliest college, according to Niche. Photo: Jaren Wilkey/Wikimedia Commons.



You have to hand it to Forbes for advancing the critical arguments against U.S. News while simultaneously establishing their own annual college and university rankings. It’s like GQ taking on People’s “Sexiest Man Alive” with its more sophisticated “Man of the Year Award.”

A recent post in Forbes's Education blog by Jessica Brondo Davidoff compared the U.S. News rankings to such quaint cultural relics as cassette tapes and Casey Kasem’s Top 40: things that used to be wildly popular but are now nostalgic rather than relevant. According to Forbes, their methodology trumps U.S. News’ due to its incorporation of contemporary return-on-investment (ROI) statistics:

“This is a new age of return-on-investment education, the very heart of our definitive ranking. Our focus is on just one measurement: outcomes. From low student debt and high graduation rates to student satisfaction and career success, these outstanding institutions are worth it.”

Aside from the many issues with ROI statistics, one of the biggest problems with Forbes’ system is something that typically isn’t viewed as a problem at all: its relation of college rankings to positional wealth and status. If someone makes more money selling junk bonds than as a commercial loan officer, it isn’t because they got their MBA from the Wharton School of UPenn versus from a relatively lower-ranked institution. However, as Susan Engel points out in a recent Salon article, “The underlying expectation is that academic performance is always and only a matter of comparison.”


Forbes named Pomona College, in Claremont, Calif., America's top college. Photo: Dave & Margie Hill/Kleerup/Creative Commons.



U.S. News entered the field of international university rankings in 2014, going up against established giants Times Higher Education (THE) and ShanghaiRanking Consultancy’s Academic Ranking of World Universities. Perhaps this is what inspired THE to introduce a spin-off rankings system based solely on the most controversial and discredited part of U.S. News’ rankings: the reputation score. As THE describes it:

“The Times Higher Education World Reputation Rankings 2015 employ the world’s largest invitation-only academic opinion survey to provide the definitive list of the top 100 most powerful global university brands. . . (The) reputation league table is based on nothing more than subjective judgement—but it is the considered expert judgement of senior, published academics—the people best placed to know the most about excellence in our universities.”

While THE’s survey is more extensive than U.S. News’—over 10,500 responses from 142 countries—it still falls prey to the inherent statistical flaws and response biases of reputation scoring. THE’s World Reputation Rankings of the top 100 universities could only get to #50 before the scores became identical clusters—forcing them to resort to alphabetized lists.


The U.S. Department of Education’s proposed college ratings system earned the #2 spot on this list before it was officially abandoned on 9/12 after a two-year development process. It remains on the list even though it is being replaced by a new college scorecard system: as a reminder of how close the USDE came to adopting the highly criticized ratings, and of the many pertinent questions still to be asked about the scorecard system.

The idea of tying public funding to a federally administered college ratings system was controversial from the beginning, with the major organizations representing U.S. colleges and universities voicing concern and opposition. While opponents of the ratings generally agreed that holding poorly performing institutions more accountable was an important goal, they thought available federal data for many of the proposed metrics were too incomplete to derive meaningful ratings. Other critics were concerned that the system’s performance metrics would end up punishing schools that serve predominantly low-income and minority students.

The USDE’s newly unveiled scorecard system no longer sets out to rate institutional performance. Instead, it aims to provide students and families with a research tool to sort and compare data for institutions based on considerations such as annual cost, graduation rates, and average salaries after graduation. Many of the scorecard’s new data sets are more comprehensive than what were previously available, especially the income data provided by the IRS. At the same time, critics have been quick to point out that this flood of new data is hardly value neutral, and could further obscure the extent to which equity issues have a greater impact on personal earnings than where students got their degrees.

It also leaves people with the classic scoring system dilemma. If you’re sifting through the data and two students have identical 4.0 GPAs, how can you possibly know which one is the better student unless you can confidently say “The one who went to Yale”?


Though it costs $237,700 for four years to attend Harvey Mudd College in Claremont, Calif., the school offers a $985,300 ROI, the best figure in the nation, according to PayScale. Photo: Imagine/Wikimedia Commons.



PayScale has boiled the complexity of ranking colleges down to a single measureable: how much money will students earn relative to what they paid for their degrees? In PayScale’s rating system, college is a transactional investment, same as a stock or mutual fund purchase, and the way you measure success is return on investment after graduation.

If the idea of absolute ROI for an institutional degree independent of student aptitude, performance or aspiration sounds too simplistic, it’s because it is. In addition to all the ROI calculation errors pointed out by economists from P3 and Arizona State University, PayScale lets graduates self-report their income. The rankings also make students who got an education degree from a regional comprehensive and went on to become high school teachers appear as if they got shafted because they didn’t pay more to go to MIT.

PayScale’s College ROI rankings will eventually supplant U.S. News and World Report in the critical scrutiny they receive—especially since commentators like William Bennett use them to argue that only 150 universities in the country are worth their price tag. In the meantime, major media outlets will trumpet their release, analysts will continue to say that it’s better than no ROI standard at all, and the science and engineering-oriented West Coast Ivy Harvey Mudd will continue to be ranked at or near the top.

This year marks the 20th anniversary of when Reed College stopped submitting survey information for theU.S. News rankings. The thoughtful explanation on Reed’s website and the articulate editorial that former President Colin Diver penned for The Atlantic in 2005 are outstanding critiques of the college rankings mindset and methodology. They also reaffirm the core values underlying higher education, and the true value that individual schools can provide to their students. It is an understanding of value that allows us to view questionable college ranking systems as unnecessary rather than inevitable. As Diver concluded:

Before I came to Reed, I thought I understood two things about college rankings: that they were terrible, and that they were irresistible. I have since learned that I was wrong about one of them.

More from Author

SmithGroup | Mar 28, 2023

Inclusive design requires relearning how we read space

Pulling from his experience during a campus design workshop, David Johnson, AIA, LEED AP, encourages architects to better understand how to design spaces that are inclusive for everyone.

SmithGroup | Feb 27, 2023

Surfing the Metaversity: The future of online learning?

SmithGroup's tour of the Metaversity gives us insight on bringing together physical and virtual campuses to create a cohesive institution.

SmithGroup | Nov 28, 2022

Data centers are a hot market—don't waste the heat!

SmithGroup's Brian Rener shares a few ways to integrate data centers in mixed-use sites, utilizing waste heat to optimize the energy demands of the buildings.

SmithGroup | Aug 3, 2022

Designing learning environments to support the future of equitable health care

While the shortage of rural health care practitioners was a concern before the COVID-19 pandemic, the public health crisis has highlighted the importance of health equity in the United States and the desperate need for practitioners help meet the needs of patients in vulnerable rural communities.

SmithGroup | Aug 10, 2021

Retail reset: The future of shopping malls

Developers and design partners are coming together to reimagine how malls can create a new generation of mixed-use opportunities. 

SmithGroup | May 17, 2021

Future pandemic preparedness at the medical district scale

The current COVID-19 pandemic highlights the concern that we will see more emergency events in the coming years.

SmithGroup | Jan 25, 2021

Amid pandemic, college students value on-campus experience

All the students we interviewed were glad that they returned to campus in one form or another.

SmithGroup | Aug 13, 2020

Renewing the healing role of public parks

While we can’t accurately predict all the ways we will respond to the current COVID-19 pandemic, it should provide a moment of reflection as we see all too clearly the consequences of our exploitation and destruction of nature.

SmithGroup | Jul 21, 2020

How design of senior living communities must change after COVID-19

The cost of maintaining high quality of care and high quality of life for senior living communities has increased up to 73% for senior living communities that remain free of COVID-19 and up to 103% for COVID-19 positive senior living communities.

SmithGroup | Jun 12, 2020

How will museums change after COVID-19

This new environment may herald innovative economic models and change the way we think about museum design.

boombox1 - default
boombox2 -
native1 -

More In Category

Mass Timber

Bjarke Ingels Group designs a mass timber cube structure for the University of Kansas

Bjarke Ingels Group (BIG) and executive architect BNIM have unveiled their design for a new mass timber cube structure called the Makers’ KUbe for the University of Kansas School of Architecture & Design. A six-story, 50,000-sf building for learning and collaboration, the light-filled KUbe will house studio and teaching space, 3D-printing and robotic labs, and a ground-level cafe, all organized around a central core.

halfpage1 -

Most Popular Content

  1. 2021 Giants 400 Report
  2. Top 150 Architecture Firms for 2019
  3. 13 projects that represent the future of affordable housing
  4. Sagrada Familia completion date pushed back due to coronavirus
  5. Top 160 Architecture Firms 2021