Remarks delivered to the Annapolis Group June 20, 2007
Douglas C. Bennett, President, Earlham College
Questions about the U.S. News and World Report rankings of colleges and universities and about our relationship to those rankings are a sideshow to the serious issues that should steadily concern us about higher education today. The most serious issues are access and quality and the relationship between those two: how do we provide access to post-secondary education for all Americans, and how do we assure that this education is of high quality? The rankings are worth our attention only insofar as they bear on these questions, and they do bear on these questions in many ways, distracting us from the real, hard work needed to improve access and distorting understandings of quality. Let us all keep access and quality in our minds as we discuss the rankings.
|The Perspective of Institutional Researchers|
A good place to begin is with how the rankings and our active participation in them appear to those on our campuses who perform institutional research for our colleges. We all have very competent, very professional institutional researchers, but I’m not sure we’ve listened as attentively as we should to what they have to say on the matter. Most of us are members of HEDS, the Higher Education Data Sharing Project, and the HEDS listserv is one of the most important sites for conversations among institutional researchers. (I’m a participant on that listserv.) Periodically they discuss the U. S. News and World Report rankings, and they have nothing but anger at and professional scorn for these rankings.
Their postings detail in extraordinary detail problems with the U. S. News rankings that undermine nearly every aspect of its methodology: They question:
Rather than detail their methodological concerns (because that would take hours), let me just quote some of their summary judgments. I won’t quote anyone by name since I didn’t ask them for permission, but by and large these quotations are from institutional researchers whose Presidents are members of the Annapolis Group, some of whom are sitting in this room.
If you ask your institutional research person to tell you candidly whether the rankings have any professional integrity about them, they will tell you no. I doubt you can find even one institutional researcher across all of our colleges and universities who believes, as a professional judgment, that the rankings satisfy a minimum threshold of acceptable research practice.
If you ask your institutional research person to tell you candidly whether your college should actively participate in the rankings, almost all of them would tell you no. They hate the rankings—not because their college doesn’t rank well but because the rankings enterprise is beneath a level of minimum professional integrity.
And yet, U.S. News and World Report continues to claim that they are producing the best judgments about college quality available today. They claim there is professional support for what they do. How can they? In the main, because we Presidents and we Deans fill out the annual reputation survey that asks us to rate the ‘quality’ of 220 baccalaureate institutions. Our participation is a telling token of our professional support.
|Higher Education Research|
Look at this in another light: what has academic research of higher education had to say about the U.S. News and World Report rankings, or about the general approach to assessing ‘quality’ that the magazine takes, which these researchers refer to as a "resources/reputation" approach since it focuses on the resources a college has at its disposal and the reputation it enjoys.
Since the U.S. News and World Report rankings began, there have been a number of studies addressing whether the rankings have any validity as a measure of educational quality. Each and every one of the academic studies shows not even the palest correlation between various measures of educational quality or student learning on the one hand, and the U.S. News and World Report rankings on the other.
I doubt I am telling you anything new, but we are not yet acting as if we have fully digested these judgments of institutional researchers and higher education scholars. What we have to talk about is what we do about the failures of U.S. News and World Report rankings to meet minimum standards of acceptable research.
|The Canadian Initiative|
Let me add a few more voices to the mix. Recently, 25 of Canada’s leading higher education institutions decided to opt out of participation in annual rankings produced by Maclean’s, a Canadian equivalent of U.S. News and World Report. We should listen to them, too.
A year ago, David Naylor, the President of the University of Toronto wrote "As academics, we devote our careers to ensuring people make important decisions on the basis of good data, analyzed with discipline. But Canadian universities have been complicit, en masse, in supporting a ranking system that has little scientific merit because it reduces everything to a meaningless, average score" (Naylor, 2006).
Last month, Indira Samarasekera, the President of the University of Alberta, wrote that "Canadian universities are listening with great interest as the call to boycott U.S. News and World Report rankings continues to increase in volume among our colleagues to the south," and noted that "it is time to question these third party rankings that are actually market driven, designed to sell particular issues of a publication" (Samarasekera, 2007) These Canadian university Presidents have said no to further active participation in the creation of rankings. And they’ve said no on the basis of affirming their professional responsibilities as educators and scholars.
|What Should We Do?|
This isn’t about what U.S. News and World Report does. This is about what we do. In the United States (and increasingly around the world) the media rank everything: golf courses, retirement communities, beaches, blue jeans, beer. I don’t doubt that, whatever we do, U.S. News and World Report will go on ranking colleges and universities. U.S. News is in the magazine business, and rankings of all kinds sell magazines. We’re in the business of education and research, and our behavior should comport with recognized professional obligations of educators and researchers.
That is, the question is whether we, as education and research professionals, will actively cooperate with U.S. News and World Report (and other publications that rank colleges and universities), thus lending our professional weight and credibility to the exercise.
The process of admissions to college should itself be an educational process in the best and widest sense. Our behavior in this process, as professionals, should be accountable to the professional norms of educators and scholars: at a minimum, integrity and transparency in the use of data, conscientious attention to issues of validity and reliability, refusal to simplify if that simplification distorts in important ways.
These are the values of the Education Conservancy, an organization created recently and committed to improving college admission processes for students, colleges and high schools. A dozen of us, now grown to about three dozen, have signed a letter making two simple commitments:
The media have described this initiative as a boycott, but that’s not really accurate. In addition to asking us to participate in the reputational survey, U.S. News and World Report asks each college to answer 600 questions. Of those, 424 are data elements that are in IPEDS or the Common Data Set. All of us already make all of these data elements publicly available. Many of us make available to the public high quality data that answer many of the remaining 176 questions. We are providing a great deal of high quality, useful data. U.S. News is free to make use of these data. So is Princeton Review, or Washington Monthly, or the U.S. Secretary of Education, or (most importantly) prospective students and their parents, teachers and counselors.
The problem we need to recognize is that we (colleges and universities) are not making these data available in an easy-to-find, user-friendly manner. As we solve that problem, U.S. News and World Report will not be the best answer. Broadly speaking, there is a further commitment we should make.|
We should feel an obligation to make public a variety of kinds of high quality data about the characteristics and functioning of our institutions. We should commit ourselves to making these easily available: easy to find and easy to use.
Several higher education organizations are already developing templates for making such data available: NASULGC and AASCU among the public universities, NAICU among the independent colleges and universities, and others, too. We should cooperate with these efforts, but we should work together as liberal arts colleges to find a format that is especially well-suited to our missions. We will likely want to include data elements that other kinds of colleges and universities will not.
Let me close by briefly (too briefly) enumerating some of the principles that our approach should follow.
To refuse to produce rankings serves the issue of access. We need every person in this country to pursue post-secondary education. The more we focus on trying to identify the "few, best" institutions, the more we miscommunicate about college admissions. Institutions have different missions. Different students will thrive at different colleges.
The idea of "one, best" college is nonsense, and that’s the notion that rankings promote. And to insist on including information about student learning serves the issue of quality. As education professionals, we need to focus prospective students on finding an institution where they are likely to learn. We need to provide them information that helps them make that choice, not focus their attention on who is most prestigious.
Several years ago, Reed College took a courageous, solitary stand on the U.S. News rankings (For its reasons, see Diver, 2005). Since then, many of us have ceased filling out the reputational survey. Now it is time for all of us to cease providing active assistance to the U.S. News rankings, recognizing its weaknesses, and instead focus our energies on provision of relevant information about our colleges and universities in an easy-to-find, easy-to-use manner.
|Diver, Colin (2005). "Is There Life After Rankings?" The Atlantic Monthly, November 2005, pp 136-39.|
|HEDS Listserv, various postings, 2005-07.|
|Kuh, George D. and Ernest Pascarella, "What Does Institutional Selectivity Tell Us About Educational Quality," Change, 36(5), pp 52-5.|
|Naylor, David (2006). "Measuring Up: What University Rankings Do and Don’t Tell Us." Opinion piece, Ottawa Citizen, April 23, 2006.|
|NORC (1997). "A Review of the Methodology for the U.S. News & World Report's Rankings of Undergraduate Colleges and Universities." Posted on the Washington Monthly website in 2000 here.|
|Pascarella, Ernest T. (2001). "Identifying Excellence in Undergraduate Education: Are We Even Close?" Change, May/June 2001, pp 19-23.|
|Pike, Gary R. (2003). "Measuring Quality: A Comparison of U.S. News Rankings and NSSE Benchmarks." Paper presented at the annual meeting of the Association for Institutional Research, May 2003.|
|Samarasekera, Indira (2007). "Rising Up Against Rankings." Inside Higher Ed, April 2, 2007.|