Who Responds to U.S. News & World Report's Law School Rankings?

DOIhttp://doi.org/10.1111/jels.12078
Published date01 September 2015
AuthorJeffrey Evans Stake,Michael Alexeev
Date01 September 2015
Who Responds to U.S. News &World
Report’s Law School Rankings?
Jeffrey Evans Stake and Michael Alexeev*
U.S. News & World Report (USN& WR) publishes annual rankings of ABA-approved law
schools. The popularity of these rankings raises the question of whether they influence the
behavior of law teachers, lawyers and judges, law school applicants, employers, or law
school administrators. This study explores some indicia of USN&WR influence. Using data
purchased from USN&WR, we attempt to determine whether USN& WR might have
influenced (1) law faculty members who respond to the USN&WR survey of law school
quality, (2) lawyers who respond to USN&WR surveys, (3) law school applicants choosing a
school, (4) employers who hire law school graduates, and (5) administrators who set
tuition. We find significant effects on the first three groups, particularly with respect to
lower-rank schools. That is, there may be “echo effects” of USN&WR rankings that are
folded back into subsequent rankings and tend to stabilize them. We also find that rankings
may exert some influence on tuition at law schools outside the top 40. We do not find
evidence that employers hiring law graduates respond to changes in USN&WR rankings,
either in median salaries paid or in employment percentages reported by law schools.
I. Introduction
Every spring since 1990, U.S. News and World Report (USN&WR) has published a ranking
of law schools in the United States. In the early years, the magazine listed the ranks of
only its top 25, but in 1994 that list of ranked schools grew to 50, in 2003 to 100,
1
and
in 2011 to 143. Although there are a number of other law school rankings, including
those created by Brian Leiter (2014) and those custom made by visitors to the Ranking
Game website (Stake 1998), the USN&WR rankings seem to attract far more attention
than all of the others combined. Copies of USN&WR are often seen in the hands of
*Address correspondence to Jeffrey Evans Stake, Robert A. Lucas Chair in Law, Indiana University Maurer School
of Law, 211 S. Indiana Ave., Bloomington, IN 47405; email: Stake@Indiana.edu. Alexeev is Professor of Econom-
ics at Indiana University.
Professor Stake thanks the Maurer School of Law for research support, and especially former Acting Dean
Buxbaum for her willingness to purchase the data upon which we rely. We also thank the anonymous reviewers
for their helpful comments, which have improved the article.
1
In addition to the specific ranks, in 1992 USN&WR started divid ing the law schools into groups of nearly equal
size, once erroneously called “quartiles” and now called “tiers.” (Generally speaking, quartiles are defined as the
values that divide an ordered list of scores into four equal quarters, in other words, the 25th, 50th, and 75th per-
centiles. The tiers of schools clearly cannot be “quartiles” if they are not equal in number.)
421
Journal of Empirical Legal Studies
Volume 12, Issue 3, 421–480, September 2015
applicants at on-campus law school recruiting fairs, suggesting that rankings may be as
important to them as rankings are to parents of applicants to undergraduate institutions
(see Machung 1998). Seeing some of the misleading impressions and counterproductive
incentives created by the USN&WR rankings (Stake 2006a, 2006b; Ehrenberg 2000;
Whitman 2002), and perhaps concerned about the potentially powerful effects of quan-
tification (Porter 1995), law school administrators have criticized USN&WR for publish-
ing the rankings and have attempted to reduce their impact (LSAC 2005). Given the
pressure to stop publishing the rankings (Bahls 2003:20), USN&WR’s continuation
might reasonably be attributed to the revenue from those rankings. What was once the
“swimsuit issue” for a news magazine has become the public face of USN&WR.
It could be argued that, like the original “swimsuit issue,” the USN&WR rankings
are merely a form of entertainment, and hence there is not much point in understand-
ing them. That position raises the threshold question of whether there are any actors
who make important decisions based on USN&WR rankings. Accordingly, one goal of
this study is to look at the impact of the rankings on decisions of students choosing
where to attend law school, administrators in determining how much to charge for a
law school education, and members of the bar who might be in the position of selecting
law graduates to hire. We show that at least some of these actors, especially the appli-
cants, change their behavior in accordance with USN&WR rankings and, therefore, the
rankings cannot be dismissed as mere entertainment. That answer, that some actors do
respond, raises the question whether the quality of information incorporated into
USN&WR rankings justifies the reliance by those actors, again especially the applicants.
Accordingly, we examine the data to see whether the lawyer and academic reputations
and other information gathered by USN&WR is itself polluted by its rankings. We find
that the USN&WR rankings have what Stake (2006a) called an “echo effect”: USN& WR
shouts its ranking every spring and then hears echoes of that ranking when it gathers
data for the following year’s publication.
A. Who Is Listening?
The first goal of this article is to identify some of the actors that appear to take
USN&WR rankings into account in making decisions. To do so, we study a number of
numbers. What we call a “factor” is any number that goes into the USN& WR formula
for ranking law schools.
2
We attempt to determine whether some of these factors
2
The factors in the USN&WR rankings are reputation among law faculty members (25 perce nt), reputation
among judges and lawyers (15 percent), median LSAT of entering class (12.5 percent), median undergraduate
GPA of entering class (10 percent), acceptance ratio (2.5 percent), employment percentage at graduation (4 per-
cent), employment percentage about nine or ten months after graduation (14 percent), ratio of bar passage per-
centage in primary state to bar passage percentage for all test takers in that state (2 percent), student/faculty
ratio (3 percent), educational expenditures per student (9.75 percent), noneducational expenditures per student
(1.5 percent), and volumes and titles in the library (0.75 percent). The data set purchased from USN&WR did
not include the last three factors. The “ABA Take-Offs” sent to law schools include data similar to those used by
USN&WR on those three factors, but those data are confidential. For analysis based on those data and a more
detailed explanation of the method used by USN&WR, see Seto (2007). Note however, that the method has
changed somewhat since then.
422 Stake and Alexeev
include echoes of earlier rankings. In other words, a factor that echoes is one that is
influenced by the USN&WR ranking. If the UGPA median for a law school is a factor
that is affected by the ranking of a school, the UGPA is echoic. A factor that does not
echo is one that is not influenced by the USN&WR ranking; it is independent. Since
employment rate is not influenced by the USN&WR ranking,
3
employment is independ-
ent. Whether a factor echoes or not depends only on whether the actors determining
that factor respond in some way to the USN&WR rankings and as a consequence
change these rankings.
Not all the actors we care about produce numbers that are included by USN&WR
in the ranking formula. Numbers that do not appear directly in the USN&WR ranking
formula are nonfactors. For example, tuition, starting salaries, and male-female ratio are
numbers that do not play a direct role in the annual USN&WR rankings. Some of these
nonfactors, such as tuition, are influenced by the USN&WR rankings. Other nonfactors,
such as male-female ratio of the student body,
4
are not influenced by USN&WR rank-
ings. By our definition, the nonfactors cannot echo, at least not directly. But echoing
factors were not our only concern. We want to know whether USN&WR influences a
number of decisionmakers, regardless of whether their decisions directly influenced
USN&WR.
Another goal of this article is to shed some light on one consideration relating to
the degree to which the rankings published by USN&WR are worth the attention of
those actors that seem to be responding to them. It is in this connection that we discuss
the “echo effect.”
B. Reasons Echoes Matter
As do many summary indicators, USN&WR rankings make it convenient for economic
agents to access information about law schools. The largest set of consumers for this
product is made up of prospective students, the applicants to law school. They usually
anticipate investing a large sum of money along with years of effort in a legal education.
Very few of them have a personal source of information about product quality for this
investment. And it is obvious from the fact that USN&WR uses many input factors in its
ranking that USN&WR also lacks access to any direct indicator of the quality that its
buyers would like to learn; it combines a number of factors in an attempt to reach a
summary number that correlates with the information about quality that applicants
would like to know.
The price paid for this convenience is that the summary numbers might mislead
these agents. For example, students following a USN&WR ranking might choose to
attend school X when school Y would have been a better choice for them. These
3
When we regress employment at graduation on lagged USN&WR score and LSAT lagged four years, the coeffi-
cient on lagged overall score is not statistically significant (but the coefficient on LSAT is).
4
When we regress male-female ratio on USN&WR score lagged one year, the coefficient of the USN&WR score is
not statistically significant.
423Who Responds to U.S. News &World Report’s Law School Rankings?

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT