Ranking the Law School Rankings, 2015

On the heels of the first-ever ranking of law school rankings, and last year's second edition, here's the third edition.

The rankings tend to measure one of, or some combination of, three things: law school inputs (e.g., applicant quality, LSAT scores); law school outputs (e.g., employment outcomes, bar passage rates); and law school quality (e.g., faculty scholarly impact, teaching quality). Some rankings prefer short-term measures; others prefer long-term measures.

Lest anyone take these rankings too seriously, there is no inherently rigorous methodology I use. It's largely my idiosyncratic preference about what rankings I think are "better" or "worse."

And, as always, I'll decide what rankings to rank. I've removed a couple and added a couple. The year listed is the year the ranking was last updated (not the self-described year of the ranking).

1. NLJ 250 Go-To Law Schools (2014): It's a clear, straightforward ranking of the percentage of graduates from each school who landed a position at an NLJ 250 law firm last year. It does not include judicial clerkships, or elite public interest or government positions, but it is perhaps the most useful metric for elite employment outcomes. As a methodological point, only 178 firms answered the survey, and NLJ relied on its database and independent reporting to supplement. To its great advantage, it includes many interactive charts of the data it has.

2. Sisk-Leiter Scholarly Impact Study (2012): The study has not been updated in a few years, but it's still useful for what it does. Drawing upon the methodology from Professor Brian Leiter, it evaluates the scholarly impact of tenured faculty in the last five years. It's a measure of the law school's inherent quality based on faculty output. In part because peer assessment is one of the most significant categories for the U.S. News & World Report rankings, it provides an objective quantification of academic quality. Admittedly, it is not perfect, particularly as it is not related to law student outcomes (of high importance to prospective law students), but, nevertheless, I think it's a valuable ranking.

3. Princeton Review Rankings (2014): Despite a black box methodology that heavily relies on student surveys, the series of rankings gives direct and useful insight into the immediate law school situation. It is admittedly not comprehensive, which I think is a virtue.

4. Above the Law Rankings (2014): The methodology is heavily outcome-driven (and perhaps driven by an outcome in mind). It relies on a very narrow "employment score" (full-time, long-term, bar passage required, excluding solo practitioners and school-funded positions). It conflates "tuition" with "cost," and it relies heavily on a couple of narrow categories (e.g., Supreme Court clerks). But it's a serious and useful ranking.

5. Enduring Hierarchies in American Legal Education (2013): Using many metrics, this study evaluates the persistence of the hierarchies among law schools. There are few things that have changed in determining which law schools are high quality over the last several decades. This study tries to figure out the traits of the hierarchies, and it categories the schools into various tiers.

6. Law School Transparency Score Reports (2013): It's less a "ranking" and more a "report," which means it aggregates the data and allows prospective students to sort and compare. The data is only as useful as what's disclosed--and so while it provides some utility, it's limited by the limited disclosures.

7. Witnesseth Boardroom Rankings (2014): Professor Rob Anderson's analysis is extremely limited: it evaluates which law school graduates end up as directors or executive officers at publicly held companies. But I think it gives a nice data point in an area that's under-discussed: law school graduates, after all, may find success in business and not simply in the practice of law.

8. Roger Williams Publication Study (2013): It selects a smaller set of "elite" journals and ranks schools outside the U.S. News & World Report "top 50." There are a few issues with this, as it relies on a fixed data set of "top 50" journals established years ago, and as it hasn't been updated in a couple of years, but, given its narrow focus, I think it does a nice job filling in some gaps left by the Sisk-Leiter study.

9. AmLaw BigLaw Associates' Satisfaction (2014): It surveys associates for how well their law schools prepared them for firm life. It highly correlates with job satisfaction. It's a nice, small post-graduate measure of law schools.

10. PayScale Rankings by Mid-Career Salary Salaries (2014): While this survey mixes all graduate schools together, and while it has some obvious selection bias in the reported salary data, it's another rare ranking that attempts to evaluate mid-career employment outcomes, which, as an under-evaluated area, makes this study something worth considering.

11. QS World University Rankings (2014): I think this ranking tends toward comparing apples, oranges, kumquats, rhododendrons, and lichen: all living things, but extremely hard to compare. But its use of h-index and citations per paper increases the objectivity of this academic-driven ranking.

12. SSRN Top 350 U.S. Law Schools (2015): The total new downloads give you an idea of the recent scholarship of a faculty--with an obvious bias toward heavy-hitters and larger faculties.

13. U.S. News & World Report (2014): Before, I've said that it isn't that this ranking is so bad that it's so low. Over time, I've concluded that, no, it's because this ranking is bad. It relies heavily on a few metrics that are not beneficial to measuring anything meaningful. It distorts student quality by incentivizing pursuit of the median LSAT and UGPA at the expense of all other quality factors, especially the bottom quartile of the class; it rewards silly categories like high-spending schools and library resources; it prints metrics unrelated to its ranking formula; its "lawyers/judge assessment score" has a notoriously low response rate; peer academic ranking scores have deflated over time as schools sandbag each other when ranking each other; and so on. It might be the case that they are exceedingly influential. It's true. Bu they are pretty poor. They may mostly get the "right" results, but for all the wrong reasons.

14. Tipping the Scales (2015): The metrics are simply a bit too ad hoc--and that's saying something coming behind U.S. News & World Report. The factors are idiosyncratic and, while they reflect a superficial appreciation of things like student quality and outputs, the measures used (salary data, which is inherently bimodal and notoriously underreported; acceptance rates, which are not uniform indicators of quality; etc.) are not a serious appreciation of those things.

15. PreLaw Magazine Best Law School Facilities (2014).

16. GraduatePrograms.com Top Law Schools for Social Life (2014).

Everything you need to know about Hickenlooper v. Kerr, the Guarantee Clause case before the Supreme Court

Tomorrow, the Supreme Court will consider a petition for a writ of certiorari in Hickenlooper v. Kerr. Colorado legislators challenged an enacted ballot initiative that prohibited legislative tax increases from taking effect without a popular vote, arguing that it violated the Guarantee Clause. A federal district court, and the Tenth Circuit, agreed that the legislators had standing and that the Guarantee Clause claim was justiciable.

I started tracking this matter over a year ago. I provide the background in these links; below that, I'll discuss the briefs in the case that the Court will consider.

Several amici were filed in the case, available at SCOTUSBlog. Of note (and these are very brief summaries of the major arguments):

  • The Colorado Union of Taxpayers Foundation, the Mountain States Legal Foundation, and 22 Colorado state legislators filed a brief in support of the petitioner. They focused primarily on the fact that respondents' injury was abstract, because legislators never enacted a tax increase for the people to vote upon--instead, they simply alleged a dilution of legislative power. That cannot comport with existing standing doctrine. Only if the Colorado legislature enacted a tax increase, then saw the people reject it, would standing exist.
  • The National Federation of Independent Business, along with several policy institutes, filed a brief in support of the petitioner. They emphasized the breadth of the impact of a finding that such a case is justiciable, because the decision invites judicial invalidation of direct democracy in a number of states on matters ranging from marijuana legalization to charter schools. They also noted that in the partisan gerrymandering context (Vieth v. Jublier), the Supreme Court has essentially required an articulation of judicially-manageable standards before the case could proceed. Here, the district court insisted (in a rather bizarre fashion) on holding a trial to determine what the Guarantee Clause demands.
  • The Center for Constitutional Jurisprudence (with John Eastman) filed a brief in support of the petitioner. It focused upon the inability of the Tenth Circuit to distinguish existing precedent finding the Guarantee Clause usually non-justiciable. Regardless, the case presents a good vehicle for clarifying the language in cases like New York v. United States (1992) suggesting that the Guarantee Clause may be justiciable, and articulating that the standards for justiciability are not met in this case.
  • Texas joined by five other states filed a brief joined in support of the petitioner. They argue that the text of the Guarantee Clause protects not the state legislature, but the people, and the States. They also cite other provisions like line-item vetos and supermajority voting requirements that may be called into question if this case is found justiciable.
  • The Cato Institute (with Ilya Shapiro) and several other policy institutes filed a brief in support of the petitioner. It asked the Court to avoid addressing the issue of whether the Guarantee Clause is per se non-justiciable and instead emphasized that Colorado's Taxpayer Bill of Rights met the standard of a "Republican Form of Government," drawing heavily from source material at the founding.

The Supreme Court will consider the case tomorrow--and we'll eagerly await their decision as to whether to hear this case.

Annual Statement, 2014

Site disclosures

Total operating cost: $186.47

Total content acquisition costs: $348.70

Total site visits: 83,706

Total unique visitors: 71,902

Total pageviews: 102,923

Top referrers:
Above the Law (12,676)
Facebook (8194)
Pajamas Media (4196)
Twitter (4091)
ABA Journal (3738)
Brian Leiter's Law School Reports (3438)
TaxProf (3252)
Election Law Blog (1802)
Reddit (1096)
The Faculty Lounge (782)
law.uci.edu (573)

Most popular content (by pageviews):
The best prospective law students read Homer (24,095)
Ranking the most liberal and conservative law firms (5516)
Bar exam scores dip to their lowest level in 10 years (5241)
Law school microranking: federal judicial clerkship placement, 2011-2013 (4803)
Ranking the law school rankings, 2014 (4780)
Increasingly appears NCBE may have had a role in declining MBE scores and bar pass rates (4201)
Where are they now? Supreme Court clerks, OT2004 (4152)

Most popular search results (when disclosed by search engine):
law school rankings (75)
excess of democracy (45)
law school rankings 2014 (40)
conservative law firms (16)
affordable law schools (13)
jd advantage jobs (13)
law schools worst yet to come excessofdemocracy (13)

Sponsored content: none

Revenue generated: none

Platform: Squarespace hosted by Peer1

Privacy disclosures

External trackers: one (Google Analytics)

Individuals with internal access to site at any time in 2014: one (Derek Muller)

Fictional Attorney of the Month: Jeff Winger

Community is not a typical comedy on television. It's probably why its quirky humor attracted a rabid but small audience, why NBC fired its creator three seasons in only to rehire him for the fifth, and why it's now canceled only to survive with some version of a sixth season on Yahoo! Screen in the near future as cast members peel off one by one. But Joel McHale's role as disbarred attorney Jeff Winger is just one of the many delightful roles.

Winger was a successful attorney at Hamish, Hamish & Hamlin, but for all the wrong reasons. He successfully persuades a jury to let off his client facing a DUI charge by tying it to 9/11. He helps a stripper escape tax evasion charges by arguing that her profession is actually not-for-profit performance art.

Unfortunately, a fellow attorney reports him to the bar for failing to obtain an undergraduate degree, and he's stuck in community college to get that degree with the least effort possible. Greendale Community College's eclectic mix of students generally rubs the sophisticated (former) attorney Winger the wrong way--much to the delight of all of us.

2014 Fictional Attorneys of the Month

January: Harvey Dent

February: Philip Banks

March: Willie Stark

April: Charles Kingsfield

May: Bob Loblaw

June: The Man of Law

July: John Shepherd

August: Lionel Hutz

September: Amanda Bonner

October: Sydney Carton

November: Barry Zuckerkorn

2013 Fictional Attorneys of the Month

NCBE has data to prove Class of 2014 was worst in a decade, and it's likely going to get worse

I have blogged extensively about the decline in bar pass rates around the country after the July 2014 test. My original take was more inquisitive, and I later discounted the impact that ExamSoft may have had. After examining the incoming LSAT scores for the Class of 2014, I concluded that it was increasingly likely that the NCBE had some role, positing elsewhere that perhaps there was a flaw in equating the test with previous administrations.

The NCBE has come back with rather forceful data to show that it wasn't the MBE (and that my most recent speculation was, probably, incorrect)--it was, in all likelihood, the graduates who took the test.

In a December publication (PDF), the NCBE described several quality-control measures that confirmed it was the test-takers, and not the test. First, on re-takers v. first-time test-takers:

Among the things I learned was that whereas the scores of those we know to be retaking the MBE dropped by 1.7 points, the score drop for those we believe to be first-time takers dropped by 2.7 points. (19% of July 2014 test takers were repeaters, and 65% were believed to be first-time takers. The remaining 16% could not be tracked because they tested in jurisdictions that collect inadequate data on the MBE answer sheets.) The decline for retakers was not atypical; however, the decline for first-time takers was without precedent during the previous 10 years.

I had suggested from earlier data from a few states that re-takers and first-time test-takers performed similarly; but, disclosing data from a much broader dataset and using the more precise issue of MBE performance, first-time test-taker performance was much worse.

Second, on equating the test:

Also telling is the fact that performance by all July 2014 takers on the equating items drawn from previous July test administrations was 1.63 percentage points lower than performance associated with the previous use of those items, as against a 0.57 percentage point increase in July 2013.

As equating the test is probably the biggest possible flaw on the NCBE's end, it's extremely telling that the equating of specific items on previous administrations yielded such a significant decline, and such a sharp contrast with the July 2013 test.

Third, and, in my view, one of the most telling elements, the MPRE presaged this outcome:

The decline in MPRE performance supports what we saw in the July 2014 MBE numbers. In 2012, 66,499 candidates generated a mean score of 97.57 (on a 50–150 scale). In 2013, 62,674 candidates generated a mean score of 95.65. In 2014, a total of 60,546 candidates generated a mean score of 93.57. Because many MPRE test takers are still enrolled in law school when they test, these scores can be seen as presaging MBE performance in 2014 and 2015.

A steady decline in MPRE scores, then, foretold this problem. This further undermines any notion that ExamSoft or other test-specific factors impacted the outcome; the writing was on the wall years ago. But as few schools carefully track MPRE performance, it might not have been an obvious sign until after the fact.

The NCBE bulletin then points out additional factors that distort student quality: a decrease in quality at the 25th percentile of admitted students at many institutions (i.e., those at the highest risk of failing the bar), the impact of highest-LSAT score reporting rather than average-LSAT score reporting for matriculants (a change embraced by both the ABA and LSAC despite evidence that taking the highest score overstates student quality), and an increase in transfer students to higher-ranked institutions (which distorts the incoming student quality metrics at many institutions). Earlier, I blogged that a decline in LSAT scores likely could not explain all of the decline--it could explain part, but there are, perhaps, other factors at play.

The NCBE goes on to identify other possible factors, ones that may merit further investigation in the legal academy:

  • An increase in "experiential learning," including an increase in pass-fail course offerings, and which often means students take fewer graded, more rigorous, "black-letter" courses;
  • A decline in credit hours required for graduation and a decline required (i.e., often more rigorous) courses;
  • An increase in bar-prep companies over semester-long coursework to prepare for the bar;
  • A lack of academic support for at-risk students as the 25th percentile LSAT scores of matriculants worsens at many institutions.

So, after I waffled, and blamed some decrease in student quality, and then started to increasingly consider the NCBE as a culprit, this data moves me back to putting essentially all of the focus on student quality and law school decisionmaking. Law schools--through admissions decisions, curriculum decisions, academic support decisions, transfer decisions, as a reaction to non-empirical calls from the ABA or other advocacy groups, or some combination of these factors--are primarily in control of the students' bar pass rates, not some remarkable decision of the NCBE. How schools respond will be another matter.

Further, the NCBE report goes on to chart the decline in the 25th percentile LSAT scores at many institutions. The declines in many places are steep. They portend some dramatic results--the decline in bar pass rates this year is only the beginning of probably still-steep declines in the next couple of years, absent aggressive decisions within the present control of law school. (The admissions decisions, after all, are baked in for the current three classes.)

Coupled with the decline of prospective law students, law schools are now getting squeezed at both ends--their prospective student quality is increasingly limited, and their graduates are going to find it still harder to pass the bar. And we'll see how they respond to this piece of news from the NCBE--I, for one, find the data quite persuasive.

Visualizing the continuing decline of the law school student body, 2014

One of the posts that has had the most staying power on this site was a post and a chart last year, "For legal education, the worst may be yet to come." We can now confirm that the decline continues, and the Class of 2017 is much smaller than previous classes--and that the bottom still has not been reached, given LSAT and applicant trends.

An ABA Journal piece discloses that the total incoming 1L class in 2014 was 37,675, the smallest since 1974, and down from the peak of 52,488 in 2010. Coupled with the declining LSAT data from LSAC, it paints a grim picture for legal education through at least 2017, and likely through 2018.

Top 25 law schools ranked by law student transfer preferences

How about another law school ranking--this time, one that measures tangible law student preference for one school over another?

The ABA has released the Standard 509 disclosures from law schools for 2014. The Standard 509 includes new data this year. Schools formerly listed only the number of transfers in and out. Now, if a school accepts more than five transfer students, it must list the schools from which the transfers came. Additionally, enough transfer students requires the school to the median GPA, and with an even larger number of transfers the 75th and 25th percentile GPAs of transfer.

The data is all concealed in those PDFs. But with a little magic (text recognition, data scraping, and a little time-consuming manual cleanup), we can aggregate the transfer data. Schools logged 2221 transfers in for the Fall of 2014. Because of disclosure requirements, we know the migration patterns of 1968 of them. So we know 1968 decisions of law students to leave one institution and instead attend another.

Students applying to law schools often don't have a good idea about what schools have to offer. But once they are in law school, they have some additional information about their own institution and have a better perspective about law schools themselves.

My colleague Rob Anderson (WITNESSETH) thought that using the Bradley-Terry model would be the best way of comparing schools. (This method is used in, among other things, Peter Wolfe's college football rankings, once a component of the BCS formula.)

Using that method, here are the top 25 schools. (The full list will be revealed at WITNESSETH later this week.) Comments below.

1. Yale University

2. Stanford University

3. Harvard University

4. New York University

5. University of California-Berkeley

6. Columbia University

7. University of Chicago

8. University of Pennsylvania

9. Northwestern University

10. University of Texas at Austin

11. Duke University

12. University of Washington

13. University of California-Los Angeles

14. Vanderbilt University

15. University of Michigan

16. University of Virginia

17. Cornell University

18. George Washington University

19. Brigham Young University

20. Georgetown University

21. University of Minnesota

22. University of Southern California

23. Southern Methodist University

24. Washington University

25. University of Notre Dame

As with any ranking system, there are obviously imperfections. Zero students transferred out of Yale, Harvard, or Stanford, so they are compared only indirectly; so too with others that had fewer transfers. Many schools had five or fewer transfers, which they did not disclose. Many students transfer for personal reasons, which may not reflect an evaluation of law school quality. Schools generally benefited if they had no (or few) transfers out; schools generally suffered if they were not required to disclose their transfer data, or if they accepted few transfers (when, of course, the most stable schools may accept the least transfers!).

Glancing at the top 25, one may wonder about some of the rankings. But consider SMU: they accepted transfers from Baylor, Hastings, and Fordham (among other institutions), but sent students only to Texas. A Bradley-Terry model would rank SMU quite high for precisely the results of this head-to-head matchup.

The data set includes all schools with at least one disclosed transfer, including the three schools in Puerto Rico, recently-accredited schools like the University of California-Irvine, and schools seeking accreditation like Concordia University.

Stay tuned: the full list of schools is forthcoming.

(If you are at a law school that did not disclose the schools from which students transfers, email us the information and we'll post an update. It will usually, but not always, help your school's ranking.)

UPDATE: This post has been modified in light of a correction in data.

Ninth Circuit affirms Measure B condom requirement

More than nine months after oral argument, the Ninth Circuit has issued its decision in Vivid Entertainment v. Fielding, a challenge to Los Angeles's Measure B (PDF), which, among other things, requires performers in pornographic films to wear condoms.

I wondered earlier if Prop 8 litigation may have undermined governmental defense of certain ballot initiatives, given that Los Angeles County urged intervenors to participate in the litigation so that it would not have to defend the measure. After challengers to the measure largely lost, they appealed, arguing, among other things, that the defendant-intervenors could not serve as appellees, because defendant Los Angeles County refused to continue to defend the measure.

Here's what the Ninth Circuit held regarding jurisdiction:

Citing Perry, Plaintiffs argue that we lack jurisdiction over this appeal, because Intervenors lack Article III standing. We disagree with their reading of Perry and with their contention that Intervenors must have standing for this appeal to proceed.
The Supreme Court has held that a party must have Article III standing both to initiate an action and to seek review on appeal. Arizonans for Official English v. Arizona, 520 U.S. 43, 64 (1997). But an intervenor who performs neither of those functions and no other function that invokes the power of the federal courts need not meet Article III standing requirements. Yniguez v. Arizona, 939 F.2d 727, 731 (9th Cir. 1991), vacated by Arizonans for Official English, 520 U.S. at 80, as recognized in League of United Latin Am. Citizens v. Wilson, 131 F.3d 1297, 1305 n.5 (9th Cir. 1997); see also Perry, 133 S. Ct. at 2661 (citing Art. III, § 2) (holding that “any person invoking the power of a federal court must demonstrate standing to do so” (emphasis added)). Nothing in Perry, which concerned the question whether an intervenor who sought to appeal had Article III standing, affects that conclusion. Plaintiffs have standing, and it is they alone who have invoked the federal courts’ jurisdiction. For that reason, we need not and do not decide whether Intervenors satisfy the requirements of Article III standing.

Jurisdiction, then, exists--but not obligation, apparently, exists for the government to defend a publicly-enacted ballot measure that the government officials may not agree with, or simply deem too sticky to provide a public defense. The Ninth Circuit went on to largely affirm the district court and, accordingly, upheld enforcement of much of Measure B.

New MP3: Eminem - '97 Bonnie and Clyde ft. Chief Justice John Roberts

I grew up on 12 Mile Road in the 313.* It's with some affection, then, that I hold the music of Detroit, from Motown to techno, in high esteem.

Eminem supporters storm the State of the Union address and beg onlooking Supreme Court justices to "Protect the 1st Amendment." (Screenshots from Eminem, "The Mosh Continues" (2004).)

Eminem is probably one of the most gifted lyricists in rap--he moves far beyond rhyme to embrace assonance, consonance, alliteration, and many more poetic forms in his lyrics. (This 90-second video clip of his interview with Anderson Cooper on the word "orange" teaches more about poetry more than most students learn in all of high school.) His music, however, is not for the faint of heart, given the obscenity and... shall we say, adult content of those lyrics.

An early album in 1997 featured "Just the Two of Us," a song about domestic violence in which Eminem and his daughter conspire to kill the girl's mother. (The song samples heavily from the 1981 Grover Washington Jr. and Bill Whithers song.) The song was re-recorded, extended, and released under the title "'97 Bonnie and Clyde" on another album in 1999.

Last week, the United States Supreme Court heard oral argument in Elonis v. United States, in which Defendant was convicted of threatening another in a Facebook post. Asking about the scope of the First Amendment when it comes to threatening speech, Chief Justice John Roberts quoted from "'97 Bonnie and Clyde." From the transcript (PDF):

CHIEF JUSTICE ROBERTS: What about the language [in] the Petitioner's brief? You know, "Da-da make a nice bed for mommy at the bottom of the lake," "tie a rope around a rock," this is during the context of a domestic dispute between a husband and wife. "There goes mama splashing in the water, no more fighting with dad," you know, all that stuff.
Now, under your test, could that be prosecuted.
MR. DREEBEN: No. Because if you look at the context of these statements--
CHIEF JUSTICE ROBERTS: Because Eminem said it instead of somebody else?
MR. DREEBEN: Because Eminem said it at a concert where people are going to be entertained.

With that, I thought a mashup was in order. With the enormous help and creative talent of an old friend and talented musician, Nate Wazoo, here's "Roberts and Clyde." an MP3 of a 52-second sample of the song.

You can purchase a copy of the original "'97 Bonnie and Clyde" at sites like Amazon.

*Strictly speaking, that was Royal Oak, not Detroit--the 313 area code was subdivided in 1993.

Copyright notice: the use of any copyrighted material is fair use for non-commercial, satirical, and educational purposes.