Last year, I introduced the first-ever ranking of law school rankings at PrawfsBlawg. I thought I would reprise the task again.
As Elie Mystal at Above the Law noted at a recent conference, law school rankings tend to encourage more law school rankings. So it may be useful to put them in a single place and analyze them.
The rankings tend to measure one of, or some combination of, three things: law school inputs (e.g., applicant quality, LSAT scores); law school outputs (e.g., employment outcomes, bar passage rates); and law school quality (e.g., faculty scholarly impact, teaching quality). Some rankings prefer short-term measures; others prefer long-term measures.
Last year, I ranked 15 rankings. I'm adding four other rankings: Enduring Hierarchies; Witnesseth Boardroom Rankings, Above the Law Rankings, and Tipping the Scales Rankings.
1. Sisk-Leiter Scholarly Impact Study (2012): Drawing upon the methodology from Professor Brian Leiter, it evaluates the scholarly impact of tenured faculty in the last five years. It's a measure of the law school's inherent quality based on faculty output. In part because peer assessment is one of the most significant categories for the U.S. News & World Report rankings, it provides an objective quantification of academic quality. Admittedly, it is not perfect, particularly as it is not related to law student outcomes (of high importance to prospective law students), but, nevertheless, I think it's the best ranking we have.
2. NLJ 250 Go-To Law Schools (2013): It's a clear, straightforward ranking of the percentage of graduates from each school who landed a position at an NLJ 250 law firm last year. It does not include judicial clerkships, or elite public interest or government positions, but it is perhaps the most useful metric for elite employment outcomes.
3. Princeton Review Rankings (2013): Despite a black box methodology that heavily relies on student surveys, the series of rankings gives direct and useful insight into the immediate law school situation. It is admittedly not comprehensive, which I think is a virtue.
4. Above the Law Rankings (2013): The methodology is heavily outcome-driven. Unfortunately, it conflates "tuition" with "cost" (conceding as much when evaluating its own metrics), and it relies heavily on a couple of narrow categories (e.g., Supreme Court clerks). But it's a serious and useful ranking.
5. Enduring Hierarchies in American Legal Education (2013): Using a wealth of metrics, this study evaluates the persistency of the hierarchies among law schools. There are few things that have changed in determining which law schools are high quality over the last several decades. This study tries to figure out the traits of the hierarchies, and it categories the schools into various tiers.
6. Witnesseth Boardroom Rankings (2013): Professor Rob Anderson's analysis is extremely limited: it evaluates which law school graduates end up as directors or executive officers at publicly held companies. I think it gives a nice data point in an area that's under-discussed: law school graduates, after all, may find success in business and not simply in the practice of law.
7. Law School Transparency Score Reports (2013): It's less a "ranking" and more a "report," which means it aggregates the data and allows prospective students to sort and compare. The data is only as useful as what's disclosed--and so while it provides some utility, it's limited by the limited disclosures.
8. The Black Student's Guide to Law Schools (2013): Despite its obvious narrow audience, I think it offers some unique, and serious, elements, such as cost and cost of living, and "distinguished alumni" as a measure for school quality and student outcomes.
9. Roger Williams Publication Study (2013): It selects a smaller set of "elite" journals and ranks schools outside the U.S. News & World Report "top 50." There are a few issues with this, especially given its narrow focus, but I think it does a nice job filling in some gaps left by the Sisk-Leiter study.
10. SSRN Top 350 U.S. Law Schools (2014): The total new downloads give you an idea of the recent scholarship of a faculty--with an obvious bias toward heavy-hitters and larger faculties.
11. Wall Street Journal Law Blog's Best Big Law Feeder Schools (2012): It's somewhat less useful than the NLJ 250, but it is what it is.
12. U.S. News & World Report (2013): It really isn't that this ranking is so bad that it's 12th on my list. It's not ideal. It has its problems. I've noted that it distorts student quality. But, mostly, it's a point that there are quite a few rankings that, I think, are much better.
13. Tipping the Scales Rankings (2013): The metrics are simply a bit too ad hoc--and that's saying something coming behind U.S. News & World Report. The factors are idiosyncratic and, while they reflect a superficial appreciation of things like student quality and outputs, the measures used (salary data, which is inherently bimodal; acceptance rates, which are not uniform indicators of quality; etc.) are not a serious appreciation of those things.
14. QS World Law School Rankings (2013): I think this ranking tends toward comparing apples, oranges, kumquats, rhododendrons, and lichen: all living things, but extremely hard to compare.
15. Business Insider 50 Best Law Schools in America (2013): A 400-legal-professional survey asking them to name their top 10 schools, this survey... isn't much for usefulness as a ranking. It reflects the impressions of a set of practitioners using a specific methodology. That's about it.
18. Top Law Schools Rankings (2013): Last year, I indicated a myriad of reasons why these rankings were a hot mess. This year, Top Law Schools has "fixed" them... simply by repeating the overall U.S. News & World Report rankings, and the Above the Law rankings. As the ordinal ranking is less useful than the underlying data, it's far less useful than the actual rankings.
19. Cooley Rankings (2010).