Ranking the Law School Rankings, 2015
On the heels of the first-ever ranking of law school rankings, and last year's second edition, here's the third edition.
The rankings tend to measure one of, or some combination of, three things: law school inputs (e.g., applicant quality, LSAT scores); law school outputs (e.g., employment outcomes, bar passage rates); and law school quality (e.g., faculty scholarly impact, teaching quality). Some rankings prefer short-term measures; others prefer long-term measures.
Lest anyone take these rankings too seriously, there is no inherently rigorous methodology I use. It's largely my idiosyncratic preference about what rankings I think are "better" or "worse."
And, as always, I'll decide what rankings to rank. I've removed a couple and added a couple. The year listed is the year the ranking was last updated (not the self-described year of the ranking).
1. NLJ 250 Go-To Law Schools (2014): It's a clear, straightforward ranking of the percentage of graduates from each school who landed a position at an NLJ 250 law firm last year. It does not include judicial clerkships, or elite public interest or government positions, but it is perhaps the most useful metric for elite employment outcomes. As a methodological point, only 178 firms answered the survey, and NLJ relied on its database and independent reporting to supplement. To its great advantage, it includes many interactive charts of the data it has.
2. Sisk-Leiter Scholarly Impact Study (2012): The study has not been updated in a few years, but it's still useful for what it does. Drawing upon the methodology from Professor Brian Leiter, it evaluates the scholarly impact of tenured faculty in the last five years. It's a measure of the law school's inherent quality based on faculty output. In part because peer assessment is one of the most significant categories for the U.S. News & World Report rankings, it provides an objective quantification of academic quality. Admittedly, it is not perfect, particularly as it is not related to law student outcomes (of high importance to prospective law students), but, nevertheless, I think it's a valuable ranking.
3. Princeton Review Rankings (2014): Despite a black box methodology that heavily relies on student surveys, the series of rankings gives direct and useful insight into the immediate law school situation. It is admittedly not comprehensive, which I think is a virtue.
4. Above the Law Rankings (2014): The methodology is heavily outcome-driven (and perhaps driven by an outcome in mind). It relies on a very narrow "employment score" (full-time, long-term, bar passage required, excluding solo practitioners and school-funded positions). It conflates "tuition" with "cost," and it relies heavily on a couple of narrow categories (e.g., Supreme Court clerks). But it's a serious and useful ranking.
5. Enduring Hierarchies in American Legal Education (2013): Using many metrics, this study evaluates the persistence of the hierarchies among law schools. There are few things that have changed in determining which law schools are high quality over the last several decades. This study tries to figure out the traits of the hierarchies, and it categories the schools into various tiers.
6. Law School Transparency Score Reports (2013): It's less a "ranking" and more a "report," which means it aggregates the data and allows prospective students to sort and compare. The data is only as useful as what's disclosed--and so while it provides some utility, it's limited by the limited disclosures.
7. Witnesseth Boardroom Rankings (2014): Professor Rob Anderson's analysis is extremely limited: it evaluates which law school graduates end up as directors or executive officers at publicly held companies. But I think it gives a nice data point in an area that's under-discussed: law school graduates, after all, may find success in business and not simply in the practice of law.
8. Roger Williams Publication Study (2013): It selects a smaller set of "elite" journals and ranks schools outside the U.S. News & World Report "top 50." There are a few issues with this, as it relies on a fixed data set of "top 50" journals established years ago, and as it hasn't been updated in a couple of years, but, given its narrow focus, I think it does a nice job filling in some gaps left by the Sisk-Leiter study.
9. AmLaw BigLaw Associates' Satisfaction (2014): It surveys associates for how well their law schools prepared them for firm life. It highly correlates with job satisfaction. It's a nice, small post-graduate measure of law schools.
10. PayScale Rankings by Mid-Career Salary Salaries (2014): While this survey mixes all graduate schools together, and while it has some obvious selection bias in the reported salary data, it's another rare ranking that attempts to evaluate mid-career employment outcomes, which, as an under-evaluated area, makes this study something worth considering.
11. QS World University Rankings (2014): I think this ranking tends toward comparing apples, oranges, kumquats, rhododendrons, and lichen: all living things, but extremely hard to compare. But its use of h-index and citations per paper increases the objectivity of this academic-driven ranking.
12. SSRN Top 350 U.S. Law Schools (2015): The total new downloads give you an idea of the recent scholarship of a faculty--with an obvious bias toward heavy-hitters and larger faculties.
13. U.S. News & World Report (2014): Before, I've said that it isn't that this ranking is so bad that it's so low. Over time, I've concluded that, no, it's because this ranking is bad. It relies heavily on a few metrics that are not beneficial to measuring anything meaningful. It distorts student quality by incentivizing pursuit of the median LSAT and UGPA at the expense of all other quality factors, especially the bottom quartile of the class; it rewards silly categories like high-spending schools and library resources; it prints metrics unrelated to its ranking formula; its "lawyers/judge assessment score" has a notoriously low response rate; peer academic ranking scores have deflated over time as schools sandbag each other when ranking each other; and so on. It might be the case that they are exceedingly influential. It's true. Bu they are pretty poor. They may mostly get the "right" results, but for all the wrong reasons.
14. Tipping the Scales (2015): The metrics are simply a bit too ad hoc--and that's saying something coming behind U.S. News & World Report. The factors are idiosyncratic and, while they reflect a superficial appreciation of things like student quality and outputs, the measures used (salary data, which is inherently bimodal and notoriously underreported; acceptance rates, which are not uniform indicators of quality; etc.) are not a serious appreciation of those things.
15. PreLaw Magazine Best Law School Facilities (2014).
16. GraduatePrograms.com Top Law Schools for Social Life (2014).