The USNWR law school rankings are deeply wounded--will law schools have the coordination to finish them off?

While law schools love to hate the USNWR law school rankings, they have mostly settled for complaining loudly and publicly, but internally (and sometimes externally) promoting those same rankings or working like made to use them as a basis for recruitment. Collective action is a real problem. Furthermore, finding an effective tool to diminish the value of USNWR law school rankings remains elusive.

But this is, perhaps, the moment for law schools seeking to finish off the USNWR rankings. In the last month, USNWR has had four separate methodological alternations between the preliminary release of rankings and the final release:

  • It created a new “diversity” ranking of schools that did not include Asian-American law students as a component of a “diverse” law school. After law school protest after recent events, USNWR agreed to include them. This decision alone moved some schools as much as 100 spots in the rankings (among nearly 200 law schools).

  • Its new “diversity” ranking also does not include multiracial students (those who consider themselves members of more than one racial group). USNWR is considering that and has decided to delay the release of these new rankings.

  • A new component of the rankings on library resources added “number of hours” the law library was available to students, 0.25% of the rankings. Methodological errors forced USNWR to recalculate the figures. This component—a 1 in 400th component, mind you—altered the ranking of more than 30 schools, and some by as much as six spots.

  • Another new component of the rankings on library resources, another one worth 0.25%, added “ratio of credit-bearing hours of instruction provided by law librarians to full-time equivalent law students.” Those errors resulted in USNWR pulling the metric entirely, and adding weight to bar passage rated from 2% to 2.25% of the ranking. This decision—again, only a 1 in 400th part of the rankings—shifting another 35 schools.

These last two components as new metrics strike me as strange. Is a law school better off if its librarians teach more student electives than providing research support and assistance to students and faculty? Is a law school better if its student can access the library (not just the law school, the law school library) between 2 and 5 am? That’s what the new metrics do. UPDATE: For more specific critiques about the library metrics, see here.

This potpourri of new metrics is even worse by the fact that USNWR can’t even assess its own rankings correctly. It’s issued multiple retractions

  • Congressional hearing. Congress assuredly has an interest in near-monopolistic behavior from an entity that increases the price of legal education and that serves as a major indicator to students who choose to enter the legal profession. It systematically undervalues public schools that are low-cost institutions by inflating emphasis on expenditures per student; and it routinely undervalues particular institutions like Howard University, one that consistently places in the upper tier of elite law firm placement and remains deeply esteemed by hiring attorneys. These strike me as ripe matters of public concern for investigation. If Congress can call tech companies to the mat, why not the rankings entity?

  • Pay-for-access boycott. USNWR charges law schools $15,000 to see the data they already provide. It strikes me that given the low value and quality of the data, schools should just stop paying for it. Even cutting 10 schools out deprives USNWR of $150,000 in quasi-extortion cash. Sure, some schools will lose opportunities to “game” the rankings by digging in and comparing figures. But maybe every-other year access—halving USNWR revenue—will stifle it.

  • Survey participation boycott. This is two-fold. The first is a refusal to fill out the survey data requests each fall. Of course, USNWR can independently collects some things if it wants to, like LSAT score and 10-month employment figures. But it can’t replicate it all. This is, of course, a collective action problem. But a second is a refusal to fill out the peer-reviewed surveys. That’s a separate problem, but I think there’s a decent solution: spike the survey. That is, fill out your own school as a 5, and all other schools as a 1. That maximizes the value to your own school while at the same time incentivizing others to render the survey meaningless. If USNWR wants to start discounting surveys it views as “gaming,” let it articulate that standard.

  • Alternative rankings developments. Law schools, of course, hate to be compared with one another in a single ranking. But schools and students are going to use them. Why not develop metrics that law schools deem “appropriate”—such as a principal component analysis of employment outcomes—with its own separately-administered peer review score, among other things? That strikes me as a better way forward, breaking the monopoly by developing approved alternative metrics.

Of course, I imagine these, like most such projects, would fall to infighting. It’s one thing for law schools to write a strongly-worded letter decrying what USNWR is doing. It’s another thing to, well, do something about it. I confess my solutions are half-baked and incomplete means of doing so.

But if there’s a moment to topple USNWR law school rankings, it is now. We’ll see if law schools do so.