Excess of Democracy

View Original

Some law schools fundamentally misunderstand the USNWR formula, in part because of USNWR's opaque methodology

Earlier this week, USNWR announced it was indefinitely postponing release of its law school rankings, after delaying their release one week. It isn’t the first data fiasco that’s hit USNWR in law rankings. In 2021, it had four independent problems, two disputed methodology and two disputed data, that forced retraction and recalculation.

There are likely obvious problems with the data that USNWR collected. For instance, Paul Caron earlier noted the discrepancies in bar passage data as released by the ABA. I noticed similar problems back in January, but (1) I remedied some of them and (2) left the rest as is, assuming, for my purposes, close was good enough. (It was.) The ABA has a spreadsheet of data that it does not update, and individual PDFs for each law school that it does update—that means any discrepancies that are corrected must later be manually supplemented to the spreadsheet. It is a terrible system. It is exacerbated by the confusing columns that ABA uses to disclose data. But it only affected a small handful of schools. It is possible USNWR has observed this issue and is correcting it. And it is possible this affects a small number of schools.

A greater mistake advocated by law school deans, however, relates to employment data. Administrators and deans at Yale, Harvard, and Berkeley, at the very least, have complained very publicly to Reuters and the New York Times that their employment figures are not accurate.

They are incorrect. It reflects a basic misunderstanding of the USNWR data, but it is admittedly exacerbated by how opaque USNWR is when disclosing its metrics.

In 2014, I highlighted how USNWR publicly shares certain data with prospective law students, but then conceals other data that it actually uses in reaching its overall ranking. This is a curious choice: it shares data it does not deem relevant to the rankings, while concealing other data that is relevant to the rankings.

The obvious one is LSAT score. USNWR will display the 25th-75th percentile range of LSAT scores. But it uses the 50th percentile in its ranking. That could be found elsewhere in its publicly-facing data if one looks carefully. And it is certainly available in the ABA disclosures.

Another less obvious one is bar passage data. USNWR will display the first-time pass rate of the school in the modal jurisdiction, and that jurisdiction’s overall pass rate. But it uses the ratio of first-timers over the overall pass rate, a number it does not show (but simple arithmetic makes easier). And in recent years, it now uses the overall rate from all test-takers across all jurisdictions, which it also does not show. Again, this is certainly available in the ABA disclosures.

Now, on to employment data. As my 2014 post shows, USNWR displays an “employed” statistics, for both at-graduation and 9 or 10 months after graduation. But it has never used that statistic in its rankings formula (EDIT: in recent years—in the pre-recession days, it weighed employment outcomes differnetly). It has, instead, weighed various categories to creates its own “employment rank.” That scaled score is used in the formula. And it has never disclosed how it weighs the other categories.

Let’s go back to what USNWR publicly assured law schools earlier this month (before withdrawing this guidance):

The 2023-2024 Best Law Schools methodology includes:

. . .

Full credit for all full-time, long-term fellowships -- includes those that are school funded -- where bar passage is required or where the JD degree is an advantage

Maximum credit for those enrolled in graduate studies in the ABA employment outcomes grid

Note that the methodology will give “full credit” or “maximum credit” for these positions. That is, its rankings formula will give these positions, as promised to law schools based on their complaints, full weight in its methodology.

I had, and have, no expectation that this would change what it publicly shares with prospective law students about who is “employed.” Again, that’s a different category, not used in the rankings. I assume, for instance, USNWR believes its consumers do not consider enrollment in a graduate program as being “employed,” so it does not include them in this publicly-facing metric.

Now, how can law schools know that this publicly-facing metric is not the one used in the rankings methodology, despite what USNWR has said? A couple of ways.

First, as I pointed out back in January, “I assume before they made a decision to boycott, law schools modeled some potential results from the boycott to determine what effect it may have on the rankings.” So law schools can use their modeling, based on USNWR own public statements, to determine where they would fall. My modeling very closely matches the now-withdrawn rankings. Indeed, Yale was the singled greatest beneficiary of the employment methodology change, as I pointed out back in January. It is very easy to run the modeling with school-funded and graduate positions given “full weight,” or given some discounted weight, and see the difference in results. It is impossible for Yale to be ranked #1 under the old formula—that is, in a world where its many graduates in school-funded or graduate positions did not receive “full weight” in the methodology. Again, very simple, publicly-available information (plus a little effort of reverse-engineering the employment metrics from years past) demonstrates the outcomes.

Second, USNWR will privately share with schools subscribing to its service an “employment rank.” This raw “rank” figure is the output of the various weights it gives to employment metrics. It does not reveal how it get there; but it does reveal where law schools stand.

It takes essentially no effort to see that the relationship between the “employment” percentage and the “employment rank” is pretty different or will look largely the same. And that’s even accounting for the fact that the “rank” can include subtle weights for many different positions. At schools like Yale, there are very few variables. In 2021, it had students in just 10 categories. And given that a whopping 30 of them were in full-time, long-term, law school funded bar passage required positions, and another 7 in graduate programs, the mismatch between “employment” percentage and “employment rank” should be obvious, or the two categories should match pretty cleanly.

Third, one can also reverse engineer the “employment rank” to see how USNWR gives various weight to the various ABA categories. This takes some effort, but, again, it is entirely feasible to see how these jobs are given various weights to yield a “rank” that looks like what USNWR privately shares. And again, for schools that run these figures themselves, they can see if USNWR is actually giving full “weight” to certain positions or not.

USNWR’s opaque approach to “employment rank” certainly contributes to law schools misunderstanding the formula. But law schools—particularly elite ones who initiated the boycott and insisted they do not care about the rankings, only now to care very much about them—should spend more effort understanding the methodology before perpetuating these erroneous claims.