Which law schools are affected the most by the USNWR dropping at-graduation employment rates?

Following up on my post, “Who's likely to benefit from the new USNWR law school rankings formula?,” I’ve continued to look at other changes to the rankings. USNWR is now only using publicly-available data. I went through some of those changes, but I wanted to look at another one: the absence of at-graduation employment outcomes.

USNWR does not publicly disclose precisely how it weighs the various employment metrics, but it does offer some relative cues, and it also says it gives “full weight” to full-time, long-term, bar passage-required or J.D. advantage jobs. How a school places in that category is highly correlated with how it places in the overall employment category. Now, changes to this category are coming, as I noted (i.e., USNWR will give “full weight” to school-funded positions and to students pursuing graduate degrees).

Setting aside that change for the moment, USNWR also offers 4% of its ranking based on a school’s at-graduation employment rate. This metric tends to favor the more “elite” law schools that place a significant number of graduates into judicial clerkships or large law firms, because those employers tend to hire before a 3L has graduated. That, however, is not data collected by the ABA, which only collects data in 10-month employment statistics.

I looked at “elite” employment outcomes for students 10 months after graduation, and compared them to the overall at-graduation employment rate reported to USNWR. (“Elite” being placement in private practice law firms with 101 or more attorneys, and in federal judicial clerkships.) In this first chart, you can see a pretty good relationship between the at-graduation rates and the elite law firm placement rates. (You can also see a number of schools that don’t report their at-graudation rate

Now here’s the chart for the relationship between those same jobs and the 10-month employment rate. As you see, overall employment rates rise significantly among the schools with the least “elite” employment outcomes. It means that the shift from at-graduation to 10-month may well favor placement into public interest, government, and smaller law firm jobs compared to how those positions have been handled in the past.

On to the change in methodology. I ran the numbers from last year’s figures to see what would happen if the 10-month employment rate were weighted at 18% instead of its present 14%, and abolished the at-graduation employment rate. I only used the top-line full-weight “employment” figures, so those are less precise than trying to use the proprietary blend USNWR uses for its actual ranking; but, I did standardize each score and looked at where it fell. While imprecise, this should give a “band” of the schools most likely to over-perform and under-perform based on this change alone. It should be noted that many schools do not presently share at-graduation employment statistics with USNWR, and probably all of them would be better off, to some degree or another.

SCHOOLS LIKELY TO BENEFIT

At grad v. 10 month

Elon 10.2%, 78.7%

Dayton 30.0%, 87.1%

Willamette 30.6%, 85.2%

Texas A&M 46.9%, 93.8%

Gonzaga 31.7%, 83.7%

Regent 31.2%, 83.1%

Houston 31.9%, 81.9%

Arkansas 39.3%, 86.6%

Northern Illinois 26.3%, 77.5%

Samford 31.4%, 80.7%

Arkansas-Little Rock 7.1%, 64.6%

DePaul 32.3%, 79.3%

Campbell 33.6%, 78.6%

North Dakota 29.9%, 76.1%

Idaho 30.6%, 76.5%

Seattle 32.4%, 76.9%

Liberty 24.6%, 71.9%

LSU 41.9%, 82.0%

Oklahoma 30.2%, 74.5%

Belmont 36.0%, 78.0%

These tend to be schools that do not place an overwhelming number of students into large law firms or judicial clerkships, but that do have a fairly strong 10-month employment rate relative to their peers. Interestingly, there are not any California law schools on the list, a cohort I had assumed might benefit most from the state’s difficult bar examination and perhaps a higher “wait and see” approach from prospective employers.

Now, to schools more likely to be adversely affected.

SCHOOLS LIKELY TO BE ADVERSELY AFFECTED

At grad v. 10 month

Massachusetts-Dartmouth 33.9%, 47.5%

Yale 89.2%, 89.2%

Stanford 88.5%, 89.0%

BYU 82.8%, 85.9%

Northwestern 87.9%, 89.5%

CUNY 36.1%, 56.5%

Loyola-New Orleans 52.1%, 66.9%

Vanderbilt 82.2%, 86.1%

Georgetown 83.1%, 86.8%

NYU 86.6%, 89.5%

Berkeley 86.7%, 90.0%

Chicago 94.6%, 95.1%

Columbia 95.3%, 95.6%

USC 76.2%, 83.6%

Virginia 92.7%, 94.3%

Cornell 90.3%, 92.8%

Montana 81.2%, 87.0%

Irvine 58.7%, 72.7%

Connecticut 58.6%, 72.9%

Harvard 88.1%, 91.8%

Recall, of course, it is on this one metric alone I’m looking at the change. And recall because the schools’ data are standardized in each category, those likely to gain or lose may look a little different than one may expect on the raw numbers alone. But it’s a mix of schools that have a very high at-graduation employment rate and receive a significant boost relative to their peers; and schools that are fairly low in both categories that were farther outliers in the at-graduation rates.

There are many other changes that could help or adversely affect other schools. Note, for instance, that I suggested in an earlier post that BYU could gain significantly in some other categories; here, it appears they could be adversely affected more. Texas A&M, to name another, performs well here, as it did in other places. How much weight USNWR gives to any change matters greatly.

But I think this highlights just how uncertain many changes are in the upcoming rankings. As I pick off different categories, there are schools likely to change their performance in each category. How those shake out in the end—whether they tend to be beneficial or not—remains to be seen.

By knocking off expenditure metrics and devaluing peer reputation scores in the new USNWR formula, did law schools just kill the faculty's golden goose?

As Aesop tells it, there was a goose that laid golden eggs. The greedy farmer saw the goose and thought there must be more gold inside the goose. The farmer kills the goose and finds nothing special inside—but he has now lost the ability to gather any more golden eggs.

It may not be the same story with the USNWR boycott and subsequent rankings changes. Law schools may well have attacked the goose thinking it was a wolf. But upon its demise, it may well be that law schools have permanently lost one of their most significant bargaining chips with central universities in trying to secure more funding for the law school.

Let me at the outset point out that I’ve long been critical of many aspects of the USNWR rankings, including expenditure data. It’s been opaque and been a kind of arms race for schools to figure out which accounting tricks they can use to raise their expenditure figures. And let me add that eliminating is in many respects a good thing, because the costs often fell on student loan borrowers and tuition hikes. So the analysis below is a small violin for many, indeed!

But a sober look at the change is in order. I posited yesterday about a potential effect of eliminating the expenditures-per-student metric:

By the way, it’s worth considering a new and different incentive for law schools situated within universities right now. Law schools could presently make the case to central administration that high spending on resources, including on law professor salaries, was essential to keeping one’s place in the rankings. No longer. It’s worth considering what financial incentive this may have on university budgets in the years ahead, and the allocation of resources.

From some offline and private conversations, this factor has been one of the most eye-opening to the law professoriate.

In the past, law schools could advocate for more money by pointing to this metric. “Spend more money on us, and we rise in the rankings.” Direct expenditures per student—including law professor salaries—were 9% of the overall rankings in the most recent formula. They were also one of the biggest sources of disparities among schools, which also meant that increases in spending could have higher benefits than increases in other categories. It was a source for naming gifts, for endowment outlays, for capital campaigns. It was a way of securing more spending than other units at the university.

And indirectly, the 40% of the formula for reputation surveys, including 25% for peer surveys and 15% for lawyer/judge, was a tremendous part of the formula, too. Schools could point to this factor to say, “We need a great faculty with a public and national reputation, let us hire more people or pay more to retain them.” Yes, it was more indirect about whether this was a “value” proposition, but law faculty rating other law faculty may well have tended to be most inclined to vote for, well, the faculty they thought were best.

Now, the expenditure data is gone, completely. And peer surveys will be diminished to some degree, a degree only known in March.

Some increase in the measurement of outputs, including bar passage data and employment outcomes, will replace it.

For law faculty specifically, and for law schools generally, this is a fairly dramatic turn of events.

To go to a central university administration now and say, “We need more money,” the answer to the “why” just became much more complicated. The easy answer was, “Well, we need it for the rankings, because you want us to be a schools rated in the top X of the USNWR rankings.” That’s gone now. Or, at the very least, diminished significantly, and the case can only be made, at best, indirectly.

The conversation will look more like, “Well, if you’re valued on bar passage and employment, what are you doing about those?

A couple of years ago, I had these long thoughts on the hollowness of law school rankings. For schools that lack the confidence in their institution and lack the vision to be able to articulate the value of the institution without reference to rankings, rankings provided easy external validation. They have also provided easy justification for these kinds of asks over the years.

Those easy days are over. Funding requests will need to look very different in a very short period of time.

Are there are other things that law schools can point to for a specific investment in law faculty in the USNWR rankings? Well, one such measure may have been citation metrics, which I had some tentative but potentially positive things to say as USNWR considered those. But law schools mounted a pressure campaign to nix that idea, too.

At the end of the day, then, the rankings formula will have very little to say with anything about the quality of law school faculty or the school’s financial investment in its faculty. An indirect case, of course, including a diminished peer reputation score. And faculty do contribute to bar passage and to employment outcomes. There will still be a faculty-student ratio.

But I think the financial case for law schools may well look very different in the very near future. This will be almost impossible to measure, and the anecdotes coming from these changes may well be wild and unpredictable. It’s also contingent on other USNWR changes, of course. But it’s a question I’ll be trying to watch closely over the next decade.

Who's likely to benefit from the new USNWR law school rankings formula?

Melissa Korn at the Wall Street Journal dropped the news today that USNWR plans on changing its formula for the law school rankings:

In a letter sent Monday to deans of the 188 law schools it currently ranks, U.S. News said it would give less weight in its next release to reputational surveys completed by deans, faculty, lawyers and judges and won’t take into account per-student expenditures that favor the wealthiest schools. The new ranking also will count graduates with school-funded public-interest legal fellowships or who go on to additional graduate programs the same as they would other employed graduates.

This is a remarkable sea change. As I recently pointed out, I did not anticipate much of a match between the boycotting law school’s tactics and complaints, and the ultimate USNWR response. I pointed out earlier that many of the complaints were about information that was already publicly available. And sure enough, the per-student expenditures, which were not the subject of complaints but the thing USNWR could not independently collect without voluntary participation from schools, are the first on the chopping block.

So which schools might it benefit or adversely affect the most? Let’s take a look at these three categories, and one other variable I’ll mention at the end.

1. Reduced weight to reputational survey data. Reputational surveys get 40% of the overall weight in the rankings—a lot. That’s 25% for the “peer” survey (i.e., other law school survey respondents), and 15% from the “lawyer/judge” survey.

Let’s start with the “peer” survey. Among the current overall-rated “top 50” schools, the schools that would benefit the most from this change (i.e., their peer score has lagged their overall score). (And this is just a raw comparison of rank v. rank; there are more nuanced issues dealing with the weighted Z-scores, scaling, and the like for another day…. And, of course, differences can be exaggerated the farther down the rankings one goes, a reason I also confined this to the “top 50” for now.)

POTENTIAL WINNERS, PEER CHANGE

George Mason (65 v. 30)

BYU (52 v. 23)

Alabama (36 v. 25)

Florida (31 v. 21)

Utah (47 v. 37)

Wake Forest (47 v. 37)

Texas A&M (56 v. 46)

Georgia (36 v. 29)

How about the other side? That is, the schools that would be adversely affected the most from the change? Again, focusing just on the existing USNWR “top 50” for now:

POTENTIAL LOSERS, PEER CHANGE:

UC-Irvine (20 v. 37)

Washington (36 v. 49)

Colorado (36 v. 49)

UC-Davis (24 v. 37)

Wisconsin (31 v. 43)

Boston College (27 v. 37)

Emory (20 v. 30)

There are a few others in the middle of some interest, because near the top smaller variations matter more. NYU (3 v. 7) and Georgetown (11 v. 14) are harmed the most.

Now, the degree to which this benefits or harms a school entirely depends, of course, on how much USNWR chooses to reduce the weight of the category.

For schools presently outside the “top 50,” schools that stand to gain the most include Wayne State, Baylor, Penn State-Dickinson, Tennessee, and Penn State-University Park. Schools that stand to be harmed the most include Santa Clara, Howard, Brooklyn, Rutgers, Denver, Georgia State, American, and Hastings.

Now, over to the “lawyer/judge” survey. It’s a smaller percentage, and, again, it depends on how much the change in weight goes. For those who stand to gain the most:

POTENTIAL WINNERS, LAWYER/JUDGE CHANGE:

Texas A&M (90 v. 46)

Arizona (77 v. 45)

George Mason (54 v. 30)

BYU (43 v. 23)

Arizona State (49 v. 30)

Alabama (43 v. 25)

Utah (54 v. 37)

Maryland (64 v. 47)

Boston University (28 v. 17)

And who are likely to be adversely affected the most:

POTENTIAL LOSERS, LAWYER/JUDGE CHANGE:

Boston College (24 v. 37)

Washington & Lee (24 v. 35)

Washington (39 v. 49)

Wisconsin (33 v. 43)

William & Mary (20 v. 30)

Emory (20 v. 30)

It should be noted, schools that appear on both lists in the best/worst categories may have much more to be happy/unhappy about.

Outside the top 50, schools most likely to benefit include UNLV, Wayne State, Florida International, Hawaii, Georgia State, Penn State-University Park, and Arkansas. Schools most likelty to be harmed include Howard, Oklahoma, Miami, Michigan State, Hastings, Lewis & Clark, Pittsburgh, and Case Western.

2. Eliminating per-student expenditures. Again, this is not publicly-available data, but it doesn’t take much effort to realize that the wealthiest schools, typically (but not always!) private schools, tend to be harmed the most by this change. Public schools (but not all!) are likely to benefit most. It was by far the biggest differentiator among many schools, giving schools like Yale and Harvard nearly insurmountable leads. Of course, it was also notoriously opaque and subject to manipulation.

If you want to consider the schools most adversely affected, it would be also useful to look at private law schools that have risen sharply in the rankings in recent years, or law schools that have had recent naming gifts or building/capital expenditures that allow an influx of spending.

By the way, it’s worth considering a new and different incentive for law schools situated within universities right now. Law schools could presently make the case to central administration that high spending on resources, including on law professor salaries, was essential to keeping one’s place in the rankings. No longer. It’s worth considering what financial incentive this may have on university budgets in the years ahead, and the allocation of resources.

3. Counting graduates with school-funded public-interest legal fellowships or who go on to additional graduate programs the same as they would other employed graduates. A couple of caveats. First, schools do not have to self-report if their full-time, long-term, bar passage-required or JD-advantage jobs that they subsidize are “public interest” or not, although one could suspect they mostly are public interest. These categories are also very small overall. Among the 35,713 graduates in 2022, 427 were pursuing graduate degrees (1.1%), 276 had school-funded bar passage-required jobs (0.7%), and 56 (0.2%) had school-funded JD-advantage jobs. The change will take pursuing a graduate degree from lesser to full weight, and school-funded positions from a lesser weight to full weight. (This is the only change really directly responsive to some of the elite schools’ earliest complaints.)

But those positions are not all equitably distributed. Who has the most in these three categories, which I’m lumping together for present purposes:

Yale 17%

Stanford 8.7%

District of Columbia 8.3%

San Diego 8.3%

South Dakota 7.7%

BYU 7.6%

Berkeley 7.0%

UC-Irvine 6.6%

Harvard 6.2%

Penn 6.2%

It’s worth considering whether the totals in these positions will climb in future years as the incentives have changed.

4. What fills the gap. USNWR has just announced it would eliminate expenditure data (per-student is 9%, “indirect” expenditures at 1%, if that’s also on the chopping block; as to library resources, it remains unclear for now). It also announced it would give reduced weight to the 40% categories of reputational surveys.

How USNWR backfills its formula to get to 100% will matter tremendously to schools. If you are doing well in all other areas of the rankings, maybe not so much. But if you are a school with relatively poor admissions medians and strong employment outcomes (a great value for students, to be frank!), you benefit much more if the new formula gives more weight to the employment outcomes. And if you are a school with really strong admissions medians but relatively poor employment outcomes, you benefit much more if the new formula gives more weight to the admissions statistics. Time will tell. UPDATE: USNWR has disclosed that it intends to give “increased weight on outcome measures.”

In short, there’s much uncertainty about how it will affect many law schools. I feel fairly confident that a handful of the schools identified above as winners in several categories, including Alabama, BYU, Georgia, and Texas A&M, will benefit significantly in the end, but one never knows for sure. It also has the potential to disrupt some of the more “entrenched” schools from their positions, as the more “legacy”-oriented factors, including spending and the echo chamber of reputational surveys, will receive less value. Law schools must increasingly face the value proposition for students (e.g., lower debt, better employment outcomes), with some other potential factors in the mix, in the years ahead.

UPDATE: It’s worth adding that USNWR has indicated, “We will rank law schools in the upcoming rankings using publicly available data that law schools annually make available as required by the American Bar Association whether or not schools respond to our annual survey.” That suggests other data, including the “employed at graduation” statistics and the “indebtedness at graduation” statistics, at the very least, would also disappear.

Annual Statement, 2022

Site disclosures

Total operating cost: $192

Total content acquisition costs: $0

Total site visits: 58,935 65,305 (-10% over 2021)

Total unique visitors: 51,745 (-7% over 2021)

Total pageviews: 70,822 (-12% over 2021)

Top referrers:
Twitter (4396)
Revolver.news (4240)
Leiter’s Law School Reports (1494)
Instapundit (760)
Reddit (518)
TaxProf Blog (469)
How Appealing (386)
David Lat’s Substack (195)

Most popular content (by pageviews):
What does it mean to “render unto Caesar”? (May 3, 2020) (11,606)
Ranking the most liberal and conservative law firms among the top 140, 2021 edition (November 8, 2021) (9426)
Federal judges have already begun to drift away from hiring Yale Law clerks (March 19, 2022) (8911)
Ranking the most liberal and conservative law firms (July 16, 2013) (3143)
Some dramatic swings as USNWR introduces new bar exam metric (March 28, 2022) (3107)
California’s “baby bar” is not harder than the main bar exam (May 28, 2021) (1308)

I have omitted "most popular search results" (99% of search results not disclosed by search engine, very few common searches in 2022).

Sponsored content: none

Revenue generated: none

Disclosure statement

Platform: Squarespace

Privacy disclosures

External trackers: one (Google Analytics)

Individuals with internal access to site at any time in 2022: one (Derek Muller)

*Over the course of a year, various spam bots may begin to visit the site at a high rate. As they did so, I added them to a referral exclusion list, but their initial visits are not disaggregated from the overall totals. These sites are also excluded from the top referrers list. Additionally, all visits from my own computers are excluded.

What does the endgame look like for law schools refusing to participate in USNWR rankings?

Whether it’s Ralph Waldo Emerson or The Wire, the sentiment in the expression, “If you come at the king, you best not miss” is a memorable one. That is, if you seek to overthrow the one in charge of something, you must truly overthrow it. If you don’t, and merely wound or annoy, well, perhaps all you’ve done is make that person angry at you, and perhaps you’re in a worse place than when you began.

I’ve been puzzling over this sentiment over the last week as Yale and Harvard (and now a handful of other elite schools in tow) announce they will not “participate” in the USNWR rankings in the future.

The puzzle is this: so what? Or, what’s the endgame here?


To start, USNWR has announced it will continue to rank law schools. Not much of a surprise here. And, as I mentioned, much data USNWR uses is available from the American Bar Association or from its own internal collection.

Law schools refusing to participate, then, may do one of two things. First, they may “delegitimize” the rankings by refusing to participate and hope that prospective law students, employers, and others take note. A related form of “delegitimization” is asterisks beside “non-participating” schools where USNWR imputes data that it cannot otherwise obtain publicly, suggesting that the rankings are “tainted,” at least as far as these non-participating law schools are concerned.

I think there’s little likelihood of his happening because of the second reason, which I’ll get to in a moment. As more and more schools refuse to participate, I think the sense of the rankings being “tainted” is, well, less and less. Everyone’s doing the same thing (well, not everyone, and more on this a little after that). The incentive, then, is for USNWR to treat everyone the same.

So, second, persuade USNWR to change its formula. As I mentioned in the original Yale and Harvard post, their three concerns were employment (publicly available), admissions (publicly available), and debt data. So the only one with any real leverage is debt data. But the Department of Education does disclose a version of debt metrics of recent graduates, albeit a little messier to compile and calculate. It’s possible, then, that none of these three demanded areas would be subject to any material change if law schools simply stopped reporting it.

Instead, it’s the expenditure data. That is, as my original post noted, the most opaque measure is the one that may ultimately get dropped if USNWR chooses to go forward with ranking, it would need to use only publicly-available data. It may mean expenditure data drops out.

Ironically, that’s precisely where Yale and Harvard (and many other early boycotters) excel the most. They have the costliest law schools and are buoyed in the rankings by those high costs.

So, will the “boycott” redound to the boycotters’ benefit? Perhaps not, if the direction is toward more transparent data.

Paul Caron has blogged about another interesting feature. Many of the schools that have joined the early boycott have fallen in USNWR in recent years, perhaps a suggestion, again, of a desire to “lock in” the present ranking. And many other schools that have publicly come out against the boycott (or who have been noticeably silent) have had advantageous boosts in recent years. So there are different incentives for those who want to change the rankings (by boycotting, forcing the hand) and those who do not (by maintaining the status quo).

Another brief point. While it’s mostly elite law schools that have “boycotted” so far, a few others have joined, including those whose “ranking” is more marginal and those who are the sole flagship law school in a state. In another sense, these schools also have little to “lose,” if you will, like the most elite schools whose reputations are firmly cemented in the existing hierarchical structure. A few schools (including a couple of HBCUs) have not given USNWR data for many years (to no particular public praise, for what it’s worth). Schools with a more marginal ranking have little value in trying to stay at, say, #110 or #124. And for sole flagship law schools, they also often cater to a different market of prospective students and employers.

Instead, we see a number of schools not boycotting who are generally, perhaps, “indistinguishable” to the prospective student or the employer, and who need to understanding of what the school’s “peers” may be as a reference point. USNWR offers this crudely and imperfectly, but it certainly adds this value in some ways.

Finally, what “boycotting” looks like. I openly asked whether schools “boycotting” intended to “boycott” the surveys sent around to law schools. It’s apparent that answer is no. “Boycotting” law schools submitted promotional material to prospective USNWR reputation survey voters (including me) even after announcing the boycott. So it’s a partial boycott, one in which schools intend to maintain as high a ranking as possible while also being recalcitrant in handing over data.

Mixed motives, many questions as Yale, Harvard "drop out" of the USNWR law rankings

November 16, 2022 may be a date marking a sea change in legal education, or a blip that will offer a few concessions and tradeoffs in the near future. But the announcement by Yale Law School, and the swift ensuing announcement (an act of conscious parallelism) from Harvard Law School, that they would, if you will, “drop out” of participating in the USNWR law rankings, is extraordinary. I wondered last year whether the weakened position of USNWR may result in a fatal blow to them. Perhaps that blow is here, now.

I’ve written a lot about USNWR law rankings, some of the good and some of the bad. So let’s walk through Yale’s and Harvard’s expressed justifications for dropping out.

First, employment metrics. From Yale:

One of the most troubling aspects of the U.S. News rankings is that it discourages law schools from providing critical support for students seeking public interest careers and devalues graduates pursuing advanced degrees. Because service is a touchstone of our profession, Yale Law School is proud to award many more public interest fellowships per student than any of our peers. These fellowships have enabled some of our finest students to serve their communities and the nation on our dime. Even though our fellowships are highly selective and pay comparable salaries to outside fellowships, U.S. News appears to discount these invaluable opportunities to such an extent that these graduates are effectively classified as unemployed. When it comes to brilliant students training themselves for a scholarly life or a wide-ranging career by pursuing coveted Ph.D. and master’s degrees, U.S. News does the same. Both of these tracks are a venerable tradition at Yale Law School, and these career choices should be valued and encouraged throughout legal education.

And from Harvard:

[T]he U.S. News methodology undermines the efforts of many law schools to support public interest careers for their graduates. We share, and have expressed to U.S. News, the concern that their debt metric ignores school-funded loan forgiveness programs in calculating student debt. Such loan forgiveness programs assist students who pursue lower paying jobs, typically in the public interest sector. We have joined other schools in also sharing with U.S. News our concern about the magazine’s decision to discount, in the employment ranking, professional positions held by those who receive public interest fellowships funded by their home schools. These jobs not only provide lawyers to organizations for critical needs, they also often launch a graduate’s career in the public sector.

The salient critique is a right one. In the 2008-2010 era, schools often “concealed” the employment status of graduates, if you will (or, as was alleged at the time by a number of those “shedding light” on employment practices), by hiring their own graduates. That boosted overall employment rates of the school, even if those careers were not meaningful careers at all—indeed, even if they were only short-term and part-time.

Once upon a time, USNWR treated all employment outcomes equally. And the ABA didn’t have granular employment data. No more. The ABA collects this granular data. USNWR gives “full weight” to full-time, long-term, bar passage-required and JD-advantage positions, but it discounts those positions if they are school-funded.

In a different era, this made more sense, but it is harder to justify discounting those positions if they are full-time, long-term. It’s a substantial investment (Yale indicates its public interest fellows received a $50,000 annual salary plus benefits) in a way that the short-term, part-time positions didn’t. In fact, those positions have all but dried up now that USNWR double-discounts them (they’re discounted for not being full-time, long-term positions, and they’re further discounted as school-funded positions). Total law school-funded, short-term, part-time positions, regardless of type of employment: Class of 2011, 964; Class of 2016, 165; Class of 2021, 59. From nearly 1000 such jobs to about 50 such jobs in a decade.

My recent look at the employment data suggests that a handful of elite schools (including Yale and Harvard) place a disproportionate number into these full-time, long-term, school-funded positions. At Yale, it’s at times over 10% of the class. At Harvard, it can exceed 20 students, a substantial number but smaller as a percentage basis at a school of Harvard’s size.

I think Yale and Harvard are right to critique this point of USNWR, but it’s fallen on deaf ears for many years.

Briefly, Yale mentions graduate programs, which, as justified could be a refuge for some schools seeking to “conceal” a set of their graduates into, perhaps, that school’s own, say, master’s programs. The gist of this metric is to say that JD graduates should be pursuing careers in law, but Yale candidly sends more (7, 2, and 10 in the last three graduating class) than a typical cohort, and perhaps justifiably so. But USNWR certainly does discount those placements. It’s a problem that may be unique to Yale (albeit modest in scope).

Second, debt metrics. From Yale:

In addition, the rankings exclude a crucial form of support for public interest careers — loan forgiveness programs — when calculating student debt loads. Loan forgiveness programs matter enormously to students interested in service, as they partially or entirely forgive the debts of students taking low-paying public interest jobs. But the rankings exclude them when calculating debt even though they can entirely erase a student’s loans. In short, when law schools devote resources to encouraging students to pursue public interest careers, U.S. News mischaracterizes them as low-employment schools with high debt loads. That backward approach discourages law schools throughout the country from supporting students who dream of a service career.

. . .

[T]he way U.S. News accounts for student debt further undercuts the efforts of law schools to recruit the most capable students into the profession. To its credit, U.S. News has recognized that debt can deter excellent students from becoming lawyers and has tried to help by giving weight to a metric that rests on the average debt of graduating students and the percentage of students who graduate with debt. Yet a metric based on debt alone can backfire, incentivizing schools to admit students with the means to pay tuition over students with substantial financial need. A far better measure is how much financial aid a law school provides to its students, rewarding schools that admit students from low-income backgrounds and support them along the way. That crucial measure receives inadequate weight in the rankings.

And from Harvard:

[T]he debt metric adopted by U.S. News two years ago risks confusing more than it informs because a school may lower debt at graduation through generous financial aid, but it may also achieve the same effect by admitting more students who have the resources to avoid borrowing. The debt metric gives prospective students no way to tell which is which. And to the extent the debt metric creates an incentive for schools to admit better resourced students who don’t need to borrow, it risks harming those it is trying to help.

The indebtedness metric was introduced in 2021. I certainly had questions about its methodology. I also suggested there might be value in considering alternative metrics, including ones that considered earnings potential. Undoubtedly, however, these metrics sink schools like Yale and Harvard. They rightly point out that it could distort admissions to well-funded students over those who could take on debt. (I’ve chronicled the “first generation” issue here, too.)

That said, most students take on debt, and indebtedness is a real concern for graduates. First, I’m not sure it confuses students, as, I think, there are not many schools that principally fund their institutions through wealthy admittees in a way that distorts the average debt loads. Indeed, USNWR also separates those who incur no debt from those who do incur debt, so the average debt load portrait offers a more accurate picture if you end up incurring debt. Second, it does reflect the need-based or other scholarship-based opportunities for students at an output level—what will it look like after graduation?

But, Yale identifies a related concern, which has some truth and a new question. If a school wants to repay debts for those who performing public interest (as those going into private practice will have an ability to repay their loans, and the loans are a “good” investment for their future earnings career), this indebtedness metric can’t track that. But, if Yale offers its own loan forgiveness, more generous than federal loan forgiveness, for public interest or government work, it solves an ability to pay problem, and a raw “indebtedness” metric doesn’t account for the fact that many of these graduates will never need to repay these loans.

And that raises a new question. How many graduates feel compelled to pursue such work because of their loans, as opposed to those who enter knowing they intend to have their loans later forgiven? That is, do the high debt loads constrain the choices of students? Undoubtedly, lower loan totals free up students to more choices. And there’s a benefit, then, of indicating the raw debt loads, even with a robust forgiveness program. But this is something of an unknowable answer.

Furthermore, the flipside of indebtedness is expenditures per student. This has been, perhaps, the bane of existence of the USNWR formula for many years. Yale and Harvard have benefited tremendously from this metric for decades. They report exceedingly high costs per student, aided by their generous endowments, which has helped keep them atop the rankings. But expenditures are opaque data not readily auditable. And they, worse still, incentivize simply spending more money, not spending it more effectively or to any particular benefit of the students. For the schools here to be so concerned about indebtedness metrics but not to speak a word about the expenditures metric (which has lasted much longer, is a much larger component of the rankings, and exacerbates much greater inequality between schools, particularly private and public) rings a bit hollow.

Third, admissions. From Yale:

The U.S. News rankings also discourage law schools from admitting and providing aid to students with enormous promise who may come from modest means. Today, 20% of a law school’s overall ranking is median LSAT/GRE scores and GPAs. While academic scores are an important tool, they don’t always capture the full measure of an applicant. This heavily weighted metric imposes tremendous pressure on schools to overlook promising students, especially those who cannot afford expensive test preparation courses. It also pushes schools to use financial aid to recruit high-scoring students. As a result, millions of dollars of scholarship money now go to students with the highest scores, not the greatest need. At a moment when concerns about economic equity stand at the center of our national dialogue, only two law schools in the country continue to give aid based entirely on need — Harvard and Yale. Just this year, Yale Law School doubled down on that commitment, launching a tuition-free scholarship for students who come from families below the poverty line. These students overcame nearly insurmountable odds to get to Yale, and their stories are nothing short of inspiring. Regrettably, U.S. News has made it difficult for other law schools to eliminate the financial barriers that deter talented minds from joining our profession.

And from Harvard:

[B]y heavily weighting students’ test scores and college grades, the U.S. News rankings have over the years created incentives for law schools to direct more financial aid toward applicants based on their LSAT scores and college GPAs without regard to their financial need. Though HLS and YLS have each resisted the pull toward so-called merit aid, it has become increasingly prevalent, absorbing scarce resources that could be allocated more directly on the basis of need.

I want to set aside “expensive” test preparation for a moment, as there is lower cost test prep than ever thanks to free services provided by LSAC and others. But I do want to discuss the LSAT/UGPA topic.

It’s been quite obviously correct that schools have been obsessively focused on their median LSAT and undergraduate GPA of incoming students and directing scholarships toward those medians. It’s diverted resources from need-based aid. That said, of course, it’s hard to know how some schools being need-focused in their aid would affect students who may have options elsewhere, at schools where they are not focused on need-focused aid.

But Yale’s note emphasizes a different issue: “the full measure of an applicant.” The ABA is on the verge of ending the requirement of an admissions test. It is entirely possible that schools that want to move toward a more “holistic” admissions process will need to find alternative pathways to measure applicants and do not want to rely on the LSAT. But as long as USNWR measures the LSAT, it will remain important for them to consider. Jettisoning the LSAT, then, requires a consideration of how to jettison USNWR.

Finally, the unstated reasons. As I’ve indicated, some of these reasons are more persuasive than others, but, of course, the timing matters too. Yale and Harvard each experienced an unprecedented drop in their peer scores this past year. Harvard dropped to a tie for 4th last year, down from its typical “top three” status. Some may be attributable to these rankings decisions; others maybe not.

So I’m watching closely to see how other schools respond. Frankly, there are benefits to USNWR for law students. They roughly approximate school quality, albeit extremely imperfectly. They can help prospective students, especially those new to the legal profession, with some rough guides of quality. Obviously, they create some bad incentives for law schools, and law students can too easily conflate rank with quality, or over-rely on the rankings.

But given how difficult it is for the ABA to revoke accreditation, USNWR offers at least some quality control for law schools, or at least some law schools. The bar exam is another bound of quality control for law schools. We’ll see if other schools jump on board, and how it may affect school choices moving forward.

Furthermore, it’s worth considering how, if at all, USNWR responds. Much of the rankings are based on their self-collected data (peer scores) or data they could independently collect from the ABA. Others, like debt and expenditures, are not so readily available. Some schools currently do not report to USNWR, but they are unranked. Will USNWR attempt to impute or estimate data back into Yale and Harvard, then rank them anyway? Will they just drop them to the “unranked” category? Will they change their criteria to use only information they can collect independently? Time will tell.

Elite federal clerkships don't reflect the whole universe of student clerkship opportunities

Much has been written about disputes or “boycotts” of federal judges hiring clerks from particular law schools. But reviewing clerkships generally reveals that the debate right now is a niche subset of student employment opportunities.

It’s a very small subset of elite, “credentialed” clerkships in dispute at the moment. There are about 800 active federal judges not counting dozens, probably hundreds more active senior federal judges. They hire around 1200 or so recent law school graduates in term (one or two year) positions each year, not counting the many post-graduate hires or career clerks.

Schools like Montana, Alabama, Kentucky, Memphis, and West Virginia routinely outplace NYU, Georgetown, and Columbia as a percentage of their graduates going on to federal clerkships right after graduation.

But the dialogue is obsessive about Yale (which places around 50 graduates a year into federal clerkships, and many more after graduation), with a sliver of judges like Judge James Ho on the 5th Circuit. Why?

The subset of elite, “credentialed” clerkships.

A peril of a set of highly credentialed, very young, former Supreme Court clerks nominated to the federal judiciary of late is increasingly sharp elbow among judges to be on the next Supreme Court “short list” (which, most recently for a Republican administration, appeared to include around 40 names, which is hardly short). That’s increasing competition for elite, pedigreed clerks.

The discussions of these clerks, as hired through an ideological valence, about reverence for their former Supreme Court bosses, about being “feeders” of clerks to the Court, and so on, all run this channel. But in truth, it’s a tiny fraction of the clerkship opportunities for law school graduates.

So many clerkships are about geographical fit, about helping launch careers of new graduates into a federal territory where they’ll ultimately practice and be ambassadors for the court (not judge the judge). The vast majority of the federal docket, too, is not about abortion or other hot-button topics, but grinding through 922(g) sentencing or suppression hearings or Social Security appeals or immigration disputes. It’s a judge working very closely with a very small team to draft work product.

In the elite subset of credentialed clerkships, there’s a lot to say about judicial hiring practices, complaints about it, ideological screens and preferences, and so on. But the vast majority of the federal clerkship experience, and the work, is nothing like the debate over the narrow subset that’s attracting significant attention. To the extent there are calls for “reform,” I hope those in positions of authority are mindful of this disparity as the conversation continues to play out.

Multistate Bar Exam scores hold steady, remain consistent with recent low scores

It has been difficult to project much about the bar exam given changes in administration and the pandemic. The July 2022 bar exam would reflect three potentially significant things: the decision of law schools to move to pass-fail grading in their courses (particularly affecting 1L courses) in the Spring 2020; the decision of law schools to significantly reduce academic attrition for 1Ls in the summer of 2020; and the decision of law schools to have a number of remote learning options for the bulk of law students taking the bar in July 2022.

Now the MBE scores have been released, and the scores are a a slight drop from July 2021—but still consistent with scores between 2014 and 2019, and certainly not an all-time low.

The score is comparable to last summer’s scores, but it remains near recent lows. It appears that these disruptions did not materially affect bar passage rates (of course, it’s impossible to know how rates may have differed without these variables—perhaps they would have improved markedly, or remained just the same!). Of some interest: test-takers declined somewhat notable, from 45,872 to 44,705.

Puerto Rico lowers its bar exam cut score in response to threats that its law schools may lose accreditation

Back in 2019, I assessed the potential effect of the American Bar Association’s revised Standard 316, which requires an “ultimate” bar passage rate of 75% within two years for a graduating class. There, I noted:

Let’s start with the schools likely in the most dire shape: 7 of them. While the proposal undoubtedly may impact far more, I decided to look at schools that failed to meet the standard in both 2015 and 2016; and I pulled out schools that were already closing, schools in Puerto Rico (we could see Puerto Rico move from 3 schools to 1 school, or perhaps 0 schools, in short order), and schools that appeared on a list due to data reporting errors.

Will state bars lower their cut scores in response?

It’s possible. Several state bars (like South Dakota as mentioned above) have lowered their cut scores in recent years when bar passage rates dropped. If states like California and Florida look at the risk of losing accredited law schools under the new proposal, they may lower their cut scores, as I suggested back in 2016. If the state bar views it as important to protect their in-state law schools, they may choose the tradeoff of lowering cut scores (or they may add it to their calculus about what the score should be).

The ABA Journal recently reported the plight of two of Puerto Rico’s law schools that have failed to meet that standard in several years. Indeed, for Pontifical, their pass rates have worsened fairly dramatically in recent years: 71% for 2017, 52% for 2018, and 46% for 2019.

That article tipped me off to changes in Puerto Rico’s bar exam cut score. Puerto Rico does not use the UBE or a standardized bar exam score, so their passing score of “596 out of 1000 points” doesn’t offer a whole lot of information. But the Supreme Court of Puerto Rico did choose to lower the cut score to 569.

A 2021 report offers some reasons to be skeptical of this change, after studying predictors and exam performance:

For both set of analyses completed, the results did support the hypothesis that the applicants in the more recent years were not as well prepared than the applicants in previous years. Average P-values for a common set of items declined over time, and when comparing specific test administration pairs, the pattern consistently saw applicants from earlier test administrations performing better.

. . .

The hypothesis that the steady decline in overall pass rate on the Puerto Rico Bar Examination is a result of applicants being less prepared for the examination is supported by the decline in performance on the 14 anchor items administered on every test administration.

The Supreme Court of Puerto Rico expressly considered the effect of the new ABA Standard 316 on Puerto Rico’s law schools as an impetus for change.

Ante la necesidad de determinar si, además de las medidas ya concretadas por el Poder Judicial para atender los efectos de la aplicación del Estándar de Acreditación 316 de la ABA en nuestra jurisdicción, era necesario disminuir o modificar la nota de pase de los exámenes de admisión al ejercicio de la profesión legal, en el 2020 la Oficina de Administración de los Tribunales (OAT) comisionó a la compañía ACS Ventures un análisis sobre este particular.

A standard-setting study for the cut score had two rounds of standard-setting. One recommended a score of 584 (with a range of 574 to 594), and the other 575 (with a range of 569 to 581). The Supreme Court took the lowest of these ranges, 569. That said, the pass rate would still be at 46.4% even with that score, better than the rate of closer to 33% under the present standard:

We recommend that the program consider a final passing score for the Bar Examination somewhere in the range of the recommended passing score (575) and a score that is two standard errors of the mean below this score (569). The rationale for this recommendation is that the reference point for the panelists during the study was the Minimally Competent Candidate and panelists made judgments to predict how these candidates would perform on the multiple-choice questions and essay questions for the examination. This means that the distribution of reference candidates was all intended to be minimally competent. In creating that distribution, the lower bound would likely best represent the threshold of minimum competency suggested by the panelists. Setting the passing score at 569 would mean that approximately 46.4% of candidates would pass the examination while setting the passing score at 575 would mean that approximately 41.5% of candidates would pass. This range is consistent with the recommendations of the panelists as characterizing the performance of the minimally competent candidate.

The ABA has given Puerto Rican law schools an extra three years to try to comply. The lower cut score will make it easier to do so, although it remains unclear that even with this cut score all schools will be able to meet the standard.

But it also shows the rarity of the ABA of actually enforcing this standard, except for continuing to give schools more time to demonstrate compliance. We’ll see what happens in the next three years.