Analysis of first-time bar passage data for Class of 2023 and ultimate bar passage data for Class of 2021

The ABA has released its new batch of data on bar passage. The data includes the first-time passage data for the Class of 2023 and the “ultimate” passage data for the Class of 2021. As I noted earlier, USNWR has increased the weight on bar passage as a metric (18% of the methodology is for first-time passage, 7% for ultimate), and it is one of the biggest metrics. It is also one of the most volatile metrics.

To offer a snapshot of what the data means, I looked at both the first-time and ultimate passage data. I compared schools’ performance against their Class of 2022 and 2020 metrics. I weighed the data the way USNWR does for a point of comparison.

Note that USNWR has not yet released its latest rankings for Spring 2024. That will include the Class of 2022 and 2020 metrics. This new batch of data will appear on rankings released in 2025.

Here are the schools projected to improve in this metric (which, again, under the current methodology, is 25% of the rankings) over the Classes of 2022 and 2020. The numbers below show the change in score; that is, they show how much a school is projected to improve or decline in the scoring. It is not the bar passage data, which is a comparative metric that can be harder to make meaningful if viewed simply in raw terms. That said, these numbers are, in their own way, meaningless, as they are just one factor among several.

Pontifical Catholic University of P.R. 0.316441

Appalachian School of Law 0.2564

Texas Southern University Thurgood Marshall School of Law 0.230382

Widener University-Delaware 0.225445

Northern Kentucky University 0.201138

Stetson University College of Law 0.188107

Villanova University 0.179214

Miami, University of 0.158228

Kansas, University of 0.152897

Albany Law School 0.141105

Baltimore, University of 0.129179

Texas Tech University 0.127884

Southern Illinois University 0.127633

Cincinnati, University of 0.127493

Saint Louis University 0.122641

North Carolina Central University 0.118442

Pittsburgh, University of 0.116755

Memphis, University of 0.106309

Vanderbilt University 0.104692

Boston College 0.102051

Here are the schools projected to decline in this metric over the Classes of 2022 and 2020.

Willamette University -0.41276

New Hampshire, University of -0.3903

Illinois, University of -0.32793

Case Western Reserve University -0.32763

Florida A&M University -0.31228

Ohio Northern University -0.30423

City University of New York -0.25318

Kentucky, University of -0.20322

Southern University -0.18699

Missouri, University of -0.17881

Puerto Rico, University of -0.166

Seattle University -0.16593

Pennsylvania State-Dickinson Law -0.15274

Regent University Law School -0.14854

Tulsa, University of -0.14119

Colorado, University of -0.1361

Gonzaga University -0.1291

Cleveland State University College of Law -0.12842

California Western School of Law -0.12533

St. Thomas University (Florida) -0.12298

Does a school's "ultimate bar passage" rate relate to that school's quality?

With a loss of data that USNWR used to use to assess the quality of law schools, USNWR had to rely on ABA data. And it was already assessing one kind of outcome, first-time bar passage rate.

It introduced “ultimate bar passage” rate as a factor in this year’s methodology, with a whopping 7% of the total score. That’s higher than the median LSAT score now. It’s also much higher than the at-graduation rate in previous methodologies (4%).

Here’s what USNWR had to say about this metric:

While passing the bar on the first try is optimal, passing eventually is critical. Underscoring this, the ABA has an accreditation standard that at least 75% of a law school’s test-taking graduates must pass a bar exam within two years of earning a diploma.

With that in mind, the ultimate bar passage ranking factor measures the percentage of each law school's 2019 graduates who sat for a bar exam and passed it within two years of graduation, including diploma privilege graduates.

Both the first-time bar passage and ultimate bar passage indicators were used to determine if a particular law school is offering a rigorous program of legal education to students. The first-time bar passage indicator was assigned greater weight because of the greater granularity of its data and its wider variance of outcomes.

There are some significant problems with this explanation.

Let’s start at the bottom. Why did first-time bar passage get greater weight? (1) “greater granularity of its data” and (2) “its wider variance of outcomes.”

Those are bizarre reasons to give first-time bar passage greater weight. One might have expected that there would be an explanation (right, I think) that first-time bar passage is more “critical” (more than “optimal”) for employment success, career earnings, efficiency, and a host of reasons beneficial to students.

But, it gets greater weight because there’s more information about it?

Even worse, because of wider variance in outcomes? The fact that there’s a bigger spread in the Z-score is a reason to give it more weight?

Frankly, these reasons are baffling. But maybe no more baffling than the opening justification. “Passing eventually is critical.” True. But following that, “Underscoring this, the ABA has an accreditation standard that at least 75% of a law school’s test-taking graduates must pass a bar exam within two years of earning a diploma.”

That doesn’t underscore it. If eventually passing is “critical,” then one would expect the ABA to require a 100% pass rate. Otherwise, schools seem to slide by with 25% flunking a “critical” outcome.

The ABA’s “ultimate” standard is simply a floor for accreditation purposes. Very few schools fail this standard. The statistic, and the cutoff, are designed for a minimal test of whether the law school is functioning appropriately, at a very basic level. (It’s also a bit circular, as I’ve written about—why does the ABA need to accredit schools separate and apart from the bar exam if it’s referring to accreditation standards as passing the bar exam?)

And why is it “critical”?

USNWR gives “full credit” to J.D.-advantage jobs, not simply bar passage-required jobs. That is, its own methodology internally contradicts this conclusion. If ultimately passing the bar is “critical,” then one would expect USNWR to diminish the value of employment outcomes that do not require passing the bar.

Let’s look at some figures, starting with an anecdotal example.

The Class of 2020 at Columbia had a 96.2% ultimate bar passage rate. Pretty good—but good for 53d nationwide. The gap between 100% and 96.2% is roughly the gap between a 172 median LSAT and a 163 median LSAT. You are reading that correctly—this 4-point gap in ultimate bar passage is the same as a 9-point gap at the upper end of the LSAT score range. Or, the 4-point gap is the equivalent to the difference in a peer score of 3.3 and a peer score of 3.0. In other words, it’s a lot.

Now, the 16 students at Columbia (among 423!) who attempted the bar exam once but did not pass it may say something. It may say that they failed four times, but that seems unlikely. It may be they gave up—possible, but why give up? It could be that they found success in careers that did not require bar passage (such as business or finance) and, having failed the bar exam once, chose not to try to take it.

It’s hard to say what happened, and, admittedly, we don’t have the data. If students never take the bar, they are not included in this count. And so maybe there’s some consistency in the “J.D. advantage” category (i.e., passing the bar exam is not required) as a “full credit” position. But for those who opt for such a job, half-heartedly try the bar, fail, and give up—well, they fall out of the “ultimate bar passage” category.

Another oddity is that the correlation between first-time passage rate (that is, over- and under-performance relative to the jurisdiction) and ultimate bar passage rate is good, but at 0.68 one might expect two different bar passage measures to be more closely correlated. And maybe that’s good not to have measures so closely bound with one another. But these are literally both bar passage categories. And they seem to be measuring quite different things.

(Note that including the three schools from Puerto Rico, which USNWR did for the first time this year, distorts this chart.)

You’ll see there’s some correlation, and it maybe tells some stories about some outliers. (There’s a caveat in comparing cohorts, of course—this is the ultimate pass rate for the Class of 2020, but the first-time rate for the Class of 2022.) Take NCCU. It is in a state with a lot of law schools with students with high incoming predictors, whose graduates pass the bar at high rates. NCCU appears to underperform relative to them on the first-time metric. But its graduates have a high degree of success on the ultimate pass rate.

So maybe there’s some value in offsetting some of the distortions for some schools that have good bar passage metrics but are in more competitive states. If that’s the case, however, I’d think that absolute first-time passage, rather than cumulative passage, would be the better metric.

Regardless, I think there’s another unstated reason for using this metric: it’s publicly available. Now that a number of law schools have “boycotted” the rankings, USNWR has had to rely on publicly available data. They took out some factors and they devalued others. But here’s some publicly available data from the ABA. It’s an “output,” something USNWR values more now. It’s about bar passage, which is something it’s already looking at. It’s there. So, it’s being used. It makes more sense than the purported justifications that USNWR gives.

And it’s given 7% in the new rankings. That’s a shocking amount of weight to this metric for another reason: what students actually rely on this figure?

When I speak to prospective law students (whether or not they’re planning to attend a school I’m teaching at), I have conversations about employment outcomes, yes. About prestige and reputation. About cost and about debt. About alumni networks. About geography. About faculty and class size.

In thirteen years of legal education, I’m not sure I’ve ever thought to mention to a student, “And by the way, check out their ultimate bar passage rate.” First time? Sure, it’s happened. Ultimate? Can’t say I’ve ever done it. Maybe that’s just reflecting my own bias. But I certainly don’t intend to start now. If I were making a list of factors I’d want prospective students to consider, I’m not sure “ultimate bar passage rate” would be anywhere on the list.

In any event, this is one of the more bizarre additions to the rankings, and I’m still wrapping my head around it.

Multistate Bar Exam scores hold steady, remain consistent with recent low scores

It has been difficult to project much about the bar exam given changes in administration and the pandemic. The July 2022 bar exam would reflect three potentially significant things: the decision of law schools to move to pass-fail grading in their courses (particularly affecting 1L courses) in the Spring 2020; the decision of law schools to significantly reduce academic attrition for 1Ls in the summer of 2020; and the decision of law schools to have a number of remote learning options for the bulk of law students taking the bar in July 2022.

Now the MBE scores have been released, and the scores are a a slight drop from July 2021—but still consistent with scores between 2014 and 2019, and certainly not an all-time low.

The score is comparable to last summer’s scores, but it remains near recent lows. It appears that these disruptions did not materially affect bar passage rates (of course, it’s impossible to know how rates may have differed without these variables—perhaps they would have improved markedly, or remained just the same!). Of some interest: test-takers declined somewhat notable, from 45,872 to 44,705.

Puerto Rico lowers its bar exam cut score in response to threats that its law schools may lose accreditation

Back in 2019, I assessed the potential effect of the American Bar Association’s revised Standard 316, which requires an “ultimate” bar passage rate of 75% within two years for a graduating class. There, I noted:

Let’s start with the schools likely in the most dire shape: 7 of them. While the proposal undoubtedly may impact far more, I decided to look at schools that failed to meet the standard in both 2015 and 2016; and I pulled out schools that were already closing, schools in Puerto Rico (we could see Puerto Rico move from 3 schools to 1 school, or perhaps 0 schools, in short order), and schools that appeared on a list due to data reporting errors.

Will state bars lower their cut scores in response?

It’s possible. Several state bars (like South Dakota as mentioned above) have lowered their cut scores in recent years when bar passage rates dropped. If states like California and Florida look at the risk of losing accredited law schools under the new proposal, they may lower their cut scores, as I suggested back in 2016. If the state bar views it as important to protect their in-state law schools, they may choose the tradeoff of lowering cut scores (or they may add it to their calculus about what the score should be).

The ABA Journal recently reported the plight of two of Puerto Rico’s law schools that have failed to meet that standard in several years. Indeed, for Pontifical, their pass rates have worsened fairly dramatically in recent years: 71% for 2017, 52% for 2018, and 46% for 2019.

That article tipped me off to changes in Puerto Rico’s bar exam cut score. Puerto Rico does not use the UBE or a standardized bar exam score, so their passing score of “596 out of 1000 points” doesn’t offer a whole lot of information. But the Supreme Court of Puerto Rico did choose to lower the cut score to 569.

A 2021 report offers some reasons to be skeptical of this change, after studying predictors and exam performance:

For both set of analyses completed, the results did support the hypothesis that the applicants in the more recent years were not as well prepared than the applicants in previous years. Average P-values for a common set of items declined over time, and when comparing specific test administration pairs, the pattern consistently saw applicants from earlier test administrations performing better.

. . .

The hypothesis that the steady decline in overall pass rate on the Puerto Rico Bar Examination is a result of applicants being less prepared for the examination is supported by the decline in performance on the 14 anchor items administered on every test administration.

The Supreme Court of Puerto Rico expressly considered the effect of the new ABA Standard 316 on Puerto Rico’s law schools as an impetus for change.

Ante la necesidad de determinar si, además de las medidas ya concretadas por el Poder Judicial para atender los efectos de la aplicación del Estándar de Acreditación 316 de la ABA en nuestra jurisdicción, era necesario disminuir o modificar la nota de pase de los exámenes de admisión al ejercicio de la profesión legal, en el 2020 la Oficina de Administración de los Tribunales (OAT) comisionó a la compañía ACS Ventures un análisis sobre este particular.

A standard-setting study for the cut score had two rounds of standard-setting. One recommended a score of 584 (with a range of 574 to 594), and the other 575 (with a range of 569 to 581). The Supreme Court took the lowest of these ranges, 569. That said, the pass rate would still be at 46.4% even with that score, better than the rate of closer to 33% under the present standard:

We recommend that the program consider a final passing score for the Bar Examination somewhere in the range of the recommended passing score (575) and a score that is two standard errors of the mean below this score (569). The rationale for this recommendation is that the reference point for the panelists during the study was the Minimally Competent Candidate and panelists made judgments to predict how these candidates would perform on the multiple-choice questions and essay questions for the examination. This means that the distribution of reference candidates was all intended to be minimally competent. In creating that distribution, the lower bound would likely best represent the threshold of minimum competency suggested by the panelists. Setting the passing score at 569 would mean that approximately 46.4% of candidates would pass the examination while setting the passing score at 575 would mean that approximately 41.5% of candidates would pass. This range is consistent with the recommendations of the panelists as characterizing the performance of the minimally competent candidate.

The ABA has given Puerto Rican law schools an extra three years to try to comply. The lower cut score will make it easier to do so, although it remains unclear that even with this cut score all schools will be able to meet the standard.

But it also shows the rarity of the ABA of actually enforcing this standard, except for continuing to give schools more time to demonstrate compliance. We’ll see what happens in the next three years.

Comment on the ABA's proposal to end admissions tests as a requirement for law school admission

Earlier, I blogged about the ABA’s proposal to end the admissions test (typically, the LSAT) as a requirement for law school admissions. I’ve submitted a comment on the proposal, which you can read in its entirety here. The comment recommends disclosure of four pieces of information if the ABA accepts the proposal: the number of matriculants who do not have a standardized test score; the percentage of students receiving—and the 75th, 50th, and 25th percentile amounts of—grants among students admitted without a standardized test score; total academic attrition among students who lack a standardized test score; and the first-time and ultimate bar exam passage rates for students without a standardized test score. The comment explains why each item would be a useful disclosure.

You can view other comments here.

California audit reveals significant underreporting and underenforcement of attorney discipline

The full report is here. The National Law Journal highlights a few things:

In a review of the agency’s disciplinary files, acting state auditor Michael Tilden’s office found one lawyer who was the subject of 165 complaints over seven years.

“Although the volume of complaints against the attorney has increased over time, the State Bar has imposed no discipline, and the attorney maintains an active license,” the report said.

In another instance, the bar closed 87 complaints against a lawyer over 20 years before finally recommending disbarment after the attorney was convicted of money laundering.

It’s a pretty remarkable story that highlights two things worth considering for future investigation.

First, when Professor Rob Anderson and I highlighted the relationship between bar exam scores and ultimate attorney discipline rates, we could only draw on publicly-available discipline records. In a sense, what we observed was a “tip of the iceberg.” Now, this could come out in a couple of different ways. On the one hand, it might be that the relationship is even stronger, and that attorney misconduct manifests earlier, if we had complete access to the kind of complaints that the California bar has. On the other hand, it might also be the case (as we point out in the paper) that some attorneys are better at concealing (or defending) their misconduct than others, and that might be hidden in the data we have. It would be a separate, interesting question to investigate.

Second, it highlights the inherent error in comparing attorney discipline rates across states. California’s process is susceptible to unique pressures or complications, as all states’ systems are. You cannot infer much from one state to another (unless you are looking at relative changes in states over time as a comparative benchmark), which is an effort some have (wrongly) attempted.

It will be interesting to see what comes out of the reforms proposed in California and if the effort improves public protection.

What happens if the ABA ends the requirement that law schools have an admissions test? Maybe less than you think

In 2018, the American Bar Association’s Council on the Section of Legal Education and Admissions to the Bar considered a proposal dropping the requirement of an admissions test for law schools. I wrote about it at the time over at PrawfsBlawg (worth a read!). The proposal did not advance. Many of these points hold true, but I’ll look at how a new proposal differs and what might come. The proposal is still in its early stages. It’s possible, of course, that the proposal changes, or that it is never adopted (as the 2018 proposal wasn’t).

To start, many law schools currently admit a non-trivial number of students without the LSAT. Some of those are with the GRE. A few are with the GMAT. Several admit students directly from undergraduate programs with a requisite ACT or SAT score. The GRE has gained more acceptance as a valid and reliable predictor of law school admissions, although how USNWR uses it in calculating its rankings is not how ETS recommends using the GRE.

The 2018 proposal concluded, “Failure to include a valid and reliable admission test as a part of the admissions process creates a rebuttable presumption that a law school is not in compliance with Standard 501.” The 2022 proposal is even more generous: “A law school may use admission tests as part of sound admission practices and policies.” No rebuttable presumption against.

There are varying levels of concern that might arise, so I’ll start with the point that I think inertia will keep many law schools using not just standardized tests but the LSAT.

First, the most significant barrier to prevent a “race to the bottom” in law school admissions: the bar exam. As it is, schools must demonstrate an ultimate bar passage rate of 75% within two years of graduating. That itself is a major barrier for dropping too low. Even there, many schools do not like an overly-low first-time passage rate, and student take note of first-time bar passage rates, which have increased importance in the USNWR rankings.

Now, some states have been actively considering alternative paths to attorney licensing My hunch—and it’s only a hunch—is that this move by the ABA will may actually reduce the likelihood that state bars will consider alternative pathways to attorney licensing beyond the bar exam, such as version of “diploma privilege.” If state bars are concerned that law schools are increasingly likely to admit students without regard to ability, state bars may decide that the bar exam becomes more important as a point of entry into the profession.

Of course, this isn’t necessarily true. If schools can demonstrate that they are admitting (and graduating) students with the ability to practice law to the ABA, and perhaps to the state bars, then that could elevate trust. But state bar licensing authorities appear to have long distrusted law schools. We’ll see if these efforts complicate proposals for bar exam reform, or simply highlight closer working relationships with (in-state) law schools and bar licensing authorities.

In short, unless schools come up with adequate alternatives on the admissions front to address bar passage at the back end, it’s unlikely to be a drastic change. And it might be that efforts in places like Oregon, which are focused on both the law school side and the consumer-facing side of the public, will assuage any such concerns.

Second, a less obvious barrier is legal employment. That’s a tail-end problem for inability to pass the bar exam. But it’s also an independent concern among, say, large law firms or federal judges to choose from graduates with the highest legal ability. There are proxies for that, law school GPA or journal service among them. But the “prestige” of an institution also turns in part on its selectivity, measured in part by the credentials of high LSAT scores. If firms or judges are less confident that schools are admitting the highest caliber law students, they may begin to look elsewhere. This is a complicated and messy question (alumni loyalty, for instance, runs deep, and memories of institutional quality run long), but it may exert some pressure on law schools to preserve something mostly like the status quo.

Third, for admissions decisions of prospective students, there’s a risk about how to evaluate GPAs. For instance, it’s well known that many humanities majors applying to law school have disproportionately higher GPAs than their LSAT scores suggest; and that hard sciences majors have disproportionately lower GPAs than their LSAT scores suggest. The LSAT helps ferret out grade inflation and avoids collegiate major grading biases. It is not immediately clear that all admissions decisions at schools will grasp this point if the focus shifts more substantially to UGPA as the metric for admissions (which is less accurate a predictor of Law school success than LSAT, and less accurate still than LSAT and UGPA combined).

Fourth, who benefits? At the outset, it’s worth noting that all schools will still indicate a willingness to accept the LSAT, and for law students interested in the broadest swath of application interest are still going to take the LSAT. Additionally, it’s likely that schools will continue to seek to attract high-quality applications with merit-based scholarships, and LSAT (or GRE) scores can demonstrate that.

One group of beneficiaries are, for lack of a better word, “special admittees.” Many law schools often admit a select handful of students for, shall we say, political or donor reasons. These students likely do not come close to the LSAT standards and may have the benefit of avoiding the test altogether. (Think of the Varsity Blues scandal.)

A second group of beneficiaries are law schools with a large cohort of undergraduates at a parent university that allows for the channeling of students into the law school. Right now, schools are capped at how many students can be admitted under such programs with an LSAT requirement as opposed to only a UGPA and some ACT or SAT requirement. That cap is now lifted.

Relatedly, pipeline programs become all the more significant. If law schools can develop relationships with undergraduate institutions or programs that can identify students who will be successful in law school upon completion of the program, it might be that the law school will seek to “lock” these students into the law school admissions pool.

In other words, it could most redound to the benefit of law schools with good relationships with undergraduate institutions, both as a channeling mechanism and as a way of preventing those students from applying to other schools (through a standardized test). We may see a significant shift in programming efforts.

There are some who may contend that racial minorities and those from socio-economically disadvantaged backgrounds will benefit, as they tend to score lower on standardized tests and bear the brunt of the cost of law schools adhering to standardized testing. That may happen, but I’m somewhat skeptical, with a caveat of some optimism. The LSAT is a good predictor of bar exam success (and of course, a great predictor of law school grades, which are a great predictor of bar exam success), so absent significant bar exam changes, there will remain problems if schools drop standardized testing in favor of metrics less likely to predict success. That said, if schools look for better measures in pipeline programs, things that prospective students from underrepresented communities can do that will improve their law school success, then it very well could redound to the benefit of these applicant pools and potentially improve diversification of the legal profession. But that will occur through alternative efforts that are more likely to predict success, efforts which we’re beginning to see but are hardly widespread.

Finally, what about USNWR? Unless many schools change, it seems unlikely that USNWR would drop using LSAT and GRE as a metric. Many schools, as noted, already have a cohort that enters without any standardized test scores that are measured in the rankings.

But we can see how the rankings have been adjusted for undergraduate schools:

A change for the 2022 edition -- if the combined percentage of the fall 2020 entering class submitting test scores was less than 50 percent of all new entrants, its combined SAT/ACT percentile distribution value used in the rankings was discounted by 15 percent. In previous editions, the threshold was 75 percent of new entrants. The change was made to reflect the growth of test-optional policies through the 2019 calendar year and the fact that the coronavirus impacted the fall 2020 admission process at many schools.

. . .

. . . U.S. News again ranks 'test blind' schools, for which data on SAT and ACT scores were not available, by assigning them a rankings value equal to the lowest test score in their rankings. These schools differ from ones with test-optional or test-flexible admissions for which SAT and ACT scores were available and were always rank eligible.

It’s possible, then, that alternative rankings weights would be added to account for schools that had increasing cohorts without standardized test scores. But, as long as it remains a factor, I imagine most law schools will continue to do everything in their power to focus on maximizing the medians for USNWR purposes, as long as the incentives remain to do so.

*

In short, it’s quite possible that we’ll see a number of innovative developments from law schools on the horizon if the proposal goes through. That said, I think there are major barriers to dramatic change in the short term, with a concession that changes in other circumstances (including the bar exam, improved undergraduate or pipeline programs, and USNWR) could make this more significant in the future.

But I’d like to suggest two points of data collection that may be useful to examine the change. First, it would be useful if law schools, perhaps only those with more than 10% of their incoming class who enter without standardized test scores, disclose the attrition rates of who had a standardized test and those who did not. Second, it would be useful if they disclosed the cumulative and ultimate bar passage rates of each cohort. I think this information would help demonstrate whether schools are maintaining high standards, both in admission and in graduation, regardless of the source of admission. But, law schools already disclose an extraordinary amount of information, and perhaps those will just be quietly disclosed to the ABA during reaccreditation rather than in some public-facing capacity.

USNWR has erratically chosen whether "statewide bar passage" rate includes only ABA-approved law schools over the years

I was directed to the fact that the new USNWR bar exam metric includes “the weighted state average among ABA accredited schools' first-time test takers in the corresponding jurisdictions in 2020.” “ABA accredited” was added. Didn’t the first-time bar exam passage rate only include ABA accredited schools in the past?

Previous methodology looked at the modal state where a law school’s graduates took the bar exam, and the “jurisdiction's overall state bar passage rate for first-time test-takers in winter and summer” of that year.

I looked at the 2022 rankings (released in 2021, using the 2019 bar exam data). I picked California, known for its significant cohort of non-ABA test-takers. The overall first-time pass rate was 59%, but the first-time pass rate among ABA accredited schools was 69%. (Historical stats are here.) USNWR used the 59% rate.

That first surprised me. I had assumed USNWR only used ABA accredited data. It also made me think that California schools would be harmed the most by this shift in metrics (even if I think it’s more accurate). That’s because California schools are less likely to “overperform” if the pass rate is higher (e.g., using only ABA accredited test-takers instead of all test-takers).

But then I dug further.

The 2021 rankings (released in 2020, using 2018 bar exam data) reported California’s first-time bar pass rate as 60%. The ABA first-time rate was 60%. But the overall rate was 52%. So in this year, USNWR used only ABA accredited schools.

The 2020 rankings (released in 2019, using 2017 bar exam data) reported a first-time pass rate of 58%. That’s the same as the overall first-time pass rate of 58%, not the 66% from ABA accredited law schools. So in this year, USNWR used overall first-time pass rates. And it appears USNWR did the same in 2019 (released in 2018, using 2016 bar exam data).

In short, there does not appear to be any reason why USNWR has used one method or another over the years. Certainly, this year it is expressly using only ABA data, and maybe it intends to stick with that going forward. But it’s another, subtle change that could adversely affect those schools (e.g., California) with a significant cohort of non-ABA test-takers. It’s probably the right call. But it also highlights the inconsistency of USNWR in its methodology over the years.

Some dramatic swings as USWNR introduces new bar exam metric

The latest USNWR law school ranking has some significant swings in the bar exam component. It made three significant changes: increasing the weight from 2.25% to 3%, and measuring “all graduates who took the bar for the first time,” and including graduates who were admitted via diploma privilege in both a school’s passers and the overall passers. From the methodology:

Specifically, the bar passage rate indicator scored schools on their 2020 first-time test takers' weighted bar passage rates among all jurisdictions (states), then added or subtracted the percentage point difference between those rates and the weighted state average among ABA accredited schools' first-time test takers in the corresponding jurisdictions in 2020. This meant schools that performed best on this ranking factor graduated students whose bar passage rates were both higher than most schools overall, and higher compared with what was typical among graduates who took the bar in corresponding jurisdictions.

For example, if a law school graduated 100 students who first took the bar exam – and 88 took the Florida exam, 10 the Georgia exam and two the South Carolina exam – the school's weighted average rate would use pass rate results that were weighted 88% Florida, 10% Georgia and 2% South Carolina. This computation would then be compared with an index of these jurisdictions' average pass rates – also weighted 88-10-2. (For privacy, school profiles on usnews.com only display bar passage data for jurisdictions with at least five test-takers.) Both weighted averages included any graduates who passed the bar with diploma privilege. Diploma privilege is a method for J.D. graduates to be admitted to a state bar and allowed to practice law in that state without taking that state's actual bar examination. Diploma privilege is generally based on attending and graduating from a law school in that state with the diploma privilege.

In previous editions, U.S. News divided each school's first-time bar passage rate in its single jurisdiction with the most test-takers by the average for that lone jurisdiction. This approach effectively excluded many law schools' graduates who took the bar. Dividing by the state average also meant the location of a law school impacted its quotient as much as its graduates' bar passage rate itself. The new arithmetic accounts for average passage rates across all applicable jurisdictions as proxy for each exam's difficulty and reflects that passing the bar is a critical outcome measure in itself.

The new methodology really changes the results for two kinds of schools. (The increase in the weight from 2.25% to 3% obviously also benefits schools that do well and harms schools more that do poorly.)

First, it benefits good schools in jurisdictions with tougher bars and strong out-of-state placement.

Second, it harms Wisconsin’s two law schools.

Let’s start with the first. Which schools benefited most from 2022 (measuring the 2019 bar) to 2023 (measuring the 2020 bar)? (These charts exclude a handful of schools that did not include their bar passage statistics this time around.)

  Pass rate 2019 Jurisdiction rate Cumulative pass rate 2020 Cumulative jurisdiction rate USNWR score delta
San Francisco 38.7% CA 59% 78.4% 78% 0.0497
William & Mary 86.7% VA 78% 96.9% 81% 0.0323
Washington & Lee 80.0% VA 78% 92.6% 81% 0.0317
Emory 84.5% GA 77% 91.7% 78% 0.0295
Minnesota 94.0% MN 81% 98.9% 82% 0.0279
Georgia 94.5% GA 77% 94.4% 76% 0.0272
Kentucky 78.4% KY 75% 90.6% 80% 0.0266
Montana 88.9% MT 85% 92.5% 82% 0.0255
Penn State-Dickinson 88.5% PA 80% 91.7% 79% 0.0249
Drexel 77.1% PA 80% 84.0% 78% 0.0249

On the left are the school’s pass rate in 2019 with its modal jurisdiction, and that jurisdiction’s pass rate. Next is cumulative pass rate in 2020 along with the cumulative jurisdiction rate. Finally is the delta of the USNWR score—how much better the school did this year compared to last year in the weighted Z-score.

(I noted last year that we saw major swings at some schools in 2020. We see how those are playing out here.)

The University of San Francisco saw a tremendous improvement in California of almost 40 points (aided in part by a lower cut score in California in 2020). But the next three schools are telling. William & Mary and Washington & Lee are strong schools in a very tough bar exam market (Virginia is one of the toughest bars in the country), and Emory in Georgia in an above-average difficulty bar. Each did reasonably well in 2019. But when adding in performances in other jurisdictions, their scores climbed. ABA data shows W&M went 15-for-15 in DC, 15-for-15 in Maryland, and 10-for-10 in New York. All were excluded in the old metrics; all are easier bars than Virginia. W&L grads went 13-for-14 in DC, 13-for-13 in North Carolina, and 9-for-10 in New York. Emory went 21-for-21 in New York and 11-for-11 in Florida.

In other words, a diffuse and successful bar exam test-taking base redounds to the benefit of these schools.

Let me add one more detail. The new methodology puts law schools closer to parity with one another when comparing bar passage rates, especially those outside the “outliers.” The more graduates you have taking the bar, across jurisdictions, the less likely the difficulty of the bar matters in the end; and the inclusion of “diploma privilege” (or adjacent) admissions lifts the results. The 2019 “denominator” of the bar exam ranged from 55% at the low end of law schools (i.e., Maine) to 87% at the top end (i.e., Kansas), a gap of 32 points. That shrunk a bit in 2020 with the new methodology, from 70% to 99% (29 points). But the difference between the 10th and 90th percentiles shrunk significantly, from 2019 (61% and 81%, 20 points) to 2020 (75% to 86%, 11 points). In other words, there differences between the 19th and 168th law schools in terms of their “jurisdiction pass rate” was about half as much in the “overall pass rate” this year compared to last year.

Let’s look at the worst-performing schools.

  Pass rate 2019 Jurisdiction rate Cumulative pass rate 2020 Cumulative jurisdiction rate USNWR score delta
Western State 56.7% CA 59% 51.7% 78% -0.0688
Ohio Northern 95.7% OH 79% 66.7% 81% -0.0658
Golden Gate 43.9% CA 59% 44.1% 78% -0.0620
Faulkner 81.8% AL 77% 60.7% 79% -0.0584
Marquette 100.0% WI 71% 98.2% 99% -0.0538
Southern Illinois 59.4% IL 79% 50.6% 82% -0.0513
Wisconsin 100.0% WI 71% 100.0% 99% -0.0497
CUNY 74.5% NY 74% 66.7% 86% -0.0493
Pepperdine 81.0% CA 59% 78.6% 78% -0.0455
Pace 76.0% NY 74% 69.6% 85% -0.0422

You can see that several schools performed worse, or relatively worse, compared to their 2019 figures (again, consistent with what I noted earlier, major swings at some schools in 2020). But note outliers. Marquette (98.2%) and Wisconsin (100%) both have extraordinarily high bar passage rates, due principally to in-state diploma privilege.

In the past, this redounded to their benefit, as ordinary test-takers who took the bar exam performed substantially lower than 100% (see 71% in 2019), giving them a huge advantage. The new USNWR methodology, however, includes all of those diploma privilege admittees as “passers” in cumulative jurisdiction’s pass rate, too. Wisconsin and Marquette used to perform 30 points above the average; they’re now basically at the average.

In one sense, there’s a greater honesty to the metric in comparing similarly-situated graduates to one another. But it comes at the cost of punishing two schools whose graduates are all (or nearly all) immediately able to practice law. That’s a tremendously valuable outcome for law students.

It might be beneficial for USNWR to instead include two factors, absolute passers and relative passers (like this one). Some (especially California deans!) critique an “absolute” passer rate that lacks accounting for the difficulty of the bar. But if we care about law students’ ability to practice law, it seems to me that it’s important to capture whether your graduates are successfully getting students there, regardless of how hard or easy the bar exam is. (Of course, relative performance also should matter, I think, at least to some degree, as it suggests that some schools are improving opportunities for their graduates.) I confess, others would disagree.

How did other schools, like those in Utah, Washington, or Oregon, not perform much better or worse despite emergency “diploma privilege” being introduced? Recall it’s a mixed bag, depending on the school and the state, history and out-of-state test-takers. February 2020 did not have such exemptions, which are partially included in the figures above. Utah and Oregon still had a decent set of in-state test-takers, as diploma privilege did not extend to everyone—but schools in those states didn’t see as dramatic changes in overall passing rates, as in both states they were keyed to pre-set levels of test-taker success (86%, with an exception in Oregon for in-state schools), and that meant most people taking the test would have passed, anyway. Washington, in contrast, opened up diploma privilege to essentially all test-takers, and the corresponding increase in passers put the University of Washington near the bottom of changes from 2019 to 2020 (suffering something that Wisconsin and Marquette experienced this year).

It’s a seemingly small change in methodology, and it’s hard to know what a number like “0.0497” means to an overall score. But it’s worth identifying that the changes are not value-neutral and can affect similarly-situated schools quite differently.