In 2025, law school admissions practices continue to look at the LSAT like it's 2005

The LSAT is an important predictor of law school success. It does a very good job of predicting who will perform well in law school. The higher your LSAT score, the higher your law school grades are likely to be. It is not perfectly correlated, but it is well correlated. When combined with your undergraduate grade point average (UGPA)—yes, regardless of your major, grade inflation, school disparities, and all that—it can even further predict law school success.

But the LSAT has changed over the years. As has its weight in the USNWR rankings. Many law school admissions practices, however, look at the LSAT like it’s 2005—like the test scores resemble what they did back then, and like the USNWR rankings care about them like they did back then. A lot has changed in a generation.

Changes to reporting of LSAT scores and repeaters

The Law School Admission Council, which administers the LSAT, has long maintained that the most accurate predictor of law school grade point average (LGPA) is the average of all LSAT scores. If a prospective student takes the LSAT twice, and scores a 164 and a 168 on those two tests, the most accurate way of evaluating that student is to conclude that the student received a 166. And schools reported to the American Bar Association the average score.

But in 2006, the ABA changed what schools need to report. It allowed schools to report the highest LSAT score, not the average of LSAT scores. For this student, then, it is no longer a 166 but a 168—even though it is less accurate in terms of the predictive value of the LSAT.

This change incentivizes repeat test-takers. Back in 2010, about two-thirds of test-takers took the exam only once. More would be incentivized to repeat the exam. But, there was an upper limit on this. LSAC only administered the exam four time a year, and it only permitted test-takers to repeat up to three times in a two-year period.

But in 2017, LSAC lifted that ban and allowed unlimited retakes—it has since brought that number down to five in “reportable score period” (around 6 years) and seven overall, but still much more than three. It now offers around eight administrations of the LSAT each year, up from four.

Granted, the total number of people inclined to take the exam five times is quite small. But repeaters continue to climb. In 2023-2024, repeat test-takers formed a majority of the LSATs administered.

Additionally, about 20% of test-takers do not have a “reportable score.” That means, under an option from LSAC developed over the last two decades, you can “peek” at your score and decide to cancel after learning about the score, preventing schools from seeing your score. (This is mostly irrational behavior from prospective students because schools will still report the highest score received, but it further muddies the waters for identifying the difference between the highest score and the average score.)

So, if you are a law school inclined to look at the LSAT as a predictive tool of student success, you would want to use the average score. But now that the ABA permits reporting the highest score—and because the USNWR rankings likewise use that score—all of the incentives are to rely on the highest score in admissions decisions, even if it’s less accurate to predict student success. True, gains among repeaters tend to be modest, about 2 points for a typical test-taker. But, as I continue, there are cumulative effects to these changes.

Importantly, when LSAC measures the validity of the LSAT, it still measures it using the average score. Law schools, however, typically are using the higher score—and therefore opting to use a less valid measure of the LSAT.

(Likewise, the LSAT is more valid at predicting success when combined with UGPA in an “index score,” but most law schools also do not use it this way, again choosing to use a less valid method of relying upon in it—more on that later.)

Extra time to take the test

Any mention of accommodations in test-taking is a fraught topic. But I want to set aside whatever preferences you may have about the relationship between accommodations and test-taking. I want to point out what it means—and specifically, extra time on the exam—for using the LSAT as a predictive tool.

Data from LSAC shows that accommodated test-takers receive higher scores than non-accommodated test-takers, around four to five points. Most accommodations for test-takers translate into LSAT scores that predict law school success—for instance, a visually-impaired person receiving large-print materials will receive a score that fairly accurately predicts law school success. There is an exception, however, for time-related accommodations, and LSAT scores tend to overpredict law school success when there are time accommodations. Requests for additional time have increased dramatically over the years, from around 6000 granted requests in 2018-2019 to around 15,000 granted requests in 2022-2023.

My point here is certainly not to debate accommodations in standardized testing, but it is to point out that additional time on the LSAT makes it less predictive, and there has been a dramatic increase in such tests. In 2014, the Department of Justice entered into a consent decree with LSAC to stop “flagging” such LSAT scores. So there remains a cohort of LSAT scores, increasing by the year, that are less predictive of law school success.

A change in test composition

In 1998, a technical study from LSAC looked at each of the three components of the LSAT—the analytical reasoning (sometimes called “logic games”), logical reasoning, and reading comprehension. The LSAT overall predicted first-year LGPA. And each individual component contributed to that overall score. But in 2019, LSAC entered into a consent decree on a challenge that the analytical reasoning section ran afoul of federal and state accommodations laws. And in 2023 it announced the end of that section.

I have not yet seen any subsequent technical report from LSAC (perhaps it’s out there) explaining how it reached this conclusion that a test without logic games could be as valid as a predictive measure expressly after its 1998 report. But certainly anecdotes, like this one in the Wall Street Journal, suggest some material changes:

Tyla Evans had almost abandoned her law-school ambitions after struggling with the logic games section. “When I found out they were changing the test, I was ecstatic,” said Evans, a 2023 George Washington University graduate. Her LSAT score jumped 15 to 20 points on the revised test, enabling a second round of applications. So far, she has received two sizable financial-aid offers and is waiting to hear from a few more schools. 

The LSAT has fundamentally changed in its content, and it suggests that scores today are not truly comparable to scores in previous eras, and that such scores will be less predictive of success.

Opting out of the test

One more slight confounding variable, although its effect on LSAT scores is more indirect and I’ll only mention briefly. More students are attending law school who have not taken the LSAT. This comes from a variety of sources—increasing cohorts of students who come directly from the law school’s parent university without an LSAT score; alternative admissions tests like the GRE; and so on. Publicly disclosed LSAT quartiles, then, conceal a cohort of students who have selected out of taking the LSAT. It is hard to know how this precisely affects the overall composition of LSAT test-takers, but it is one more small detail to note.

What the rankings use

A run-of-the-mill admissions office should care about LSAT scores as a predictor of law school success. Several developments in the last generation have diluted the power of the LSAT as a predictor—reliance on the highest score for repeat test-takers, unlimited retakes, additional time for some test-takers, a change in content, and a change in the cohort taking the exam. Nevertheless, despite all those changes, it is still a good predictor, or better than alternatives.

But this admissions office probably cares a great deal about something else too—law school rankings. Rightly or wrongly—again, not a debate for this post—law school rankings, particularly the USNWR rankings, play a great deal of influence in how prospective law students perceive of schools. Even marginal changes can significantly influence admissions decisions and financial aid costs, not to mention the effect on students, alumni, faculty, and the university as a barometer (again, rightly or wrongly) of the law school’s trajectory and overall health.

But USNWR does two important things with LSAT scores, one very public and recent, one more subtle and longstanding but often misunderstood.

First, USNWR has long used the median LSAT of an incoming class as a benchmark of the overall quality of the class (a decision long known to distort how law school admissions offices conduct their admissions practices as the “bottom” credentials of the incoming class looks very different from the “top”—more on that in a bit). But it changed its formula recently to focus more on outputs instead of inputs. That meant the weight it gave to the median LSAT score dropped from 11.25% of the rankings to just 5% of the rankings.

A related, and subtle, change is that by giving such significant weight to employment outcomes, now 33% of the rankings. It is not only a large category, but it is a category with a huge spread from top to bottom. That has the effect of diminishing the value of the other categories. Let me offer one comparison I raised earlier, in very simple terms: moving from a school ranked around 100th in the employment rankings to around 50th, which is typically the difference of getting an additional 3-4% of the class full-time legal jobs, is actually a bigger deal than moving from a 153 median LSAT score to a 175 LSAT score (which is a massive shift).

In short, the LSAT scores matter far less in the new rankings than they did before.

Second, USNWR has long used the percentile equivalents of LSAT scores, not the raw scores themselves. This is deceptive to the outsider, because there is such an emphasis on the raw LSAT score. But as scores get higher, they actually reflect narrower and narrower changes in improvement of a ranking’s perception of the incoming class.

LSAT scores look like a bell curve. In the middle of the distribution are where most of the scores fall, around 150 to 152. At each end of the curve are increasingly small numbers of students who get each score. So the move from 160 to 161 reflects a more significant improvement, relative to others, than the move from 170 to 171.

A quick chart will help illustrate the point using a recent LSAC percentile equivalent table. The gap from 173 to 172 is small, 0.65 percentage points. From 168 to 167 larger, 1.55 percentage points. From 164 to 163 still larger, 2.5 percentage points. And from 157 to 158, 3.42 percentage points.

This plays out the same way in the USNWR rankings (roughly, subject to, among other things, the fact that USNWR includes GRE and other scores in some schools’ rankings percentile equivalencies). Using my modeled rankings measure, the scores are scaled against other schools, and in their unweighted terms, the gap between a 172 and 173 is around 0.032 raw points; 168 to 167, 0.077; 164 to 163, 0.123; and 158 to 157, 0.142. (It isn’t so dramatic between 164-163 and 158-157 here, because most law schools cluster higher on the LSAT curve than the overall LSAT curve itself.)

These unweighted numbers are meant to give you some comparisons of raw points compared against each other. But recall that these figures only receive 5% weight in the overall rankings. So that 0.032 actually converts to 0.0016—a fraction of a hundredth of a point, a rounding error in many circumstances. Without getting into all of the details of how USNWR otherwise creates a rankings formula, it is, quite clearly, a very, very small number.

What this means, then, is that law schools perform “better” in the rankings each time the median LSAT of their incoming class increases, but the marginal value of each additional point increase diminishes.

How often do changes to LSAT median scores can alter a school’s ranking?

So, the big question is this. If many schools (and most school pursuing a more “elite” ranking) care about medians in LSAT scores, what tangible difference would a change in a median LSAT score make to a school’s ranking?

This should be a very straightforward question for most law schools. That is, most law schools should make a very basic model then run a cost-benefit analysis. That is, it’s a lot of effort to pursue a median LSAT score of X (including skewed admissions decisions, financial aid costs, etc.). Is it worth it?

For years, the answer has been unhesitatingly and unflinchingly “yes” at many law schools. Or, perhaps, a begrudging and inevitable “yes.” When the LSAT was a significant part of the rankings, it could make a big difference. And schools saw those numbers

But today, the value of the LSAT is dramatically lower than it was in the past. Schools should be reassessing whether this unflinching commitment to LSAT scores is worth it. But it turns out, schools are not reassessing—they are sticking to their old practices.

Median chasing in 2025

Below are a few charts from some top 30-ish schools from LSD.law, which continues a longstanding practice of creating a site for law school applicants to submit their individualized admissions decisions and allow them to be compared in the aggregate. I slightly cropped the charts to focus on the heartland of reported acceptances as of March 21, 2025.

The charts below show the UGPA and LSAT of accepted students. Left to right shows an increasing LSAT score; bottom to top shows an increasing UGPA.

The schools don’t particularly matter here. The point is that the admissions practices—or “strategy”—is essentially identical at every school, with some twists at various institutions (e.g., how they handle UGPA). There is a sharp drop-off to the left of some line that represents a school’s targeted “median” LSAT. Most accepted students to the left of that score are above the median UGPA line.

In short, just like schools back in 2013 and earlier, schools are “chasing” the median. They are ignoring high-caliber students with numbers that sit just below their medians in exchange for students who can help boost one side or the other of their medians (often, most starkly, as can be seen, the LSAT medians).

Do LSAT median changes affect the rankings?

This behavior may be rational if the LSAT matters for rankings purposes like it did in the past. But it doesn’t. It’s value has been significantly reduced. Additionally, schools ought to understand that marginal increases in LSAT score (which at the top end are extremely costly—there are fewer of those scores and more competition for them so financial aid packages can become quite costly) are even less valuable the higher the score. That said, a school might simply have the desire to get better, and the better the LSAT score, the more likely it helps an increase in the ranking.

That’s true, but how likely is it to happen?

I modeled the USNWR rankings and ran some counterfactuals to assess whether schools in a particular LSAT band would drop in the rankings. The results are quite telling.

I ran estimates for the 43 schools with median LSAT scores of 165 or higher. I estimated a one-point drop in their LSAT medians and assumed all other schools remained the same. Impressively, for 38 of the schools, the rankings did not change. It only changed for 5 schools. And for 2 of those 5 schools, they had a non-trivial cohort of GRE scores (which tend to be lower, often substantially lower) than LSAT medians, so it’s not as much a “true” comparison because their median LSAT was actually a lower percentile than peer schools.

I then took a quick look at the 17 schools with a 156 LSAT, and a whopping 10 of the 17 would see a drop—and outright majority. A jump of one point in the 156-158 range is worth about six times as much as a jump of one point in the 172 range and about twice as much in the 167 range.

Again, knowing what we know about percentile distributions, this should not be so shocking. The marginal value of climbing (or dropping) a point diminishes the higher you go. Likewise,

Perhaps even more surprising, however, is modeling a two-point drop for a given school. Even a two-point drop only adversely affects 15 of the 43 schools—that is, there’s a 2-in-3 chance that even a two point drop does not adversely affect the school among 165 and higher LSAT medians. And 4 of those 15 schools had a non-trivial GRE cohort (again, which tends to drop the value of the LSAT median in the first place). Unsurprisingly, on the whole, we see greater effects the farther down the LSAT score.

In short, if a school’s motivation in maximizing LSAT median is to improve its position in the rankings (and the revealed practices of admissions committees from LSD reflect this is the motivation), it needs to assess the cost of this decision. It is quite apparent that if a school were to give up a bit of ground in LSAT medians, it would, in most circumstances, not be adversely affected in the rankings. That should yield a profound change in admissions strategy—if the admissions strategy is designed fro 2025 instead of 2005.

Of course, if you are standing still while everyone else climbs a point, it could adversely affect you in the long run, and there are some drops in score that are too detrimental. Likewise, LSAT does predict law school success, so people with higher scores are going to be better students on the whole, so it can be valuable (although, as noted, a given score probably means less than it did before). And schools could be considering alternative admission strategies (e.g., alternative tests like the GRE, test-optional policies, etc.) that complicate this narrative. But, the point with respect to LSAT medians still stands.

The irrational pursuit of round numbers

One more data point on the potential irrational behavior with respect to admissions practices and LSAT scores. Here are the median LSAT figures for most schools and their distribution (how many schools had that median) for the incoming class in 2024.

Now, I am hardly an expert—or even a novice—in regression discontinuity analysis, but something strange happens around the numbers 150 and 160. Each reflect a significant spike from the total number of schools with a 149 and 159. That number jumps from 1 to 11 for 149 to 150; and from 4 to 16 for 159 to 160. True, the numbers are not a smooth bell curve overall and do not reflect a gentle increase and decrease. Nevertheless, there’s no jump up from one band to another like these two jumps. It is as if admissions offices—or perhaps their deans—are fixated on securing a round number. If that’s the case, it’s an even more absurd effort at median chasing (and frankly, many schools in the 150-ish range are not “chasing” the USNWR rankings). (Perhaps this is just a one-off, and we’ll see if such jumps occur in the future.)

Coda

Admissions strategies at a given school may differ for any number of reasons. At some institutions, leadership demands certain kinds of output and results, including maximizing any opportunity for rankings advancement. At other institutions, the status quo has been a successful formula in the past, and it’s a huge risk to suggest changing that. At still others, there isn’t anyone available to do a cost-benefit analysis or evaluate the tradeoffs and risk tolerance. And at still other institutions, well, I’m sure they think I’m wrong.

But let me recap. The LSAT, as a raw score, is less predictive of ability than it was 20 years ago. That is, a 170 or a 160 means less than it did 20 years ago. It may still be predictive in the aggregate. That is, a 170 means a higher likelihood of success than a 160. But there are error rates in that 170 that were unknown 20 years ago—the 170 likely overstates the “true” value compared to 20 years ago. Relatively speaking, and in terms of its validity as a statistical matter, it’s still valuable—it just has a different value than before.

Likewise, schools have continued to rely on the LSAT but used it in a way that makes it less predictive than it is designed to be—by relying on the highest score, for instance, or by refusing to use the index score. This is exacerbated by the fact that LSAC allows more retakes than it did a generation ago, and it allows cancellation of scores in mechanisms unknown a generation ago.

More recent developments, including the acceleration of extra time test-takers and the dropping of logic games from the LSAT, promise to further dilute the predictive validity of the LSAT in yet-unknown ways.

Schools have dabbled in GRE admissions or other alternative tests, or no score required for sufficiently high UGPAs from the law school’s home undergraduate institution. But back in 2023, I posited it that it was an “extraordinary moment” for law schools to rethink admissions. Some ideas I floated included:

  • Law schools could rely more heavily on the LSAC index, which is more predictive of student success, even if it means sacrificing a little of the LSAT and UGPA.

  • Law schools could seek out students in hard sciences, who traditionally have weaker UGPAs than other applicants.

  • Law schools can consider “strengthening” the “bottom” of a prospective class if it knows it does not need to “target” a median—it can pursue a class that is not “top heavy” or have a significant spread in applicant credentials from “top” to “bottom.”

  • Law schools can lean into need-based financial aid packages. If pursuit of the medians is not as important, it can afford to lose a little on the medians in merit-based financial aid and instead use some of that money for need-based aid.

  • Law schools could rely more heavily on alternative tests, including the GRE or other pre-law pipeline programs, to ascertain likely success if it proves more predictive of longer term employment or bar passage outcomes.

(There are other suggestions there, too, including interviewing prospective candidates or evaluating CVs for past employment experience, but these may well be more of a mixed bag.)

It seems, however, that schools are not changing their admissions practices. They are largely living in a 2005 universe of admissions.

Of course, any change carries risk. It is a time-consuming endeavor to pursue alternative admissions strategies, and reactive law schools likely do not have the infrastructure to do so as universities face hiring freezes. Even a marginal risk of declining in rank is too great a negative risk for some law schools—even if it might redound to schools in the long run with higher quality students, however the school chooses to measure them, rather than splitting LSAT and UGPA into quadrants and admitting on that basis. And alternative strategies could benefit schools in the long run on employment measures, but schools may be too risk averse to wait five years for that to play out.

But this extremely lengthy post is to point out, law school admissions practices are stuck in 2005. There is a significant uptick in prospective students, which ought to give schools the flexibility to think more creatively about law school admissions as they have more similarly-situated students to choose from. And it’s an open question which schools, if any, will step forward as the ones that don’t look at the LSAT like they did a generation ago.

"Florida Supreme Court Appoints Workgroup To Consider Bar Exam Requirements"

That’s the headline over at the Florida Supreme Court’s website. From the press release:

To be admitted to the Florida Bar and gain the privilege of practicing law in this state, applicants must pass the Florida Bar Examination. In turn, with limited exceptions, the Florida Supreme Court’s rules require graduation from an ABA accredited law school as a prerequisite to taking the Bar exam. The ABA has been the sole accrediting body recognized in the Court’s rules since 1992, though the rules have relied on ABA accreditation since 1955.

The Court has appointed a Workgroup to study the current ABA accreditation requirement in the Bar admission rules and to propose possible alternatives. Former Justice Ricky Polston will chair the Workgroup, which will submit its report to the Court by September 30. The Court has asked the Workgroup to be “guided in its study and deliberations by the goals of promoting excellence in Florida’s legal profession; not hindering law schools from providing high-quality, cost-effective, and innovative legal education, in a nondiscriminatory setting; and protecting the public and meeting Floridians’ need for legal services.”

The Court believes a study of this kind is warranted due to increasing public interest in governments’ reliance on ABA accreditation in regulations dealing with lawyer licensing and access to financial aid. Reasonable questions have arisen about the ABA’s accreditation standards on racial and ethnic diversity in law schools and about the ABA’s active political engagement. Scholars have also questioned how ABA accreditation requirements affect costs and innovation in legal education.

I blogged earlier that the ABA’s role as law school accreditor is “fairly secure.” It is possible, of course, for states to permit the practice of law for graduates of non-ABA accredited schools (California among them). And it’s possible that more deregulation from states (or from a substantial cohort of states) weakens the ABA’s influence, or creates additional competition in the marketplace from non-ABA accredited schools. But I’ll be watching this development in Florida closely to see how they examine the legal profession, legal education, and attorney licensing.

USNWR promises three new employment rankings for law schools in its 2025 rankings release

Ahead of the new USNWR rankings release April 8, USNWR just released this detail:

Outcomes are critical when considering an advanced degree. That is why three brand new law school rankings that compare schools on different career outcomes for graduates will be unveiled with the overall rankings this year.

Because the rankings should be a starting point and not an end point in research, U.S. News offers the MyLaw Rankings quiz that enables users to create custom rankings based on preferences. The law school search function also enables users to find schools based on information that is not collected by the American Bar Association, such as average starting salary and graduate employment. This was further facilitated this year by 10 additional law schools compared to the prior year having reported data to U.S. News, mirrored by improved response rates in the qualitative assessments U.S. News sends to top law school officials and legal professionals.

It’s worth noting a few things. The first is that “boycotting” law schools are down (perhaps as some schools realized it was not in their self interest to do so). But the second is that USNWR plans new rankings on three employment-related metrics.

Of course, as I’ve written about extensively, there are challenges in how USNWR assesses employment outcomes. For the rankings, a job is a job. But there are qualitative differences between many types of employment. That said, even articulating what those are (e.g., is a “Big Law Firm” job higher quality than other jobs, is a “federal clerkship” higher quality than other jobs, etc.) is fraught. It appears USNWR will include three flavors of rankings that allows students the opportunity to evaluate school employment outcomes in the categories they view as most valuable. And it appears USNWR will run these parallel to its overall rankings, which is helpful for students—students trying to compare two apparently similarly-situated schools can use differentiators like these employment rankings to help make comparisons. Upon reflection, it is possible (and this is entirely speculative) that one ranking might focus on “elite” employment outcomes (e.g., large law firms and federal clerkships); another might focus on “public service” (e.g., public interest and government jobs, perhaps state and federal clerkships). But this is speculation, as there are many ways to slice and dice the sub-categories of employment.

Now, naturally, it calls into question the whole enterprise of attempting a “one size fits all” ranking in the first place, but that ship has long since sailed….

In a trying time for legal education, the proactive will feel little, but the reactive will feel a lot

There is no question that there are several pressures facing legal education at the moment, and I’ll walk through some of them below. But, as I’ve reflected on them over the last few weeks, it’s been quite clear that most of these developments are unremarkable. Yes, some of the specific means may be a surprise, but events like these are actually long predictable (again, as I’ll highlight below). In the next couple of years, we’ll see which schools have been proactive about managing these pressures and will experience little disruption. And we’ll see a contrast in the reactive schools, the once that will face some of the more significant turbulence.

Law school admissions

Back in 2023, I noted it was a time for law schools to rethink law school admissions. The pressures of USNWR had changed. Outputs mattered more than inputs. There, I highlighted some of the proactive steps schools should consider:

  • Law schools could rely more heavily on the LSAC index, which is more predictive of student success, even if it means sacrificing a little of the LSAT and UGPA.

  • Law schools could seek out students in hard sciences, who traditionally have weaker UGPAs than other applicants.

  • Law schools can consider “strengthening” the “bottom” of a prospective class if it knows it does not need to “target” a median—it can pursue a class that is not “top heavy” or have a significant spread in applicant credentials from “top” to “bottom.”

  • Law schools can lean into need-based financial aid packages. If pursuit of the medians is not as important, it can afford to lose a little on the medians in merit-based financial aid and instead use some of that money for need-based aid.

  • Law schools could rely more heavily on alternative tests, including the GRE or other pre-law pipeline programs, to ascertain likely success if it proves more predictive of longer term employment or bar passage outcomes.

In one respect, it is a strong time for admissions. Applicants are up, and high quality applicants (i.e., those with high LSAT scores) are up even more. (As an aside, some suggest the increase in LSAT scores is attributable to the change in the LSAT format. I am entirely sympathetic with the view that LSAC has inadequately explained how the validity of the test will remain as strong with the dropping of the analytical reasoning section. At the same time, there was a significant cohort of LSAT test-takers, and high-end scores were up very early in the cycle (as measured before November 1, 2024), well before the bulk of those new LSAT scores were introduced.)

Despite the “demographic cliff,” a decline in births that is making its way through the higher education space more generally, will affect admissions, but not this year. And perhaps not next year. It’s entirely possible that, as the economy softens (more on that below), and as former federal employees who have lost their jobs—service-oriented professionals—consider alternative employment trajectories, we’ll see an increase in admissions interest next year, too.

But there will be new and additional strains on legal education. For non-JD legal education, those who relied on foreign-trained lawyers may face new restrictions on who may study in their programs, and those who relied on third-party for-profit service providers to administer their non-JD programs may find challenges as that market has changed significantly. It is also likely that the value of non-JD credentials, such as legal master’s degrees that allow federal employees a bump in pay status, will also diminish, if there are fewer federal works or higher uncertainty about the employment status of those workers; likewise, with a softer economic market, the value proposition of those non-JD degrees may decline. These alternative revenue streams will make JD admissions all the more significant.

That said, if there are more prospective JD students with higher credentials, there will be a temptation to over-fill some classes to make up for lost revenue elsewhere., or to stock up on revenue in the event of an admissions decline in the near future. But for those schools that have failed to be proactive about thinking about the ultimate employment outcome possibility of these prospective students, it could create significant divergence between law schools—those focused more on the “old” metrics like medians, and those focused on the newer and more valuable metrics they should have been considering two years ago. And by 2028 or 2029, the employment landscape may well have sufficiently shifted to be notable.

One more detail. Proactive law schools anticipated changes to the racial and ethnic makeups of their classes ahead of the Supreme Court’s decision in SFFA v. Harvard. Some schools pivoted toward considering socioeconomic status more robustly, or to providing greater financial need-based aid to attract students, or to reassessing their student recruitment pool for talent on a more aggressive and robust level. Others, more reactive, have seen significant changes to their incoming classes or continue to use explicit racial decisionmaking mechanisms that mark them as targets for federal oversight and investigation.

Law school accreditation

The American Bar Association is under attack by the present presidential administration, and it is, of course, entirely possible that accreditation power changes in the years ahead. If schools have been proactive about managing what the ABA requires—including, frankly, doing what is necessary to meet the minimum standards—then there should be little for law schools to worry about. But note that the ABA is very reactive. As I blogged earlier, the ABA buckled under pressure from the Department of Justice in 1995 and the Department of Justice in 2016. It is buckling again in 2025.

For reactive schools, reactive to a reactive institution like the ABA, they will have a much harder time calibrating. If schools over-corrected on matters like racial diversity policies in response to the ABA, they may have to over-correct back to come back into compliance with the ABA and new Department of Education policies. Proactive schools will have planned and prepared for changes to these policies—as the ABA has certainly changed over the years—and experience less disruption.

Funding and staffing

Many law schools are not overly reliant on federal funding. Some are, however; some have more substantial research grant money from federal sources. And some are parts of universities that quite reliant on federal funding. Some schools have more independence over their funding, others are more in a collective model with their universities.

In light of recent developments over the years, including an endowment tax instituted a few years ago that may well get much larger, and including significant inflation pressures, proactive schools began to develop long-term strategies to anticipate the challenges. Universities negotiated with donors over terms of endowed and restricted funds, creating more opportunities and more flexibility. Law schools negotiated with the university about how much to draw down the endowment. Development offices clearly communicated to donors the concerns in advance of these changes to ensure broader use of funds. The proactive have been prepared. But the reactive will face significant challenges of where to get the money and how.

For schools that staffed up to an appropriate level in recent years, including, as mentioned above, to handle admissions, and more importantly, as I highlighted back in 2023, larger and more aggressive career development offices, they, too, will be in a strong position. A hiring “freeze” at many institutions means to keep the status quo, absent exceptional circumstances. If the economy is softening, or if more labor-intensive admissions processes will be needed, then more staff will be essential, at the very time staff might be harder to come by.

Legal employment

Earlier this year, I highlighted a small piece of the employment picture to watch, recent law school graduates in government jobs. But there are bigger changes to the legal markets coming. One is the presidential administration’s targeting of some specific law firms. Another is that it will examine summer associate programs that may have been focused on underrepresented minority groups. Layoffs in federal government lawyer positions will create new pressure in the private sectors at those attorneys look for jobs elsewhere. Some practice areas, like Foreign Corrupt Practices Act, may dry up if the administration is enforcing it much less; the same goes for tax audits for sophisticated clients if the Internal Revenue Service loses personnel, or if the Consumer Financial Protection Bureau offers fewer regulations of major segments of the economy. If Public Service Loan Forgiveness drops, schools will likewise need to ensure in-house funding mechanisms to assist those in public interest work, or find more robust financial aid packages. And there’s the whole AI thing.

In short, we may see significant transition in the employment markets more generally with law students. Schools that been proactive of building up career development offices, broadening interest in legal markets, counseling students holistically for professional formation, and the like will be more ready to handle these challenges. The reactive, however, could very quickly see a rise in unemployment among their graduating classes (and a corresponding drop in the USNWR rankings that are heavily focused on outputs like these from year to year). Alternatively, reactive schools could more aggressively backfill classes with school-funded jobs or by placing students into other degree programs.

Bar exam

Set aside the entirely predictable snafu in California. The bar exam is changing (more posts on that to come). Academic support and bar preparation support at law schools has become all the more critical to ensure students are shepherded through their legal education and make it out with success on the bar exam. Passing the bar becomes all the more critical for employment if the market is tightening. Changes to the bar exam require staffing and personnel ready to make a swift transition in methods of teaching. Schools that have adequately staffed and prepared for this should experience few problems. Those that are unprepared for a new bar exam and expect students—especially those falling in the gap in some jurisdictions, especially if they’ve previously failed a bar—to handle on their own; or those who failed to staff up ahead of a hiring freeze, will face challenges.

*

In one respect, I suppose, this long post is nothing new. It’s better to be prepared than to react to changes. But given the volume of changes and uncertainty facing legal education in the present moment, it’s worth pausing to think about what’s to come in the months ahead. Many of the things I discussed I was writing about two years ago. They are hardly surprises. Yes, of course, this iteration of changes may be, in many respects, surprising as to its cause or its specific effects. But for the law schools that have the leadership, faculty engagement, foresight, and will to execute these things in advance, they will ultimately serve their students and the profession quite well.

Law school academic dismissal and conditional scholarship eliminations, 2024

I have previously highlighted the fact that law schools have wide variance in how they handle academic dismissals of first-year law students and how they handle reducing or eliminating scholarships. Both categories, I have argued, are negatives for law schools and the kind of information that USNWR could (and perhaps should) incorporate into its rankings. I offered a few ways of comparing schools to one another.

Here’s a visualization of the percentage of first-year law students who were academically dismissed in 2024. These percentages are slightly different than the opaque percentages that are reported to the ABA. These figures look at enrollment as of October 5, 2023; and the ensuing total number of first-year law students who were academically dismissed the following year. The figures exclude transfers, and those who withdrew for other reasons. I organize the chart roughly by USNWR ranking and only look at the top 100-ish schools. Last year’s charts are here.

You can see that most schools have zero or negligible academic attrition, and that it picks up slightly as the chart moves down. But a few schools have somewhat higher academic attrition, 5% or higher. (For what it’s worth, in my judgment, negligible attrition, 1% or so, is entirely appropriate for a school as it makes risk tradeoffs for admissions, as it may desire to broaden the applicant pool and give a larger cohort of students a chance than a more stringent policy.

Now over to scholarship reductions or eliminations. The ABA does not distinguish between the two, or distinguish in the amount. Instead, any reduction or elimination is included. The percentage here is also slightly different than the ABA data—it is the percentage of the overall class in this chart, not the percentage among scholarship recipients. That is, if you did not receive a scholarship, you are included in the denominator in this chart, so this chart includes all 1Ls at each school.

There are far fewer schools that reduce or eliminate scholarships, because the vast majority of school simply do not have “conditional” scholarship policies. But, again, as one moves down the chart, one can see some more reductions or eliminations, with a handful eliminating or reducing scholarships for 10% or more of the class. And there is some overlap among academic attrition rates and conditional scholarship data.

What is the future of ABA accreditation of law schools? (In short, fairly secure)

A recent announcement by the chair of the Federal Trade Commission with respect to the American Bar Association drew some attention. As announced:

Today, Federal Trade Commission Chairman Andrew N. Ferguson announced a new policy that prohibits FTC political appointees from holding leadership roles in the American Bar Association (ABA), participating in ABA events, or renewing their ABA memberships. Additionally, the FTC will no longer use its resources to support any employee's ABA membership or participation in ABA activities.

Although this missive is not related to the ABA’s role accrediting law schools, nevertheless the question arises: is there potential peril for the ABA’s role in accrediting law schools?

While the ABA has insisted for a long time that there is a significant wall of separation between its accrediting authority and its other activities, there is no question that its active political engagement as an organization, including, inter alia, claiming that the Equal Rights Amendment is currently the Twenty-Eighth Amendment to the United States Constitution, has drawn significant criticism. Formal separation or not, the appearance of political influence is enough to draw challenges.

But the ABA is no stranger to controversy with the federal government with respect to its role as accreditor. In the 1990s, it entered into a consent decree with the Department of Justice because its accreditation standards were, at the time, deemed too stringent. In 2016, the Department of Education came after the ABA for being too lax in its standards. It has hardly been a model accrediting body.

That said, to be an accrediting body of law schools can mean two things.

The first, and what the ABA’s troubles have reflected, is to be an accrediting body of an institution of higher education so that the institution is eligible for, among other things, federal loan disbursements. You can read the fairly generic statements in the CFR to see what an accreditor must do. There are many accreditors of higher education—the ABA is only one. Indeed, most law schools are twice accredited, because they are a part of a university, and that university is accredited by some other body (e.g., the Middle States Commission on Higher Education, the Higher Learning Commission, WASC, or some other entity).

Anyone can do this—anyone can form an organization that meets the CFR rules and start accrediting schools. Understandably, there haven’t been many making the effort to jump into this domain.

And so long as the ABA continues to meet the CFR rules—subject, of course, to oversight, as the DOJ in 1995 and Department of Education in 2016 made perfectly clear they are willing to do—they will continue to accredit schools.

The second, and what many separately discuss, is the role of ABA accreditation as a condition of admission to the bar. The vast majority of states require that bar admittees have attended an ABA-accredited law school and obtained a JD. There are a few notable exceptions, like California, which permits admission to the bar for those who attend a California-accredited school, or even an unaccredited school, if certain conditions are met.

For many years, I’ve wondered (somewhat rhetorically) why the ABA continues to accredit law schools. It used to be that it believed accreditation would improve the quality of lawyers admitted to the bar, by requiring a certain floor of education. It’s obvious over the years that the ABA, through various pressures (e.g., the 1995 consent decree), does not measure much on this domain. The ABA also focuses increasingly on outputs of law schools—e.g., how many of your students have passed the bar exam. If students pass the bar exam, which is the barrier to practice, then it seems that accreditation is a superfluous barrier to practice. Of course, there are consumer protection rationales for accreditation, but those are different measures that might be implemented more inexpensively than the full-scale ABA accreditation.

Likewise, the ABA continues to require more and more specific demands of law schools (in contravention of its own advice a decade ago) and fails to measure whether it is achieving its goals when it implements new standards that purport to achieve some desired end.

But for a state bar or state supreme court considering whether ABA accreditation is a worthy precursor to admission to the bar, states have precious little influence.

A state could, like California, open up admission to the bar to a larger group of individuals. That would diminish the ABA’s influence. But the experiment in California has not played out particularly well, in my view. The vast majority of non-ABA accredited schools are materially worse (but not all!) than the bulk of ABA schools. There is tremendous market pressure for prospective students to attend an ABA accredited school. These non-ABA accredited schools are typically cheaper, but the student quality tends to be much lower, and many will fail the bar exam and never ultimately practice law. It is a tradeoff to consider these risks.

So there is a way to dilute the influence of the ABA, but it does not seem particularly viable (at least, as seen in California so far).

States also have great challenges by thinking about in-state and out-of-state influence. A single state—say, New York or Texas—could insist that ABA accreditation is not a prerequisite to pass the bar exam. The state might require, say, attending any school that issues a JD. Of course, this is mostly going to be ABA accredited schools or California schools—and there might be much less oversight, if some unaccredited school opens and purports to issue a “JD.”

Likewise, a state could require a graduate attend any accredited school that issues a JD. That way, if the school is accredited by, say, WASC and issues a JD, it would count just as much as an ABA accredited JD. But few law schools would change their behavior. You’d need an ABA-accredited JD to get admitted to the other 49 states, so the incentive to skip ABA accreditation is quite low.

States, you see, have little leverage as individual states, at least in one direction, that of loosening standards. If one state relaxes its standards, schools won’t change. That’s because they really need to ensure they meet the most stringent jurisdiction’s requirements so their graduates can be admitted into any state. In California—a very large market—some schools have taken the leap to focus exclusively on one state at the expense of all others. But for many schools, it is not feasible to think they would change. It would take a critical mass of states to loosen standards—and even then, it’s not clear many schools would change.

A state could also tighten its standards with idiosyncratic additional rules (e.g., a requirement that all law school graduates take 15 units of “experiential learning”; that all law school graduates perform 50 hours of pro bono work before acceptance to the bar; etc.). California has attempted versions of these. In-state schools tend to react with robust programs. Mostly, however, it places significant costs on (often first-generation) law students at out-of-state schools who unwittingly learn about conditions to legal practice because most schools simply can’t provide bespoke accommodations to each state’s bar.

So a state bar could change who accredits or loosen the standards for who accredits, but it likely doesn’t affect many schools’ behavior (except, perhaps, attract a few more marginal players into the market, who might launch schools). A state bar could tighten the standards (e.g., require more than the ABA or something the ABA doesn’t require), but that tends to fall on individual out-of-state law students.

In short, the two functions of ABA accreditation—complying with the Department of Education regulations to unlock federal loans for students at those schools, and ensuring some minimum level of education as a condition for admission to the bar—appear to be fairly secure. True, the Department of Education could revoke the ABA’s accrediting authority, but it would need to explain why it’s failed existing CFR rules, and that seems unlikely at the moment. Likewise, it could suspend them from accrediting new schools (as it has temporarily done in the past), but that would really just preserve the status quo. Alternatively, it might develop new CFR rules for all accrediting bodies, not just the ABA, about what needs to go into accreditation (or what should not be a part of accreditation), but that seems a heavy future lift. And finally, it seems unlikely state bars are in a position to do much, unless a critical mass of them manage to free schools from ABA accreditation specifically and permit alternative accrediting bodies as permissible (and even then, the vast majority of law schools would still seek ABA accreditation so long as even one state requires ABA accreditation as a condition for bar admission).

But, of course, politics and administrative rules are a dynamic field, and things could change quickly. That said, I tend to anticipate the status quo will remain, and that’s my anticipation here, too.

Federal government hiring freeze could dramatically affect some law schools' employment outcomes

The reports are coming in from law schools around the country about law students, either about to graduate or set for summer employment, losing positions in the federal government as the result of a recently-initiated hiring freeze. Some of those positions may be made available again in the near future, as departments are staffed with political appointments and begin to make decisions about hiring. And it is not all federal positions: “This order does not apply to military personnel of the armed forces or to positions related to immigration enforcement, national security, or public safety.”

Law school employment metrics often look to “full weight” jobs—full-time, long-term, bar passage-required to JD-advantage jobs. And for some schools, the number of recent graduates placed into government jobs can vary dramatically from school to school. Some of the top schools for the Class of 2023:

Albany 27.9%

Dayton 25.4%

South Dakota 24.1%

Regent 22.6%

Northern Kentucky 22.4%

Florida A&M 22.3%

Syracuse 21.3%

Southern Illinois 20.5%

George Mason 20.2%

Catholic 20.2%

Pace 20.2%

Widener-Commonwealth 20.0%

Liberty 19.8%

McGeorge 19.8%

Florida State 19.7%

The ABA data do not separate state or local government jobs from federal government jobs. But it is probably fair to assume that for students at schools in the state capital (Albany, Florida State) or near the state capital (Dayton, Syracuse), many of these are state jobs. For schools in and around Washington, DC (e.g, Regent, Georgia Mason, Catholic), it is likely there are more federal jobs. These schools could be most affected by a hiring freeze, both in this metric and in any rankings that rely on this metric.

But a lot of school—including the vast majority of elite schools—send very few into government jobs (the vast majority end up in large law firms or judicial clerkships). Again for the Class of 2023:

Chicago 0.5%

Cornell 1.7%

Penn 2.0%

USC 2.2%

UCLA 2.2%

Duke 2.5%

NYU 3.2%

Virginia 3.2%

Northwestern 3.2%

Loyola Los Angeles 3.5%

Columbia 3.5%

Harvard 3.7%

Western State 3.7%

Stanford 3.8%

Michigan 3.9%

I’m interested to see if the aggregate jobs in government change when the Class of 2024 data is reported this spring. And I’m also interested to see if it disproportionately affects a subset of schools—or if those schools manage to find other outlets for their graduates. That said, most people who graduated months ago may be secure in their positions and it will affect relatively few. Perhaps more from the Class of 2025 will be affected—but they also have a longer window to secure positions.

Annual Statement, 2024

Site disclosures

Total operating cost: $192

Total content acquisition costs: $0

Total site visits: 90,667 (+11% over 2023*)

Top referrers:
Reddit (5890)
Twitter (5420)
TaxProf Blog (3827)
Leiter’s Law School Reports (2795)
Buzzfeed (1426)
LinkedIn (846)
Instapundit (531)
ABA Journal (248)

(Other referrers of note)
Facbeook (102)
Bluesky (27)
ChatGPT (16)

Most popular content (by pageviews):
Ranking the most liberal and conservative law firms among the top 140, 2021 edition (November 8, 2021) (11,116)
Projecting the 2025-2026 USNWR law school rankings (to be released March 2025 or so) (May 21, 2024) (8849)
Updated projected 2025-2026 USNWR law school rankings (to be released March 2025 or so) (December 17, 2024) (6078)
Updating and projecting the 2024-2025 USNWR law school rankings (to be released March 2024 or so) (December 18, 2023) (3985)
The 2024-2025 USNWR law school rankings: methodology tweaks may help entrench elite schools, but elite schools see reputation decline among lawyers and judges (April 8, 2024) (3732)
Projecting the 2024-2025 USNWR law school rankings (to be released March 2024 or so) (May 15, 2023) (3076)
Law school faculty monetary contributions to political candidates, 2017 to early 2023 (March 11, 2024) (2965)
California’s “baby bar” is not harder than the main bar exam (May 28, 2021) (2604)

Sponsored content: none

Revenue generated: none

Disclosure statement

Platform: Squarespace

Privacy disclosures

External trackers: none

Individuals with internal access to site at any time in 2024: one (Derek Muller)

*Because Google changed its analytics platform in a way that made it less user-friendly and more complicated, I ended access to Google analytics and instead switched analytics to Squarespace’s internal analytics. It may make comparisons more challenging. Unfortunately, Squarespace’s “unique visitors” and “pageviews” are less valuable metrics, so those categories have been removed.

Latest ABA data shows a continuing decrease of Black men enrolled in law schools

Any discussion of race, sex, and legal education has its own challenges about how to approach or how to interpret the topic, but some recent trends are noteworthy enough to raise, which I do here, and leave some of the discussion of implications for others.

Total 1L enrollment of Black law students has been fairly steady at law schools in the last decade. Total 1L enrollment has been fairly stable, mostly between 37,000 and 38,500 with occasional forays above 40,000. For Black 1Ls, it’s been mostly between 2900 and 3500 1Ls. (For figures, see the statistics here.)

For 2024, Black 1L enrollment is 3066, up a tick from 2023’s 2969. Stories like this one at the New York Times focus on a few elite law schools, particularly Harvard, and its declining Black 1L enrollment after the Supreme Court’s decision in Students for Fair Admissions. It’s a fairly anecdotal story, though—the picture for legal education as a whole looks a little different than a deep dive on a handful of schools.

One fairly significant and underreported story for that bigger picture of legal education is the wide, and now widening, gender gap among Black law students.

Let’s start with the overall portrait of sex and legal education. In 2016, women outnumbered men in the incoming 1L class for the first time. The total number of women has continued to climb, and the total number of men has continued to fall.

While men totaled around 18,000 to 19,000 of 1L enrollees about a decade ago, and even up until a few years ago, it’s down to around 16,000 now (16,679 for the incoming Class of 2024). Women enrolled rose both in absolute and relative numbers over the years, from 19,032 in that 2016 figure to 22,276 in 2024. Among men and women, women were 51% of the incoming class in 2016; they are more than 57% of the incoming class of 2024.

This is consistent with trends in higher education more generally, as men are increasingly eschewing college and women enroll at much higher rates. As the gap in undergraduate education widens, one would expect the gap in legal education to widen, too.

So, there is a story to tell in recent years about sex and legal education. But the interaction with race is a separate noteworthy development.

There are anecdotes about the gender gap between Black men and women in the legal profession in recent media. For instance, President Joe Biden recently appointed Embry Kidd to the federal court of appeals, and news outlets noted that it was the second Black man he appointed to the bench, compared to thirteen Black women. (At the district court level at the time of that story, it was 20 Black men and 25 Black women.)

But 1L enrollment tells a story about the current and future state of the legal profession, too. As the gender gap has widened more generally in legal education, it has widened particularly acutely for Black men.

Back in 2016, there were 1198 Black men enrolled in the 1L class, and 2076 Black women. Women were 63% of the total of Black men and women 1L enrollees. In the last decade, the total number of Black men has fallen to 918 in 2024, about a 25% decline since 2016. (And in 2017, 1L Black men totaled 1281, so the fall to 918 is even more precipitous compared to that bench mark.) For Black women, 1L enrollment is up slightly since 2016, to 2103 1Ls in 2024 (some years climbing into the 2200 range). That moves the percentage of Black women up to almost 70%—more than a 2 to 1 ratio.

There are challenges with this, like any, data. This data set only includes those who identify as a man or a woman, not another gender identity or refuse to reveal their sex (although the ABA aggregates those, but they are a much smaller number, around 32 Black 1Ls total in 2023 and 45 in 2024, to give two examples). It also includes only those who identify as “Black of African-American” (the category the ABA uses), not those who refuse to disclose, or who identify as two or more races (although the ABA aggregates those, too). But for consistency, we can make the comparative approach over the last decade with the same kind of data limitations.

In many respects, Black men have all but disappeared from legal education in many places. Consider the following statistics about the 918 Black men who make up the incoming 2024 1L class.

  • About one-sixth of Black men enrolled as 1Ls (154) are concentrated at the five HBCU law schools.

  • Another 13% (120) are concentrated at what might have historically been labeled “top 14” law schools.

  • 27 law schools (about 14% of law schools) report zero Black men who are 1Ls.

  • 51 law schools (about 26% of law schools) report one or two Black men who are 1Ls.

There are, of course, reasons for some schools to have such figures (e.g., lower racial diversity in some great plains or mountain west states and their affiliated law schools). But the figures are fairly noteworthy all the same.

In short, in the aggregate, Black 1Ls have been a fairly stable cohort in the entering 1L class over recent years, and the incoming 2024 admissions class is no exception. But there’s a deeper story to examine about race and sex when it comes to legal education.

As I opened this post, any discussion of race, sex, and legal education can be challenging. And presenting data always presents its own challenges, including what conclusions to draw or what implications the data offer. One could compare the current legal education picture to the legal profession as a whole. One could look at current trends and ask why certain trends are happening or what can or ought to be done about them. One could drill down to individual schools or cohorts of schools, by “prestige” or by geography. But there’s no question that the ABA data shows a continuing decrease of Black men enrolled in law schools, and the numbers are more stark than they’ve been in recent history.