Comment on the ABA's proposal to end admissions tests as a requirement for law school admission

Earlier, I blogged about the ABA’s proposal to end the admissions test (typically, the LSAT) as a requirement for law school admissions. I’ve submitted a comment on the proposal, which you can read in its entirety here. The comment recommends disclosure of four pieces of information if the ABA accepts the proposal: the number of matriculants who do not have a standardized test score; the percentage of students receiving—and the 75th, 50th, and 25th percentile amounts of—grants among students admitted without a standardized test score; total academic attrition among students who lack a standardized test score; and the first-time and ultimate bar exam passage rates for students without a standardized test score. The comment explains why each item would be a useful disclosure.

You can view other comments here.

Biden experiences unprecedented hot streak with ABA judicial nominee ratings

I’ve blogged about the ABA’s judicial nominee ratings, wondering whether the ABA was any good at evaluating nominees. You can take a look at its historical ratings.

But President Joe Biden is experiencing an unprecedented hot streak. He’s had 100 ABA judicial nominee evaluations returned, and not a single one of them had a single “not qualified” vote among them.

Mr. Biden is the third president, joining Presidents George W. Bush and Donald Trump, to reject the ABA’s “pre-screening” power in evaluating judicial nominees. In the past, a president would submit potential nominees to the ABA and receive a rating back. Most of the time, a majority “not qualified” vote would sink the potential nominee, and the person would never face a formal nomination. Mr. Bush first broke the tradition on grounds that the ABA tended to give more conservative nominees lower ratings than more progressive nominees.

President Barack Obama resumed the tradition. In his first three years, the ABA, apparently, gave outright “not qualified” ratings (a majority vote of “not qualified”) to 14 potential nominees. For another 7 nominees, the ABA gave a minority vote of “not qualified.”

As a point of comparison (Democratic to Democratic administrations), Mr. Biden has zero, majority or minority “not qualified.” That’s a remarkable achievement. Given how many candidates Mr. Obama named who received a “not qualified,” it suggests some combination of White House vetting and ABA reviewing have changed, although it’s entirely unclear how to measure this. But it does show that Mr. Biden is on an unprecedented hot streak.

California audit reveals significant underreporting and underenforcement of attorney discipline

The full report is here. The National Law Journal highlights a few things:

In a review of the agency’s disciplinary files, acting state auditor Michael Tilden’s office found one lawyer who was the subject of 165 complaints over seven years.

“Although the volume of complaints against the attorney has increased over time, the State Bar has imposed no discipline, and the attorney maintains an active license,” the report said.

In another instance, the bar closed 87 complaints against a lawyer over 20 years before finally recommending disbarment after the attorney was convicted of money laundering.

It’s a pretty remarkable story that highlights two things worth considering for future investigation.

First, when Professor Rob Anderson and I highlighted the relationship between bar exam scores and ultimate attorney discipline rates, we could only draw on publicly-available discipline records. In a sense, what we observed was a “tip of the iceberg.” Now, this could come out in a couple of different ways. On the one hand, it might be that the relationship is even stronger, and that attorney misconduct manifests earlier, if we had complete access to the kind of complaints that the California bar has. On the other hand, it might also be the case (as we point out in the paper) that some attorneys are better at concealing (or defending) their misconduct than others, and that might be hidden in the data we have. It would be a separate, interesting question to investigate.

Second, it highlights the inherent error in comparing attorney discipline rates across states. California’s process is susceptible to unique pressures or complications, as all states’ systems are. You cannot infer much from one state to another (unless you are looking at relative changes in states over time as a comparative benchmark), which is an effort some have (wrongly) attempted.

It will be interesting to see what comes out of the reforms proposed in California and if the effort improves public protection.

Where the major political parties spent their legal dollars between 1Q2021 and 1Q2022?

I pulled the FEC data for the DCCC, DNC, DSCC, NRCC, NRSC, and RNC from January 1, 2021 to March 31, 2022 to see where the major political party arms spend their money. I looked at any expenditure labeled legal, law, or attorney. I deduped and merged entries for this 15-month period. To start, here’s where Democratic-affiliated outlets spent money, to any outlet receiving at least $25,000 in this period. (It excludes internal spending or transfers.)

PERKINS COIE WA (DC) $27,815,540
ELIAS LAW GROUP LLP DC $2,424,174
WILMER CUTLER PICKERING HALE AND DORR LLP DC $2,149,733
BROOKS PIERCE MCLENDON HUMPHREY & LEONARD LLP NC $1,043,825
KAPLAN HECKER FINK LLP NY $1,020,537
LATHAM & WATKINS LLP PA $645,175
DECHERT LLP PA $622,246
HEMENWAY & BARNES LLP MA $470,814
KREVOLIN HORST GA $385,765
DENTONS COHEN & GRIGSBY PC PA $382,242
BONDURANT MIXSON & ELMORE LLP GA $209,606
COVINGTON & BURLINGTON LLP DC $162,481
BALLARD SPAHR LLP PA $160,930
BALLARD SPAHR LLP AZ $160,930
CHERRY, BEKAERT & HOLLAND VA $150,333
MUNGER TOLLES OLSON LLP CA $143,955
MILLER CANFIELD PADDOCK AND STONE PLC MI $137,672
HIATT, JONATHAN MD $120,300
GREENBERG TRAURIG LLP PA (NY) $117,587
LAW OFFICE OF EVELYN GONG PLLC NY $116,910
LOCKRIDGE GRINDAL NAUEN PLLP MN $90,000
THE LAW OFFICE OF ADAM C BONIN PA $58,020
MELOY LAW FIRM MT $49,728
GUREWITZ, MARY ELLEN MI $40,800
HERRON, MICHAEL NH $39,025
SONNENFELDT, MICHAEL NY $36,500
FOX ONEILL SHANNON SC WI $35,656
STAFFORD ROSENBAUM LLP WI $34,822
CIVITECH TX $31,800
JACKSON LEWIS PC KS $29,500
JAMS INC CA $27,002
BRYAN D HOBEN ESQ NY $26,500
WOLF RIFKIN SHAPIRO SCHULMAN RABKIN LLP NV $26,410
JAMES & HOFFMAN DC $25,980

Firms with (a second state) in parentheses indicated that spending was labeled as being sent to another branch of that firm in that second state. Unfortunately, I cannot explain why identical amounts went to Ballard Spahr in two different states (there were not perfectly symmetrical transactions); I did not merge in case they are duplicates, but I left both there as a point of comparison.

Here are the firms on the Republican side. (Note more firms but smaller totals, and more spending that may not precisely align with “legal” expenditures but more media or press-related costs.)

Unfortunately, I had the same duplication problem with Wiley Rein and with King & Spalding.

WILEY REIN LLP MD $2,681,866
WILEY REIN LLP NJ $2,681,866
JONES DAY DC $2,146,453
CONSOVOY MCCARTHY VA $1,858,158
HOLTZMAN VOGEL JOSEFIAK PLLC VA $1,744,962
SHUTTS & BOWEN LLP FL $1,550,270
KASOWITZ BENSON TORRES LLP NY $1,262,500
ON MESSAGE INC VA $1,251,610
MCGUIRE WOODS LLP VA $799,938
DHILLON LAW GROUP INC CA $699,618
NECHELESLAW LLP NY $676,039
BUTZEL LONG ATTORNEY'S AND COUNSELORS MI $506,095
BELL MCANDREWS & HILTACHK LLP CA $433,952
VAN DER VEEN HARTSHORN AND LEVIN PA $349,864
FISCHETTI & MALGIERI LLP NY $333,945
CONSTITUTIONAL LITIGATION & ADVOCACY GROUP PC DC $300,000
NEWMEYER & DILLION LLP CA $225,286
BLANK ROME PA $209,030
HALL BOOK SMITH GA $190,249
MICHAEL BEST & FRIEDRICH LLP WI $179,573
BAKER & HOSTETLER LLP OH $150,000
RASKIN & RASKIN PA FL $150,000
DIGENOVA & TOENSING LLP DC $141,146
GOLDSTEIN LAW PARTNERS LLC PA $140,823
DEROHANNESIAN & DEROHANNESIAN NY $137,526
DILLON MCCANDLESS KING COULTER & GRAHAM LLP PA $136,749
KLEINBARD LLC PA $126,129
CROSBY OTTENHOFF GROUP DC $110,000
SNELL & WILMER LLP AZ $107,411
SKADDEN ARPS SLATE MEAGHER & FLOM NY $106,233
JOHN CIAMPOLI ESQ. NY $95,388
LAW OFFICE OF LINDA A. KERNSLLC PA $88,647
ROBINSON GRAY STEPP & LAFFITTE LLC SC $83,805
KING & SPALDING LLP DC $81,649
KING & SPALDING LLP GA $81,649
TAYLOR ENGLISH DUMA LLP GA $81,421
IMPERIUM PUBLIC STRATEGIES TN $80,000
SPARTAN PUBLIC AFFAIRS LLC VA $80,000
LODGE, JOHN III TX $79,831
CLARK HILL PLC PA $75,000
KINCAID, ADAM VA $75,000
STATECRAFT PLLC AZ $71,785
BRICKER & ECKLER LLP OH $70,770
THE NATIONAL REPUBLICAN REDISTRICTING TRUST VA $70,000
ALAN R OSTERGREN PC IA $68,025
LANDSLIDE STRATEGIES VA $65,000
BULEY, JEFFREY NY $63,136
BRADLEY ARANT BOULT CUMMINGS LLP AL $60,614
BELIN MCCORMICK IA $57,988
VAN DE BOGART LAW P.A. FL $51,481
MARQUIS AURBACH ATTORNEYS AT LAW NV $50,357
PHELPS TX $48,297
SHANAHAN LAW GROUP PLLC NC $47,927
PORTER WRIGHT MORRIS & ARTHUR LLP OH $38,871
LITTEN & SIPE LLP VA $36,554
CROSS XAMINE INVESTIGATION INC MI $36,195
DAVIDSON, DONNA GARCIA TX $35,000
KEVIN CLINE LAW PLLC NC $34,158
2652 GROUP LLC VA $34,082
OGLETREE DEAKINS NASH SMOAK & STEWART P.C. SC $31,593
HUCKABY DAVIS LISKER VA $31,500
DANIEL K HAGOOD PC TX $30,098
BROWN, MICHAEL DC $30,000
CUTOLO BARROS LLC NJ $30,000
MR&A LLC PA $29,193
AMERICA RISING LLC VA $27,709

What happens if the ABA ends the requirement that law schools have an admissions test? Maybe less than you think

In 2018, the American Bar Association’s Council on the Section of Legal Education and Admissions to the Bar considered a proposal dropping the requirement of an admissions test for law schools. I wrote about it at the time over at PrawfsBlawg (worth a read!). The proposal did not advance. Many of these points hold true, but I’ll look at how a new proposal differs and what might come. The proposal is still in its early stages. It’s possible, of course, that the proposal changes, or that it is never adopted (as the 2018 proposal wasn’t).

To start, many law schools currently admit a non-trivial number of students without the LSAT. Some of those are with the GRE. A few are with the GMAT. Several admit students directly from undergraduate programs with a requisite ACT or SAT score. The GRE has gained more acceptance as a valid and reliable predictor of law school admissions, although how USNWR uses it in calculating its rankings is not how ETS recommends using the GRE.

The 2018 proposal concluded, “Failure to include a valid and reliable admission test as a part of the admissions process creates a rebuttable presumption that a law school is not in compliance with Standard 501.” The 2022 proposal is even more generous: “A law school may use admission tests as part of sound admission practices and policies.” No rebuttable presumption against.

There are varying levels of concern that might arise, so I’ll start with the point that I think inertia will keep many law schools using not just standardized tests but the LSAT.

First, the most significant barrier to prevent a “race to the bottom” in law school admissions: the bar exam. As it is, schools must demonstrate an ultimate bar passage rate of 75% within two years of graduating. That itself is a major barrier for dropping too low. Even there, many schools do not like an overly-low first-time passage rate, and student take note of first-time bar passage rates, which have increased importance in the USNWR rankings.

Now, some states have been actively considering alternative paths to attorney licensing My hunch—and it’s only a hunch—is that this move by the ABA will may actually reduce the likelihood that state bars will consider alternative pathways to attorney licensing beyond the bar exam, such as version of “diploma privilege.” If state bars are concerned that law schools are increasingly likely to admit students without regard to ability, state bars may decide that the bar exam becomes more important as a point of entry into the profession.

Of course, this isn’t necessarily true. If schools can demonstrate that they are admitting (and graduating) students with the ability to practice law to the ABA, and perhaps to the state bars, then that could elevate trust. But state bar licensing authorities appear to have long distrusted law schools. We’ll see if these efforts complicate proposals for bar exam reform, or simply highlight closer working relationships with (in-state) law schools and bar licensing authorities.

In short, unless schools come up with adequate alternatives on the admissions front to address bar passage at the back end, it’s unlikely to be a drastic change. And it might be that efforts in places like Oregon, which are focused on both the law school side and the consumer-facing side of the public, will assuage any such concerns.

Second, a less obvious barrier is legal employment. That’s a tail-end problem for inability to pass the bar exam. But it’s also an independent concern among, say, large law firms or federal judges to choose from graduates with the highest legal ability. There are proxies for that, law school GPA or journal service among them. But the “prestige” of an institution also turns in part on its selectivity, measured in part by the credentials of high LSAT scores. If firms or judges are less confident that schools are admitting the highest caliber law students, they may begin to look elsewhere. This is a complicated and messy question (alumni loyalty, for instance, runs deep, and memories of institutional quality run long), but it may exert some pressure on law schools to preserve something mostly like the status quo.

Third, for admissions decisions of prospective students, there’s a risk about how to evaluate GPAs. For instance, it’s well known that many humanities majors applying to law school have disproportionately higher GPAs than their LSAT scores suggest; and that hard sciences majors have disproportionately lower GPAs than their LSAT scores suggest. The LSAT helps ferret out grade inflation and avoids collegiate major grading biases. It is not immediately clear that all admissions decisions at schools will grasp this point if the focus shifts more substantially to UGPA as the metric for admissions (which is less accurate a predictor of Law school success than LSAT, and less accurate still than LSAT and UGPA combined).

Fourth, who benefits? At the outset, it’s worth noting that all schools will still indicate a willingness to accept the LSAT, and for law students interested in the broadest swath of application interest are still going to take the LSAT. Additionally, it’s likely that schools will continue to seek to attract high-quality applications with merit-based scholarships, and LSAT (or GRE) scores can demonstrate that.

One group of beneficiaries are, for lack of a better word, “special admittees.” Many law schools often admit a select handful of students for, shall we say, political or donor reasons. These students likely do not come close to the LSAT standards and may have the benefit of avoiding the test altogether. (Think of the Varsity Blues scandal.)

A second group of beneficiaries are law schools with a large cohort of undergraduates at a parent university that allows for the channeling of students into the law school. Right now, schools are capped at how many students can be admitted under such programs with an LSAT requirement as opposed to only a UGPA and some ACT or SAT requirement. That cap is now lifted.

Relatedly, pipeline programs become all the more significant. If law schools can develop relationships with undergraduate institutions or programs that can identify students who will be successful in law school upon completion of the program, it might be that the law school will seek to “lock” these students into the law school admissions pool.

In other words, it could most redound to the benefit of law schools with good relationships with undergraduate institutions, both as a channeling mechanism and as a way of preventing those students from applying to other schools (through a standardized test). We may see a significant shift in programming efforts.

There are some who may contend that racial minorities and those from socio-economically disadvantaged backgrounds will benefit, as they tend to score lower on standardized tests and bear the brunt of the cost of law schools adhering to standardized testing. That may happen, but I’m somewhat skeptical, with a caveat of some optimism. The LSAT is a good predictor of bar exam success (and of course, a great predictor of law school grades, which are a great predictor of bar exam success), so absent significant bar exam changes, there will remain problems if schools drop standardized testing in favor of metrics less likely to predict success. That said, if schools look for better measures in pipeline programs, things that prospective students from underrepresented communities can do that will improve their law school success, then it very well could redound to the benefit of these applicant pools and potentially improve diversification of the legal profession. But that will occur through alternative efforts that are more likely to predict success, efforts which we’re beginning to see but are hardly widespread.

Finally, what about USNWR? Unless many schools change, it seems unlikely that USNWR would drop using LSAT and GRE as a metric. Many schools, as noted, already have a cohort that enters without any standardized test scores that are measured in the rankings.

But we can see how the rankings have been adjusted for undergraduate schools:

A change for the 2022 edition -- if the combined percentage of the fall 2020 entering class submitting test scores was less than 50 percent of all new entrants, its combined SAT/ACT percentile distribution value used in the rankings was discounted by 15 percent. In previous editions, the threshold was 75 percent of new entrants. The change was made to reflect the growth of test-optional policies through the 2019 calendar year and the fact that the coronavirus impacted the fall 2020 admission process at many schools.

. . .

. . . U.S. News again ranks 'test blind' schools, for which data on SAT and ACT scores were not available, by assigning them a rankings value equal to the lowest test score in their rankings. These schools differ from ones with test-optional or test-flexible admissions for which SAT and ACT scores were available and were always rank eligible.

It’s possible, then, that alternative rankings weights would be added to account for schools that had increasing cohorts without standardized test scores. But, as long as it remains a factor, I imagine most law schools will continue to do everything in their power to focus on maximizing the medians for USNWR purposes, as long as the incentives remain to do so.

*

In short, it’s quite possible that we’ll see a number of innovative developments from law schools on the horizon if the proposal goes through. That said, I think there are major barriers to dramatic change in the short term, with a concession that changes in other circumstances (including the bar exam, improved undergraduate or pipeline programs, and USNWR) could make this more significant in the future.

But I’d like to suggest two points of data collection that may be useful to examine the change. First, it would be useful if law schools, perhaps only those with more than 10% of their incoming class who enter without standardized test scores, disclose the attrition rates of who had a standardized test and those who did not. Second, it would be useful if they disclosed the cumulative and ultimate bar passage rates of each cohort. I think this information would help demonstrate whether schools are maintaining high standards, both in admission and in graduation, regardless of the source of admission. But, law schools already disclose an extraordinary amount of information, and perhaps those will just be quietly disclosed to the ABA during reaccreditation rather than in some public-facing capacity.

Visualizing the 2023 U.S. News law school rankings--the way they should be presented

Five years ago, I pointed out that the ordinal ranking at the heart of the USNWR rankings is perhaps one of its greatest deceptions. It crunches its formula and spits out a score. That score is normalized to give the top-scoring school (Yale) a score of 100, and it scales the rest of the scores off that.

But the magazine then chooses to display rank order of each school--even if there are significant gaps between the scores. To highlight one such example this year, Vanderbilt has a score of 80, USC has a score of 79, and Florida has a score of 73, which suggest that Vanderbilt and USC are quite close and that Florida is somewhat farther behind those two (even if in overall elite company!). But the magazine displays this as Vanderbilt tied for 17, USC 20, and Florida 21--distorting the narrow gap between Vanderbilt and USC, and the much wider gap between them and Florida. And even though the magazine displays the overall score, the ordinal ranking drowns out these scores. Indeed, as the rankings are ordinal, there is no space from one school to the next, suggesting that they are placed along an equal line.

This plays out elsewhere in the rankings, as law students agonize over small differences in ordinal ranking that belie fairly distinct clumpings of schools that suggest little difference--indeed, in many cases, differences likely only the result of rounding the raw score up or down to the next whole number.

Assuming one takes the USNWR formula seriously--which it doesn't even appear USNWR does, given its choice to rank--a better way would be to visualize the relative performance of each school based on the score, not assigning each school an ordinal rank. That provides better context about the relative position of schools to one another. And that can help illustrate sharp differences in the overall score, or groupings that illustrate a high degree of similarity between a number of schools.

Below is my attempt to visualize the rankings in that fashion. (Please note that this may look best on a desktop browser due to the size of the chart.)

Score USNWR 2023 Rankings, Visualized by Overall Score
100 Yale
99  
98 Stanford
97  
96 Chicago
95 Columbia | Harvard
94  
93 Penn
92 NYU
91 Virginia
90  
89 Berkeley
88 Michigan
87 Duke
86 Cornell
85 Northwestern
84 Georgetown
83 UCLA
82 Washington Univ.
81  
80 Boston Univ. | Texas | Vanderbilt
79 USC
78  
77  
76  
75  
74  
73 Florida | Minnesota
72 BYU | North Carolina
71 George Washington | Alabama | Notre Dame
70 Iowa
69 Georgia
68 Arizona State | Emory | George Mason | Ohio State | William & Mary
67 Illinois | Washington & Lee
66 Fordham | Davis | Irvine | Utah | Wake Forest
65  
64 Indiana | Wisconsin
63 Arizona
62 Texas A&M
61 Florida State | Maryland
60 Colorado | Washington
59 Hastings
58 Pepperdine | Richmond | Cardozo
57 Tulane
56 Tennessee | Villanova
55 Baylor | Penn State Dickinson | SMU | Houston | Wayn State
54 Temple
53 Penn State University Park | Connecticut | San Diego
52 Loyola (CA) | Kansas | Kentucky | Missouri | UNLV | Oregon
51 American | Loyola Chicago | Northeastern | Seton Hall | Miami
50 Case Western | Drexel | Georgia State | Denver | Nebraska | Pitt
49 St. John's | South Carolina
48 Rutgers | Arkansas
47  
46 Lewis & Clark | Cincinnati | Oklahoma
45 Michigan State | Hawaii | New Mexico
44 Chicago-Kent | Catholic | Buffalo | Louisville
43 Brooklyn | Florida International | Howard | Indianapolis | St. Louis
42 Syracuse | Montana
41 DePaul | Louisiana State | Marquette | Texas Tech | New Hampshire | Washburn
40 Drake | Stetson | Mississippi
39 Maine | Missouri-Kansas City
38 Gonzaga | Seattle
37 Chapman | Hofstra | Tulsa | West Virginia
36 Albana | Mercer | Suffolk | Baltimore | Dayton
35 Cleveland State | St. Thomas (MN)
34 Duquesne | NYLS | Wyoming | Willamette
33 Belmont | CUNY | Loyola New Orleans | Santa Clara | South Dakota | McGeorge
32  
31 Creighton | Samford | Detroit Mercy
30 Pace | Regent | Idaho | Memphis | Vermont

Breaking down my many posts on USNWR metrics

I’ve blogged plenty about various facets of the USNWR rankings over the years. Here’s an aggregation of the most significant posts.

Peer score

Do law professors generally think most other law schools are pretty awful? (2017)

Will Goodhart's Law come to USNWR's Hein-based citation metrics? (2019)

Gaming out Hein citation metrics in a USNWR rankings system (2019)

Significant one-year peer USNWR survey score drops, their apparent causes, and their longevity (2019)

Congrats to the University of Illinois-Chicago John Marshall Law School on an unprecedented USNWR peer score improvement (2020)

USNWR law school voters sank Yale Law and Harvard Law for the first time in rankings history. (2022)

Admissions (LSAT, GPA, acceptance rate)

Solving law school admissions; or, how U.S. News distorts student quality (2013)

Some more evidence of the scope of GRE admissions in legal education (2020)

For the second year in a row, Alabama's admissions standards (partially) trump Yale's (2021)

Non-LSAT standardized test scores in admissions remain concentrated at a handful of schools (2022)

Indebtedness

Indebtedness metrics and USNWR rankings (2021)

New USNWR metric favors $0 loans over $1 loans for graduating law students (2021)

Rethinking the best debt metrics for evaluating law schools (2021)

Employment

How state court clerkship opportunities affect legal employment (2014)

Law school-funded positions dry up with U.S. News methodology change (2016)

How should we think about law school-funded jobs? (2017)

At graduation employment figures for law school graduates in 2018 (2021)

Bar exam

USNWR has erratically chosen whether "statewide bar passage" rate includes only ABA-approved law schools over the years (2022)

Some dramatic swings as USWNR introduces new bar exam metric (2022)

Expenditures

Trying (unsuccessfully) to account for law school expenditures under the USNWR rankings formula (2022)

Overall

If U.S. News rankings were a cake, you wouldn't want to follow the recipe (2014)

When U.S. News rankings aren't news, but just 15 months late (2014)

Visualizing the 2018 U.S. News law school rankings--the way they should be presented (2017)

The new arms race for USNWR law specialty rankings (2019)

The absurd volatility of USNWR specialty law rankings (2020)

The tension in measuring law school quality and graduating first generation law students (2020)

Law school inputs, outputs, and rankings: a guide for prospective law students (2021)

The hollowness of law school rankings (2021)

The USNWR law school rankings are deeply wounded--will law schools have the coordination to finish them off? (2021)

Overall legal employment for the Class of 2021 improves significantly, with large law firm and public interest placement growing

Despite an ongoing pandemic, disrupted legal education, challenging bar exams, remote interviews, and the like, the red hot legal market benefited the Class of 2021. The trends were quite positive. Below are figures for the ABA-disclosed data (excluding Puerto Rico’s three law schools). These are ten-month figures from March 15, 2022 for the Class of 2021.

  Graduates FTLT BPR Placement FTLT JDA
Class of 2012 45,751 25,503 55.7% 4,218
Class of 2013 46,112 25,787 55.9% 4,550
Class of 2014 43,195 25,348 58.7% 4,774
Class of 2015 40,205 23,895 59.4% 4,416
Class of 2016 36,654 22,874 62.4% 3,948
Class of 2017 34,428 23,078 67.0% 3,121
Class of 2018 33,633 23,314 69.3% 3,123
Class of 2019 33,462 24,409 72.9% 2,799
Class of 2020 33,926 24,006 70.8% 2,514
Class of 2021 35,310 26,423 74.8% 3,056

The placement is still quite good. There was an increase of nearly 2500 full-time, long-term bar passage-required jobs year-over year, and the graduating class size was the largest since 2016. It yielded a placement of 74.8%. J.D. advantage jobs increased somewhat, too, perhaps consistent with an overall hot market.

It’s astonishing to compare the placement rates from the Class of 2012 to the present,. from 56% to 75%. And it’s almost entirely attributable to the decline in class size.

We can see some of the year-over-year categories, too.

FTLT Class of 2020 Class of 2021 Net Delta
Solo 260 234 -26 -10.0%
2-10 4,948 5,205 257 5.2%
11-25 1,755 2,004 249 14.2%
26-50 1,010 1,218 208 20.6%
51-100 856 1,003 147 17.2%
101-205 1,001 1,143 142 14.2%
251-500 1,030 1,108 78 7.6%
501+ 5,073 5,740 667 13.1%
Business/Industry 2,546 3,070 524 20.6%
Government 3,189 3,492 303 9.5%
Public Interest 2,284 2,573 289 12.7%
Federal Clerk 1,226 1,189 -37 -3.0%
State Clerk 1,938 2,094 156 8.0%
Academia/Education 269 328 59 21.9%

The trend continues last years uptick in public interest placement, which is not an outlier. Public interest job placement is up over 80% since the Class of 2017. These eye-popping number continue to rise. It is likely not an understatement to say that law students are increasingly oriented toward public interest, and that there are ample funding opportunities in public interest work to sustain these graduates. Sole practitioners continue to slide (they were in the low 300s not long ago in raw placement).

Additionally, extremely large law firm placement continues to boom. Placement is up more than 2000 graduates in the last several years, approaching 6000.

I wondered if government and clerkship declines last year may have been attributable to the pandemic, and it appears that employment has rebounded in these categories.

Some figures have been updated to correct errors.

February 2022 MBE bar scores fall, match record lows

What had been a record low in February 2020, after a record low in February 2018, after a record low in February 2017, we see a match of the February 2022 score. The mean score was 132.6, down from 134.0 last year and matching the February 2020 low. (That’s off from the recent 2011 high of 138.6.) We would expect bar exam passing rates to drop in most jurisdictions (although results have been decidedly mixed, with wild swings depending on the jurisdiction—big improvements in Iowa and North Dakota, for instance, and a big decline in Oklahoma).

Given how small the February pool is in relation to the July pool, it's hard to draw too many conclusions from the February test-taker pool. The February cohort is historically much weaker than the July cohort, in part because it includes so many who failed in July and retook in February. The NCBE reports that more than two-thirds of test-takers were repeaters. There were significantly more repeaters than February 2021 (unsurprising given 2020 pushed many first-timers to February). Comparisons to February 2020 seem better, then.

As interest in law schools appears to be waning, law schools will need to ensure that class quality remains strong and that they find adequate interventions to assist at-risk student populations.