A trickle of law school closures

Will any law schools close as a result of the decline in enrollment and the lingering effects of the 2008 recession?

This was how I began a draft of a blog post five years ago. (And that I'm finally revising....) It's a little strange, perhaps, now, to link some of the present status of legal education to an economic event ten years ago, or that we've been discussing the same decline in enrollment problems for many years.

In part, I confess, I had deep skepticism that any law schools would close--at least, any accredited law schools. From my research, I couldn't find a single ABA-accredited law school that had ever closed. The ABA has been accrediting law schools for about a hundred years, and there had been closures of non-accredited or state-accredited schools. I wondered, then, what closures might look like.

I had compiled a few of the most notable predictions concerning law school closures a few years ago.

June 17, 2010, Professor Bainbridge:

If admission applicants drop enough, maybe some of the bottom tier of schools will have to close for lack of qualified applicants. (Or maybe they'll just admit unqualified applicants.) 

October 19, 2010, ABA Journal:

As large law firms continue to hire fewer highly paid associates, law school applications will eventually drop and the number of law schools will likely contract, two professors predict in a recent article.

April 25, 2011, The New Republic:

I'm not saying law schools will go away -- the prestigious ones, especially, will probably come out just fine -- but vast swaths of it [sic] will probably disappear. 

October 3, 2012, Brian Leiter's Law School Reports

My own opinion was that we'll see several law schools close during the next decade, but probably not more than ten--and that was the majority view among readers by a wide margin.  Most vulnerable are going to be free-standing law schools that are relatively young.  Relatively young law schools part of universities that are in vulnerable financial shape are also likely candidates.  

Here we are, ten years after the recession, eight years after the peak law school applicant and enrollment cycle, and we've seen some fairly significant movement. (I won't include the list of prospective new law schools that were abandoned before they began in the last decade.)

Cooley closed its Ann Arbor branch campus in 2014, but that was a branch campus, not a full closure.

William Mitchell and Hamline announced a merger in 2015, but I wasn't inclined to call that a "closure," despite two schools becoming one school. And, that said, enrollment looks more like the size of one school than two schools put together.

Indiana Tech announced its closure in 2016--but, in part, I discounted that closure because the school was not fully accredited by the ABA--only provisionally.

Whittier announced its closure in 2017, and there was the first fully ABA-accredited institution to do so.

Charlotte announced its closure in 2017, a second fully ABA-accredited institution, and the first for-profit law school (six for-profit law schools arose since a consent decree in 1995).

Valparaiso announced in 2017 that it would not enroll an incoming class and was looking for future options for the school--whether closure or selling the institution.

And just yesterday, Savannah announced it would be closing--a branch of Atlanta's John Marshall.

It's been a trickle of closures--a merger, a couple of branch closures, a suspension of admissions, a couple of outright closures. It's hard to portend what else might come. Arizona Summit and Thomas Jefferson are on probation status with the ABA, but that might change. Concordia, Lincoln Memorial, and North Texas are provisionally accredited, and we might see an update on their statuses soon. And there might be fully-accredited schools out there making difficult decisions.

Professor Jerry Organ in 2014 presciently compared a historic decline in dental school enrollment and ultimate closures with what he saw to be possible future closures in legal education--the tail was long, he emphasized, before the true change in the quality of students and the long-term financial pressures closed dental schools. Here we are, a decade after the recession, and we are seeing some of the consequences. Yes, next year appears to be a relatively good year for law schools compared to the last several. But that is probably not enough to change what might be still future difficulties at many institutions. Only sustained long-term growth in quantity and quality of law school applicants, driven by a market desire for more such law graduates, will bring much-needed stability.

A few thoughts on improving law school test and applicant figures for 2018

Recent data from the Law School Admissions Council shows that Law School Admissions Tests administered in December 2017 were up an eye-popping 27.9% year-over-year. It's worth digging a bit into the figures to see what that really means.

First, they're up slightly more in the United States than Canada--recall that this figure includes all LSATs administered. This represents an increase of 29.1% in the United States year-over-year.

But, second, it represents a slightly less impressive total among first-time test-takers. Recall that the LSAC, as of September 2017, allows test-takers to retake an unlimited number of times. Because LSAC reports the highest score to schools (which is less reliable than the average of scores), there is increased incentive to retake tests. First-time test-takers increased 24.0% year-over-year, but repeaters increased a whopping 35.8%. That said, 24% year-over-year increase in first-time United States test-takers is nothing to scoff at.

Third, the quality of applicants is up year-over-year. Those with an LSAT score of 160-164 are up 10.2% year-over-year as of February 21, and those with a score of 165-169 are up 22.8%. The lowest scores have seen a slight decline in applicants.

This is very good news for the best law schools. Of course, the open question is what happens now: do the very good law schools that have shrunk in recent years maintain their size and improve quality, which trickles down to the benefit of many other schools? Or do those schools increase their size and seize the greatest advantage from the improved quality? Time will tell.

Applicants are up 8.8% year-over-year. This is somewhat lower than one would expect given the significant year-over-year increase in first-time United States test-takers, but it might be that the December bump will be reflected much later in the cycle. (Indeed, as schools have quietly dropped their applicant deadlines, coupled with high incentives to retake tests, we may expect that applications lag slightly in each subsequent cycle.)

Of course, these projections may change dramatically. We may see more applicants (but not as many as the increase in LSAT test-takers, for reasons noted about the higher increase in repeaters than first-time test-takers). But, the advent of the GRE in admissions in law schools may mean that these LSAT figures are less predictive than they once were, and we may see more GRE-only applicants.

Time will tell. In short, the figures offer, with some nuance, an overall good picture for legal education generally for the incoming Class of 2018 (including the cohort taking the bar exam in July 2021). How that translates into individual schools, and how precise these figures look in the months ahead, remains to be seen.

A secret small world of "other" law school admissions

Okay, perhaps the title's a bit sensational. But American Bar Association ("ABA") data this year, for the first time, breaks out a couple of categories of 1L law school enrollment. One category is "enrollment from law school applications." The other is "other enrollment."

Typical "application" admissions occurs from the process you might expect: in a very traditional timeline, submit an application in November or December, wait for that envelope (or email?) in March or April, then enroll for a term beginning in August. Of the ABA's 37,400 first-year enrollees reported this year, 36,321 come from this category.

But another 1079 enrollees come from an "other" category. (Admittedly, this is a sliver of the overall admissions picture.) That opaque category includes four groups of enrollees:

  • Students admitted in a prior year who deferred enrollment until the current year
  • Students admitted in a prior year who took a leave of absence
  • Readmits with fewer than 15 credits
  • Students admitted with fewer than 15 credits of prior law study

This is a brand new category of ABA disclosures, designed, apparently, to capture "odd" admissions.

Of those 1079 enrollees, 419 come from just 20 schools (the 20 with the highest percentage of "other" enrollees that make up the first-year class). And these schools are hardly what one might consider peer schools.

USNWR Rank School App Enrollees Other Pct Other
1 Yale University 163 42 20.5%
2 Harvard University 477 83 14.8%
Tier 2 District of Columbia 82 11 11.8%
145 Ohio Northern University 46 6 11.5%
Tier 2 Thomas Jefferson School of Law 215 26 10.8%
Tier 2 Charleston School of Law 225 26 10.4%
Tier 2 Atlanta's John Marshall Law Shool 194 22 10.2%
20 University of Southern California 169 18 9.6%
18 Washington University 204 21 9.3%
2 Stanford University 164 16 8.9%
Tier 2 California Western School of Law 240 23 8.7%
Tier 2 Florida Coastal School of Law 97 9 8.5%
n/r Concordia Law School 44 4 8.3%
Tier 2 Widener-Commonwealth 118 10 7.8%
59 University of Missouri 85 7 7.6%
Tier 2 Western Michigan University 424 34 7.4%
8 University of Virginia 296 23 7.2%
Tier 2 Appalachian School of Law 68 5 6.8%
11 University of Michigan 299 21 6.6%
Tier 2 St. Thomas University (Florida) 173 12 6.5%

Of these 20 schools, 7 are among the top 20 in the USNWR rankings, 10 are among the lowest-ranked schools in USNWR's "Tier 2" designation; and the remaining three are unranked Conordia, 145th-ranked Ohio Northern, and 59th-ranked Missouri. It is almost an entirely binary set of schools--the very elite and the marginal.

So, here comes some speculation.

The Yale 1L class, for instance, includes 20% of a study body that did not apply in the last year--they deferred, took leave, started a handful of credits at another institution (not likely), or were readmitted with a handful of credits from Yale (again, not likely). Yale is very generous in its deferral program. Harvard's "Junior Deferral Program" likely also accounts for a significant chunk.

These admitted students as "deferrals" makes sense. Students get into their dream school, like Yale or Harvard, and rather than postpone law school and reapply in a second round of admissions, perhaps they want to postpone law school to do Teach for America, save a little more money, or travel the world, and they don't need to apply anywhere else--a deferral makes sense for such students. At many other schools, however, students would probably not defer, but reapply in a subsequent admissions cycle, hoping, perhaps, that admissions standards drop (even slightly!), or that their improved personal statement or senior year grades would put them over the top, or that an LSAT retake will make them shine.

At the other end of the spectrum, it appears that many of the more marginal schools admit a number of students who have some at-risk flag factors--for instance, those who were academically dismissed with a very small number of credits.

But, you'll note I have to speculate here. The ABA decided to lump all four of these categories into one heap, and even there failed to disclose on the public-facing website what these "other" categories even were in the first place. Perhaps in the future we'll see more granular data. Until then, we just have an opaque picture of this secret (small) world of law school admissions.

LSAT trends show increase in test-takers and project modest 2018 JD enrollment increase

In my last post, I looked at the law school enrollment figures for 2017. What might happen in 2018?

While LSAT test-takers are up, it's worth emphasizing that an increasing percentage of test-takers are repeaters, not first-time test-takers. On the flip side, the number of schools accepting the GRE as an alternative to the LSAT may understate the number of law school applicants next year.

More importantly than LSAT test-takers increasing, however, is their quality. I emphasized this years ago: the quality of the applicant pool matters in much the way that the quantity does. Professor Jerry Organ has helpfully examined the increase in quality.

(It's worth noting that LSAC changed its data for law school applicants in 2016; it explains, "Archived data for 2015 and prior years include applicants for the fall term only and also include deferrals; therefore, archived data are not comparable to current data." They are, however, close enough for our present comparative purposes; and 2016-2017 are comparable, albeit I only have an estimate for 2017 right now.)

Let's also provide some comparisons in recent LSAT & enrollment data. We saw 1L JD enrollment largely flat for the fourth straight year, and the overall law school enrollment figure may well have bottomed out.

But LSAT test-takers have increased each year since 2015: from 101,600, to 105,900, to 109,400, with a projected 125,000 test-takers this cycle. LSAT test-takers are not proportionately translating into applicants; indeed, despite a 3.3% increased in LSATs administered last year, applicants actually declined slightly, and matriculants increased only 0.8%. Part of this, as I've identified, is attributable to increased numbers of repeaters taking the LSAT. But there are other reasons why LSATs administered are not translating into applicants--reasons I could only speculate about at this time. In part, low quality test-takers may have contributed to inflated LSAT statistics, but we may be seeing a reversal.

That said, surely such a significant increase in the percentage of LSAT test-takers would yield at least some increase in applicants and matriculants--particularly given the quality of those test-takers. Only time will tell. For now, stagnant JD enrollment is the status quo, and law schools can look forward to a glimmer of hope for some improvement in 2018.

2017 law school enrollment: JD enrollment flat, nearly 1 in 7 are not in the JD program

The 2017 law school enrollment figures have been released, and they reveal flat JD enrollment and a sharp uptick in non-JD enrollment.

In contrast, total JD enrollment is at its lowest point since 1974, when 105,708 students were enrolled in just 157 ABA-accredited law schools. Enrollment dropped slightly from last year, down to 110,156.

1L enrollment is actually slightly up, from 37,107 last fall to 37,398 this year. It's the fourth straight year of enrollment in the 37,000-range.

Earlier I predicted that non-JD legal enrollment would decline this year due to uncertainty in immigration and travel rules from the new presidential administration. That is emphatically not the case. Instead, there's a whopping 20% increase in non-JD enrollment, from 13,677 in 2016 to 16,428 this fall. Perhaps some of this arises from the jump in non-JD online degrees, particularly "masters of legal studies"-type degrees.

The growth has been explosive in recent years. When coupled with the decline and flattening of JD enrollment, the relative figures are, in my view, staggering. 13% of all students enrolled in law schools are not a part of a JD program--nearly 1 in 7 students. That's up from 11% last year, 10.3% in 2015, and 9.1% in 2017.

I've earlier wondered about a coming reckoning for non-JD legal education, a market largely unregulated by the American Bar Association and with essentially no disclosure of student inputs or outcomes. And I wonder how long this trajectory might continue.

In light of this enrollment data, I'll shortly project some things about the Class of 2018.

Recent trends in non-JD legal education

I've blogged before about the rise of non-JD legal education. Law schools increasingly rely on non-JD sources of revenue (now, 1 in 9 students enrolled in a law school are not a part of the JD program, up sharply over the last few years). I've also expressed some concern about the value proposition of some of those degrees, particularly given the high failure rate of LLM graduates on the bar exam.

I thought I'd share a prediction, an update, and a new observation.

First, I predict that non-JD enrollment will drop this year, the first such decline in some time. I suggested last year that the new presidential administration might lead to declines in foreign visitors to American educational institutions. I anticipate that will be true when it comes to non-JD education (and foreign students are a significant portion of such degree offerings). Even though the "Travel Ban 1.0/2.0/3.0" has been ostensibly limited in scope and had significant legal challenges (in addition to naturally-expiring deadlines), I think these formal legal postures are quite distinct from the pragmatic effect that even the rhetoric about such immigration restrictions would have on prospective foreign students. We should know more next month.

Second, the New York bar is by far the most popular bar exam for foreign attorneys. This year, first-time test-takers from foreign countries had a whopping 57% pass rate, dramatically up from the historic 42%-46% pass rate in recent years. I don't know what would cause such an increase--more student from English-speaking countries; better bar prep; or any of a number of factors. But it's worth noting in light of my earlier concerns about the low bar pass rates. (The same kind of improvement took place in Texas: first-time pass rates among July test-takers rose from 20% in 2015 and 25% in 2016 to 44% in 2017.) Not all have secured a US non-JD degree, but many do as a prerequisite to taking a state bar exam.

nonjdonlinevtradenrollment.png

Third, law schools have discovered online non-JD legal education. It's not clear how such degrees fit into the overall marketplace (any more so than non-JD degrees more generally), and it might be that such opportunities will offset at least some of the loss of other non-JD enrollment.

Indeed, breaking down traditional versus online non-JD enrollment in the last few years, online non-JD enrollment is up significantly, and traditional non-JD enrollment has flattened. Much of the most recent growth, then, has come from online non-JD degrees. While online non-JD degrees had enrollment of just 1590 in 2014, it nearly doubled to 2971 in 2016--and I expect is still larger for Fall 2017.

Only 38 schools had online non-JD programs in Fall 2016, but even that figure is deceiving. An eclectic crop of eight schools accounted for about half of all non-JD enrollment in 2016.

nonjdonlineenrollment.png

Again, the Fall 2017 figures will be released soon, and we'll see what changes to these trends have taken place. I remain interested to know the place of non-JD degrees and the future trends of enrollment, and I'll always happily report more updates here.

How a change in the bar exam cut score could alter California legal education

Virtually all the deans of law schools in California, of ABA-accredited and California-accredited schools, have come out in favor, at multiple stages, of lowering the cut score for the California bar exam. The score, 144, is the second-highest in the country and has long been this high. Given the size of California and the number of test-takers each year, even modest changes could result in hundreds of new first-time passers each test administration.

The State Bar, in a narrowly-divided 6-5 vote, recommended three options to the California Supreme Court: keep the score; lower it to 141.1; or lower it to 139. As I watched the hearing, the dissenters seemed more in favor of keeping it at 144. At least some of the supporters seem inclined to support the 139 score, or something even lower, but recognized the limitations of securing a majority vote on an issue. Essentially, however, the State Bar adopted the staff recommendation and offered these options to the California Supreme Court.

The Court could adopt none of these options, but I imagine it would be inclined to adopt a recommended standard, and probably the lowest standard at that, 139. (The link above includes the call from the Supreme Court to evaluate the appropriateness of the cut score, a hint, but hardly definitive, that it believes something ought to be done.)

What surprised me, however, is that there would be such unanimity among law deans, because the impact on legal education could be quite significant--and not benefit all institutions equally. Put another way, I understand the obvious short-term benefit for all institutions--some number of law school graduates who previously might have failed the exam would pass, redounding to the benefit of the institution and those graduating classes.

But that, in part, assumes that present circumstances remain the same. Invariably, they will not. Let me set up a few things that are likely to occur, and then game out some of the possible impacts these changes might have on legal education--all on the assumption that the cut score drops from 144 to 139.

First, the number of passers will increase fairly significantly. About 3480 people passed the bar when 8150 took it in July 2016 bar exam. That included about 3000 first-time passers among 5400 first-time test-takers. Bar test-takers are also up significantly this test (in part likely because of the reduction from three days to two). We should expect that number in this cohort to rise to about 4100 passing--and probably more this administration, given that there were more test-takers. We may expect more out-of-state attorneys, or people who'd failed and given up, to start attempting the test again. Statistics also indicate that the greatest increase in new attorneys will tend to be racial minorities, who have historically passed the bar exam at lower rates.

The change will also disproportionately benefit California-accredited schools: while ABA-accredited schools on the whole would see a 17% increase in pass rates, California-accredited schools would see about a 70% increase in pass rates. (Granted, far fewer graduates from these schools take the bar--only about 100 passed the July 2016 bar exam.)

Additionally, we know that this year's test-takers scored better nationwide. If that trend translates to California, too, we would expect a few hundred more on top of that figure. And we may also expect an increase of test-takers to linger for a long period of time if more people are attracted to California because of it has a modestly easier test to pass.

This obviously, in the very short term, primarily benefits those students who scored between a 139 and a 144, but would have failed the bar exam, and schools with those student populations. In the slightly longer term, it will benefit students who scored less than a 139 and on repeat have a much higher chance of securing a 139 than a 144.

About 700 to 800 (or potentially even more, depending on the volume of test-takers) extra attorneys into the system per July, and some smaller number of extra attorneys per February, should slowly exert changes to attorney prices in California, as Professor Michael Simkovic has reasoned. More lawyers means more competition, which means that prices should drop, particularly among attorneys catering to more price-sensitive clients (no one thinks Vault 100 law firms will start slashing California salaries!). It's worth noting, too, that this change may be more gradual at first--there has been a drop in test-takers overall, so the increase in pass rates may not be as dramatic unless (or until) test-taking volume rebounds to previous highs. (For instance, in the July 2013 exam, nearly 5000 passed the exam among 8900 test-takers.)

Professor Robert Anderson and I also indicated that we would expect more attorneys who would face discipline. Currently, we estimate those who score a 144 on the bar exam ultimately face a career likelihood of facing discipline at around 9%. (This compares to the overall likelihood of about 5% at 35 years since admission to the bar.) Those with a 139, we project, would likely face a career likelihood of facing discipline at around 12%. The entering cohort would have a somewhat higher likelihood of facing career discipline at some point a 35-year career.

Finally, some law schools will disproportionately benefit, typically those schools at the lower end of the performance—but not whose student bodies perform at the very bottom among law schools. If the cut score is lowered from 144 to 139, schools who had a significant “middle” of the curve, with the bulk of their graduates scoring in a range around 135 to 145, should see the bulk of improvement.

The chart below illustrates a very rough projection of the improvement in performance of each school from the July 2016 bar exam if the score had been lowered to 139. This is very rough because many factors, particularly the distribution of the students at each school, and should be taken only as rough estimates—any figure could easily be a few percentage points higher or lower; and complicating the estimate is that the July 2017 results would, of course, look different. I’m simply trying to fit the projection to last year for some reference.

As you can see, in that middle band of 12 schools, those between Cal Western and Whittier, we would expect to see gains ranging from 14 to 21 points. The 11 schools at the top of the chart would generally see more modest gains of around 8 to 12 points. The 10 schools at the bottom of the chart would also see more modest improvement, typically 6 to 11 points. (The asterisks on the chart are notations for California schools that are not accredited by the American Bar Association.) There are over 50 law schools in California, but not all had sufficient test-takers to be reported in the California data.

What might these factors do to legal education in California? Potentially, quite a bit. I sketch out some possible outcomes—with an emphasis on their potentiality. A change from 144 to 139 is somewhat modest but, in a state as large as California with as many law schools and lawyers, could have significant effects. Here are a few possible things that could occur:

* * *

At least some law schools will admit larger classes. To the extent law schools were reluctant to admit larger classes because of concerns about bar passage rates, those schools will be more inclined to admit larger student bodies. Of course, there are still other reasons that schools may not increase their class sizes, or at least not substantially—they are concerned about their LSAT and UGPA medians for USNWR rankings purposes, they may be worried about finding meaningful legal employment for a larger number of graduates, and so on. But, at least one barrier in the admissions calculus has been partially removed.

Higher-ranked law schools may begin admitting more students that recently historically matriculated to lower-ranked law schools. That is, a new kind of competition may begin. In light of the thought mentioned above, it may not simply be that schools admit larger classes; they may be grabbing applicants who would have attended lower-ranked schools.  This would exert downward pressure on lower-ranked schools in the event that competition for their prospective students increased.

Higher-ranked law schools may see improved racial diversity profiles among incoming classes, potentially at the expense of lower-ranked schools. This is good news for highly-ranked schools and students from racially diverse backgrounds. The lower score will tend to benefit racial minorities, as the data has shown that minorities fail the bar at higher rates. So highly-ranked schools can admit more diverse student bodies with greater confidence of their success. Of course, this will exert downward pressure on lower-ranked schools, who may see their diversity applicant pools dwindle or face pools of applicants with worse predictors than in past years.

Law schools will experience more price sensitivity from prospective law students. That is, the value of the law degree should decline in California, as the volume of attorneys increases and the price for lawyers drops. That should, in turn, make law students more skeptical of the existing value proposition of a law degree. Law schools that have relied on high tuition prices have benefited from the high bar exam cut score, because opportunities for attorneys have been relatively scarce; the drop in cut score will dilute the value of the degree and perhaps require some cost-cutting at law schools. This is not to say that an artificial constriction on the supply of lawyers is a good thing because it props up costs (in my personal view, I think it's quite a bad thing); but, it is to say that lowering the score will have the effect of making cost-sensitivity an increasing possibility.

California-accredited law schools will have opportunities to thrive. Look again at the chart above. San Joaquin (which had 45 first-time test-takers in July 2017) would have a projected bar pass rate of 50%. Lincoln Sacramento (which had 42 first-time test-takers) would have a projected bar pass rate of 47%. These exceed some ABA-accredited schools and start to look quite attractive to prospective law students. That’s particularly true given the tuition at these institutions. The figure below displays the full-time academic year tuition in 2016 for each of these institutions. (For institutions on the credit-hour payment model, I used 28 academic units; for Lincoln Sacramento, a four-year program, I took the total price and divided by three.) I put the schools in rank order of their (projected) bar exam performance. (As a caveat, the actual price at many institutions is much lower because many students receive scholarships that discount tuition; but, for present comparative purposes, I'm using sticker price.)

(It's worth noting in the chart above that an institution like La Verne, which charges much lower tuition than peer institutions, may see a similar benefit.) For those who oppose the regulatory burden of ABA-accreditation and wish that non-accredited institutions have an opportunity to thrive, California (with more than 30 non-ABA-accredited schools) may offer a more meaningful experiment in that effort if the cut score is lowered.

Negative impact in USNWR for elite schools, and positive impact in USNWR for more marginal schools. This category may not be immediately obvious to observers considering bar exam pass rates. That is, some might ask, wouldn't higher bar exam passing rates improve a school's USNWR profile? Not necessarily--particularly not if the overall passing rate increases.

USNWR measures bar pass rate not in absolute terms but in relative terms--the margin between a school's first-time passing rate in a jurisdiction and that jurisdiction's overall pass rates. If School A has a passing rate of 90% and School B 75%, showing some gap that's only part of the story: School A had a 90% rate in a jurisdiction with an overall rate of 60%, which means it actually did quite well; but School B had a 75% rate in a jurisdiction with an overall rate of 80%, which means it actually did poorly. USNWR measures that relative performance. UPDATE: I edited this for some clarity in the hypothetical.

So if School A sees its passing rate increase to 93%, but the jurisdiction's overall passing rate increases to 85%, that's bad for School A in USNWR terms--its ability to outshine others in the jurisdiction has dwindled. In a state as large as California and with such a relatively low first-time overall passing rate, this gives elite schools an opportunity to shine.

Stanford, for instance, boasted a 91% first-time bar passage rate in a jurisdiction with a 56.3% first-time pass rate, a 1.62 ratio. If the bar pass cut score is dropped to 139, the bar projects a first-time pass rate of 64.5%. Even if its pass rate increases to a projected 96%, its ratio drops to 1.49, a 0.12-point drop. The same holds true for institutions like USC (-0.08), UCLA (-0.03), and Berkeley (-0.06). These are just one factor in the USNWR ratings, and these figures are ultimately normalized and compared with other institutions nationally, but it will marginally hurt each of these schools as an institution in the rankings--even though it might benefit a small cohort of graduates each year taking the bar exam.

In contrast, schools that have had below-average bar exam performance would see a significant increase—some of them in my projections moving up 0.2 points in their ratios or even more. If the school is in the unranked tier, it might help get the school into the rankings; if they are ranked lower, it might help them move up the rankings, an added benefit to their graduates passing the bar at higher rates.

* * *

I’ll emphasize what I’ve mentioned repeatedly before but is too often lost when blog posts like this are shared. I have no particularly strong views about what the bar exam cut score ought to be—where it is, a little lower, much lower, or anything else. There are costs and benefits that go along with that, and they are judgments I confess I find myself unable to adequately assess.

But, these are my preliminarily thoughts on things that might happen if the cut score were dropped to 139. Granted, they are contingent on many other things, and it is quite possible that many of them do not happen. But they are a somewhat-evidence-based look at the future. And they show that the change in cut score may disproportionately affect some institutions in ways beyond the short-term bar exam results of cohorts of graduating law students. Time will tell how wrong I am!

An odd and flawed last-minute twist in the California bar exam discussion

My colleague Rob Anderson last night blogged about the strange turn in a recent report from the staff at the California State Bar. Hours ahead of today's meeting of the Board of Trustees, which will make a recommendation to the California Supreme Court about the appropriate "cut score" on the bar exam, new information was shared with the Board, which can be found in the report here. (For background on some of my longer thoughts on this debate, see here.)

The report doubles down on its previous claim that there is "no empirical evidence available that indicates California lawyers are more competent than those in other states." (As our draft study of the relationship between attorney discipline and bar scores in California discusses, we concluded that we lacked the ability to compare discipline rates across states because of significant variances in how state bars may handle attorney misconduct.)

But it's now added a new claim: "Nor is there any data that suggests that a higher cut score reduces attorney misconduct." Our paper is one empirical study that expressly undermines this claim. Rob digs into some of the major problems with this assertion and the "study" that comes from it; his post is worth reading. I'd like to add a couple more.

First, the paper makes an illogical jump: "Nor is there any data that suggests a higher cut score reduces attorney misconduct" to "But based on the available data, it appears unlikely that changing the cut score would have any impact on the incidence of attorney misconduct." These are two different claims. One is an absence of evidence; the other is an affirmative finding relating to the evidence. Additionally, the adjective "unlikely" adds a level of certainty--how low is the probability? And on what is this judgment made? Furthermore, the paragraph is self-refuting: "Given the vast differences in the operation of different states' attorney discipline systems, these discipline numbers should be read with caution." Caution indeed--perhaps not read at all! That is, there's no effort to track differences among the states and control for those differences. (This is a reason we couldn't do that in our study.)

Apart from, as Rob points out, other hasty flaws, like misspelling "Deleware" and concluding that California's discipline rate of 2.6 per thousand is "less than a third" of "Deleware"'s 4.7 per thousand, it's worth considering some other problems in this form of analysis.

At a basic level, in order to compare states based on discipline rates, it must be the case that the other factors do not differ dramatically among states. But if the other factors do not differ dramatically among states, and bar pass score also does not matter, then the states should have roughly equal rates, which they don't.

The figure itself demonstrates a number of significant problems.

First, Figure 7 compares cut score with attorney discipline. But it uses a single year's worth of data, 2015. The sample size is absurdly small--it projects, for instance, the State of Vermont's discipline rate based on a sample of 1 (the total attorneys disciplined in 2015). The ABA has such data for several years, but this report doesn't collect that. In contrast, ours uses over 40 years of California discipline data from over 100,000 attorney records.

Second, the figure doesn't control for years of practice, which can affect discipline rates. That is particularly the case if the cohort of licensed attorneys in the state skews younger or older. We find that attorneys are more likely to face discipline later in their careers, and our study accounts for years of practice.

Third, the figure doesn't recognize variances in the quality of test-takers in each state. In July 2016, for instance, California's mean MBE score was a 142.4, but Tennessee's was a 139.8. Many states don't disclose state-specific MBE data. But two states with similar cut scores may have dramatically different abilities among their test-takers, some with disproportionately higher scores. Our study accounts for differences in individual test-taker scores by examining the typical scores of graduates of particular law schools, and of the differences in typical scores between first-time test-takers and repeaters.

Fourth, the figure treats the "cut score" as static in all jurisdictions, when it has changed fairly significantly in some. This is in stark contrast to the long history of California's cut score. California has tethered its 1440 to earlier standards when it sought applicants to score about 70% correct on a test, so even when it has changed scoring systems (as it did more than 30 years ago), it has tried to hold that score as constant as it can. Other states lack that continuity when adopting the MBE or other NCBE-related testing materials, have changed their cut scores, or have altered their scoring methods. Tennessee, for instance, only five years ago adopted scaling essay scores to the MBE, and failure to do so assuredly resulted in inconsistent administration of standards; further, Tennessee once permitted those with a 125 MBE to pass with sufficient "passing" scores on the unscaled essays. South Carolina at one time required a 125 MBE score, and didn't scale its essays. Evaluating state attorney discipline rates from attorneys admitted to the bar over several decades based on a cut score from the July 2016 test cannot adequately measure the cut score.

Let me emphasize a couple of points. I do wish that we had the ability to compare attorney discipline rates across states. I wish we could dive into state-specific data in jurisdictions where they changed the cut score, and evaluate whether discipline rates changed among the cohorts of attorneys under different standards.

But one of the things our study called for was for the State Bar to use its own internally-available data on the performance of its attorneys on the bar exam, and evaluate that when assessing discipline. The State Bar instead chose this crude and flawed process to demonstrate something else.

Finally, let me emphasize one last point, which I continue to raise in this discussion. Our study demonstrates that lower California bar scores correlate with higher attorney discipline rates, and lowering the bar score will result in more attorneys subject to discipline. But, of course, one can still conclude in a cost-benefit analysis that this trade-off is worth it--that the discipline rates are not sufficient for necessary concern, that they often take years to manifest, that access to justice or other real benefits are worth the trade-off, and so on.

But it is disappointing to ignore or use deeply flawed data about the relationship between attorney discipline and the bar exam cut score in this process, particularly when dumped the night before the Trustees meet to evaluate the issue.

A poor attorney survey from the California State Bar on proposals to change the bar exam cut score

I'm not a member of the California State Bar (although I've been an active member of the Illinois State Bar for nearly 10 years), so I did not receive the survey that the state bar circulate late last week. Northwestern Dean Dan Rodriguez tweeted about it, and after we had an exchange kindly shared the survey with me.

I've defended some of the work the Bar has done, such as its recent standard-setting study, which examined bar test-taker essays to determine "minimum competence." (I mentioned the study is understandably limited in scope and particularly given time. The Bar has shared a couple of critiques of the study here, which are generally favorable but identify some of the weaknesses in the study.) And, of course, one study should not so determine what the cut score ought to be, but it's one point among many studies coming along.

Indeed, the studies, so far, have been done with some care and thoughtfulness despite the compressed time frame. Ron Pi, Chad Buckendahl, and Roger Bolus have long been involved in such projects, and their involvement here has been welcome.

Unfortunately, despite my praise with some caveats about understandable limitations, the State Bar has circulated a poor survey to members of the State Bar about the proposed potential changes to the cut score. Below are screenshots of the email circulated and most of the salient portions of the survey.

It is very hard to understand what this survey can accomplish except to get a general sense of the bar about their feelings about what the cut score ought to be. And it's not terribly helpful in addressing the question about what the cut score ought to be.

For instance, there's little likelihood that attorneys understand what a score of 1440, 1414, or "lower" means. There's also a primed negativity in the question "Lower the cut score further below the recommended option of 1414"--of course, there were two recommended options (hold in place, or lower to 1414), with not just "below" but "further below." Additionally, what do these scores mean to attorneys? The Standard-Setting Study was designed to determine what essays met the reviewing panel's definition of "minimum competence"; how would most lawyers out there know what these numbers mean in terms of defining minimum competence?

The survey, instead, is more likely a barometer about how protectionist members of the State Bar currently are. If lawyers don't want more lawyers competing with them, they'll likely prefer the cut score to remain in place. (A more innocent reason is possible, too, a kind of hazing: "kids these days" need to meet the same standards they needed to meet when getting admitted to the bar.) To the extent the survey is controlling whether to turn the spigot to control the flow of lawyers, to add more or to hold it in place, it represents the worst that a state bar has to offer.

The survey also asks, on a scale of 1 to 10, the "importance" attorneys assign to "statements often considered relevant factors in determining an appropriate bar exam cut score." These answers vary from the generic that most lawyers would find very important, like "maintaining the integrity of the profession," to answers that weigh almost exclusively in favor of lowering the cut score, like "declining bar exam pass rates in California."

One problem, of course, is that these rather generic statements have been tossed about in debates, but how is one supposed to decide which measures are appropriate costs and benefits? Perhaps this survey is one way of testing the profession's interests, but it's not entirely clear why two issues are being conflated: what the cut score ought to be to establish "minimum competence," and the potential tradeoffs at stake in decisions to raise or lower the cut score.

In a draft study with Rob Anderson, we identified that lower bar scores are correlated with higher discipline rates and that lowering the cut score would likely result in higher attorney discipline. But we also identified a lot of potential benefits from raising the score, which have been raised by many--greater access to attorneys, lower costs for legal services for the public, and so on. How should one weigh those costs and benefits? That's the sticky question.

I'm still not sure what the "right" cut score is. But I do feel fairly certain that this survey to California attorneys is not terribly helpful in moving us toward answering that question.