The best ways to visualize the impact of the decline in bar passage scores

I've visualized a lot about the decline in bar pass scores and bar passage rates in the last few years, including a post on the February 2017 decline here. For some reason, this post in particular drew criticism as being particularly deceptive. It caused me to think a little more about how to best visualize--and explain--what the decline in multistate bar exam ("MBE") scores might mean. (I'll channel my inner Tufte and see what I can do....)

In the February 2017 chart, I didn't start the Y-axis at zero. And why should I? No one scores a zero. The very lowest scores are something in the 50s to 90s. And the score is on a 200-point scale, but no one gets a 200. So I suppose I could visualize it on the low to high ends--say, 90 to 190.

When you put it that way, it looks completely unremarkable. MBE scores have dipped a bit, but they've hardly moved at all. And it looks like my last post was simply clickbait. (It's worth noting I generate no revenue from this site!)

But that surely can't be right, either. After all, bar passage rates have been declining fairly sharply in the last few years even if this mean score has only moved relatively nominally. (For extensive discussion, see the "Bar exam" category on this blog.)

That's because what really matters is the passing score or the "cut score" in each jurisdiction.

Suppose the cut score in a jurisdiction is 100. A decline from a mean score of 135 to 134 should have essentially no effect if the results are distributed among a typical bell curve (and they usually are). That's because virtually everyone would still pass even if scores dropped a bit. In contrast, if the cut score were 180, a decline from a mean score of 135 to 134 should also have essentially no effect--virtually everyone would still fail.

But the reason for the perilous drop in bar pass rates is because this is exactly the spot where the mean scores have begun to hit the cut scores in many jurisdictions. Here's a visualization of what looks like, with a couple of changes--a larger y-axis, historical data for the February bar back to 1976, and gridlines identifying the cut scores in several jurisdictions. (It's worth noting that this is the national MBE mean, not individualized state means; July scores are somewhat higher; and it is a mean, not a median.)

You can see that the drop in the means plunges scores past what have been cut scores in many jurisdictions.

One more way of explaining why a drop at this point of the bell curve is particularly significant. The NCBE has not yet released the distributions of scores, but the bell curve linked above should be instructive, and the change from 2011 to 2016 is useful to consider.

In February 2011, just 39.6% of all test-takers had a score of 135.4 or lower. 13.7% had a score in the range of 135.5 to 140.4, and 46.6% had a score of 140.5 or higher. (Consider the chart above for references as to what those scores might mean.) In February 2016, however, 51.1% of all test-takers had a score of 135.4 or lower, a 11.5-point jump. 13.7% had a score in the range of 135.5 to 140.4, and just 35.1% had a score of 140.5 or higher.

That's because this particular drop in the score is at a very perilous spot on the curve. Bar takers are performing just a little worse in a relative sense. But when the distribution of performance is put up against the cut score, this is precisely the point that would have the most dramatic national impact.

I hope these explanations help illustrate what's happening on the bar exam front--and, of course, I welcome corrections or feedback to improve these visualizations in the future!

February 2017 MBE bar scores collapse to all-time record low in test history

UPDATE: Some wondered about the scale used for the visualization below, and I respond with some thoughts in a subsequent blog post.

On the heels of the February 2016 multistate bar exam (MBE) scores reaching a 33-year low, including a sharp drop in recent years, and a small improvement in the July 2016 test while scores remained near all-time lows, we now have the February 2017 statistics, courtesy of Pennsylvania (PDF). After a drop from 136.2 to 135 last year, scores dropped another full point to 134. It likely portends a drop in overall pass rates in most jurisdictions.

This is the lowest February score in the history of aggregated MBE results. (The test was first introduced in 1972 but, as far as I know, national aggregate statistics begin in 1976, as data demonstrates.) The previous record low was 134.3 in 1980.

It's worth noting that the February 2017 test had a small change in its administration: rather than 190 question that were scaled into the score and 10 experimental questions, the split in this exam was 175/25. It's unlikely (PDF) this caused much of a change, but it's worth noting as a factor to think about. And it's not because the MBE was "harder" than usual. Instead, it primarily reflects continued fall-out from law schools accepting more students of lower ability, then graduating those students who go on to take the bar exam. Given the relatively small cohort that takes the February test, it's anyone's guess what this will portends for the July 2017 test.

Visualization note: the non-zero Y axis is designed to demonstrate recent relative performance of bar scores, not absolute scores.

California's move to a two-day bar exam might affect some schools more than others

I was among the first to discuss California's planned move from a three-day bar exam to a two-day bar exam. The first two-day exam will occur in the July 2017 administration.

The old three-day model weighted the Multistate Bar Exam component (the 6-hour multiple choice test) at about 1/3 of the overall score, and the other two days of essays as about 2/3 of the overall score. When the bar studied the issues, it found little difference in assessing aptitute or in scoring between a 1/3-2/3 model and a two-day bar where both sections would be weighted roughly equally (as most states do).

That's true at the macro level. For individual test-takers, of course, that can vary wildly. And even at the school level, we may see somewhat noticeable differences between the MBE scores and the essay scores.

Thanks to a pretty sizeable disclosure from the California bar, we can assess how individual schools fared on the bar, and what their scores would look like if scored under the July 2017 1/2-1/2 model.

This, of course, has many limitations, which I'll start listing here. First, these are the mean scores; they correlate highly with pass rates, but not perfectly. Note that Stanford's mean score blows all other schools out of the water, but its first-time pass rate is only a few percentage points better than others. That means movement up or down in the mean scores would likely improve or worsen the pass rate, but in measures not immediately ascertainable. Second, just because the bar was scored this way in July 2016 does not mean we would expect graduates of these schools to perform similarly in 2017. Indeed, evidence like this would probably drive a change in bar study habits! Graduates would be inclined to focus more attention on the MBE and less attention on the essays, which would change the scores in unknown ways.

The chart at the right shows in red circles what schools' mean scores were this July under the 1/3-2/3 scoring model. The blue circles are what the scores would have been under the 1/2-1/2 model. (Recall that a passing score in California is a 1440.) As you can see, there is almost no difference for most schools. I flagged four schools that might see the biggest changes--San Diego's for the better; and Irvine, San Francisco, and Thomas Jefferson for the worse.

And recall the caveats above--this does not mean it will translate into demonstrable differences in the pass rate, and pass performance is not an indicator of future success. This is particularly school for the three schools I identified that might expect lower means--Irvine is well above the passing score, and San Francisco and Thomas Jefferson are well below it, meaning marginal differences in the mean score would probably affect very few. (For schools closer to the 1440 score, we might expect slightly larger differences, again with the significant caveats listed above about the limited value of using the means.) But it should certainly shift attention in graduate preparation next summer--and whether that changes scores remains to be seen.

The collapse of bar passage rates in California

My colleague Paul Caron has helpfully displayed the data of the performance of California law schools in the July 2016 California bar exam. It's worth noting that the results aren't simply bad for many law schools; they represent a complete collapse of scores in the last three years.

The chart here shows the performance of first-time California bar test-takers who graduated from California's 22 ABA-accredited law schools in the July 2013, 2014, 2015, and 2016 administrations of the exam. The blue line in the middle is the statewide average among California's ABA-accredited law schools. (The overall passage rate among all ABA-accredited law schools is usually a point or two lower than this average.)

The top performers are mostly unchanged from their position a few years ago. The middle performers decline at roughly the rate of the statewide average. But the bottom performers show dramatic declines: from 65% to 22%, and from 75% to 36%, to identify two of the most dramatic declines.

It's true that changes to the applicant pool have dramatically impacted law schools, as I identified three years ago and as continues to hold true. There have been fewer applicants for law schools; those applicants are often less qualified applicants--those with lower LSAT scores and UGPAs than previous classes; and schools are not shrinking their class sizes quickly enough to respond to the decline in quality. For some of the more at-risk schools, they face significant attrition each year as their very best students are transferring to higher-ranked institutions, further diluting the quality of the graduating classes. (I've also occasionally read critiques that law schools are not "doing enough to prepare" students to take the bar exam, but I highly doubt law schools have dramatically changed their pedagogy over the last few years to cause such a decline.)

And the decline in bar pass rates in 2014 was the first in a longer stage of declining scores, as I explained back then. And it's not even clear that pass rates have reached bottom.

I noted earlier this year that the new mandate from the ABA that 75% of graduates of law schools must pass the bar exam within two years of graduation will uniquely impact California--despite bar test-takers being far more able in California, they fail at much higher rates. Whether bar pass rates will improve for some of these schools in the future, or whether the state bar intervenes to ease its scorning practices, remains a matter to be seen.

Note: I did not start my Y-axis at 0% to avoid unnecessary white space at the bottom of the graph, and it is designed to show relative performance rather than absolute performance.

Why is the ABA still accrediting law schools?

Bear with some meandering musings below....

To grossly oversimplify the history of the American Bar Association accrediting law schools, it looks something like this.

About a hundred years ago, just about anyone could take a bar exam, as long as you studied with a lawyer or "read law" for a time. Some attended law school, but it was not required. By the 1930s, states began making it more difficult to pass the bar exam--presumably in part to reduce competition for existing lawyers and make the profession more difficult for individuals to enter. (Presumably, of course, with another, at least salutary, benefit of increasing the "quality" of those practicing law.)

The ABA today describes its accreditation standards as "minimal educational requirements to qualify a person for eligibility to sit for the bar examination." As state bars became more exclusionary, these bars began to adopt minimal standards--driven, perhaps, by the ABA itself, which has become the sole accrediting body in most jurisdictions. The bar required attendance at an accredited law school; the accreditation process was designed to ensure that legal education met standards that the ABA believed to offer a "sound program" of legal education.

All this is actually quite descriptive and lacks any normative explanation. Why should there be certain standards for legal education that must be met before someone takes the bar exam?

What is the difference between legal education and the bar exam?

It might be that we have legal education because we believe that attorneys should be, somehow, perhaps, well-rounded and well-educated individuals, apart from their ability to pass the bar exam. That would seem to be the driving concern--we think (perhaps you don't, but work with the assumption) lawyers shouldn't just be able to pass the bar and practice law; they should have some kind of training and background before they practice law and something that qualifies them apart from the bar exam's test of "minimum competence."

The ABA has a near obsessive focus on the picayune details of how a law school functions, including the types of books the law library maintains. Many of the ABA standards are fairly generic, requiring things like "suitable" classrooms, "sufficient" space for staff, and "sound" admissions policies. Ensuing interpretations often add specific guidance to these generic standards, which drive a great deal of law school decision-making. But these are all designed to elevate the educational experience, quite apart from the ability to pass the bar exam.

Many of these standards, of course, suffer from serious deficiencies. For matters like the books in a library, they are antiquated notions about access to print materials from an era where books were scarce. Today, not only are books plentiful, the resources attorneys principally use are electronic. Some standards are the result of bargains with entrenched interests within the ABA rather than with empirical or quantifiable pedagogical benefit.

But that can be set aside for the moment--the ABA may have a goal of providing some kind of a quality education for all prospective attorneys before they take the bar, but it may simply do so albeit ham-handedly.

But there is a different, perhaps reverse, form of the question: if legal education provides students with three years of sound education and a degree at the end, why is the bar exam even needed? Isn't graduating from a law school after three years of thickly-regulated education sufficient to make one eligible to practice law? Indeed, it's a reason why the state of Wisconsin offers "diploma privilege" to graduates of its two law schools.

The opening question, then, is really to determine the purpose of the bar exam and the purpose of accreditation of law schools.

To recap, the bar exam has perhaps less-than-noble purposes (such as limiting the pool of attorneys), and some perhaps good purposes (such as establishing minimum competence to practice law, however imperfect the bar exam may establish that).

Legal education, in contrast, is, I think, designed to offer something beyond simply establishing "minimum competence." It, perhaps, and perhaps ideally, offers students an opportunity to learn about the law in a more systematic way than reading law might have permitted. That, of course, comes at a high cost for prospective attorneys who must invest (typically) three years of education and a substantial sum of money to achieve the diploma required to take the bar.

Therefore, I think it would be fair to say that legal education is providing something distinct from the bar exam. (Whether the accreditation process is a proper assessment, and whether accreditation should be required, are separate concerns.)

If legal education is providing something distinct from the bar exam, then why are new accreditation standard focusing on the bar exam?

So, why is the ABA still accrediting law schools given its new obsession with the ability of graduates to pass the bar exam?

Most of the rest of the ABA's accreditation practices focus upon the terms of education. It is, in theory, providing something apart from the bar exam. Now comes the new ABA standard, on track for approval, which provides, quite flatly, "At least 75 percent of a law school’s graduates in a calendar year who sat for a bar examination must have passed a bar examination administered within two years of their date of graduation." An earlier malleable standard has become a clean rule. It also threatens a number of schools currently in non-compliance with this standard, likely to get worse as bar rates continue to decline.

What value is the ABA adding if its newest, most stringent control is simply redundant of, well, the bar exam itself? Why have accreditation at all?

It seems a bit self-referential to say that a law school cannot send its graduates to take the bar exam until enough of its graduates pass the bar exam--particularly as the entire point of legal education, as I've suggested, is to provide something apart from "minimum competence."

After all, it would be pretty simple stuff for state bars to simply disclose the pass rates of all institutions whose graduates take the bar exam. Then consumers could know if their school met the standards that they expected or desired when attending law school.

But perhaps it is because of the consumer protection-focused nature of recent years that has driven this result. Legal education is not really seen as offering something "apart from" anymore. It is instead deemed more a necessary and costly hurdle before taking the bar exam. And if law school graduates are going through this costly and time-consuming process, but unable to pass the bar exam, then law schools’ function is greatly diminished.

There are two principal, and opposing, kinds of responses one could make to my query.

First, I suppose one could make the claim that if a law school is not providing "minimum competence" to its graduates, then it is hardly providing the kinds of aspirational traits legal education purports to provide and should not be accredited. That's, I think, somewhat misguided. The bar exam is not really very well designed to test "minimum competence." Indeed, it's not really very well designed to test the abilities of lawyers. Timed, closed-book essays that principally rely on regurgitating black letter law (indeed, often greatly simplified, even fictitious versions of law), alongside a series of multiple choice questions, in selected areas of practice designed for general practitioners who arose from a common-law, 70s-era form of the law, are not really something that should be taught in law school--at least, not emphasized.

In reality, the problem is not that law schools are failing to train their graduates with the "minimum competence" needed to practice law. Or, even the ability to pass the bar. It is that many are accepting, and then graduating, students who are unable to acquire the skills needed to pass the bar--because they are incapable of doing so, or because they are unable to transfer the three years of legal education into the licensing regime of bar test-taking, or because they have been prioritizing other things, or whatever it may be.

This is, I think, a subtle point. It is tied, I think, closer to admissions and graduation practices, and to post-graduation study habits, more than legal education. That is, of course, because legal education is supposed to be providing something other than bar prep, and has been doing so for decades. So, the decline in bar pass rates is not really a problem with "education" in the sense of the time in the classroom for three years. It is about other things.

Second, one could say that the ABA needs to have some standards for accrediting law schools, and this is as good a standard as any to help control the admissions and graduation problems that may be contributing to bar pass rate declines. But this, again, gets back to my opening question--why have accreditation at all?

If law schools aren't in the business of bar exam training (and I don't think they should be), we should still expect that law schools are graduating students who are able to use their professional degrees in the practice of law. If schools are failing in that endeavor, stripping accreditation is certainly a way of penalizing them.

But it all seems quite circuitous, given that we could just permit students to take the bar regardless of their legal education history--as long as they establish that they have the "minimum competence" to take a licensing exam, they could practice law. And if some want to attend law school to secure a credential for future employers that says, "I've attended this law school and have some additional training that establishes something beyond minimum competence," they could do that, too.

And this points back to the purposes of requiring attendance at an accredited law school in the first place. You see, my problem isn't necessarily that the ABA wants to ensure that law schools are graduating students who are able to pass the bar exam and become licensed practicing attorneys. It is, instead, that if the bar exam is our principal concern, and the principal concern is wholly independent of legal education, and now legal education is accrediting bodies based on performance on this principal concern... doesn't that instead suggest that the accreditation process of legal education is, perhaps, its own problem now?

Concluding thoughts

If you've survived through this meandering, it's worth considering what legal education should be. Perhaps it should still try to provide something different from the "minimum competence" required to pass the bar exam.

But as some law schools have departed from practices that may best benefit their graduates--particularly in high tuition costs; entrenched and inflexible standards; and declining control in the quality of admissions, retention, and graduation practices--it may be the case that we have forgotten what law school ought to be. Its purposes have been lost as we consider it as a kind of necessary rite of passage before one takes the bar exam. In this instrumental vein, distrustful of the operation of law schools, the accreditation process should look mostly at the outputs of graduates.

I don't think that's a welcome development, either on the accreditation end or on the telos of legal education. But it's perhaps the necessary evil that has come upon us, until either schools change their practices or the market improves dramatically. Even then, it will be hard to separate legal education from the bar exam, and that loss speaks more about why the ABA is still accrediting schools in the first place--or why state bars require legal education before taking the bar exam.

Despite improvement in MBE scores, bar exam rates appear to be falling yet again

I blogged earlier about the slight improvement in MBE scores, which, I thought, might lead to an increase in the overall bar pass rates. But I was in a wait-and-see mode, because, while MBE scores typically track bar pass rates, we needed to see results from jurisdictions to see if that would be the case this year.

It appears, however, that even though MBE scores rose, bar exam pass rates are declining again.

I continue to track overall pass rates (rather than first-time or first-time ABA rates) because that's often the only data many jurisdictions disclosed--but first-time pass rates are often much better. These are most of the jurisdictions that have publicly disclosed overall pass rates.

Bar pass rates have improved slightly in a couple of jurisdictions--Kansas rose from 76% to about 79%, and West Virginia from 69% to about 71%. But there are some declines--and some fairly significant ones--elsewhere. Missouri's fell from 84% to 79%--it was 91% in 2012. Indiana's dropped 13 points, from 74% to 61%. Iowa's dropped 15 points, from 86% to 71%--it was 90% in 2012.

Assuming all things are equal, an increase in the mean MBE scores would have meant an increase in the bar pass rates. But why are seeing declines in many many jurisdictions?

My initial theory--unsupported by any evidence!--would be that the best students were sufficiently worried about the bar exam and studied more than ever. That would mean that the scores of people already inclined to pass the bar exam improved--and that wouldn't have any impact on the pass rates. It would shift up the mean score of the MBE without affecting the overall pass rates. And, if the quality of students law schools have been graduating has continued to decline, then we might expect to see overall pass rates decline.

It's also possible that these jurisdictions are outliers and we'll see improvement in pass rates in places like New York and California. (The small decline in the pass rate in Florida, however, is not a good sign on this front.)

In short, there was some excitement and enthusiasm about the uptick in the MBE scores. But if that uptick isn't translating into improved bar pass rates, law schools need to be seriously considering responses to continued declines in the months ahead.

(It's worth noting that I chose a non-zero y-axis to demonstrate the relative changes in performance; overall, a majority of test-takers continue to pass the bar in each jurisdiction.)

Why weren't bar exam pass rates an existential crisis in the 1980s?

I blogged about the small improvement in Multistate Bar Exam ("MBE") scores in the July 2016 administration of the test. We won't know what first-time pass rates from ABA-accredited law schools are for some time, but it's fair to assume we should see a small improvement nationally.

The drop in test scores--likely partially caused by a decline in the quality of the applicant pool over the last several years--has caused quite an uproar, particular as the ABA considers clamping down on schools with relatively low pass rates.

But if you look at MBE scores from the 70s and 80s, the peak time for Baby Boomers to be completing legal education, you'll notice that their scores are fairly comparable to the scores in the last two years.

So if MBE scores look a lot like they did back then, why is there such a commotion about them? Perhaps a few reasons.

First, expectations have changed. Gone are the days with the mythic "look to your left, look to your right" fears of dismissal. There is an expectation that virtually all law school enrollees complete their JD, and another expectation that those who secure the JD and take the bar will pass the bar. The challenges are no longer deemed to be the failure rates in law school or the bar exam, but in the process that one must "survive." A dip in bar pass rates upsets existing expectations.

Second, the fiscal consequences have changed. Indebtedness of students at graduation is quite high (for many), especially when law school loans are coupled with undergraduate loans (and sometimes credit card debt). Indebtedness has outpaced inflation over the decades. For students pressed with this debt, the prospect of failing the bar exam--and likely delaying or losing job opportunities--is more significant.

Third, the bar looks different today, and pass rates may differ despite similar MBE scores. Perhaps the sample size is just a little smaller in the 1980s--and perhaps these jurisdictions with the MBE had disproportionately lower pass rates. There's little question that states that have administered their own bar exams (like Louisiana) can have more inconsistent results. Consider a recent example from Oklahoma, which adopted the Uniform Bar Exam and saw pass rates plunge so significantly that it modified the passing score. Perhaps, then, the MBE score in the 1980s was not as indicative of overall pass rates--but that's more a lack of data on my end.

It's good to see the MBE score improve slightly. In an absolute sense, unfortunately, the pass rates will not approach what they were a few years ago. But while bar pass rates are historically low, the history is worth reflecting upon.

July 2016 bar exam scores improve slightly but remain near all-time lows

The good news for recent law school graduates? The July 2016 pass rates for test-takers will likely increase slightly nationally. As Deborah Merritt recently shared, the mean Multistate Bar Exam score rose from 139.9 in July 2015 to 140.3 in July 2016. Professor Merritt offers a few reasons why scores improved slightly, and I won't add to her good thoughts (despite other possible reasons that may come to light later!).

Given that bar exam scores hit a 27-year low last July, this is surely good news--particularly as the incoming predictors of law students across the nation continued to decline between the entering classes in 2012 and 2013. But there is an important matter of perspective: a 140.3 is still near all-time lows.

The July 2016 score is the second-lowest since 1988 (the lowest being July 2015), and still well off the mark of even the July 2014 score, much less the July 2013 score. In an absolute sense, the score is not good. Indeed, while modest improvements in the bar passage rates in most jurisdictions will be good news for those passing students and for law schools looking for any positive signs, they will not approach the pass rates of three or four years ago.

We should see pass rates from states like North Carolina and Oklahoma soon. As the fall wanes, we'll see more individual jurisdictions and more results from specific schools. And perhaps we'll see if dramatic changes occur in a few places or at a few schools--or whether the change is small and relatively uniform everywhere.