Why is the ABA still accrediting law schools?

Bear with some meandering musings below....

To grossly oversimplify the history of the American Bar Association accrediting law schools, it looks something like this.

About a hundred years ago, just about anyone could take a bar exam, as long as you studied with a lawyer or "read law" for a time. Some attended law school, but it was not required. By the 1930s, states began making it more difficult to pass the bar exam--presumably in part to reduce competition for existing lawyers and make the profession more difficult for individuals to enter. (Presumably, of course, with another, at least salutary, benefit of increasing the "quality" of those practicing law.)

The ABA today describes its accreditation standards as "minimal educational requirements to qualify a person for eligibility to sit for the bar examination." As state bars became more exclusionary, these bars began to adopt minimal standards--driven, perhaps, by the ABA itself, which has become the sole accrediting body in most jurisdictions. The bar required attendance at an accredited law school; the accreditation process was designed to ensure that legal education met standards that the ABA believed to offer a "sound program" of legal education.

All this is actually quite descriptive and lacks any normative explanation. Why should there be certain standards for legal education that must be met before someone takes the bar exam?

What is the difference between legal education and the bar exam?

It might be that we have legal education because we believe that attorneys should be, somehow, perhaps, well-rounded and well-educated individuals, apart from their ability to pass the bar exam. That would seem to be the driving concern--we think (perhaps you don't, but work with the assumption) lawyers shouldn't just be able to pass the bar and practice law; they should have some kind of training and background before they practice law and something that qualifies them apart from the bar exam's test of "minimum competence."

The ABA has a near obsessive focus on the picayune details of how a law school functions, including the types of books the law library maintains. Many of the ABA standards are fairly generic, requiring things like "suitable" classrooms, "sufficient" space for staff, and "sound" admissions policies. Ensuing interpretations often add specific guidance to these generic standards, which drive a great deal of law school decision-making. But these are all designed to elevate the educational experience, quite apart from the ability to pass the bar exam.

Many of these standards, of course, suffer from serious deficiencies. For matters like the books in a library, they are antiquated notions about access to print materials from an era where books were scarce. Today, not only are books plentiful, the resources attorneys principally use are electronic. Some standards are the result of bargains with entrenched interests within the ABA rather than with empirical or quantifiable pedagogical benefit.

But that can be set aside for the moment--the ABA may have a goal of providing some kind of a quality education for all prospective attorneys before they take the bar, but it may simply do so albeit ham-handedly.

But there is a different, perhaps reverse, form of the question: if legal education provides students with three years of sound education and a degree at the end, why is the bar exam even needed? Isn't graduating from a law school after three years of thickly-regulated education sufficient to make one eligible to practice law? Indeed, it's a reason why the state of Wisconsin offers "diploma privilege" to graduates of its two law schools.

The opening question, then, is really to determine the purpose of the bar exam and the purpose of accreditation of law schools.

To recap, the bar exam has perhaps less-than-noble purposes (such as limiting the pool of attorneys), and some perhaps good purposes (such as establishing minimum competence to practice law, however imperfect the bar exam may establish that).

Legal education, in contrast, is, I think, designed to offer something beyond simply establishing "minimum competence." It, perhaps, and perhaps ideally, offers students an opportunity to learn about the law in a more systematic way than reading law might have permitted. That, of course, comes at a high cost for prospective attorneys who must invest (typically) three years of education and a substantial sum of money to achieve the diploma required to take the bar.

Therefore, I think it would be fair to say that legal education is providing something distinct from the bar exam. (Whether the accreditation process is a proper assessment, and whether accreditation should be required, are separate concerns.)

If legal education is providing something distinct from the bar exam, then why are new accreditation standard focusing on the bar exam?

So, why is the ABA still accrediting law schools given its new obsession with the ability of graduates to pass the bar exam?

Most of the rest of the ABA's accreditation practices focus upon the terms of education. It is, in theory, providing something apart from the bar exam. Now comes the new ABA standard, on track for approval, which provides, quite flatly, "At least 75 percent of a law school’s graduates in a calendar year who sat for a bar examination must have passed a bar examination administered within two years of their date of graduation." An earlier malleable standard has become a clean rule. It also threatens a number of schools currently in non-compliance with this standard, likely to get worse as bar rates continue to decline.

What value is the ABA adding if its newest, most stringent control is simply redundant of, well, the bar exam itself? Why have accreditation at all?

It seems a bit self-referential to say that a law school cannot send its graduates to take the bar exam until enough of its graduates pass the bar exam--particularly as the entire point of legal education, as I've suggested, is to provide something apart from "minimum competence."

After all, it would be pretty simple stuff for state bars to simply disclose the pass rates of all institutions whose graduates take the bar exam. Then consumers could know if their school met the standards that they expected or desired when attending law school.

But perhaps it is because of the consumer protection-focused nature of recent years that has driven this result. Legal education is not really seen as offering something "apart from" anymore. It is instead deemed more a necessary and costly hurdle before taking the bar exam. And if law school graduates are going through this costly and time-consuming process, but unable to pass the bar exam, then law schools’ function is greatly diminished.

There are two principal, and opposing, kinds of responses one could make to my query.

First, I suppose one could make the claim that if a law school is not providing "minimum competence" to its graduates, then it is hardly providing the kinds of aspirational traits legal education purports to provide and should not be accredited. That's, I think, somewhat misguided. The bar exam is not really very well designed to test "minimum competence." Indeed, it's not really very well designed to test the abilities of lawyers. Timed, closed-book essays that principally rely on regurgitating black letter law (indeed, often greatly simplified, even fictitious versions of law), alongside a series of multiple choice questions, in selected areas of practice designed for general practitioners who arose from a common-law, 70s-era form of the law, are not really something that should be taught in law school--at least, not emphasized.

In reality, the problem is not that law schools are failing to train their graduates with the "minimum competence" needed to practice law. Or, even the ability to pass the bar. It is that many are accepting, and then graduating, students who are unable to acquire the skills needed to pass the bar--because they are incapable of doing so, or because they are unable to transfer the three years of legal education into the licensing regime of bar test-taking, or because they have been prioritizing other things, or whatever it may be.

This is, I think, a subtle point. It is tied, I think, closer to admissions and graduation practices, and to post-graduation study habits, more than legal education. That is, of course, because legal education is supposed to be providing something other than bar prep, and has been doing so for decades. So, the decline in bar pass rates is not really a problem with "education" in the sense of the time in the classroom for three years. It is about other things.

Second, one could say that the ABA needs to have some standards for accrediting law schools, and this is as good a standard as any to help control the admissions and graduation problems that may be contributing to bar pass rate declines. But this, again, gets back to my opening question--why have accreditation at all?

If law schools aren't in the business of bar exam training (and I don't think they should be), we should still expect that law schools are graduating students who are able to use their professional degrees in the practice of law. If schools are failing in that endeavor, stripping accreditation is certainly a way of penalizing them.

But it all seems quite circuitous, given that we could just permit students to take the bar regardless of their legal education history--as long as they establish that they have the "minimum competence" to take a licensing exam, they could practice law. And if some want to attend law school to secure a credential for future employers that says, "I've attended this law school and have some additional training that establishes something beyond minimum competence," they could do that, too.

And this points back to the purposes of requiring attendance at an accredited law school in the first place. You see, my problem isn't necessarily that the ABA wants to ensure that law schools are graduating students who are able to pass the bar exam and become licensed practicing attorneys. It is, instead, that if the bar exam is our principal concern, and the principal concern is wholly independent of legal education, and now legal education is accrediting bodies based on performance on this principal concern... doesn't that instead suggest that the accreditation process of legal education is, perhaps, its own problem now?

Concluding thoughts

If you've survived through this meandering, it's worth considering what legal education should be. Perhaps it should still try to provide something different from the "minimum competence" required to pass the bar exam.

But as some law schools have departed from practices that may best benefit their graduates--particularly in high tuition costs; entrenched and inflexible standards; and declining control in the quality of admissions, retention, and graduation practices--it may be the case that we have forgotten what law school ought to be. Its purposes have been lost as we consider it as a kind of necessary rite of passage before one takes the bar exam. In this instrumental vein, distrustful of the operation of law schools, the accreditation process should look mostly at the outputs of graduates.

I don't think that's a welcome development, either on the accreditation end or on the telos of legal education. But it's perhaps the necessary evil that has come upon us, until either schools change their practices or the market improves dramatically. Even then, it will be hard to separate legal education from the bar exam, and that loss speaks more about why the ABA is still accrediting schools in the first place--or why state bars require legal education before taking the bar exam.

Despite improvement in MBE scores, bar exam rates appear to be falling yet again

I blogged earlier about the slight improvement in MBE scores, which, I thought, might lead to an increase in the overall bar pass rates. But I was in a wait-and-see mode, because, while MBE scores typically track bar pass rates, we needed to see results from jurisdictions to see if that would be the case this year.

It appears, however, that even though MBE scores rose, bar exam pass rates are declining again.

I continue to track overall pass rates (rather than first-time or first-time ABA rates) because that's often the only data many jurisdictions disclosed--but first-time pass rates are often much better. These are most of the jurisdictions that have publicly disclosed overall pass rates.

Bar pass rates have improved slightly in a couple of jurisdictions--Kansas rose from 76% to about 79%, and West Virginia from 69% to about 71%. But there are some declines--and some fairly significant ones--elsewhere. Missouri's fell from 84% to 79%--it was 91% in 2012. Indiana's dropped 13 points, from 74% to 61%. Iowa's dropped 15 points, from 86% to 71%--it was 90% in 2012.

Assuming all things are equal, an increase in the mean MBE scores would have meant an increase in the bar pass rates. But why are seeing declines in many many jurisdictions?

My initial theory--unsupported by any evidence!--would be that the best students were sufficiently worried about the bar exam and studied more than ever. That would mean that the scores of people already inclined to pass the bar exam improved--and that wouldn't have any impact on the pass rates. It would shift up the mean score of the MBE without affecting the overall pass rates. And, if the quality of students law schools have been graduating has continued to decline, then we might expect to see overall pass rates decline.

It's also possible that these jurisdictions are outliers and we'll see improvement in pass rates in places like New York and California. (The small decline in the pass rate in Florida, however, is not a good sign on this front.)

In short, there was some excitement and enthusiasm about the uptick in the MBE scores. But if that uptick isn't translating into improved bar pass rates, law schools need to be seriously considering responses to continued declines in the months ahead.

(It's worth noting that I chose a non-zero y-axis to demonstrate the relative changes in performance; overall, a majority of test-takers continue to pass the bar in each jurisdiction.)

Why weren't bar exam pass rates an existential crisis in the 1980s?

I blogged about the small improvement in Multistate Bar Exam ("MBE") scores in the July 2016 administration of the test. We won't know what first-time pass rates from ABA-accredited law schools are for some time, but it's fair to assume we should see a small improvement nationally.

The drop in test scores--likely partially caused by a decline in the quality of the applicant pool over the last several years--has caused quite an uproar, particular as the ABA considers clamping down on schools with relatively low pass rates.

But if you look at MBE scores from the 70s and 80s, the peak time for Baby Boomers to be completing legal education, you'll notice that their scores are fairly comparable to the scores in the last two years.

So if MBE scores look a lot like they did back then, why is there such a commotion about them? Perhaps a few reasons.

First, expectations have changed. Gone are the days with the mythic "look to your left, look to your right" fears of dismissal. There is an expectation that virtually all law school enrollees complete their JD, and another expectation that those who secure the JD and take the bar will pass the bar. The challenges are no longer deemed to be the failure rates in law school or the bar exam, but in the process that one must "survive." A dip in bar pass rates upsets existing expectations.

Second, the fiscal consequences have changed. Indebtedness of students at graduation is quite high (for many), especially when law school loans are coupled with undergraduate loans (and sometimes credit card debt). Indebtedness has outpaced inflation over the decades. For students pressed with this debt, the prospect of failing the bar exam--and likely delaying or losing job opportunities--is more significant.

Third, the bar looks different today, and pass rates may differ despite similar MBE scores. Perhaps the sample size is just a little smaller in the 1980s--and perhaps these jurisdictions with the MBE had disproportionately lower pass rates. There's little question that states that have administered their own bar exams (like Louisiana) can have more inconsistent results. Consider a recent example from Oklahoma, which adopted the Uniform Bar Exam and saw pass rates plunge so significantly that it modified the passing score. Perhaps, then, the MBE score in the 1980s was not as indicative of overall pass rates--but that's more a lack of data on my end.

It's good to see the MBE score improve slightly. In an absolute sense, unfortunately, the pass rates will not approach what they were a few years ago. But while bar pass rates are historically low, the history is worth reflecting upon.

July 2016 bar exam scores improve slightly but remain near all-time lows

The good news for recent law school graduates? The July 2016 pass rates for test-takers will likely increase slightly nationally. As Deborah Merritt recently shared, the mean Multistate Bar Exam score rose from 139.9 in July 2015 to 140.3 in July 2016. Professor Merritt offers a few reasons why scores improved slightly, and I won't add to her good thoughts (despite other possible reasons that may come to light later!).

Given that bar exam scores hit a 27-year low last July, this is surely good news--particularly as the incoming predictors of law students across the nation continued to decline between the entering classes in 2012 and 2013. But there is an important matter of perspective: a 140.3 is still near all-time lows.

The July 2016 score is the second-lowest since 1988 (the lowest being July 2015), and still well off the mark of even the July 2014 score, much less the July 2013 score. In an absolute sense, the score is not good. Indeed, while modest improvements in the bar passage rates in most jurisdictions will be good news for those passing students and for law schools looking for any positive signs, they will not approach the pass rates of three or four years ago.

We should see pass rates from states like North Carolina and Oklahoma soon. As the fall wanes, we'll see more individual jurisdictions and more results from specific schools. And perhaps we'll see if dramatic changes occur in a few places or at a few schools--or whether the change is small and relatively uniform everywhere.

Some mixed results for February 2016 first-time bar exam test-takers

I depicted earlier the 33-year low in February Multistate Bar Exam scores from the February 2016 test. But, as one commenter noted, that doesn't necessarily tell us a whole lot about the first-time pass rate, particularly if there is a glut of repeaters taking the bar inflating the failure rate. Now that results are trickling in, we are seeing declining pass rates across the board among first-time test takers--but, at least in a piece of good news for bar test-takers, the declines are not as dramatically sweeping and across-the-board as we've seen before.

The results are decidedly more mixed, albeit with more in the negative direction than a positive direction. Admittedly, it is important to note that these are only a handful of jurisdictions. Additionally, the February pool of test-takers are much smaller than the July pool of test-takers, which may skew results. The following results are the change in pass rates year-over-year, and I listed them only among first-time test-takers, when the statistics were available. (More states had overall pass rates, but these tend to be more valuable in terms of identifying trends among recent law school graduates.)

Florida, -6 points (February 2015: 64%; February 2016: 58%)

Indiana, -12 points (February 2015: 74%; February 2016: 62%)

New Mexico, -9 points (February 2015: 90%; February 2016: 81%)

Oregon, unchanged (February 2015: 69%; February 2016: 69%)

Pennsylvania, +5 points (February 2015: 69%; February 2016: 74%)

Tennessee, +4 points (February 2015: 64%; February 2016: 68%)

Washington, -4 points (February 2015: 75%; February 2016, 71%)

February 2016 MBE bar exam scores drop to lowest point since 1983

I've written extensively about the bar exam, including the significant decline in bar exam scores, specifically the Multistate Bar Exam, and the corresponding the decline in pass rates in most jurisdictions. The February 2016 results are the fourth consecutive exam to display a significant decline in MBE scores. In fact, it's the lowest score on the February test since 1983--even worse than the July 2015 results, which were the lowest since 1988. Below is a visualization of February test scores since 2005--note the precipitous drop in the last two tests. (I may visualize results since 1983 in the future.)

This will likely mean a decline in pass rates in most jurisdictions, news of which will trickle out over the next several weeks. The decline in scores continues to correlate with declines in student quality, as law schools admitted classes with an increased number of students at risk of failing the bar exam. Whether other factors contribute to the decline remains an open question. But this helps illustrate that the problems are not one-time issues as the result of ExamSoft--they are structural and long-term issues with significant consequences. I'll blog more about this in the near future.

Will states like California lower their bar standards to help schools comply with new ABA mandate?

Unintended consequences are common. One develops a great idea; it takes form; it is discussed and debated; and, finally, it takes effect. But it may result in unintended consequences, it's always been fascinating to think about those unintended consequences. I've extensively discussed unintended consequences of matters such as LSAT administration, accommodated LSAT test-taking, and distortions in law school admissions.

The American Bar Association has moved closer to approving a new accreditation standard. At least 75 of law graduates from an institution must pass the bar exam within two years. It is a much simpler rule than the previous standard, and it holds schools to a higher standard.

Might there be unintended consequences? Many schools right now currently fail that standard. Professor Brian Leiter rightly wonders if schools will focus more on bar prep than other aspects of legal education. It is also likely that many schools will seriously reconsider their class sizes, admissions standards, academic dismissal rates, and transfer students.

But it's also worth noting that not all state bar exams are created equal. Perhaps nothing makes that point so clearly as looking at the passing scores required for the Uniform Bar Exam, a standardized bar exam with a single score, and varying scores required for admission in different states. A 260 will pass in Minnesota or Alabama, while a 280 is required to pass in Alaska or Idaho. My colleague Rob Anderson has identified the varying degrees of difficulty of many states' bar exams. And California is at the top--I've identified how California bar test-takers are more capable than test-takers in other states, but they fail at higher rates because of the difficulty of the bar.

So take a state like California. It is very likely that a number of schools will face serious difficulty meeting this standard--the first time rates for many schools are well below 50%, much less 75%, and even students who retake the test may make it a challenge for the total to pass the 75% threshold.

Some schools may begin to "export" students to jurisdictions with easier exams and higher pass rates--perhaps incentivizing them with stipends on the condition they take the exam in an easier jurisdiction.

But that's a potential unintended consequence that is school-centered. Might there be bar-centered consequences?

Suppose the state bar of California suddenly finds that four or five of its law schools are at risk of losing ABA accreditation. While some may praise that outcome, it's not clear that the state bar would do so. It might be inclined to lower its standards to increase pass rates (more in line with other states) and keep its schools in the ABA's good graces. Other states with particularly difficult bar exams, or with law schools that have significant political clout, may do the same.

Of course, this is speculative. And I make no claim as to whether such decisions would be good or bad--one could think some state bars are too difficult and that the pass rates should be increased, or one could think that the bar should not lower its standards. Instead, it's simply to identify some of the potential consequences that may come about from proposals like this. Only time will tell whether such consequences actually arise.

Heat and light, LSAT scores and bar passage data

If you at all frequently read this blog, you're undoubtedly aware that what largely began as my idiosyncratic thoughts about election law have given way to a significant amount of content on legal education and the bar exam.

Recently, many pixels have been used to discuss the utility of the LSAT, and the relationship between LSAT scores and bar pass rates, which has spurred many larger discussions about the nature of legal education. They are easily discoverable.

Bernie Burk several months ago used the metaphor of heat and light in the midst of some such discussions, which I found quite useful. And I commend to all readers Jerry Organ's comments at the Legal Whiteboard, The Opaqueness of Bar Passage Data and the Need for Greater Transparency. Measured, careful, thoughtful analysis is the analysis I find most useful in such discussions--ones that not only concede limitations, but do not minimize such concessions. I remain deeply grateful for the thoughtful contributors in this space who have spurred me to think carefully and critically on all fronts. And I hope my posts remain useful.

Visualizing the overall bar pass rate declines in 2015 across jurisdictions

In early September, I highlighted the warning signs of bar pass rate declines in several jurisdictions. Shortly thereafter, the NCBE disclosed that MBE scores had hit a 27-year low. Last year, I offered a rough compilation of the decline in overall pass rates, suggesting that ExamSoft was not to blame but that the MBE itself may have contributed to the decline. (Later evidence convinced me that the MBE was likely not responsible for the decline.) By March, we had more granular data for jurisdiction-by-jurisdiction results.

Here are the changes in overall bar pass rates between July 2014 and July 2015. for a handful of jurisdictions that have easily-disclosed top-line data.

The overall declines are far from universal in these jurisdictions, and the median decline is about 2 points. But several jurisdictions did experience overall declines of at least 5 points.

Cobbling together the overall results from July 2013 to July 2015, a two-year change, the trends are fairly stark in most jurisdictions, often joining together significant declines in 2014 with modest declines in 2015.

The entering class profiles for the next few law school classes suggest that these trends will continue, at least to a small degree--the total degree remains something of an open question. How schools react, or how prospective bar examinees react, may further change these projections. And next spring, we'll have the data for the first-time bar-pass rates in these jurisdictions, which will provide a slightly more useful comparison of the overall trends.