Bar exam scores hit 27-year low

After my early details about the dropping bar rates across most jurisdictions, Bloomberg reports that the scaled MBE score has dropped yet again, 1.6 points, to reach its lowest level since 1988.

Last year, I blogged about the drop in MBE scores, the 200-question multiple choice test that serves as the primary objective scale for score for bar passage, noting that it was the single-largest drop in MBE history. This year's decline is by no means historic, but it is the among the lowest scores in the history of the test. I've updated charts from last year here.

July 2015 bar exam results again show declining pass rates almost everywhere: outliers, or a sign of more carnage?

This post has been updated.

Many speculated that the July 2014 bar passage results were anomalously low on account of some failure in the exam, either because of software glitches or because of some yet-undescribed problem with the National Conference of Bar Examiners and its scoring of the Multistate Bar Exam. Last October, I was among the first to identify the decline in scores last year, and my initial instinct caused me to consider that a problem may have occurred in the bar exam itself. Contrary evidence, however, led me to change my mind, and the final scores showed rather significant declines in all jurisdictions, in all likelihood, I concluded, based on a decline in law school graduate quality.*

It's quite early for the July 2015 bar exam results, but they are trickling in. In most of these jurisdictions, only the overall pass rate is available, even thought it's usually better to separate first-time test-takers from repeaters (and, even better, first-time test-takers who graduated from ABA-accredited law schools). In other jurisdictions, I use the best available data, which is sometimes second-hand (and I link all sources when available). Worse, many of these jurisdictions only list pass and fail identities, so I have to do the math myself, which increases the likelihood for error.

But looking at the NCBE statistics from last year, we can see another overall decline in scores almost across the board. And even in places where there was an uptick in pass rates--which, perhaps, suggest that things are not as dire as they appeared last year, where--they remain low compared to recent history. Assuming last year's exam was not an anomaly but the beginning of a trend, which I eventually came to agree was the best explanation given the evidence, these results are consistent with that assumption--with no ExamSoft fiasco to blame. The problem of lower standards at many law schools that began about four years ago appears to be coinciding with the decline of bar pass rates, in many jurisdictions to recent-past lows, and several jurisdictions experiencing double-digit drops in the pass rate.

As with last year, of course, we're looking at only a handful of early-reporting jurisdictions. The final scaled MBE score, when disclosed, should reveal a great deal of information, so projections from the trends of a few states should be treated with appropriate caution (and speculation).

UPDATE: The MBE scores have been released, and they are the lowest since 1988. You can see details here.

Change in overall bar pass rate, July 2014 over July 2015

Iowa, +5 points (July 2014: 81%; July 2015: 86%)

Kansas, -3 points (July 2014: 79%; July 2015: 76%)

New Mexico, -12 points (July 2014: 84%; July 2015: 72%)

North Carolina, -4 points** (July 2014: 71%; July 2015: 67%)

North Dakota, +6 points (July 2014: 63%; July 2015: 69%)

Oklahoma, -11 points (July 2014: 79%; July 2015: 68%)

Washington, -1 point (July 2014: 77%; July 2015: 76%)

West Virginia, -5 points (July 2014: 74%; July 2015: 69%)

Wisconsin, -10 points*** (July 2014: 75%; July 2015: 65%)

**denotes first-time test-takers, not overall rate. UPDATE: I relied on erroneous data from 2014; I've since updated the data.

***source via comments

It's worth noting that North Carolina's bar appears to have an unusually volatile pass rate. The first-time pass rate in July 2013 was 71%; that skyrocketed to 85% last year; and that plummeted back to 67% this year. UPDATE: This data was in error, see above.

Jurisdictions like North Dakota are incredibly small--just 62 people took the bar, which likely explains some of the great volatility in scores, as each test-taker represents almost 2 points in the overall pass rate. July 2013 had a 76% overall pass rate, which plunged to 63% last year and bobbed back up to 69% this year. But more importantly, their first-time pass rate increased 15 points, from 64% to 79%, which resembles the 81% first-time pass rate from July 2013.

I've also added a little historical perspective for these bar exams. I've added charts beside the table showing the overall July pass rate (in North Carolina's case, the first-time pass rate) since 2010. In many jurisdictions, this is a six-year low, and it might be the lowest in quite some time. In most jurisdictions, it's the lowest or second-lowest in the six-year window of data. (The charts are slightly deceptive because the axes all end near the bottom of the pass rate range and doesn't go all the way down to 0%; perhaps not obviously to all, most graduates still pass the bar in these jurisdictions, but the charts reflect the relative changes within a small band in recent years.)

*(As an important caveat, I recognize that there are many measures of "student quality" or "law school graduate quality," and that the bar exam is but one measure of that. But, assuming, which may be even too big an assumption for many, that the bar exam presents, very roughly, a proxy for those who have the minimum capability to practice law, and the pass rates continue to decline, then we can, very roughly, say that there has been a "decline" in "law school graduate quality," at least as evaluated by this one metric. Perhaps there are other metrics, or perhaps there are better metrics, but this is how I use the term here.)


Additional updates to this post will occasionally occur here.

Alabama, -5 points (July 2014: 65%; July 2015: 60%)

Arizona, -11 points (July 2014: 68%; July 2015: 57%)

California, -2 points (July 2014: 49%; July 2015: 47%)

Colorado, -2 points (July 2014: 74%; July 2015: 72%)

Connecticut: -2 points (July 2014: 77%; July 2015: 75%)

Florida, +3 points (July 2014: 66%; July 2015: 69%)

Georgia, -6 points (July 2014: 80%; July 2015: 74%)

Idaho, +4 points (July 2014: 65%; July 2015: 69%)

Indiana, unchanged (July 2014: 72%; July 2015: 72%)

Louisiana, -8 points (July 2014: 70%; July 2015: 62%)

Mississippi, -27 points (July 2014: 78%; July 2015: 51%)

Missouri, -1 point (July 2014: 85%; July 2015: 84%)

Montana, -2 points (July 2014: 64%; July 2015: 62%)

Nevada, +2 points (July 2014: 58%; July 2015: 60%)

New York, -4 points (July 2014: 65%; July 2015: 61%)

Oregon, -5 points (July 2014: 65%; July 2015: 60%)

Pennsylvania, -5 points (July 2014: 76%; July 2015: 71%)

Tennessee, -2 points (July 2014: 66%; July 2015: 64%)

Vermont, -14 points (July 2014: 66%; July 2015: 52%)

California bar votes to cut exam from three days to two

In March, I covered the news that the California bar was considering cutting the length of the bar exam from three days to two. Today, Above the Law reports that the bar's board of trustees has unanimously approved the change, which should take effect July 2017.

The proposal (PDF) called for five one-hour essay questions and a 90-minute performance test on one day, and the 200-question multistate bar exam (MBE) on another day. The essays and the multiple choice component would each receive half the weight in the final score.

This post has been updated.

Here we go again: February 2015 bar pass rates down over last year

For February 2016 information, please click here.

This post has been updated with a visual representation of the decline in the mean MBE score.

In pursuit of a seemingly endless quest to determine what caused the July 2014 decline in bar pass rates, there's a simple solution: wait and see. Subsequent administrations of the test would reveal whether the July 2014 test was a one-time aberration or reflected an actual decline in student quality.

As the February 2015 bar exam results start to trickle in, the answer, as I've been inclined to suggest of late, is increasingly likely to be the latter.

It should be noted that some state bars, like Illinois, have begun to pull up the ladder on young Millennials increase the score required to pass. That will likely independently increase the failure rate in many jurisdictions in the years to come.

Additionally, the February bar exam is something different in kind. It usually includes fewer first-time test-takers, which means that the overall pass rates are usually lower. (People who fail the bar once are much more likely than others to fail it again.) There are often with much smaller pools of test-takers, making a single jurisdiction's pass rate subject to apparent significant fluctuations.

At this stage, too, like last year, most jurisdictions only disclose the overall pass rate, lumping together first-time test-takers and repeaters, ABA and non-ABA law school graduates, which is the least meaningful metric for evaluating performance across administrations.

Then again, if the theory is that the July 2014 was a one-time aberration, we might see an increase in highly qualified repeaters who are much more likely to pass the test if they "ought" to have passed the first time around--meaning, perhaps, that, all things being equal, we may see pass rates increase in the February 2015 administration over the February 2014 test, if the July 2014 test was attributable to non-test-taker-related factors.

The preliminary data, however, reflects a decline in pass rates largely across the board (with no ExamSoft debacle to complicate our analysis).

Granted, not only are we dealing with the caveats above, but these jurisdictions are (mostly) smaller than the typical jurisdiction, which makes potential distortions even more likely. Further, the declines are (somewhat) smaller (and, perhaps, closer to what one would expect with the decline of predictors) than the ones initially observed last July. And until a jurisdiction discloses the national mean scaled MBE score, we don't have the cleanest comparison. But given that early signs last year pointed toward the ultimate trend--despite most of the same caveats--these might serve as a warning.

Overall bar pass rates, February 2015 v. February 2014

Florida, -8 points* (February 2014: 72%; February 2015: 64%)

Kansas, -4 points (February 2014: 86%; February 2015: 82%)

Kentucky, -7 points (February 2014: 77%; February 2015: 70%)

Illinois, about -5 points (February 2014: 75%**)

Iowa, -14 points (February 2014: 86%; February 2015: 72%)

Missouri, -3 points (February 2014: 81%; February 2015: 78%)

New Mexico, -1 point (February 2014: 81%; February 2015: 80%)

New York, -4 points (February 2014: 47%; February 2015: 43%)

North Carolina, -13 points (February 2014: 56%; February 2015: 43%)

North Dakota, -7 points (February 2014: 62%; February 2015: 55%)

Ohio, unchanged (February 2014: 64%; February 2015: 64%)

Oklahoma, -3 points (February 2014: 70%; February 2015: 67%)

Oregon, -2 points (February 2014: 66%; February 2015: 64%)

Pennsylvania, -4 points (February 2014: 57%; February 2015: 53%)

Tennessee, -10 points (February 2014: 64%; February 2015: 54%)

Vermont, -20 points (February 2014: 68%; February 2015: 48%)

Virginia, unchanged (February 2014: 59%; February 2015: 59%)

Washington, -5 points (February 2014: 71%; February 2015: 66%)

West Virginia, -2 points (February 2014: 70%; February 2015: 68%)

We have small additional data points reflecting that perhaps it's not quite so bad. North Dakota disclosed its first-time pass rate, which increased 7 points--of course, only 31 were first-time takers last year, which, again, reflects some of the caveats listed above. (UPDATE: Pennsylvania's first-time pass rate was 69%, a 3-point drop. Oregon's first-time pass rate was 69%, an 11-point drop.)

I hope to occasionally update this post in the weeks to come, and we'll see if these jurisdictions are an aberration or a sign of things to come.

*Florida's statistics include only first-time exam takers.

**While Illinois has not disclosed its pass rate, its percentile equivalent chart suggests a drop of about 5 points. A scaled score of 264 is required to pass. A scaled score of 270 was the equivalent of the 40th percentile in February 2014; it's the equivalent of the 46th percentile in 2015. A scaled score of 260 was the equivalent of the 27th percentile in February 2014; it's the equivalent of the 31st percentile in 2015. (Although I confess I don't understand how Illinois disclosed an overall 75% pass rate when it conceded that 27% of test-takers scored at least 4 points below the passing score in February 2014, unless they have extremely generous re-scoring and re-evaluation.)

UPDATE: The Pennsylvania bar results reveal that the national scaled MBE score for February 2015 was a 136.2. That's a 1.8-point drop from the February 2014, and, while not the steepest decline or the lowest score in the last decade, is certainly close to it.

 

Visualizing the grim final numbers from the July 2014 bar exam

Most by now are undoubtedly aware about the significant decline in MBE scores and bar pass rates in the July 2014 bar exam. I've recently been persuaded (but not wholly) by NCBE explanations, suggesting that the July 2014 had generally worse predictors and performed worse as a result. If true, that suggests a grim reality as predictors worsen over the next several administrations.

I had some data earlier, cobbled together from state by state data sets using overall pass rates, suggesting, among other things, that the ExamSoft fiasco was not (primarily) responsible for the decline.

The NCBE has released its statistics for the 2014 administrations of bar exams. That means we have access to complete data sets, and to more precise data (e.g., first-time pass rates instead of overall pass rates). Below is a chart of changes in first-time bar pass rates among all 50 states and the District of Columbia between July 2013 and July 2014, with some color coding relating to the MBE and ExamSoft. Thoughts below.

As noted previously, the only non-MBE jurisdiction, Louisiana, saw a significant improvement in bar pass rates among first-time test-takers. So, too, did North Carolina--an MBE and ExamSoft jurisdiction with its essays on Tuesday. Congrats to the lucky test-takers in the Tar Heel State. Elsewhere, however, you see across-the-board declines among first-time test-takers, with a modest improvements in a few of jurisdictions.

It's wait and see for the July 2015 administration to determine whether this decline is the start of a trend or, perhaps, a one-off aberration.

California poised to cut bar exam from three days to two

UPDATE: The bar voted in July 2015 in favor of the proposal, to take effect July 2017. See the update here.

Tomorrow, the Committee of Bar Examiners for the State of California meets to consider whether to cut the bar exam from three days to two days.

The proposal would result in one day of essays and one day of the MBE. The essays would include a morning of three, one-hour essays; and an afternoon of two, one-hour essays and a 90-minute performance test. As a practical matter, its most significant impact would be on the performance test, which has been a three-hour element of the exam. Each day would be weighed equally.

It would not make the exam any easier--that's a question left for the cutline for scores, which presumably would be recallibrated to reflect a comparable difficulty. Instead, it would make it less grueling for test-takers, and less expensive for all parties--one fewer day staying in a hotel, and one fewer day of material to develop and score. Further, it might speed grading, which, given California's glacial pace of scoring that postpones bar admission ceremonies into December after a student graduates in May, would benefit all parties.

The most intriguing component of the agenda item, in my view, describes the mismatch between critiques of proposed changes and the point of the exam itself:

There continues to be some confusion with regard to what the bar examination is intended to do. The examination is not designed to predict success as a lawyer or even that a lawyer is ready for the practice of law. In fact, one of the best predictors of bar examination scores is the grades an applicant received during law school. So, in one sense, the examination is confirmation that the necessary skills and knowledge were learned during the three or four years of law study, through whatever means, which are needed to show minimum competence as a lawyer. The bar examination is an examination to test minimum competence in the law.

The format of the exam, then, whether through essays or multiple choice, whether three days or two days, is not the point.

Implementation would be submitted for review in April 2015 to determine when the two-day bar, if approved, would first take place.

Correcting the National Jurist piece on bar pass rates

In its February 2015 issue, National Jurist published a story about bar pass rates. It quoted my earlier work on the subject. But, apparently, the editing process at the magazine is relatively slow and does not respond to new information well. For some time, I suspected the NCBE had some role in the decline in the bar pass rates (which National Jurist notes). But after the NCBE provided additional data in December 2014, I was convinced that much of the decline could be attributed to a decline in student quality. For more on this area, consider my previous posts about the bar exam.

NCBE has data to prove Class of 2014 was worst in a decade, and it's likely going to get worse

I have blogged extensively about the decline in bar pass rates around the country after the July 2014 test. My original take was more inquisitive, and I later discounted the impact that ExamSoft may have had. After examining the incoming LSAT scores for the Class of 2014, I concluded that it was increasingly likely that the NCBE had some role, positing elsewhere that perhaps there was a flaw in equating the test with previous administrations.

The NCBE has come back with rather forceful data to show that it wasn't the MBE (and that my most recent speculation was, probably, incorrect)--it was, in all likelihood, the graduates who took the test.

In a December publication (PDF), the NCBE described several quality-control measures that confirmed it was the test-takers, and not the test. First, on re-takers v. first-time test-takers:

Among the things I learned was that whereas the scores of those we know to be retaking the MBE dropped by 1.7 points, the score drop for those we believe to be first-time takers dropped by 2.7 points. (19% of July 2014 test takers were repeaters, and 65% were believed to be first-time takers. The remaining 16% could not be tracked because they tested in jurisdictions that collect inadequate data on the MBE answer sheets.) The decline for retakers was not atypical; however, the decline for first-time takers was without precedent during the previous 10 years.

I had suggested from earlier data from a few states that re-takers and first-time test-takers performed similarly; but, disclosing data from a much broader dataset and using the more precise issue of MBE performance, first-time test-taker performance was much worse.

Second, on equating the test:

Also telling is the fact that performance by all July 2014 takers on the equating items drawn from previous July test administrations was 1.63 percentage points lower than performance associated with the previous use of those items, as against a 0.57 percentage point increase in July 2013.

As equating the test is probably the biggest possible flaw on the NCBE's end, it's extremely telling that the equating of specific items on previous administrations yielded such a significant decline, and such a sharp contrast with the July 2013 test.

Third, and, in my view, one of the most telling elements, the MPRE presaged this outcome:

The decline in MPRE performance supports what we saw in the July 2014 MBE numbers. In 2012, 66,499 candidates generated a mean score of 97.57 (on a 50–150 scale). In 2013, 62,674 candidates generated a mean score of 95.65. In 2014, a total of 60,546 candidates generated a mean score of 93.57. Because many MPRE test takers are still enrolled in law school when they test, these scores can be seen as presaging MBE performance in 2014 and 2015.

A steady decline in MPRE scores, then, foretold this problem. This further undermines any notion that ExamSoft or other test-specific factors impacted the outcome; the writing was on the wall years ago. But as few schools carefully track MPRE performance, it might not have been an obvious sign until after the fact.

The NCBE bulletin then points out additional factors that distort student quality: a decrease in quality at the 25th percentile of admitted students at many institutions (i.e., those at the highest risk of failing the bar), the impact of highest-LSAT score reporting rather than average-LSAT score reporting for matriculants (a change embraced by both the ABA and LSAC despite evidence that taking the highest score overstates student quality), and an increase in transfer students to higher-ranked institutions (which distorts the incoming student quality metrics at many institutions). Earlier, I blogged that a decline in LSAT scores likely could not explain all of the decline--it could explain part, but there are, perhaps, other factors at play.

The NCBE goes on to identify other possible factors, ones that may merit further investigation in the legal academy:

  • An increase in "experiential learning," including an increase in pass-fail course offerings, and which often means students take fewer graded, more rigorous, "black-letter" courses;
  • A decline in credit hours required for graduation and a decline required (i.e., often more rigorous) courses;
  • An increase in bar-prep companies over semester-long coursework to prepare for the bar;
  • A lack of academic support for at-risk students as the 25th percentile LSAT scores of matriculants worsens at many institutions.

So, after I waffled, and blamed some decrease in student quality, and then started to increasingly consider the NCBE as a culprit, this data moves me back to putting essentially all of the focus on student quality and law school decisionmaking. Law schools--through admissions decisions, curriculum decisions, academic support decisions, transfer decisions, as a reaction to non-empirical calls from the ABA or other advocacy groups, or some combination of these factors--are primarily in control of the students' bar pass rates, not some remarkable decision of the NCBE. How schools respond will be another matter.

Further, the NCBE report goes on to chart the decline in the 25th percentile LSAT scores at many institutions. The declines in many places are steep. They portend some dramatic results--the decline in bar pass rates this year is only the beginning of probably still-steep declines in the next couple of years, absent aggressive decisions within the present control of law school. (The admissions decisions, after all, are baked in for the current three classes.)

Coupled with the decline of prospective law students, law schools are now getting squeezed at both ends--their prospective student quality is increasingly limited, and their graduates are going to find it still harder to pass the bar. And we'll see how they respond to this piece of news from the NCBE--I, for one, find the data quite persuasive.