National Conference of Bar Examiners: Class of 2014 "was less able" than Class of 2013
Continuing a series about the decline in bar passage rates, the National Conference of Bar Examiners recently wrote a letter to law school deans that explained its theory behind the reason in a 10-year low in Multistate Bar Exam scores and the single-biggest drop in MBE history. I've excerpted the relevant paragraphs below.
In the wake of the release of MBE scores from the July 2014 test administration, I also want to take this opportunity to let you know that the drop in scores that we saw this past July has been a matter of concern to us, as no doubt it has been to many of you. While we always take quality control of MBE scoring very seriously, we redoubled our efforts to satisfy ourselves that no error occurred in scoring the examination or in equating the test with its predecessors. The results are correct.
Beyond checking and rechecking our equating, we have looked at other indicators to challenge the results. All point to the fact that the group that sat in July 2014 was less able than the group that sat in July 2013. In July 2013 we marked the highest number of MBE test-takers. This year the number of MBE test-takers fell by five percent. This was not unanticipated: figures from the American Bar Association indicate that first-year enrollment fell 7% between Fall 2010 (the 2013 graduating class) and Fall 2011 (the 2014 class). We have been expecting a dip in bar examination numbers as declining law school applications and enrollments worked their way to the law school graduation stage, but the question of performance of the 2014 graduates was of course unknown.
Some have questioned whether adoption of the Uniform Bar Examination has been a factor in slumping pass rates. It has not. In most UBE jurisdictions (there are currently 14), the same test components are being used and the components are being combined as they were before the UBE was adopted. As noted above, it is the MBE, with scores equated across time, that reveals a decline in performance of the cohort that took July 2014 bar examinations.
In closing, I can assure you that had we discovered an error in MBE scoring, we would have acknowledged it and corrected it.
Well, that doesn't make any sense.
First, whether a class is "less able" is a matter of LSAT and UGPA scores. It is not a matter of the size of the class.
Second, to the extent a brief window into the LSAT scores for the entering classes in the Fall of 2010 and 2011 are useful metrics, Jerry Organ has noted elsewhere that the dip in scores was fairly modest in that first year after the peak application cycle. It certainly gets dramatically worse, but nothing suggesting that admissions in that one-year window fell off a cliff. (More data on the class quality is, I hope, forthcoming.)
Third, to the extent that the size of the class matters, it does not adequately explain the drop-off. Below is a chart of the mean scaled MBE scores, and an overlay of the entering 1L class size (shifted three years, so that the 1L entering class corresponds with the expected year of graduation).
If there's supposed to be a drop-off in scores because of a drop-off in enrollment (and there is, indeed, some meaningful correlations between the two), it doesn't explain the severity of the drop in this case.
This explanation, then, isn't really an explanation, save an ipse dixit statement that NCBE has "redoubled" its efforts and an assurance that "[t]he results are correct." An explanation is still yet to be found.