NCBE has data to prove Class of 2014 was worst in a decade, and it's likely going to get worse

I have blogged extensively about the decline in bar pass rates around the country after the July 2014 test. My original take was more inquisitive, and I later discounted the impact that ExamSoft may have had. After examining the incoming LSAT scores for the Class of 2014, I concluded that it was increasingly likely that the NCBE had some role, positing elsewhere that perhaps there was a flaw in equating the test with previous administrations.

The NCBE has come back with rather forceful data to show that it wasn't the MBE (and that my most recent speculation was, probably, incorrect)--it was, in all likelihood, the graduates who took the test.

In a December publication (PDF), the NCBE described several quality-control measures that confirmed it was the test-takers, and not the test. First, on re-takers v. first-time test-takers:

Among the things I learned was that whereas the scores of those we know to be retaking the MBE dropped by 1.7 points, the score drop for those we believe to be first-time takers dropped by 2.7 points. (19% of July 2014 test takers were repeaters, and 65% were believed to be first-time takers. The remaining 16% could not be tracked because they tested in jurisdictions that collect inadequate data on the MBE answer sheets.) The decline for retakers was not atypical; however, the decline for first-time takers was without precedent during the previous 10 years.

I had suggested from earlier data from a few states that re-takers and first-time test-takers performed similarly; but, disclosing data from a much broader dataset and using the more precise issue of MBE performance, first-time test-taker performance was much worse.

Second, on equating the test:

Also telling is the fact that performance by all July 2014 takers on the equating items drawn from previous July test administrations was 1.63 percentage points lower than performance associated with the previous use of those items, as against a 0.57 percentage point increase in July 2013.

As equating the test is probably the biggest possible flaw on the NCBE's end, it's extremely telling that the equating of specific items on previous administrations yielded such a significant decline, and such a sharp contrast with the July 2013 test.

Third, and, in my view, one of the most telling elements, the MPRE presaged this outcome:

The decline in MPRE performance supports what we saw in the July 2014 MBE numbers. In 2012, 66,499 candidates generated a mean score of 97.57 (on a 50–150 scale). In 2013, 62,674 candidates generated a mean score of 95.65. In 2014, a total of 60,546 candidates generated a mean score of 93.57. Because many MPRE test takers are still enrolled in law school when they test, these scores can be seen as presaging MBE performance in 2014 and 2015.

A steady decline in MPRE scores, then, foretold this problem. This further undermines any notion that ExamSoft or other test-specific factors impacted the outcome; the writing was on the wall years ago. But as few schools carefully track MPRE performance, it might not have been an obvious sign until after the fact.

The NCBE bulletin then points out additional factors that distort student quality: a decrease in quality at the 25th percentile of admitted students at many institutions (i.e., those at the highest risk of failing the bar), the impact of highest-LSAT score reporting rather than average-LSAT score reporting for matriculants (a change embraced by both the ABA and LSAC despite evidence that taking the highest score overstates student quality), and an increase in transfer students to higher-ranked institutions (which distorts the incoming student quality metrics at many institutions). Earlier, I blogged that a decline in LSAT scores likely could not explain all of the decline--it could explain part, but there are, perhaps, other factors at play.

The NCBE goes on to identify other possible factors, ones that may merit further investigation in the legal academy:

  • An increase in "experiential learning," including an increase in pass-fail course offerings, and which often means students take fewer graded, more rigorous, "black-letter" courses;
  • A decline in credit hours required for graduation and a decline required (i.e., often more rigorous) courses;
  • An increase in bar-prep companies over semester-long coursework to prepare for the bar;
  • A lack of academic support for at-risk students as the 25th percentile LSAT scores of matriculants worsens at many institutions.

So, after I waffled, and blamed some decrease in student quality, and then started to increasingly consider the NCBE as a culprit, this data moves me back to putting essentially all of the focus on student quality and law school decisionmaking. Law schools--through admissions decisions, curriculum decisions, academic support decisions, transfer decisions, as a reaction to non-empirical calls from the ABA or other advocacy groups, or some combination of these factors--are primarily in control of the students' bar pass rates, not some remarkable decision of the NCBE. How schools respond will be another matter.

Further, the NCBE report goes on to chart the decline in the 25th percentile LSAT scores at many institutions. The declines in many places are steep. They portend some dramatic results--the decline in bar pass rates this year is only the beginning of probably still-steep declines in the next couple of years, absent aggressive decisions within the present control of law school. (The admissions decisions, after all, are baked in for the current three classes.)

Coupled with the decline of prospective law students, law schools are now getting squeezed at both ends--their prospective student quality is increasingly limited, and their graduates are going to find it still harder to pass the bar. And we'll see how they respond to this piece of news from the NCBE--I, for one, find the data quite persuasive.

Visualizing the continuing decline of the law school student body, 2014

One of the posts that has had the most staying power on this site was a post and a chart last year, "For legal education, the worst may be yet to come." We can now confirm that the decline continues, and the Class of 2017 is much smaller than previous classes--and that the bottom still has not been reached, given LSAT and applicant trends.

An ABA Journal piece discloses that the total incoming 1L class in 2014 was 37,675, the smallest since 1974, and down from the peak of 52,488 in 2010. Coupled with the declining LSAT data from LSAC, it paints a grim picture for legal education through at least 2017, and likely through 2018.

Top 25 law schools ranked by law student transfer preferences

How about another law school ranking--this time, one that measures tangible law student preference for one school over another?

The ABA has released the Standard 509 disclosures from law schools for 2014. The Standard 509 includes new data this year. Schools formerly listed only the number of transfers in and out. Now, if a school accepts more than five transfer students, it must list the schools from which the transfers came. Additionally, enough transfer students requires the school to the median GPA, and with an even larger number of transfers the 75th and 25th percentile GPAs of transfer.

The data is all concealed in those PDFs. But with a little magic (text recognition, data scraping, and a little time-consuming manual cleanup), we can aggregate the transfer data. Schools logged 2221 transfers in for the Fall of 2014. Because of disclosure requirements, we know the migration patterns of 1968 of them. So we know 1968 decisions of law students to leave one institution and instead attend another.

Students applying to law schools often don't have a good idea about what schools have to offer. But once they are in law school, they have some additional information about their own institution and have a better perspective about law schools themselves.

My colleague Rob Anderson (WITNESSETH) thought that using the Bradley-Terry model would be the best way of comparing schools. (This method is used in, among other things, Peter Wolfe's college football rankings, once a component of the BCS formula.)

Using that method, here are the top 25 schools. (The full list will be revealed at WITNESSETH later this week.) Comments below.

1. Yale University

2. Stanford University

3. Harvard University

4. New York University

5. University of California-Berkeley

6. Columbia University

7. University of Chicago

8. University of Pennsylvania

9. Northwestern University

10. University of Texas at Austin

11. Duke University

12. University of Washington

13. University of California-Los Angeles

14. Vanderbilt University

15. University of Michigan

16. University of Virginia

17. Cornell University

18. George Washington University

19. Brigham Young University

20. Georgetown University

21. University of Minnesota

22. University of Southern California

23. Southern Methodist University

24. Washington University

25. University of Notre Dame

As with any ranking system, there are obviously imperfections. Zero students transferred out of Yale, Harvard, or Stanford, so they are compared only indirectly; so too with others that had fewer transfers. Many schools had five or fewer transfers, which they did not disclose. Many students transfer for personal reasons, which may not reflect an evaluation of law school quality. Schools generally benefited if they had no (or few) transfers out; schools generally suffered if they were not required to disclose their transfer data, or if they accepted few transfers (when, of course, the most stable schools may accept the least transfers!).

Glancing at the top 25, one may wonder about some of the rankings. But consider SMU: they accepted transfers from Baylor, Hastings, and Fordham (among other institutions), but sent students only to Texas. A Bradley-Terry model would rank SMU quite high for precisely the results of this head-to-head matchup.

The data set includes all schools with at least one disclosed transfer, including the three schools in Puerto Rico, recently-accredited schools like the University of California-Irvine, and schools seeking accreditation like Concordia University.

Stay tuned: the full list of schools is forthcoming.

(If you are at a law school that did not disclose the schools from which students transfers, email us the information and we'll post an update. It will usually, but not always, help your school's ranking.)

UPDATE: This post has been modified in light of a correction in data.

The law student applicant pool still hasn't bottomed out

On the heels of 8- to 9-point drops in LSAT test-takers year over year, LSAC is now reporting an 8.5% decline in applicants year-over-year. I explained last year that the worst may be yet to come for law schools, and noted that the shrinking 2013-2014 applicant pool would linger with law schools until 2017. This cycle promises to have still fewer applicants, which means even more deep impacts to law school budgets through at least 2018. Added with declining bar passage rates, law schools are facing the most dire circumstances yet. Some may not truly feel the financial effects for another year or two, but even more difficult decisions will be made this applicant cycle that may impact which schools survive.

(While I usually accompany a post like this with a chart displaying a precarious decline, I'm holding off for more concrete numbers from the ABA for the Class of 2017....)

Increasingly appears NCBE may have had role in declining MBE scores and bar pass rates

Despite protests from the National Council of Bar Examiners to the contrary (PDF), it increasingly appears that the NCBE had some role in the decline of Multistate Bar Exam scores and, accordingly, the decline in bar passage rates around the country.

Causation is hard to establish from my end--I only see the data out there and can make guesses. But with California's bar results released, we now have 34 of 51 jurisdictions (excluding American territories but including the District of Columbia) that have released their overall bar pass rates. Comparing them to 2013, we see that 20 of them experienced at least a 5-point drop in scores. Louisiana is the only state that does not use the MBE, and it's quite the outlier this time around.

A single state, of course, cannot establish that the MBE is to blame. But it's a data point of note.

Some have blamed ExamSoft. On that, I remain skeptical. First, it would assume that the exam-takers on Tuesday were "stressed out" and sleepless as a result of the upload fiasco, which caused them to perform poorly on Wednesday's MBE. Perhaps I'm too callous to think it's very much of an excuse--it might be for some, but I would have doubts that it would have a dramatic effect on so many. One problem is that reporting of the actual problems of ExamSoft has been spotty--there have been no journalists who did the legwork of investigating which states had the problems, or to what extent.

But we have a couple of data points we can now use. First, jurisdictions that do not use ExamSoft, but use some other exam software like Exam4 or require handwriting. Second, the jurisdictions whose essay components occurred on Thursday, not Tuesday--meaning there was no ExamSoft debacle the night before the MBE.

Again, there does not appear to be a significant trend in any of these jurisdictions--they appear to be randomly distributed among the varying scores. While it might be a cause for some, I am not convinced it's a meaningful cause.

Finally, the NCBE has alleged that the class of 2014 was "less able." That's true, as I've pointed out, but only to a point--the decline in scores should not have been as sharp as it was. One small way of trying to compare this point is to examine repeater test-taker data.

A problem with measuring repeater data right now is that few jurisdictions have disclosed it. Further, most bar exams are quite easy, and repeaters are few. Finally, repeaters should fail the bar at extremely high rates, as it would prove the validity of the test--and which skews the results figures. But it might be useful to extract the data and compare first-time from repeater pass rates this cycle, at least in jurisdictions that had a significant number of repeaters. If the Class of 2014 was "less able," then we might expect the first-time takers' pass rates to decline at a higher rate than the repeat takers' pass rates.

Places like California saw identical declines. Others, like Texas and Pennsylvania, actually saw a slightly increased rate of failure from repeaters than from first-time takers. Ohio is on the other side, with a decline pass rate for first-time takers but a decent increase in the rate for repeat takers.

In short, I haven't been able to find an explanation that would identify the cause of the sharp decline in rates. Some, I think, is explained by a slightly lower-quality incoming class--one I've noted will lead to still sharper declines in the years ahead.

But after looking at all this information, I'm increasingly convinced that some decision in the NCBE's scoring of the MBE had some role in the decline of the scores, and of the pass rates around the country. That's speculation at this point--but it's a point, I think, worth investigating further, assuming additional data would be made available.

Previous posts on this subject

A more difficult bar exam, or a sign of declining student quality? (October 2, 2014)

Bar exam scores dip to their lowest level in 10 years (October 14, 2014)

Bar exam posts single-largest drop in scores in history (October 27, 2014)

Did ExamSoft cause the bar passage rate decline? (October 27, 2014)

National Conference of Bar Examiners: Class of 2014 "was less able" than Class of 2013 (October 28, 2014)

Class of 2014 LSAT scores did not portend sharp drop in MBE scores (November 11, 2014)

The bleak short-term future for law school bar passage rates (November 17, 2014)

The bleak short-term future for law school bar passage rates

This is the last (for now!) about the bar exam. And it's not about what caused the MBE and bar passage rate declines--it's what it means for law schools going forward. The news is grim.

There's no question there was a decline in the law school applicant profile from the Class of 2013 to the Class of 2014. The dispute that Jerry Organ and I (and others) have had is whether the decline in bar passage rates should have been as stark. But going forward, the Class of 2015, and 2016, and likely 2017, and probably 2018, will each be incrementally worse profiles still.

And it's not simply at the median LSAT and GPA. It's at the below-median profiles, particular LSAT, that should concern schools.

Those with LSAT scores below 150, and even 155, are at a substantially higher risk of failing the bar in most jurisdictions. For the Class of 2016, about 2/3 of schools have a 25th percentile LSAT of 155 or lower--that is, 25% of their incoming classes have an LSAT at or below 155. And over 80 schools have a 25th percentile LSAT at 150 or lower.

Furthermore, about half of schools have a 50th percentile LSAT of 155 or lower, and a full 30 schools have a 50th percentile of 150 or lower.

The increasing willingness of schools to accept these low-LSAT performers is a function of a combination of decisions made years ago. U.S. News & World Report evaluates only LSAT medians. This decision distorts evaluation of law student quality. To ensure that their medians remained as strong as possible, schools increasingly admitted more "imbalanced" students--students with a median or better LSAT and substantially below-median GPA, or a median or better GPA and a substantially below-median LSAT. That meant the 25th percentile LSAT began to sag at more schools--the bottom of the class became worse at a higher rate than the middle of the class. (There are other, less-measurable decisions at the moment, such as factoring the highest LSAT score instead of the average LSAT score of applicants, which probably distorts student quality; possible decisions to academically dismiss fewer students; educational programming decisions that may channel more students toward the kinds of courses that may not sharpen legal analysis for the bar exam, to the extent it affects bar passage; transfer-related decisions at schools; and much more.)

As bar passage rates decline--perhaps sharply--we should see still-falling rates, particularly from institutions that made the admissions decision years ago to prop up the median but sacrifice the quality of the bottom of the class.

For schools that made this decision years ago, the results will become increasingly sharp in the years ahead. If a school did not sufficiently reduce its class size, or worried about LSAT medians, it favored short-term interests; those short-term interests are becoming long-term as those classes graduate; are likely face the more significant debt (as the below-median students are less likely to have obtained merit-based aid); pass the bar at lower rates; in that cohort, are likely find employment at lower rates (if they are unable to pass the bar); and trickle back out to an already-reluctant applicant pool.

I've said before that I'm not a "doomsday" predictor. But these bar results portend a significantly worsening portrait for law school bar passage rates in the years ahead, if schools made short-term decisions years ago and are now facing the long-term results. For the long-term schools with visionary deans and faculties anticipating the long-term future of the institution, the results may not be quite so grim. (But we shall see how many of those there are.)

Class of 2014 LSAT scores did not portend sharp drop in MBE scores

UPDATE: Jerry Organ (University of St. Thomas) has posted an even more thorough and thoughtful analysis of the LSAT scores and projected bar passage rates at the Legal Whiteboard. He, too, finds the NCBE's conclusion difficult.

I've blogged (here and here and here and here and here) about the sharp drop in bar passage rates around the country from the July 2014 administration of the bar exam, largely due to the unprecedented drop in MBE scores. A recent Wall Street Journal blog post about the reaction of the dean of Brooklyn Law School shows the sides in the fight. Did the NCBE screw up its exam, yielding a sharp drop in scores? Or did law schools admit a disproportionately unqualified class?

Here's an attempt to measure the quality of the class and correlate it with MBE scores. (Maybe it's just awful math.)

The LSAT is fairly highly correlated with MBE scores. Consider this NCBE report (PDF). I extrapolated those figures for the LSAT and the average MBE scores. I then weighted them against the number of matriculants in law school: LSAC reports the number of matriculants with scores of 175+, 170-174, and so on. I took a rough estimate of the expected MBE score for each range; I then averaged it out for the entire class.

When I first charged it, the projected MBE scores were much higher than the actual MBE scores that arose three years later. (I used the 2009-2010 LSAT matriculant data, for instance, and mapped it on the MBE results three years later, in 2013.) I attributed this to several possibilities, the most significant of which is that repeaters probably significantly drag down the MBE score. But subtracting five points from the projected MBE score lead to an almost perfect match with the actual MBE score, with one exception.*

Note that the LSAT score reporting changed beginning in the 2009-2010 cycle (i.e., the Class of 2013): schools could report the highest LSAT scores, rather than the average LSAT scores, of matriculants. That meant that the LSAT scores were probably overstated in the last two graduating classes.

But in the charge, we see a fairly significant correlation between my extremely rough approximation of a projected MBE score based on the LSAT scores of the matriculating classes, and the actual MBE scores, with one exception: this cycle.

My math is rough--and maybe it's just bad. But as this comports with every other analysis I've done, and as I've not been able to find any other factors that would contribute to an across-the-board decline in scores, I'm increasingly convinced that a problem occurred on the NCBE's end--and not that the Class of 2014 was somehow disproportionately and dramatically worse than other classes.

That said, we should expect to see declining MBE scores (and bar passage rates) of some kind in the next few years, as academic quality of entering classes continues to decline; and, we should expect bar passage-required employment outcomes to see some (likely negative) effect due to a sharp drop-off in bar passage rates.

*I should add that I could have simply plotted the projected results so that you could observe the similarity (or differences) in the rise and fall; or, in the alternative, I could have plotted them on two different Y axes. Subtracting five points, however, seemed like the easiest way to make the visualization more obvious.

National Conference of Bar Examiners: Class of 2014 "was less able" than Class of 2013

Continuing a series about the decline in bar passage rates, the National Conference of Bar Examiners recently wrote a letter to law school deans that explained its theory behind the reason in a 10-year low in Multistate Bar Exam scores and the single-biggest drop in MBE history. I've excerpted the relevant paragraphs below.

In the wake of the release of MBE scores from the July 2014 test administration, I also want to take this opportunity to let you know that the drop in scores that we saw this past July has been a matter of concern to us, as no doubt it has been to many of you. While we always take quality control of MBE scoring very seriously, we redoubled our efforts to satisfy ourselves that no error occurred in scoring the examination or in equating the test with its predecessors. The results are correct.
Beyond checking and rechecking our equating, we have looked at other indicators to challenge the results. All point to the fact that the group that sat in July 2014 was less able than the group that sat in July 2013. In July 2013 we marked the highest number of MBE test-takers. This year the number of MBE test-takers fell by five percent. This was not unanticipated: figures from the American Bar Association indicate that first-year enrollment fell 7% between Fall 2010 (the 2013 graduating class) and Fall 2011 (the 2014 class). We have been expecting a dip in bar examination numbers as declining law school applications and enrollments worked their way to the law school graduation stage, but the question of performance of the 2014 graduates was of course unknown.
Some have questioned whether adoption of the Uniform Bar Examination has been a factor in slumping pass rates. It has not. In most UBE jurisdictions (there are currently 14), the same test components are being used and the components are being combined as they were before the UBE was adopted. As noted above, it is the MBE, with scores equated across time, that reveals a decline in performance of the cohort that took July 2014 bar examinations.
In closing, I can assure you that had we discovered an error in MBE scoring, we would have acknowledged it and corrected it.

Well, that doesn't make any sense.

First, whether a class is "less able" is a matter of LSAT and UGPA scores. It is not a matter of the size of the class.

Second, to the extent a brief window into the LSAT scores for the entering classes in the Fall of 2010 and 2011 are useful metrics, Jerry Organ has noted elsewhere that the dip in scores was fairly modest in that first year after the peak application cycle. It certainly gets dramatically worse, but nothing suggesting that admissions in that one-year window fell off a cliff. (More data on the class quality is, I hope, forthcoming.)

Third, to the extent that the size of the class matters, it does not adequately explain the drop-off. Below is a chart of the mean scaled MBE scores, and an overlay of the entering 1L class size (shifted three years, so that the 1L entering class corresponds with the expected year of graduation).

If there's supposed to be a drop-off in scores because of a drop-off in enrollment (and there is, indeed, some meaningful correlations between the two), it doesn't explain the severity of the drop in this case.

This explanation, then, isn't really an explanation, save an ipse dixit statement that NCBE has "redoubled" its efforts and an assurance that "[t]he results are correct." An explanation is still yet to be found.

Did ExamSoft cause the bar passage rate decline?

I’ve blogged about the sharp decline in the MBE scores and the corresponding drop in bar passage rates in a number of jurisdictions around the United States. I’m still struggling to find an explanation.

One theory is that the ExamSoft fiasco affected the MBE scores. Most states have two days of exams: a day of essays followed by a day of multiple choice in the MBE. The software most states use for the essay response portion had problems in July 2014--test-takers were unable to upload bar results in a timely fashion. As a result, students slept less and stressed more the night before the MBE, which may have yielded lower scores on the MBE.

We can test this in one small way: several states do not use ExamSoft. Arizona, Kentucky, Maine, Nebraska, Virginia, and Wisconsin all use Exam4 software; the District of Columbia does not permit the use of computers. If ExamSoft yielded lower scores, then we might expect bar passage rights to remain unaffected in places that didn’t use it.

But it doesn’t appear that the non-ExamSoft jurisdictions did any better. Here are the disclosed changes in bar passage rates of July 2013 in jurisdictions that did not use ExamSoft:

Arizona (-7 points)

District of Columbia (-8 points)

Kentucky (unchanged)

Virginia (-7 points)

These states have already disclosed their statewide passage rates, and they do not appear to be materially better than the other scores around the country.

It might still be a factor in the jurisdictions that use ExamSoft in conjunction with other variables. But it doesn’t appear to be the single, magic explanation for the decline. There are likely other, yet-unexplained variables out there.

 (I’m grateful to Jerry Organ for his comments on this theory.)