Law school ruin porn hits USA Today

I actually laughed out loud when I started reading this “yearlong investigation” by four USA Today journalists on the state of legal education. I call the genre, “law school ruin porn.”

“Ruin porn” has long been a genre of photojournalism to display the decay of urban centers or Olympic sites. And I think the genre works for “law school ruins,” or exploiting details about the most marginal law schools and the most at-risk students, then treating them as typical of the profession.

Here’s how the piece opens:

Sam Goldstein graduated from law school in 2013, eager to embark on a legal career.

Five years later, he is still waiting. After eight attempts, Goldstein has not passed the bar exam, a requirement to become a practicing attorney in most states.

"I did not feel I was really prepared at all" to pass the bar, Goldstein  said of his three years in law school. "Even the best of test preps can't really help you unless you've had that solid foundation in law school."

In the meantime, many take lower-paying jobs, as Goldstein did, working  as a law clerk. What he earned didn't put a dent in his $285,000  in student-loan debt, most of which was accrued in law school.   

The piece is reminiscent of a genre of journalism that peaked in 2011 in a series of pieces by David Segal in the New York Times. Here’s how one of them opened:

If there is ever a class in how to remain calm while trapped beneath $250,000 in loans, Michael Wallerstein ought to teach it.

Here  he is, sitting one afternoon at a restaurant on the Upper East Side of Manhattan, a tall, sandy-haired, 27-year-old radiating a kind of surfer-dude serenity. His secret, if that’s the right word, is to pretty  much ignore all the calls and letters that he receives every day from  the dozen or so creditors now hounding him for cash.

“And I don’t open the e-mail alerts with my credit score,” he adds. “I can’t look at my credit score any more.”

Mr.  Wallerstein, who can’t afford to pay down interest and thus watches the outstanding loan balance grow, is in roughly the same financial hell as  people who bought more home than they could afford during the real estate boom. But creditors can’t foreclose on him because he didn’t spend the money on a house.

He spent it on a law degree. And from every angle, this now looks like a catastrophic investment.

Well, every angle except one: the view from law schools.

The fundamental problem with a piece like this one in USA Today is how it treats the outlier as the norm. The vast majority of law students do pass the bar exam on the first attempt. The vast majority of law schools are at no risk of failing to meet the ABA’s standards. But the piece is framed in quite a different fashion.

A student like the one USA Today found is nearly impossible to find. For instance, I blogged earlier about a look at how 2293 first-time test-takers did on the Texas bar exam. Only 10 failed the bar exam even four times. Granted, that includes about another 150 who failed one, two, or three attempts and stopped attempting (at least, stopped attempting in Texas). But it’s nearly impossible to find graduates who have had such poor performance, bad luck, or some combination for such an extended period of time.

USA Today also profiled a graduate of Arizona Summit Law School, the outlier for-profit law school—I’ve blogged about how before 1995, the ABA would never accredit for-profit law schools, until the Department of Justice compelled law schools to do so. (More on Arizona Summit in a bit.)

The ostensible focus of the piece is the ABA’s renewed proposal to require law schools to demonstrate an “ultimate” bar passage rate of 75% within two years of graduation. The result appears dire: “At 18 U.S. law schools, more than a quarter of students did not pass the bar exam within two years,” according to Class of 2015 data.

Of course, George W. Bush would have lost the 2000 presidential election if the National Popular Vote plan were in place. Or, less snarkily, if the rules change, we should expect schools—and perhaps state bars—to change how they behave. If 75% were the cut off, we would expect not just changes in admissions standards, but changes in bar exam cut scores, changes in where students are encouraged to take the bar exam, increased academic dismissal rates, and so on—in short, the 18 from the Class of 2015 doesn’t tell us much.

That said, there are two other reasons the 18 figure doesn’t tell us much. First, and this makes me more “doom and gloom,” it’s too conservative a figure to show the schools that may face a problem in the near future. Any school near an 80% ultimate pass rate, I think, would feel the heat of this proposal—a bad year, a few frustrated students who stop repeating, a weak incoming class, and so on could move a school’s figures a few percentage points and put them in danger. Another 12-15 law schools are within a zone of danger of the new ABA proposal.

Second, the 18 is not nearly as dire as the USA Today piece makes it seem. Two of them are schools in Puerto Rico, which are so different in kind from the rest of the ABA-accredited law schools in the United States that they are essentially two entirely different markets.

At the very end of the piece, it finally conceded something about Arizona Summit: “Arizona Summit Law School in Phoenix, Whittier Law School in Southern California and Valparaiso Law School in northern Indiana are not  accepting new students and will shut once students finish their degrees.” Even without the ABA proposal, 3 of the 18 schools are shutting down—including Arizona Summit, the foil of the opening of the piece. So now the student is not simply an outlier, an 8-time bar test taker from a for-profit school, but from a for-profit school that is no longer in operation. An outlier of an outlier of an outlier—given treatment as something typical. Talk about burying the lede.

And while the data comes from the ABA, I have to wonder whether, because this is the first data disclosure from law schools, some of it is not entirely helpful. (Again, one would think a year-long investigation would clear up these points.) Take Syracuse, listed as an ultimate pass rate of 71%. Its first-time July 2015 bar pass rate was 79%. (Its subsequent July 2016 flew to 89%.) Its combined February & July 2015 pass rates were 86%, along with 75% in New Jersey. (Its California rate for those two tests was 1-for-13.) Now, perhaps it has an unusually high number of individuals failing out of state; or who didn’t take the July 2015 bar the first time and ultimately failed—I have no idea. But it’s the kind of outlier statistic that, to me, merits an inquiry rather than simply a report of figures. (UPDATE: Syracuse has indicated that the figures were, in fact, inaccurate, and that their ultimate bar passage rate was 82.6%.)

The piece also unhelpfully quotes, without critique, some conclusions from “Law School Transparency.” (You may recall that several years ago LST tried to shake down law schools by charging them at least $2750 a year to “certify” that those schools met LST’s disclosure standards.) For instance, “The number of law schools admitting at least 25% of students considered ‘at risk’ of failing the bar jumped from 30 schools to 74 schools from 2010 to 2014, according to a report in 2015 by Law School Transparency.” Of course, if one cares about ultimate pass rates, which this article purports to care about, then how is it that just 18 schools missed the “ultimate” pass rate compared to LST’s projected 74 (for 2014, but things weren’t exactly better by 2015). In part because LST’s “at risk” is an overly broad definition—because it doesn’t include academic dismissals (despite mentioning it in the report), because it doesn’t account for variances in state bars (despite mentioning it in the report, but not included in identifying “at risk”), because it’s not clear whether LST is primarily concerned with first-time or ultimate passage (the report jumps around), because LST adds a level of risk (which USA Today mistakenly reports) to “at risk” of not graduating in addition to “at risk” of not passing the bar (which, I think, is an entirely valid thing to include), and so on.

A lengthy investigative piece should, in theory, provide greater opportunity for subtlety and fine-tuning points, rather than list a bunch of at-risk schools and serially identify problems with as many of them as possible. That isn’t to say that there aren’t some existential problems at a handful of law schools in the United States, or that the ABA’s proposal isn’t worthy of some serious consideration. It’s simply that this form of journalism is a relic of 2011, and I hope we see the return of more nuanced and complicated analyses to come.

Do specific substantive courses prepare students for those topics on the bar exam? Probably not

Earlier, I blogged about the disconcerting conclusion from recent bar performance and the results of a California State Bar study that law school “bar prep programs” appear to have no impact on students ability to pass the bar exam.

But what about specific substantive course areas? Does a student’s performance in, say, Torts translate into a stronger bar exam score?

The answer? Probably not.

First, let me aclear a little underbrush about what claim I’d like to examine. We all know that students take some subjects that appear on the bar, but most don’t take all of them. Virtually all law school graduates take a specific bar preparation course offered by a for-profit company to help train them for the bar exam.

But law schools might think that they could improve bar passage rates by focusing not simply on “bar prep,” but on the substantive courses that will be tested on the bar exam. If bar passage rates are dropping, then curricular reform that tries to require students to take more Evidence, Torts, or Property might be a perceived soslution.

So what exactly is the relationship between substantive course area performance and the bar exam? Not much.

Back in the 1970s, LSAC commissioned a study looking at law schools in several states and their performance on the bar exam. The then-new Multistate Bar Exam had five subjects. Researchers looked at how law students performed in each of those substantive subject areas in law school: Contracts, Criminal Law, Evidence, Property, Torts. (The results of the study are found at Alfred B. Carlson & Charles E. Werts, Relationships Among Law School Predictors, Law School Performance, and Bar Examination Results, Sep. 1976, LSAC-76-1.)

They then looked at whether The LSAC study examined first-year subject-area grades; first-, second-, third-year grades; and overall law school GPA, and their correlations with MBE subject areas. The higher the number, the closer the relationship.

Torts is an illustrative example. The relationship between the TORT/L (grades in Torts) and the performance of students on the MBE area of Torts is 0.19, a relatively weak correlation. But grades in Torts were more predictive of performance in Real Property, Evidence, Criminal Law, and Contracts—perhaps a counterintuitive finding. That is, your Torts grade told you more about your performance in the Property portion of the bar exam than the Torts section.

Again, these numbers are relatively weak, so one shouldn’t draw much from from that noise, like 0.19 to 0.26.

In contrast, LGPA/L (law school GPA) was more highly correlated than any particular bar exam subject area, and highly correlated (0.55) with the total MBE performance. Recall that overall law school GPA includes a number of courses—bar related and not—and that it’s more predictive than any particular substantive course area.

The LSAC study dug into further findings to conclude that the bar exam is testing “general legal knowledge,” and that performance in any particular subject area is not particularly indicative of strength of performance on that subject area on the bar exam.

The short of it is, this is good evidence that the important thing coming out of three years of law school is not the substantive transmission of knowledge, but the, for lack of a better phrase, ability to “think like a lawyer” (or simply engage in critical legal analysis). Bar prep courses the summer before the bar exam are likely the better place to cram the substantive knowledge for the bar; but the broad base of legal education is what’s being tested (perhaps imperfectly!) on the bar exam.

We also have the results of a recent study by the California State Bar. The study looked at student performance in particular course areas and the relationship with bar exam scores. After examining the results of thousands of students and bar results from 2013, 2016, and 2017, the findings are almost identical.

The correlations between any one subject that that subject on the bar exam are modest, and sometimes they’re (slightly) more highly correlated with different subject areas—the same findings as LSAC’s 1976 study. But none of them are nearly as strong as the overall law school GPA, which is between .6 and .7 over the overall MBE and written components as the study finds. (Unfortunately, this study didn’t break out the relationship between law school GPA and particular MBE topic areas.)

The study did, however, make an interesting finding and reached what I think is an incorrect possible conclusion.

The study discovered that cumulative GPA in California bar exam-related subject areas (listed above) was significantly more highly correlated with the cumulative GPA in non-California bar exam-related subject areas.

It went on to find no relationship (in some smaller sets of data) between bar passage rates and participation in clinical programs; externships; internships; bar preparation courses; and “Non-Bar Related Specialty Course Units” (e.g., Intellectual Property).

Here’s the finding I’d take issue with: “However, overall CBX [California bar exam] performance correlated more strongly statistically with aggregate performance in all of the bar-related courses than with aggregate performance in all non-bar-related courses, suggesting that there may be some type of cumulative effect operating.”

I’m not sure that’s the right assumption to reach. I think that the report understates the likelihood that grade inflation in seminar courses; higher inconsistency in grading in courses taught by adjuncts; or grades in courses that don’t measure the kinds of skills evaluated on the bar exam (e.g., oral advocacy in graded trial advocacy courses) all affect non-bar-related course GPA. That is, my suspicion is that if one were to measure the GPA in other substantively-similar non-bar-related courses (e.g., Federal Courts, Antitrust, Secured Transactions, Administrative Law, Merger & Acquisitions, Intellectual Property, etc.), one would likely find a similar relationship as performance in bar-related course GPA. That’s just a hunch. That’s what I’d love to see future reports examine.

That said, both in 1976 and in 2017, the evidence suggests that performance in a specific substantive course has little to say about how the student will do on the bar—at least, little unique to that course. Students who do well in law school as a whole do well on each particular subject of the bar exam.

When law schools consider how to best help prepare their students for the bar, then, simply channeling students into bar-related subjects is likely ineffective. (And that’s not to say that law schools shouldn’t offer these courses!) Alternative measures should be considered. And I look forward to more substantive course studies like the California study in the future.

Why are law school graduates still failing the bar exam at a high rate?

The first decline took place in the July 2014 bar exam, which some believed might be blamed on an ExamSoft software glitch. Then came continued declines in the July 2015 exam, which some blamed on the addition of Civil Procedure to the Multistate Bar Exam. The declines persisted and even worsened.

Five straight July bar exam cycles with persistent low pass rates across the country. But the bar exam has not become more difficult. Why?

One reason rates remain low is that predictors for incoming classes remain low. LSAT scores actually declined among the most at-risk students between the incoming classes admitted in the 2011-2012 cycle (graduating in 2015) and the 2014-2015 cycle (graduating in 2018). The 25th percentile median LSAT among full-time entrants dropped 2 LSAT points between those who graduated in the Class of 2015 and the Class of 2018. Indeed, 11 schools saw a drop of at least 5 LSAT points in their 25th percentile incoming classes—almost as many as those that saw any improvement whatsoever (just 12 schools, including Yale and Stanford).

Not all LSAT declines are created equal: a drop from 170 to 168 is much more marginal than a drop from 152 to 150; and a drop can have a bigger impact depending on the cut score of the bar exam in each jurisdiction. But it’s no surprise, then, to see the persistently low, and even declining, bar passage rates around the country with this quick aggregate analysis.

Nevertheless, since around September 2014, law schools have been acutely aware of the problem of declining bar passage rates. Perhaps it was too late to course-correct on admissions cycles through at least the Class of 2017.

But what about academic advising? What about providing bar preparation services for at-risk students? Given that law schools have been on notice for nearly five years, why haven’t bar passage rates improved?

I confess, I don’t know what’s happened. But I have a few ideas that I think are worth exploring.

First, it seems increasingly likely that academic dismissal rates, while rising slightly over several years, have not kept pace to account for the significant decline in quality of entering students. Of course, academic dismissals are only one part of the picture, and a controversial topic at that, particularly if tethered to projections about future likelihood to pass the bar exam on the first attempt. I won’t delve into those challenging discussions; I simply note them here.

Another is that law schools haven’t provided those academic advising or bar preparation services to students—but that seems unlikely.

Still another, and perhaps much more alarming, concern is that those bar services have been ineffective (or not as effective as one might hope). And this is a moment of reckoning for law schools.

Assuredly, when the first downturns of scores came, law schools felt they had to do something, anything, to right the ship. That meant taking steps that would calm the fears of law students and appease universities. Creating or expanding bar preparation courses, or hiring individuals dedicated with bar preparation, would be easy solutions—law students could participate in direct and tangible courses that were specifically designed to help them achieve bar exam success; law faculty could feel relieved that steps were being taken to help students; university administrators could feel confident that something was being done. Whether these bolstered existing courses or added to them, assuredly schools provided opportunities to their students.

But… to what end? Something was done at many institutions. Has it been effective?

Apparently not. The lagging (and falling) bar passage rates are a sign of that. Granted, perhaps the slide would be worse without such courses, but that seems like cold comfort to schools that have been trying to affirmatively improve rates.

We now have the first evidence to that effect. A report commissioned by the California State Bar recently studied several California law schools that disclosed student-specific data on a wide range of fronts—not just LSAT and UGPA in relation to their bar exam score, but law school GPA, courses taken, even participation in externships and clinic.

One variable to consider was involvement in a bar preparation course. Did participation in a bar preparation course help students pass the bar? I excerpt the unsettling finding here:

Five law schools provided data for this variable. Students averaged about 1.5 units (range 0 to 6). For all those students, there was a -.20 (p<.0001) correlation between the number of units taken and CBX TOTSCL [California Bar Exam Total Scale Scores]. The source of this negative relationship appears to be the fact that in five out of six [sic] of the schools, it was students with lower GPAs who took these classes. After controlling for GPA, the number of bar preparation course units a student takes had no relationship to their performance on the CBX. A follow up analysis, examining just the students in the lower half of GPA distribution, showed that there was no statistically significant difference in CBX TOTSCL for those who took a bar preparation course versus those who did not (p=.24). Analyses conducted within each of the five schools yielded similar findings.

This should be a red flag for law schools seeking to provide bar preparation services to their students. In this student, whatever law schools are doing to help their students pass the bar has no discernible impact on students’ actual bar exam scores.

Granted, these are just five California law schools and the California bar. And there has been other school-specific programs at some institutions that may provide a better model.

But it’s worth law schools considering whether students are on a path toward improving bar passage success or simply on a hamster wheel of doing more work without any discernible positive impact. More studies and evidence are of course in order. But the results from the last several years, confirmed by the study of five California law schools, suggests that revisiting the existing the model is of some urgency.

MBE scores drop to 34-year low as bar pass rates decline again

On the heels of some good news in recent administrations of the July bar exam comes tough news from the National Conference of Bar Examiners: the Multistate Bar Exam (MBE) scores have dropped to a 34-year low, their lowest point since 1984.

For perspective, California's "cut score" is 144, Virginia 140, Texas 135, New York 133. A bar score of 139.5 is comparable to 2015 (139.9) in recent years. One would have to go back to the 80s to see comparable scores: 1982 (139.7), 1984 (139.2), & 1988 (139.8).

I’d hoped that perhaps qualifications of students have rebounded a bit as schools improved their incoming classes a few years ago; perhaps students are putting more effort into the bar than previous years; or other factors. That appears to not be the case this year.

That said, MBE scores may be slightly less predictive of what will happen with actual bar pass rates. the NCBE has pointed out that the rise of the Uniform Bar Exam has led to a number of test-takers transferring scores to new jurisdictions rather than taking a second jurisdiction’s bar—and, presumably, those who pass in one jurisdiction are much more likely to pass in another jurisdiction (accepting that cut scores can vary in some jurisdictions). The UBE points to a few thousand such transfers last year, at least some of whom may have taken the bar exam. But put against more than 40,000 MBE test-takers, the effect, while real, may be small.

Instead, we’re left to watch as results come in state by state. Tracking first-time pass rates (from jurisdictions that share them so far—ideally, ABA graduates would be a better measure, but this works reasonably well for now), declines have been pretty consistent: New Mexico (-14 points), Indiana (-3), North Carolina (+1), Oklahoma (-8), Missouri (-7), Iowa (-3), Washington (-3), and Florida (-4). But in many of these jurisdiction, pass rates were worse in, say, 2015 or 2016.

We’ll know more in the months to come, but it looks like another year of decline will cause some continued anguish in legal education. The increased quality of law school applicants this year will help the July 2021 bar exam look much better.

Note: I chose a non-zero Y-axis to show relative performance.

February 2018 MBE bar scores collapse to all-time record low in test history

If that headline seems like déjà vu, it's because I wrote the same headline after the February 2017 MBE bar scores were released. There were some interesting comments last year about the best way to visualize the decline, so here are a couple of attempts below. (You can see more about the methodology choices in last year's post, including reasons it's a non-zero Y-axis, which would be absurd.)

We now know the mean scaled national February MBE score was 132.8, down 1.8 points from last year's 134.0, which was already an all-time record low. We would expect bar exam passing rates to drop in most jurisdictions.

For perspective, California's "cut score" is 144, Virginia's 140, Texas's 135, and New York's 133. The trend is more pronounced when looking at a more recent window of scores.

On the heels of an uptick in MBE scores last July, the results are particularly troubling. Given how small the February pool is in relation to the July pool, it's hard to draw too many conclusions from the February test-taker pool.

That said, the February cohort is historically much weaker than the July cohort, in part because it includes so many who failed in July and retook in February. Without knowing the percentage of repeaters, that would be the first place to look.

Another reason might relate to the increase in the July scores. Based on some informed speculation, some schools may have been advising some more at-risk students to delay taking the July exam and instead prepare more for the February exam in hopes of increasing first-time pass rates. If that happened, we may see a skewing in the quality of first-time test-takers in the February cohort, which would result in a decline in scores. That might explain some of the small improvement in July and decline in February.

At some point soon, however, we should see a more regular rebound in bar pass rates. The first major drop in bar exam scores was revealed to law schools in late fall 2014. That means the 2014-2015 applicant cycle, to the extent schools took heed of the warning, was a time for them to improve the quality of their incoming classes, leading to some improvement for the class graduating this May of 2018.

Of course, these are high-level projections and guesses. School-specific data would be useful. But it surely will not end the debates raging right now about the bar exam, and it will only serve to put more pressure on law schools looking at this July's bar exam.

UPDATE: NCBEX has revealed that first-time test-takers were 30% of the pool and saw a smaller decline than repeaters, but the number of repeaters was mostly unchanged. Karen Sloan has more.

A change in calculating pass rates for the California bar exam

Good news from the California bar: the overall bar pass rate rose year-over-year from 43% to 49.6%. Or... did it?

The State Bar of California made a small change to how it calculates the passing rate of bar exam test-takers. In April 2017, it adopted the following change:

It was moved, seconded and duly carried that beginning with the February 2017 administration of the California Bar Examination applicants who did not complete all portions of the examination not be included in the pass/fail statistics published at the time results from the examination are published; and that for an examination to be considered complete, applicant must have achieved a grade of at least 40 on their answers to each question on the examination.

The change is a sensible one: if a test-taker walks out in the middle of the exam, it doesn't seem terribly sensible to include that test-taker as a failure. That's not usually what we'd think about in terms of failure rates; instead, those who sat through the whole exam, answered all the questions, and tried to pass the bar would be the ones whose success rates we'd like to evaluate. A quotation from Karen Goodman on the Committee of Bar Examiners in the Daily Journal was consistent with this: "It seemed like if people did not finish the test, they should not count against the pass rate." (Of course, I suppose, the person did fail!)

At the same time, instituting this new change could make it appear that bar pass rates were higher than they actually were, because the new pass rates are going to be higher than old pass rates due to the change in methodology.

The February 2017 overall pass rate was reported at 34.5%, when under the old methodology it would have been 33.9% (a 0.6-point difference). 78 did not complete the exam

For July 2017, 66 did not complete the exam. That lifted the overall percentage who passed from 49.19% to 49.57%. A California bar representative also informed me that the July 2016 exam had 89 who did not complete the exam, a pass rate of 43.57% v. 43.07%.

(It's worth emphasizing this difference is probably even smaller today because the bar has been shorted from three days to two as of July 2017, making it more likely that more individuals will finish the exam.)

This is a very modest advantage to all schools in reporting their overall pass rates--odds are that one dropout in 200 can bump a school's overall score by a point (when rounded). And it offers a very modest (if slightly deceptive) improvement to the current state of affairs when considering bar passage rates in California. It makes comparisons across years slightly disparate.

But, in an era nearly obsessed with almost any numerical change in bar exam statistics, this one is worth highlighting for future consideration. The true year-over-year comparison is 43.6% to 49.6% (+6 points), or 43.1% [sic; that's the percentage shared with me!] to 49.2% (+6.2 points), not 43.0% to 49.6% (+6.6 points). In future years, the comparison will be easier to make.

Recent trends in non-JD legal education

I've blogged before about the rise of non-JD legal education. Law schools increasingly rely on non-JD sources of revenue (now, 1 in 9 students enrolled in a law school are not a part of the JD program, up sharply over the last few years). I've also expressed some concern about the value proposition of some of those degrees, particularly given the high failure rate of LLM graduates on the bar exam.

I thought I'd share a prediction, an update, and a new observation.

First, I predict that non-JD enrollment will drop this year, the first such decline in some time. I suggested last year that the new presidential administration might lead to declines in foreign visitors to American educational institutions. I anticipate that will be true when it comes to non-JD education (and foreign students are a significant portion of such degree offerings). Even though the "Travel Ban 1.0/2.0/3.0" has been ostensibly limited in scope and had significant legal challenges (in addition to naturally-expiring deadlines), I think these formal legal postures are quite distinct from the pragmatic effect that even the rhetoric about such immigration restrictions would have on prospective foreign students. We should know more next month.

Second, the New York bar is by far the most popular bar exam for foreign attorneys. This year, first-time test-takers from foreign countries had a whopping 57% pass rate, dramatically up from the historic 42%-46% pass rate in recent years. I don't know what would cause such an increase--more student from English-speaking countries; better bar prep; or any of a number of factors. But it's worth noting in light of my earlier concerns about the low bar pass rates. (The same kind of improvement took place in Texas: first-time pass rates among July test-takers rose from 20% in 2015 and 25% in 2016 to 44% in 2017.) Not all have secured a US non-JD degree, but many do as a prerequisite to taking a state bar exam.


Third, law schools have discovered online non-JD legal education. It's not clear how such degrees fit into the overall marketplace (any more so than non-JD degrees more generally), and it might be that such opportunities will offset at least some of the loss of other non-JD enrollment.

Indeed, breaking down traditional versus online non-JD enrollment in the last few years, online non-JD enrollment is up significantly, and traditional non-JD enrollment has flattened. Much of the most recent growth, then, has come from online non-JD degrees. While online non-JD degrees had enrollment of just 1590 in 2014, it nearly doubled to 2971 in 2016--and I expect is still larger for Fall 2017.

Only 38 schools had online non-JD programs in Fall 2016, but even that figure is deceiving. An eclectic crop of eight schools accounted for about half of all non-JD enrollment in 2016.


Again, the Fall 2017 figures will be released soon, and we'll see what changes to these trends have taken place. I remain interested to know the place of non-JD degrees and the future trends of enrollment, and I'll always happily report more updates here.

Why are bar exam scores improving?

The news that the mean scaled MBE score has risen for the second year in a row and is now the highest since 2013 is good news for law schools and law students. I've been tweeting the results of the comparative overall pass rates in some jurisdictions as they roll in. It shows that, as expected with an increase in the MBE, passing rates are up in most jurisdictions. That's helped by jurisdictions that have lowered the cut score: Oregon, for instance, reduced its passing score from 142 to 137, and its passing rate rose from 58% in July 2016 to 79% in July 2017. (The low point of MBE scores came in July 2015).

But, why? In 2014, I noted that it looked like bar pass rates would have a bleak (at least short-term) future. In 2016, scores slightly improved; and here in 2017, they've improved quite a bit (though well behind where they were in 2013 and the preceding decade of relatively high scores).

Schools that saw their declines in bar pass rates in September to November of 2014 would not have been able to take action on the admissions front until they admitted students who began in August 2015. (Indeed, some might have hoped it was a one-time blip and might not have reacted even then.) But we could look at a couple of things to see if their practices changed.

First, it turns out that the bottom end of the incoming classes in August 2014 had worse predictors than August 2012--but the July 2017 test-takers scored much better than the July 2015 test-takers. A whopping 146 law schools saw a decline in their 25th percentile LSAT incoming classes (i.e., the cohort most likely to fail the bar--relative, of course, to each school's LSAT profile and each jurisdiction's cut score) in that two-year period. 29 held steady in their 25th percentile, and just 14 saw an improvement.

If anything, then, we should expect bar pass scores to be much more this past July! But we also have another factor: academic dismissals. Note that the incoming class from August 2014 may have had worse credentials, but they would have completed their first year in May 2015, shortly after some schools would have been aware of the significant drop in the bar pass rates.

Professor Jerry Organ tracked attrition and noted an uptick in academic dismissals among that August 2014 incoming class by 2015--and before they took the July 2017 bar. Overall first-year attrition was up slightly, from 6.25% for the Class of 2015 to 7.04% for the Class of 2017. But attrition rose the most at schools with the lowest LSAT profiles. Among schools with a median LSAT profile below 150, attrition rose from 12.1% to 17.1% in that two-year stretch, while declining slightly at all other institutions.

Surely that can offset some of the worsening LSAT profiles. But it can hardly explain all of it. I wonder if institutions have found better strategies of intervening with at-risk students, or providing more robust bar exam support for at-risk students. Perhaps in the last couple of years, students have been sufficiently scared of failing the bar to study harder or earlier (we know that over time, a bar exam test-taker's score will improve). These are matters that institutions may have the data to examine (or may be in the process of collecting). Regardless, it remains good, albeit still slightly mysterious news--and those in legal education hope that it is the beginning of a continued trend of good news.