Why are law school graduates still failing the bar exam at a high rate?

The first decline took place in the July 2014 bar exam, which some believed might be blamed on an ExamSoft software glitch. Then came continued declines in the July 2015 exam, which some blamed on the addition of Civil Procedure to the Multistate Bar Exam. The declines persisted and even worsened.

Five straight July bar exam cycles with persistent low pass rates across the country. But the bar exam has not become more difficult. Why?

One reason rates remain low is that predictors for incoming classes remain low. LSAT scores actually declined among the most at-risk students between the incoming classes admitted in the 2011-2012 cycle (graduating in 2015) and the 2014-2015 cycle (graduating in 2018). The 25th percentile median LSAT among full-time entrants dropped 2 LSAT points between those who graduated in the Class of 2015 and the Class of 2018. Indeed, 11 schools saw a drop of at least 5 LSAT points in their 25th percentile incoming classes—almost as many as those that saw any improvement whatsoever (just 12 schools, including Yale and Stanford).

Not all LSAT declines are created equal: a drop from 170 to 168 is much more marginal than a drop from 152 to 150; and a drop can have a bigger impact depending on the cut score of the bar exam in each jurisdiction. But it’s no surprise, then, to see the persistently low, and even declining, bar passage rates around the country with this quick aggregate analysis.

Nevertheless, since around September 2014, law schools have been acutely aware of the problem of declining bar passage rates. Perhaps it was too late to course-correct on admissions cycles through at least the Class of 2017.

But what about academic advising? What about providing bar preparation services for at-risk students? Given that law schools have been on notice for nearly five years, why haven’t bar passage rates improved?

I confess, I don’t know what’s happened. But I have a few ideas that I think are worth exploring.

First, it seems increasingly likely that academic dismissal rates, while rising slightly over several years, have not kept pace to account for the significant decline in quality of entering students. Of course, academic dismissals are only one part of the picture, and a controversial topic at that, particularly if tethered to projections about future likelihood to pass the bar exam on the first attempt. I won’t delve into those challenging discussions; I simply note them here.

Another is that law schools haven’t provided those academic advising or bar preparation services to students—but that seems unlikely.

Still another, and perhaps much more alarming, concern is that those bar services have been ineffective (or not as effective as one might hope). And this is a moment of reckoning for law schools.

Assuredly, when the first downturns of scores came, law schools felt they had to do something, anything, to right the ship. That meant taking steps that would calm the fears of law students and appease universities. Creating or expanding bar preparation courses, or hiring individuals dedicated with bar preparation, would be easy solutions—law students could participate in direct and tangible courses that were specifically designed to help them achieve bar exam success; law faculty could feel relieved that steps were being taken to help students; university administrators could feel confident that something was being done. Whether these bolstered existing courses or added to them, assuredly schools provided opportunities to their students.

But… to what end? Something was done at many institutions. Has it been effective?

Apparently not. The lagging (and falling) bar passage rates are a sign of that. Granted, perhaps the slide would be worse without such courses, but that seems like cold comfort to schools that have been trying to affirmatively improve rates.

We now have the first evidence to that effect. A report commissioned by the California State Bar recently studied several California law schools that disclosed student-specific data on a wide range of fronts—not just LSAT and UGPA in relation to their bar exam score, but law school GPA, courses taken, even participation in externships and clinic.

One variable to consider was involvement in a bar preparation course. Did participation in a bar preparation course help students pass the bar? I excerpt the unsettling finding here:

Five law schools provided data for this variable. Students averaged about 1.5 units (range 0 to 6). For all those students, there was a -.20 (p<.0001) correlation between the number of units taken and CBX TOTSCL [California Bar Exam Total Scale Scores]. The source of this negative relationship appears to be the fact that in five out of six [sic] of the schools, it was students with lower GPAs who took these classes. After controlling for GPA, the number of bar preparation course units a student takes had no relationship to their performance on the CBX. A follow up analysis, examining just the students in the lower half of GPA distribution, showed that there was no statistically significant difference in CBX TOTSCL for those who took a bar preparation course versus those who did not (p=.24). Analyses conducted within each of the five schools yielded similar findings.

This should be a red flag for law schools seeking to provide bar preparation services to their students. In this student, whatever law schools are doing to help their students pass the bar has no discernible impact on students’ actual bar exam scores.

Granted, these are just five California law schools and the California bar. And there has been other school-specific programs at some institutions that may provide a better model.

But it’s worth law schools considering whether students are on a path toward improving bar passage success or simply on a hamster wheel of doing more work without any discernible positive impact. More studies and evidence are of course in order. But the results from the last several years, confirmed by the study of five California law schools, suggests that revisiting the existing the model is of some urgency.

The tiny impact (so far) of GRE law school admissions

The University of Arizona announced in early 2016 that it would consider GRE scores as a valid and reliable measure for prospective law students, accepting a test other than the LSAT. Dozens of schools have since followed suit. But the impact has been decidedly muted on the admissions front.

Just 168 law students entered without an LSAT score, those among around 38,000 at ABA-accredited law schools (excluding three law schools in Puerto Rico). That’s up from 81 last year. (Data before that is hard to compare, because some schools reported negative numbers of students entering without LSAT scores.) That’s a big relative increase but a small figure.

Arizona, the leader in this field, had 18 students enter in the Fall 2018 without LSAT scores. Georgetown and Harvard also each had 18.

But ABA data makes this figure hard to evaluate. It includes students who in previously years might also have been admitted without an LSAT score, like admitting a student from an undergraduate program or another graduate program, as long as the student scored in the 85th percentile of the ACT/SAT/GRE/GMAT, or was in the top 10% of the class, or had a 3.5 undergraduate GPA. Some schools assuredly took advantage of this admissions option in the past and continue to do so today. Harvard went from 2 non-LSAT admissions in 2017 to 18 in 2018, after it announced in March 2017 it would accept the GRE; Georgetown from 0 in 2017 to 18 in 2018 after an announcement in August 2017. That’s a suggestion that the GRE has had a more significant recent impact for them.

Without more granular data from the ABA, it’s hard to know how much the GRE trend is affecting law school admissions. At a high level so far, however, the impact is tiny. While many schools have now announced they’ll accept the GRE, that’s translated into extraordinarily few matriculants, less than one half of one percent, even assuming every single non-LSAT admission is a GRE admission (which, they aren’t). At Arizona, such admissions are a good chunk of the incoming class—10% to 15% of the incoming class. At Harvard and Georgetown, 2% to 3% of the class.

But as more schools announce, and more students perhaps opt into it, we’ll see if these trends change in the years to come. And the impact of graduates who use the GRE on the bar exam surely a future matter to consider.

Law school JD enrollment inches upward as non-JD enrollment continues to soar

The 2018 law school enrollment figures have been released. They show a slightly improved picture in JD enrollment and continued soaring growth in non-JD enrollment. About 14% of law school enrollees, 1 in 7, are not enrolled in a JD program.

This is the fourth consecutive year of growth in the incoming 1L class, but a bit larger than the last few years’ growth: there were 38,390 new 1L enrollees, about 900 more students over the incoming class in 2017. It’s the largest 1L class since 2013’s 39,675. (The 2010 peak was 52,488 1Ls.) This is despite the fact that two schools—Valparaiso and Whittier—admitted zero first-year students.

Despite much heralded promises of a “Trump bump” in law school enrollment, my earlier posts on the subject proved true: a modest increase in 1L enrollment of 2.6%, as I projected in December 2017, despite 19% increases in LSAT test-takers (because many were repeaters, not all translate into applicants let alone enrollees, and quality of scores matters) and despite 8% applicant spikes. Part of it reflects some of the concerns I raised earlier, but it appears most schools were more cautious this cycle, choosing to improve class quality (likely a good move given persistently low bar exam scores).

Total JD law school enrollment also ticked up slightly to 111,561 (still well off the peak of 2010-2011 with 147,525).

Non-JD enrollment continues to climb. The ABA changed its definitions a couple of years ago, which resulted in a spike in reported non-JD enrollment, but the steady climb continues. 18,523 were enrolled in non-JD programs, a 1,400-student jump over last year.

Now over 14% of all students enrolled in law school are not a part of a JD program, about 1 in 7 students.

Growth in non-JD online enrollment as a part of overall non-JD enrollment continues, too, with much faster growth than when I first looked at trends two years ago.

This is overall modestly good news for law schools. I continue to wonder about the sustainability and value proposition of non-JD legal education, but perhaps my concerns are overblown.

That said, more information about the kinds of degrees and the outcomes of those who secure these degrees would be welcome information, if the kind that is unavailable at this time.

Finally, we see some continued growth in LSAT test-takers again this cycle. We may see 1L enrollment creep up again, perhaps surpassing 40,000 students next year. For law schools, a robust and valuable JD program is essential, and that would be a good step toward restoring some of the losses suffered after the recession. Below I highlight a handful of schools with the highest non-JD enrollment as a percentage of total law school enrollment.

Comparing Google Scholar's H5 index to Sisk-Leiter citations

After the latest release of Professor Greg Sisk’s scholarly impact measure for law school faculties, Professor Brian Leiter blogged a series of smaller rankings of individual faculty members in different scholarly areas. I thought I’d use the data for a quick look at the difference between measures of scholarly activity. The Sisk-Leiter method is this longstanding project; I thought I’d compare it to Google’s H5 index.

One major barrier to using Google Scholar is that it only works for those who create an account (absent using a time consuming back channel like Publish or Perish). But the two measures do different things.

Google Scholar index covers more works, including far more non-law-related works, than the Sisk-Leiter methodology. Google Scholar includes a number of non-legal and interdisciplinary works. It's a value judgment as to which metric ought to matter--or, perhaps, it's a reason to consider both and acknowledge they measure different things!

Google Scholar gives "credit" for an author being cited multiple times in a single piece; Sisk-Leiter only gives "credit" for one mention. The downside for Sisk-Leiter is that an author who has 12 of her articles published would receive credit in Google Scholar for 12 citations, but only 1 in Sisk-Leiter. On the flip side, an author who cites himself 12 times in a single piece would receive credit in Google Scholar for 12 citations, but only 1 in Sisk-Leiter--and, I think, self-citations are, on the whole, less valuable when measuring "impact."

Google Scholar covers all authors; Sisk-Leiter excludes names omitted in et al. There is a method to help mitigate this concern, but, again, this tends to benefit interdisciplinary scholars in Google Scholar, and tends to benefit (through omission) the more typical sole-author law pieces in Sisk-Leiter. That said, Professor Leiter updated his blog’s rankings with some corrections from Professor Ted Sichelman.

Google Scholar includes references to indexed recognized scholarship; Sisk-Leiter extends to all mentions, including blog posts or opinion pieces typically not indexed in Google Scholar. It's another value judgment as to which metric ought to matter. In this dimension, Sisk-Leiter can be broader than Google Scholar might be.

Sisk-Leiter offers a greater reward for a few highly-cited works; H5 offers a greater reward for breadth and depth of citations. This is a specific measure for H5 in Google Scholar as opposed to Google Scholar more generally. Google Scholar also measures citations in the last five years. But I chose to compare Sisk-Leiter to the Google H5 index instead of the C5 (citations in the last five years) index. H5 measures how many (X) pieces have received at least X citations in the last 5 years. So if you have 10 articles that have each received at least 10 citations since 2013, your H5 index is 10. It doesn’t matter if your 11th piece has 9 citations; it doesn’t matter if one of your 10 pieces has 10,000 citations. It’s a measure of depth and breadth, different in kind than total citations.

In the chart below, I logged the Sisk-Leiter citations and compared them to the Google H5 index. I drew from about 85 scholars who both appeared in a Leiter rankings and had a public Google Scholar profile, and I looked at their Google Scholar profiles this fall (which may mean that figures are slightly off from today’s figures). Google Scholar is also only as good as the profiles are, so if scholars have failed to maintain their profile with recent publications, it may understate their citations. I highlighted in blue circles those identified in the Leiter rankings as age 50 and under.

I included a trendline to show the relationship between the two sets of citations. Those “above” the line are those with higher Sisk-Leiter scores than Google H5 index scores and “benefit", in a sense from the use of this metric over Google H5. Those “below” the line, in contrast, are those who would “benefit” more from the use of Google H5. At a glance, it’s worth considering that perhaps more “pure law” scholars are above the line and more interdisciplinary scholars below it—not a judgment about one or the other, and only a great generalization, but one way of thinking about how we measure scholarly impact, and perhaps reflects a benefit of thinking more broadly about faculty impact. Recall, too, that this chart selectively includes faculty, and that some citation totals vary wildly due to the particular fields scholars write in. The usual caveats about the data apply—there are weaknesses to every citation metric, and this is just a way of comparing a couple of them.

MBE scores drop to 34-year low as bar pass rates decline again

On the heels of some good news in recent administrations of the July bar exam comes tough news from the National Conference of Bar Examiners: the Multistate Bar Exam (MBE) scores have dropped to a 34-year low, their lowest point since 1984.

For perspective, California's "cut score" is 144, Virginia 140, Texas 135, New York 133. A bar score of 139.5 is comparable to 2015 (139.9) in recent years. One would have to go back to the 80s to see comparable scores: 1982 (139.7), 1984 (139.2), & 1988 (139.8).

I’d hoped that perhaps qualifications of students have rebounded a bit as schools improved their incoming classes a few years ago; perhaps students are putting more effort into the bar than previous years; or other factors. That appears to not be the case this year.

That said, MBE scores may be slightly less predictive of what will happen with actual bar pass rates. the NCBE has pointed out that the rise of the Uniform Bar Exam has led to a number of test-takers transferring scores to new jurisdictions rather than taking a second jurisdiction’s bar—and, presumably, those who pass in one jurisdiction are much more likely to pass in another jurisdiction (accepting that cut scores can vary in some jurisdictions). The UBE points to a few thousand such transfers last year, at least some of whom may have taken the bar exam. But put against more than 40,000 MBE test-takers, the effect, while real, may be small.

Instead, we’re left to watch as results come in state by state. Tracking first-time pass rates (from jurisdictions that share them so far—ideally, ABA graduates would be a better measure, but this works reasonably well for now), declines have been pretty consistent: New Mexico (-14 points), Indiana (-3), North Carolina (+1), Oklahoma (-8), Missouri (-7), Iowa (-3), Washington (-3), and Florida (-4). But in many of these jurisdiction, pass rates were worse in, say, 2015 or 2016.

We’ll know more in the months to come, but it looks like another year of decline will cause some continued anguish in legal education. The increased quality of law school applicants this year will help the July 2021 bar exam look much better.

Note: I chose a non-zero Y-axis to show relative performance.

The demise of the stand-alone law school

Most law schools accredited by the American Bar Association are affiliated with a university. In the last quarter century, we've seen the slow demise of the stand-alone law school. Few are left.

In 1995, Michigan State acquired the Detroit College of Law.

Penn State in 1997 announced a similar plan to create a law school by acquiring the Dickinson School of Law. (Penn State would eventually have law schools at two sites, then split them into two separate law schools under the Pennsylvania State University system.)

In 2010, the Franklin Pierce Law Center affiliated with the University of New Hampshire.

Western Michigan University associated with Thomas M. Cooley Law School in 2014.

William Mitchell College of Law merged with Hamline University School of Law to become Mitchell | Hamline, formally affiliated with Hamline University in 2015.

The recent announcement that the University of Illinois at Chicago would merge with John Marshall Law School is the latest.

(I exclude for-profit schools from this analysis, which come with their own complications. But in 2012, Western State became a part of Argosy University. Savannah Law School recently announced its upcoming closure.)

I also anticipate that someone will point out omissions or errata in my assuredly-incomplete list....

At this point, then, there are only a handful of stand-alone law schools left. The recent news over tenure at Vermont Law School shows that without a university affiliation, weathering tough times can be a significant challenge for stand-alone law schools. How many might remain after the next quarter-century?

Apart from Vermont, there are California Western (in San Diego), Thomas Jefferson School of Law (in San Diego), South Texas College of Law Houston, New England Law | Boston, Brooklyn Law School, New York Law School, and Appalachian School of Law. (Again, please correct any omissions or errata!) (I originally included the University of California Hastings, but given its affiliation with the UC system, perhaps it's simply different in kind and should not be included....)

At this pace, we might expect another couple of closures or mergers in the next few years. And it's simply a demonstration that legal education is changing, and old stand-alone law schools are slowly becoming a thing of the past.

Note: this post has been updated thanks to helpful Twitter feedback and helpful comments!

Federal Judicial Clerkship Report of Recent Law School Gradates, 2018 Edition

I've regularly posted judicial clerkship statistics on this blog. This year, I offer something slightly different: "Federal Judicial Clerkship Report of Recent Law School Gradates, 2018 Edition," a report I've posted on SSRN.

This Report offers an analysis of the overall hiring of recent law school graduates into federal judicial clerkships between 2015-2017 for each law school. It includes an overall hiring report, regional reports, overall hiring trends, an elite hiring report, and trends concerning judicial vacancies.

A preview of overall placement:

There's also been a decline in total law school federal clerkship placement, likely attributable in part to the rise in federal judicial vacancies:

For these and more, check out the Report!

Continued hope for modest law school applicant increase in 2018

After a sharp spike in LSAT test-takers in July 2017, I noted that it was good news, with some caution, because first-time test-takers were slowly becoming smaller and smaller in the LSAT test-taker pool. In December 2017, I noted the same cautious optimism for improved applicant quality and quality this admissions cycle.

You can sort through up-to-date figures at LSAC here to see the pace of applicants, including higher quality and quantity. But, again, cautious optimism is in order.

While LSAT test-takers are up 19% year-over-year, applicants look to rise just 8%--better than a decline or a nominal increase, to be sure, but far short of the surge one might project from LSAT test-takers. Then again, given unlimited repeats, this is hardly a surprise. But another surprise is that despite a number of schools accepting the GRE, we don't see a higher applicant pool given the surge in LSAT test-takers. One might expect that LSAT test-takers now understate applicants. That's apparently not the case (at the moment, on a very superficial level).

Schools should hope that applicants exceed 60,000, which would be the first time since the 2009-2010 cycle. (I should emphasize here that LSAC has changed some of its counting in the last few cycles, so it's a rough approximation to go across years like this.) Additionally, if schools modestly increase their matriculants as the quality and quantity increases, we may see more than 40,000 enrolled for the first time since Fall 2012.

But visualized this way, the sharp increase of LSATs administered is in some contrast to the modest increase in applicants. Time will tell what this cycle holds--and by next fall, we'll know how schools handled this applicant pool in terms of overall matriculants.

Small law firm jobs shrink dramatically and big law hiring picks up for the Class of 2017

After sharing some big-picture good news about the legal job market for the Class of 2017, I thought I'd share a few details on the market, similar to my report last year. Indeed, the report is very similar to last year's because the trends have accelerated. And outcomes appear to be qualitatively and quantitatively better.

I drew comparisons to the Class of 2013 (which, it should be noted, were nine-month figures). Declines in overall jobs, overall graduates, and bar passage rates assuredly affect some of the industry-specific figures. Last year, I noted that jobs in smaller firms and business and industry were disappearing for entry-level hires. That continues to be the case.

FTLT Class of 2013 Class of 2017 Net Delta
Solo 926 392 -534 -57.7%
2-10 6,947 5,145 -1,802 -25.9%
11-25 1,842 1,628 -214 -11.6%
26-50 1,045 953 -92 -8.8%
51-100 846 779 -67 -7.9%
101-205 1,027 956 -71 -6.9%
251-500 1,041 983 -58 -5.6%
501+ 3,978 4,569 591 14.9%
Business/Industry 5,494 3,241 -2,253 -41.0%
Government 4,360 3,812 -548 -12.6%
Public Interest 1,665 1,419 -246 -14.8%
Federal Clerk 1,259 1,151 -108 -8.6%
State Clerk 2,043 1,984 -59 -2.9%
Academia/Education 490 303 -187 -38.2%

I think the decline is likely attributable to two factors. First, as bar passage rates decrease, the most marginal graduates--who were already the ones most likely to enter solo practice--are the ones most likely to be squeezed out. The same holds true at very small firms, 2-10 attorneys. If the graduates who'd typically fill those spots are now failing the bar exam, we'd expect the positions to decline. A nearly 60% decline in entry-level sole practitioners, and more than a 25% decline in 2-10-attorney firm hiring, is pretty sharp in just four years.

Additionally, business & industry jobs are the ones most likely to be categorized as J.D. advantage positions, and we've seen a decline in those positions generally.

On top of that, big law hiring--at firms with more than 500 attorneys--has increased 15% in four years. Given the dramatic decline in the number of graduates--12,000 fewer graduates between 2013 and 2017--things look even better. For the Class of 2013, 8.6% of graduates ended up in the biggest of law firm jobs; that figure climbed to 13.3% for the Class of 2017. Of course, big law jobs aren't everything, and there were slight declines in 101-500-attorney firms along with federal clerkships. But, the trend is a good one.

All in all, these are good signs for the market. The employment figures are not just quantitatively better; they are also qualitatively better, as more graduates are in the most coveted jobs (again, conceding that big law jobs aren't everything), and fewer are in the more marginal or least desired positions.