Bar exam posts single-largest drop in scores in history

The Mulitstate Bar Exam, a series of multiple choice questions administered over several jurisdictions, has existed since 1972. The NCBE has statistics disclosing the history of scaled MBE scores since 1976.

After tracking the decline in bar scores across jurisdictions this year, I noted that the MBE had reached a 10-year low in scores. It turns out that's only part of the story.

The 2.8-point drop in scores is the single largest drop in the history of the MBE.

The largest other drop was in 1984, which saw a 2.3-point drop over the July 1983 test. The biggest increase was 1994, which saw a 2.4-point increase over the July 1993 test. And the only other fluctuation exceeding two points was the 1989 2.2-point increase over the July 1988 test.

What might be behind this change? I've speculated about a few things earlier; I'll address some theories in later posts this week.

Total LSAT takers in steady decline

Last year I blogged about the fact that for legal education, the worst is yet to come--there continued to be fewer LSAT takers and fewer law school applicants. I charted the decline in cumulative LSATs administered last October. But I noted that there seemed to be an evening out by the end of the cycle and updated the chart to reflect that.

No longer. LSAC has now reported a 9.1% decrease year-over-year in LSAT test-takers in the June 2014 test, and a 8.1% decline in the October 2014 test. That's a cumulative total of 52,745 LSATs administered, down from 57,670 over last year, and down from 93,341 in 2009-2010--that's more than a 40% decline in LSATs administered.

Here's the updated chart showing cumulative LSATs administered.

It appears that for legal education, the worst still may be yet to come.

Bar exam scores dip to their lowest level in 10 years

Earlier, I noted that there had been a drop in bar passage rates in a handful of jurisdictions. (Follow that post to track state-by-state changes in the pass rates as the statistics come in.) A commenter theorized:

It's quite simple actually: the NCBE did a poor job of normalizing the MBE this year. The median MBE score is down a couple of points, and because states scale their essays to match the MBE results in their state, it also means median essay scores have decreased a small amount. Combine the two scores and you are seeing (in states using a 50/50 system), a 4-5 point drop in scores.

It's actually quite damning to the NCBE, because bar passage rates should be up and median MBEs also up if the historical correlation between LSAT and bar passage is taken into account.

Tennessee recently disclosed at the national mean scaled MBE score for July 2014 was 141.47. That's the lowest mean scaled MBE score for July since 2004, when the mean scaled MBE score was 141.2 (PDF). It's also almost three points lower than the July 2013 score.

There are innocuous reasons why the score dropped. It might be that there were a disproportionately high number of repeated test-takers. It might be that an increase in non-American law degree test-takers yielded a drop. Or there might be other reasons, too.

But for whatever reasons, the decline in MBE scores is almost assuredly the reason that bar passage rates have dropped in a number of jurisdictions. Whether similar declines are going to arise in places like New York and California in the weeks ahead is simply a matter of waiting.

A more difficult bar exam, or a sign of declining student quality?

I saw this thread at Top-Law-Schools about bar passage rates apparently somewhat lower than previous years. Thanks to link aggregation of bar statistics at Deceptively Blonde, I could start comparing results to the NCBEX annual statistics (PDF). Unfortunately, due to selectivity of statistical releases at this point, it's not possible at this moment to get a sufficiently granular analysis of bar passage. (For instance, most bars only report total pass rates, which include all takers, including repeaters and those from non-ABA accredited schools.) But we can start with a little anecdata until the full NCBEX data is released next spring.

These figures compare overall bar takers in July. Numbers are rounded to maintain consistency with NCBEX data.

Alabama, -6 points (July 2013: 71%; July 2014: 65%)

Florida, +1 point (July 2013: 71%; July 2014: 72%)

Idaho, -15 points (July 2013: 80%; July 2014: 65%)

Indiana, -8 points (July 2013: 76%; July 2014: 68%)

Iowa, -11 points (July 2013: 92%; July 2014: 81%)

New Mexico, +3 points (July 2013: 81%; July 2014: 84%)

North Carolina, -1 point (July 2013: 63%; July 2014: 62%)

Oklahoma, -3 points (July 2013: 82%; July 2014: 79%)

Oregon, -10 points (July 2013: 75%; July 2014: 65%)

Vermont, -6 points (July 2013: 72%; July 2014: 66%)

Washington, -8 points (July 2013: 85%; July 2014: 77%)

Of the ten states that have disclosed overall bar passage rates, seven have passage rates that dropped at least five points, and three have passage rates that dropped at least ten points.

Why?

Have state bars begun increasing the difficulty of their exams? That seems unlikely, because it's usually a big deal, and a public deal, for a state to adjust an exam. The fact that this is happening in several places also makes it unlikely.

Has student quality declined? The graduating class of 2014 was admitted in 2011, at a time of a very high applicant pool and some of the highest standards for most schools--while we might see a decline in passage rates in the next couple of years as schools sacrifice LSAT medians, GPA medians, and, perhaps most importantly, index scores (as I blogged about here), it doesn't explain why there's a drop for this graduating class. That said, the applications in 2011 were down slightly from the 2010 peak. (If anything, it may portend an even more dire situation as the student quality at institutions makes its way to graduation.)

Is it simply a brief anomaly from a few states? It might be. Looking at 2012 results (PDF), North Carolina had a 72% passage rate in July 2012; Washington had a 64% passage rate. So perhaps some significant oscillation in a few jurisdictions is not unprecedented.

At this stage, it's a small data point to keep an eye on as the bar results come in. Additionally, if bar passage rates decline overall, we might see another wave of consequences: fewer students passing state bars in July means lower employment outcomes for students in bar passage-required positions that must be reported the following February. Schools that slashed admissions standards three years ago might be seeing the consequences if higher numbers of their graduates fail the bar.


Update: Here are a few additional results. This will occasionally be updated. For a chart identifying a sharp decline in MBE scores, please see this post.

Alaska, -3 points (July 2013: 68%; July 2014: 65%)

Arizona, -8 points (July 2013: 68%; July 2014: 76%)

California, -7 points (July 2013: 56%; July 2014: 49%)

Colorado, -4 points (July 2013, 79%; July 2014: 75%)

Connecticut, +3 points (July 2013: 77%; July 2014, 77%)

Delaware, -9 points (July 2013: 72%; July 2014: 63%)

District of Columbia, -8 points (July 2013: 47%; July 2014: 39%)

Georgia, -6 points (July 2013: 80%; July 2014: 74%)

Kentucky, unchanged (July 2013: 76%; July 2014: 76%)

Louisiana, +17 points (July 2013: 53%; July 2014, 70%)*

Massachusetts, -6 points (July 2013: 82%; July 2014: 76%)

Michigan,  +1 point (July 2013: 62%; July 2014: 63%)

Minnesota, -9 points (July 2013: 88%; July 2014: 79%)

Missouri, -4 points (July 2013: 89%; July 2014, 85%)

Nevada, -9 points (July 2013: 66%; July 2014: 57%)

New Jersey, -4 points (July 2013: 79%; July 2014: 75%)

New York, -4 points (July 2013: 69%; July 2014: 65%)

Ohio, -5 points (July 2013: 82%; July 2014: 77%)

Pennsylvania, -1 point (July 2013: 77%; July 2014: 76%)

South Carolina, -6 points (July 2013: 77%; July 2014: 71%)

Tennessee, -12 points (July 2013, 78%; July 2014, 66%)

Texas, -11 points (July 2013, 82%; July 2014, 71%)

Virginia, -7 points (July 2013: 75%; July 2014: 68%)

Running totals for change in passage rate (for 34 jurisdictions)

Drop of at least ten points: 5

Drop of five to nine points: 15

Essentially unchanged (drop of four points to increase of four points): 13

Increase of five or more points: 1*

*Louisiana is the only state that does not use the MBE.

How state court clerkship opportunities affect legal employment

California state courts do not offer clerkships to new law school graduates. And that decision affects the employment outcomes of graduates of California law schools.

Federal clerkships have been examined at great length (here and elsewhere). State court clerkships, however, remained relatively underexamined. And they are a source of significant volatility in comparing employment outcomes of graduates.

It's a crude general statement to say that law students tend to practice in the state in which their law school is located. I looked at how many law school graduates came from each state's law schools in 2013. (Alaska has no law school.) I then looked at how many of those graduates obtained state court clerkships in the reported ABA employment statistics. Lacking more granular data, it was a rough proxy--graduates, after all, may clerk in another state rather than the state of their law school. (For more details, see the bottom of this post.)

Here's a map (courtesy of Choropleth.us) of how many law school graduates from each state's law schools obtained state court clerkships, with figures in a table below:

State St. Clerks St. Grads Pct.
New Jersey 273 855 31.9%
South Dakota 14 71 19.7%
Hawaii 20 104 19.2%
Montana 15 81 18.5%
Nevada 23 132 17.4%
North Dakota 12 74 16.2%
Maryland 95 602 15.8%
Delaware 44 279 15.8%
Minnesota 121 942 12.8%
Idaho 13 117 11.1%
South Carolina 49 442 11.1%
New Mexico 12 114 10.5%
Vermont 21 200 10.5%
Colorado 46 444 10.4%
Utah 26 292 8.9%
Oregon 45 524 8.6%
Pennsylvania 140 1700 8.2%
Rhode Island 14 175 8.0%
Iowa 25 328 7.6%
Maine 6 96 6.3%
Kentucky 25 421 5.9%
Virginia 85 1440 5.9%
Washington 38 655 5.8%
Arizona 36 630 5.7%
Louisiana 52 924 5.6%
Mississippi 20 377 5.3%
Wyoming 4 76 5.3%
District of Columbia 113 2211 5.1%
West Virginia 6 130 4.6%
Connecticut 24 538 4.5%
Nationwide 2044 46116 4.5%
Massachusetts 100 2384 4.2%
Indiana 31 831 3.7%
Alabama 15 421 3.6%
North Carolina 46 1424 3.2%
Wisconsin 15 487 3.1%
Georgia 34 1112 3.1%
Missouri 27 885 3.1%
Tennessee 15 497 3.0%
Kansas 9 324 2.8%
Michigan 54 2228 2.4%
New York 113 5009 2.3%
New Hampshire 2 107 1.9%
Nebraska 4 249 1.6%
Texas 30 2323 1.3%
Ohio 19 1476 1.3%
Illinois 29 2274 1.3%
Arkansas 3 275 1.1%
Florida 34 3185 1.1%
California 46 5185 0.9%
Oklahoma 1 466 0.2%
Alaska 0 0 0.0%

For most of the top few states (e.g., Hawaii, Montana, Nevada, North Dakota, and South Dakota) have similar characteristics: one in-state school, a relatively insular market, and small law schools. Those schools each send a handful of their graduates to clerk in their states' courts--at least, it's probably a good guess, despite the lack of more granular data, that they're clerking in their home state.

A state like New Jersey is an anomaly. It has a robust state court clerkships system designed specifically for recent law graduates. Its website boasts 480 one-year positions. So it's probably no surprise that New Jersey-based law schools channel an extremely high number of graduates into state court clerkships.

Other states are not so fortunate--California among them, as it sits near the bottom of the list.

Here are the numbers as a percentage of full-time, long-term, bar passage-required jobs. (As a note, even though these positions are often only one year, they are still considered "long-term.")

State St. Clerks FTLT BPR Pct.
New Jersey 273 539 50.6%
Hawaii 20 56 35.7%
Delaware 44 132 33.3%
Maryland 95 290 32.8%
South Dakota 14 44 31.8%
Nevada 23 84 27.4%
North Dakota 12 44 27.3%
Montana 15 56 26.8%
Minnesota 121 538 22.5%
Vermont 21 109 19.3%
Rhode Island 14 73 19.2%
South Carolina 49 269 18.2%
Idaho 13 73 17.8%
Colorado 46 273 16.8%
Oregon 45 296 15.2%
Maine 6 41 14.6%
New Mexico 12 84 14.3%
Utah 26 187 13.9%
Pennsylvania 140 1012 13.8%
Louisiana 52 464 11.2%
Iowa 25 229 10.9%
Arizona 36 338 10.7%
Washington 38 364 10.4%
Kentucky 25 244 10.2%
Virginia 85 965 8.8%
Mississippi 20 230 8.7%
Connecticut 24 290 8.3%
West Virginia 6 75 8.0%
Wyoming 4 51 7.8%
District of Columbia 113 1441 7.8%
Nationwide 2044 26539 7.7%
Massachusetts 100 1345 7.4%
Indiana 31 477 6.5%
North Carolina 46 754 6.1%
Michigan 54 915 5.9%
Alabama 15 272 5.5%
Wisconsin 15 279 5.4%
Missouri 27 523 5.2%
Georgia 34 724 4.7%
Kansas 9 206 4.4%
Tennessee 15 363 4.1%
New York 113 3153 3.6%
New Hampshire 2 74 2.7%
Nebraska 4 151 2.6%
Ohio 19 818 2.3%
Florida 34 1653 2.1%
Illinois 29 1413 2.1%
Texas 30 1506 2.0%
Arkansas 3 163 1.8%
California 46 2557 1.8%
Oklahoma 1 302 0.3%
Alaska 0 0 0.0%

Now, of course, if California courts began offering robust clerkships opportunities for graduates, it might simply be that graduates who otherwise would have pursued other job opportunities would instead take a state court clerkship first. But, this data, I think, does show that regional employment opportunities greatly affect the short-term legal employment outcomes of graduates. (And I imagine many will draw a variety of conclusions from this data--but, the primary purpose of this post is to provide the data.)

Methodology note: A few schools do distort the picture for a few states (like Yale in Connecticut sends relatively few of its graduates into state court clerkships). So I thought I might define each school's "home market" as the state where the school sent the largest percentage of its graduates. The only schools that had a percentage difference of at least one-half of one percentage point were Pennsylvania, Kentucky, North Carolina, and Connecticut--and the greatest of these was Connecticut at a 2.6-point difference. These were too small for me to decide to use this metric--particularly because the state where the school sends the largest percentage of its students may change from year to year, which would make future comparisons across years more difficult. As the methodology only examines graduates from each state's schools, Alaska lists zero, but it has does offer state clerkships.

Law school applicants, matriculants, and employment outcomes - in one chart

Occasionally, to me, at least, the facts and figures about law school enrollment and employment outcomes tend to blur. So I created a visualization of the current situation.

This chart combines LSAC data and ABA employment data for the law school classes of 2011 to 2017. (The ABA employment data for the Class of 2010 is not comparable to its later data sets.)

The top light blue slashed bars represent the total applicants to law school in that applicant cycle. For example, for the Class of 2011, there were around 82,000 applicants in the 2007-2008 cycle.

The dark blue slashed bars represent the total matriculants to law schools each year. For example, for the Class of 2011, there were around 49,400 matriculants beginning law school in the fall of 2008.

The five solid bars underneath represent the employment statistics of that year's graduating class, as reported in the employment data 9 months from graduation. For the Class of 2011, there were 43,735 graduates whose employment was reported as of February 15, 2012.

(The margin between matriculants and graduates reflects a few losses. First, I removed graduate statistics of the three schools in Puerto Rico, which amount to a few hundred graduates each year. Second, those who dropped out, or who were dismissed, are not included among graduates. Third, there's some lag in data for students in joint-degree or part-time programs, and they would not be included in data for matriculants three years after entering a program.)

The red solid bars represent unemployed (whether seeking or not seeking employment) and those whose employment status is unknown.

The orange bars represent part-time employed, short-term employed, any employed in professional or nonprofessional positions, those whose employer is unknown, and those pursuing an additional degree.

The light green bars represent those in full-time, long-term, JD advantage positions, whether funded by the school or not.

The medium green bars represent those in full-time, long-term, bar passage required positions funded by the school. (The ABA data for the Class of 2011 does not separately break out this data.)

The dark green bars represent those in full-time, long-term, bar passage required positions not funded by the school.

Applicants for the Class of 2017 are estimated using the most recent LSAC data.

When U.S. News rankings aren't news, but just 15 months late

On June 12, 2014, U.S. News & World Report released a "news" story. It boasts, "U.S. News has just published two exclusive clerkship lists of law schools using data from our 2015 Best Law Schools rankings for the 2012 J.D. graduating class."

You may have to read that sentence a few times to realize the problems.

First, the clerkship data for the Class of 2012 has been publicly available since March 29, 2013: almost 15 months ago, when the American Bar Association released its school-by-school data. (Many schools had already individually posted their own results by then.)

Second, that's data for the Class of 2012, which graduated 25 months ago. Data for the Class of 2013 is available from the ABA here. (I have a "microranking" that averages three years' worth of federal clerkship data for each school, from 2011 to 2013, available here.)

Third, the USNWR data may be "exclusive" because it's from their data... but, as the data is also publicly available from the ABA, it's hard to determine what value USNWR adds.

It's unfortunate that outlets like Above the Law pick up 15-month-old stories like they are "news." But maybe the fact that it's called U.S. News & World Report does, in fact, prove that the power of suggestion is quite powerful.

More thoughts on the accommodated LSAT settlement

My analysis last week of the accommodated LSAT settlement between DOJ and LSAC has prompted some further reflection.

I raised the possibility that LSAC might coordinate with the ABA to disclose accommodated test-takers after the fact so that the ABA might exclude those LSAT scores from its reported medians. But it appears that the consent decree might not permit even that disclosure. That would mean that accommodate LSAT scores would be included in ABA means. And that also means that law schools would not face the kind of uncertainty I flagged as a potential issue--the reported scores would remain the reported scores, with no post hoc adjustments to the medians.

In terms of scholarship retention, it is likely the case that it only really affects students with high LSAT scores, because higher-ranked schools generally have very few restrictions on scholarship retention. In contrast, lower-LSAT accommodated students, who are admitted to lower-ranked schools with more stringent scholarship retention data, are likely exposed to a relatively higher risk.

And finally, this settlement, like many proposed broad legal efforts, means that the defendant cannot provide all the services that it purports to provide. LSAC wants to provide scores highly predictive of first-year law school grades. On that, it does a very good job--it is the best predictor of first-year grades; it is an even better predictor when combined (with an appropriate formula) with an undergraduate GPA. But the settlement means that LSAC must now provide both these scores, and scores that are less predictive (i.e., accommodated scores, which are not as predictive of first-year law school grades), without any indication to law schools about whether this score fits into one category or into another.

In short, this settlement will be good for some prospective students, and it will be bad for other prospective students. But, in several years, particularly with an increase in the number of accommodate LSAT takers that are likely to arise as a result of the DOJ consent decree, we should see the predictive value of the LSAT diminish materially. And it will be incumbent upon schools to find other ways of identifying factors that will be more predictive of first-year success.

Cy pres awards funding legal education

As a putative member of the class action concerning TicketMaster litigation, I read the latest iteration of the proposed settlement that arrived in my inbox today with interest--in part because I knew this wasn't the first time settlement had been proposed. But atop the proposed maximum $386 million in coupons for future purchases at TicketMaster (with a likelihood that perhaps one-tenth of them would ever be used), one item caught my attention (PDF):

Ticketmaster will pay $3 million to the University of California, Irvine School of Law to be used for the benefit of consumers like yourself. In addition to the benefits set forth above, Ticketmaster will also make a $3 million cy pres cash payment to the University of California, Irvine School of Law’s Consumer Law Clinic. The money will establish the Consumer Law Clinic as a permanent clinic, and it will be used to: (i) provide direct legal representations for clients with consumer law claims, (ii) advocate for consumers through policy work, and (iii) provide free educational tools (including online tutorials) to help consumers understand their rights, responsibilities, and remedies for online purchases.

Cy pres awards to law schools are certainly nothing new. Consider the following (proposed or actual) cy pres award recipients:

Stanford Law School's Center on Internet and Society

University of Washington School of Law's Shidler Center for Law, Commerce & Technology; University of California, Berkeley School of Law's Samuelson Law, Technology & Public Policy Clinic; and UW School of Law's Technology Law and Public Policy Clinic

Temple Law School

Harvard Law School’s Berkman Center for Internet and Society

Loyola University Chicago’s Institute for Consumer Antitrust Studies

University of San Diego Legal Clinics; California Western School of Law; & Thomas Jefferson School of Law

Berkeley Center for Law & Technology; The Berkman Center for Internet and Society at Harvard University; Center for Law + Innovation, University of Maine School of Law; High Tech Law Institute of Santa Clara University School of Law; New York University’s Information Law Institute; Privacy & Technology Project, University of California Hastings College of the Law; Samuelson Law, Technology & Public Policy Clinic, University of California, Berkeley School of Law; Stanford Law School Center for Internet and Society; University of Southern California Gould School of Law

Colorado Law’s Clinical Education Program

University of Maryland School of Law's Consumer Protection Clinic

California Western School of Law Interdisciplinary Studies, Health Law

George Washington University Law School

Branstetter Litigation & Dispute Resolution Program, Vanderbilt Law School

University of Memphis’ Cecil C. Humphreys School of Law

Sometimes, alumni of the law school involved in the settlement are responsible for channeling the money toward their alma mater. Sometimes, the law school thanks the law firm or the attorneys involved, occasionally naming the program after the settling attorneys. Some law schools even have dedicated development web sites that encourage cy pres awards to be earmarked for the law school.

In case law schools are suffering financially and seeking alternative sources of revenue, there's still one place where they can seek income, without resorting to tuition increases--class action settlements.