Visualizing legal employment outcomes in California in 2016

This is the fourth in a series of visualizations on legal employment outcomes for the Class of 2016. Following posts on outcomes in Texas, New York, and Illinois, here is a visualization for legal employment outcomes of graduates of California law schools for the Class of 2016. (More about the methodology is available at the Texas post.)

Outcomes generally improved with some important caveats. Total graduates declined nearly 10% year over year, from 4403 in the Class of 2015 to 4081 in the Class of 2016. That resulted in marginal improvements in employment outcomes: 64.3% in unfunded full-time, long-term, bar passage-required and J.D.-advantage positions, up from 63.8%. But total jobs in these positions declined, from 2807 to 2624, likely attributable in part to challenging bar passage rates (and perhaps because of conditions relating to California's job market).

Law school-funded positions experienced a small resurgence, from 107 positions last year (2.4% of graduates) to 118 positions (2.9% of graduates). (Please recall from the methodology that the bar chart is sorted by full-weight positions, which excludes school-funded positions, while the table below that is sorted by total employment as USNWR prints, which includes school-funded positions.)

As always, please notify me of any corrections or errata.

Peer Score School 2016 YoY% BPR JDA LSF 2015 BPR JDA LSF
4.4 University of California-Berkeley 94.5% 3.5 278 11 23 91.0% 237 5 11
4.8 Stanford University 94.0% 1.7 164 4 4 92.3% 166 8 6
3.9 University of California-Los Angeles 90.8% -0.5 239 18 30 91.3% 247 25 34
3.3 University of California-Irvine 85.6% 1.0 84 3 14 84.5% 71 2 20
3.5 University of Southern California 85.5% 5.2 140 9 22 80.3% 155 9 7
3.4 University of California-Davis 81.2% 3.9 87 11 14 77.3% 125 9 9
2.6 Loyola Law School-Los Angeles 73.6% -0.7 221 36 5 74.3% 227 42 3
3.1 University of California-Hastings 67.0% 0.1 154 46 1 66.9% 174 28 4
2.6 Pepperdine University 65.7% 1.8 98 19 2 64.0% 104 21 1
1.6 California Western School of Law 63.1% 4.6 82 29 0 58.5% 90 41 0
2.4 Santa Clara University 61.4% 7.5 102 30 0 53.9% 86 32 0
1.9 Chapman University 60.8% -3.6 78 18 0 64.4% 62 23 0
2.7 University of San Diego 57.8% -4.6 102 24 0 62.3% 138 16 0
1.9 McGeorge School of Law 56.8% -2.8 56 23 0 59.6% 105 31 0
1.9 Southwestern Law School 54.5% 6.3 125 48 2 48.2% 115 35 0
2.0 University of San Francisco 47.1% -4.1 46 20 0 51.2% 60 22 3
1.1 Western State College of Law 45.1% -5.9 29 12 0 50.9% 46 10 0
1.6 Golden Gate University 41.1% -0.1 30 15 1 41.1% 58 7 0
1.4 Whittier Law School 39.1% -9.9 38 12 0 48.9% 30 30 9
1.3 Thomas Jefferson School of Law 31.9% -7.5 46 21 0 39.4% 59 36 0
1.2 University of La Verne 31.4% -19.9 7 9 0 51.3% 16 4 0

Visualizing legal employment outcomes in Illinois in 2016

This is the third in a series of visualizations on legal employment outcomes for the Class of 2016. Following posts on outcomes in Texas and New York, here is a visualization for legal employment outcomes of graduates of Illinois law schools for the Class of 2016. (More about the methodology is available at the Texas post.)

Outcomes improved everywhere, highlighted by Chicago's 100% placement rate, and Illinois's nearly double-digit rise accompanied by its drop in law school-funded jobs. Total graduates declined from 2041 in 2015 to 1816 in 2016. That accompanied a rise in placement, from 73.8% to 78.1%, even as total jobs fell slightly.

As always, please notify me of any corrections or errata.

Peer Score School 2016 YoY% BPR JDA LSF 2015 BPR JDA LSF
4.6 University of Chicago 100.0% 6.1 201 4 10 93.9% 178 0 6
4.2 Northwestern University (Pritzker) 92.4% 1.1 203 19 8 91.3% 234 21 8
3.3 University of Illinois-Urbana-Champaign 87.3% 8.9 131 13 1 78.5% 118 14 10
1.7 Northern Illinois University 76.1% 0.9 52 14 1 75.2% 60 15 1
2.5 Loyola University Chicago 72.7% 7.6 119 32 1 65.1% 133 36 1
2.4 Illinois Institute of Technology (Chicago-Kent) 70.7% 4.2 136 35 0 66.4% 144 40 0
2.3 DePaul University 70.1% 1.2 126 38 0 68.9% 142 35 0
1.7 Southern Illinois University-Carbondale 69.2% 4.6 67 14 0 64.6% 56 8 0
1.7 The John Marshall Law School 65.5% 1.0 153 41 0 64.6% 194 52 0

UPDATE: This post erroneously included data from Atlanta's John Marshall. It has been corrected.

Examining Whittier

Last week, the Whittier College officials announced that it planned on closing Whittier Law School. Some recent events, such as the school's first-time California bar pass rate of around 22%, and the decision to sell the land the school sits on for $35 million (and lease it back), suggested there was some instability. And while emotions are still raw and much remains uncertain, I thought I'd dig into a few publicly-accessible details to examine what's happened at Whittier in the last decade or so. I'll try to be mostly descriptive. Whether one believes the best decision is to close the school, or to address the school's challenges in a different way, is far beyond the scope of this blog (and beyond the available information we have).

This story starts in 2005, when the American Bar Association ("ABA") placed Whittier on probation. The incoming class in 2005 (which would graduate in 2008) would set the tone for decisions that could improve the school's standing. Indeed, by 2008, things looked much better: the graduating class had a bar passage rate of 83% (in the July 2008 bar, it was 84.3% for first-time test-takers), the incoming class in 2007 had a dip in credentials remedied in 2008, and the school was taken off probation.

I thought I'd evaluate a few of the things that have happened over the years and attempt to visualize them. I'll start with the LSAT quartiles of the classes over the years (excluding the three most recent admissions cycles for now).

LSAT scores, of course, don't tell the whole story; UGPAs coupled with LSAT scores in an index score do a better job, but this is a first rough take. It's also worth noting a few things about these LSAT scores: a 155 is roughly around the 64rd percentile; a 150 is 44th percentile; and a 145 is the 24th percentile. Despite nominally small changes in scores, small changes in these areas can reflect a big change in the student quality. Additionally, a student with an LSAT score of 150 typically averages a 143 on the MBE; the passing score in California is a 144, so we should expect pretty serious degradation in bar passage rates among those with LSAT scores below 150.

Adding a second axis of percentages, I've included the first-time California bar passage rates; the results are from the administrations of the two exams after the class has graduated. (It's worth noting that Whittier has a robust part-time program, so students I identified as a part of the Class of 2008 may well have graduated in 2009. Additionally, some of the data is inconsistent over February administrations in particular across LSAC/ABA reporting forms, so I apologize for some small errors in this portion.) The peaks and valleys roughly correspond with the overall LSAT--but not perfectly. And we would expect the score to decline more rapidly as scores of the incoming classes dip well below 150.

But a couple of major points can change a class's profile from the incoming admission statistics. The first is academic dismissal rates.

(Because of some inconsistencies in how schools report academic dismissal rates, I attributed all academic dismissals in a subsequent year to the previous year's entering class. This should roughly even out across classes. Additionally, the annual data should align, but the ABA/LSAC data is not reported in the most accessible fashion.) It's worth noting that in 2006, the year after Whittier was placed on probation, it academically dismissed a whopping 37% of its first-year class. That undoubtedly had a major impact in improving its bar passage rates in 2008. Only once since then have academic dismissal rates even come close. Expectedly, we see bar passage rates drop as LSAT predictors drop and as academic dismissals remain relatively low.

But why some of the other fluctuation? Why did bar pass rates improve between the Class of 2011 and the Class of 2012, despite a decline in predictors and an essentially identical academic dismissal rate? One more variable to consider: net transfers. Presumably, schools take the very best students in transfer--specifically, from Whittier, the students most likely to pass the bar. High net transfer losses should result in a bigger decline in bar passage rates.

This visualization has almost too much information to be helpful, but consider the loss for the Class of 2011 (in 2009) compared to the loss for the Class of 2012 (in 2010). Whittier had far fewer academic transfers in its Class of 2012; and in the subsequent year, a much better bar passage rate. Higher academic dismissal rates and lower transfer rates should generally yield better bar passage rates. (It also isn't a perfect science; there's some randomness built into bar pass rates that could fluctuate around a few percentage points each year.)

It's worth noting that even as its incoming predictors worsened for the Classes of 2015 and 2016 (both in LSAT scores and confirmed by the bar passage rates), net transfers out actually increased. Many law schools in 2013 and 2014, feeling the crunch of the downturn in applicants, turned to smaller incoming classes to hold their median LSAT for USNWR purposes, then backfilled with transfers. Even though Whittier's incoming classes deteriorated, it became a more popular place to take transfers--further complicating efforts to improve bar passage rates.

Recent incoming class sizes and class LSAT profiles also display this balance of trying to improve class quality without reducing the class size so much that the school was unable to operate at a financially viable level. (I'll add in the three most recent years now; please note the change in Y-axis.)

A decision in the 2007-2008 academic cycle to shrink the incoming class size dramatically yielded its intended benefit: the median LSAT rose 3 points. But it wasn't designed to be a structural change; the incoming class size more than doubled the following year and returned to levels more in line with recent history.

This chart also includes the last three years of incoming classes. The LSAT profile for the Class of 2017 was worse than the Class of 2016 (which had a bar pass rate last July of 22%), with 22% academic dismissal rate and -10% net transfers. For the Class of 2018, however, the class size shrunk dramatically, around 40%, to improve the LSAT of the median and the bottom quartiles. That, like the decision in 2007-2008 for the Class of 2011, comes at a significant financial cost. Last year, the total enrollment at the school was 456, down from 700 in 2011. It was exacerbated by a 23% academic dismissal rate and a -19% net transfer rate.

The incoming class last fall was even smaller still, just 132 students, but the LSAT profile dropped again. These declines were on the heels of not simply significant attrition rates in academic dismissals and transfers (and "others," such as drop-outs), but also a trying job market.

It turns out job placement has been difficult even for a fairly small class size. The figures in the chart above are the 9- and 10-month employment rates in full-time, long-term bar passage required (FTLT BPR) and J.D. advantage (FTLT JDA) positions, along with all "other grad outcomes," and the gray bar on top the matriculating class three years earlier.

To restate how I began this post, the optimal solution for a school in Whittier's position is hardly obvious to me, absent far more data and evidence--in particular, absent far more specific evidence about the overall financial picture. That said, the decline in incoming class size and total enrollment suggests a fairly challenging financial picture.

But I hope a few data points present some of the complicated and longer history of decisionmaking, along with some perspective of what's happened at Whittier in the last few years. For schools facing their own challenges, data points like these, and the far more granular internal data schools have, are useful starting points for this discussion.

Please note any errors or corrections in the data, particularly if I mistakenly assigned outcomes from a particular year to the wrong year from the ABA/LSAC data. Some Y-axes or multiple axes do not start at zero to provide relative values. Multiple Y-axes are aligned primarily to provide ease of reading a chart.

UPDATE: This post has been updated slightly for clarity.

Visualizing legal employment outcomes in New York in 2016

This is the second in a series of visualizations on legal employment outcomes for the Class of 2016. Following up on a post on outcomes in Texas, here is a visualization for legal employment outcomes of graduates of New York law schools for the Class of 2016. (More about the methodology is available at the Texas post.)

Total graduates among the New York law schools dropped from 4083 to 3811. (There were about 4500 in the Class of 2014.) That helped overall placement rise from 79.3% to 83.4% in full-time, long-term, bar passage-required and J.D.-advantage jobs. That's even despite the fact that Columbia cut its law school funded placements in such positions from 28 down to 12. Overall jobs declined slightly.

As always, please notify me of any corrections or errata.

Peer score School 2016 YoY% BPR JDA LSF 2015 BPR JDA LSF
4.5 New York University 97.9% 1.2 430 15 30 96.7% 424 14 31
4.6 Columbia University 96.7% -2.1 356 8 12 98.8% 360 10 28
4.2 Cornell University 92.4% -3.1 166 2 2 95.5% 164 3 3
2.2 St. John's University 84.0% 2.2 176 29 0 81.9% 173 29 1
1.9 Pace University 82.5% 6.9 123 18 0 75.5% 93 17 1
3.3 Fordham University 82.3% 6.2 286 29 1 76.1% 274 37 1
2.7 Cardozo School of Law 82.2% 5.5 251 26 1 76.8% 246 32 0
2.0 Albany Law School 82.1% 1.8 106 17 1 80.3% 119 25 3
2.3 Hofstra University 80.3% 6.5 145 13 1 73.8% 201 17 4
2.5 Brooklyn Law School 77.0% 3.8 244 40 0 73.2% 215 31 0
1.9 New York Law School 76.3% 8.7 162 69 1 67.6% 171 66 1
2.3 Syracuse University 74.1% 8.5 100 23 0 65.6% 104 20 0
1.5 Touro College 69.8% 9.4 87 10 0 60.4% 105 8 0
2.2 City University of New York 69.2% 5.3 69 3 0 64.0% 66 5 0
2.2 University of Buffalo-SUNY 67.6% -3.1 117 8 0 70.7% 115 20 0

Visualizing legal employment outcomes in Texas in 2016

Following up on a series of posts last year, this is the first in a series visualizing employment outcomes of law school graduates from the Class of 2016. The U.S. News & World Report ("USNWR") rankings recently released, which include data for the Class of 2015, are already obsolete. The ABA will release the information soon, but individualized employment reports are available on schools' websites.

The USNWR prints the "employed" rate as "the percentage of all graduates who had a full-time job lasting at least a year for which bar passage was required or a J.D. degree was an advantage." But it does not give "full weight" in its internal ranking metric to jobs that were funded by the law school. USNWR gives other positions lower weight, but these positions are not included in the ranking tables. And while it includes J.D. advantage positions, there remain disputes about whether those positions are actually as valuable. (Some have further critiqued solo practitioners being included in the bar passage required statistics.)

The top chart is sorted by non-school-funded jobs (or "full weight" positions). The visualization breaks out full-time, long-term, bar passage required positions (not funded by the school); full-time, long term, J.D.-advantage positions (not funded by the school); school funded positions (full-time, long-term, bar passage required or J.D.-advantage positions); and all other outcomes. I included a breakdown in the visualization slightly distinguishing bar passage required positions from J.D.-advantage positions, even though both are included in "full weight" for USNWR purposes (and I still sort the chart by "full weight" positions).

The table below the chart breaks down the raw data values for the Classes of 2015 and 2016, with relative overall changes year-over-year, and is sorted by total placement (as USNWR prints). The columns beside each year break out the three categories in the total placement: FTLT unfunded bar passage required ("BPR"), FTLT unfunded J.D. advantage ("JDA"), and FTLT law school funded BPR & JDA positions ("LSF").

The first state is Texas (last year's visualization here). Total jobs in these unfunded bar passage-required and J.D.-advantage positions improved, from 1445 in 2015 to 1551 in 2016, even as the total graduates actually increased slightly in the state. The overall employment rate was 74.1% (including a few funded positions), up from 70.5% last year. (More granular data is available at each school's website.) Some of the improvement may be attributable to improved bar passage rates last July.

As always, if I made a mistake, please feel free to email me or comment; I confess there are always risks in data translation, and I am happy to make corrections.

Peer Score School 2016 YoY% BPR JDA LSF 2015 BPR JDA LSF
4.1 University of Texas-Austin 85.6% 1.2 289 18 3 84.5% 268 20 11
2.6 Southern Methodist University 81.6% -2.1 176 15 0 83.7% 183 17 0
2.4 Baylor University 80.4% -7.6 122 4 1 88.0% 88 5 2
2.7 University of Houston 79.6% 1.4 162 29 0 78.2% 129 42 1
1.9 Texas Tech University 76.4% 2.9 125 14 0 73.5% 138 17 0
1.6 St. Mary's University 68.9% 7.3 144 18 2 61.6% 113 19 1
2.2 Texas A&M University 68.3% 0.5 121 19 0 67.8% 137 17 0
1.6 South Texas College of Law Houston 62.2% 8.0 175 31 0 54.2% 164 25 1
1.5 Texas Southern University 58.9% 16.0 79 10 0 42.9% 52 11 0

The best ways to visualize the impact of the decline in bar passage scores

I've visualized a lot about the decline in bar pass scores and bar passage rates in the last few years, including a post on the February 2017 decline here. For some reason, this post in particular drew criticism as being particularly deceptive. It caused me to think a little more about how to best visualize--and explain--what the decline in multistate bar exam ("MBE") scores might mean. (I'll channel my inner Tufte and see what I can do....)

In the February 2017 chart, I didn't start the Y-axis at zero. And why should I? No one scores a zero. The very lowest scores are something in the 50s to 90s. And the score is on a 200-point scale, but no one gets a 200. So I suppose I could visualize it on the low to high ends--say, 90 to 190.

When you put it that way, it looks completely unremarkable. MBE scores have dipped a bit, but they've hardly moved at all. And it looks like my last post was simply clickbait. (It's worth noting I generate no revenue from this site!)

But that surely can't be right, either. After all, bar passage rates have been declining fairly sharply in the last few years even if this mean score has only moved relatively nominally. (For extensive discussion, see the "Bar exam" category on this blog.)

That's because what really matters is the passing score or the "cut score" in each jurisdiction.

Suppose the cut score in a jurisdiction is 100. A decline from a mean score of 135 to 134 should have essentially no effect if the results are distributed among a typical bell curve (and they usually are). That's because virtually everyone would still pass even if scores dropped a bit. In contrast, if the cut score were 180, a decline from a mean score of 135 to 134 should also have essentially no effect--virtually everyone would still fail.

But the reason for the perilous drop in bar pass rates is because this is exactly the spot where the mean scores have begun to hit the cut scores in many jurisdictions. Here's a visualization of what looks like, with a couple of changes--a larger y-axis, historical data for the February bar back to 1976, and gridlines identifying the cut scores in several jurisdictions. (It's worth noting that this is the national MBE mean, not individualized state means; July scores are somewhat higher; and it is a mean, not a median.)

You can see that the drop in the means plunges scores past what have been cut scores in many jurisdictions.

One more way of explaining why a drop at this point of the bell curve is particularly significant. The NCBE has not yet released the distributions of scores, but the bell curve linked above should be instructive, and the change from 2011 to 2016 is useful to consider.

In February 2011, just 39.6% of all test-takers had a score of 135.4 or lower. 13.7% had a score in the range of 135.5 to 140.4, and 46.6% had a score of 140.5 or higher. (Consider the chart above for references as to what those scores might mean.) In February 2016, however, 51.1% of all test-takers had a score of 135.4 or lower, a 11.5-point jump. 13.7% had a score in the range of 135.5 to 140.4, and just 35.1% had a score of 140.5 or higher.

That's because this particular drop in the score is at a very perilous spot on the curve. Bar takers are performing just a little worse in a relative sense. But when the distribution of performance is put up against the cut score, this is precisely the point that would have the most dramatic national impact.

I hope these explanations help illustrate what's happening on the bar exam front--and, of course, I welcome corrections or feedback to improve these visualizations in the future!

February 2017 MBE bar scores collapse to all-time record low in test history

UPDATE: Some wondered about the scale used for the visualization below, and I respond with some thoughts in a subsequent blog post.

On the heels of the February 2016 multistate bar exam (MBE) scores reaching a 33-year low, including a sharp drop in recent years, and a small improvement in the July 2016 test while scores remained near all-time lows, we now have the February 2017 statistics, courtesy of Pennsylvania (PDF). After a drop from 136.2 to 135 last year, scores dropped another full point to 134. It likely portends a drop in overall pass rates in most jurisdictions.

This is the lowest February score in the history of aggregated MBE results. (The test was first introduced in 1972 but, as far as I know, national aggregate statistics begin in 1976, as data demonstrates.) The previous record low was 134.3 in 1980.

It's worth noting that the February 2017 test had a small change in its administration: rather than 190 question that were scaled into the score and 10 experimental questions, the split in this exam was 175/25. It's unlikely (PDF) this caused much of a change, but it's worth noting as a factor to think about. And it's not because the MBE was "harder" than usual. Instead, it primarily reflects continued fall-out from law schools accepting more students of lower ability, then graduating those students who go on to take the bar exam. Given the relatively small cohort that takes the February test, it's anyone's guess what this will portends for the July 2017 test.

Visualization note: the non-zero Y axis is designed to demonstrate recent relative performance of bar scores, not absolute scores.

In today's New York Times: "Don't Use the Ballot to Get Trump's Tax Returns"

In today's New York Times, I have an opinion piece entitled, "Don't Use the Ballot to Get Trump's Tax Returns." It begins:

Opponents of Donald Trump were outraged when, flouting recent tradition, he refused to disclose his tax returns during the 2016 presidential campaign. They remain outraged that he continues to decline to do so as president.

Now that political outrage is being channeled into legislation. Lawmakers in at least two dozen states have introduced bills that would compel presidential candidates to disclose their tax returns or be left off the ballot in 2020. The New Jersey Legislature recently passed such a bill, which sits on Gov. Chris Christie’s desk.

Mr. Christie should veto the bill, and other states should abandon their efforts. Making the disclosure of tax returns mandatory is bad policy and, in this form, probably unconstitutional.

Other recent pieces on this subject include those by Vik Amar and Rick Hasen. I approach this a bit differently--proponents, including Laurence Tribe, have styled this as a "ballot access" case, rather than additional qualifications (which, I think, are even more likely to be found unconstitutional), and I've addressed it from that perspective.