Puerto Rican statehood and the effect on Congress and the Electoral College

After the low-turnout, high-pro-statehood referendum in Puerto Rico last weekend, despite the low likelihood of it becoming a state, it's worth considering the impact that statehood might have in representation and elections.

Puerto Rico would receive two Senators, increasing the size of the Senate to 102.

Census estimates project that Puerto Rico would send five members to the House. Since 1929, the House has not expanded in size, so it would mean that Puerto Rico's delegation would come at the expense of other states' delegations. In 1959, however, with the admission of Hawaii and Alaska, Congress temporarily increased in size from 435 members to 437, then dropped back down to 435 after the 1960 Census and reapportionment. Congress might do something similar with Puerto Rico upon statehood. (For some thoughts about doubling the size of the House, see my post on the Electoral College.)

Based on projections for 2020, Puerto Rico's five seats would likely come at the expense of one seat each from California, Montana, New York, Pennsylvania, and Texas. (It's worth noting these are based on the 2020 projections; Montana is likely to receive a second representative after the 2020 reapportionment.)

This would also mean that in presidential elections, Puerto Rico would have 7 electoral votes, and these five states would each lose an electoral vote. The electoral vote total would be 540, and it would take 271 votes to win.

Virgin Islands Supreme Court embroiled in another candidate qualifications disputes

A few year ago, I blogged about a rather extraordinary series of cases from the Virgin Islands concerning candidate qualifications. A candidate previously convicted of tax evasion was kept off the ballot by the Virgin Islands Supreme Court. A federal court ordered otherwise. The Virgin Islands Supreme Court ignored it.

A similar dispute has arisen recently. Kevin Rodriquez was elected to the legislature, but some claimed he was not a three-year resident of the Virgin Islands, which meant that he could not serve in the legislature. After litigation, the Virgin Islands Supreme Court ordered that he not be seated. The Third Circuit last week reversed. In doing so, it approvingly cited the power of the Board of Elections to judge qualifications before the election (a dubious proposition, as my earlier posts have noted).

The continued steady decline of the LSAT

In 2015, I wrote a post called "The slow, steady decline of the LSAT." I described a number of problems that have arising in the LSAT--problems partially of the making of LSAC, which administers the test. LSAC (and the ABA, and USNWR) count the highest prospective law student's LSAT score--even though the average of scores is a more accurate predictor of success. LSAC entered a consent decree to refuse to flag accommodated test-takers, even though it conceded its test was only reliable under ordinary test-taking conditions. Schools began to avoid using the LSAT in admitting some students for USNWR purposes to improve their medians. Schools also obsessed over the LSAT median,e ven though index scores were a more reliable predictor of success, and even as 25th percentile--and lower--admitted students dropped at a faster rate, imperiling future success on the bar exam.

In the last two years, the LSAT has continued to decline.

First, schools have started to turn to the GRE in lieu of the LSAT. It's not for USNWR purposes, because USNWR factors in GRE score into its LSAT equivalent. Instead, it's because the GRE is a general exam, and the LSAT is a specific exam. And if there's little different between what the tests are measuring, why not permit people taking the more general exam considering a broader array of graduate programs to apply to law school? Admitted, perhaps the reliability of the GRE is more of an open question left for another day--but I would suspect that if law schools needed to rely on SAT scores, it wouldn't be dramatically worse than relying on LSAT scores; and I imagine we'll see some studies in the near future regarding the reliability of using GRE scores.

Second, LSAC has become bizarrely defensive of its test. To the extent it intends to go to war with law schools over its own test--and go to war in ways that are not terribly logical--it does so at its own peril.

Third, prospective law student may now retake the LSAT an unlimited number of times. Previously, test-takers were limited to 3 attempts in 2 years (that is, 8 administrations of the test); they would need special permission to retake more than that. Given the fact that schools only need to report the highest score--and given the fact that the highest score is less reliable than the average of scores--we can expect the value of the LSAT to decline to a still-greater degree.

Fourth, LSAC will now administer the LSAT 6 times a year instead of 4 times a year. The linked article offers understandable justifications--greater flexibility given the GRE's flexibility, more opportunities given the less-rigid law school admissions cycle, and so on. But given the unlimited number of opportunities to retake, plus the highest-score standard, we can expect, again, a still-greater decline in value of an LSAT score.

Many of the problems I've identified here are principally driven by one concern: the USNWR rankings. Without them, enterprising (and risk-taking) law schools might consider only the average, or only the first two or three attempts, or consider the index score to a greater degree, or weight the quality of the undergraduate institution and difficulty of the undergraduate major to a greater degree.

But USNWR rankings--which report the median LSAT score as a whopping one-eighth of the total rankings formula--continue to drive admissions decisions. As the LSAT declines in value, it places many schools in an increasingly untenable position--rely upon the increasingly-flawed metrics of the LSAT, or succumb to a USNWR ratings decline.

Draft work in progress: "The High Cost of Lowering the Bar"

My colleague Rob Anderson and I have posted a draft article, The High Cost of Lowering the Bar on SSRN. From the abstract:

In this Essay, we present data suggesting that lowering the bar examination passing score will likely increase the amount of malpractice, misconduct, and discipline among California lawyers. Our analysis shows that bar exam score is significantly related to likelihood of State Bar discipline throughout a lawyer’s career. We investigate these claims by collecting data on disciplinary actions and disbarments among California-licensed attorneys. We find support for the assertion that attorneys with lower bar examination performance are more likely to be disciplined and disbarred than those with higher performance.

Although our measures of bar performance only have modest predictive power of subsequent discipline, we project that lowering the cut score would result in the admission of attorneys with a substantially higher probability of State Bar discipline over the course of their careers. But we admit that our analysis is limited due to the imperfect data available to the public. For a precise calculation, we call on the California State Bar to use its internal records on bar scores and discipline outcomes to determine the likely impact of changes to the passing score.

We were inspired by the lack of evidence surrounding costs that may be associated with lowering the "cut score" required to pass the California bar, and we offered this small study as one data point toward that end. The Wall Street Journal cited the draft this week, and we've received valuable feedback from a number of people. We welcome more feedback! (We also welcome publication offers!)

The paper really does two things--identifies the likelihood of discipline associated with the bar exam score, and calls on the State Bar to engage in more precise data collection and analysis when evaluating the costs and benefits of changing the cut score.

It emphatically does not do several things. For instance, it does not identify causation and identifies a number of possible reasons for the disparity (at pp. 12-13 of the draft). Additionally, it simply identifies a cost--lower the cut score will likely increase attorneys subject to discipline. It does not make any effort to weigh that cost--it may well be the case that the State Bar views the cost as acceptable given the trade-off of benefits (e.g., more attorneys, more access to justice, etc.) (see pp. 11-12 of the draft). Or it might be the case that the occupational licensing of the state bar and the risk of attorney discipline should not hinge on correlation measures like bar exam score.

There are many, for instance, who have been thoughtfully critically of the bar exam and would likely agree that our findings are accurate but reject that they should be insurmountable costs. Consider thoughtful commentary from Professor Deborah Jones Merritt at the Law School Cafe, who has long had careful and substantive critiques about the use of the bar exam generally.

It has been our hope that these costs are addressed in a meaningful, substantial, and productive way. We include many caveats in our findings for that reason.

Unfortunately, not everyone has reacted to this draft that way.

The Daily Journal (print only) solicited feedback on the work with a couple of salient quotations. First:

Bar Trustee Joanna Mendoza said she agreed the study should not be relied on for policy decisions.

“I am not persuaded by the study since the professors did not have the data available to prove their hypothesis,” she said.

We feel confident in our modest hypothesis--that attorneys with lower bar exam scores are subject to higher rates of discipline. We use two methods to support this. We do not have individualized data that would allow us the precision of measuring the precise effect, but we are confident in this major hypothesis.

Worse, however, is the disappointing answer. Our draft expressly calls on the State Bar to study the data! While we can only roughly address the impact at the macro level, we call on the bar to use data for more precise information! We do hope that the California State Bar would do so. But it appears it will not--at least, not unless it has already planned on doing so:

Bar spokeswoman Laura Ernde did not directly address questions about the Pepperdine professors’ study or their call for the bar to review its internal data, including non-public discipline. Ernde wrote in an email that the agency would use its ongoing studies to make recommendations to the Supreme Court about the bar exam.

Second are the remarks from David L. Faigman, dean of the University of California Hastings College of Law. Dean Faigman has been one of the most vocal advocates for lowering the cut score (consider this Los Angeles Times opinion piece.) His response:

Among his many critiques, Faigman said the professors failed to factor in a number of variables that impact whether an attorney is disciplined. 

“If they were to publish it in its current form, it would be about as irresponsible a product of empirical scholarship I could imagine putting out for public consumption,” Faigman said. “God forbid anybody of policy authority should rely on that manuscript.”

It's hard to know how to address a critique when the epithet "irresponsible" is the substance of the critique.

We concede many variables that may cause attorney discipline (pp. 12-13), and the paper makes no attempt to address that. Instead, we're pointing out that lower bar scores correlate with higher discipline rates; and lowering the score further would likely result in still higher discipline rates. Yes, many factors go into discipline--but the consequence of lowering the cut score will still remain, a consequence of higher discipline.

And our call for policy authorities to "rely" on the manuscript is twofold--to consider that there are actual costs to lowering the cut score, and to use more data to more carefully evaluate those costs. Both, I think, are valuable things for a policy authority to "rely" upon.

We hope that the paper sparks a more nuanced and thoughtful discussion than the one that has been waged in lobbying the State Bar and state legislature so far. We hardly know what the "right" cut score is, or the full range of costs and benefits that arise at varying changes to the cut score of the bar exam. But we hope decisionmakers patiently and seriously engage with these costs and benefits in the months--and, perhaps ideally, years--ahead.

Does the bar exam adequately test prospective lawyers' minimum competence?

The critiques of the bar exam have grown louder over the last few years on the heels of declining bar pass rates. But the most popular critiques have changed somewhat. It used to be that external factors--such as the ExamSoft debacle--were a target. Then came charges that the bar exam was harder than usual. But the most recent charges are actually quite a longstanding critique of the bar exam--it simply isn't a good measure of prospective lawyers' "minimum competence."

The bar has attempted to adjust in the last fifty years. Many states now have a "performance test," a component designed to simulate what lawyers do--test-takers are given some law and some facts and asked to address the problem with a legal task. That said, performance tests moderately correlate with other elements of the bar exam and perhaps are not performing the function some hoped they would serve.

Regardless, critiques of the bar exam are longstanding, and some of the most popular critiques look something like this: why did a state, like California, pick this score as a passing score for "minimum competence"? And why is the bar exam any good at testing the kinds of things that lawyers actually do? The bar exam is a three-day (in California, beginning this July, two-day), closed book test with multiple choice and timed essay questions that in no way resembles the real world of law practice. Why should we trust this test?

It's a fair point, and it's one best met with a question: what ought the bar test? And, perhaps a more subtle question: what if it turns out that the answer to what the bar ought to test actually aligns quite closely with the results from the existing bar exam?

A study in 1980 in California is one of the most impressive I've seen on this subject. And while it's a little old, it's the kind of thing that ought to be replicated before state bars go about making dramatic changes to their exams or scoring methods. I'll narrate what happened there. (For details, consider two reports on the study and the testimony presented to California lawmakers asking the exact same questions in 1984, after the particularly poor performance of applicants to the state bar on the July 1983 bar exam--a historically low score essentially matched in the July 2016 administration.)

After the July 1980 bar exam in California, the National Conference of Bar Examiners teamed up with the California Committee of Bar Examiners to run a study. They selected 485 applicants to the bar who had taken the July 1980 exam. Each of these applicants took an additional two-day test in August 1980.

The two-day test required participants to "function as counsel for the plaintiff in a simulated case" on one day, and "counsel for the defendant in a different simulated case" the other day. Actors played clients and witnesses. The participants were given oral and written tasks--client interviews, discovery plans, briefs, memoranda, opening statements, cross-examination, and the like. They were then evaluated among a number of dimensions and scored.

In the end, the scores were correlated to the applicants' bar exam scores. The relationship between the scores and the general bar exam scores were fairly strong--"about as strong as the underlying relationship between the Essay and MBE section of the [General Bar Exam]." "In short," the study concluded, the study and the bar exam "appear to be measuring similar but not identical abilities."

Additionally, a panel of 25 lawyers spent more than two days with extended in-depth evaluation of 18 of these participants. The panelists were clinical professors, law professors, attorneys, judges, and others with a variety of experience. The panelists were asked to evaluate these 18 participants' performance among the various dimensions along a scale of "very unsatisfactory" (i.e., fail) to "borderline" to "very satisfactory" (i.e., pass). The panel's judgments about the pass/fail line was consistent with the line where it was drawn on the California bar exam (with the caveat that this was a sample of just 18 applicants).

It might be that there are different things we ought to be testing, or that this experiment has its own limitations (again, I encourage you to read it if you're interested in the details). But before anything is done about the bar exam, it might be worth spending some time thinking about how we can evaluate what we think ought to be evaluated--and recognize that there are decades of studies addressing very similar things that we may ignore to our peril.

Whittier's challenges may have been unique to California

On the heels of my analysis of the challenges facing Whittier, I starting thinking about how Whittier compared with a great many other law schools in the country that are facing the same challenges--a shrinking law school applicant pool, declining quality of applicants, continued challenges in bar exam pass rates and graduate employment statistics. Whittier's incoming class profile isn't unique. What makes its situation different from other schools?

The answer, I think, lies in California, along three dimensions--state bar cut scores, transfers, and employment.

I recently read a law professor suggest that Whittier was making a significant mistake closing because it was located in Orange County, California, a place that would experience great demand for legal services in the near future. I tend to find just the opposite--if Whittier were dropped into just about any of the other 49 states in the country, it likely would not be facing the same pressures it faces in its current location. (This is, of course, not to say that it wouldn't be facing the same kinds of pressures in legal education generally, but that its problems are exacerbated in California.)

I looked at the incoming class profiles from 2013 and picked 11 other schools that closely matched Whittier's overall incoming LSAT profile.

School Name Matriculants 75th LSAT 50th LSAT 25th LSAT
Atlanta's John Marshall Law School 235 152 149 146
John Marshall Law School 404 152 149 146
Mississippi College 159 153 149 145
New England Law | Boston 238 153 149 145
Nova Southeastern University 305 152 149 146
Oklahoma City University 162 153 149 145
Suffolk University 450 153 149 145
University of Massachusetts Dartmouth 78 151 148 145
University of North Dakota 83 153 148 145
Western New England University 120 152 149 145
Whittier Law School 221 152 149 145
Widener-Commonwealth 74 151 148 145

First, I looked at the first-time bar pass rate for the July 2016 bar, with each state's cut score and the state's overall first-time pass rate among graduates of ABA-accredited schools. (As of this writing, neither the Mississippi state bar nor Mississippi College have disclosed school-specific bar pass rates yet.)

School Name State Cut score July 2016 Pass Rate Statewide Pass Rate
Mississippi College MS 132 * 75%
Widener-Commonwealth PA 136 79% 75%
Western New England University MA 135 74% 81%
New England Law | Boston MA 135 73% 81%
University of North Dakota ND 130 73% 73%
Suffolk University MA 135 70% 81%
University of Massachusetts Dartmouth MA 135 69% 81%
Oklahoma City University OK 132 67% 75%
John Marshall Law School IL 133 65% 77%
Nova Southeastern University FL 136 63% 68%
Atlanta's John Marshall Law School GA 135 43% 73%
Whittier Law School CA 144 22% 62%

A Pepperdine colleague blogged last year that if Whittier were in New York, it would likely have had a 51% first-time pass rate instead of a 22% pass rate. New York's cut score is relatively low--a 133.  (Whittier's average combined California bar score for first-time test-takers in July 2016 was a 135.5, above 133 and well below California's 144.) If Whittier were in Massachusetts or Georgia, it might have had something near 51%. If it were in Mississippi or North Dakota, its pass rate may have approached 60%. A first-time pass rate of 3 in 5 is still not something to be happy about, but it's a far cry from a first-time rate around just 1 in 5.

It isn't that some of these schools figured out how to help their students to pass the bar and that Whittier lagged; it's that California's high cut score makes it more difficult to pass the bar than if Whittier grads had taken the bar in almost any other state. (This isn't to say that a higher or a lower cut score is better or worse; it's simply to describe the situation that California schools face compared to others.)

Second, a factor in bar pass rate includes the loss of high-performing students as transfers elsewhere. I looked at transfer rates among these schools in 2014, a loss of students who matriculated to the school in 2013.

School Name Transfers Out Pct Transfers Out
Atlanta's John Marshall Law School 19% 45
Nova Southeastern University 13% 41
Whittier Law School 13% 28
John Marshall Law School 12% 50
New England Law | Boston 11% 26
University of Massachusetts Dartmouth 10% 8
Western New England University 8% 10
Suffolk University 8% 37
University of North Dakota 5% 4
Oklahoma City University 3% 5
Widener-Commonwealth 3% 2
Mississippi College 3% 4

It may come as little surprise that larger states with many competitive schools that shrunk the incoming class sizes to preserve their LSAT and UGPA medians tended to rely on transfers to help backfill their classes. Atlanta's John Marshall lost 20 students to Emory, 10 to Georgia State, and 5 to Mercer; Nova lost 16 to Miami and 5 to Florida State; and Whittier lost 15 to Loyola-Los Angeles. These schools lost a number of their best students, and, unsurprisingly, had some of the worst bar outcomes among this cohort. (Four of these schools are in Massachusetts, and perhaps no single school attracts the bulk of transfer attention; and schools in less competitive states like Mississippi, North Dakota, and Oklahoma experienced insignificant attrition.)

Third, Whittier had low job placement in full-time, long-term, bar passage-required and J.D.-advantage positions, but it's a reflection of the fact that the placement rate of California schools lags most of the rest of the country. (It's also exacerbated by the low bar pass rate.) Consider each school's placement in FTLT BPR & JDA positions, and the statewide placement rate into such (unfunded) jobs.

School Name FTLT BPR+JDA Statewide emp Delta
Mississippi College 76% 72% 4
Oklahoma City University 73% 77% -4
Widener-Commonwealth 68% 83% -15
Suffolk University 66% 79% -13
John Marshall Law School 66% 77% -11
New England Law | Boston 60% 79% -19
Atlanta's John Marshall Law School 58% 77% -19
Nova Southeastern University 58% 65% -7
Western New England University 57% 79% -22
University of North Dakota 57% 57% 0
University of Massachusetts Dartmouth 55% 79% -24
Whittier Law School 39% 64% -25

Whittier lags in placement here, too, but in part because California has unusually low placement. (North Dakota, a state with just one flagship law school, serves as an outlier.) This is not a total defense of a particular school's outcomes, either--Florida also appears to have a relatively high number of law school graduates, and its employment rate shows similar challenges. But coupled with Whittier's low bar passage rate, one can see why securing students in positions, particularly "bar passage required" positions, would be even more difficult. Several other schools show employment rates that are 19 to 24 points behind the state average. (School like Oklahoma City, one of three; Mississippi College, one of two; and North Dakota, the only school in the state, may distort these comparisons somewhat.)

I then read another piece from a graduate from the 1970s lamenting that Whittier had "lost its way" in training graduates ready to take the bar. I pointed out Whittier's challenges were hardly recent, as the ABA had placed Whittier on probation in 2005, which lead to efforts that bolstered Whittier's first-time bar pass rate in California past 84%.

But it's worth looking back to the 1970s, when Whittier first sought accreditation, to consider its situation and aspirations. Here's an excerpt from the Los Angeles Times in 1978 when Whittier received ABA accreditation:

In 1978, a 550 was around the 51st percentile of LSAT scores--something like a 151 today. Its tuition in 1974 was $1200 per year, or around $6000 per year in 2017 dollars. It was on pace to increase to $2900 per year in just four years, or about $11,000 in 2017 dollars. There are obviously significant benefits that arise from becoming an ABA-accredited law school. But there are also costs with accreditation--and I'm not sure that a law school with tuition levels at $11,000 a year would be facing the same kinds of pressures among selecting prospective students.

I don't pretend to understand the dynamics of legal education in California in the last 40 years, with more than 20 ABA-accredited law schools and a number of California accredited and unaccredited schools. But I do think some context about the California market suggests that some of the problems Whittier faced were exacerbated by the California market in particular.

Visualizing law school federal judicial clerkship placement, 2014-2016

The release of the latest ABA employment data offers an opportunity to update the three-year federal judicial clerkship placement rates. Here is the clerkship placement rate for the Classes of 2014, 2015, and 2016. Methodology and observations below the interactive visualization. The "placement" is the three-year total placement; the "percentage" is the three-year placement divided by the three-year graduating class total.

The placement is based on graduates reported as having a full-time, long-term federal clerkship. (A one-year term clerkship counts for this category.) I thought a three-year average for clerkships (over 3600 clerks from the graduating classes of 2014, 2015, and 2016) would be a useful metric to smooth out any one-year outliers. It does not include clerkships obtained by students after graduation; it only includes clerkships obtained by each year's graduating class.

I included some schools that had only one or two year's worth of data, like the separate Penn State schools. Additionally, I merged the entries for William Mitchell and Hamline into Mitchell|Hamline. The three schools in Puerto Rico are excluded.

I should add that we've actually seen a slight decline in graduates placed into federal clerkships, just under 1200 for the second year in a row. Given last year's figures, some might think this is a trend toward judges hiring more clerks with work experience. I'm not sure that's the case. Instead, I would venture to guess that because the Senate last confirmed a federal judge in November 2015, we may be experiencing an unusual number of vacancies--and, therefore, lack of slots for clerkship hires. In the event the President nominates, and Congress confirms, these judges, we could see a few hundred more clerkship openings in the near future. And if Congress chooses to create more judgeships consistent with the recommendations of the Federal Judicial Center, we'd see even more.

I'll highlight two smaller charts first. The first is New York law school placement.

School Pct Total Clerks
Cornell University 6.5% 36
New York University 5.8% 84
Columbia University 5.0% 64
Brooklyn Law School 2.4% 26
Fordham University 2.0% 25
Syracuse University 1.8% 10
University of Buffalo-SUNY 1.2% 7
St. John's University 1.2% 9
Cardozo School of Law 1.2% 13
Albany Law School 1.1% 6
City University of New York 1.1% 4
Pace University 0.7% 4
New York Law School 0.7% 8
Hofstra University 0.7% 6
Touro College 0.0% 0

The second is California law school placement.

School Pct Total Clerks
Stanford University 27.1% 153
University of California-Irvine 12.5% 40
University of California-Berkeley 12.3% 110
University of California-Los Angeles 4.0% 39
Pepperdine University 3.5% 20
University of Southern California 2.9% 18
University of California-Davis 2.8% 14
Loyola Law School-Los Angeles 2.3% 26
University of San Diego 2.0% 15
University of California-Hastings 1.7% 17
Thomas Jefferson School of Law 0.7% 5
California Western School of Law 0.6% 4
McGeorge School of Law 0.4% 2
Chapman University 0.2% 1
University of San Francisco 0.2% 1
Southwestern Law School 0.1% 1
University of La Verne 0.0% 0
Western State College of Law 0.0% 0
Golden Gate University 0.0% 0
Whittier Law School 0.0% 0
Santa Clara University 0.0% 0

An overall raw chart is below.

St School Pct Total Clerks
CT Yale University 31.0% 200
CA Stanford University 27.1% 153
MA Harvard University 17.6% 312
IL University of Chicago 15.8% 98
VA University of Virginia 15.2% 159
NC Duke University 12.7% 82
CA University of California-Irvine 12.5% 40
CA University of California-Berkeley 12.3% 110
MI University of Michigan 11.1% 119
TN Vanderbilt University 10.3% 58
PA University of Pennsylvania 9.8% 77
TX University of Texas at Austin 9.4% 100
IL Northwestern University 8.0% 66
AL University of Alabama 7.6% 35
MT University of Montana 7.5% 18
IN University of Notre Dame 7.0% 37
LA Tulane University 6.6% 45
KY University of Kentucky 6.5% 26
NY Cornell University 6.5% 36
VA Washington and Lee University 6.1% 24
IA University of Iowa 5.9% 25
VA William and Mary Law School 5.8% 36
NY New York University 5.8% 84
GA University of Georgia 5.8% 36
NC University of North Carolina 5.7% 40
VA University of Richmond 5.5% 25
NY Columbia University 5.0% 64
TX Baylor University 5.0% 20
MN University of Minnesota 4.9% 37
PA Temple University 4.8% 34
MO Washington University 4.5% 32
MS University of Mississippi 4.3% 19
DC Georgetown University 4.1% 81
AR University of Arkansas, Fayetteville 4.1% 15
UT Brigham Young University 4.1% 17
WA University of Washington 4.0% 22
CA University of California-Los Angeles 4.0% 39
WV West Virginia University 3.8% 14
UT University of Utah 3.8% 14
GA Mercer University 3.8% 16
DC George Washington University 3.7% 59
DC American University 3.7% 49
GA Emory University 3.6% 31
KS University of Kansas 3.6% 13
IL University of Illinois 3.6% 19
CA Pepperdine University 3.5% 20
MO University of Missouri 3.4% 13
MA Boston College 3.3% 25
WY University of Wyoming 3.3% 7
VA Regent University 3.0% 10
SD University of South Dakota 3.0% 6
TX Texas Tech University 3.0% 18
TN University of Memphis 2.9% 10
NC Wake Forest University 2.9% 15
CA University of Southern California 2.9% 18
CA University of California-Davis 2.8% 14
PA Pennsylvania State University 2.8% 5
GA Atlanta John Marshall Savannah 2.8% 1
MS Mississippi College 2.7% 12
MD University of Maryland 2.7% 21
GA Georgia State University 2.7% 16
IN Indiana University - Bloomington 2.6% 16
TX Southern Methodist University 2.6% 19
NV University of Nevada - Las Vegas 2.6% 10
VA George Mason University 2.6% 12
LA Louisiana State University 2.6% 15
SC University of South Carolina 2.5% 15
KY University of Louisville 2.5% 9
OH Ohio State University 2.5% 14
AZ University of Arizona 2.4% 10
FL Florida State University 2.4% 17
NY Brooklyn Law School 2.4% 26
LA Loyola University-New Orleans 2.4% 15
NE Creighton University 2.4% 9
ME University of Maine 2.4% 6
CA Loyola Law School-Los Angeles 2.3% 26
TN University of Tennessee 2.3% 10
CT University of Connecticut 2.2% 11
OH University of Toledo 2.2% 7
DC Howard University 2.2% 8
CO University of Colorado 2.2% 11
FL University of Florida 2.1% 20
CA University of San Diego 2.0% 15
PA Widener-Commonwealth 2.0% 5
NY Fordham University 2.0% 25
WI University of Wisconsin 1.9% 12
AZ Arizona State University 1.8% 11
NY Syracuse University 1.8% 10
NJ Rutgers Law School 1.8% 22
OH Case Western Reserve University 1.7% 7
CA University of California-Hastings 1.7% 17
NE University of Nebraska 1.7% 6
OR Lewis and Clark College 1.6% 10
WI Marquette University 1.6% 10
NM University of New Mexico 1.5% 5
NC Elon University 1.5% 4
OH University of Cincinnati 1.5% 5
TX University of Houston 1.4% 10
MO University of Missouri-Kansas City 1.3% 6
AR University of Arkansas, Little Rock 1.3% 5
OH Ohio Northern University 1.3% 3
ND University of North Dakota 1.3% 3
NJ Seton Hall University 1.3% 8
AL Samford University 1.2% 5
IL Southern Illinois University-Carbondale 1.2% 4
NC Campbell University 1.2% 5
NY University of Buffalo-SUNY 1.2% 7
NY St. John's University 1.2% 9
KY Northern Kentucky University 1.2% 5
NY Cardozo School of Law 1.2% 13
MA Boston University 1.2% 8
PA University of Pittsburgh 1.2% 7
PA Villanova University 1.2% 7
TX Texas Southern University 1.1% 5
OK University of Oklahoma 1.1% 5
NY Albany Law School 1.1% 6
PA Penn State - Dickinson Law 1.1% 1
NY City University of New York 1.1% 4
OK University of Tulsa 1.1% 3
MA Northeastern University 1.1% 6
SC Charleston School of Law 1.1% 5
PA Penn State Law 1.0% 2
FL Stetson University 1.0% 9
VA Liberty University 1.0% 2
MI Michigan State University 1.0% 9
WA Gonzaga University 1.0% 4
PA Drexel University 1.0% 4
MI Wayne State University 0.9% 4
OR University of Oregon 0.9% 4
ID University of Idaho 0.9% 3
FL University of Miami 0.9% 10
NY Pace University 0.7% 4
NY New York Law School 0.7% 8
NH University of New Hampshire 0.7% 2
NY Hofstra University 0.7% 6
VT Vermont Law School 0.7% 3
PA Duquesne University 0.7% 3
IL Loyola University-Chicago 0.7% 5
FL Florida A&M University 0.7% 3
CA Thomas Jefferson School of Law 0.7% 5
MO Saint Louis University 0.7% 4
IN Valparaiso University 0.6% 3
CA California Western School of Law 0.6% 4
IL John Marshall Law School 0.6% 7
OH University of Dayton 0.6% 2
TN Belmont University 0.6% 1
WA Seattle University 0.6% 5
CO University of Denver 0.6% 5
TX St. Mary's University 0.6% 4
IA Drake University 0.6% 2
OH Cleveland State University 0.5% 2
DE Widener University-Delaware 0.5% 3
MN University of St. Thomas (Minnesota) 0.5% 2
OH University of Akron 0.5% 2
IL Chicago-Kent College of Law-IIT 0.5% 4
TX South Texas College of Law 0.5% 5
AZ Arizona Summit Law School 0.5% 4
DC Catholic University of America 0.4% 2
AL Faulkner University 0.4% 1
IL Depaul University 0.4% 3
FL Ave Maria School of Law 0.4% 1
LA Southern University 0.4% 2
CA McGeorge School of Law 0.4% 2
MD University of Baltimore 0.3% 3
IL Northern Illinois University 0.3% 1
FL St. Thomas University (Florida) 0.3% 2
TX Texas A&M University 0.3% 2
KS Washburn University 0.3% 1
IN Indiana University - Indianapolis 0.3% 2
CA Chapman University 0.2% 1
OK Oklahoma City University 0.2% 1
MA Suffolk University 0.2% 3
MI University of Detroit Mercy 0.2% 1
NC North Carolina Central University 0.2% 1
CA University of San Francisco 0.2% 1
MN Mitchell|Hamline 0.2% 2
FL Barry University 0.1% 1
FL Nova Southeastern University 0.1% 1
MA New England Law | Boston 0.1% 1
CA Southwestern Law School 0.1% 1
NC Charlotte School of Law 0.1% 1
TN Lincoln Memorial 0.0% 0
ID Concordia Law School 0.0% 0
VA Appalachian School of Law 0.0% 0
CA University of La Verne 0.0% 0
MA University of Massachusetts Dartmouth 0.0% 0
CT Quinnipiac University 0.0% 0
HI University of Hawaii 0.0% 0
RI Roger Williams University 0.0% 0
CA Western State College of Law 0.0% 0
DC District of Columbia 0.0% 0
MA Western New England University 0.0% 0
CA Golden Gate University 0.0% 0
OR Willamette University 0.0% 0
OH Capital University 0.0% 0
CA Whittier Law School 0.0% 0
NY Touro College 0.0% 0
GA Atlanta's John Marshall Law School 0.0% 0
FL Florida International University 0.0% 0
CA Santa Clara University 0.0% 0
FL Florida Coastal School of Law 0.0% 0
MI Thomas M. Colley Law School 0.0% 0

More details on the legal job market: small law firm, business jobs disappearing

Last week, I posted a perspective on the changing legal market and the outcomes for the Class of 2016. Troublingly, job placement in full-time, long-term, bar passage-required positions declined from 25,787 for the Class of 2013 to 22,874 for the Class of 2016; that said, placement improved from 55.9% to 62.4% because of the shrinking graduating class size.

I thought I'd dig into job-specific data to see what may be leading the decline. The problem with the industry-specific data is that we don't know whether the jobs are bar passage-required, J.D.-advantage, professional, or non-professional. That said, we can make some guesses; most entry-level hiring in law firms with 501 or more attorneys are probably bar passage-required, for instance. Regardless, I looked at the full-time, long-term job categories for each position and catalogued notable areas.

I had two instincts. First, perhaps big law hiring has declined and law firms are relying on greater productivity, greater outsourcing, increased reliance on technology, higher retention of junior associates, and delayed retirements. Second, perhaps government hiring has declined in eras of partisanship and budget stalemates.

Both instincts were wrong.

FTLT Class of 2013 Class of 2016 Net Delta
Solo 926 444 -482 -52.1%
2-10 6,947 5,490 -1,457 -21.0%
11-25 1,842 1,640 -202 -11.0%
26-50 1,045 906 -139 -13.3%
51-100 846 768 -78 -9.2%
101-205 1,027 940 -87 -8.5%
251-500 1,041 993 -48 -4.6%
501+ 3,978 4,204 226 5.7%
Business/Industry 5,494 3,796 -1,698 -30.9%
Government 4,360 4,034 -326 -7.5%
Public Interest 1,665 1,398 -267 -16.0%
Federal Clerk 1,259 1,184 -75 -6.0%
State Clerk 2,043 2,021 -22 -1.1%
Academia 490 352 -138 -28.2%

Law firms with 501 or more attorneys were the only category that saw an increase in the last three years, a modest 226-person increase but a pleasant increase at that.

Yes, Government saw a decline, but not as significant as other categories. Other relatively small areas had some big declines. Public Interest jobs declined, perhaps because of combinations of student debt levels or public interest organization funding, and because of law school-funded positions drying up (although it remains somewhat counterintuitive at a time when law schools have been increasing their hiring and development of clinical education). Solo practitioners fell by half--it's become something less attractive for new graduates due to its uncertainty, perhaps.

Instead, two areas saw thousand-student-plus declines in three years.

First, hiring in small law firms, those with two to 10 people, declined 21% in three years.  This has been the single biggest employer of new graduates in recent years, and it has seen a significant decline. These are not firms that typically arrive at on campus interviews. But they are also firms, I think, that probably need new graduates to pass the bar on the first attempt. Failure to do so is a real challenge--they likely can't absorb someone who's unable to practice until the next bar go-around. And these are probably places that are most willing to dip into lower-ranked schools and students with lower grades. The decline in bar passage rates may be impacting this area the most--but that's just speculation on my part.

Second, hiring in business & industry positions has declined nearly 31% in three years. As business & industry jobs are a major source of J.D.-advantage positions, it would explain the decline in J.D.-advantage positions, too. But while bar passage-required jobs in business might also suffer a decline in placement as bar pass rates decline, why J.D.-advantage positions, too? Perhaps--again, some speculation--it's that, in a (relatively) robust economy with many strong college graduates, businesses may no longer be valuing the J.D., or they may be finding greater value in M.B.A. students.

For schools looking to solve their employment challenges, addressing the reasons why there's been a steep drop in demand in these two important employers is crucial. A few speculated reasons here are hardly a beginning to explore the reasons for the changes in climate.