How much post-JD non-clerkship work experience do entry-level law professors have?

Professor Sarah Lawsky offers her tireless and annual service compiling entry-level law professor hires. One chart of interest to me is the year of the JD: in recent years, about 10-20% of entering law professors obtained their JD within the last four years; 45-60% in the last five to nine years; and 25-30% in the last 10 to 19 years, with a negligible number at least 20 years ago.

But there's a different question I've had, one that's been floating out there as a rule of thumb: how much practice experience should an entering law professor have? Of course, "should" is a matter of preference. Most aspiring law professors often mean it to ask, "What would make me most attractive as a candidate? Or, what are schools looking for?"

There are widely varied schools of thought, I imagine, but a common rule of thumb I'd heard was three to five years of post-clerkship experience, and probably no more. (Now that I'm trying to search where I might have first read that, I can't find it.) In my own experience, I worked for two years in practice after clerking. Some think more experience is a good thing to give law professors a solid grounding in the actual practice of law they're about to teach, but some worry too much time in practice can inhibit academic scholarship (speaking very generally); some think less experience is a good or a bad thing for mostly the opposite reasons. (Of course, what experience law professors ought to have, regardless of school hiring preferences, is a matter for a much deeper normative debate!)

I thought I'd do a quick analysis of post-JD work experience among entry-level law professors. I looked at the 82 United States tenure-track law professors listed in the 2016 entry-level report. I did a quick search of their law school biographies, CVs, or LinkedIn accounts for their work experience and put it into one of several categories: 0, 1, 2, 3, 4, 5+, "some," or "unknown." 5+ because I thought (perhaps wrongly!) that such experience would be relatively rare and start to run together over longer periods of time; "some" meaning the professor listed post-JD work experience but the dates were not immediately discernible; or "unknown" if I couldn't tell.

I also chose to categorize "post-clerkship" experience. I think clerkship experience is different in kind, and it still rightly is a kind of work experience, but I was interested in specifically the non-clerkship variety. I excluded independent consultant work, and judicial staff attorney/clerk positions, but I included non-law-school fellowships. Any academic position was also not included in post-JD non-clerkship work experience. I excluded pre-JD work experience, of course, but included all post-JD work experience whether law-related or not (e.g., business consulting). All figures are probably +/-2.

There are going to be lots of ways to slice and dice the information, so I'll offer three different visualizations. First, 23 of the 82 entering law professors (28%) had no post-JD non-clerkship work experience. 56 had at least some, and 3 had unknown experience. That struck me as a fairly large number of "no work experience." (If you included clerkships, 13 of those "nones" had clerkships, and 10 had no clerkship experience.) I thought most of the "nones" might be attributable to increases in PhD/SJD/DPhil hires, and that accounts for about two-thirds of that category.

I then broke it down by years' service.

24 had one to four years' experience; 21 had five or more years' experience; and 11 had "some" experience, to an extent I was unable to quickly determine. (Be careful with this kind of visualization; the "some" makes the 1-4 & 5+ categories appear smaller than they actually are!) I was surprised that 21 (about 26%) had at least five years' post-JD non-clerkship work experience, and many had substantially more than that. Perhaps I shouldn't have been surprised, as about 30% earned their JD at least 10 years ago; but I thought a good amount of that might have been attributable to PhD programs, multiple clerkships, or multiple VAPs. It turns out 5+ years' experience isn't "too much" based on recent school tenure-track hiring.

For the individual total breakdown, here's what I found:

This visualization overstates the "nones," because it breaks out each category, unlike the first chart, but it's each category I collected. Note the big drop-off from "0" to "1"!

Again, all figures likely +/-2 and based on my quickest examination of profiles. If you can think of a better way of splicing the data or collecting it in the future, please let me know!

How should we think about law school-funded jobs?

One of the most contentious elements of the proposed changes to the way law schools report jobs to the American Bar Association is how to handle law school-funded jobs. In my letter, I noted that more information and more careful examination of costs and benefits would be needed before reaching a decision about how best to treat them. The letters to the ABA mostly fall on a more black-and-white alignment: school-funded positions should be treated like any other job, or they should remain a separate category as they have the last two years.

Briefly, school-funded positions may offer opportunities for students to practice, particularly in public interest positions, in a transition period toward opportunities where funding may be lacking. At their best, it provides students with much-needed legal experience in these fields and can help them continue a long career in such opportunities. At their worst, however, they are opportunities for schools to inflate their employment placement by spending money on student placement with no assurance about what that placement looks like after the year is complete.

In the last couple of years, the number of positions that would even qualify as "law school-funded" have been severely limited. I noted in 2016 that these positions had dropped in half, to fewer than 400 positions nationwide, accompanying the change in the USNWR reporting system that gave these positions "less weight" than non-funded positions. Jerry Organ rightly noted that much of the decline was probably attributable to the definitional change: only jobs lasting at least a year and with an annual salary of at least $40,000 would count.

This methodological change likely weeded out many of the lower-quality positions from the school-funded totals.

So, are law school-funded positions good outcomes, or not? It seems impossible to tell from the evidence, because we have essentially no data about what happens to students from these positions after the funding ceases. We have some generic assurances that they are successful in placing students into good jobs; we have others who express deep skepticism about that likelihood. One major reason I endorse the proposal to postpone the change to employment reporting data is to find out more information about what they do! (Alas, that seems unlikely.)

But we do have one piece of data from Tom Miles (Chicago), who wrote in his letter to the ABA: "97% of new graduates who have received one of our school-funded Postgraduate Public Interest Law Fellowships remained in public interest or government immediately after their fellowships; 45% of them with the organization who hosted their fellowship."

That is impressive placement. If such statistics are similar across institutions, it would be a very strong reason, in my view, to move such positions back into the "above the line" totals with other job placement.

Finally, my colleague Rob Anderson did a principal components analysis of job placement and found that law school-funded positions were a relatively good, if minor, job outcome among institutions.

It may be that the worst excesses of the recession-era practices of law schools are behind us, and that these school-funded positions are providing the kinds of opportunities that are laudable. More investigation from the ABA would be most beneficial. But it's also likely the case that the change may be quite modest in the event the ABA chooses to adopt the changes this year.

My letter to the ABA Section on Legal Education re proposed changes to law school employment reporting

On the heels of commentary from individuals like Professor Jerry Organ and Dean Vik Amar, I share the letter I sent to the ABA's Section on Legal Education regarding changes to the Employment Summary Report and the classification of law-school funded positions. (Underlying documents are available at the Section's website here.) Below is the text of the letter:

---

Dear Mr. Currier,

I:

1) petition the Council to suspend implementation of the proposal until at least the Class of 2018, and direct the Section to implement for the Class of 2017, the Employment Questionnaire as approved at the June meeting, together with the Employment Summary Report used for the Class of 2016, and

2) petition the Council to direct the Standards Review Committee to

a. delineate all of the changes in the Employment Questionnaire that would be necessary to implement the proposal, and

b. provide notice of all proposed changes to the Employment Questionnaire and Employment Summary Report and an opportunity to comment on the proposed changes before the Council takes any further action to implement the proposal.

The unusual and truncated process to adopt these proposals is reason enough to oppose the change. But the substance merits additional discussion.

In particular, I do not believe the statements made in the Mahoney Memorandum sufficiently address the costs of returning to the pre-2015 system of reporting school-funded employment figures as "above the line" totals. The Memorandum contains speculative language in justification of the position advanced ("The NLJ assumed, as would any casual reader," "Many readers may never have learned of the error," "we must assume"), language which should be the basis for further investigation and a weighing of costs and benefits, not of reaching a definitive outcome.

Additionally, the Memorandum uses incomplete statistics to advance its proposal--in particular, that "School-funded positions accounted for 2% of reported employment outcomes for the class of 2016" is more relevant if such positions are distributed roughly equally across institutions. But these positions are not distributed roughly equally, and the fact that a few institutions bear a disproportionate number of such positions should merit deeper investigation before examining the impact of such a change.

Furthermore, the Memorandum's proposal, adopted in the Revised Employment Outcomes Schools Report, includes material errors (including overlapping categories of employment by firm size, where an individual in a firm with 10 people would be both in a firm with "2-10" and "10-100"; the same for 100 and the categories "10-100" and "100-500") that never should have made it to the Section.

I find much to be lauded in the objectives of the Section in the area of disclosure. Improving disclosures to minimize reporting errors, streamline unnecessary categories, and provide meaningful transparency in ways that consumers find beneficial are good and important goals. The Section should do so with the care and diligence it has done in its past revisions, which is why it ought to suspend implementation of this proposal.

Best,

/s/ Prof. Derek T. Muller

Some good news, and some perspective, on the June 2017 LSAT

The Law School Admissions Council recently shared that LSATs administered increased significantly year-over-year: a 19.8% increase. In historical terms, the June 2017 test had more test-takers than any June since 2010, which had a whopping 23,973 (the last year of the "boom" cycle for law school admissions). That's good news for law schools looking to fill their classes with more students, and, hopefully, more qualified students. I've visualized the last decade of June LSAT administrations below.

Of course, there are many more steps along the way to a better Class of 2021: applicant quality among those test-takers, whether they turn into actual applicants, etc. And given the potential for schools to accept the GRE instead of the LSAT, LSAT administrations may slightly understate expectations for future applicants.

But one data point is worth consider, and that's repeat test-takers. LSAC discloses that data in some more opaque ways, but it's worth considering how many first-time test-takers were among the June 2017 test-takers.

First-time test-takers are a better picture of the likely changes to the quality and quantity of the applicant pool. Repeaters are permitted to use their highest score, which is a worse indicator of their quality. (They may now retake the test an unlimited number of times.) Additionally, first-time test-takers represent truly potentially new applicants, as opposed to repeaters who were probably already inclined to apply (or perhaps have applied and are seeking better scholarship offers).

Repeat test-takers have been slowly on the rise, as the graphic above (barely!) demonstrates. First-time test-takers made up 84.9% of the June 2007 LSAT administration. That number has eroded every June since, and this June saw first-time test-takers make up 74% of the administration. About 27,600 took the test, and 20,430 for the first time; compare that to June 2011, when there were fewer test-takers (about 26,800), but more who took it for the first time (21,610).

There is some good optimism for law schools looking for a boost in their admissions figures. But there's also a little perspective to consider about what these figures actually represent.

The continued steady decline of the LSAT

In 2015, I wrote a post called "The slow, steady decline of the LSAT." I described a number of problems that have arising in the LSAT--problems partially of the making of LSAC, which administers the test. LSAC (and the ABA, and USNWR) count the highest prospective law student's LSAT score--even though the average of scores is a more accurate predictor of success. LSAC entered a consent decree to refuse to flag accommodated test-takers, even though it conceded its test was only reliable under ordinary test-taking conditions. Schools began to avoid using the LSAT in admitting some students for USNWR purposes to improve their medians. Schools also obsessed over the LSAT median,e ven though index scores were a more reliable predictor of success, and even as 25th percentile--and lower--admitted students dropped at a faster rate, imperiling future success on the bar exam.

In the last two years, the LSAT has continued to decline.

First, schools have started to turn to the GRE in lieu of the LSAT. It's not for USNWR purposes, because USNWR factors in GRE score into its LSAT equivalent. Instead, it's because the GRE is a general exam, and the LSAT is a specific exam. And if there's little different between what the tests are measuring, why not permit people taking the more general exam considering a broader array of graduate programs to apply to law school? Admitted, perhaps the reliability of the GRE is more of an open question left for another day--but I would suspect that if law schools needed to rely on SAT scores, it wouldn't be dramatically worse than relying on LSAT scores; and I imagine we'll see some studies in the near future regarding the reliability of using GRE scores.

Second, LSAC has become bizarrely defensive of its test. To the extent it intends to go to war with law schools over its own test--and go to war in ways that are not terribly logical--it does so at its own peril.

Third, prospective law student may now retake the LSAT an unlimited number of times. Previously, test-takers were limited to 3 attempts in 2 years (that is, 8 administrations of the test); they would need special permission to retake more than that. Given the fact that schools only need to report the highest score--and given the fact that the highest score is less reliable than the average of scores--we can expect the value of the LSAT to decline to a still-greater degree.

Fourth, LSAC will now administer the LSAT 6 times a year instead of 4 times a year. The linked article offers understandable justifications--greater flexibility given the GRE's flexibility, more opportunities given the less-rigid law school admissions cycle, and so on. But given the unlimited number of opportunities to retake, plus the highest-score standard, we can expect, again, a still-greater decline in value of an LSAT score.

Many of the problems I've identified here are principally driven by one concern: the USNWR rankings. Without them, enterprising (and risk-taking) law schools might consider only the average, or only the first two or three attempts, or consider the index score to a greater degree, or weight the quality of the undergraduate institution and difficulty of the undergraduate major to a greater degree.

But USNWR rankings--which report the median LSAT score as a whopping one-eighth of the total rankings formula--continue to drive admissions decisions. As the LSAT declines in value, it places many schools in an increasingly untenable position--rely upon the increasingly-flawed metrics of the LSAT, or succumb to a USNWR ratings decline.

Draft work in progress: "The High Cost of Lowering the Bar"

My colleague Rob Anderson and I have posted a draft article, The High Cost of Lowering the Bar on SSRN. From the abstract:

In this Essay, we present data suggesting that lowering the bar examination passing score will likely increase the amount of malpractice, misconduct, and discipline among California lawyers. Our analysis shows that bar exam score is significantly related to likelihood of State Bar discipline throughout a lawyer’s career. We investigate these claims by collecting data on disciplinary actions and disbarments among California-licensed attorneys. We find support for the assertion that attorneys with lower bar examination performance are more likely to be disciplined and disbarred than those with higher performance.

Although our measures of bar performance only have modest predictive power of subsequent discipline, we project that lowering the cut score would result in the admission of attorneys with a substantially higher probability of State Bar discipline over the course of their careers. But we admit that our analysis is limited due to the imperfect data available to the public. For a precise calculation, we call on the California State Bar to use its internal records on bar scores and discipline outcomes to determine the likely impact of changes to the passing score.

We were inspired by the lack of evidence surrounding costs that may be associated with lowering the "cut score" required to pass the California bar, and we offered this small study as one data point toward that end. The Wall Street Journal cited the draft this week, and we've received valuable feedback from a number of people. We welcome more feedback! (We also welcome publication offers!)

The paper really does two things--identifies the likelihood of discipline associated with the bar exam score, and calls on the State Bar to engage in more precise data collection and analysis when evaluating the costs and benefits of changing the cut score.

It emphatically does not do several things. For instance, it does not identify causation and identifies a number of possible reasons for the disparity (at pp. 12-13 of the draft). Additionally, it simply identifies a cost--lower the cut score will likely increase attorneys subject to discipline. It does not make any effort to weigh that cost--it may well be the case that the State Bar views the cost as acceptable given the trade-off of benefits (e.g., more attorneys, more access to justice, etc.) (see pp. 11-12 of the draft). Or it might be the case that the occupational licensing of the state bar and the risk of attorney discipline should not hinge on correlation measures like bar exam score.

There are many, for instance, who have been thoughtfully critically of the bar exam and would likely agree that our findings are accurate but reject that they should be insurmountable costs. Consider thoughtful commentary from Professor Deborah Jones Merritt at the Law School Cafe, who has long had careful and substantive critiques about the use of the bar exam generally.

It has been our hope that these costs are addressed in a meaningful, substantial, and productive way. We include many caveats in our findings for that reason.

Unfortunately, not everyone has reacted to this draft that way.

The Daily Journal (print only) solicited feedback on the work with a couple of salient quotations. First:

Bar Trustee Joanna Mendoza said she agreed the study should not be relied on for policy decisions.

“I am not persuaded by the study since the professors did not have the data available to prove their hypothesis,” she said.

We feel confident in our modest hypothesis--that attorneys with lower bar exam scores are subject to higher rates of discipline. We use two methods to support this. We do not have individualized data that would allow us the precision of measuring the precise effect, but we are confident in this major hypothesis.

Worse, however, is the disappointing answer. Our draft expressly calls on the State Bar to study the data! While we can only roughly address the impact at the macro level, we call on the bar to use data for more precise information! We do hope that the California State Bar would do so. But it appears it will not--at least, not unless it has already planned on doing so:

Bar spokeswoman Laura Ernde did not directly address questions about the Pepperdine professors’ study or their call for the bar to review its internal data, including non-public discipline. Ernde wrote in an email that the agency would use its ongoing studies to make recommendations to the Supreme Court about the bar exam.

Second are the remarks from David L. Faigman, dean of the University of California Hastings College of Law. Dean Faigman has been one of the most vocal advocates for lowering the cut score (consider this Los Angeles Times opinion piece.) His response:

Among his many critiques, Faigman said the professors failed to factor in a number of variables that impact whether an attorney is disciplined. 

“If they were to publish it in its current form, it would be about as irresponsible a product of empirical scholarship I could imagine putting out for public consumption,” Faigman said. “God forbid anybody of policy authority should rely on that manuscript.”

It's hard to know how to address a critique when the epithet "irresponsible" is the substance of the critique.

We concede many variables that may cause attorney discipline (pp. 12-13), and the paper makes no attempt to address that. Instead, we're pointing out that lower bar scores correlate with higher discipline rates; and lowering the score further would likely result in still higher discipline rates. Yes, many factors go into discipline--but the consequence of lowering the cut score will still remain, a consequence of higher discipline.

And our call for policy authorities to "rely" on the manuscript is twofold--to consider that there are actual costs to lowering the cut score, and to use more data to more carefully evaluate those costs. Both, I think, are valuable things for a policy authority to "rely" upon.

We hope that the paper sparks a more nuanced and thoughtful discussion than the one that has been waged in lobbying the State Bar and state legislature so far. We hardly know what the "right" cut score is, or the full range of costs and benefits that arise at varying changes to the cut score of the bar exam. But we hope decisionmakers patiently and seriously engage with these costs and benefits in the months--and, perhaps ideally, years--ahead.

Whittier's challenges may have been unique to California

On the heels of my analysis of the challenges facing Whittier, I starting thinking about how Whittier compared with a great many other law schools in the country that are facing the same challenges--a shrinking law school applicant pool, declining quality of applicants, continued challenges in bar exam pass rates and graduate employment statistics. Whittier's incoming class profile isn't unique. What makes its situation different from other schools?

The answer, I think, lies in California, along three dimensions--state bar cut scores, transfers, and employment.

I recently read a law professor suggest that Whittier was making a significant mistake closing because it was located in Orange County, California, a place that would experience great demand for legal services in the near future. I tend to find just the opposite--if Whittier were dropped into just about any of the other 49 states in the country, it likely would not be facing the same pressures it faces in its current location. (This is, of course, not to say that it wouldn't be facing the same kinds of pressures in legal education generally, but that its problems are exacerbated in California.)

I looked at the incoming class profiles from 2013 and picked 11 other schools that closely matched Whittier's overall incoming LSAT profile.

School Name Matriculants 75th LSAT 50th LSAT 25th LSAT
Atlanta's John Marshall Law School 235 152 149 146
John Marshall Law School 404 152 149 146
Mississippi College 159 153 149 145
New England Law | Boston 238 153 149 145
Nova Southeastern University 305 152 149 146
Oklahoma City University 162 153 149 145
Suffolk University 450 153 149 145
University of Massachusetts Dartmouth 78 151 148 145
University of North Dakota 83 153 148 145
Western New England University 120 152 149 145
Whittier Law School 221 152 149 145
Widener-Commonwealth 74 151 148 145

First, I looked at the first-time bar pass rate for the July 2016 bar, with each state's cut score and the state's overall first-time pass rate among graduates of ABA-accredited schools. (As of this writing, neither the Mississippi state bar nor Mississippi College have disclosed school-specific bar pass rates yet.)

School Name State Cut score July 2016 Pass Rate Statewide Pass Rate
Mississippi College MS 132 * 75%
Widener-Commonwealth PA 136 79% 75%
Western New England University MA 135 74% 81%
New England Law | Boston MA 135 73% 81%
University of North Dakota ND 130 73% 73%
Suffolk University MA 135 70% 81%
University of Massachusetts Dartmouth MA 135 69% 81%
Oklahoma City University OK 132 67% 75%
John Marshall Law School IL 133 65% 77%
Nova Southeastern University FL 136 63% 68%
Atlanta's John Marshall Law School GA 135 43% 73%
Whittier Law School CA 144 22% 62%

A Pepperdine colleague blogged last year that if Whittier were in New York, it would likely have had a 51% first-time pass rate instead of a 22% pass rate. New York's cut score is relatively low--a 133.  (Whittier's average combined California bar score for first-time test-takers in July 2016 was a 135.5, above 133 and well below California's 144.) If Whittier were in Massachusetts or Georgia, it might have had something near 51%. If it were in Mississippi or North Dakota, its pass rate may have approached 60%. A first-time pass rate of 3 in 5 is still not something to be happy about, but it's a far cry from a first-time rate around just 1 in 5.

It isn't that some of these schools figured out how to help their students to pass the bar and that Whittier lagged; it's that California's high cut score makes it more difficult to pass the bar than if Whittier grads had taken the bar in almost any other state. (This isn't to say that a higher or a lower cut score is better or worse; it's simply to describe the situation that California schools face compared to others.)

Second, a factor in bar pass rate includes the loss of high-performing students as transfers elsewhere. I looked at transfer rates among these schools in 2014, a loss of students who matriculated to the school in 2013.

School Name Transfers Out Pct Transfers Out
Atlanta's John Marshall Law School 19% 45
Nova Southeastern University 13% 41
Whittier Law School 13% 28
John Marshall Law School 12% 50
New England Law | Boston 11% 26
University of Massachusetts Dartmouth 10% 8
Western New England University 8% 10
Suffolk University 8% 37
University of North Dakota 5% 4
Oklahoma City University 3% 5
Widener-Commonwealth 3% 2
Mississippi College 3% 4

It may come as little surprise that larger states with many competitive schools that shrunk the incoming class sizes to preserve their LSAT and UGPA medians tended to rely on transfers to help backfill their classes. Atlanta's John Marshall lost 20 students to Emory, 10 to Georgia State, and 5 to Mercer; Nova lost 16 to Miami and 5 to Florida State; and Whittier lost 15 to Loyola-Los Angeles. These schools lost a number of their best students, and, unsurprisingly, had some of the worst bar outcomes among this cohort. (Four of these schools are in Massachusetts, and perhaps no single school attracts the bulk of transfer attention; and schools in less competitive states like Mississippi, North Dakota, and Oklahoma experienced insignificant attrition.)

Third, Whittier had low job placement in full-time, long-term, bar passage-required and J.D.-advantage positions, but it's a reflection of the fact that the placement rate of California schools lags most of the rest of the country. (It's also exacerbated by the low bar pass rate.) Consider each school's placement in FTLT BPR & JDA positions, and the statewide placement rate into such (unfunded) jobs.

School Name FTLT BPR+JDA Statewide emp Delta
Mississippi College 76% 72% 4
Oklahoma City University 73% 77% -4
Widener-Commonwealth 68% 83% -15
Suffolk University 66% 79% -13
John Marshall Law School 66% 77% -11
New England Law | Boston 60% 79% -19
Atlanta's John Marshall Law School 58% 77% -19
Nova Southeastern University 58% 65% -7
Western New England University 57% 79% -22
University of North Dakota 57% 57% 0
University of Massachusetts Dartmouth 55% 79% -24
Whittier Law School 39% 64% -25

Whittier lags in placement here, too, but in part because California has unusually low placement. (North Dakota, a state with just one flagship law school, serves as an outlier.) This is not a total defense of a particular school's outcomes, either--Florida also appears to have a relatively high number of law school graduates, and its employment rate shows similar challenges. But coupled with Whittier's low bar passage rate, one can see why securing students in positions, particularly "bar passage required" positions, would be even more difficult. Several other schools show employment rates that are 19 to 24 points behind the state average. (School like Oklahoma City, one of three; Mississippi College, one of two; and North Dakota, the only school in the state, may distort these comparisons somewhat.)

I then read another piece from a graduate from the 1970s lamenting that Whittier had "lost its way" in training graduates ready to take the bar. I pointed out Whittier's challenges were hardly recent, as the ABA had placed Whittier on probation in 2005, which lead to efforts that bolstered Whittier's first-time bar pass rate in California past 84%.

But it's worth looking back to the 1970s, when Whittier first sought accreditation, to consider its situation and aspirations. Here's an excerpt from the Los Angeles Times in 1978 when Whittier received ABA accreditation:

In 1978, a 550 was around the 51st percentile of LSAT scores--something like a 151 today. Its tuition in 1974 was $1200 per year, or around $6000 per year in 2017 dollars. It was on pace to increase to $2900 per year in just four years, or about $11,000 in 2017 dollars. There are obviously significant benefits that arise from becoming an ABA-accredited law school. But there are also costs with accreditation--and I'm not sure that a law school with tuition levels at $11,000 a year would be facing the same kinds of pressures among selecting prospective students.

I don't pretend to understand the dynamics of legal education in California in the last 40 years, with more than 20 ABA-accredited law schools and a number of California accredited and unaccredited schools. But I do think some context about the California market suggests that some of the problems Whittier faced were exacerbated by the California market in particular.

Visualizing law school federal judicial clerkship placement, 2014-2016

The release of the latest ABA employment data offers an opportunity to update the three-year federal judicial clerkship placement rates. Here is the clerkship placement rate for the Classes of 2014, 2015, and 2016. Methodology and observations below the interactive visualization. The "placement" is the three-year total placement; the "percentage" is the three-year placement divided by the three-year graduating class total.

The placement is based on graduates reported as having a full-time, long-term federal clerkship. (A one-year term clerkship counts for this category.) I thought a three-year average for clerkships (over 3600 clerks from the graduating classes of 2014, 2015, and 2016) would be a useful metric to smooth out any one-year outliers. It does not include clerkships obtained by students after graduation; it only includes clerkships obtained by each year's graduating class.

I included some schools that had only one or two year's worth of data, like the separate Penn State schools. Additionally, I merged the entries for William Mitchell and Hamline into Mitchell|Hamline. The three schools in Puerto Rico are excluded.

I should add that we've actually seen a slight decline in graduates placed into federal clerkships, just under 1200 for the second year in a row. Given last year's figures, some might think this is a trend toward judges hiring more clerks with work experience. I'm not sure that's the case. Instead, I would venture to guess that because the Senate last confirmed a federal judge in November 2015, we may be experiencing an unusual number of vacancies--and, therefore, lack of slots for clerkship hires. In the event the President nominates, and Congress confirms, these judges, we could see a few hundred more clerkship openings in the near future. And if Congress chooses to create more judgeships consistent with the recommendations of the Federal Judicial Center, we'd see even more.

I'll highlight two smaller charts first. The first is New York law school placement.

School Pct Total Clerks
Cornell University 6.5% 36
New York University 5.8% 84
Columbia University 5.0% 64
Brooklyn Law School 2.4% 26
Fordham University 2.0% 25
Syracuse University 1.8% 10
University of Buffalo-SUNY 1.2% 7
St. John's University 1.2% 9
Cardozo School of Law 1.2% 13
Albany Law School 1.1% 6
City University of New York 1.1% 4
Pace University 0.7% 4
New York Law School 0.7% 8
Hofstra University 0.7% 6
Touro College 0.0% 0

The second is California law school placement.

School Pct Total Clerks
Stanford University 27.1% 153
University of California-Irvine 12.5% 40
University of California-Berkeley 12.3% 110
University of California-Los Angeles 4.0% 39
Pepperdine University 3.5% 20
University of Southern California 2.9% 18
University of California-Davis 2.8% 14
Loyola Law School-Los Angeles 2.3% 26
University of San Diego 2.0% 15
University of California-Hastings 1.7% 17
Thomas Jefferson School of Law 0.7% 5
California Western School of Law 0.6% 4
McGeorge School of Law 0.4% 2
Chapman University 0.2% 1
University of San Francisco 0.2% 1
Southwestern Law School 0.1% 1
University of La Verne 0.0% 0
Western State College of Law 0.0% 0
Golden Gate University 0.0% 0
Whittier Law School 0.0% 0
Santa Clara University 0.0% 0

An overall raw chart is below.

St School Pct Total Clerks
CT Yale University 31.0% 200
CA Stanford University 27.1% 153
MA Harvard University 17.6% 312
IL University of Chicago 15.8% 98
VA University of Virginia 15.2% 159
NC Duke University 12.7% 82
CA University of California-Irvine 12.5% 40
CA University of California-Berkeley 12.3% 110
MI University of Michigan 11.1% 119
TN Vanderbilt University 10.3% 58
PA University of Pennsylvania 9.8% 77
TX University of Texas at Austin 9.4% 100
IL Northwestern University 8.0% 66
AL University of Alabama 7.6% 35
MT University of Montana 7.5% 18
IN University of Notre Dame 7.0% 37
LA Tulane University 6.6% 45
KY University of Kentucky 6.5% 26
NY Cornell University 6.5% 36
VA Washington and Lee University 6.1% 24
IA University of Iowa 5.9% 25
VA William and Mary Law School 5.8% 36
NY New York University 5.8% 84
GA University of Georgia 5.8% 36
NC University of North Carolina 5.7% 40
VA University of Richmond 5.5% 25
NY Columbia University 5.0% 64
TX Baylor University 5.0% 20
MN University of Minnesota 4.9% 37
PA Temple University 4.8% 34
MO Washington University 4.5% 32
MS University of Mississippi 4.3% 19
DC Georgetown University 4.1% 81
AR University of Arkansas, Fayetteville 4.1% 15
UT Brigham Young University 4.1% 17
WA University of Washington 4.0% 22
CA University of California-Los Angeles 4.0% 39
WV West Virginia University 3.8% 14
UT University of Utah 3.8% 14
GA Mercer University 3.8% 16
DC George Washington University 3.7% 59
DC American University 3.7% 49
GA Emory University 3.6% 31
KS University of Kansas 3.6% 13
IL University of Illinois 3.6% 19
CA Pepperdine University 3.5% 20
MO University of Missouri 3.4% 13
MA Boston College 3.3% 25
WY University of Wyoming 3.3% 7
VA Regent University 3.0% 10
SD University of South Dakota 3.0% 6
TX Texas Tech University 3.0% 18
TN University of Memphis 2.9% 10
NC Wake Forest University 2.9% 15
CA University of Southern California 2.9% 18
CA University of California-Davis 2.8% 14
PA Pennsylvania State University 2.8% 5
GA Atlanta John Marshall Savannah 2.8% 1
MS Mississippi College 2.7% 12
MD University of Maryland 2.7% 21
GA Georgia State University 2.7% 16
IN Indiana University - Bloomington 2.6% 16
TX Southern Methodist University 2.6% 19
NV University of Nevada - Las Vegas 2.6% 10
VA George Mason University 2.6% 12
LA Louisiana State University 2.6% 15
SC University of South Carolina 2.5% 15
KY University of Louisville 2.5% 9
OH Ohio State University 2.5% 14
AZ University of Arizona 2.4% 10
FL Florida State University 2.4% 17
NY Brooklyn Law School 2.4% 26
LA Loyola University-New Orleans 2.4% 15
NE Creighton University 2.4% 9
ME University of Maine 2.4% 6
CA Loyola Law School-Los Angeles 2.3% 26
TN University of Tennessee 2.3% 10
CT University of Connecticut 2.2% 11
OH University of Toledo 2.2% 7
DC Howard University 2.2% 8
CO University of Colorado 2.2% 11
FL University of Florida 2.1% 20
CA University of San Diego 2.0% 15
PA Widener-Commonwealth 2.0% 5
NY Fordham University 2.0% 25
WI University of Wisconsin 1.9% 12
AZ Arizona State University 1.8% 11
NY Syracuse University 1.8% 10
NJ Rutgers Law School 1.8% 22
OH Case Western Reserve University 1.7% 7
CA University of California-Hastings 1.7% 17
NE University of Nebraska 1.7% 6
OR Lewis and Clark College 1.6% 10
WI Marquette University 1.6% 10
NM University of New Mexico 1.5% 5
NC Elon University 1.5% 4
OH University of Cincinnati 1.5% 5
TX University of Houston 1.4% 10
MO University of Missouri-Kansas City 1.3% 6
AR University of Arkansas, Little Rock 1.3% 5
OH Ohio Northern University 1.3% 3
ND University of North Dakota 1.3% 3
NJ Seton Hall University 1.3% 8
AL Samford University 1.2% 5
IL Southern Illinois University-Carbondale 1.2% 4
NC Campbell University 1.2% 5
NY University of Buffalo-SUNY 1.2% 7
NY St. John's University 1.2% 9
KY Northern Kentucky University 1.2% 5
NY Cardozo School of Law 1.2% 13
MA Boston University 1.2% 8
PA University of Pittsburgh 1.2% 7
PA Villanova University 1.2% 7
TX Texas Southern University 1.1% 5
OK University of Oklahoma 1.1% 5
NY Albany Law School 1.1% 6
PA Penn State - Dickinson Law 1.1% 1
NY City University of New York 1.1% 4
OK University of Tulsa 1.1% 3
MA Northeastern University 1.1% 6
SC Charleston School of Law 1.1% 5
PA Penn State Law 1.0% 2
FL Stetson University 1.0% 9
VA Liberty University 1.0% 2
MI Michigan State University 1.0% 9
WA Gonzaga University 1.0% 4
PA Drexel University 1.0% 4
MI Wayne State University 0.9% 4
OR University of Oregon 0.9% 4
ID University of Idaho 0.9% 3
FL University of Miami 0.9% 10
NY Pace University 0.7% 4
NY New York Law School 0.7% 8
NH University of New Hampshire 0.7% 2
NY Hofstra University 0.7% 6
VT Vermont Law School 0.7% 3
PA Duquesne University 0.7% 3
IL Loyola University-Chicago 0.7% 5
FL Florida A&M University 0.7% 3
CA Thomas Jefferson School of Law 0.7% 5
MO Saint Louis University 0.7% 4
IN Valparaiso University 0.6% 3
CA California Western School of Law 0.6% 4
IL John Marshall Law School 0.6% 7
OH University of Dayton 0.6% 2
TN Belmont University 0.6% 1
WA Seattle University 0.6% 5
CO University of Denver 0.6% 5
TX St. Mary's University 0.6% 4
IA Drake University 0.6% 2
OH Cleveland State University 0.5% 2
DE Widener University-Delaware 0.5% 3
MN University of St. Thomas (Minnesota) 0.5% 2
OH University of Akron 0.5% 2
IL Chicago-Kent College of Law-IIT 0.5% 4
TX South Texas College of Law 0.5% 5
AZ Arizona Summit Law School 0.5% 4
DC Catholic University of America 0.4% 2
AL Faulkner University 0.4% 1
IL Depaul University 0.4% 3
FL Ave Maria School of Law 0.4% 1
LA Southern University 0.4% 2
CA McGeorge School of Law 0.4% 2
MD University of Baltimore 0.3% 3
IL Northern Illinois University 0.3% 1
FL St. Thomas University (Florida) 0.3% 2
TX Texas A&M University 0.3% 2
KS Washburn University 0.3% 1
IN Indiana University - Indianapolis 0.3% 2
CA Chapman University 0.2% 1
OK Oklahoma City University 0.2% 1
MA Suffolk University 0.2% 3
MI University of Detroit Mercy 0.2% 1
NC North Carolina Central University 0.2% 1
CA University of San Francisco 0.2% 1
MN Mitchell|Hamline 0.2% 2
FL Barry University 0.1% 1
FL Nova Southeastern University 0.1% 1
MA New England Law | Boston 0.1% 1
CA Southwestern Law School 0.1% 1
NC Charlotte School of Law 0.1% 1
TN Lincoln Memorial 0.0% 0
ID Concordia Law School 0.0% 0
VA Appalachian School of Law 0.0% 0
CA University of La Verne 0.0% 0
MA University of Massachusetts Dartmouth 0.0% 0
CT Quinnipiac University 0.0% 0
HI University of Hawaii 0.0% 0
RI Roger Williams University 0.0% 0
CA Western State College of Law 0.0% 0
DC District of Columbia 0.0% 0
MA Western New England University 0.0% 0
CA Golden Gate University 0.0% 0
OR Willamette University 0.0% 0
OH Capital University 0.0% 0
CA Whittier Law School 0.0% 0
NY Touro College 0.0% 0
GA Atlanta's John Marshall Law School 0.0% 0
FL Florida International University 0.0% 0
CA Santa Clara University 0.0% 0
FL Florida Coastal School of Law 0.0% 0
MI Thomas M. Colley Law School 0.0% 0

More details on the legal job market: small law firm, business jobs disappearing

Last week, I posted a perspective on the changing legal market and the outcomes for the Class of 2016. Troublingly, job placement in full-time, long-term, bar passage-required positions declined from 25,787 for the Class of 2013 to 22,874 for the Class of 2016; that said, placement improved from 55.9% to 62.4% because of the shrinking graduating class size.

I thought I'd dig into job-specific data to see what may be leading the decline. The problem with the industry-specific data is that we don't know whether the jobs are bar passage-required, J.D.-advantage, professional, or non-professional. That said, we can make some guesses; most entry-level hiring in law firms with 501 or more attorneys are probably bar passage-required, for instance. Regardless, I looked at the full-time, long-term job categories for each position and catalogued notable areas.

I had two instincts. First, perhaps big law hiring has declined and law firms are relying on greater productivity, greater outsourcing, increased reliance on technology, higher retention of junior associates, and delayed retirements. Second, perhaps government hiring has declined in eras of partisanship and budget stalemates.

Both instincts were wrong.

FTLT Class of 2013 Class of 2016 Net Delta
Solo 926 444 -482 -52.1%
2-10 6,947 5,490 -1,457 -21.0%
11-25 1,842 1,640 -202 -11.0%
26-50 1,045 906 -139 -13.3%
51-100 846 768 -78 -9.2%
101-205 1,027 940 -87 -8.5%
251-500 1,041 993 -48 -4.6%
501+ 3,978 4,204 226 5.7%
Business/Industry 5,494 3,796 -1,698 -30.9%
Government 4,360 4,034 -326 -7.5%
Public Interest 1,665 1,398 -267 -16.0%
Federal Clerk 1,259 1,184 -75 -6.0%
State Clerk 2,043 2,021 -22 -1.1%
Academia 490 352 -138 -28.2%

Law firms with 501 or more attorneys were the only category that saw an increase in the last three years, a modest 226-person increase but a pleasant increase at that.

Yes, Government saw a decline, but not as significant as other categories. Other relatively small areas had some big declines. Public Interest jobs declined, perhaps because of combinations of student debt levels or public interest organization funding, and because of law school-funded positions drying up (although it remains somewhat counterintuitive at a time when law schools have been increasing their hiring and development of clinical education). Solo practitioners fell by half--it's become something less attractive for new graduates due to its uncertainty, perhaps.

Instead, two areas saw thousand-student-plus declines in three years.

First, hiring in small law firms, those with two to 10 people, declined 21% in three years.  This has been the single biggest employer of new graduates in recent years, and it has seen a significant decline. These are not firms that typically arrive at on campus interviews. But they are also firms, I think, that probably need new graduates to pass the bar on the first attempt. Failure to do so is a real challenge--they likely can't absorb someone who's unable to practice until the next bar go-around. And these are probably places that are most willing to dip into lower-ranked schools and students with lower grades. The decline in bar passage rates may be impacting this area the most--but that's just speculation on my part.

Second, hiring in business & industry positions has declined nearly 31% in three years. As business & industry jobs are a major source of J.D.-advantage positions, it would explain the decline in J.D.-advantage positions, too. But while bar passage-required jobs in business might also suffer a decline in placement as bar pass rates decline, why J.D.-advantage positions, too? Perhaps--again, some speculation--it's that, in a (relatively) robust economy with many strong college graduates, businesses may no longer be valuing the J.D., or they may be finding greater value in M.B.A. students.

For schools looking to solve their employment challenges, addressing the reasons why there's been a steep drop in demand in these two important employers is crucial. A few speculated reasons here are hardly a beginning to explore the reasons for the changes in climate.