Congrats to the University of Illinois-Chicago John Marshall Law School on an unprecedented USNWR peer score improvement

USNWR peer scores are notoriously sticky. That said, one of the greatest ways to improve peer score is to change names. Michigan State, Penn State, New Hampshire, and Texas A&M, among others, have all benefited from acquiring a law school and changing the name to the more prominent research institution. Granted, schools improve in many respect with such a change—more financial backing, more resources poured into the institution, often new leadership and new faculty.

But one thing that we can observe is the year-to-year change in peer reputation scores. USNWR surveys about 800 law faculty around the country with a slightly better than 70% response rate. They rate law schools on a scale of 1 to 5. Even after these schools names changed, however, their peer scores did not immediately and rapidly change—the peer scores improved, at best, 0.2 points per year after that before leveling off.

In the entire history of USNWR peer surveys, just once have I observed an increase of 0.3 peer score in a year: the year after USNWR misreported the name of Loyola Law School in Los Angeles and the school dropped 0.3 points, it rose 0.3 points in the following year.

But the University of Illinois-Chicago (John Marshall), as printed by USNWR, has set a new record: it improved from 1.7 last year (when it was labeled “The John Marshall Law School”), to 2.1 this year. That’s also good enough (at least in part) to bring it from the “rank not published” tier to 140th overall.

The lawyers and judges survey is a three-year average of responses. That score rose from 2.6 to 2.7. We’ll see in two more years if the three-year average improves dramatically.

To be sure, more resources and new programs will be features at UIC brings John Marshall in. But peers reacted very positively in just one year to this new name.

Some thoughts about (and mostly against) pass-fail law school grading during Covid-19

Cornell Law School is reportedly among the first law schools to move to pass-fail grading for the semester. It’s very likely we’ll see many more (particularly elite schools, or at least they’ll star the trend) schools doing so. This, like many decisions made in the coronavirus outbreak and Covid-19 illness spread, will simply be a cascade. I’m mostly against such proposals (but have no influence over them!), and I thought I’d explain why.

I have small experience thinking about this, having taught during a nearly three-week evacuation from Malibu for the Woolsey fires. No classes were held remotely, class periods were compressed, and the exam period was shortened.

Grades are an imperfect measure student performance, but they do pretty well to identify students who will perform well on the bar exam and students who are most at risk. They are also deeply valued by employers, judges looking for clerks, and so on. Grades are not the only thing, of course—good grades do not guarantee good jobs. But they are usually necessary, if not sufficient. (For those privileged to enjoy special family connections or status for legal employment, grades are usually not necessary.)

Assuming a law school should grade (and some might challenge this, as a few particularly elite schools offer little more than premium “pass-fail” grading—and particularly as many law professors came from those schools), the question is what these circumstances do to change the presumption of grading into pass-fail.

Obviously, the disruption is significant. Many students are displaced physically from dorms or homes. Most classes will be taught in a new online format, either synchronous or asynchronous. Some students will not have access to their physical textbooks and need to rely on other materials. Still others hare dealing with travel restrictions, or, worse, illnesses of themselves or loved ones.

All real concerns. At the same time, there are a couple of ways to look at disruption. One is disruption that affects everyone—and if we’re all in it together, we can all suffer along together, expecting, yes, some individual resiliency that may vary from person to person, but, on the whole, grading (which is typically distributed across pre-set grading criteria) will remain largely unchanged.

Another is disruption that uniquely affects a subset of the student population. Of course, these things happen every semester—the student with a death in the family, with a significant illness, with some personal problem that disrupts the term. Schools typically accommodate the student through leave or other policies, but rarely (I think) to take a course pass-fail. It’s simply that this is happening on a much larger scale, so many more people are collectively identifying issues that may affect them.

If pass-fail becomes optional, in some ways it ends up worse. Students who opt for a pass-fail grade immediately put less effort into the course. That artificially inflates the grades of the remaining students who are taking the exam on a graded basis. Depending on which cohorts opt for pass-fail, it can skew classes in bizarre ways.

Now, maybe these reasons aren’t persuasive, and you think that shifting to pass-fail, given the seriousness of the disruption and some varying levels of uncertainty, is the right call. I want to call attention to a few sub-populations that likely will be disadvantaged by this decision.

First, students in classes where they have done well on already-graded midterms or interim assessment. Yes, that’s a rarity in law school. But for those students, they have put in work and received feedback that should, I think, be seriously considered as a component of their final performance for the sake of employers and judges looking at transcripts.

Second, academically at-risk first-year law students. Schools will place at-risk law students on probation after their first semester. A few “figure out” law school in the second semester and do well. That takes them off probation and puts them on track to graduate. Students without that opportunity are facing academic dismissal.

Third, students who “figure out” law school after one semester—often, in my anecdotal experience, those who come from professional careers with a gap between undergrad and law school. While there’s a high correlation between first semester and second semester grades, a small set of students will do exceedingly well.

Fourth, students fighting resume bias. Pass-fail makes it very easy for employers to rely on other measures like undergraduate and law school reputational quality. If they can’t rely on grades, they’ll rely on other proxies.

Back to the opening point, pass-fail is a luxurious advantage for the highest-ranking law schools. They can easily move to pass-fail and know that the vast majority of their students will experience little difference in likelihood of employment outcomes.

For many other students at the vast majority of law schools, however, I do think there will be disadvantages to moving to pass-fail.

Maybe I’m overstating it, and maybe there won’t be a significant change in judges’ or employers’ experience. Maybe the concerns of the students who I identified as potentially disadvantaged should be outweighed by the concerns of others. But I offer my own thoughts here and look forward to reading more of the robust debates in the days ahead—and to seeing how law schools react.

UPDATE: Three reflections on this. First, Prof. Maggie Wittlin writes, “The strongest argument I've heard in favor of pass/fail is that certain groups of students will be disadvantaged by maintaining grades, specifically, students with fewer resources, who can't access reliably fast internet or quiet spaces for studying.” I think this is right. It’s about certain kinds of disruption that uniquely affect subsets of the student population, and how to balance their concerns with the other concerns I laid out. No easy answers, but I don’t want to minimize the cast for pass-fail. In some circumstances or at certain schools, these kinds of factors can cut in favor of moving there. But, in my view, we’d want more than general statements before doing so.

Second, this position shouldn’t be confused with a lack of empathy for students! I don’t defend grading as some kind of “tough life lesson,” as learning to be “resilient” in the face of challenges, or so on. That feels more like a right of passage or a kind of hazing justification, which I think must fail. Instead, it’s to look at the value of grades—i.e, the value we assign to them before a disruption like this arises—and to weigh that value with the costs and benefits of switching in the midst of a challenging time like this. By all means, I emphatically defend accommodations for students, and consideration of whether, on the whole, alterations to grading should be made. It’s simply that, I think, the “solution” of pass-fail grading comes with problems that are often beneath the surface.

Third, there are more creative “optional” pass-fail structures out there, like allowing students to opt in after looking at their grades, or setting a “cutoff” and taking a pass-fail only if they perform below the cutoff. Solutions like these can help mitigate the distorting effects of an optional system by increasing the incentives for all students to work hard and help keep a more competitive performance on the exam. That said, any “optional” system invites student second-guessing, agonizing, gamesmanship, and curiosity of how decisions might affect the behavior of others—things I’ve experienced (none very good) in “optional” pass-fail settings, and things probably worthy of a separate and more extensive blog post.

Some more evidence of the scope of GRE admissions in legal education

I’ve written that GRE-related law school admissions remain a small, but rising, cohort of all admissions. But tracking GRE admissions is notoriously hard, as I lay out in these posts—the ABA allows students to be admitted without LSAT scores under a number of scenarios, and (so far) it isn’t disclosing GRE-specific data.

USNWR, however, does collect that data for its rankings. It uses GRE analytical writing, quantitative, and verbal scores in its rankings. It appears it converts those into a composite score, then converts it to the percentile equivalent of the LSAT.

This doesn’t really make any sense for a host of reasons. The GRE doesn’t have a single composite score like the LSAT. While many law schools indicate that GRE scores are predictive of law school performance, it’s not clear all schools treat all GRE scores similarly. And while USNWR does this for, say, the ACT & SAT for colleges, there’s a “concordance table” specifically designed to explore the relationship between the two.

Its methodology, however, reveals that 32 law schools in 2019 reported GRE admissions, up from 16 in 2018. In part-time programs, the number rose from 4 to 10.

Again, it remains a small number of schools (about 15% of schools), and a small number of admissions even at those schools. A few schools have the bulk of GRE admissions. We continue to wait and see whether it will have any impact on law school’s ultimate outcomes, like employment and bar passage. Until then, it remains a factor (albeit probably more flawed than other factors) in USNWR.

Thinking about higher education, legal education, and Covid-19

As the novel coronavirus (or the illness it causes, “Covid-19”) makes its way across the globe, we’ve seen significant containment and mitigation measures, largely due to the high degree of uncertainty over the virus—its morbidity and mortality rate, its relative risk, and so on. It’s certainly larger than other recent outbreaks like SARS 2002 or MERS 2012. And it appears to be somewhat more dangerous than the typical seasonal flu, although the degree of danger and the effect on certain subpopulations remains deeply uncertain.

I’m no epidemiologist, so I’ll leave it at that. I did want to focus, however, on the containment and mitigation measures as they affect higher education more generally and legal education more specifically.

Universities are, of course, shutting down international study abroad programs in places with travel advisories, returning students home, and having those students self-quarantine for fourteen days. Universities are also regularly sharing the same kinds of information: the risk to the university population remains low; faculty and student travel, particularly internationally, should be flexible and responsive to ongoing changes in circumstances; preparations for distance-learning or remote learning are underway in the event of shutdowns.

These are preparations for the start of worst case scenarios, so it may never come to this at most places. At the same time, what can higher education do? Some have thoughts elsewhere, and, to be sure, many with many good ideas should be heeded. There are things the university health services should be doing, but I set those aside for the health professionals and university administrators to consider. On the pedagogical side, however, here are a few thoughts, inspired by Prof. Karen Grepin’s Twitter thread.

Anticipate the local disruption of canceled study abroad programs. If your university or your law school sends, say, 10% of its study body abroad each year, and those study abroad programs are canceled, it will be highly disruptive to the education community—the equivalent of admitting a much larger class than usual. Classrooms will be tighter, more classes or sections might need to be offered— that is, faculty might need to teach more and grade more. Student housing will also grow tighter for the term or the year. On the flip side, students from abroad may not be able to participate in programs here or may need to defer enrollment in degree programs. That loss of revenue should be anticipated now.

Begin recording or live-streaming lectures now. I have mixed feelings on recorded lectures. But schools should start doing this now, not when the university decides to shut down for a period of time. Only a few students are absent each class at the moment, so this allows faculty to start to learn how to use technology and make errors in a much smaller audience. It then becomes routine in the event the entire class must hear some lectures remotely. It also gives faculty some time to think about how pedagogy may differ online. (This is particularly true, I think, as I consider the Socratic method or the seminar discussion remotely.) If you are concerned about recorded lectures, live-streaming from Zoom or another remote presentation software in lieu of recording it may be preferable.

Accommodate sick leave for students and staff. Faculty have the greatest flexibility concerning sick leave, apart from days they teach. But ensuring that staff have paid sick leave, and are encouraged to take it, is essential to the health of the community. But related to the point above and concerning pedagogy, live-streaming lectures gives sick students—or self-quarantined students—the opportunity to stay home and not feel like they’re missing anything.

Increase exam flexibility. Whether that means take-home exams, self-scheduled exams, and flexible deadlines for exams, reducing student anxiety about remaining away from campus can be significantly helpful. It also means thinking about preparing exams that could accommodate such methods of test-taking.

I don’t have great answers. I only offer a few thoughts and hope universities and law schools are anticipating these big-picture changes that could happen in a very short time period and that could have lingering effects through the 2020-2021 academic year.

Are most law schools losing a million dollars a year?

Okay, that’s a provocative title. But here’s what I started to do. I tracked down 990 data for 14 law schools. Nine of these law schools reported seven-figure losses (“revenue less expenses”) for at least one of the last two fiscal years. The median loss in 2017 was $1.5 million; the median loss in 2018 was $720,000. My goal isn’t to call out particular schools in this post, so I anonymized the schools.

And yes, accounting practices may not tell us much about how “real” these deficits may be. But with that caveat, let’s take a look.

  990 Revenue Less Expenses
  2017 2018
School A -$1,740,899 -$331,894
School B -$2,361,093 -$2,719,319
School C $2,099,592 -$264,383
School D $1,179,037 $4,668,804
School E -$1,340,958 -$3,453,295
School F $945,640 $29,612,793
School G -$3,270,424 -$1,111,501
School H -$2,510,861 -$8,155,296
School I $59,515,319 -$3,918,838
School J -$20,456,417 $7,985,200
School K -$1,649,617 -$1,661,162
School L $3,658,968 $1,211,574
School M $592,788 $2,202,950
School N -$1,983,864 -$6,016,395

You’ll notice that some schools experienced dramatic swings from one year to another. A sizeable gift, a sale of property, a restructuring of debt are all among the possible reasons why one year’s net revenue may have increased significantly; a tranche of funds for faculty buyouts or scholarships may decreased it; a sharp swing in student enrollment may shift it in one direction or another. That might limit the value of making any overall comparisons.

Now, of course, the caveats. 990 data comes mostly from stand-alone law schools, which are not representative of all law schools. (University 990 data typically doesn’t break out sub-units like law schools.) Stand-alone law schools may not be able to scale like universities and may bear disproportionately more costs internally. They are often “lower ranked” than the typical law school, to the extent law school hierarchies bear some relation to financial stability or success. (UPDATE: As one commenter suggested to me, maybe all I’ve revealed is that 990 data is idiosyncratic!)

It’s also not to say that these schools are “at risk.” Their reported net assets range from $5 million to $274 million (five schools over or near $100 million). Schools with very high net assets can absorb such losses much longer. And it’s not to say that losses as a percentage of total revenue might matter, too—while some schools reported revenue as low as $3 million and others as high as $85 million, most had annual revenue of $30 million to $50 million.

That said, we rarely see much transparency about specific law schools and their budget deficits, with occasional stories of some institutions making their way to the media. I thought it might be worth looking at some of the information we do have and see how widespread the trend is.

Given that legal education has reached a kind of “new normal,” of fairly stable and predictable applications and enrollment for several admissions cycles, one hopes that budget deficits will continue to shrink and law schools will find stability soon. The “crunch” in legal education is now about a decade old, so lingering challenging financial pictures are not a welcome sight.

Law schools and faculty roles mentoring students in judicial clerkships

There’s a lot to say about this testimony of Olivia Warren and her experience of sexual harassment during her clerkship with Judge Stephen Reinhardt. It is a difficult thing to write and to say, and I’m glad she did. Jacqueline Thomsen at the National Law Journal is reporting about a congressional hearing on judicial misconduct, and Ms. Warren’s testimony is a part of that.

But I wanted to focus on one part where she describes some of the ways she tried to report the harassment. Here’s an excerpt when she went to her alma mater:

My first attempt to formally report the harassment occurred on August 1, 2018, when I met with several members of the Harvard Law School administration, including the Dean. A friend and mentor who is a tenured Harvard Law faculty member helped arrange the meeting and encouraged me to communicate my concerns so that more accurate information and better support could be provided to current and future students. During the meeting, I described my experience clerking for Judge Reinhardt and the harassment to which I was subjected. I also shared my view that I thought there was a risk this was happening with other clerkships. I emphasized that students rarely hear about negative clerkship experiences for many of the systemic reasons that I have explained, and described how misled I felt by the institutional push to clerk. Nobody has communicated to me since that meeting what, if any, steps Harvard has taken to address the issues I raised.

Law schools and law faculty have critical roles in mentoring and advising students. This includes learning about students’ interests and preferences, providing them clear-eyed and realistic advice about costs and benefits of career choices, and ensuring that the best interests of the students are pursued.

These are, I think, all distinct concepts. One is to recognize that there’s not a one-sized-fits-all solution for student employment. Optimal outcomes may vary for many law students, but not all, and not all opportunities, even elite opportunities, are equally suitable for all students.

Another is to provide clear-eyed and realistic advice. Clear in that some language can mask negatives—Professor Ed Swaine noted that “intense” is often coded for unarticulated negatives in employment. Not unduly negative, because many jobs are hard work. But not rosy and unusually optimistic, either. Every job has costs and benefits. Providing students with those tradeoffs and helping them make a decision is an important role for faculty. And in some cases, it’s not just providing tradeoffs, but making a recommendation about a preferable path to take—or to avoid—for specific reasons.

Finally, it’s about pursuing the best interests of the student. It’s the result of the first two—thinking of students first and providing them with the best advice to ensure that their interests are sought. Singular focuses or “institutional push[es]” can cloud that. That’s undoubtedly true as we think of “ranking” law schools by the number of federal clerkships or “big law” placement. Determining whether students are satisfied with their employment outcomes may not help schools in the “rankings,” but it’s the moral imperative of legal educators. Clerkships, after all, may just be another job.

In another sense, these are “first world” problems for some law schools. Many law students at many law schools would appreciate any opportunity to get gainful legal employment, much less to choose among options, let alone elite options! But the imperative remains in advising students at all levels and in all capacities.

I’ve had the pleasure of serving as a clerkship advisor (and sometimes externship advisor) to many students over many years, and I hope to continue to do so in the past. I’ve tried to heed these principles, and I’ll offer a couple of practical examples.

One student was weighing two options—he’d received an offer from a “less” prestigious judge but hoped to have an opportunity from a “more” prestigious judge within the next few days, and he asked me what I thought. Of course, turning down an offer from a judge is deemed taboo among law school career development offices, and I noted that to him—then I followed up with, “But what’s important is your career, and we’re going to talk about that.” The “more” prestigious judge, in my view, would not provide the mentoring role that I thought would be better for his career for a host of reasons, knowing the reputation of both judges and the student’s career interests. Ultimately, he took my advice to heart and accepted the offer from the “less” prestigious judge. And at the end of the work experience? He gushed about his experience, told me the mentoring he’d received was invaluable, and deeply appreciated my frank advice.

Another student had accepted an offer for a judicial clerkship when a problem arose that would counsel toward withdrawing from the clerkship. She called me to figure out what to do. Once again, I noted that yes, it might be “bad form” to withdraw an offer from a clerkship, but, again, I emphasized we ought to talk through her career options—the problem, whether she ought to withdraw or take other steps, if she withdrew how to approach it, and so on. We had some lengthy conversations to figure out the best solutions, and all was resolved amicably.

Of course, I select a couple of good anecdotes with students where I’d built up good relationships! In others I could give more limited advice, or I referred them to other faculty who might give them better advice. And I haven’t had to experience horrible allegations of sexual assault from a clerk.

For students out there, seek out mentors who will provide you with this kind of advice. And importantly, seek them out early to help avoid making mistakes or entering problems. I’ve always been happy to help in whatever small way I can.

For law professors out there, encourage one another to think this way, and to advise students with this approach.

For me, I’m thinking about how to institutionalize some of this—training students before they head to clerkships, encouraging them to report concerns to us, and so on. It needs to be more than just faculty individually doing it. I only hope I can take my own advice and do a better job in the future.

Some questions about mashups of law school rankings

Each year, the Princeton Review surveys law students around the country and uses those surveys to create eleven rankings lists. (Three other lists are based on school-reported data.) The data includes factors like “Best Classroom Experience” and “Most Competitive Students.” I’ve previously noted that I think these surveys are pretty good because they are designed not to be “comprehensive” but rate schools in different categories, even if a fairly black box methodology.

One oft-shared survey among law professors is “Best Professors.” That ranking, according to the Princeton Review, is “Based on student answers to survey questions concerning how good their professors are as teachers and how accessible they are outside the classroom.” Here’s the “top 10” as reported by the Princeton Review for 2020:

  1. University of Virginia

  2. Duke University

  3. University of Chicago

  4. Washington & Lee University

  5. Stanford University

  6. University of Notre Dame

  7. Boston College

  8. Boston University

  9. University of Michigan

  10. Northwestern University

Dean Paul Caron over at TaxProf approaches it somewhat differently. He takes two of the Princeton Review categories, “Professors: Interesting” (based on a student survey of “the quality of teaching”), and “Professors: Accessible” (based on a student survey of “the accessibility of law faculty members”), which Princeton Review rates on a scale of 60 to 99. It yields a different top 10:

1. University of Virginia

2. University of Alabama

University of Chicago

Duke University

Pepperdine University

Stanford University

Washington & Lee University

8. Charleston

Notre Dame

Regent

Vanderbilt

It’s an interesting comparison—ostensibly, the “Best Professors” rating is based on how good the professors are as teachers (i.e., quality), and accessibility. But mashing up two sets of categories from Princeton Review on similar topics yields different results: Alabama, Pepperdine, Charleston, Regent, and Vanderbilt go from “unranked” to the “top 10” of Dean Caron’s mashup rankings; Boston College, Boston University, the University of Michigan, and Northwestern University drop out of the “top 10.” What’s different about the Princeton Review’s judgment of “Best Professors” than simply a mashup of two categories?

Dean Caron goes on to create a new “overall law school ranking,” with “equal weight” to five categories: (1) selectivity in admissions, (2) academic experience, (3) professor teaching quality, (4) professor accessibility, and (5) career rating. His rankings give 20% weight to each of the five categories noted above.

(Princeton Review actually disclaims doing this with its information:)

Note: we don't have a "Best Overall Academics" ranking list nor do we rank the law schools 1 to 167 on a single list because we believe each of the schools offers outstanding academics. We believe that hierarchical ranking lists that focus solely on academics offer very little value to students and only add to the stress of applying to law school.

So this raises a new question: are these five categories of equal value to prospective students? Specifically, is "faculty accessibility” worth equal parts with “career rating”—that is, “the confidence students have in their school's ability to lead them to fruitful employment opportunities, as well as the school's own record of having done so”? I’m not terribly convinced these are of equal value.

For the Princeton Review, “career rating” actually includes several judgments compiled into a single index—practical experience, externship opportunities, student perceptions of preparedness, graduate salaries, bar passage-required jobs, and bar passage rates, all lumped into one 60-99 score.

Any ranking that takes multiple components into a single result requires value judgments about how to weigh those components. USNWR ranking, for instance, places about 22% of the overall score on career and bar outcomes. Above the Law puts all of its metrics on outcomes, mostly employment-related outcomes.

Of course, I bracket all this to say that I of course do not suggest that professor accessibility is unimportant. (If you’ve ever seen me at one of the four law schools I’ve taught at, you’ll see I’m physically in the building a lot—I value accessibility and making myself accessible tremendously!) But whether it’s of equal importance is something more puzzling for me. It’s valuable to students—how valuable? How valuable compared to other things?

Professor Paul Gowder suggested that perhaps it’s valuable to have measures that are not simply highly correlated with one another, as so many rankings metrics tend to be, such that the rankings can truly measure lots of different things. It’s a fair point, and one to consider: faculty accessibility is negatively correlated (-0.12 by this measure) with career outcomes. Of course, as I ran these figures, this also seems a bit strange (in my view)—I might think that better faculty access (including mentoring!) would lead to better career counseling outcomes and better bar passage rates. Additionally, Professor Rob Anderson noted that there may be questions about how we define “faculty accessibility” and whether some faculty (like professors with childcare responsibilities) are disproportionately affected.

Other Princeton Review measures are more highly correlated with career outcomes, like “teaching quality (0.36) and “academic experience (0.56). Of course, least surprising is that “admissions selectivity” (0.71) is the most highly correlated with best career outcomes.

All this is to say, I appreciate when folks lpress forward with alternative measures of evaluating law schools and comparing law schools to one another. Personally, I think we don’t we spend enough time evaluating a lot of important things about law schools, and I’ve tried to put some of them here on this blog: debt-to-income ratios, median debt loads, optimal employment outcomes, the role of school-funded jobs, and so on.

Each measure, though, like the Princeton Review rankings, is hard to compare to one another without making judgments about how to weigh them. Maybe we should weigh a series of factors equally; maybe there are reasons not to. And I confess I have far more questions than answers! But it’s also a reason to wonder whether “comprehensive” or mashup rankings are as valuable as a series of discrete evaluation categories that offers opportunities for students to assess how valuable individual components are.

Despite stable enrollment, law schools continue to shed full-time faculty

Overall law school enrollment has improved slightly over the last few years, and a huge influx of non-JD enrollment continues.

Nevertheless, ABA data reveals that law schools continue to shrink—at least, when it comes to full-time faculty.

Law schools dropped from 10,226 full-time faculty (this figure includes all full-time positions, regardless of faculty status) in 2017 to 9470 in 2019, a 7% decline in two years. Law schools are doing more with less. Indeed, they’re not being replaced with adjuncts or temporary faculty—non-full-time faculty also declined (albeit at a smaller rate) in this period, too (from about 17,000 to about 16,500).

It might be, of course, that some of this attrition is simply phased retirements finally panning out, or ordinary departures that aren’t being refilled. But it’s also a sign that law schools are being cautious—and that despite enrollment improvements, that hasn’t translated into revenue improvements (e.g., increased scholarship spending to attract a similar student profile).

Three schools (Arizona Summit, Valparaiso, and Whittier) shut down in this period, totaling about 83 faculty in that time frame (some, however, did lateral to other law schools). Nevertheless, 44 law schools saw faculty declines of at least 15% in that time period.

Name 2017 2019 Change
Florida Coastal 39 13 -66.7%
Vermont 59 37 -37.3%
Thomas Jefferson 41 27 -34.1%
Liberty 27 19 -29.6%
Touro 44 31 -29.5%
William & Mary 64 46 -28.1%
Buffalo 61 45 -26.2%
Regent 27 20 -25.9%
Louisville 36 27 -25.0%
Arkansas 49 37 -24.5%
Oklahoma City 29 22 -24.1%
Western New England 29 22 -24.1%
Samford 25 19 -24.0%
Berkeley 103 79 -23.3%
Denver 85 66 -22.4%
American 94 73 -22.3%
Catholic 37 29 -21.6%
Widener-Delaware 33 26 -21.2%
Detroit Mercy 30 24 -20.0%
Nova Southeastern 56 45 -19.6%
Faulkner 26 21 -19.2%
DePaul 47 38 -19.1%
Akron 32 26 -18.8%
Concordia 16 13 -18.8%
New Mexico 48 39 -18.8%
Northern Kentucky 32 26 -18.8%
West Virginia 43 35 -18.6%
Creighton 33 27 -18.2%
Davis 51 42 -17.6%
North Carolina 68 56 -17.6%
Case Western Reserve 46 38 -17.4%
Seattle 59 49 -16.9%
North Carolina Central 36 30 -16.7%
Chicago-Kent 67 56 -16.4%
Texas Tech 43 36 -16.3%
Chapman 50 42 -16.0%
Ohio State 64 54 -15.6%
Southern Illinois 32 27 -15.6%
University of Washington 64 54 -15.6%
Charleston 26 22 -15.4%
Pepperdine 52 44 -15.4%
St. Louis 52 44 -15.4%
Mitchell|Hamline 46 39 -15.2%
San Diego 60 51 -15.0%

It’s not all bad news, however. 14 law schools (especially a few recently-founded schools) saw hiring upticks of at least 10%.

Name 2017 2019 Change
Lincoln Memorial 14 19 35.7%
UNT Dallas 16 21 31.3%
UNLV 42 55 31.0%
CUNY 51 64 25.5%
Appalachian 10 12 20.0%
La Verne 21 25 19.0%
George Mason 44 52 18.2%
Campbell 26 30 15.4%
Irvine 50 57 14.0%
Oklahoma 37 42 13.5%
Roger Williams 23 26 13.0%
Arkansas-Little Rock 25 28 12.0%
Penn State Law 42 47 11.9%
Howard 35 39 11.4%

A small but rising cohort of GRE law school admissions

In 2018, I looked at the “tiny impact (so far)” of GRE law school admissions. Law students admitted without an LSAT score rose from 81 in 2017 to 168 in 2018 (among ABA-accredited law schools, excluding those in Puerto Rico). But some law students could always be admitted without an LSAT score under limited circumstances. Still, it appeared that the bulk of these admissions were those with GRE scores. But given about 37,000 matriculants to law school, it was a very small percentage.

USNWR has begun collecting data about GRE admissions at law schools. That information (not publicly available, sadly!) confirms that the bulk of these no-LSAT admissions at law schools are those with GRE scores.

In 2019, no-LSAT admissions rose from 168 to 384—more than doubling from the previous year, which more than doubled the year before. Undoubtedly, on the rise.

Whoa, cowboy! That’s a data spike! But… not really. In fact, my first chart is probably pretty deceptive.

You see, 384 admissions among 37,873 matriculants represents just 1% of all law school admissions. Still a small number—but rising. Let’s situate that number among all admitted students, with a better sense of perspective in a new chart.

Even though GRE admissions still represent a very small percentage of overall admissions, only a few dozen law schools accept the GRE. That means GRE admissions are disproportionately concentrated at a few law schools. Last year, I noted that Arizona had about 15% of its class as GRE admissions, and Harvard and Georgetown around 2% or 3% of the class. Several classes are above 5% this year: Alabama (8 non-LSAT admissions), Arizona (17), BYU (16), Kent (12), Georgetown (48!), Georgia (19), Harvard (43!), Hawaii (12), Northwestern (27), St. John’s (13), and Buffalo (8). Indeed, these 11 schools are more than half of all non-LSAT admissions.

So while the number rises and remains very small overall, a few schools have admitted substantial cohorts of GRE students. We’ll see what happens to these cohorts in the years ahead—if bar passage rates or employment rates materially differ, for instance. And we should expect this trend to continue next year.