Janet Reno, for-profit law schools, and a reversal at the New York Times

Recent stories, from commentary in the Atlantic to the opinion pages of the New York Times, lament the current state of legal education. The current target of their ire? For-profit law schools operating in the United States. Note one takeaway from the Times:

If this sounds like a scam, that’s because it is. Florida Coastal, in Jacksonville, is one of six for-profit law schools in the country that have been vacuuming up hordes of young people, charging them outrageously high tuition and, after many of the students fail to become lawyers, sticking taxpayers with the tab for their loan defaults.

There is a great irony, of course. These institution exist solely because of the decision of the Department of Justice in the mid-1990s to compel the American Bar Association to accredit these institutions--a move that the Times once praised.

(Let me confess at the outset that I know almost nothing about antitrust law.) In 1995, the Department of Justice and the American Bar Association entered a consent decree on the heels of DOJ antitrust allegations.

The Janet Reno-led DOJ believed that law school tuition was too high, and that it was too high largely because of cartel-like practices by the ABA that drove up the cost of faculty salaries. The ABA's exclusion of for-profit law schools from receiving accreditation was also deemed a culprit. Breathless reports from DOJ officials responsible for brokering a settlement anticipated an end to practices leading to "artificially inflated levels" of faculty salaries and "other costly accreditation requirements that had little to do with the quality of the legal education they provided."

The 10-year consent decree was renewed recently, and the ABA has been found on a previous occasion to have violated provisions of the decree. But these are the small details that arose out of the settlement; what of two larger points, cost and for-profit institutions?

First, there is no evidence that this consent decree did anything to improve the affordability of legal education, despite DOJ assurances to the contrary. Indeed, student indebtedness rose dramatically after 1995. One ABA report in the early 2000s noted that about 74.8% of students borrowed money to attend law school in 1992-1993, and the average amount borrowed was $37,637. In 1995-1996, the percentage rose 84.5%, and the amount borrowed rose to $49,415. And just a few years later, by 1999-2000, the percentage borrowing stood at 86.4%, and the average amount borrowed stood at $77,300. Slightly different statistics from the ABA reveal that in 2011-2012, the average amount borrowed was $84,600 at public schools and $122,158 at private schools.

My point is not whether these figures are too high, too low, or just right, or even to include some inflation-adjusted evaluation of the change in cost over time. (While I imagine most find these debt loads high, claims have been made suggesting that the whole-career value of a law degree is quite worth such debt loads, at least generally and historically.) Rather, it is to suggest that the DOJ's projection that its consent decree would help alleviate the costliness of legal education has never come to fruition. (Although, I suppose, one could claim that the indebtedness would have been far worse but for the consent decree, which is, perhaps, to take the rather bold promise of the DOJ and turn it into a quite modest claim.)

Furthermore, the ABA had categorically prohibited accrediting for-profit law schools. Today, there are six such schools, precisely the schools that the New York Times et al. critique so harshly. My point is not to target any particular institution or type of institution, but simply to note that for-profit law schools were once deemed something good that should be pursued! Indeed, it was the very same opinion pages of the New York Times that, without a hint of irony, the editors penned the following words twenty years ago:

The A.B.A. refused, for example, to accredit law schools that earned profit, paid low salaries or failed to provide paid leaves for faculty.

These rules have nothing to do with guaranteeing students that they are gaining professional training, and everything to do with guaranteeing faculty that they will be paid top dollar. The losers are students, who are forced to pay higher tuition than would otherwise be charged.

Yes, it was the Times twenty years ago praising the potential future influx of for-profit law schools as a means of providing education at a better cost to law students. History, I suppose, can be lost in a given moment. But providing a bit of perspective is useful to recognize that the very problems identified in legal education today were the direct result of DOJ decisions that were, at one time, lauded as the solution.

Sifting through the data for some clarifications about law student debt figures

Law school debt figures are an important component of any examination of the legal education market. But they are often misunderstood in ways that exaggerate their scope and burden--and using figures that may in other respects conceal the true magnitude of their effect.

To understand, consider statistics on alcohol consumption. The average American consumes 14 alcoholic drinks per week.*

Well, no, the average American doesn't. Among drinkers, the average American consumes 14 alcoholic drinks per week; but 30 percent of Americans don't drink. If you include those Americans, the number drops to 9.8 drinks per week. That's a pretty significant difference--but, of course, either figure is useful, as long as one includes the relevant caveat when disclosing the data.

The same holds true for debt figures. Using one measure, Law School Transparency reports an average debt of $118,670 for law school loans. Well, not exactly--$118,670 among law students who took out law school loans.

Consider, too, the recent New York Times article on the subject: "In 2012, the average law graduate’s debt was $140,000, 59 percent higher than eight years earlier."

That statement is false at one level and deceptive at another.

False, because, as the Wall Street Journal reports, in the very article that was linked, that was the average law graduate's debt among those who borrowed. The Times piece includes no such caveat. Looking at the underlying data, we see that figure is 87%--nearly 1 in 7 have $0 in debt upon graduation.

And deceptive, because that $140,000 figure includes undergraduate debt; certainly, it matters little to the law school graduate where the debt comes from, and law schools should be attuned to the total debt burden of their graduates. But, it's also worth noting that not all law graduate loan debt is attributable to the law school. As the target of the opinion is a critique of the price of law schools, fine-tuning claims with greater precision may be in order.

Back to alcohol. Consider, too, the difference between mean (or average) consumption and median consumption. In the United States, the median American consumes about 3 alcoholic drinks a week. That's not very much, particularly when you compare that to the average figure I cited above of 9.8 drinks a week.

That's because there are a significant number of people who consume zero alcoholic drinks, or hardly anything at all, and a very small number of consumers who imbibe an extraordinary number of drinks. In that figure, then, the average suggests that a "typical" American consumes a lot of alcohol; but, the median, the 50th percentile American, right in the middle of the general population, is only consuming about 3 drinks a week. The average is not a very useful figure on a per-student basis.

Sadly, we lack the more granular data on the median debt loads at schools--despite the fact that we deeply value medians in law school admissions, including LSAT scores and UGPAs. And it might be the case that the debt load picture paints something in the reverse of the alcohol picture. It may well be the case that few students take out "low" debt loads--there may well be a chasm between the no-debt students and the indebtedness students. And if a significant number are slightly below the "average" total, and a handful are quite a bit above that, then the "average" may suggest that the debt situation among graduates is better than it actually is. Of course, the reverse might be true--a few extreme outliers distort the average, and median debt is lower. Given the number of students who enter with zero scholarship-based financial assistance and no personal family support, typical debt loads may be quite different than the "average" figures let on.

In short, the average indebtedness figures are a useful starting point. (Bill Henderson, for instance, notes some of the limitations and complications of such data, and suggests that the problems may be particularly acute at the high end.) But they are hardly a complete picture, and they must be considered in context--on a per-school and a national basis; in conjunction with and distinct from undergraduate debt; taking into account employment outcomes and default rates; and, of course, including careful clarification of what such figures represent, including which graduates they include or exclude.

*Note: this is an extremely quick and dirty analysis using the deciles of data disclosed; I'm sure the actual number is something different.

Year-over-year LSAT test-takers up a little, or down a little, or up somewhat

The Law School Admissions Council ("LSAC") is in the business of, among other things, administering the Law School Admissions Test ("LSAT"). Shortly after each administration, which it offers four times a year, it offers a total of LSAT test-takers for that administration. This is a pretty easy number to understand. But it conceals a lot of data.

Baseball stat geeks are acutely aware of this phenomenon in other contexts. "Batting average," for instance, is a popular way of measuring a batter's productivity, which measures the number of hits a batter has over his at-bats. But the measure conceals a lot of information about the batter's productivity. It equates singles and home runs, the latter being far more valuable. It ignores walks, a positive outcome for a batter. There are, perhaps, more valuable or useful metrics to evaluate a batter's quality, if only one can look inside the data.

Reading that top line, you see that overall LSAT takers are up 6.6% over last year. But there are other numbers to look at, too, which LSAT distributes via PDF to law schools but does not include in its top-line data on its website.

For instance, LSAT administrations at U.S. regional test centers are up 7.9% June-over-June, but down 7.7% at Canadian test centers. (A handful of other international test centers exist, too.) That's probably slightly better news for law schools--the overwhelming number of matriculants to United States law schools take the LSAT in the United States.

Even within that 7.9% number, salient distinctions exist. For instance, first-time test-takers at U.S. regional test centers are up 8.0% June-over-June, whereas repeaters are up 7.7%. Perhaps not an overwhelming difference, but a difference to emphasize that first-time test-takers, probably the more important measure to evaluate new levels of interest in law schools, are slightly higher than the repeaters.

How these statistics correlate to the more meaningful measure, law school applicants, is another matter entirely. Last year, overall LSAT test-takers declined 9.1% June-over-June, including a 9.9% decline at U.S. regional centers and an 11.7% decline among first-time takers are U.S. regional centers. By year's end, LSAT test-takers increased year-over-year, and applicants declined just 1.8% year-over-year.

More granular data might indicate a more meaningful narrative about applicants this admissions cycle for the incoming Class of 2019. But, as is often the case, it's wait and see until the next data point arrives.

A tale of two law school applicant cycles

It may not be the best of times, but there is no question that today's prospective law school applicant is in a dramatically different position than the law school applicant of just four years ago. Gleaning data from LawSchoolNumbers (of course, with all the usual caveats that come with such data), I looked at the profiles of similarly-situated law school applicants applying to a similar set of law schools in the 2010-2011 and 2014-2015 application cycles. I included their self-reported (all the usual caveats) outcomes. (I found applicants with identical LSAT scores and similar UGPAs, but I ensured that if the UGPAs were different, the 2014-2015 applicants always had the slightly worse UGPA.) I anonymized the schools, even though they're easily discoverable, simply because the precise identities of each school don't matter terribly much; instead, the illustration of the dramatically different outcomes for similar-situated applicants four years apart stands alone. The dollar figure listed is the three-year scholarship offer.

YEAR 2010-2011 2014-2015
APPLICANT LSAT 160 160
APPLICANT UGPA 3.53 3.46
School W Rejected Waitlisted
School X Rejected Accepted, $30,000
School Y Rejected Accepted, $102,000
School Z Accepted Accepted, $102,000
   
YEAR 2010-2011 2014-2015
APPLICANT LSAT 162 162
APPLICANT UGPA 3.42 3.4
School J Rejected Accepted, $120,000
School K Waitlisted Accepted, $159,000
   
YEAR 2010-2011 2014-2015
APPLICANT LSAT 166 166
APPLICANT UGPA 3.91 3.91
School C Waitlisted Accepted
School D Rejected Accepted, $127,500
School E Accepted Accepted, $105,000
     
YEAR 2010-2011 2014-2015
APPLICANT LSAT 162 162
APPLICANT UGPA 3.9 3.72
School P Waitlisted Accepted
School Q Waitlisted Accepted, $48,000
School R Accepted, $25,000 Accepted, $132,000

UPDATE: For the methodology, yes, I simply found two similarly-situated applicants as best I could find. I excluded anyone with self-identified distinctive applicant profiles, such as under-represented minority or early action applicants, to minimize any distinctions between applicants.

The twenty-two law reviews you should follow on Twitter

While you could follow a pretty sizeable list of law reviews I've maintained on Twitter, there are a handful of law reviews that rise above the rest.

Last year, I listed the sixteen law reviews to follow on Twitter. I've modified the criteria slightly and updated it. I've mentioned that I find Twitter one of the best places to stumble upon scholarship and engage in a first layer of discussion about new ideas.

In my view, it's actually shocking how challenging it is to find recently journal content. Many journals don't maintain a Twitter feed, much less a decent web site--most lack an RSS, are updated infrequently at best, and often include stock art (because, apparently, law reviews are into stock art?). Given scarce resources that law schools have today, one might expect schools to find ways of maximizing the value from their investments in their journals.

(Also, some advice you're welcome to ignore if you're developing a Twitter handle. Avoid symbols like underscores in your name. Eschew the "U" for your university if possible. Abbreviate LRev and LJ if you can. You want as concise a title to ensure maximum ability for people to communicate a message in 140 characters when mentioning your username. And you want it to be clear who you are: too brief a nickname for your law school may not communicate much about your brand to the casual Twitter follower.)

Alas, I'll settle for the occasional tweet on the subject. I looked at the flagship law reviews at the 105 law schools with a U.S. News & World Report peer score of 2.2 or higher.  If I found their Twitter accounts, I included them. I then examined how many tweets they had, how many followers they had, and when their last tweet (not a retweet) took place. I then created a benchmark, a slightly stricter standard than last year (as another year has passed!): the law reviews "worth following" are those with at least 150 tweets, at least 150 followers, and at least one tweet (not a retweet or direct reply) in the last 30 days. I thought that would be a pretty minimal standard for level of engagement and recency of engagement. This 150/150/30 standard reduces the list to 22 accounts worth following:

Harvard Law Review

Stanford Law Review

Yale Law Journal

Columbia Law Review

Chicago Law Review

NYU Law Review

California Law Review

Michigan Law Review

Penn Law Review

Texas Law Review

UCLA Law Review

George Washington Law Review

Boston University Law Review

Iowa Law Review

Ohio State Law Journal

Florida Law Review

Illinois Law Review

Washington Law Review

Connecticut Law Review

Case Western Reserve Law Review

St. Louis University Law Journal

Syracuse Law Review

It's fairly notable, I think, that half of the schools on this list have a top-20 peer reputation score and that every single one of the top 9 schools in peer reputation make the list. Indeed, follower count is highly correlated with peer score (0.57)!

Below is the complete list of these journals, with 150/150/30 law reviews highlighted. If you see a journal not listed, tweet me about it @derektmuller.

Peer score Journal Tweets Followers Last tweet (not RT)
4.8 @HarvLRev 650 15.8K May 15, 2015
4.8 @StanLRev 414 4202 May 29, 2015
4.8 @YaleLJournal 657 6335 May 15, 2015
4.6 @ColumLRev 288 2901 May 12, 2015
4.6 @UChiLRev 265 3409 May 15, 2015
4.5 @nyulawreview 1281 4946 May 2, 2015
4.4 @CalifLRev 342 2164 May 21, 2015
4.4 @michlawreview 161 1465 May 31, 2015
4.4 @PennLawReview 331 1945 May 25, 2015
4.3 @VirginiaLawRev 16 328 May 18, 2015
4.2 @Cornell_Law_Rev 1 613 July 21, 2010
4.2 @DukeLawJournal 30 899 April 15, 2015
4.1 @GeorgetownLJ 71 744 May 13, 2015
4.1 @NULRev 136 615 May 30, 2015
4.0 @TexasLRev 410 1609 May 19, 2015
3.9 @UCLALawReview 150 1864 May 7, 2015
3.8 Vanderbilt  
3.5 @emorylawjournal 54 110 May 20, 2015
3.5 @SCalLRev 5 69 May 9, 2013
3.5 Washington (St. Louis)  
3.4 @GWLawReview 405 379 May 30, 2015
3.4 @MinnesotaLawRev 37 394 March 31, 2015
3.4 @NotreDameLawRev 20 303 May 12, 2015
3.4 North Carolina  
3.3 @BULawReview 527 1003 May 11, 2015
3.3 @UCDavisLawRev 92 334 May 29, 2015
3.3 Wisconsin  
3.2 @AlaLawReview 31 477 March 1, 2015
3.2 @BCLawReview 343 1165 April 27, 2015
3.2 @fordhamlrev 355 1638 March 6, 2015
3.2 @IowaLawReview 232 939 May 14, 2015
3.2 @OhioStateLJ 392 1130 May 25, 2015
3.2 Indiana (Bloomington)  
3.2 William & Mary  
3.1 @GaLRev 9 171 May 12, 2015
3.1 @HastingsLJ 117 358 May 3, 2015
3.1 @UFLawReview 203 558 May 22, 2015
3.1 @UIllLRev 204 879 May 5, 2015
3.1 @WashLawReview 127 1065 May 21, 2015
3.1 @WLU_LawReview 247 137 May 14, 2015
3.1 Colorado  
3.0 @arizlrev 31 197 April 2, 2013
3.0 @TulaneLawReview 40 546 March 6, 2015
3.0 @WFULawReview 764 542 April 24, 2015
3.0 Arizona State  
3.0 Irvine  
2.9 BYU  
2.9 Florida State  
2.9 Maryland  
2.8 @AmULRev 335 821 March 26, 2015
2.8 @ConnLRev 668 816 May 31, 2015
2.8 @UtahLawReview 0 3 n/a
2.7 @CardozoLRev 53 765 May 14, 2015
2.7 @denverlawreview 101 506 April 28, 2015
2.7 @geomasonlrev 82 143 May 17, 2015
2.7 @UMLawReview 97 671 April 28, 2015
2.6 @OregonLawReview 7 340 April 7, 2015
2.6 @PeppLawReview 595 671 April 17, 2015
2.6 @PittLawReview 0 10 n/a
2.6 @ukanlrev 98 467 September 25, 2014
2.6 Missouri (Columbia)  
2.6 San Diego  
2.6 SMU  
2.6 Temple  
2.6 Tennessee  
2.5 @Brook_L_Rev 0 26 n/a
2.5 @CaseWResLRev 813 731 May 27, 2015
2.5 @GSULawReview 0 14 n/a
2.5 @KYLawJournal 17 131 March 20, 2012
2.5 @LLSlawreview 0 1 n/a
2.5 Chicago-Kent  
2.5 Houston  
2.5 Richmond  
2.4 @LUCLawJournal 169 120 May 21, 2014
2.4 @NebLRev 161 119 May 31, 2015
2.4 @RutgersLJ 12 457 May 2, 2014
2.4 @RutgersLRev 63 580 April 3, 2015
2.4 Baylor  
2.4 Hawaii  
2.4 Indiana (Indianapolis)  
2.4 Lewis & Clark  
2.4 New Mexico  
2.4 Oklahoma  
2.4 Santa Clara  
2.3 @arklawrev 156 1726 February 17, 2014
2.3 @HULawJournal 430 567 January 9, 2015
2.3 @MichStLRev 318 584 April 23, 2015
2.3 @NevLawJournal 54 82 November 17, 2014
2.3 @nuljournal 40 286 May 25, 2015
2.3 @SCLawReview 317 738 February 20, 2015
2.3 @SHULawReview 22 163 January 28, 2014
2.3 @VillanovaLawRev 40 112 March 19, 2015
2.3 Cincinnati  
2.3 Marquette  
2.3 Mississippi  
2.2 @lalawreview 74 664 April 13, 2015
2.2 @MaineLawReview 92 463 May 26, 2015
2.2 @pennstatim 27 132 September 18, 2013
2.2 @SLULawJournal 560 455 May 14, 2015
2.2 @SULawRev 23 34 March 6, 2015
2.2 @SyracuseLRev 330 295 May 8, 2015
2.2 @UMKCLawReview 2 60 April 20, 2015
2.2 DePaul  
2.2 St. John's  
2.2 SUNY (Buffalo)      

LSAT scores and GPAs of law school matriculants, sorted by undergraduate major, 2013-2014

Following up on yesterday's post, here are the LSAT scores and GPAs of law school matriculants (as opposed to applicants) with at least 80 majors reported. All the caveats and qualifications from the previous data set apply.

You can see classics, math, linguistics, art history, and physics all near the top. Philosophy and economics are two of the larger disciplines that perform quite well. Hover over the data for your own observations!

Which undergraduate majors are the best law students? Featuring interactive visualizations

Last year, I posted data about which majors as law school applicants and matriculants had the best LSAT scores and undergraduate GPAs. I have a new data set for the 2013-2014 admissions cycle. (I also have been working with D3, a JavaScript library, to make my graphics more interesting. I'm new to this, but I hope I've avoided any technological glitches!)

One cannot identify causation based upon these scores. Students self-identify majors, sometimes more than one, or sometimes none at all; others self-select into taking the LSAT altogether (opting for medical school, business school, or a lucrative career instead of law school). Therefore, it is emphatically not necessarily the case, based on this data, that these majors cause students to perform better or worse on the LSAT. It simply describes them.

The chart above identifies the mean highest LSAT score and mean undergraduate GPA based on self-identified major, for majors with at least 80 students taking the exam, among all law school applicants.

A forthcoming chart will have the same information, but solely for law school matriculants; that is, for people who actually matriculated to law school this year. Stay tuned!

UPDATE: I had some glitches in the first post, so I removed the matriculant data and will save it for another post. The matriculant data is now available here.

Visualizing law school federal judicial clerkship placement, 2012-2014

The release of the latest ABA employment data offers an opportunity to update the three-year federal judicial clerkship placement rates. Here is the clerkship placement rate for the Classes of 2012, 2013, and 2014. Methodology and observations below the interactive visualization. (By the way, this is my first effort to code a visualization using D3, so please bear with me for any technical glitches!) The "placement" is the three-year total placement; the "percentage" is the three-year placement divided by the three-year graduating class total.

The placement is based on graduates reported as having a full-time, long-term federal clerkship. (A one-year term clerkship counts for this category.) I thought a three-year average for clerkships (over 3500 clerks from the graduating classes of 2012, 2013, and 2014) would be a useful metric to smooth out any one-year outliers. It does not include clerkships obtained by students after graduation; it only includes clerkships obtained by each year's graduating class.

You can see that smaller schools, and strong regional schools, perform quite well. You can see that the University of California-Irvine is still performing quite well, but that's largely presently because a third of its score includes the 16-for-56 placement from its inaugural class. This year's placement was slightly over 10%, and I anticipate that its increased class sizes in the coming years will settle it somewhat lower but still near the top.

By the way, I'd previously called this a "microranking," but I've abolished that title for a couple of reasons. First, "rankings" are, in my view, increasingly problematic, particularly given how law school marketing departments gush over every "ranking," from whatever source, that places them anywhere near a respectable position in an effort to attract prospective students. Second, I tried using a 20-80 scale to rate schools, but, with a strong visualization, I feel more confident in allowing the figures to speak without attaching a scaled numerical value.