Two charts on non-JD law school enrollment figures for 2014

Following up on last week's post on non-JD enrollment, I thought I would update last year's charts on the ratio of JD and non-JD enrollment in law schools.

The first update reflects the last 26 years of non-JD enrollment as a percentage of law school enrollment, updated to include the 2013-2014 figures.

nonjdpercentage2014.png

Non-JD enrollment was in the 4-5% range until around 2000, when it increased to around 5-5.5% for the next few years, and has risen each year since 2007. In the 2008-2009 and 2009-2010 academic years, non-JD students were 6% of all law school enrollment. In 2010-2011, it was 6.2%; in 2011-2012, it was 6.5%; and in 2012-2013, it was 7.4%. This year, it increased yet again to 8.0%.

Part of that recent increase is, of course, relative.  The non-JD percentage appears higher in the last three academic years in particular simply because of the significant drop-off in JD enrollment. But it's also higher in absolute terms, too. Consider the last 11 years.

totaljdandnonjd2014.png

When I drafted this chart last year, I tried to make the scale the same on both sides, with a common starting point for the JD enrollment (blue, on the left axis) and the non-JD enrollment (red, on the right axis), to reflect changes in absolute numbers. That, however, yielded an uncomfortable chart when I updated it. The sharp drop in JD enrollment means that the non-JD axis of the chart technically dips below zero (denoted by a small line below the last numbered axis point). But I wanted to preserve the same chart I used last year for continuity.

Which law schools have the highest non-JD enrollment?

UPDATE: The table below has been corrected due to an error on my part--it double-counted the non-JD online degrees.

I've discussed the trend of increased non-JD enrollment in law schools. Thanks to new ABA data, we now have the JD and non-JD enrollment data for each school in 2013.

It turns out that the original figures I used were underinclusive in one respect: the ABA reports "non-JD enrollment" as the sum of post-JD enrollment and post-baccalaureate enrollment (including "non law," usually including "master level programs aimed at non-lawyer professionals"). But it excludes the 1677 "non-JD online" enrollment. [UPDATE: This is incorrect: it was pointed out to me that the totals DO include the non-JD online. A corrected able is below.]

I sorted the schools by the total non-JD enrollment--including post-JD, post-baccalaureate, and non-JD online--as a percentage of total enrollment (the denominator being those categories, plus full-time and part-time JD enrollment). These schools had the highest percentage of non-JD enrollment.

Vermont: 38.5%

NYU: 33.2%

Loyola Chicago: 32.2%

Boston University: 30.7%

Temple: 26.6%

Georgetown: 25.4%

Alabama: 24.5%

Washington: 24.4%

Berkeley: 23.2%

USC: 22.0%

Tulsa: 21.8%

Golden Gate 21.6%

Northwestern: 19.8%

Columbia: 19.0%

Washington University in St. Louis: 18.9%

Illinois: 17.6%

Penn State: 17.1%

Case Western: 16.8%

George Washington: 16.6%

Denver: 16.1%

The ABA Task Force on the Future of Legal Education wants to change everything

A peril with the ABA Task Force on the Future of Legal Education is that it recommends changes; it does not implement them. So it is now incumbent upon a series of other actors (including other sections of the ABA itself) to act upon its recently-released report and recommendations.

The report is concise, punchy, and thorough. Its specific recommendations are clear and numerous.

It first recommends another task force to evaluate pricing and financing of legal education, this one to address the concerns discovered by the task force, such as the rapid increase of tuition, discounting based on LSAT scores, lack of need-based discounting, and loans, among other things.

It then asks the Section of Legal Education and Admissions to the Bar to eliminate or substantially moderate a number of standards. These standards, the Task Force concludes, "directly or indirectly raise the cost of delivering a J.D. education without commensurately contribution to the goal of ensuring that law schools deliver a quality education," and "directly or indirectly impede law school innovation in delivering a J.D. education without commensurately contributing to the goal of ensuring that law schools deliver a quality education." Those standards and interpretations recommended for elimination or substantial moderation include:

  • Standard 206(c) (requiring that, except in extraordinary circumstances, a dean be a faculty member with tenure)
  • Standard 304, and specifically Standards 304(b) & (c) (a 130-day academic year; 45,000 minutes of attendance in regularly scheduled class sessions; requiring that a J.D. degree be completed no earlier than 24 months after commencement of law study)
  • Interpretation 304-5 (requiring that credit only be given for course work taking after the student has matriculated to a law school)
  • Interpretation 305-3 (prohibiting credit earned for field placement in which student receives compensation)
  • Standard 306 (limiting and conditioning distance education)
  • Interpretations 402-1 & 402-2 (restricting faculty-student ratio and discounting non-tenured and non-tenure track faculty)
  • Standard 403 (limiting instructional roles of non-tenured and non-tenure track faculty)
  • Standard 405 (regarding tenure)
  • Standard 603 (requiring certain standards for the director of the law library)
  • Interpretation 701-2 (requiring a certain amount of "adequate physical facilities")
  • Rule 25 (requiring confidentially over all matters relating to accreditation of a law school, including the site evaluation report)
  • Rule 27 (prohibiting disclosure of statistical data reported to the ABA)

It's ironic, I think, that, at the very time the ABA Task Force on the Future of Legal Education has identified a "mismatch" between curriculum and goals; "diverse views" on the purpose of law schools; a need for "greater heterogeneity" in law schools and legal education; and innovation to reduce the cost of legal education; that, at the same time, the ABA Section of Legal Education and Admissions to the Bar has proposed a new, likely more expensive, homogenous proposal about experiential education.

This is a dramatic series of specific recommendations, some of which are quite long-standing. But here's where I think one may see resistance. The existing 201 or so schools are already built on this model. Even if these standards and interpretations are abolished, it would be very difficult for an existing school, barring a complete overhaul, to use them effectively.

Instead, the biggest opportunities exist in new law schools (ironic, I know, given that there are so many existing schools). It's through pressure from new, smaller, sleeker, more nimble schools structured around a lower cost model that the existing law school model would be truly forced to change (apart, of course, from its present state of triage as it relates to a declining applicant pool). And, of course, it's a reason why law schools would be unlikely to want to adopt these changes.

The report goes on to recommend that law schools--and not just the institutions, but also law faculty members--should undertake certain steps. Here they are:

Each law school should undertake the following:

1. Develop and Implement a Plan for Reducing the Cost and Limiting Increases in the Cost of Delivering the J.D. Education, and Continually Assess and Improve the Plan.

2. Develop and Implement a Plan to Manage the Extent of Law School Investment in Faculty Scholarly Activity, and Continually Assess Success in Accomplishing the Goals in the Plan.

3. Develop a Clear Statement of the Value the Law School's Program of Education and other Services Will Provide, Including Relation to Employment Opportunities, and Communicate that Statement to Students and Prospective Students.

4. Adopt, as an Institution-Wide Responsibility, Promoting Career Success of Graduates and Develop Plans for Meeting that Responsibility.

5. Develop Comprehensive Programs of Financial Counseling for Law Students, and Continually Assess the Effectiveness of Such Programs.

Law school faculty members should undertake the following:

1. Become Informed About the Subjects Addressed in This Report and Recommendations, in Order to Play an Effective Role in the Improvement of Legal Education at the Faculty Member's School.

2. Recognize the Role of Status as a Motivator but Reduce its Role as a Measure of Personal and Institutional Success.

3. Support the Law School in Implementing the Recommendations in [the section above].

How will law faculty, and law schools, respond? Time will tell.

Ranking the Law School Rankings, 2014

Last year, I introduced the first-ever ranking of law school rankings at PrawfsBlawg. I thought I would reprise the task again.

As Elie Mystal at Above the Law noted at a recent conference, law school rankings tend to encourage more law school rankings. So it may be useful to put them in a single place and analyze them.

The rankings tend to measure one of, or some combination of, three things: law school inputs (e.g., applicant quality, LSAT scores); law school outputs (e.g., employment outcomes, bar passage rates); and law school quality (e.g., faculty scholarly impact, teaching quality). Some rankings prefer short-term measures; others prefer long-term measures.

Last year, I ranked 15 rankings. I'm adding four other rankings: Enduring Hierarchies; Witnesseth Boardroom Rankings, Above the Law Rankings, and Tipping the Scales Rankings.

1. Sisk-Leiter Scholarly Impact Study (2012): Drawing upon the methodology from Professor Brian Leiter, it evaluates the scholarly impact of tenured faculty in the last five years. It's a measure of the law school's inherent quality based on faculty output. In part because peer assessment is one of the most significant categories for the U.S. News & World Report rankings, it provides an objective quantification of academic quality. Admittedly, it is not perfect, particularly as it is not related to law student outcomes (of high importance to prospective law students), but, nevertheless, I think it's the best ranking we have.

2. NLJ 250 Go-To Law Schools (2013): It's a clear, straightforward ranking of the percentage of graduates from each school who landed a position at an NLJ 250 law firm last year. It does not include judicial clerkships, or elite public interest or government positions, but it is perhaps the most useful metric for elite employment outcomes.

3. Princeton Review Rankings (2013): Despite a black box methodology that heavily relies on student surveys, the series of rankings gives direct and useful insight into the immediate law school situation. It is admittedly not comprehensive, which I think is a virtue.

4. Above the Law Rankings (2013): The methodology is heavily outcome-driven. Unfortunately, it conflates "tuition" with "cost" (conceding as much when evaluating its own metrics), and it relies heavily on a couple of narrow categories (e.g., Supreme Court clerks). But it's a serious and useful ranking.

5. Enduring Hierarchies in American Legal Education (2013): Using a wealth of metrics, this study evaluates the persistency of the hierarchies among law schools. There are few things that have changed in determining which law schools are high quality over the last several decades. This study tries to figure out the traits of the hierarchies, and it categories the schools into various tiers.

6. Witnesseth Boardroom Rankings (2013): Professor Rob Anderson's analysis is extremely limited: it evaluates which law school graduates end up as directors or executive officers at publicly held companies. I think it gives a nice data point in an area that's under-discussed: law school graduates, after all, may find success in business and not simply in the practice of law.

7. Law School Transparency Score Reports (2013): It's less a "ranking" and more a "report," which means it aggregates the data and allows prospective students to sort and compare. The data is only as useful as what's disclosed--and so while it provides some utility, it's limited by the limited disclosures.

8. The Black Student's Guide to Law Schools (2013): Despite its obvious narrow audience, I think it offers some unique, and serious, elements, such as cost and cost of living, and "distinguished alumni" as a measure for school quality and student outcomes.

9. Roger Williams Publication Study (2013): It selects a smaller set of "elite" journals and ranks schools outside the U.S. News & World Report "top 50." There are a few issues with this, especially given its narrow focus, but I think it does a nice job filling in some gaps left by the Sisk-Leiter study.

10. SSRN Top 350 U.S. Law Schools (2014): The total new downloads give you an idea of the recent scholarship of a faculty--with an obvious bias toward heavy-hitters and larger faculties.

11. Wall Street Journal Law Blog's Best Big Law Feeder Schools (2012): It's somewhat less useful than the NLJ 250, but it is what it is.

12. U.S. News & World Report (2013): It really isn't that this ranking is so bad that it's 12th on my list. It's not ideal. It has its problems. I've noted that it distorts student quality. But, mostly, it's a point that there are quite a few rankings that, I think, are much better.

13. Tipping the Scales Rankings (2013): The metrics are simply a bit too ad hoc--and that's saying something coming behind U.S. News & World Report. The factors are idiosyncratic and, while they reflect a superficial appreciation of things like student quality and outputs, the measures used (salary data, which is inherently bimodal; acceptance rates, which are not uniform indicators of quality; etc.) are not a serious appreciation of those things.

14. QS World Law School Rankings (2013): I think this ranking tends toward comparing apples, oranges, kumquats, rhododendrons, and lichen: all living things, but extremely hard to compare.

15. Business Insider 50 Best Law Schools in America (2013): A 400-legal-professional survey asking them to name their top 10 schools, this survey... isn't much for usefulness as a ranking. It reflects the impressions of a set of practitioners using a specific methodology. That's about it.

16. Seto Rankings (2012): As these rankings have been thoroughly debunked, there isn't much for me to add. Here, bigger schools do better.

17. National Jurist Best Value Law School Rankings (2013): These rankings had so many flaws that subsequent remedial measures are inadequate to fix them. They should not be used.

18. Top Law Schools Rankings (2013): Last year, I indicated a myriad of reasons why these rankings were a hot mess. This year, Top Law Schools has "fixed" them... simply by repeating the overall U.S. News & World Report rankings, and the Above the Law rankings. As the ordinal ranking is less useful than the underlying data, it's far less useful than the actual rankings.

19. Cooley Rankings (2010).

A few things I learned from the Wisconsin new lawyers task force report

The State Bar of Wisconsin recently released (PDF via TaxProf) a task force report on "Challenges Facing New Lawyers." They received feedback from 599 "new lawyers" with a margin of error of 3.8%. Most graduated in 2011; over 70% had graduated since 2008. Here are some things I gleaned.

1. Contracts, criminal law, family law, estate planning, municipal/government law. These were the top five practice areas in an open-ended, unlimited response survey of the graduates when asked to identify their "primary" practice area. (These may be useful courses for a student to consider.)

2. Undergrad debt is a small portion of law graduate debt. Among those who took out loans (over 90% of respondents), the median amount of law school debt was $95,000; the mean was $94,822; the 25th and 75th percentiles were $68,000 and $120,000. But the median undergraduate debt was just $20,000, and only half of respondents had undergraduate debt.

3. Graduates expected lower debt loads. The survey found that about 80% found their debt "more than they expected it to be." One factor is probably the ballooning annual rise in cost: debt loads increased by by 36.8% from 2005 to 2008. Another is probably lack of information for prospective students. It's worsened lately after federal politicians agreed to eliminate subsidized loans for law students. Most anticipate long-term (i.e., at least 6 years) consequences on account of their debt.

4. Graduates expected higher salaries. 78.9% thought their earnings were less than they expected; their income was an average $41,591. 6.6% had incomes higher than they expected; their income was an average $147,250. And for the 14.6% who had income meet expectations, their average compensation was $79,865.

5. Graduates expected better benefits. 71.4% had lower benefits than they expected, such as health or dental insurance.

6. Jobs come from who you know. The survey reflected the fact that more found jobs from "networking" (39.2%) than "job boards" (30.9%). The survey qualifies that results were based on what was used, so efficacy should not be implied from the result. (And it roughly correlates with NALP figures I've seen.) But, it's a notable result.

7. Students sometimes don't know what classes they ought to take until it's too late. Even though law schools often offer practice management, business training, entrepreneurship, and similar classes, many students do not take them because they "do not realize the importance of the classes until it's too late."

8. Most students who knew about the difficult legal job market still went to law school. The survey reports that about 64% of students went to law school "despite knowing the difficulties lying ahead because they wanted to help others or serve justice." They thought the risk was acceptable: "More than half of respondents though the risk was worth the reward or they trusted that they would find a way to succeed."

Ranking law prof blogs by digital privacy

I recently posted an "annual disclosure," which describes things like hits, costs, and privacy. Of note, this blog is not monetized in any way. At the moment, I have a fairly strong hostility to the notion that legal academics should monetize their blogs--not that I begrudge or hold it against any who do, but that I am not convinced it would do anything but pressure my blog toward a reduction in quality or an increase in real or apparent corruption. (For more on that, see my site disclosure.)

I thought it might be useful to rank the blogs of law professors based upon digital privacy. I used 66 blogs for this sample: the top 50 law professor blogs from a recent TaxProf ranking based on site traffic, and 16 other blogs of note (including my own), some of which are not law professor blogs but law-related blogs.

Every time you visit a site, you transmit information to that site. Presumably, the owner of the site (and anyone responsible for hosting the site) has access to that information. But sites may contain a number of items that may disclose your information to third parties--frequently, without disclosing it, and without your knowledge.

So, I used Ghostery to determine the "cookies, tags, web bugs, pixels and beacons" that are invisible to a site visitor, but that may (and frequently do) disclose information to third parties. It treats each tracking item as equal; it counts multiple tracking items from the same site (e.g., Twitter Badge and Twitter Button) as separate items.

In a sense, Ghostery is underinclusive. A site may internally collect data but does not need a tracker, because, well, the information is routed directly through the site itself, and there is no need to install a third-party tracking device. So, of course, you should assume that all sites contain at least one tracking device: the site itself. What the site collects and how it uses it varies from site to site.

Further, Ghostery is, alas, but one imperfect measure. When I recently did this test, I saw that some sites had different third-party trackers depending on when I visited, or from where. These rankings, accordingly, should be taken with a grain of salt.

One curious item is that utter lack of transparency on most sites--and some, ironically, operated by law professors who specifically write in the area of digital and online privacy. I'm hard-pressed to find on most of these sites a privacy policy or disclosure that your information is being shared with third parties, or that the site operators may profit from your activity on the site. Perhaps that disclosure is not the kind of thing one should expect--but it's one that I would hope to see more of in the future.

Among these 66 blogs, 87 different kinds of devices were used. The most popular include the following (with links to the Ghostery identification of the service and the information collected by the service, such as whether the information is anonymous, pseudonymous, or personally-identifiable information, and data retention policies, if available). Where available, I've included a link to the site's opt-out options.

Google Analytics (used by 56 sites) (opt-out options)

SiteMeter (48) (opt-out cookie)

Specific Media (46) (opt-out cookie)

Vindicio Group (46) (opt-out cookie)

Quantcast (40) (opt-out cookie)

ScoreCard Research Beacon (40) (opt-out cookie)

Typepad Stats (31)

The sites below are ranked based on the number of trackers. The lower the number, the fewer trackers. Rounding out the top three: Non Curat Lex, operated by Professor Kyle Graham, who bravely has not a single tracking device (and who, alas, has just retired from blogging); my own blog, which uses Google Analytics (as disclosed); Professor Lawrence Lessig's blog, which uses just Google Analytics and the Typekit by Adobe (which only collects anonymous information about the serving domain).

The rankings below are based on the total number of Ghostery items identified on the site; higher numbers mean more items. There are, I'm sure, other ways of calculating "digital privacy"; please let me know if you have thoughts in the comments.

Non Curat Lex 0

Excess of Democracy 1

Lessig.org 2

ACS Blog 3

Feminist Law Professors 3

How Appealing 3

Word on the Streeterville 3

Balkinization 4

California Appellate Report 4

Freakonomics 4

Point of Law 4

Discourse.net 5

Dorf on Law 5

Election Law Blog 5

Harvard Law Corp Gov 5

Lawfare 5

The Incidental Economist 5

Constitutional Law Prof Blog 6

Federalist Society Blog 6

IntLawGrrls 6

Legal History Blog 6

Turtle Talk 6

Sentencing Law & Policy 7

Sports Law Blog 7

Althouse 8

Antitrust & Comp. Policy Blog 8

ContractsProf 8

CrimProf Blog 8

EvidenceProf Blog 8

Instapundit 8

Legal Profession Blog 8

Legal Skills 8

Legal Theory Blog 8

Legal Whiteboard 8

Legal Writing Prof Blog 8

M&A Law Prof Blog 8

PrawfsBlawg 8

PropertyProf Blog 8

SCOTUS Blog 8

White Collar Crime Prof Blog 8

Workplace Prof Blog 8

Credit Slips 9

ImmigrationProf Blog 9

Leiter's Law School Reports 9

Wills, Trusts & Estates Prof Blog 9

Josh Blackman's Blog 10

Legal Ethics Forum 10

Leiter Reports: Philosophy 10

Religion Clause 10

Conglomerate 11

Hugh Hewitt 11

Patently-O 11

TaxProf Blog 11

Witnesseth 11

Concurring Opinions 12

Nonprofit Law Prof Blog 12

Volokh Conspiracy 12

College Insurrection 13

Faculty Lounge 13

The Right Coast 14

Jack Bog's Blog 15

Opinio Juris 15

Legal Insurrection 16

Professor Bainbridge 20

Above the Law 22

Mirror of Justice 29

New ABA data shows JD enrollment down, non-JD enrollment up

Following up on my earlier discussions projecting the near future of JD enrollment and the legal innovation of increased non-JD enrollment, the ABA has released a portion of the Fall 2013 data. The results are actually slightly worse for JD enrollment than my original projections: I estimated 40,200 enrolled, and the final number from the ABA is 39,675, around an 11% decline. I've updated the chart to show the decline in LSATs administered, JD applicants, and JD matriculants, along with slightly modified (i.e., slightly lower) projections for Fall 2014.

Additionally, non-JD enrollment increased at the nominal rate of an additional 70 students to an all-time high of 11,139. That almost assuredly means non-JD enrollment now exceeds 7.5% of total law school enrollment. I won't have that figure until later, but consider the earlier charts about non-JD enrollment. The ABA has also included a new breakdown of non-JD enrollment: "post-JD" (this year, 9401), and "non-JD for non-lawyers" (1738).

These 30 law schools became more affordable in the last 3 years

There's one area of legal education that, in my view, has been under-examined: student debt. Some look at the non-discounted cost of legal education, but that's not really helpful because there are plenty of reasons that this cost is not the true cost (e.g., scholarships and tuition assistance). Others look at tuition increases or decreases, but it suffers from the same problem: we can't tell if the changes are falling on students or not.

Student debt is only a part of the story, of course, but it's one of the most significant barriers to legal education. Lower debt tends to give students more flexibility in pursuing careers, or less urgency in finding desirable employment outcomes.

The U.S. News & World Report has ranked schools by average debt load among students graduating with debt for several years. I thought I would try to examine the trend in that data over the last three years. That is, in this economic environment, which schools have responded in a way that minimizes, or even decreases, the typical debt a student graduates with? In a time of reduced donations and a prolonged economic downturn, where did the brunt of the cost fall?

I decided to examine the difference in the average student debt loads from 2009 to 2012. There are a number of complications and assumptions I had to make.

First, a substantial number of students, usually 10% to 20% at most schools, graduate with no law school debt. That could be because they are independently wealthy or come from a wealthy family willing to finance the education; they could have substantial scholarship assistance; they could earn income during school or during the summers; they could live in a low cost-of-living area, or live frugally; or some combination of these and other factors. It's worth noting that several thousand students graduate each year without any debt.

I started with the 2009 data and adjusted for inflation by increasing the debt load by 7%.

I then took the percentage change in the 2012 average debt load from the 2009 inflation-adjusted average debt load. Schools with a negative percentage change saw a decrease in the average debt load over the last three years. The percentage may be somewhat deceptive, because at a very low-cost school, a modest increase in debt load may appear, on a percentage basis, much higher than comparable increase at a high-cost school.  A $10,000 increase in debt at a school that previously had just $20,000 in debt looks like 50%; at a school with $100,000 in debt, just 10%. But I thought percentage would still be the most useful.

I then decided to discount the average debt load at each institution each year by the percentage of students who incurred no debt. A school may simply be admitting more independently-wealthy students, or students with substantial work savings, so that fewer graduate without debt; or, it may be handing out more scholarships to help more students graduate debt-free. On the whole, I thought factoring them into the total would be a better way of evaluating the affordability. (This is, of course, not to say that law school is "costless" for people who graduate debt free!)

The averages are not precise, either, for individuals. The average may be artificially high if a few students took out extremely high debt loads that distorted the average, or artificially low if a few students took out nominal debt loads that distorted the average.

These figures cannot be read in isolation. For instance, in the spreadsheet, one public school saw one of the highest percentage increases in student debt at 238%. But the 2009 total was just under an inflation-adjusted $24,000, and its new total of $68,000 is still one of the lowest in the country, particularly when one factors in its relatively good employment outcomes. Or, there are several schools that had astronomical student debt loads in 2009 and saw modest increases through 2012, which may not be indicative of some praiseworthy trait in 2012.

For some schools, there is so substantial a difference between the 2009 numbers and the 2012 numbers that I wonder if there is some one-off from either year that may be responsible, or a reporting error to USNWR.

When I ran the calculations, 30 schools came out as "more affordable." That is, there were 30 schools whose 2012 average student debt load, discounting the percentage of students who incurred no debt, was lower than the 2009 inflation-adjusted student debt load, discounting the percentage of students who incurred no debt. I also decided to include below 10 schools whose 2012 average student debt load was 2.5% higher, or less. (I thought that, given the closeness of the figures, being within 2.5% of the 2009 debt load, was good enough even if the school was not, strictly speaking, "more affordable.")

Change in law student debt loads, 2009 to 2012

Villanova -19.6%

Akron -17.5%

Elon -14.8%

Penn -14.0%

Western State -13.7%

Emory -13.1%

Minnesota -11.7%

North Dakota -10.9%

Barry -10.6%

Tulane -10.3%

St. Thomas MN -9.7%

Southern Illinois -9.7%

Baylor -9.6%

Vanderbilt -8.8%

South Dakota -8.5%

Montana -8.4%

UConn -8.3%

Pepperdine -8.0%

Illinois -7.9%

Gonzaga -7.3%

Albany -5.6%

Arkansas -5.2%

Mississippi -4.6%

Wisconsin -4.5%

Penn State -4.3%

Chicago -3.7%

SMU -3.3%

GWU -1.9%

BU -0.5%

Catholic -0.3%

Roger Williams 0.3%

St. John's 0.3%

W&L 0.4%

San Diego 1.0%

DePaul 1.1%

Harvard 1.2%

Loyola Marymount 1.8%

Texas 1.8%

American 2.0%

Notre Dame 2.1%

Observations from these figures

These observations are fairly general in nature, and I don't know that too much should be read into any of them--if they are even noteworthy.

First, two schools, Villanova and Illinois, had relatively public law school-related scandals in the last few years. Neither happened before the Class of 2012 was admitted to law school, but it may be that tuition or scholarships were adjusted in a particular way to entice students to stay.

Second, a few "elite" schools (e.g., Harvard, Chicago, Penn, Texas, Vanderbilt) helped keep debt loads stable, or eased them, suggesting that not all elite schools are, shall we say, inelastic in their tuition demands on students.

Third, there appears to be some notable regionalism. The Midwest (Akron, Chicago, DePaul, Illinois, Minnesota, Notre Dame, St. Thomas, Southern Illinois, Wisconsin), private schools in southern California (Loyola Marymount, Pepperdine, San Diego, Western State), New England (BU, Harvard, Roger Williams, UConn), the DC-area (American, Catholic, GWU, W&L), and most of the strongest Texas schools (Baylor, SMU, Texas) saw increasing affordability. In contrast, northern California (zero), New York City (St. John's), and Florida (Barry) saw virtually no schools listed.

Fourth, despite the fact that many states have cut funding flagship schools, a number of these institutions are public schools that still saw declines in student debt loads.

I recognize that "more affordable" is a relative term. It may be that some of these schools are still "overpriced" or "underpriced." And, as mentioned earlier (and it bears repeating), this is just one factor to take into consideration among many others, including faculty quality, employment outcomes, and regional preferences.

If you'd like to look at the data, see this Google Doc. There are a few useful ways you may want to sort it. I arranged by "Dsct diff," which includes the discount rate for the percentage of students who incurred no debt. If you want the absolute rate, sort by "Diff." If you want the raw money figures rather than the percentages, sort by "$ diff" or "$ dsct diff."

Ranking law schools by elite employment outcomes

Thanks to the new granular employment data reported by law schools to the American Bar Association, we can try to evaluate student outcomes by a variety of metrics. I thought I'd try to rank schools by "elite employment outcomes" from the Class of 2012 data.

This ranking looks at two employment figures: full-time, long-term bar passage-required employment in firms of 101 or more attorneys; and federal clerkships.

No ranking is perfect, and this one is no exception. There are plenty of "elite" jobs that are not at law firms of more than 100 attorneys, particularly elite public interest positions, financial sector or other JD-preferred positions, or academic positions for those with a joint JD-PhD. There are "elite" boutiques with 100 or fewer attorneys. Not all federal clerkships are created equal. But this is at least a rough metric of two objective elements.

One additional complicating factor is the existence of school-funded positions. I opted to discount the employment rate by the school-funded rate for all full-time, long-term, bar passage-required employment. Unfortunately, schools do not report the size of the firms for students receiving school-funded positions, and it may be that there are fewer school-funded positions at larger firms than smaller firms. But, with the metrics I have, I opted to include the rough discount across the board and accept the lack of precision.

Finally, I decided to visualize the rankings, because sometimes numerical rankings don't indicate the gaps in performance. There are fourteen schools that had over 50% of their students obtain elite employment outcomes; the fifteenth-ranked school was below 40%, which is demonstrated in the gap in the visualization below. I included in the graphic all schools with at least 20%; in the ranking, I included the forty-two schools with at least 15%. (Also of note: fifteen schools had less than 1% of their graduates obtain elite employment outcomes.)

Class of 2012 data Class 101+ Less funded Fed clerks Total
PENNSYLVANIA, UNIVERSITY OF 270 180 175.1 28 75.2%
STANFORD UNIVERSITY 181 85 82.9 51 74.0%
HARVARD UNIVERSITY 590 316 306.2 105 69.7%
COLUMBIA UNIVERSITY 469 301 274.9 37 66.5%
CHICAGO, UNIVERSITY OF 215 121 110.9 31 66.0%
YALE UNIVERSITY 222 71 66.7 77 64.7%
CORNELL UNIVERSITY 190 110 109.3 12 63.9%
DUKE UNIVERSITY 225 115 114.4 29 63.7%
CALIFORNIA-BERKELEY, UNIVERSITY OF 312 167 167.0 21 60.3%
NEW YORK UNIVERSITY 482 287 249.1 27 57.3%
NORTHWESTERN UNIVERSITY 295 145 144.4 19 55.4%
VIRGINIA, UNIVERSITY OF 364 174 146.7 45 52.7%
CALIFORNIA-IRVINE, UNIVERSITY OF 56 13 13.0 16 51.8%
MICHIGAN, UNIVERSITY OF 388 168 166.4 33 51.4%
GEORGETOWN UNIVERSITY 626 245 223.1 23 39.3%
VANDERBILT UNIVERSITY 196 56 56.0 20 38.8%
CALIFORNIA-LOS ANGELES, UNIVERSITY OF 333 114 110.7 16 38.0%
SOUTHERN CALIFORNIA, UNIVERSITY OF 221 68 68.0 15 37.6%
TEXAS AT AUSTIN, UNIVERSITY OF 373 100 99.3 31 34.9%
FORDHAM UNIVERSITY 486 148 147.5 14 33.2%
BOSTON UNIVERSITY 273 74 74.0 13 31.9%
NOTRE DAME, UNIVERSITY OF 196 44 43.0 18 31.1%
BOSTON COLLEGE 260 66 66.0 6 27.7%
EMORY UNIVERSITY 266 60 54.8 17 27.0%
GEORGIA, UNIVERSITY OF 229 40 40.0 20 26.2%
WASHINGTON UNIVERSITY 300 69 67.7 10 25.9%
GEORGE WASHINGTON UNIVERSITY 575 157 116.9 22 24.2%
ILLINOIS, UNIVERSITY OF 213 47 44.5 4 22.8%
NORTH CAROLINA, UNIVERSITY OF 256 42 41.1 12 20.7%
WEST VIRGINIA UNIVERSITY 142 22 22.0 7 20.4%
WAKE FOREST UNIVERSITY 156 23 23.0 7 19.2%
HOUSTON, UNIVERSITY OF 262 47 47.0 3 19.1%
ALABAMA, UNIVERSITY OF 172 14 14.0 16 17.4%
SOUTHERN METHODIST UNIVERSITY 293 42 42.0 9 17.4%
MINNESOTA, UNIVERSITY OF 230 31 29.6 10 17.2%
HOWARD UNIVERSITY 151 23 23.0 2 16.6%
WILLIAM AND MARY LAW SCHOOL 204 26 19.9 12 15.7%
WASHINGTON AND LEE UNIVERSITY 130 14 14.0 6 15.4%
VILLANOVA UNIVERSITY 256 34 34.0 5 15.2%
TULANE UNIVERSITY 269 27 26.8 14 15.2%
CALIFORNIA-HASTINGS, UNIVERSITY OF 443 59 59.0 8 15.1%
KENTUCKY, UNIVERSITY OF 147 17 17.0 5 15.0%

UPDATE: Brian Leiter has included his thoughts here.

I should add that the table includes both the total number in "large" law firms, as well as a discount based on the percentage of full-time, long-term, bar passage-required positions (which is what I use in the underlying percentages). I concede that at many schools, school funding is only available to individuals in government or non-profit work. But I do not have that data. A sound case could be made for an alternative ranking that included all the big law firm employment without any discount.