Fictional Attorney of the Month: Evangeline Whedon

Evangeline "Vange" Whedon is an attorney in the Marvel Comics Universe. She was a successful prosecutor until she was discovered to be a mutant--when she comes into contact with blood, she has the power to shapeshift into a red dragon.

After the discovery, she lost her job as a prosecutor and was estranged from her family. She now represents the X-Men in a variety of legal battles (no doubt illustrating the versatility of an ideal legal professional).

In one case, for instance, she helps an anti-mutant terrorist secure a reduced sentence of probation as the X-Men give the terrorist a second change and hire her to work for them. Vange served as in-house counsel for the X-Corporation, a support agency for mutant populations around the world. She also helps with a number of family law disputes to ensure that child mutants whose parents disown them have new legal guardians.  In one critical legal dispute, she ensures that a child mutant in the custody of the X-Men remains in their custody after the child's parents, who had previously given him away, engage in a custody dispute.

Not all attorneys can shapeshift into dragons, but it's this versatility that set her apart. Vange Whedon is this month's Fictional Attorney of the Month.

Three charts to illustrate the present market for law schools

We often read about the "crisis" in legal education, and the "drastic" steps that law schools are taking. All that being said, I'm actually surprised that law schools are taking such modest steps in the face of fairly long-term declines. That is, given trends, the Class of 2018 is likely to be still a smaller group of students than the 40-year low of the Class of 2017. And even if it's the bottoming out, we even a rebound would likely not return law schools to any sense of "normalcy" until the year 2020. But we see very few schools reacting with a serious, long-term focus like one might expect.

Below are three charts illustrating the total LSAT takers, total JD applicants, and total JD matriculants from 2004-2014, with a projection for 2015 (i.e., the Class of 2018) based on presently-available data. (Data derived from LSAC and ABA resources.)

Ranking the Law School Rankings, 2015

On the heels of the first-ever ranking of law school rankings, and last year's second edition, here's the third edition.

The rankings tend to measure one of, or some combination of, three things: law school inputs (e.g., applicant quality, LSAT scores); law school outputs (e.g., employment outcomes, bar passage rates); and law school quality (e.g., faculty scholarly impact, teaching quality). Some rankings prefer short-term measures; others prefer long-term measures.

Lest anyone take these rankings too seriously, there is no inherently rigorous methodology I use. It's largely my idiosyncratic preference about what rankings I think are "better" or "worse."

And, as always, I'll decide what rankings to rank. I've removed a couple and added a couple. The year listed is the year the ranking was last updated (not the self-described year of the ranking).

1. NLJ 250 Go-To Law Schools (2014): It's a clear, straightforward ranking of the percentage of graduates from each school who landed a position at an NLJ 250 law firm last year. It does not include judicial clerkships, or elite public interest or government positions, but it is perhaps the most useful metric for elite employment outcomes. As a methodological point, only 178 firms answered the survey, and NLJ relied on its database and independent reporting to supplement. To its great advantage, it includes many interactive charts of the data it has.

2. Sisk-Leiter Scholarly Impact Study (2012): The study has not been updated in a few years, but it's still useful for what it does. Drawing upon the methodology from Professor Brian Leiter, it evaluates the scholarly impact of tenured faculty in the last five years. It's a measure of the law school's inherent quality based on faculty output. In part because peer assessment is one of the most significant categories for the U.S. News & World Report rankings, it provides an objective quantification of academic quality. Admittedly, it is not perfect, particularly as it is not related to law student outcomes (of high importance to prospective law students), but, nevertheless, I think it's a valuable ranking.

3. Princeton Review Rankings (2014): Despite a black box methodology that heavily relies on student surveys, the series of rankings gives direct and useful insight into the immediate law school situation. It is admittedly not comprehensive, which I think is a virtue.

4. Above the Law Rankings (2014): The methodology is heavily outcome-driven (and perhaps driven by an outcome in mind). It relies on a very narrow "employment score" (full-time, long-term, bar passage required, excluding solo practitioners and school-funded positions). It conflates "tuition" with "cost," and it relies heavily on a couple of narrow categories (e.g., Supreme Court clerks). But it's a serious and useful ranking.

5. Enduring Hierarchies in American Legal Education (2013): Using many metrics, this study evaluates the persistence of the hierarchies among law schools. There are few things that have changed in determining which law schools are high quality over the last several decades. This study tries to figure out the traits of the hierarchies, and it categories the schools into various tiers.

6. Law School Transparency Score Reports (2013): It's less a "ranking" and more a "report," which means it aggregates the data and allows prospective students to sort and compare. The data is only as useful as what's disclosed--and so while it provides some utility, it's limited by the limited disclosures.

7. Witnesseth Boardroom Rankings (2014): Professor Rob Anderson's analysis is extremely limited: it evaluates which law school graduates end up as directors or executive officers at publicly held companies. But I think it gives a nice data point in an area that's under-discussed: law school graduates, after all, may find success in business and not simply in the practice of law.

8. Roger Williams Publication Study (2013): It selects a smaller set of "elite" journals and ranks schools outside the U.S. News & World Report "top 50." There are a few issues with this, as it relies on a fixed data set of "top 50" journals established years ago, and as it hasn't been updated in a couple of years, but, given its narrow focus, I think it does a nice job filling in some gaps left by the Sisk-Leiter study.

9. AmLaw BigLaw Associates' Satisfaction (2014): It surveys associates for how well their law schools prepared them for firm life. It highly correlates with job satisfaction. It's a nice, small post-graduate measure of law schools.

10. PayScale Rankings by Mid-Career Salary Salaries (2014): While this survey mixes all graduate schools together, and while it has some obvious selection bias in the reported salary data, it's another rare ranking that attempts to evaluate mid-career employment outcomes, which, as an under-evaluated area, makes this study something worth considering.

11. QS World University Rankings (2014): I think this ranking tends toward comparing apples, oranges, kumquats, rhododendrons, and lichen: all living things, but extremely hard to compare. But its use of h-index and citations per paper increases the objectivity of this academic-driven ranking.

12. SSRN Top 350 U.S. Law Schools (2015): The total new downloads give you an idea of the recent scholarship of a faculty--with an obvious bias toward heavy-hitters and larger faculties.

13. U.S. News & World Report (2014): Before, I've said that it isn't that this ranking is so bad that it's so low. Over time, I've concluded that, no, it's because this ranking is bad. It relies heavily on a few metrics that are not beneficial to measuring anything meaningful. It distorts student quality by incentivizing pursuit of the median LSAT and UGPA at the expense of all other quality factors, especially the bottom quartile of the class; it rewards silly categories like high-spending schools and library resources; it prints metrics unrelated to its ranking formula; its "lawyers/judge assessment score" has a notoriously low response rate; peer academic ranking scores have deflated over time as schools sandbag each other when ranking each other; and so on. It might be the case that they are exceedingly influential. It's true. Bu they are pretty poor. They may mostly get the "right" results, but for all the wrong reasons.

14. Tipping the Scales (2015): The metrics are simply a bit too ad hoc--and that's saying something coming behind U.S. News & World Report. The factors are idiosyncratic and, while they reflect a superficial appreciation of things like student quality and outputs, the measures used (salary data, which is inherently bimodal and notoriously underreported; acceptance rates, which are not uniform indicators of quality; etc.) are not a serious appreciation of those things.

15. PreLaw Magazine Best Law School Facilities (2014).

16. GraduatePrograms.com Top Law Schools for Social Life (2014).

Everything you need to know about Hickenlooper v. Kerr, the Guarantee Clause case before the Supreme Court

Tomorrow, the Supreme Court will consider a petition for a writ of certiorari in Hickenlooper v. Kerr. Colorado legislators challenged an enacted ballot initiative that prohibited legislative tax increases from taking effect without a popular vote, arguing that it violated the Guarantee Clause. A federal district court, and the Tenth Circuit, agreed that the legislators had standing and that the Guarantee Clause claim was justiciable.

I started tracking this matter over a year ago. I provide the background in these links; below that, I'll discuss the briefs in the case that the Court will consider.

Several amici were filed in the case, available at SCOTUSBlog. Of note (and these are very brief summaries of the major arguments):

  • The Colorado Union of Taxpayers Foundation, the Mountain States Legal Foundation, and 22 Colorado state legislators filed a brief in support of the petitioner. They focused primarily on the fact that respondents' injury was abstract, because legislators never enacted a tax increase for the people to vote upon--instead, they simply alleged a dilution of legislative power. That cannot comport with existing standing doctrine. Only if the Colorado legislature enacted a tax increase, then saw the people reject it, would standing exist.
  • The National Federation of Independent Business, along with several policy institutes, filed a brief in support of the petitioner. They emphasized the breadth of the impact of a finding that such a case is justiciable, because the decision invites judicial invalidation of direct democracy in a number of states on matters ranging from marijuana legalization to charter schools. They also noted that in the partisan gerrymandering context (Vieth v. Jublier), the Supreme Court has essentially required an articulation of judicially-manageable standards before the case could proceed. Here, the district court insisted (in a rather bizarre fashion) on holding a trial to determine what the Guarantee Clause demands.
  • The Center for Constitutional Jurisprudence (with John Eastman) filed a brief in support of the petitioner. It focused upon the inability of the Tenth Circuit to distinguish existing precedent finding the Guarantee Clause usually non-justiciable. Regardless, the case presents a good vehicle for clarifying the language in cases like New York v. United States (1992) suggesting that the Guarantee Clause may be justiciable, and articulating that the standards for justiciability are not met in this case.
  • Texas joined by five other states filed a brief joined in support of the petitioner. They argue that the text of the Guarantee Clause protects not the state legislature, but the people, and the States. They also cite other provisions like line-item vetos and supermajority voting requirements that may be called into question if this case is found justiciable.
  • The Cato Institute (with Ilya Shapiro) and several other policy institutes filed a brief in support of the petitioner. It asked the Court to avoid addressing the issue of whether the Guarantee Clause is per se non-justiciable and instead emphasized that Colorado's Taxpayer Bill of Rights met the standard of a "Republican Form of Government," drawing heavily from source material at the founding.

The Supreme Court will consider the case tomorrow--and we'll eagerly await their decision as to whether to hear this case.

Annual Statement, 2014

Site disclosures

Total operating cost: $186.47

Total content acquisition costs: $348.70

Total site visits: 83,706

Total unique visitors: 71,902

Total pageviews: 102,923

Top referrers:
Above the Law (12,676)
Facebook (8194)
Pajamas Media (4196)
Twitter (4091)
ABA Journal (3738)
Brian Leiter's Law School Reports (3438)
TaxProf (3252)
Election Law Blog (1802)
Reddit (1096)
The Faculty Lounge (782)
law.uci.edu (573)

Most popular content (by pageviews):
The best prospective law students read Homer (24,095)
Ranking the most liberal and conservative law firms (5516)
Bar exam scores dip to their lowest level in 10 years (5241)
Law school microranking: federal judicial clerkship placement, 2011-2013 (4803)
Ranking the law school rankings, 2014 (4780)
Increasingly appears NCBE may have had a role in declining MBE scores and bar pass rates (4201)
Where are they now? Supreme Court clerks, OT2004 (4152)

Most popular search results (when disclosed by search engine):
law school rankings (75)
excess of democracy (45)
law school rankings 2014 (40)
conservative law firms (16)
affordable law schools (13)
jd advantage jobs (13)
law schools worst yet to come excessofdemocracy (13)

Sponsored content: none

Revenue generated: none

Platform: Squarespace hosted by Peer1

Privacy disclosures

External trackers: one (Google Analytics)

Individuals with internal access to site at any time in 2014: one (Derek Muller)

Fictional Attorney of the Month: Jeff Winger

Community is not a typical comedy on television. It's probably why its quirky humor attracted a rabid but small audience, why NBC fired its creator three seasons in only to rehire him for the fifth, and why it's now canceled only to survive with some version of a sixth season on Yahoo! Screen in the near future as cast members peel off one by one. But Joel McHale's role as disbarred attorney Jeff Winger is just one of the many delightful roles.

Winger was a successful attorney at Hamish, Hamish & Hamlin, but for all the wrong reasons. He successfully persuades a jury to let off his client facing a DUI charge by tying it to 9/11. He helps a stripper escape tax evasion charges by arguing that her profession is actually not-for-profit performance art.

Unfortunately, a fellow attorney reports him to the bar for failing to obtain an undergraduate degree, and he's stuck in community college to get that degree with the least effort possible. Greendale Community College's eclectic mix of students generally rubs the sophisticated (former) attorney Winger the wrong way--much to the delight of all of us.

2014 Fictional Attorneys of the Month

January: Harvey Dent

February: Philip Banks

March: Willie Stark

April: Charles Kingsfield

May: Bob Loblaw

June: The Man of Law

July: John Shepherd

August: Lionel Hutz

September: Amanda Bonner

October: Sydney Carton

November: Barry Zuckerkorn

2013 Fictional Attorneys of the Month

NCBE has data to prove Class of 2014 was worst in a decade, and it's likely going to get worse

I have blogged extensively about the decline in bar pass rates around the country after the July 2014 test. My original take was more inquisitive, and I later discounted the impact that ExamSoft may have had. After examining the incoming LSAT scores for the Class of 2014, I concluded that it was increasingly likely that the NCBE had some role, positing elsewhere that perhaps there was a flaw in equating the test with previous administrations.

The NCBE has come back with rather forceful data to show that it wasn't the MBE (and that my most recent speculation was, probably, incorrect)--it was, in all likelihood, the graduates who took the test.

In a December publication (PDF), the NCBE described several quality-control measures that confirmed it was the test-takers, and not the test. First, on re-takers v. first-time test-takers:

Among the things I learned was that whereas the scores of those we know to be retaking the MBE dropped by 1.7 points, the score drop for those we believe to be first-time takers dropped by 2.7 points. (19% of July 2014 test takers were repeaters, and 65% were believed to be first-time takers. The remaining 16% could not be tracked because they tested in jurisdictions that collect inadequate data on the MBE answer sheets.) The decline for retakers was not atypical; however, the decline for first-time takers was without precedent during the previous 10 years.

I had suggested from earlier data from a few states that re-takers and first-time test-takers performed similarly; but, disclosing data from a much broader dataset and using the more precise issue of MBE performance, first-time test-taker performance was much worse.

Second, on equating the test:

Also telling is the fact that performance by all July 2014 takers on the equating items drawn from previous July test administrations was 1.63 percentage points lower than performance associated with the previous use of those items, as against a 0.57 percentage point increase in July 2013.

As equating the test is probably the biggest possible flaw on the NCBE's end, it's extremely telling that the equating of specific items on previous administrations yielded such a significant decline, and such a sharp contrast with the July 2013 test.

Third, and, in my view, one of the most telling elements, the MPRE presaged this outcome:

The decline in MPRE performance supports what we saw in the July 2014 MBE numbers. In 2012, 66,499 candidates generated a mean score of 97.57 (on a 50–150 scale). In 2013, 62,674 candidates generated a mean score of 95.65. In 2014, a total of 60,546 candidates generated a mean score of 93.57. Because many MPRE test takers are still enrolled in law school when they test, these scores can be seen as presaging MBE performance in 2014 and 2015.

A steady decline in MPRE scores, then, foretold this problem. This further undermines any notion that ExamSoft or other test-specific factors impacted the outcome; the writing was on the wall years ago. But as few schools carefully track MPRE performance, it might not have been an obvious sign until after the fact.

The NCBE bulletin then points out additional factors that distort student quality: a decrease in quality at the 25th percentile of admitted students at many institutions (i.e., those at the highest risk of failing the bar), the impact of highest-LSAT score reporting rather than average-LSAT score reporting for matriculants (a change embraced by both the ABA and LSAC despite evidence that taking the highest score overstates student quality), and an increase in transfer students to higher-ranked institutions (which distorts the incoming student quality metrics at many institutions). Earlier, I blogged that a decline in LSAT scores likely could not explain all of the decline--it could explain part, but there are, perhaps, other factors at play.

The NCBE goes on to identify other possible factors, ones that may merit further investigation in the legal academy:

  • An increase in "experiential learning," including an increase in pass-fail course offerings, and which often means students take fewer graded, more rigorous, "black-letter" courses;
  • A decline in credit hours required for graduation and a decline required (i.e., often more rigorous) courses;
  • An increase in bar-prep companies over semester-long coursework to prepare for the bar;
  • A lack of academic support for at-risk students as the 25th percentile LSAT scores of matriculants worsens at many institutions.

So, after I waffled, and blamed some decrease in student quality, and then started to increasingly consider the NCBE as a culprit, this data moves me back to putting essentially all of the focus on student quality and law school decisionmaking. Law schools--through admissions decisions, curriculum decisions, academic support decisions, transfer decisions, as a reaction to non-empirical calls from the ABA or other advocacy groups, or some combination of these factors--are primarily in control of the students' bar pass rates, not some remarkable decision of the NCBE. How schools respond will be another matter.

Further, the NCBE report goes on to chart the decline in the 25th percentile LSAT scores at many institutions. The declines in many places are steep. They portend some dramatic results--the decline in bar pass rates this year is only the beginning of probably still-steep declines in the next couple of years, absent aggressive decisions within the present control of law school. (The admissions decisions, after all, are baked in for the current three classes.)

Coupled with the decline of prospective law students, law schools are now getting squeezed at both ends--their prospective student quality is increasingly limited, and their graduates are going to find it still harder to pass the bar. And we'll see how they respond to this piece of news from the NCBE--I, for one, find the data quite persuasive.

Visualizing the continuing decline of the law school student body, 2014

One of the posts that has had the most staying power on this site was a post and a chart last year, "For legal education, the worst may be yet to come." We can now confirm that the decline continues, and the Class of 2017 is much smaller than previous classes--and that the bottom still has not been reached, given LSAT and applicant trends.

An ABA Journal piece discloses that the total incoming 1L class in 2014 was 37,675, the smallest since 1974, and down from the peak of 52,488 in 2010. Coupled with the declining LSAT data from LSAC, it paints a grim picture for legal education through at least 2017, and likely through 2018.