The coming reckoning for non-JD legal education

As JD enrollment falls and non-JD enrollment increases at law schools, leading to a dramatic increase in the percentage of legal education focused on a non-JD student body, it's worth considering what non-JD legal education looks like, where it's going, and what the future may hold. It's a story of some unusual and under-discussed factors that portend a coming reckoning. (And this assumes demand remains fairly strong--recent reports suggest foreign countries may begin to cut back on sending foreign students to the United States for education if a trade war begins, or if immigration and international travel priorities change.)

As I've noted before, one in ten students enrolled in law schools in the United States are not part of a JD program, a number likely to continue to rise:

The American Bar Association defines three categories of non-JD degrees: "academic masters degrees for non-lawyer," "post-JD law degrees for practicing lawyers and/or foreign lawyers seeking to practice in the United States" and "research and academic-based doctorate level degrees." The second category, usually LLM degrees, have historically been the largest contingent (at least according to conventional wisdom), and the first category is among the fastest growing (again, at least according to conventional wisdom).

The ABA does not accredit non-JD programs. Instead, the ABA's task is limited to "acquiescence" of a new program. The ABA offers some rather onerous regulations that schools must meet for the JD program, but they offer no guarantee or review of the non-JD programs, except for very limited purposes:

ABA accreditation does not extend to any program supporting any other degree granted by the law school. Rather the content and requirements of those degrees, such as an LL.M., are created by the law school itself and do not reflect any judgment by the ABA accrediting bodies regarding the quality of the program. Moreover, admission requirements for such programs, particularly with regard to foreign students, vary from school to school, and are not evaluated through the ABA accreditation process. The ABA reviews these degree programs only to determine whether their offering would have an adverse impact on the law school's ability to maintain its accreditation for the JD program. If no adverse impact is indicated, the ABA "acquiesces" in the law school's decision to offer the non-JD program and degree.

I sadly must qualify statements above as "according to conventional wisdom" because, as noted, the ABA does not collect data or evaluate matters like incoming student metrics or outcome performance of non-JD graduates. To do so might be a challenge, of course, given the variety of programs that offer quite different things.

But I'll focus on one particular kind of degree to start: the "post-JD law degree" for "foreign lawyers seeking to practice in the United States." In 2015, there were 6529 bar exam test-takers (including repeaters) who attended law school outside the United States. Virtually all of them (4754, or 73%) took the New York bar. Combined with the 1142 who took the California bar, over 90% took these two states' bar exams.

In both these states, and in most others, bar exam test-takers must have additional education at an ABA-approved law school. New York sketches some basic requirements for LLM programming to qualify an individual for the bar, as does California.

But note the gap between the ABA and the state bars: the ABA does not actually accredit these programs or "reflect any judgment" "regarding the quality of the program." But the state bars condition foreign graduates to secure a degree at an ABA-accredited school--even if the degree itself is not approved by the ABA.

This, of course, means that any of the typical factors one would consider in an accreditation process--including admissions standards, or quality control measures for graduation, like bar pass rates or employment outcomes--do not exist for such programs. Of course, the Department of Education, or other accrediting bodies, may have other things to say about such programs. But it means that there are two sets of programs operating out of ABA-approved law schools: ABA-approved JD programs, and ABA-"acquiesced" non-JD programs.

If one examines the cumulative bar pass rates of non-US law graduates--most of whom have been required to complete a program at an ABA-accredited law school--and compares them to the pass rates of ABA graduates, the results are quite striking. The overall bar pass rate for ABA graduates has been in decline for several years, drifting down from 74.3% in the February & July administrations in 2011 down to 64.4% for the administrations in 2015. (These test results include all test-takers, including repeaters, those who took multiple states' bar exams, those who were not recent law school graduates, and test-takers in United States territories.) But those who were educated outside of the United States--and almost all of whom secured a degree from a program at an ABA-accredited school--now sit at a meager 28% overall pass rate, a slight decline in recent years. (UPDATE November 19: see below.)

(It might be worth noting that New York's pass rate of 68% is fairly typical of the overall pass rate of 64%, and the 31% New York bar pass rate for non-US law graduates is also fairly typical of the 28% overall pass rate for non-US educated test-takers. That's despite California's lower-than-average bar pass rates being an unusually high component of the non-US law graduate bar exam test-takers.)

Of course, non-US attorneys are still just a sliver of overall bar exam test-takers, particularly because they are concentrated in just two jurisdictions. The chart to the left shows the tota number of test-takers for these categories.

Perhaps, of course, bar pass rates should not be the touchstone for accrediting bodies. And perhaps the incentives are quite different in reviewing such programs.

But it is hard to believe that attention won't shift toward the non-JD market, particularly as it grows in a semi-unregulated fashion. Perhaps the consumer advocacy interests are different from those who are already attorneys in another country seeking to study in the United States, or for non-JD degree-seekers who do not intend to take the bar exam. Only time will tell whether a reckoning is coming.

Display note: I did start the y-axis for non-JD percentage at a non-zero number to avoid excessive white space, but as it displays relative changes in value as a percentage, I think it is not terribly deceptive.

UPDATE November 19: A careful reader wondered about the evidence behind this claim. It's worth referring to a 2014 NCBE "Bar Examiner" report on foreign lawyers who took the New York bar. 75% of them had completed an LLM, and 25% had completed programs abroad that met the New York requirements (e.g., of similar duration and based on English common law). It includes some other breakdowns about the countries of origin of these students and their pass rates based on that country.

Total LSAT test-takers remain steady for 2016-2017 and fourth consecutive admissions cycle

Last year, I noted that we have seen fairly steady numbers of LSAT test-takers, at least at the top line level. That looks to hold true this cycle. In soon-to-be-disclosed statistics, October (well, September/October) 2016 test-takers were up 1.0% over last October 2015, as 33,563 took the test last month. First-time test-takers were down slightly, down 0.4%.

Below is a visualization of cumulative test-takers who took the July and October administrations of the LSAT. These have consistently been about 55% of the total LSAT test-takers in an admissions cycle (the remaining 45% taking the test in December or February.)

For the fourth straight year, cumulative LSATs have hovered between 53,000 and 57,000. (This includes American and Canadian test-takers, first-timers and repeaters.) It's hard to resist the temptation to call this level the "new normal."

Why is the ABA still accrediting law schools?

Bear with some meandering musings below....

To grossly oversimplify the history of the American Bar Association accrediting law schools, it looks something like this.

About a hundred years ago, just about anyone could take a bar exam, as long as you studied with a lawyer or "read law" for a time. Some attended law school, but it was not required. By the 1930s, states began making it more difficult to pass the bar exam--presumably in part to reduce competition for existing lawyers and make the profession more difficult for individuals to enter. (Presumably, of course, with another, at least salutary, benefit of increasing the "quality" of those practicing law.)

The ABA today describes its accreditation standards as "minimal educational requirements to qualify a person for eligibility to sit for the bar examination." As state bars became more exclusionary, these bars began to adopt minimal standards--driven, perhaps, by the ABA itself, which has become the sole accrediting body in most jurisdictions. The bar required attendance at an accredited law school; the accreditation process was designed to ensure that legal education met standards that the ABA believed to offer a "sound program" of legal education.

All this is actually quite descriptive and lacks any normative explanation. Why should there be certain standards for legal education that must be met before someone takes the bar exam?

What is the difference between legal education and the bar exam?

It might be that we have legal education because we believe that attorneys should be, somehow, perhaps, well-rounded and well-educated individuals, apart from their ability to pass the bar exam. That would seem to be the driving concern--we think (perhaps you don't, but work with the assumption) lawyers shouldn't just be able to pass the bar and practice law; they should have some kind of training and background before they practice law and something that qualifies them apart from the bar exam's test of "minimum competence."

The ABA has a near obsessive focus on the picayune details of how a law school functions, including the types of books the law library maintains. Many of the ABA standards are fairly generic, requiring things like "suitable" classrooms, "sufficient" space for staff, and "sound" admissions policies. Ensuing interpretations often add specific guidance to these generic standards, which drive a great deal of law school decision-making. But these are all designed to elevate the educational experience, quite apart from the ability to pass the bar exam.

Many of these standards, of course, suffer from serious deficiencies. For matters like the books in a library, they are antiquated notions about access to print materials from an era where books were scarce. Today, not only are books plentiful, the resources attorneys principally use are electronic. Some standards are the result of bargains with entrenched interests within the ABA rather than with empirical or quantifiable pedagogical benefit.

But that can be set aside for the moment--the ABA may have a goal of providing some kind of a quality education for all prospective attorneys before they take the bar, but it may simply do so albeit ham-handedly.

But there is a different, perhaps reverse, form of the question: if legal education provides students with three years of sound education and a degree at the end, why is the bar exam even needed? Isn't graduating from a law school after three years of thickly-regulated education sufficient to make one eligible to practice law? Indeed, it's a reason why the state of Wisconsin offers "diploma privilege" to graduates of its two law schools.

The opening question, then, is really to determine the purpose of the bar exam and the purpose of accreditation of law schools.

To recap, the bar exam has perhaps less-than-noble purposes (such as limiting the pool of attorneys), and some perhaps good purposes (such as establishing minimum competence to practice law, however imperfect the bar exam may establish that).

Legal education, in contrast, is, I think, designed to offer something beyond simply establishing "minimum competence." It, perhaps, and perhaps ideally, offers students an opportunity to learn about the law in a more systematic way than reading law might have permitted. That, of course, comes at a high cost for prospective attorneys who must invest (typically) three years of education and a substantial sum of money to achieve the diploma required to take the bar.

Therefore, I think it would be fair to say that legal education is providing something distinct from the bar exam. (Whether the accreditation process is a proper assessment, and whether accreditation should be required, are separate concerns.)

If legal education is providing something distinct from the bar exam, then why are new accreditation standard focusing on the bar exam?

So, why is the ABA still accrediting law schools given its new obsession with the ability of graduates to pass the bar exam?

Most of the rest of the ABA's accreditation practices focus upon the terms of education. It is, in theory, providing something apart from the bar exam. Now comes the new ABA standard, on track for approval, which provides, quite flatly, "At least 75 percent of a law school’s graduates in a calendar year who sat for a bar examination must have passed a bar examination administered within two years of their date of graduation." An earlier malleable standard has become a clean rule. It also threatens a number of schools currently in non-compliance with this standard, likely to get worse as bar rates continue to decline.

What value is the ABA adding if its newest, most stringent control is simply redundant of, well, the bar exam itself? Why have accreditation at all?

It seems a bit self-referential to say that a law school cannot send its graduates to take the bar exam until enough of its graduates pass the bar exam--particularly as the entire point of legal education, as I've suggested, is to provide something apart from "minimum competence."

After all, it would be pretty simple stuff for state bars to simply disclose the pass rates of all institutions whose graduates take the bar exam. Then consumers could know if their school met the standards that they expected or desired when attending law school.

But perhaps it is because of the consumer protection-focused nature of recent years that has driven this result. Legal education is not really seen as offering something "apart from" anymore. It is instead deemed more a necessary and costly hurdle before taking the bar exam. And if law school graduates are going through this costly and time-consuming process, but unable to pass the bar exam, then law schools’ function is greatly diminished.

There are two principal, and opposing, kinds of responses one could make to my query.

First, I suppose one could make the claim that if a law school is not providing "minimum competence" to its graduates, then it is hardly providing the kinds of aspirational traits legal education purports to provide and should not be accredited. That's, I think, somewhat misguided. The bar exam is not really very well designed to test "minimum competence." Indeed, it's not really very well designed to test the abilities of lawyers. Timed, closed-book essays that principally rely on regurgitating black letter law (indeed, often greatly simplified, even fictitious versions of law), alongside a series of multiple choice questions, in selected areas of practice designed for general practitioners who arose from a common-law, 70s-era form of the law, are not really something that should be taught in law school--at least, not emphasized.

In reality, the problem is not that law schools are failing to train their graduates with the "minimum competence" needed to practice law. Or, even the ability to pass the bar. It is that many are accepting, and then graduating, students who are unable to acquire the skills needed to pass the bar--because they are incapable of doing so, or because they are unable to transfer the three years of legal education into the licensing regime of bar test-taking, or because they have been prioritizing other things, or whatever it may be.

This is, I think, a subtle point. It is tied, I think, closer to admissions and graduation practices, and to post-graduation study habits, more than legal education. That is, of course, because legal education is supposed to be providing something other than bar prep, and has been doing so for decades. So, the decline in bar pass rates is not really a problem with "education" in the sense of the time in the classroom for three years. It is about other things.

Second, one could say that the ABA needs to have some standards for accrediting law schools, and this is as good a standard as any to help control the admissions and graduation problems that may be contributing to bar pass rate declines. But this, again, gets back to my opening question--why have accreditation at all?

If law schools aren't in the business of bar exam training (and I don't think they should be), we should still expect that law schools are graduating students who are able to use their professional degrees in the practice of law. If schools are failing in that endeavor, stripping accreditation is certainly a way of penalizing them.

But it all seems quite circuitous, given that we could just permit students to take the bar regardless of their legal education history--as long as they establish that they have the "minimum competence" to take a licensing exam, they could practice law. And if some want to attend law school to secure a credential for future employers that says, "I've attended this law school and have some additional training that establishes something beyond minimum competence," they could do that, too.

And this points back to the purposes of requiring attendance at an accredited law school in the first place. You see, my problem isn't necessarily that the ABA wants to ensure that law schools are graduating students who are able to pass the bar exam and become licensed practicing attorneys. It is, instead, that if the bar exam is our principal concern, and the principal concern is wholly independent of legal education, and now legal education is accrediting bodies based on performance on this principal concern... doesn't that instead suggest that the accreditation process of legal education is, perhaps, its own problem now?

Concluding thoughts

If you've survived through this meandering, it's worth considering what legal education should be. Perhaps it should still try to provide something different from the "minimum competence" required to pass the bar exam.

But as some law schools have departed from practices that may best benefit their graduates--particularly in high tuition costs; entrenched and inflexible standards; and declining control in the quality of admissions, retention, and graduation practices--it may be the case that we have forgotten what law school ought to be. Its purposes have been lost as we consider it as a kind of necessary rite of passage before one takes the bar exam. In this instrumental vein, distrustful of the operation of law schools, the accreditation process should look mostly at the outputs of graduates.

I don't think that's a welcome development, either on the accreditation end or on the telos of legal education. But it's perhaps the necessary evil that has come upon us, until either schools change their practices or the market improves dramatically. Even then, it will be hard to separate legal education from the bar exam, and that loss speaks more about why the ABA is still accrediting schools in the first place--or why state bars require legal education before taking the bar exam.

Despite improvement in MBE scores, bar exam rates appear to be falling yet again

I blogged earlier about the slight improvement in MBE scores, which, I thought, might lead to an increase in the overall bar pass rates. But I was in a wait-and-see mode, because, while MBE scores typically track bar pass rates, we needed to see results from jurisdictions to see if that would be the case this year.

It appears, however, that even though MBE scores rose, bar exam pass rates are declining again.

I continue to track overall pass rates (rather than first-time or first-time ABA rates) because that's often the only data many jurisdictions disclosed--but first-time pass rates are often much better. These are most of the jurisdictions that have publicly disclosed overall pass rates.

Bar pass rates have improved slightly in a couple of jurisdictions--Kansas rose from 76% to about 79%, and West Virginia from 69% to about 71%. But there are some declines--and some fairly significant ones--elsewhere. Missouri's fell from 84% to 79%--it was 91% in 2012. Indiana's dropped 13 points, from 74% to 61%. Iowa's dropped 15 points, from 86% to 71%--it was 90% in 2012.

Assuming all things are equal, an increase in the mean MBE scores would have meant an increase in the bar pass rates. But why are seeing declines in many many jurisdictions?

My initial theory--unsupported by any evidence!--would be that the best students were sufficiently worried about the bar exam and studied more than ever. That would mean that the scores of people already inclined to pass the bar exam improved--and that wouldn't have any impact on the pass rates. It would shift up the mean score of the MBE without affecting the overall pass rates. And, if the quality of students law schools have been graduating has continued to decline, then we might expect to see overall pass rates decline.

It's also possible that these jurisdictions are outliers and we'll see improvement in pass rates in places like New York and California. (The small decline in the pass rate in Florida, however, is not a good sign on this front.)

In short, there was some excitement and enthusiasm about the uptick in the MBE scores. But if that uptick isn't translating into improved bar pass rates, law schools need to be seriously considering responses to continued declines in the months ahead.

(It's worth noting that I chose a non-zero y-axis to demonstrate the relative changes in performance; overall, a majority of test-takers continue to pass the bar in each jurisdiction.)

Law school graduates are clerking for federal judges at a (mostly) steady rate

Today is the day after Labor Day, which, in an earlier era, was a kind of holiday for third-year law students, who would send materials to federal judges who were hiring under "The Plan." After that cartel failed, federal judges hire at their own pace--often, ever-earlier.

An ongoing question about this practice lingers: do judges prefer graduates, or do they prefer clerks with some work experience? Because graduates were not subject to The Plan, were they a more popular choice until The Plan died? Or has there been a trend toward hiring clerks with some work experience?

When I looked at the data two years ago, it looked like there was no significant trend over four years. With another two years' worth of data, I thought I'd check again. The following totals are from the ABA employment summary reports and include all full-time, long-term (which includes one-year positions) positions as federal judicial clerks. (Please note that "federal" is undefined in the ABA guidelines--it might include magistrate judges, Article I courts, and other miscellaneous positions).

There's been a small trend downward in the total graduates placed into federal clerkships, but only time will tell if it's a trend or just a little noise. One reason may be credentials--as there are fewer incoming law students, there are fewer graduates at elite schools, or fewer graduates who possess the credentials that a judge may want. If judges choose not to dip lower into a graduating class, those judges may be inclined to move toward clerks with work experience. On a percentage basis, 1188 clerks for the Class of 2015 is much higher than the 1259 for the Class of 2013 simply because the total number of graduates has shrunk significantly--from 46,116 in the Class of 2013 (2.73% employed as federal clerks) to 39,418 in the Class of 2015 (3.01% employed as federal clerks).

Finally, the data has far more potential noise than just the concerns listed above. If there is an increase in new judges, or judicial vacancies, that changes the demand for all clerks, across both new graduates and clerks with work experience.

Why weren't bar exam pass rates an existential crisis in the 1980s?

I blogged about the small improvement in Multistate Bar Exam ("MBE") scores in the July 2016 administration of the test. We won't know what first-time pass rates from ABA-accredited law schools are for some time, but it's fair to assume we should see a small improvement nationally.

The drop in test scores--likely partially caused by a decline in the quality of the applicant pool over the last several years--has caused quite an uproar, particular as the ABA considers clamping down on schools with relatively low pass rates.

But if you look at MBE scores from the 70s and 80s, the peak time for Baby Boomers to be completing legal education, you'll notice that their scores are fairly comparable to the scores in the last two years.

So if MBE scores look a lot like they did back then, why is there such a commotion about them? Perhaps a few reasons.

First, expectations have changed. Gone are the days with the mythic "look to your left, look to your right" fears of dismissal. There is an expectation that virtually all law school enrollees complete their JD, and another expectation that those who secure the JD and take the bar will pass the bar. The challenges are no longer deemed to be the failure rates in law school or the bar exam, but in the process that one must "survive." A dip in bar pass rates upsets existing expectations.

Second, the fiscal consequences have changed. Indebtedness of students at graduation is quite high (for many), especially when law school loans are coupled with undergraduate loans (and sometimes credit card debt). Indebtedness has outpaced inflation over the decades. For students pressed with this debt, the prospect of failing the bar exam--and likely delaying or losing job opportunities--is more significant.

Third, the bar looks different today, and pass rates may differ despite similar MBE scores. Perhaps the sample size is just a little smaller in the 1980s--and perhaps these jurisdictions with the MBE had disproportionately lower pass rates. There's little question that states that have administered their own bar exams (like Louisiana) can have more inconsistent results. Consider a recent example from Oklahoma, which adopted the Uniform Bar Exam and saw pass rates plunge so significantly that it modified the passing score. Perhaps, then, the MBE score in the 1980s was not as indicative of overall pass rates--but that's more a lack of data on my end.

It's good to see the MBE score improve slightly. In an absolute sense, unfortunately, the pass rates will not approach what they were a few years ago. But while bar pass rates are historically low, the history is worth reflecting upon.

July 2016 bar exam scores improve slightly but remain near all-time lows

The good news for recent law school graduates? The July 2016 pass rates for test-takers will likely increase slightly nationally. As Deborah Merritt recently shared, the mean Multistate Bar Exam score rose from 139.9 in July 2015 to 140.3 in July 2016. Professor Merritt offers a few reasons why scores improved slightly, and I won't add to her good thoughts (despite other possible reasons that may come to light later!).

Given that bar exam scores hit a 27-year low last July, this is surely good news--particularly as the incoming predictors of law students across the nation continued to decline between the entering classes in 2012 and 2013. But there is an important matter of perspective: a 140.3 is still near all-time lows.

The July 2016 score is the second-lowest since 1988 (the lowest being July 2015), and still well off the mark of even the July 2014 score, much less the July 2013 score. In an absolute sense, the score is not good. Indeed, while modest improvements in the bar passage rates in most jurisdictions will be good news for those passing students and for law schools looking for any positive signs, they will not approach the pass rates of three or four years ago.

We should see pass rates from states like North Carolina and Oklahoma soon. As the fall wanes, we'll see more individual jurisdictions and more results from specific schools. And perhaps we'll see if dramatic changes occur in a few places or at a few schools--or whether the change is small and relatively uniform everywhere.

Year-over-year LSAT test-takers flat as the new normal continues

Last year, I noted that the reports about the number of LSATs administered is only a partial picture of the present state of the law school admissions cycle. The latest numbers on the June 2016 LSAT are no different.

About 23,000 tests were administered, down 0.8% over last June. But the LSAC reports, circulated via PDF and not available on its website, tell a little more.

First-time test-takers in the United States declined slightly less than this, down 0.6% over last June, or close to the 0.8% overall decline reported in the top-line results. Repeaters were up slightly, 1.4% over last June. The real decline occurred in Canada, which saw a 15.7% drop in first-time test-takers.

For those who anticipated that law school applicants had "bottomed out," it appears that it's more a "new normal," as I've suggested before. The bottoming out does not appear to mean that law schools will experience a new upswing in applicants, a rebound to previous levels. Instead, it reflects another year of a rather flat market. And it's a sign that temporary structural changes instituted at many law schools will need to become more permanent to reflect this reality.

LSAC strikes back with bizarre charge in latest war over LSAT

The Law School Admission Council ("LSAC") administers the Law School Admission Test ("LSAT"). It also administers the admissions service that all law schools use to admit applicants. Recently, it was not very happy that Arizona Law planned on admitting some students using the Graduate Record Examination ("GRE") in lieu of the LSAT. It threatened to expel Arizona from LSAC. A number of law school deans pushed back. And LSAC backed down. But it chose another way of threatening law schools--one with a bizarre line of attack.

A few preliminary matters. It's not obvious to me that the GRE is comparable to the LSAT in predicting law school performance. But as LSAT is an imperfect tool at best, and the GRE may be slightly more imperfect, it may well be better at some schools for a handful of applicants. Indeed, I imagine the correlation between SAT and LSAT is quite high. And there are obvious advantages to using the GRE--it expands the pool of prospective law students to a more generalized pool of students interested in a degree after the bachelor's degree. And, perhaps more cynically, incoming students with a GRE score do not have an LSAT score, which is a significant category used in U.S. News & World Report ("USNWR") rankings. All this aside, I think there's no question schools should be permitted--and should--innovate with admissions criteria.

Further, LSAC has been diluting the value of the LSAT for some time. It reports the highest LSAT score of an applicant rather than the mean of all scores, which is the more accurate measure of success. It has agreed to stop "flagging" accommodated test-takers, whose scores are not as reliable a measure of law school success.

Additionally, schools have already been looking toward other measures beside the LSAT as a basis for admitting students. A few schools offer select programs admit students from their own undergraduate schools with no need of an LSAT score if they've achieved a high enough GPA.

Given all these changes, the LSAT is not a great measure for the quality of an incoming class right now. The median LSAT score is not a great measure of the quality of the class, even though USNWR uses it anyway.

But LSAC has decided otherwise. It used to provide reports about each school's 25th, 50th, and 75th percentile LSAT scores to the ABA. It will no longer do so. It explains, "Given the current uncertainty about the Section's position on the use of admission tests other than the LSAT, and the current or potential use by some law schools of admission tests other than the LSAT, we no longer believe that this goal can be met."

Pause and reflect on these remarks. Recall that, so far, only Arizona Law has overtly expressed a desire to use an alternative test (although others are considering it). Recall, too, the many, many present weaknesses in the LSAT scores, some of which have literally been caused by LSAC itself.

It's something of an absurd statement from LSAC. It reflects not a genuine concern about its data but a preemptive strike against the ABA or any law schools that intend to use any other admissions tests. It also reflects a rather unsophisticated cry against law schools--particularly in the event antitrust claims arose against such an organization that held such a powerful grip on law school admissions.

This is hardly the first bizarre letter from LSAC this cycle, as my colleague Rob Anderson has explained over a 2015 letter concerning the usefulness of LSAT scores.

But I thought I would highlight the disingenuous attack on the use of the GRE. While I'm inclined to agree that the GRE has not been demonstrated to be as predictive, the non-sequitur claiming that this is the one thing that renders LSAT scores at law schools essentially meaningless is, I think, not accurate.

(UPDATE Sept. 25, 2016: LSAC has decided to withdraw its threat for one year.)