Excess of Democracy

View Original

Solving law school admissions; or, how U.S. News distorts student quality

A student's law school application is composed of several parts. It includes a grade point average and a Law School Admissions Test score. It includes an undergraduate institution, and major or majors. It may include the candidate's sex, race, ethnicity, national origin, work experience, sexual orientation, post-graduate education, military history, religious affiliation, and anything else a school requests or an applicant desires to include (usually in a personal statement). Schools compete for applicants and want to create the best class they can.  A school's definition of "best" can include a number of factors, but LSAT and GPA are usually two of the most significant.

The U.S. News & World Report annually ranks law schools. Almost a quarter of the ranking is based on the median LSAT and GPA of the entering class. This is intended to ascertain how "selective" the school is.

Professor Brian Leiter has noted elsewhere that LSAT and GPA are "highly manipulable" by law schools. But I'd like to focus on a slightly different area: the distortion of student quality by the reporting of medians.

A student, after all, has both an LSAT score and a GPA. But the USNWR ranking isolates LSAT and GPA. An incoming class under USNWR, then, is no longer the composite of students; it is the composite of LSAT scores and GPAs, each independently evaluated.

At the same time, schools are still trying to accept the "best" students, and they need a metric for identifying the "best."  Each school often has an "index" formula, which combines LSAT and GPA into a single number, weighting each differently.

So there's a metric schools use to identify the "best" students, calculated by their indices; and there's a metric that USNWR uses to identify the selectivity of each institution, calculated by isolating LSAT and GPA medians. What happens when the two don't align?

Solving law school admissions

In order to solve law school admissions, we'll need admissions data. Schools, understandably, don't disclose this data. So we'll use the next best thing: LawSchoolNumbers.com. (I'm grateful to myLSN.info for help in aggregating LSN's data.)

Prospective students have voluntarily submitted over 250,000 data points over the last 10 years to LSN. Even if it's self-reported, it's a valuable resource for examining admissions.

I took a large, fairly selective law school as the basis for this examination.  (You could probably find the school easily, but its identity is irrelevant to this analysis.) For fun, here's an animated GIF (from MakeAGif.com) of the last ten years of admitted students (with data from the 2012-2013 cycle through last month). 

In law school admissions, there's usually a very high degree of certainty about that individual's LSAT score: students rarely take the LSAT after applying to law school, much less after gaining acceptance. There's some slight uncertainty for GPA: applicants still in college often have at least one semester of grades remaining if they are admitted before the summer, and their GPA could increase or decrease slightly.

Here's a closer look at the students who self-identified that they were accepted (including off the waitlist) or rejected (including off the waitlist) in 2010, 2011, and 2012. (I'm grateful to my colleague Rob Anderson at Witnesseth for his help and insight.)

Green dots represent admitted applicants; red squares represent rejected applicants. The yellow triangles in the middle of each graph represent the median LSAT and GPA for those years.

Using a version of this school's index formula, we can derive a line from the medians using the index. As you can see, the line does a pretty good job in predicting an applicant's prospects: virtually everyone to the right of the line (or higher than the index score) is admitted; most (but not "virtually everyone") of those to the left of the line (or lower than the index score) is not admitted.

A simplistic examination focusing on medians would yield a part of this analysis. Essentially everyone "above-above" (i.e., above the LSAT median and above the GPA median) is admitted; essentially everyone "below-below" (i.e., below both medians) is not; and portions of those above one median or the other are admitted. But the index line helps identify which students are admitted or not even if they are not "above-above" applicants.

How U.S. News distorts student quality

If we examine these charts closely, however, we can identify some zones of unusual acceptance rates. These occur below the index line in two different pockets. The charts below illustrate two "wings" of admitted students below the index line, along with a "donut hole" of high-quality students who are not admitted.

There are two "wings" blocked out in blue. The smaller wing at the top of each chart includes those with above-median GPAs and below-median LSAT scores, who appear to be accepted at a relatively high rate. The larger wing at the bottom includes those with below-median GPAs and above-median LSAT scores, who also appear to be accepted at a relatively high rate. This reflects a couple of realities.

First, LSAT scores are a scarce commodity, which means they are in high demand. The LSAT is scored in a way that resembles a bell curve, and each point higher on the LSAT has fewer and fewer applicants with that score. It makes sense, then, that schools dip deeper into an applicant pool for high LSAT scores than for high GPAs (which are essentially unbounded at the many institutions that award grades in a manner each sees fit).

Second, the applicants are still evaluated close to the index score. Merely having a high LSAT or GPA is not enough; being closer to the index line offers applicants a higher chance for admission.

But what about those in the "donut hole"? These are high-quality candidates--that is, they are very close to the index line of the school, comparable to many others who are admitted. There is virtually no chance at these students being accepted: their odds are worse than 10% each year. But their peers with comparable--or worse--index scores, but who have a higher isolated GPA or LSAT score, are admitted at dramatically higher rates. It is because the students in the "donut hole" are "below-below" candidates; they cannot assist the school's LSAT or GPA medians.

In the end, this is where USNWR drives law school admissions. They admit students in the "wings," but exclude students in the "donut hole"; and they not only accept students in the "wings" with comparable index scores over those in the "donut hole," they accept students with worse index scores. Schools accept students they believe to be "worse" than these "donut hole" candidates--and not by some subjective understanding, but based upon their own preexisting index formula designed to measure student quality--in an effort to secure students who have one LSAT score or GPA that meets or exceeds a target median they intend to report to USNWR.

When schools have the opportunity to evaluate candidates, they evaluate the candidate as a whole: a student who has an LSAT score, and a GPA, and other factors, too. But when faced with the prospect of filling out a class, the school will reject its own holistic evaluation in favor of isolating traits at the USNWR values. If left to its own devices, the school would actually select a stronger class: its admitted students would have higher index scores. But the pressure to meet two isolated median targets distorts the quality of the class, and actually makes schools select weaker students (by its own definition) in an effort to meet the targets of USNWR.

What's the solution? Well, here's one: USNWR could create its own index, then evaluate the median student. This would give schools more flexibility in filling out the class and reflect individual students rather than isolated scores. That, of course, is controversial, because schools weight LSAT scores and GPA differently; others don't use an index score; and still others include other crude mathematical approximations into their index, like undergraduate quality. Additionally, it would simply drive a different arms race to the highest student median instead of the existing race. Finally, it would incentivize schools to admit extremely high "splitters" (e.g., an extremely high GPA and an extremely low LSAT) who met a target index score.

So, I don't have a good answer, but I'm interested in the thoughts others may have on this area.  And I'm happy for any corrections or critiques for this analysis.

(Methodological points: I excluded self-identified "underrepresented minorities" from the tabulation above. Schools tend to consider these applicants in a different way than other candidates, who are mostly evaluated primarily on LSAT and GPA. Schools may also value non-LSAT and GPA points like work experience, religious affiliation, undergraduate quality, or other factors, too, but those are not self-reported at the same rate, and these traits often lack consistency of value across institutions. So these charts are, admittedly, imperfect. It's also possible that the self-reported data is biased in some way. Not every school may use an index, and not every school may adhere to the index as consistently. I excluded students who failed to report an LSAT score, GPA, or both. Finally, when I use terms like "worse" or "weaker" applicants, I mean according to the school's index score. There may be reasons why a high LSAT score or a high GPA is more valuable; there are also schools that use different indices that weigh LSAT and GPA differently.)