Excess of Democracy

View Original

Which law schools are affected the most by the USNWR dropping at-graduation employment rates?

Following up on my post, “Who's likely to benefit from the new USNWR law school rankings formula?,” I’ve continued to look at other changes to the rankings. USNWR is now only using publicly-available data. I went through some of those changes, but I wanted to look at another one: the absence of at-graduation employment outcomes.

USNWR does not publicly disclose precisely how it weighs the various employment metrics, but it does offer some relative cues, and it also says it gives “full weight” to full-time, long-term, bar passage-required or J.D. advantage jobs. How a school places in that category is highly correlated with how it places in the overall employment category. Now, changes to this category are coming, as I noted (i.e., USNWR will give “full weight” to school-funded positions and to students pursuing graduate degrees).

Setting aside that change for the moment, USNWR also offers 4% of its ranking based on a school’s at-graduation employment rate. This metric tends to favor the more “elite” law schools that place a significant number of graduates into judicial clerkships or large law firms, because those employers tend to hire before a 3L has graduated. That, however, is not data collected by the ABA, which only collects data in 10-month employment statistics.

I looked at “elite” employment outcomes for students 10 months after graduation, and compared them to the overall at-graduation employment rate reported to USNWR. (“Elite” being placement in private practice law firms with 101 or more attorneys, and in federal judicial clerkships.) In this first chart, you can see a pretty good relationship between the at-graduation rates and the elite law firm placement rates. (You can also see a number of schools that don’t report their at-graudation rate

Now here’s the chart for the relationship between those same jobs and the 10-month employment rate. As you see, overall employment rates rise significantly among the schools with the least “elite” employment outcomes. It means that the shift from at-graduation to 10-month may well favor placement into public interest, government, and smaller law firm jobs compared to how those positions have been handled in the past.

On to the change in methodology. I ran the numbers from last year’s figures to see what would happen if the 10-month employment rate were weighted at 18% instead of its present 14%, and abolished the at-graduation employment rate. I only used the top-line full-weight “employment” figures, so those are less precise than trying to use the proprietary blend USNWR uses for its actual ranking; but, I did standardize each score and looked at where it fell. While imprecise, this should give a “band” of the schools most likely to over-perform and under-perform based on this change alone. It should be noted that many schools do not presently share at-graduation employment statistics with USNWR, and probably all of them would be better off, to some degree or another.

SCHOOLS LIKELY TO BENEFIT

At grad v. 10 month

Elon 10.2%, 78.7%

Dayton 30.0%, 87.1%

Willamette 30.6%, 85.2%

Texas A&M 46.9%, 93.8%

Gonzaga 31.7%, 83.7%

Regent 31.2%, 83.1%

Houston 31.9%, 81.9%

Arkansas 39.3%, 86.6%

Northern Illinois 26.3%, 77.5%

Samford 31.4%, 80.7%

Arkansas-Little Rock 7.1%, 64.6%

DePaul 32.3%, 79.3%

Campbell 33.6%, 78.6%

North Dakota 29.9%, 76.1%

Idaho 30.6%, 76.5%

Seattle 32.4%, 76.9%

Liberty 24.6%, 71.9%

LSU 41.9%, 82.0%

Oklahoma 30.2%, 74.5%

Belmont 36.0%, 78.0%

These tend to be schools that do not place an overwhelming number of students into large law firms or judicial clerkships, but that do have a fairly strong 10-month employment rate relative to their peers. Interestingly, there are not any California law schools on the list, a cohort I had assumed might benefit most from the state’s difficult bar examination and perhaps a higher “wait and see” approach from prospective employers.

Now, to schools more likely to be adversely affected.

SCHOOLS LIKELY TO BE ADVERSELY AFFECTED

At grad v. 10 month

Massachusetts-Dartmouth 33.9%, 47.5%

Yale 89.2%, 89.2%

Stanford 88.5%, 89.0%

BYU 82.8%, 85.9%

Northwestern 87.9%, 89.5%

CUNY 36.1%, 56.5%

Loyola-New Orleans 52.1%, 66.9%

Vanderbilt 82.2%, 86.1%

Georgetown 83.1%, 86.8%

NYU 86.6%, 89.5%

Berkeley 86.7%, 90.0%

Chicago 94.6%, 95.1%

Columbia 95.3%, 95.6%

USC 76.2%, 83.6%

Virginia 92.7%, 94.3%

Cornell 90.3%, 92.8%

Montana 81.2%, 87.0%

Irvine 58.7%, 72.7%

Connecticut 58.6%, 72.9%

Harvard 88.1%, 91.8%

Recall, of course, it is on this one metric alone I’m looking at the change. And recall because the schools’ data are standardized in each category, those likely to gain or lose may look a little different than one may expect on the raw numbers alone. But it’s a mix of schools that have a very high at-graduation employment rate and receive a significant boost relative to their peers; and schools that are fairly low in both categories that were farther outliers in the at-graduation rates.

There are many other changes that could help or adversely affect other schools. Note, for instance, that I suggested in an earlier post that BYU could gain significantly in some other categories; here, it appears they could be adversely affected more. Texas A&M, to name another, performs well here, as it did in other places. How much weight USNWR gives to any change matters greatly.

But I think this highlights just how uncertain many changes are in the upcoming rankings. As I pick off different categories, there are schools likely to change their performance in each category. How those shake out in the end—whether they tend to be beneficial or not—remains to be seen.