Visualizing legal employment outcomes in Pennsylvania in 2018

This is the second in a series of visualizations on legal employment outcomes for the Class of 2018. Following a post on outcomes in Illinois, here is a visualization for legal employment outcomes of graduates of Pennsylvania law schools for the Class of 2018. (More about the methodology is available at the Illinois post.) Last year's Pennsylvania post is here.

Total graduates were down slightly year-over-year, and the job picture improved a little, with 85.2% employed in bar passage required and J.D. advantage positions, including 11 school-funded positions. Most schools are bunched together in the 78%-82% placement range. Total placement fell slightly, so the improvement is attributable to the slightly smaller number of gradates, from 1256 to 1238.

As always, please notify me of any corrections or errata.

emppa2018.png
Peer Score School 2018 YoY% BPR JDA LSF Grads 2017 BPR JDA LSF Grads
4.4 University of Pennsylvania 97.9% -0.9 216 12 10 243 98.8% 232 16 5 256
1.8 Duquesne University 90.0% 10.6 88 20 0 120 79.4% 80 20 0 126
2.5 Villanova University 88.2% 5.8 127 15 0 161 82.4% 120 11 0 159
2.2 Pennsylvania State - Dickinson Law 87.3% 8.6 51 4 0 63 78.7% 41 7 0 61
2.1 Drexel University 83.7% 2.3 95 13 0 129 81.5% 88 13 0 124
2.7 Temple University 83.3% -3.4 161 13 0 209 86.6% 172 15 1 217
2.3 Penn State Law 80.0% 1.9 86 13 1 125 78.1% 82 7 0 114
2.7 University of Pittsburgh 71.9% -5.7 85 12 0 135 77.5% 87 20 0 138
1.7 Widener Commonwealth 62.3% -4.9 32 1 0 53 67.2% 36 5 0 61

Visualizing legal employment outcomes in Illinois in 2018

Following up on a series of posts last year (and previous years), this is the first in a series visualizing employment outcomes of law school graduates from the Class of 2018. The U.S. News & World Report ("USNWR") rankings recently released, which include data for the Class of 2017, are already obsolete. The ABA will release the information soon, but individualized employment reports are available on schools' websites.

The USNWR prints the "employed" rate as "all jobs excluding positions funded by the law school or university that are full-time, long-term and for which a J.D. and bar passage are necessary or advantageous ." It does not give "full weight" in its metrics to jobs that were funded by the law school. USNWR gives other positions lower weight, but these positions are not included in the ranking tables. And while it includes J.D. advantage positions, there remain disputes about whether those positions are actually as valuable as bar passage required jobs. (Some have further critiqued solo practitioners being included in the bar passage required statistics.) Nonetheless, as a top-level category, I looked at these “full weight” positions.

The top chart is sorted by non-school-funded jobs (or "full weight" positions). The visualization breaks out full-time, long-term, bar passage required positions (not funded by the school); full-time, long term, J.D.-advantage positions (not funded by the school); school funded positions (full-time, long-term, bar passage required or J.D.-advantage positions); and all other outcomes. I included a breakdown in the visualization slightly distinguishing bar passage required positions from J.D.-advantage positions, even though both are included in "full weight" for USNWR purposes (and I still sort the chart by "full weight" positions).

The table below the chart breaks down the raw data values for the Classes of 2017 and 2018, with relative overall changes year-over-year. Here, I used the employment rate including school-funded positions, which USNWR used to print but no longer does; nevertheless, because there are good-faith disputes, I think, about the value of school-funded positions, I split the difference—I excluded them in the sorting of the bar graphs, and included them comparatively in the tables. The columns beside each year break out the three categories in the total placement: FTLT unfunded bar passage required ("BPR"), FTLT unfunded J.D. advantage ("JDA"), and FTLT law school funded BPR & JDA positions ("LSF"). This year, I also added the total graduates. (My visualization is limited because the bar widths for each school are the same, even though schools vary greatly in size, and that means raw placement might be more impressive considering class size.)

The first state is Illinois (last year's visualization here). There were 1696 statewide grades, a 3% decline over last year's class. The total placement rate among the graduates was 82% (including a few school-funded jobs). It is, once again, a slight improvement over last year driven by the smaller class size. Placement in bar passage required jobs fell slightly again.

As always, if I made a mistake, please feel free to email me or comment; I confess there are always risks in data translation, and I am happy to make corrections.

Peer Score School 2018 YoY% BPR JDA LSF Grads 2017 BPR JDA LSF Grads
4.7 University of Chicago 98.1% 0.4 188 4 10 206 97.7% 197 4 8 214
4.2 Northwestern University (Pritzker) 96.9% 3.0 205 12 5 229 94.0% 204 24 5 248
3.2 University of Illinois-Urbana-Champaign 91.9% 5.3 118 19 0 149 86.6% 111 12 0 142
2.6 Loyola University Chicago 85.5% 8.0 119 46 0 193 77.5% 138 34 0 222
2.6 Illinois Institute of Technology (Chicago-Kent) 81.0% 7.0 149 39 0 232 74.0% 133 32 0 223
2.2 DePaul University 73.5% 3.9 126 40 0 226 69.6% 126 34 0 230
1.7 The John Marshall Law School 67.5% -4.1 151 33 1 274 71.6% 161 43 0 285
1.6 Southern Illinois University-Carbondale 67.3% 8.7 63 11 0 110 58.6% 64 4 0 116
1.6 Northern Illinois University 66.2% -10.9 43 8 0 77 77.1% 42 11 1 70

February 2019 MBE bar scores bounce back from all-time lows

After cratering to all-time record lows last year, scores on the February administration of the Multistate Bar Exam have bounced back. It’s good news, but modest—the rise returns to scores from February 2017, which were at that time the lowest in history. Scores have now bounced back to match the second-lowest total in history… which is slightly better.

To be fair (which is not to say I’ve been unfair!), part of this overall score is likely driven by the Uniform Bar Exam. It used to be that there were more test-takers who’d passed a previous bar exam and would have to take another test in another jurisdiction. Those who’d already passed were likely to score quite well on a second attempt on a new bar. But the National Conference of Bar Examiners has indicated that the rise of the UBE has dropped the number of people taking a second bar, which in turns drops the number of high scorers, which in turn drops the MBE scores. So the drop in the MBE scores itself isn’t entirely a cause of alarm. It’s a reflection that the UBE is reducing the number of bar test-takers by some small figure each year.

We now know the mean scaled national February MBE score was 134.0, up 1.8 points from last year's 132.8. We would expect bar exam passing rates to rise in most jurisdictions. Just as repeaters caused most of the drop last time, they are causing most of the rise this time. Repeaters’ scores simply appear to be more volatile as a cohort of test takers.

A couple of visualizations are below, long-term and short-term trends.

For perspective, California's "cut score" is 144, Virginia's 140, Texas's 135, and New York's 133. The trend is more pronounced when looking at a more recent window of scores.

The first major drop in bar exam scores was revealed to law schools in late fall 2014. That means the 2014-2015 applicant cycle, to the extent schools took heed of the warning, was a time for them to improve the quality of their incoming classes, leading to some expected improvement for the class graduating in May of 2018. But bar pass rates were historically low in July 2018. It’s not clear that law schools have properly adapted even after five years.

Until then, we wait and see for the July 2019 exam. For more, see Karen Sloan over at NLJ.

My advice to students looking to enroll in classes

I offered a few thoughts on Twitter recently about advice to students looking to enroll in classes. It became popular advice, and then some people added nuance or qualifications, so I thought an extended discussion here might be warranted. While I teach at a law school and think specifically about that, the advice can work well for higher education generally.

1. Take the professor, not the course. In my seven years of higher education, I never regretted a course I took with a professor I liked in an area outside of my specialties or interests; and I’d say all of my least-favorite courses came in courses I felt like I “had” to take or “ought” to take for one reason or another. The quality of the professor often makes or breaks a course. In my conversations with students about their favorite and least favorite courses, it usually turns on naming the professor rather than the contents of the course.

There is a risk that this becomes some kind of cult of personality around faculty. But I do think we are inclined to learn best from the people we best understand, or whose teaching style is most interesting to us.

There is a risk, too, that we ignore courses that are essential for our major or for an area of legal practice. But I don’t worry too much about that (but it does give me some pause). For one, if you like all the faculty in a different area—criminal law when you want to be a corporate attorney, for instance—maybe you picked the wrong field or the wrong school…. And there are some courses that are simply unavoidable, often because they are required. And there are courses I really value taking as courses I felt would help my career—federal courts and criminal procedure in anticipation of my clerkship, for instance (even though I did like the professors!). But I advise students to be cautious when thinking about course selection.

2. Find courses with writing and substantial revision requirements. Who hasn’t been the student relieved that they have no exams and only paper courses? But school—particularly, again, I think of law school—is a tremendous opportunity to improve one’s writing ability without the pressures of, say, a demanding client or boss frustrated with your writing ability! Writing opportunities, then, are terrific places to improve this craft. But it’s not just dumping words onto a page at the end of the semester. Find courses that also include revision requirements—a draft due early, a professor’s feedback about the piece’s strengths and weaknesses, and an opportunity to improve it. In law school, this is an essential component of legal research and writing. But finding such opportunities in the upper-division curriculum requires you to seek them out—and requires faculty willing to incorporate draft revision in the syllabus rather than simply expecting some paper at the end.

3. Pick a schedule with course times that help your self-discipline. I loved 8 am courses in school. In college, they helped keep me on a disciplined schedule and ensured I didn’t skip breakfast on my meal plan. In law school, they kept me and my wife (who worked) on similar schedules. I liked morning courses because I paid attention best then; I liked doing homework in the afternoon. I liked scheduling classes every day because it forced me to get into school every day to study. In short, I found out what worked best for me and made sure I planned schedules around it. Too often, it’s tempting to develop schedules around what is convenient. Convenience may be important, but self-discipline—developing habits that will help you avoid your own weaknesses or temptations, like procrastination or laziness—is crucial to future success.

4. Do not assume an elective will be offered next year: take it now. You’re looking at the schedule, and you see a neat course with a professor you like. But it’s an inconvenient time, or it runs up against a requirement in your discipline, or whatever it is. And you think, “Well, I’ll just take it next year.” Don’t do that. Don’t! Schedules are fickle things. Faculty lateral to another institution, go on sabbatical, visit elsewhere, take parental leave, or retire. There’s a deficiency in another area, so faculty give up the elective to help teach something else. Research interests shift. Low interest means the course isn’t offered again. There are a thousand reasons that there is no guarantee that next year will allow this course to return. So take it now (if you can).

*

There are of course many, many factors to consider when scheduling courses. (Many on Twitter have been suggesting other considerations, too.) But these are four of my most common pieces of advice and things that can help improve one’s experience.

The relationship between 1L class size and USNWR peer score

Professor Robert Jones has tracked the peer scores for law schools as reported by USNWR over time. That is, each year, each law school receives a peer score on a scale of 1 to 5, and we can see how that score has changed over time. He’s also tracked the average of all peer scores. That gives us an interesting idea—what do law professors think about the state of legal education as a whole? Perhaps schools are always getting better and we’d see peer scores climb; perhaps schools increasingly rate each other more harshly as they compete to climb the rankings and we’d see the peer scores drop. (There’s some evidence law faculties are pretty harsh about most law schools!)

At the risk of offering a spurious correlation, I noticed that the average score appeared to rise and fall with the conditions of the legal education market. The easiest way to track that would be to look at the overall 1L incoming class size the year the survey was circulated.

You can make a lot of correlations look good on two axes if you shape the two axes carefully enough. But there’s a good relationship between the two here. As the legal education market rose from 2006 to 2010 with increasingly large 1L class sizes, peer scores roughly trended upwards. As the market crashed through 2014, peer scores dropped. Now, the market has modestly improved—and peer scores have moved up much more quickly, perhaps reflecting optimism.

All this is really just speculation about why peer scores would change, on average, more than 0.1 in a single decade, or why they’d move up and down. Intuitively, the fact that peer scores may improve as the legal academy feels better about the state of legal education, or worsen as it feels worse, seems to make sense. There are far better ways to investigate this claim, but this relationship struck me as noteworthy!

A continuing trickle of law school closures

One year ago today—March 22, 2018—I reflected on the “trickle” of law school closures. Some campuses closed (Cooley’s Ann Arbor branch, Atlanta’s John Marshall’s Savannah branch), two schools merged into one (William Mitchell and Hamline), and others announced their closure (Indiana Tech, Whittier, Charlotte, and Valparaiso). In the last year, Arizona Summit and Western State have announced their closures.

Western State closing two years after Whittier is a remarkable turn for legal education in Orange County, California. Orange County, with more than 3 million residents, is one of the most populous and fastest-growing counties in the United States.

California has long had a number of state-accredited schools in the state, schools that do not have full ABA accreditation. Western State has been around since the 1970s but was not the first school to gain full ABA accreditation—that was Whittier in 1978. Western State joined newcomer Chapman as fully accredited in 1998. Then UC-Irvine was accredited in 2011. But now two of those four schools have closed.

While we are a long way from the recession, and while law school enrollment has stabilized (and slightly improved) over the last few years, there remain longstanding pressures on legal education, in part from the legacy of the recession—small class sizes can only be sustained so long, scholarships have increased to attract students, the transfer market has disproportionately impacted more marginal schools, lower credentials of incoming students have translated into systemic lower bar passage rates, and so on.

We may still see a few more closures in the years ahead—for-profit schools schools have borne the brunt of the closures, but we’ll see what happens in the months to come.

The new arms race for USNWR law specialty rankings

The USNWR law “specialty” rankings long operated this way: schools would identify one faculty member whose specialty matched one of the various USNWR specialty categories (legal writing, trail advocacy, tax, etc.). USNWR would send a survey to those faculty asking them to list up to 15 of the top schools in those areas. USNWR would then take the top half of those schools who received a critical mass of votes, and rank them based upon who received the most votes—just ordinal rank with no total votes listed. For many specialty areas, that meant 10 to 20 schools. And for the other 180 to 190 schools, that meant blissful ignorance.

USNWR changed that methodology this year in a couple of ways. First, its survey asks voters to rank every school on the basis of this specialty on a scale of 1 to 5, similar to how the peer reputation survey works. Second, it ranks all the schools that received a critical mass of votes (i.e., about 10 votes—and most law professors are not shy about rating most schools). Third, it now lists that reputation score, ties and all.

The result is that almost all schools are ranked in almost all categories. And now your school might be 33d or 107th or 56th or something in a category.

The result in some categories is comical compression in some categories. A score of 2.0 (out of 5) gets you 91st in International Law, and a score of 1.0 (the bottom) gets you to 177th. Ties are abundant—after all, there are usually at least 180 schools ranked, and given that the scale is from 5.0 to 1.0, and that virtually all schools are in the 4.0 to 1.0 range, there are going to be a lot of ties.

Worse, now schools can advertise their top X program, when X in the past typically wouldn’t drop past 10 to 20. Now, top 30, top 50, top 100 all earn bragging rights.

So now there’s a new arms race. Schools know exactly where they sit in this year’s survey, how tantalizing close the next tranche of the ratings are (because of the ties), how much higher that ranking is (again, because of the ties), and the temptation to pepper prospective voters with more marketing materials in the ever-escalating race to climb the ranks of a new set of specialty rankings. In the past, it was blissful ignorance for those below 20th. Today, it’s all laid bare.

Perhaps I’m wrong. Maybe schools will mostly ignore the change to the specialty rankings. The compression and ties alone should cause most ot ignore them. But, I doubt it. The allure of rankings and the temptation of marketing departments to boast to prospective students and alumni about some figure (especially if that figure is higher than the overall USNWR rank) will, I think, overwhelm cooler heads.

Anatomy of a botched USNWR law ranking leak

For the past few years, USNWR has emailed all law schools deans an embargoed PDF listing the tentative law school rankings about a week before their formal release. And for the past few years, within minutes (and in disregard of that embargo), that email is leaked to a private consulting company, which then posts the rankings on its corporate blog, where the rankings then spread via social media and gossip sites.

This year, USWNR did something different. It released most of its graduate school rankings in an Excel spreadsheet on a password-protected site around 8 am ET on Tuesday, March 5. But it did not release the law school full time rankings, nor the business school rankings. (I guess we know which schools are beholden to these rankings and where USWNR sees its value!) 

Instead, shortly after, individuals at schools received their own school's ranking, and nothing more. This makes leaking much more challenging. If you leak your own school's ranking, it's obvious you leaked it, and USNWR may punish you by not giving you access to that embargoed data early next year. 

But around 5 pm ET on Tuesday, March 5, USNWR sent out a new update. Its Academic Insights database would now have the 2020 rankings data (that is, the rankings data to be released March 12, 2019). 

Academic Insights is a USNWR platform that law schools purchase a license to access and use. It has rankings data stretching back to the beginning. It offers multiple ways to view the data inside AI, or to pull the rankings data out of the database. 

It's user friendly, but it isn't always the easiest to operate, and like many web databases it can suffer from some wonky behavior. It makes leaking a trickier proposition.

Around 7 pm ET March 5, the private consulting company posted the rankings. But the rankings made it very obvious that there were errors, and it also provided clues about how those errors came about.

To leak this information to someone, some law school administrator made a data request from the database and exported the rankings information to a CSV file. The “Leaderboard” AI database is a swift way to see the ranking of law schools compared to one another across categories. (Recall that the database stretches back through the history of USNWR, so it includes all schools that were ever ranked over the last 30 years, whether or not they’re ranked, or even exist, this year.)

The list then included as “N/A” (i.e., “unranked” this year) schools like Arizona Summit and the University of Puerto Rico. This is unsurprising because USNWR doesn’t rank (1) provisionally-accredited schools, (2) schools under probation, and (3) the schools in Puerto Rico.

But the leaked ranking included other bizarre “unranked” choices: Hamline University; Pennsylvania State University (Dickinson) pre-2017; Rutgers, The State University of New Jersey--Camden; Rutgers, The State University of New Jersey--Newark; Widener University; and William Mitchell College of Law (among others). These schools all no longer exist (along with a couple of others that have announced closures). Why list them as “unranked”?

Separately, the leaked rankings omitted information for Penn State - University Park, Penn State - Dickinson, Rutgers, Widener University (Commonwealth), Widener University Delaware, and Mitchell|Hamline. Why aren’t these schools in the database?

These are obviously not random database omissions. They're omissions of schools that split or merged. Their old schools are in the database. But the Leaderboard database pull request omitted those schools. (Why, I don't know.)

But there are ways of requesting school-specific data. You could request the specific institutional data in the AI database for, say, Penn State - University Park or Rutgers, and the data is now available for your review—including those institutions’ ranks. Of course, a few schools might ultimately be "rank not published," or "Tier 2" schools in the rankings. But they're not "unranked."

(Incidentally, from the revealed metadata, we know a lot of information about which person at which law school leaked the rankings, but that’s not what this blog post is about.)

The real botching came when the leaked ranking included these strange inclusions and omissions (with some noticeable gaps—think two schools listed at 64, followed by a school ranked at 67, which means there’s an omission in the 64 ranking) was posted and began to spread. Panicked students and prospective students at places like Penn State and Rutgers asked what happened. The private consulting company replied that it “appeared” the schools were “unranked.” That spawned a great deal of speculation and worry on behalf of these students.

Of course, that wasn’t the case. The statements speculating that these schools appeared to be “unranked” were reckless—that is, they were based without an understanding of how the database operates and based instead on speculation—and false—because, as I noted, each of these omitted schools had a ranking in the database, simply not in the CSV leaked to this private consulting company. (Later statements began to concede that these schools would probably be ranked, but those statements came only after worry and misinformation had spread.)

I pushed back against this false news last week in a couple of social media outlets, because it does no good to perpetuate false rumors about these law schools. These law schools, I insisted, would probably be ranked. They were ranked at the very moment in the AI database; and, barring a change, they’ll be ranked when the rankings were released (i.e., now). (Of course, some schools, like those under probation or those in Puerto Rico, were never going to be ranked.) 

The backlash I received on social media was impressive. I confess, I'm not sure why so many prospective law students felt threatened by my insistence that someone had disclosed bad information about schools like Penn State and Rutgers to them! (Happily, such comments roll off easily.) After that, apparently, USNWR asked for those rankings to be taken down, and they were. (Of course, they still floated around social media and gossip sites.)

But we know that leaking USNWR information from the AI database presents complications for future leaks. Failure to understand how to operate the database may leave an incomplete and inaccurate picture, as occurred this year with the botched leak. We’ll see what USNWR does for the 2021 rankings—are total but accurate leaks better, or incomplete but inaccurate leaks better? We shall see.

And for those relying on leaks in the future? Read skeptically. The leak included material errors this year, and I wouldn't be surprised to see material errors in future leaks.