Visualizing legal employment outcomes in Texas in 2018

This is the third in a series of visualizations on legal employment outcomes for the Class of 2018. Following posts on outcomes in Illinois and Pennsylvania, here is a visualization for legal employment outcomes of graduates of Texas law schools for the Class of 2018. (More about the methodology is available at the Illinois post.) Last year's Texas post is here.

Total jobs improved notably in bar passage required jobs, from 1321 to 1366. J.D.-advantage jobs declined. Total graduates increased to 1979, up from 1959. Placement improved from 75.8% to 76.3%.

As always, please notify me of any corrections or errata.

Peer Score School 2018 YoY% BPR JDA LSF Grads 2017 BPR JDA LSF Grads
4.1 University of Texas-Austin 92.8% 6.0 238 15 6 279 86.9% 259 18 7 327
2.4 Baylor University 89.0% 2.8 100 5 0 118 86.2% 107 4 1 130
2.7 Southern Methodist University 87.1% 4.0 192 17 0 240 83.1% 179 17 0 236
1.9 Texas Tech University 85.8% 7.6 125 8 0 155 78.2% 137 17 0 197
2.7 University of Houston 85.4% 7.9 171 22 0 226 77.5% 153 26 0 231
2.4 Texas A&M University 81.9% 7.6 93 20 0 138 74.3% 118 18 0 183
1.6 South Texas College of Law Houston 67.8% 6.0 160 24 1 273 61.7% 157 27 0 298
1.6 St. Mary's University 62.5% -8.4 129 11 0 224 70.9% 118 20 1 196
nr University of North Texas Dallas 57.9% 6.4 76 8 0 145 51.5% 17 0 0 33
1.4 Texas Southern University 48.6% -17.0 82 6 0 181 65.6% 76 8 0 128

Visualizing legal employment outcomes in Pennsylvania in 2018

This is the second in a series of visualizations on legal employment outcomes for the Class of 2018. Following a post on outcomes in Illinois, here is a visualization for legal employment outcomes of graduates of Pennsylvania law schools for the Class of 2018. (More about the methodology is available at the Illinois post.) Last year's Pennsylvania post is here.

Total graduates were down slightly year-over-year, and the job picture improved a little, with 84.4% employed in bar passage required and J.D. advantage positions, including 11 school-funded positions. Total placement fell slightly, so the improvement is attributable to the slightly smaller number of graduates, from 1256 to 1238.

As always, please notify me of any corrections or errata. UPDATE: Duquesne’s employment data was accidentally overstated in an earlier post and has been edited to present the accurate data.

Peer Score School 2018 YoY% BPR JDA LSF Grads 2017 BPR JDA LSF Grads
4.4 University of Pennsylvania 97.9% -0.9 216 12 10 243 98.8% 232 16 5 256
2.5 Villanova University 88.2% 5.8 127 15 0 161 82.4% 120 11 0 159
2.2 Pennsylvania State - Dickinson Law 87.3% 8.6 51 4 0 63 78.7% 41 7 0 61
2.1 Drexel University 83.7% 2.3 95 13 0 129 81.5% 88 13 0 124
2.7 Temple University 83.3% -3.4 161 13 0 209 86.6% 172 15 1 217
1.8 Duquesne University 80.0% 0.6 86 13 0 120 79.4% 80 20 0 126
2.3 Penn State Law 80.0% 1.9 86 13 1 125 78.1% 82 7 0 114
2.7 University of Pittsburgh 71.9% -5.7 85 12 0 135 77.5% 87 20 0 138
1.7 Widener Commonwealth 62.3% -4.9 32 1 0 53 67.2% 36 5 0 61

Visualizing legal employment outcomes in Illinois in 2018

Following up on a series of posts last year (and previous years), this is the first in a series visualizing employment outcomes of law school graduates from the Class of 2018. The U.S. News & World Report ("USNWR") rankings recently released, which include data for the Class of 2017, are already obsolete. The ABA will release the information soon, but individualized employment reports are available on schools' websites.

The USNWR prints the "employed" rate as "all jobs excluding positions funded by the law school or university that are full-time, long-term and for which a J.D. and bar passage are necessary or advantageous ." It does not give "full weight" in its metrics to jobs that were funded by the law school. USNWR gives other positions lower weight, but these positions are not included in the ranking tables. And while it includes J.D. advantage positions, there remain disputes about whether those positions are actually as valuable as bar passage required jobs. (Some have further critiqued solo practitioners being included in the bar passage required statistics.) Nonetheless, as a top-level category, I looked at these “full weight” positions.

The top chart is sorted by non-school-funded jobs (or "full weight" positions). The visualization breaks out full-time, long-term, bar passage required positions (not funded by the school); full-time, long term, J.D.-advantage positions (not funded by the school); school funded positions (full-time, long-term, bar passage required or J.D.-advantage positions); and all other outcomes. I included a breakdown in the visualization slightly distinguishing bar passage required positions from J.D.-advantage positions, even though both are included in "full weight" for USNWR purposes (and I still sort the chart by "full weight" positions).

The table below the chart breaks down the raw data values for the Classes of 2017 and 2018, with relative overall changes year-over-year. Here, I used the employment rate including school-funded positions, which USNWR used to print but no longer does; nevertheless, because there are good-faith disputes, I think, about the value of school-funded positions, I split the difference—I excluded them in the sorting of the bar graphs, and included them comparatively in the tables. The columns beside each year break out the three categories in the total placement: FTLT unfunded bar passage required ("BPR"), FTLT unfunded J.D. advantage ("JDA"), and FTLT law school funded BPR & JDA positions ("LSF"). This year, I also added the total graduates. (My visualization is limited because the bar widths for each school are the same, even though schools vary greatly in size, and that means raw placement might be more impressive considering class size.)

The first state is Illinois (last year's visualization here). There were 1696 statewide grades, a 3% decline over last year's class. The total placement rate among the graduates was 82% (including a few school-funded jobs). It is, once again, a slight improvement over last year driven by the smaller class size. Placement in bar passage required jobs fell slightly again.

As always, if I made a mistake, please feel free to email me or comment; I confess there are always risks in data translation, and I am happy to make corrections.

Peer Score School 2018 YoY% BPR JDA LSF Grads 2017 BPR JDA LSF Grads
4.7 University of Chicago 98.1% 0.4 188 4 10 206 97.7% 197 4 8 214
4.2 Northwestern University (Pritzker) 96.9% 3.0 205 12 5 229 94.0% 204 24 5 248
3.2 University of Illinois-Urbana-Champaign 91.9% 5.3 118 19 0 149 86.6% 111 12 0 142
2.6 Loyola University Chicago 85.5% 8.0 119 46 0 193 77.5% 138 34 0 222
2.6 Illinois Institute of Technology (Chicago-Kent) 81.0% 7.0 149 39 0 232 74.0% 133 32 0 223
2.2 DePaul University 73.5% 3.9 126 40 0 226 69.6% 126 34 0 230
1.7 The John Marshall Law School 67.5% -4.1 151 33 1 274 71.6% 161 43 0 285
1.6 Southern Illinois University-Carbondale 67.3% 8.7 63 11 0 110 58.6% 64 4 0 116
1.6 Northern Illinois University 66.2% -10.9 43 8 0 77 77.1% 42 11 1 70

February 2019 MBE bar scores bounce back from all-time lows

After cratering to all-time record lows last year, scores on the February administration of the Multistate Bar Exam have bounced back. It’s good news, but modest—the rise returns to scores from February 2017, which were at that time the lowest in history. Scores have now bounced back to match the second-lowest total in history… which is slightly better.

To be fair (which is not to say I’ve been unfair!), part of this overall score is likely driven by the Uniform Bar Exam. It used to be that there were more test-takers who’d passed a previous bar exam and would have to take another test in another jurisdiction. Those who’d already passed were likely to score quite well on a second attempt on a new bar. But the National Conference of Bar Examiners has indicated that the rise of the UBE has dropped the number of people taking a second bar, which in turns drops the number of high scorers, which in turn drops the MBE scores. So the drop in the MBE scores itself isn’t entirely a cause of alarm. It’s a reflection that the UBE is reducing the number of bar test-takers by some small figure each year.

We now know the mean scaled national February MBE score was 134.0, up 1.8 points from last year's 132.8. We would expect bar exam passing rates to rise in most jurisdictions. Just as repeaters caused most of the drop last time, they are causing most of the rise this time. Repeaters’ scores simply appear to be more volatile as a cohort of test takers.

A couple of visualizations are below, long-term and short-term trends.

For perspective, California's "cut score" is 144, Virginia's 140, Texas's 135, and New York's 133. The trend is more pronounced when looking at a more recent window of scores.

The first major drop in bar exam scores was revealed to law schools in late fall 2014. That means the 2014-2015 applicant cycle, to the extent schools took heed of the warning, was a time for them to improve the quality of their incoming classes, leading to some expected improvement for the class graduating in May of 2018. But bar pass rates were historically low in July 2018. It’s not clear that law schools have properly adapted even after five years.

Until then, we wait and see for the July 2019 exam. For more, see Karen Sloan over at NLJ.

My advice to students looking to enroll in classes

I offered a few thoughts on Twitter recently about advice to students looking to enroll in classes. It became popular advice, and then some people added nuance or qualifications, so I thought an extended discussion here might be warranted. While I teach at a law school and think specifically about that, the advice can work well for higher education generally.

1. Take the professor, not the course. In my seven years of higher education, I never regretted a course I took with a professor I liked in an area outside of my specialties or interests; and I’d say all of my least-favorite courses came in courses I felt like I “had” to take or “ought” to take for one reason or another. The quality of the professor often makes or breaks a course. In my conversations with students about their favorite and least favorite courses, it usually turns on naming the professor rather than the contents of the course.

There is a risk that this becomes some kind of cult of personality around faculty. But I do think we are inclined to learn best from the people we best understand, or whose teaching style is most interesting to us.

There is a risk, too, that we ignore courses that are essential for our major or for an area of legal practice. But I don’t worry too much about that (but it does give me some pause). For one, if you like all the faculty in a different area—criminal law when you want to be a corporate attorney, for instance—maybe you picked the wrong field or the wrong school…. And there are some courses that are simply unavoidable, often because they are required. And there are courses I really value taking as courses I felt would help my career—federal courts and criminal procedure in anticipation of my clerkship, for instance (even though I did like the professors!). But I advise students to be cautious when thinking about course selection.

2. Find courses with writing and substantial revision requirements. Who hasn’t been the student relieved that they have no exams and only paper courses? But school—particularly, again, I think of law school—is a tremendous opportunity to improve one’s writing ability without the pressures of, say, a demanding client or boss frustrated with your writing ability! Writing opportunities, then, are terrific places to improve this craft. But it’s not just dumping words onto a page at the end of the semester. Find courses that also include revision requirements—a draft due early, a professor’s feedback about the piece’s strengths and weaknesses, and an opportunity to improve it. In law school, this is an essential component of legal research and writing. But finding such opportunities in the upper-division curriculum requires you to seek them out—and requires faculty willing to incorporate draft revision in the syllabus rather than simply expecting some paper at the end.

3. Pick a schedule with course times that help your self-discipline. I loved 8 am courses in school. In college, they helped keep me on a disciplined schedule and ensured I didn’t skip breakfast on my meal plan. In law school, they kept me and my wife (who worked) on similar schedules. I liked morning courses because I paid attention best then; I liked doing homework in the afternoon. I liked scheduling classes every day because it forced me to get into school every day to study. In short, I found out what worked best for me and made sure I planned schedules around it. Too often, it’s tempting to develop schedules around what is convenient. Convenience may be important, but self-discipline—developing habits that will help you avoid your own weaknesses or temptations, like procrastination or laziness—is crucial to future success.

4. Do not assume an elective will be offered next year: take it now. You’re looking at the schedule, and you see a neat course with a professor you like. But it’s an inconvenient time, or it runs up against a requirement in your discipline, or whatever it is. And you think, “Well, I’ll just take it next year.” Don’t do that. Don’t! Schedules are fickle things. Faculty lateral to another institution, go on sabbatical, visit elsewhere, take parental leave, or retire. There’s a deficiency in another area, so faculty give up the elective to help teach something else. Research interests shift. Low interest means the course isn’t offered again. There are a thousand reasons that there is no guarantee that next year will allow this course to return. So take it now (if you can).


There are of course many, many factors to consider when scheduling courses. (Many on Twitter have been suggesting other considerations, too.) But these are four of my most common pieces of advice and things that can help improve one’s experience.

The relationship between 1L class size and USNWR peer score

Professor Robert Jones has tracked the peer scores for law schools as reported by USNWR over time. That is, each year, each law school receives a peer score on a scale of 1 to 5, and we can see how that score has changed over time. He’s also tracked the average of all peer scores. That gives us an interesting idea—what do law professors think about the state of legal education as a whole? Perhaps schools are always getting better and we’d see peer scores climb; perhaps schools increasingly rate each other more harshly as they compete to climb the rankings and we’d see the peer scores drop. (There’s some evidence law faculties are pretty harsh about most law schools!)

At the risk of offering a spurious correlation, I noticed that the average score appeared to rise and fall with the conditions of the legal education market. The easiest way to track that would be to look at the overall 1L incoming class size the year the survey was circulated.

You can make a lot of correlations look good on two axes if you shape the two axes carefully enough. But there’s a good relationship between the two here. As the legal education market rose from 2006 to 2010 with increasingly large 1L class sizes, peer scores roughly trended upwards. As the market crashed through 2014, peer scores dropped. Now, the market has modestly improved—and peer scores have moved up much more quickly, perhaps reflecting optimism.

All this is really just speculation about why peer scores would change, on average, more than 0.1 in a single decade, or why they’d move up and down. Intuitively, the fact that peer scores may improve as the legal academy feels better about the state of legal education, or worsen as it feels worse, seems to make sense. There are far better ways to investigate this claim, but this relationship struck me as noteworthy!

A continuing trickle of law school closures

One year ago today—March 22, 2018—I reflected on the “trickle” of law school closures. Some campuses closed (Cooley’s Ann Arbor branch, Atlanta’s John Marshall’s Savannah branch), two schools merged into one (William Mitchell and Hamline), and others announced their closure (Indiana Tech, Whittier, Charlotte, and Valparaiso). In the last year, Arizona Summit and Western State have announced their closures.

Western State closing two years after Whittier is a remarkable turn for legal education in Orange County, California. Orange County, with more than 3 million residents, is one of the most populous and fastest-growing counties in the United States.

California has long had a number of state-accredited schools in the state, schools that do not have full ABA accreditation. Western State has been around since the 1970s but was not the first school to gain full ABA accreditation—that was Whittier in 1978. Western State joined newcomer Chapman as fully accredited in 1998. Then UC-Irvine was accredited in 2011. But now two of those four schools have closed.

While we are a long way from the recession, and while law school enrollment has stabilized (and slightly improved) over the last few years, there remain longstanding pressures on legal education, in part from the legacy of the recession—small class sizes can only be sustained so long, scholarships have increased to attract students, the transfer market has disproportionately impacted more marginal schools, lower credentials of incoming students have translated into systemic lower bar passage rates, and so on.

We may still see a few more closures in the years ahead—for-profit schools schools have borne the brunt of the closures, but we’ll see what happens in the months to come.

The new arms race for USNWR law specialty rankings

The USNWR law “specialty” rankings long operated this way: schools would identify one faculty member whose specialty matched one of the various USNWR specialty categories (legal writing, trail advocacy, tax, etc.). USNWR would send a survey to those faculty asking them to list up to 15 of the top schools in those areas. USNWR would then take the top half of those schools who received a critical mass of votes, and rank them based upon who received the most votes—just ordinal rank with no total votes listed. For many specialty areas, that meant 10 to 20 schools. And for the other 180 to 190 schools, that meant blissful ignorance.

USNWR changed that methodology this year in a couple of ways. First, its survey asks voters to rank every school on the basis of this specialty on a scale of 1 to 5, similar to how the peer reputation survey works. Second, it ranks all the schools that received a critical mass of votes (i.e., about 10 votes—and most law professors are not shy about rating most schools). Third, it now lists that reputation score, ties and all.

The result is that almost all schools are ranked in almost all categories. And now your school might be 33d or 107th or 56th or something in a category.

The result in some categories is comical compression in some categories. A score of 2.0 (out of 5) gets you 91st in International Law, and a score of 1.0 (the bottom) gets you to 177th. Ties are abundant—after all, there are usually at least 180 schools ranked, and given that the scale is from 5.0 to 1.0, and that virtually all schools are in the 4.0 to 1.0 range, there are going to be a lot of ties.

Worse, now schools can advertise their top X program, when X in the past typically wouldn’t drop past 10 to 20. Now, top 30, top 50, top 100 all earn bragging rights.

So now there’s a new arms race. Schools know exactly where they sit in this year’s survey, how tantalizing close the next tranche of the ratings are (because of the ties), how much higher that ranking is (again, because of the ties), and the temptation to pepper prospective voters with more marketing materials in the ever-escalating race to climb the ranks of a new set of specialty rankings. In the past, it was blissful ignorance for those below 20th. Today, it’s all laid bare.

Perhaps I’m wrong. Maybe schools will mostly ignore the change to the specialty rankings. The compression and ties alone should cause most ot ignore them. But, I doubt it. The allure of rankings and the temptation of marketing departments to boast to prospective students and alumni about some figure (especially if that figure is higher than the overall USNWR rank) will, I think, overwhelm cooler heads.