Earlier, I blogged about the disconcerting conclusion from recent bar performance and the results of a California State Bar study that law school “bar prep programs” appear to have no impact on students ability to pass the bar exam.
But what about specific substantive course areas? Does a student’s performance in, say, Torts translate into a stronger bar exam score?
The answer? Probably not.
First, let me aclear a little underbrush about what claim I’d like to examine. We all know that students take some subjects that appear on the bar, but most don’t take all of them. Virtually all law school graduates take a specific bar preparation course offered by a for-profit company to help train them for the bar exam.
But law schools might think that they could improve bar passage rates by focusing not simply on “bar prep,” but on the substantive courses that will be tested on the bar exam. If bar passage rates are dropping, then curricular reform that tries to require students to take more Evidence, Torts, or Property might be a perceived soslution.
So what exactly is the relationship between substantive course area performance and the bar exam? Not much.
Back in the 1970s, LSAC commissioned a study looking at law schools in several states and their performance on the bar exam. The then-new Multistate Bar Exam had five subjects. Researchers looked at how law students performed in each of those substantive subject areas in law school: Contracts, Criminal Law, Evidence, Property, Torts. (The results of the study are found at Alfred B. Carlson & Charles E. Werts, Relationships Among Law School Predictors, Law School Performance, and Bar Examination Results, Sep. 1976, LSAC-76-1.)
They then looked at whether The LSAC study examined first-year subject-area grades; first-, second-, third-year grades; and overall law school GPA, and their correlations with MBE subject areas. The higher the number, the closer the relationship.
Torts is an illustrative example. The relationship between the TORT/L (grades in Torts) and the performance of students on the MBE area of Torts is 0.19, a relatively weak correlation. But grades in Torts were more predictive of performance in Real Property, Evidence, Criminal Law, and Contracts—perhaps a counterintuitive finding. That is, your Torts grade told you more about your performance in the Property portion of the bar exam than the Torts section.
Again, these numbers are relatively weak, so one shouldn’t draw much from from that noise, like 0.19 to 0.26.
In contrast, LGPA/L (law school GPA) was more highly correlated than any particular bar exam subject area, and highly correlated (0.55) with the total MBE performance. Recall that overall law school GPA includes a number of courses—bar related and not—and that it’s more predictive than any particular substantive course area.
The LSAC study dug into further findings to conclude that the bar exam is testing “general legal knowledge,” and that performance in any particular subject area is not particularly indicative of strength of performance on that subject area on the bar exam.
The short of it is, this is good evidence that the important thing coming out of three years of law school is not the substantive transmission of knowledge, but the, for lack of a better phrase, ability to “think like a lawyer” (or simply engage in critical legal analysis). Bar prep courses the summer before the bar exam are likely the better place to cram the substantive knowledge for the bar; but the broad base of legal education is what’s being tested (perhaps imperfectly!) on the bar exam.
We also have the results of a recent study by the California State Bar. The study looked at student performance in particular course areas and the relationship with bar exam scores. After examining the results of thousands of students and bar results from 2013, 2016, and 2017, the findings are almost identical.
The correlations between any one subject that that subject on the bar exam are modest, and sometimes they’re (slightly) more highly correlated with different subject areas—the same findings as LSAC’s 1976 study. But none of them are nearly as strong as the overall law school GPA, which is between .6 and .7 over the overall MBE and written components as the study finds. (Unfortunately, this study didn’t break out the relationship between law school GPA and particular MBE topic areas.)
The study did, however, make an interesting finding and reached what I think is an incorrect possible conclusion.
The study discovered that cumulative GPA in California bar exam-related subject areas (listed above) was significantly more highly correlated with the cumulative GPA in non-California bar exam-related subject areas.
It went on to find no relationship (in some smaller sets of data) between bar passage rates and participation in clinical programs; externships; internships; bar preparation courses; and “Non-Bar Related Specialty Course Units” (e.g., Intellectual Property).
Here’s the finding I’d take issue with: “However, overall CBX [California bar exam] performance correlated more strongly statistically with aggregate performance in all of the bar-related courses than with aggregate performance in all non-bar-related courses, suggesting that there may be some type of cumulative effect operating.”
I’m not sure that’s the right assumption to reach. I think that the report understates the likelihood that grade inflation in seminar courses; higher inconsistency in grading in courses taught by adjuncts; or grades in courses that don’t measure the kinds of skills evaluated on the bar exam (e.g., oral advocacy in graded trial advocacy courses) all affect non-bar-related course GPA. That is, my suspicion is that if one were to measure the GPA in other substantively-similar non-bar-related courses (e.g., Federal Courts, Antitrust, Secured Transactions, Administrative Law, Merger & Acquisitions, Intellectual Property, etc.), one would likely find a similar relationship as performance in bar-related course GPA. That’s just a hunch. That’s what I’d love to see future reports examine.
That said, both in 1976 and in 2017, the evidence suggests that performance in a specific substantive course has little to say about how the student will do on the bar—at least, little unique to that course. Students who do well in law school as a whole do well on each particular subject of the bar exam.
When law schools consider how to best help prepare their students for the bar, then, simply channeling students into bar-related subjects is likely ineffective. (And that’s not to say that law schools shouldn’t offer these courses!) Alternative measures should be considered. And I look forward to more substantive course studies like the California study in the future.