Law school work product as a substitute for bar exams
Yesterday, I offered some thoughts about the bar exam and the potential solutions to delays caused by Covid-19. Some solutions, like “emergency diploma privilege,” primarily focus on in-state graduates of the Class of 2020 of ABA-accredited law schools taking the bar for the first time. I mused that there are several cohorts of bar exam test-takers that state bar licensing authorities must consider when adjusting their bar exam policies.
One idea I’ve mulled over is, admittedly, an imperfect and half-baked idea, and one that’s labor intensive—but one that opens up opportunities for essentially all comers (with some caveats) to get admitted to a jurisdiction.
In short: state bars could review portfolios of prospective attorneys’ work product (from law school, supervised practice, or actual practice) and determine whether the candidate meets the “minimum standards” to practice in the jurisdiction.
Okay, in length: this isn’t easy. But graduates of all ABA-accredited law schools are required to take a legal writing course and an upper-division writing course. Written work product is assuredly something that comes out of every law school.
Students also commonly take a core of courses that are on the bar exam and take written exams in those courses. They write exams examining issues of Civil Procedure, Evidence, Contracts, and Property.
Law school graduates also spend their summers—or some semesters—working with legal employers and developing written work product. They work in clinics or externships during the semester, and they write memos or other work product.
State licensing authorities, then, could require prospective applicants to develop portfolios of written work product. State bar graders could then go through the standard “calibration” exercises they usually do to “grade” written work product. Multiple “graders” would look at each component of the work product.
Now, there are huge problems with this, I know, and I’ll start with a few. First and foremost is the lack of standardization. Not everyone does the same writing assignments in law school, much less the same types of writing assignments. Exam essays typically lack context (e.g., they don’t have a “statement of facts” like a typical legal writing assignment would). Not everyone spends a summer in the practice of law (e.g., in finance) or has written work product (e.g., an externship with a judge that doesn’t want disclosure of work product). There’s no scaling and equating like one has with the bar exam to improve reliability. Grading would take quite some time.
In the alternative, state licensing authorities could authorize “supervised practice” (Proposal 6 in the working paper on bar exam alternatives) and use the work product from that supervised practice later to submit to the licensing authority to supplement the law school work product.
But an advantage of this proposal, I think, is that the written product is what we’d expect of attorneys and a good measure of their ability. Law school grades (i.e., mostly the assessment of written work product) strongly correlate with bar exam scores and bar exam success. It would extend to in-state or out-of-state, to ABA-accredited graduates or others, to foreign-trained attorneys, or to licensed practitioners in other states. It could even apply to those who’ve failed the bar exam before—if they’re judged on their work to be ”minimally qualified,” all the better.
I toss it out as one possible solution that requires little additional or new work on the part of prospective applicants to the bar, that judges them on something relevant to their ability to engage in legal analysis, and that mitigates concerns around different cohorts of applicants to the bar.
Maybe it’s too much work, the disparities in the types of work product too vast, for us to consider. At the same time, federal judges commonly review clerkship applicants on an open-ended consideration of written work product. Perhaps there’s something to be said for looking at past written work product.