Visualizing legal employment outcomes in New York in 2014

Following up on posts about California and about DC-Maryland-Virginia, here are outcomes for law schools in New York. (Details about the methodology, and the USNWR methodology basis, are in the California post.) The chart is sorted by non-school-funded jobs (or USNWR "full-weight" positions). The table below the chart breaks down the raw data values for the Classes of 2013 and 2014, with relative overall changes, and is sorted by total placement (as USNWR prints). The raw data (and overall percentages) includes all full-time, long-term, bar passage-required and J.D.-advantage positions, with a parenthetical with the total number of school-funded positions.

Employment outcomes improved: 73.1% had such positions, up from 68.6% for the Class of 2013. That's likely almost exclusively due to the reduction in class size: these 15 schools went from 5009 graduates in the Class of 2013 to 4529 in the Class of 2014. There were just 93 school-funded positions, down from 102, and almost all of them came from Columbia, NYU, and Cornell. More granular data (e.g., breakdowns between bar passage required and J.D. advantage positions) is available at each school's website and forthcoming in spreadsheet format from the ABA.

Peer score School 2014 YoY% raw 2013 raw
4.5 New York University 96.7% 1.1 463 (39) 95.5% 513 (42)
4.2 Cornell University 96.3% 6.7 184 (11) 89.6% 173 (16)
4.6 Columbia University 95.7% -1.3 448 (31) 97.0% 424 (29)
3.2 Fordham University 74.1% 5.9 340 (0) 68.2% 328 (0)
2.2 St. John's University 73.9% 6.5 190 (0) 67.3% 208 (1)
1.7 Albany Law School 72.5% 4.2 148 (0) 68.4% 134 (2)
2.2 Syracuse University 72.0% 5.4 152 (0) 66.7% 136 (0)
2.1 Hofstra University 71.0% 10.5 225 (4) 60.5% 193 (0)
2.2 University of Buffalo-SUNY 69.6% 0.1 133 (0) 69.5% 162 (0)
1.9 Pace University 68.7% 15.9 149 (5) 52.7% 155 (7)
2.7 Cardozo School of Law 67.1% 5.5 263 (1) 61.6% 245 (0)
1.5 Touro College 64.3% 6.9 126 (0) 57.4% 132 (0)
1.8 New York Law School 64.1% 5.7 266 (2) 58.4% 328 (5)
2.5 Brooklyn Law School 63.9% -5.2 244 (0) 69.0% 330 (0)
2.0 City University of New York 50.7% -5.1 76 (0) 55.8% 77 (0)

UPDATE: This post has had a small data error corrected.

Visualizing legal employment outcomes in DC-Maryland-Virginia in 2014

Following up on my post regarding California employment outcomes, here are outcomes for law schools in the District of Columbia, Maryland, and Virginia. (Details about the methodology, and the USNWR methodology basis, are there.) The chart is sorted by non-school-funded jobs (or "full-weight" positions). The table below the chart breaks down the raw data values for the Classes of 2013 and 2014, with relative overall changes, and is sorted by total placement (as USNWR prints). The raw data (and overall percentages) includes all full-time, long-term, bar passage-required and J.D.-advantage positions, with a parenthetical with the total number of school-funded positions.

Total jobs in these bar passage-required and J.D.-advantage positions declined slightly from 3207 to 3119, due in part, I would assume, from a significant decline in school-funded positions, from 357 to 271. But there were about 250 fewer graduates, from 4253 to 3992, which means that overall prospects improved for graduates: the overall employment rate was 78.1% (including all funded positions). More granular data is available at each school's website and forthcoming in spreadsheet format from the ABA.

Peer score School 2014 YoY% raw 2013 raw
4.3 University of Virginia 96.6% -0.4 337 (34) 97.0% 353 (59)
3.4 George Washington University 89.2% 0.2 521 (78) 89.1% 537 (89)
4.1 Georgetown University 87.2% -2.5 546 (72) 89.8% 579 (80)
3.2 William and Mary Law School 82.3% -1.5 177 (24) 83.9% 182 (48)
2.5 University of Richmond 81.9% 10.0 122 (0) 71.8% 102 (0)
2.7 George Mason University 79.9% 5.4 147 (7) 74.5% 190 (7)
2.9 University of Maryland 75.3% 7.6 223 (2) 67.7% 197 (16)
3.1 Washington and Lee University 74.8% 11.2 95 (1) 63.6% 91 (0)
2.1 Catholic University of America 70.9% 2.5 127 (0) 68.5% 163 (0)
2.0 University of Baltimore 70.4% 5.8 221 (0) 64.6% 201 (0)
2.8 American University 70.2% 9.7 323 (48) 60.6% 307 (54)
2.3 Howard University 65.5% -0.5 74 (1) 65.9% 91 (1)
1.3 Regent University 63.1% -1.4 77 (0) 64.5% 89 (0)
1.2 Liberty University 56.6% 10.3 43 (1) 46.2% 43 (2)
1.4 District of Columbia 44.7% 3.4 46 (2) 41.3% 33 (1)
1.2 Appalachian School of Law 42.1% -13.6 40 (1) 55.7% 49 (0)

Visualizing legal employment outcomes in California in 2014

As is the case every year, the U.S. News & World Report releases its rankings with data that becomes obsolete in a matter of weeks. Its rankings include the Class of 2013 employment data. But schools have released their Class of 2014 employment data. The ABA has not yet released the spreadsheet of the data, but individual schools have. I thought I'd recreate last year's data on California law school employment outcomes, with a couple of tweaks due to external changes.

Complaints from law school deans in California were heeded. Employment outcomes are now reported at ten months' after graduation instead of nine. But the decline in bar pass rates for the Class of 2014 meant, in all likelihood, a more challenging environment.

The USNWR methodology also changed slightly, too. It still prints the "employed" rate as "the percentage of all graduates who had a full-time job lasting at least a year for which bar passage was required or a J.D. degree was an advantage." But this year, it announced that would not give "full weight" in its internal ranking metric to jobs that were funded by the law school. (Many law schools have migrated toward providing funding for these types of positions; we'll see if that was motivated by the USNWR methodology, and, if so, we'll see if any react by rolling back those programs.) USNWR gives other positions lower weight, but these positions are not included in the ranking tables. And while it includes J.D. advantage positions, there remain disputes about whether those positions are really as valuable.

A few things jump out from the data. First, there were fewer graduates: there were about 400 fewer graduates from California schools, from 5185 for the Class of 2013 to 4731 for the Class of 2014.

Second, total job placement remained flat. Between 2800-2900 California graduates obtained unfunded positions in the last three years. This year shows that 2849 obtained these unfunded, full-weight positions, good for 60.2% of California graduates--a percentage better than previous years, no doubt, because of the smaller graduating classes.

Third, school-funded positions continue to rise. There were 145 school-funded positions from California schools .Two schools significantly increased school-funded positions: USC went from 12 to 33 (15% of the graduating class), and UC-Irvine went from 0 to 13 (14% of the graduating class). Davis added 9, and Stanford and Loyola each added 5 more school-funded positions, among other more modest changes. (Keep in mind that the Class of 2012 had just 24 such school-funded positions among California schools.)

Below is a graph of the unfunded and funded full-time, long-term, bar passage-required and J.D.-advantage positions. The chart below that reflects the same data, with the 2016 USNWR peer score, the full-time, long-term, bar passage-required and J.D.-advantage positions, along with year-over-year increase or decline in points from the 2013 rate. It then lists the raw number of students who obtained such positions, along with a parenthetical notation of how many of those positions were school-funded. The same is listed for 2013.

Peer score School 2014 YoY% raw 2013 YoY% raw
4.4 CALIFORNIA-BERKELEY, UNIVERSITY OF 95.5% 5.1 274 (20) 90.4% 2.3 272 (25)
4.8 STANFORD UNIVERSITY 93.6% 0.8 180 (10) 92.8% -3.9 180 (5)
3.9 CALIFORNIA-LOS ANGELES, UNIVERSITY OF 87.5% 5.3 294 (32) 82.2% 5.1 273 (34)
3.5 SOUTHERN CALIFORNIA, UNIVERSITY OF 85.7% 14.7 186 (33) 71.0% -1.4 169 (12)
3.0 CALIFORNIA-IRVINE, UNIVERSITY OF 84.9% 18.3 79 (13) 66.7% -19 56 (0)
3.3 CALIFORNIA-DAVIS, UNIVERSITY OF 81.7% 8.2 138 (19) 73.5% 5.6 144 (10)
2.5 LOYOLA LAW SCHOOL-LOS ANGELES 71.0% 11.8 281 (10) 59.1% 10.4 230 (5)
2.6 PEPPERDINE UNIVERSITY 59.6% -5.2 118 (1) 64.8% 6.6 138 (0)
1.9 MCGEORGE SCHOOL OF LAW 59.4% 12.5 111 (1) 46.9% 3.1 149 (3)
2.6 SAN DIEGO, UNIVERSITY OF 58.1% -2.0 155 (1) 60.1% 8.4 191 (0)
3.1 CALIFORNIA-HASTINGS, UNIVERSITY OF 57.7% 10.5 232 (2) 47.2% -4.5 176 (2)
1.5 CALIFORNIA WESTERN SCHOOL OF LAW 57.1% 15.4 125 (0) 41.6% -7.8 117 (0)
1.9 SOUTHWESTERN LAW SCHOOL 55.9% 3.9 175 (1) 52.0% -4.5 156 (0)
1.8 CHAPMAN UNIVERSITY 55.1% 9.4 76 (0) 45.7% -2.1 85 (0)
nr LA VERNE, UNIVERSITY OF 50.0% 9.3 22 (0) 40.7% 4.2 35 (1)
2.4 SANTA CLARA UNIVERSITY 47.9% -8.3 125 (0) 56.2% -0.2 181 (1)
2.1 SAN FRANCISCO, UNIVERSITY OF 46.7% -0.8 92 (0) 47.5% 14.9 95 (1)
1.1 WESTERN STATE COLLEGE OF LAW 45.3% 1.4 68 (0) 43.9% 4.1 54 (0)
1.4 WHITTIER LAW SCHOOL 43.8% 13.3 85 (0) 30.5% -15.4 64 (0)
1.3 THOMAS JEFFERSON SCHOOL OF LAW 41.0% 0.0 120 (0) 41.0% 4.8 120 (0)
1.6 GOLDEN GATE UNIVERSITY 31.7% 2.7 58 (2) 28.9% 1.8 66 (1)

The wrong sort of law school applicants, visualized

I have just been thinking, and I have come to a very important decision. These are the wrong sort of bees.

Are they?

Quite the wrong sort. So I should think they would make the wrong sort of honey, shouldn’t you?

Would they?

Yes. So I think I shall come down.
— A.A. Milne, Winnie-the-Pooh (1926)

There's good news and bad news for law schools. The good news is that total law school applicants appear to be reaching the bottom. After projections last year that the worst may be yet to come, it appears that the Class of 2018 will have only slightly fewer applicants than the Class of 2017. Current projections are about a 2.8-point drop in applicants, and that gap may narrow if recent trends of late applicants continue.

Of course, it doesn't take much effort to notice that despite the bottoming out, it's still very much below recent trend, both for projected applicants and projected matriculants.

But the bad news is that the quality of the applicants. In short, the wrong sort of applicants are applying.

LSAC sends occasional updates, its "Current Volume Summary," describing trends in applicants. At this point last year, LSAC reports it had 87% of the preliminary final applicant count, which means we are fairly late in the applicant cycle.

It also circulates a breakdown of the high LSAT score of 2015 ABA applicants, with the percentage change from last year. The numbers are, I think, fairly shocking.

While the raw totals of applicants are largely unchanged (i.e., down slightly), the quality of those applicants is down fairly significantly. The only places that have seen an increase in applicants are those scoring a 144 or lower.  Applicants scoring a 145 to 149 are largely flat. In contrast, applicants with a high score of 165 or higher are down double-digits, with the steepest decline among the most elite applicants.

Granted, there are only a handful of those with a high score of 175 or higher (this year, just 481 applicants). But the numbers do reflect the predicament law schools find themselves in. They can certainly fill their classes with applicants, from a pool of comparable size. But the composition of the pool is worse than it was even last year. And as school confront lower quality applicants, with lower predictors, they face the back-end problem of higher bar failure rates, and likely adverse employment outcomes.

So the first level of data tells us some good news for law schools--applicants are not down as much as one might have feared, and the bottom may be approaching. But the second level of data tells us some bad news--applicant quality has worsened, in such a way that law schools are still confronted with as difficult a choice as if they had simply experienced an overall drop in applicants. How schools react remains to be seen.

"Next round in LSAT disability fight"

Last year, I blogged twice about an agreement between the Law School Admissions Council and the Department of Justice regarding accommodated LSAT test-takers. In the future, LSAC agrees to stop "flagging" accommodated takers and to ensure additional opportunities for accommodated test-taking. Among other things, I noted:

LSAC wants to provide scores highly predictive of first-year law school grades. On that, it does a very good job--it is the best predictor of first-year grades; it is an even better predictor when combined (with an appropriate formula) with an undergraduate GPA. But the settlement means that LSAC must now provide both these scores, and scores that are less predictive (i.e., accommodated scores, which are not as predictive of first-year law school grades), without any indication to law schools about whether this score fits into one category or into another.

Now comes this piece by Karen Sloan in the National Law Journal. Details of the agreement have yielded disputes, including, unsurprisingly, this:

A spokeswoman declined to detail the council's objections, but issued a written statement citing potential damage to the test's ability to accurately predict who will succeed in law school. "We want to reiterate that we deeply respect the rights of disabled test-takers, but we cannot ignore the impact that certain of the recommendations would have on the overall integrity and fairness of the LSAT accommodation process," the council said.

There's much more to the story from several perspectives. But this crucial issue was, of course, entirely foreseeable.

The slow, steady decline of the LSAT

Imagine you had a tool to predict the future. You'd probably use it. A lot, in fact, especially if that tool predicted success in your industry.

Then, one day, you abruptly stop using that tool. It would probably mean some combination of the following: a better tool for predicting success; a decline in quality of that tool; some significant negative side effect from using that tool; a lack of concern for learning the predictive value offered by that tool; or an alternative advantage that might be gained only if the tool is not used.

For the LSAT, the latter four reasons have illustrated the slow, steady decline of its use.

A decline in the quality of that tool

The LSAT has long been deemed an extremely reliable test. Reliable, in that it highly and consistently correlates with first-year law school grade point averages. (For numerous studies, see the LSAC reports.) It uses item response theory, which allows the scores to reflect similar quality over time--a 170 on each test looks roughly the same, regardless of the month or year in which the test is taken.

The LSAT is even better when combined with a prospective law student's undergraduate GPA. And, if a school so desires, it can obtain a formula from LSAC indicating an optimal "index formula" that weighs LSAT and UGPA appropriately to find the best fit for a law school's first year grading distribution.

The LSAT, however, has lost some of this quality.

For many years, schools generally disclosed and relied upon the average of LSAT scores from a single applicant. LSAT studies, after all, revealed that the average is the most predictive of the applicant's ability, not the high or the low score. In 2006, however, the American Bar Association decided to request that schools report the high scores, not the average scores, of applicants. Despite the lower predictive value of reporting the high score, schools have increasingly pursued these high-end scores.

Additionally, the LSAC recently entered a consent decree to stop flagging LSAT scores earned through accommodated test-taking, and making it easier to secure accommodated test-taking. Because LSAC only finds that its scores secured during ordinary conditions are reliable, the consent decree means that the LSAT scores that schools obtain will have lower value.

Some significant negative side effect from using that tool

When U.S. News & World Report calculates its rankings of law schools, one-eighth of its entire score is based on a single LSAT score: the median incoming student. This creates significant distortions in how law schools secure incoming classes. Schools pursue that median LSAT score, despite the more promising index score it might otherwise use. Even more troubling, LSAT takers are fewer and fewer, making scores more difficult to obtain.

As a result, schools have an incentive to avoid the negative side effect from declines in their LSAT median, which might result in a decline in their USNWR rank. And so, as reported in recent reports, schools have begun to admit a non-trivial number of students without that score. Really, the new trend is not new, but several years old--instead, it's a trend begun by new interpretations of regulations that permit alternative metrics, such as SAT or ACT scores, to evaluate incoming students.

Of course, there's no data indicating the reliability of SAT or ACT scores correlating with first year grades, or how to index those scores with undergraduate GPA for an even more reliable picture. But the negative externality--the risk of median declines and a corresponding USNWR hit--is too great a cost. (You'll note, then, that the use of SAT or ACT scores is not, as one might say, a "better tool for predicting success." It is not a tested method at all.)

A lack of concern for learning the predictive value offered by that tool

It might have been the case that the LSAT was valued by admissions departments because it was a way of guessing success. Better students would be at a lower risk of dropping out or failing out. Better students would have a better chance at passing the bar and earning desirable employment outcomes.

But if those metrics are less valuable than other concerns--such as today's LSAT profile for an incoming class over the profile of a graduating class in three years or its employment profile in four years--then schools push them aside. It's not that schools are unconcerned with first-year student success-they undoubtedly are. It is simply that such concerns necessarily lessen if the obsession over an LSAT median--rather than the depth of the class, given the abrupt decline in the 25th percentile at many schools--is heightened.

An alternative advantage that might be gained only if the tool is not used

These are, of course, rather rankings-centric views. But there's also an advantage to be gained in refusing to use LSAT scores for prospective students. If a school is one of the only, or one of the few, doing so, it is a very strong enticement for the, let's face it, lazy prospective law student: forgo taking the LSAT, forgo opportunities at most other law schools in America, and effectively commit to a school without an LSAT requirement (assuming other metrics, like GPA and a "comparable" SAT or ACT score, have been met).

It's a decisive recruiting advantage, particularly for a law school seeking to attract candidates from its home undergraduate institution, a baked-in base likely inclined to attend the same law school anyway. Sure, students lose options elsewhere, but they save the time and financial cost of LSAT preparation and agony. It might be, of course, that this incentivizes all of the wrong sorts of students, but that might be a matter of perspective, depending on whether one views the LSAT as an unnecessary hoop or an objective measure of likely future performance.

*

The LSAT, then, is not abruptly dying. It has been experiencing nicks and scrapes for a decade now, and an increasing number of factors, both internal to LSAC and external to the market for legal education, have put it in a precarious position of slow and steady decline.

Pregaming the U.S. News 2016 law school rankings

Despite the pernicious effect U.S. News & World Report law school rankings has upon legal education, it remains the most trusted resource for 21-year-olds seeking to set a course for the rest of their lives. Before the rankings are released next week, I thought I'd reshare three rankings I released when the Class of 2013 employment data was released (which will be incorporated in the forthcoming USNWR rankings). Law schools will likely report their Class of 2014 employment data in mid-April 2015.

Legal employment outcomes in California in 2013 (March 31, 2014)

Legal employment outcomes in 2013 (April 11, 2014)

Law school microranking: federal judicial clerkship placement, 2011-2013 (May 6, 2014)

Three charts to illustrate the present market for law schools

We often read about the "crisis" in legal education, and the "drastic" steps that law schools are taking. All that being said, I'm actually surprised that law schools are taking such modest steps in the face of fairly long-term declines. That is, given trends, the Class of 2018 is likely to be still a smaller group of students than the 40-year low of the Class of 2017. And even if it's the bottoming out, we even a rebound would likely not return law schools to any sense of "normalcy" until the year 2020. But we see very few schools reacting with a serious, long-term focus like one might expect.

Below are three charts illustrating the total LSAT takers, total JD applicants, and total JD matriculants from 2004-2014, with a projection for 2015 (i.e., the Class of 2018) based on presently-available data. (Data derived from LSAC and ABA resources.)

Ranking the Law School Rankings, 2015

On the heels of the first-ever ranking of law school rankings, and last year's second edition, here's the third edition.

The rankings tend to measure one of, or some combination of, three things: law school inputs (e.g., applicant quality, LSAT scores); law school outputs (e.g., employment outcomes, bar passage rates); and law school quality (e.g., faculty scholarly impact, teaching quality). Some rankings prefer short-term measures; others prefer long-term measures.

Lest anyone take these rankings too seriously, there is no inherently rigorous methodology I use. It's largely my idiosyncratic preference about what rankings I think are "better" or "worse."

And, as always, I'll decide what rankings to rank. I've removed a couple and added a couple. The year listed is the year the ranking was last updated (not the self-described year of the ranking).

1. NLJ 250 Go-To Law Schools (2014): It's a clear, straightforward ranking of the percentage of graduates from each school who landed a position at an NLJ 250 law firm last year. It does not include judicial clerkships, or elite public interest or government positions, but it is perhaps the most useful metric for elite employment outcomes. As a methodological point, only 178 firms answered the survey, and NLJ relied on its database and independent reporting to supplement. To its great advantage, it includes many interactive charts of the data it has.

2. Sisk-Leiter Scholarly Impact Study (2012): The study has not been updated in a few years, but it's still useful for what it does. Drawing upon the methodology from Professor Brian Leiter, it evaluates the scholarly impact of tenured faculty in the last five years. It's a measure of the law school's inherent quality based on faculty output. In part because peer assessment is one of the most significant categories for the U.S. News & World Report rankings, it provides an objective quantification of academic quality. Admittedly, it is not perfect, particularly as it is not related to law student outcomes (of high importance to prospective law students), but, nevertheless, I think it's a valuable ranking.

3. Princeton Review Rankings (2014): Despite a black box methodology that heavily relies on student surveys, the series of rankings gives direct and useful insight into the immediate law school situation. It is admittedly not comprehensive, which I think is a virtue.

4. Above the Law Rankings (2014): The methodology is heavily outcome-driven (and perhaps driven by an outcome in mind). It relies on a very narrow "employment score" (full-time, long-term, bar passage required, excluding solo practitioners and school-funded positions). It conflates "tuition" with "cost," and it relies heavily on a couple of narrow categories (e.g., Supreme Court clerks). But it's a serious and useful ranking.

5. Enduring Hierarchies in American Legal Education (2013): Using many metrics, this study evaluates the persistence of the hierarchies among law schools. There are few things that have changed in determining which law schools are high quality over the last several decades. This study tries to figure out the traits of the hierarchies, and it categories the schools into various tiers.

6. Law School Transparency Score Reports (2013): It's less a "ranking" and more a "report," which means it aggregates the data and allows prospective students to sort and compare. The data is only as useful as what's disclosed--and so while it provides some utility, it's limited by the limited disclosures.

7. Witnesseth Boardroom Rankings (2014): Professor Rob Anderson's analysis is extremely limited: it evaluates which law school graduates end up as directors or executive officers at publicly held companies. But I think it gives a nice data point in an area that's under-discussed: law school graduates, after all, may find success in business and not simply in the practice of law.

8. Roger Williams Publication Study (2013): It selects a smaller set of "elite" journals and ranks schools outside the U.S. News & World Report "top 50." There are a few issues with this, as it relies on a fixed data set of "top 50" journals established years ago, and as it hasn't been updated in a couple of years, but, given its narrow focus, I think it does a nice job filling in some gaps left by the Sisk-Leiter study.

9. AmLaw BigLaw Associates' Satisfaction (2014): It surveys associates for how well their law schools prepared them for firm life. It highly correlates with job satisfaction. It's a nice, small post-graduate measure of law schools.

10. PayScale Rankings by Mid-Career Salary Salaries (2014): While this survey mixes all graduate schools together, and while it has some obvious selection bias in the reported salary data, it's another rare ranking that attempts to evaluate mid-career employment outcomes, which, as an under-evaluated area, makes this study something worth considering.

11. QS World University Rankings (2014): I think this ranking tends toward comparing apples, oranges, kumquats, rhododendrons, and lichen: all living things, but extremely hard to compare. But its use of h-index and citations per paper increases the objectivity of this academic-driven ranking.

12. SSRN Top 350 U.S. Law Schools (2015): The total new downloads give you an idea of the recent scholarship of a faculty--with an obvious bias toward heavy-hitters and larger faculties.

13. U.S. News & World Report (2014): Before, I've said that it isn't that this ranking is so bad that it's so low. Over time, I've concluded that, no, it's because this ranking is bad. It relies heavily on a few metrics that are not beneficial to measuring anything meaningful. It distorts student quality by incentivizing pursuit of the median LSAT and UGPA at the expense of all other quality factors, especially the bottom quartile of the class; it rewards silly categories like high-spending schools and library resources; it prints metrics unrelated to its ranking formula; its "lawyers/judge assessment score" has a notoriously low response rate; peer academic ranking scores have deflated over time as schools sandbag each other when ranking each other; and so on. It might be the case that they are exceedingly influential. It's true. Bu they are pretty poor. They may mostly get the "right" results, but for all the wrong reasons.

14. Tipping the Scales (2015): The metrics are simply a bit too ad hoc--and that's saying something coming behind U.S. News & World Report. The factors are idiosyncratic and, while they reflect a superficial appreciation of things like student quality and outputs, the measures used (salary data, which is inherently bimodal and notoriously underreported; acceptance rates, which are not uniform indicators of quality; etc.) are not a serious appreciation of those things.

15. PreLaw Magazine Best Law School Facilities (2014).

16. GraduatePrograms.com Top Law Schools for Social Life (2014).