Useless information about the 2020 presidential election

Baseball writer Jayson Stark regularly writes articles from the “Useless Information Department,” filled with interesting, odd, bizarre, coincidental, or just plain silly statistics or factoids from the world of baseball. With a 120-year-plus history, a 162-game season, 250-plus pitches per game, and a lot more, there’s always some interesting connections to make.

I’ve occasionally tweeted out some of my own useless information from the 2020 presidential campaign, and I thought I’d turn that into a long form blog post. Hang on…

* * *

Joe Biden ran—and won—as a vice presidential candidate in 2008 and 2012. But he opted not to run for president in 2016, choosing instead to run in 2020. Typically, a vice president who runs for president (and wasn’t elevated to the office of president due to a vacancy) runs either after his president stops running (e.g., Al Gore in 2000) or runs for the term immediately after losing an election seeking the office of vice president (e.g., Walter Mondale in 1984, who lost on the VP ticket in 1980).

Just four vice presidents have won a presidential election while serving as vice president—John Adams in 1796, Thomas Jefferson in 1800, Martin Van Buren in 1836, and George H.W. Bush in 1988.

So have any vice presidents skipped at least one term, then become president? Just one—Richard Nixon. Nixon was Eisenhower’s vice president and won the 1952 and 1956 elections. He ran for president in 1960 and lost. He then took 1964 off before running (and winning) in 1968. But Nixon’s case also differs, because he did try in 1960 immediately after serving eight years as Dwight Eisenhower’s vice president—but Nixon lost in 1960 and tried again later in 1968. Nonetheless, Nixon is the only vice president to later serve as a president, who was not serving as vice president when he won.

If Biden wins the 2020 presidential election, then, he’d be just the second vice president since Nixon to win a presidential election while not serving as vice president. And unlike Nixon, it’d be his first attempt to run for president after serving as vice president.

* * *

Websites like FiveThirtyEight and 270toWin remind us that presidential candidates must secure a majority of votes in the Electoral College to win the presidential election. Obviously, Donald Trump won a bunch of electoral votes in 2016 (304 of them, to be exact). Joe Biden also won a bunch of electoral votes, 365 in 2008 and 332 in 2012, both vice presidential votes. But they’re not alone.

Elizabeth Warren received two vice presidential electoral votes in 2016, one in Hawaii and one in Washington. And Bernie Sanders received one presidential electoral vote in 2016, in Hawaii. Both came from “faithless” electors, presidential electors ostensibly committed to support Hillary Clinton when the Electoral College convened in December 2016 but who ultimately cast votes for these candidates.

* * *

The last Democratic presidential nominee who attended neither Yale nor Harvard was Walter Mondale in 1984. (Yes, that means Michael Dukakis, Bill Clinton, Al Gore, John Kerry, Barack Obama, and Hillary Clinton all have ties to this sliver of the Ivy League.)

But none of the Democratic presidential frontrunners attended either. Joe Biden went to Delaware and Syracuse Law; Bernie Sanders went to Brooklyn College and Chicago. Dipping a little deeper into the candidates, Elizabeth Warren attended George Washington University, Houston, and Rutgers Law (although she did teach at Harvard Law). Kamala Harris went to Howard before attending Hastings Law. Of course, there’s a chance a candidate like Pete Buttigieg (Harvard/Oxford) or Cory Booker (Stanford/Oxford/Yale) pulls through and keeps the Harvard-Yale streak alive.

* * *

Democrats may also keep another education streak alive. Since 1984, every Democratic presidential and vice presidential nominee has attended law school—Walter Mondale, Geraldine Ferraro, Michael Dukakis, Lloyd Bentsen, Bill Clinton, Al Gore, Joe Lieberman, John Kerry, John Edwards, Barack Obama, Joe Biden, Hillary Clinton, and Tim Kaine.

Attended—all but one received a law degree, the exception being Gore, who dropped out of Vanderbilt Law School before completing his Juris Doctor.

The frontrunners are a mixture of lawyers (Biden, Elizabeth Warren, Kamala Harris, Cory Booker, Amy Klobuchar) and non-lawyers (Bernie Sanders, Pete Buttigieg, Beto O’Rourke)—but the safe money may be on the lawyers, and it may rest with a vice presidential nominee to break the streak.*

* * *

Before Trump, Ronald Reagan was the only 70-something to win a presidential election. That took place in his second term, and he was 73 when sworn in. Trump became the first 70-something to win a first term.

But we’re seeing a surge of septuagenarian candidates and may see that age record fall. The age of some candidates as of the next inauguration day, January 20, 2021: Bernie Sanders, 79; Joe Biden, 78; Donald Trump, 74; Elizabeth Warren 71.

For the record, Reagan was 77 years, 349 days when leaving office. The next-oldest president upon leaving office stands to be Trump—but if he completes a second term, he’d be 78 years, 221 days to edge out Reagan. And the only other 70-something to ever serve in office was Dwight Eisenhower, who left office at 70 years, 98 days.

The election of either Sanders (79 years, 134 days as of January 20, 2021) or Biden (78 years, 61 days) would immediately make that candidate the oldest person to ever serve in as president. Both are older than each of the last five Democratic presidential candidates—older than Hillary Clinton, Barack Obama, John Kerry, Al Gore, and Bill Clinton. (But younger than 1988 Democratic nominee Michael Dukakis.)

Compared to recent presidential candidates popularly considered “old”? John McCain would have been 72 years, 144 days on January 20, 2009; Bob Dole would have been 73 years, 182 days on January 20, 1997.

And the combined Election Day ages of Trump-Sanders (153), Trump-Biden (152—Biden turns 78 after Election Day), or Trump-Warren (145) easily make them the oldest major party opponents in history. Reagan-Mondale, 1984 (129); Van Buren-Harrison, 1840 (124); and Dole-Clinton, 1996 (123) are among the oldest pairs of major party opponents.

* * *

But Democrats are on pace to help break a different record. Three of the last four presidents were each born in the same year, 1946. That’s right, Bill Clinton, George W. Bush, and Donald Trump were all born in 1946, in that first year of the “Baby Boom” after World War II. Sanders (1941), Biden (1942), and Warren (1949) all missed that birth year. So did Hillary Clinton, narrowly (1947).

Of course, there have been other 1946 presidential candidates. To name a few: 2000 Republican candidate Gary Bauer, 2004 and 2008 Democratic candidate Dennis Kucinich, 2012 Constitution Party presidential nominee Virgil Goode, and 2016 Democratic candidate Jim Webb.

* * *

There are a number of Democratic candidates vying for the title of the youngest president. That’s currently held by Teddy Roosevelt (42 years, 322 days when he took office). Roosevelt became president after William McKinley died. For the youngest elected candidate, that goes to John F. Kennedy (43 years, 236 days).

Several candidates, including Pete Buttigieg (39 as of January 20, 2021), Tulsi Gabbard (39), Seth Moulton (42), and Eric Swalwell (40), could eclipse these marks. But they’d hardly be the youngest major party candidates in history. That belongs to Wiliam Jennings Bryan, who was just 36 when he secured the Democratic Party nomination in 1896.

Not only that, but one of these candidates might eclipse the age gap between major party opponents set by 72-year-old John McCain and 47-year-old Barack Obama in 2008, a 25-year gap. Trump will be 74 on Election Day, so anyone under 49 would set the record.

* * *

Alexandria Ocasio-Cortez is a useful foil for age comparisons. Elected to the House of Representatives in 2018, she was born October 13, 1989 and is just 29 years old—but she has an outsized influence on social media and in the Democratic Party.

This is Joe Biden's third presidential run after failed campaigns for the 1988 and 2008 Democratic Party nominations. Biden’s first run ended in September 1987... more than two years before Ocasio-Cortez was even born.

Bernie Sanders was born 24 years after John F. Kennedy. He was also born 48 years before Ocasio-Cortez.

* * *

By my count, 88 members of the Senate of the 93rd Congress as of January 3, 1973 have died. Of the 12 remaining, two are former presidential candidates (Walter Mondale and Bob Dole), & two are running this year (Joe Biden and Mike Gravel).

* * *

I know, the post has focused a lot on age. But there’s so much to do with it! And maybe it’s only fitting that the oldest living former president ever is still with us (Jimmy Carter, who turns 95 in October 2019), and the one who’s lived the longest after leaving office (39 years and counting).

 * * *

*Special thanks to Brian Kalt for this detail.
Please notify me if you find any errors I ought to correct or ambiguities I ought to clarify.

In Memoriam: Professor John Copeland Nagle

With John Nagle at the Notre Dame Law School Commencement, May 20, 2007

I had John as a first-year student in Property in the Fall of 2005. But in my first year of law school, he was more than that. He was the advisor to the Christian Legal Society, a small group of students at Notre Dame, and he hosted occasional gatherings at his home. At an early September 2004 barbeque, he’d run the grill and host dozens of students, the first of my many visits to his home.

In Property, he’d bring in his daughters’ stuffed animals as props on the day we covered animals as property—cheesy, maybe, but a student couldn’t help but smile. He was self-deprecating in the best way—he was one four co-authors on a casebook, and he said he was only put on it to “do one chapter on environmental law and submit pictures to accompany the other cases.” (And how he loved pictures to accompany the law!)

On top of that, he told us that his daughters (Laura and Julia) picked out who’d be called upon each class, defining the cold call roster and ensuring that he could remain blameless (or, so he believed). It’s a small and amusing choice that I adopted in my own teaching for first-year courses.

In the upper division, the least likely elective I took was Biodiversity & the Law, because John taught it. His exam was, in a way, miserable, in the sense that it was a 24 hour take-home on some fairly open-ended questions about how one would go about protecting an animal species in a fragile ecosystem given existing law and competing concerns. But talking the exam over with Emily, my wife, after the fact, she reflected, “It sounded like a genuine question, and he’s really interested in hearing what you think about it.” And it’s true. It’s the kind of exam I’m not sure I’d ever be capable of writing. But he wrote an exam that showed a love of the material and a genuine open interest in seeing how we handled a situation that he himself may have well viewed as uncertain. He wanted to see what we’d do with it. Miserable, maybe. But the rare exam that felt like it was helping the professor think through the world and the law.

John gave me my Note topic—out of his own curiosity. He wondered why “public necessity” would allow destruction of property for the public benefit, but that the property owner wouldn’t receive compensation. It was a topic that took me deep into 17th century original sources and admiralty law. It was also my first opportunity to learn that I liked writing legal scholarship enough to pursue academia one day.

John was also a deeply valued mentor for me in my path into academic. He had a somewhat unorthodox entry into law teaching, and he was extraordinarily supportive as I made my way to the market (particularly as only a few from Notre Dame go on to teach). But my biggest encouragement came in an exchange, one he maybe never knew how powerful it was. As one still unsure whether I’d make it on the hiring market but with some modest publications and work in progress, he invited me to a dinner with a couple of other law professors at a conference. They asked me about my work, and I described one recent project, to which John responded to the others, casually but sincerely, in the middle of a bite of food, “You should read it, it’s really good.” That’s when I felt like I might be able to make it—if John could read it and share, so genuinely and spontaneously, that sentiment with others, I could do okay.

It helped, too, that my first year at Pepperdine was also the first year of Dean Deanell Reece Tacha—for whom John clerked on the Tenth Circuit. He fondly recalled his clerkship with her and held her in the highest esteem. It made his visits to Pepperdine to present papers or participate in a conference all the better.

John’s scholarship was remarkable for reasons I may dive into at depth another time. But I’ll reflect on this small thought now: he was a man who had ideas and wrote about them, ideas that reflected a deep interest across disciplines and that drew comparisons across things. That sounds simple, but it’s something that is all too rare.

His primary research was in environmental law, and he had a number of terrific pieces in election law, too. But how to come up with the piece Pornography as Pollution, taking a very environmental law-centered concept and applying it to the pervasive problem of sexually explicit material? Or merging environmental law and election law, Corruption, Pollution, and Politics, using the old metaphor of “pollution” for political corruption and using it within the environmental law framework for addressing campaign finance regulation? Or A Twentieth Amendment Parable, which opens with the avowedly biblical allusion and offers in its second footnote to the statement, “The definitive law review article on the Amendment had yet to be written,” this sentence: “This isn’t it.”

These are some (I’ve already gone on too long) of the articles of John’s I remember. I often feel like I barely remember some of my own, much less others’. And in the months ahead, I’ll dig into some of his other work I don’t remember. But these, among others, were wonderful because they said something interesting. They were—are—memorable. They drew comparisons I hadn’t thought about before—and assuredly never would have.

This whole reflection is a bit surreal to write. I suppose that’s what happens when someone young passes away unexpectedly. But more to the point, I’m visiting at Notre Dame in the Spring 2020 term and looked forward to spending time with John in particular, in part because I hadn’t seen him much in the last couple of years. I’d already been in contact with him about possible housing options in the region.

Not long ago, I recently asked if he was going to be at AALS, my usual opportunity to try to catch up with people. But he wasn’t, as usual. He wrote in this email, “I've gone from missing the AALS because of Laura's basketball schedule to missing the AALS because the girls are home from college.” We’d just have to connect another time.

I am tearing up as I write about this. John loved his family and knew how to dedicate time to spend with them. I’m so glad he did. And I’m confident he was glad he did. It’s also a reminder to myself to carve out the time needed with loved ones.

To close what’s already starting to feel like my rambling thoughts: a big reason I valued John so greatly as a friend and mentor was that he was a Christian. It’s sometimes hard to think about how faith and vocation fit together for a Christian in any given discipline. But John lived a life committed to his faith and his God, and he thrived in his work. He saw God’s glory in all the world—it’s impossible to separate John’s passion for visiting national parks, for writing about the wilderness and rare animal species and protection of nature, unless one looks at his heart rejoicing in the beauty and awe of creation.

I recall pointed questions in Biodiversity & the Law about why we like the “wilderness,” a term that’s usual fraught with danger; or why we’d protect endangered species, perhaps something beyond aesthetic enjoyment (given we may never see them) or utility (given we may not find the “use” of all animals). In both—and it’s a reason I chose to attend Notre Dame—he gently suggested that perhaps our faith, and specifically the Christian faith, could provide answers. That maybe the wilderness, while often a biblical place to be cast out into, is also a place of spiritual retreat and prayer. That animals are all a part of God’s creation, and our dominion and stewardship over them should include their protection regardless of any utility.

John helped me recognize that my Christian faith—importantly, a good, deep faith—of necessity extends beyond the personal life, and even beyond thinking about a moral code of right and wrong, into reflecting on just about anything in the world around us. I admit, I’m still not great at it. But he was very good at it—although I imagine, in characteristic humility, he’d say he was only struggling to figure it out, too. It encourages me as I write this to go deeper still.

My heart aches for Lisa, Laura, and Julia as they mourn a devoted husband and a loving father who has passed away. But with John, may we find comfort in the assurance that nothing “in all creation will be able to separate us from the love of God in Christ Jesus our Lord.” We await the new heaven and the new earth, where God “will wipe away every tear from their eyes, and death shall be no more.” Rest in peace.

Guest blogging at PrawfsBlawg

The rise and fall of my use of Twitter

I first joined Twitter in 2009 under a pseudonymous account before restarting in May 2012 with my present account. I began to use it more over the last five years for a few reasons.

First, unfiltered news. There is no algorithm determining what content I see. Instead, it's simply the most recent content, all there, if I choose to follow those feeds. I prefer RSS for time-shifting, but it offers the same kind of function.

Second, professional disintermediated contacts. You can talk to people all over the world, in your field and related fields, in a very easy way.

Third, journalists live there. To the extent one is interested in sharing ideas with journalists, they frequently look to Twitter for news and sources.

Fourth, branding. The crass term is simply a reality--it is a way of gaining name recognition in a fairly simple way. (This is particularly true because I have a blog with content I frequently share to a broader audience.)

Fifth, engagement with law professors. Many other law profs are on Twitter, and the discussion occurs there in a way that, perhaps a decade ago, discussions might have occurred on blog comments sections, or listservs. It's a great way to virtually meet people outside of conferences.

But, over time, I found that these benefits has lost much of their appeal, and the cost-benefit analysis has moved me away from using Twitter.

There have been increased attempts from Twitter to tell me what I ought to believe is important, a new kind of filter to the experience. Trending stories are the first in that effort. Moments, another. Autocompleting search terms or displaying preferred search results, still another. And occasionally, it will display "live" events at the top of my feed that it believes I ought to heed. In each of these circumstances, I've found the content offensive--not because it somehow offended my morals, but because it was so utterly trivial and banal that I wondered why it would, in its vaunted algorithmic way, decide I would have any interest in these silly and trite things.

I have found that the reward from "status" on Twitter is simply not great. For journalism, it remains, sadly, nearly ubiquitous. A majority of media inquiries now start from a tweet; indeed, a non-trivial number of media mentions fail to even inquire of me and simply (lazily) cite my tweet. Using Twitter less means fewer citations in journalists' pieces, but such is the tradeoff. Furthermore, I've found that a lot of media now focuses on what people say on Twitter, and then how others react to those statements on Twitter--a deeply meta, and often, I think, deeply superficial way of thinking about newsworthiness.

Furthermore, I've watched a number of law professors (and others) lose a significant amount of their credibility (in my eyes, at least, and I think, to some degree, in the eyes of at least some others) by succumbing to the allure of fleeting social media fame. It moves beyond branding into a quasi-celebrity status. It's something that I want to separate myself from.

I've experienced moments like this. Consider this tweet, which went somewhat "viral" at the end of 2017. I have lacked the self-control in terms of time spent on the medium. I've reveled in the dopamine pleasure of notifications telling me that someone, anyone, has read my stuff, or interacted with my stuff, or acknowledged my existence in this pithy format. And this kind of "viral" sharing was utterly unfulfilling--fluffy stuff, dopamine hits without any meaningful return.

The good of Twitter, I've found, has increasingly become banal as a form of escape. The pleasant or non-controversial sides of Twitter feel increasingly vacuous (or, at least, I've grown quite aware that they are so). Pleasant people exchanging superficial and trite hashtag greetings and emojis have left me wanting.

And perhaps most of all, I found visiting Twitter a joyless, even painful, experience. It was a chore, or a necessity, not a pleasant way of learning about the news. If it's not the banal, it's the stranger shouting angrily, or the self-laudatory sarcastic point that demolishes or obliterates or decimates one's (usually political) enemies. I found my blood pressure too quickly and easily rise. I found myself defensive, typing out a hasty or angry or sarcastic response, only to delete it. (Occasionally it escaped my self-editing, to my detriment, I think.)

I would find myself thinking lamenting the lack of subtlety. Or, more significantly, the lack of the ability to have an actual conversation. I found total strangers willing to say consistently hurtful things (fortunately, only rarely to me; too often, to many undeserving targets). I saw the herd mentality of social media, where errors spread like wildfire or outraged mobs congregate. I found that many of the cutting tweeters would be perfectly pleasant to have a conversation, even a disagreement, with when face to face, perhaps for hours over a meal. Twitter has been destructive to that end, at least for me.

I realized that I wanted to read more long-form articles, and that I was dedicating too much time to the moment. Many pieces I was reading were not deep or interesting, but designed to secure a click from Twitter with a controversial or sensation headline (hardly a new practice in media, of course, but one that increasingly annoyed my consumption of news in this format). I receive a print Wall Street Journal every day, and the curated content there is sufficient for most major news, even if it may take 24 or 48 hours to dig deep into "breaking" events. I also subscribe to The New Criterion and First Things for long-form cultural commentary, and I dedicate too little time to those things. Finally, I was perpetually reading too few books (in particular too little fiction), and I needed to cut trivial reading.

I've chopped probably 90% of my Twitter use this year already. I hope to cut it even further. I will still use it, of course, just less frequently. I'll tweet rarely, but I'll do so to, say, share this blog's content.

This is not to say that others have not calculated the cost-benefit differently, and that others might not do much better. Others have thrived on Twitter, and I've come to deeply respect (in some ways, more deeply respect) the work of many because of Twitter. That's a cost, and a loss for me.

These are also, of course, generalizations. There are exceptions to every single thing I've said. And others' experiences may well quite differ from my own.

And it's not to say that it might not improve. Professor Carissa Byrne Hessick has offered a thoughtful and measured take on best practices for Twitter, one that I hope will be widely shared and adopted in the future. (UPDATE: Professor Josh Blackman today posted his own helpful and thoughtful guidelines.)

For me, though, it's time to cut a lot of my use of Twitter. I hope to distance myself from a medium that, I think, on the whole, is more cost than benefit. I'll revisit my habits on an ongoing basis. But after a few weeks with the app uninstalled, and interacting very little with content in the Twitter stream, I feel fairly confident that I'll keep going like this for some time. I hope to blog slightly more (longer forms of such thoughts with more nuance and editing). I hope to read far more.

And I hope to keep away from the tyranny of the urgent for a little while.

A map of the United States according to Supreme Court case citations

Some time ago, I thought about making a map of the United States based on the most significant Supreme Court cases from each state. Specifically, I'd rename the states after the party opponent in which the state was the principal opponent in the case caption.

"Significant" turned out to be a challenge, so I opted for "most cited" according to Westlaw. That led to the results below.

It's worth noting that some of my searches were inconsistent with Westlaw's limited capabilities, and I may well be wrong on some--please correct me if so! I simply sought the most cited cases from each state.

It turns out that there are many cases I imagined were far more significant, but that didn't meet the "most cited" in a state. Those included Alabama (J.E.B., NAACP, & Miller), Arkansas (Epperson), California (Miller), Connecticut (Palko), Florida (Riley), Louisiana (Hans), Michigan (Long), Missouri (Holland), New Jersey (T.L.O.), New York (United States), Ohio (Mapp), Oregon (Muller & Mitchell), Pennsylvania (Prigg), South Carolina (Katzenbach), South Dakota (Dole), Texas (Johnson & Lawrence), and Virginia (Loving, Black, & Cohen). So while a more intriguing map might have been a kind of public vote about the most significant Supreme Court case to arise out of each state, I opted for the easy way out.

UPDATE: Special thanks to Danny Lewin, who notified me of the terrible error of including Strickland as a Washington case and not a Florida case! Washington is now Crawford.

Logical reasoning prep: Chipotle GMO edition

Prospective law students, take a moment to engage in logical reasoning while waiting in line for a delicious burrito.

You read the sign above at a local restaurant. Based solely on the assertions in these statements, which of the following statements is true?

A. Ingredients without GMOs are better than ingredients with GMOs.

B. Chipotle's ingredients are better now than they were in the last 21 years.

C. Chipotle's ingredients are worse now than they were in the last 21 years.

D. It is unknown whether ingredients without GMOs are better than ingredients with GMOs.


Answer: The correct answer is D. All that is known about ingredients with GMOs is that they are "n[o]t . . . better" than ingredients without GMOs. That may also mean that the ingredients are of the same quality. A, B, and C are all based on a value judgment whether ingredients without GMOs are better. Granted, the sign is designed to offer the impression that that ingredients without GMOs are "better," given the assertion that Chipotle "ha[s] been striving to make our ingredient better," and here is a step regarding its ingredients. But the first sentence cannot cure the lack of information in the second sentence.

Remembering the Armenian Genocide

A statue in Detroit, Michigan, erected in memory of Gomidas Vartabas and the victims of the Armenian Genocide, via Wikipedia.

This blog, sometimes, is about elections. Candidates in elections behave differently than men and women serving as representatives or senators or governors or presidents. They say different things. They emphasize different things. It's a very real part of the political process, whether those differences are good or bad, whether those differences are right or wrong.

It is one thing when presidential candidates have promised, or currently promise, to recognize the Armenian Genocide. And not with recognition of a "tragedy," or of "terrible events." But of using that word, "genocide."

The word "genocide" obviously evokes serious reactions. The Holocaust is probably the first that comes to mind. Poll most Americans about another genocide, and you might find a few scattered responses about Rwanda, Bosnia, or Cambodia.

But few Americans would call to mind the Armenian genocide. It began in 1915, one hundred years ago, in the middle of the Great War. More than a million Armenians were killed. Indeed, the word "genocide" was coined in 1944 in the midst of World War II, but it arose upon reflection of the history of such killings, for Adolf Hitler was not the first--the Armenians had been sought out before that. It was striking when I first read of it at some point in college--I had been completely unaware of it. (I've have a deep interest in the history of World War I ever since.) The Armenian Genocide is not widely taught. In many places, it is essentially forgotten.

But politicians behave differently as candidates than they do as elected officials. Both Presidents George W. Bush and Barack Obama promised as candidates to recognize the Armenian Genocide, and both refused to do so when they took office. The office changes behavior--there is fear of offending American allies with the word "genocide," and politicians behave differently. But it is perhaps that very power of the office that should be used to call Americans, and the world, to recognize and acknowledge and reflect upon that genocide, that historical fact, that truth that some would deny in the hope that all would forget.

April 24, 2015 marks the one hundred year commemoration of the Armenian Genocide. Pepperdine Law has an active and engaging Armenian Law Students Association, which commemorated the event this month through some moving tributes. Many others around the world will also remember that genocide. I close, then, with the right words from President Ronald Reagan's proclamation, April 22, 1981:

"Like the genocide of the Armenians before it, and the genocide of the Cambodians which followed it -- and like too many other such persecutions of too many other peoples -- the lessons of the Holocaust must never be forgotten."