Wednesday, March 30, 2016

Orrin Hatch's Embarrassing New York Times Op-ed

This piece was originally published on History News Network

Senator Orrin Hatch took to the New York Times op-ed page to try to make the case for the Senate refusing to take up President Obama’s Supreme Court nominee.

It didn’t go well.

He starts by praising the late Antonin Scalia, implying that the rules should be different when replacing one of the “greatest jurists in our nation’s history.” The obvious reply is that it does not matter who the president is replacing. All openings on the Court are created equal.

Hatch then asserts that Obama has “contempt” for Scalia’s judicial philosophy. That may or may not be true, but in any case, it is irrelevant. When the electorate once again decisively elected Obama as president in 2012, it did not include an asterisk that said he could only replace justices with whom he agreed.

His next point is that when a senator, Obama opposed two of President Bush’s nominees. Again, this is irrelevant. No one is claiming that Hatch or any other Republican has to support Obama’s nominee—just that Judge Garland deserves a hearing and a vote. Republicans now are as free as Obama was then to oppose the confirmation of the nominee.

Hatch then moves on to even more absurdly irrelevant points, such as his assertion that Obama has “consistently exceeded the scope of his legitimate constitutional authority.” Putting aside how questionable that point is, what Hatch seems to be suggesting is that if senators think such a thing about a president, the president loses the right to exercise legitimate constitutional powers. The Constitution provides Congress with a remedy for a president who exceeds the scope of legitimate constitutional authority: impeachment. The simple fact that a Republican House has not taken up impeachment reveals Hatch’s point for the nonsense it is.

He then notes that the American people have chosen a Democratic president and Republican Senate. Fair enough. But that in no way leads to Hatch’s conclusion that the Senate can therefore ignore the nomination. What that “split decision” suggests is that the Democratic president should nominate a person who is not his political ideal, but a compromise candidate more acceptable to that Republican Senate. By choosing Merrick Garland, that is precisely what Obama has done. He is respecting the idea of checks and balances, both institutionally and politically. He did not chose someone who was a darling of the Democratic left, but someone who has (in the past) been repeatedly praised by Republicans, including Hatch himself. By refusing to even consider the nominee of the elected president, it is Hatch and the Senate Republicans who are not respecting the “split decision” of the American people, not the president. They are saying that the smaller subset of the American public that elected those Senate Republicans can simply ignore the decision of the entire national electorate in the last presidential election.

For an historian, perhaps the most offensive point Hatch makes is this: “Throughout its history, the Senate has never confirmed a nominee to fill a Supreme Court vacancy that occurred this late in a term-limited president’s time in office.” As a history teacher, I am used to the instinct unprepared undergraduates have to bolster a poor argument with the “throughout history” trick. I expect better of United States Senators.

Hatch shows his contempt for his readers with this tortured construction. To make his “throughout its history” line work, Hatch needs to make that history awfully short. He does that with the phrase “term-limited.” The 22nd Amendment, which imposes term limits on presidents, has only been in effect for 65 years. So this particular “throughout its history” means for 65 years—less than Hatch’s own life span.

As I pointed out in my previous piece on this subject, there has only been one other vacancy during that period that was “this late” in a president’s term: LBJ’s nomination of Abe Fortas in 1968. Yes, Fortas was not confirmed as Chief Justice. That nomination received a hearing, however, and a vote. It was not met with this disingenuous nonsense that “we never do this.” And as Hatch well knows, 1968 was one of the most contentious elections years in American history. Somehow, the Senate still did its job.

That leads to the next part of Hatch’s “kitchen sink” piece. He blames the “toxic presidential election” for Republican irresponsibility on this nomination. Anyone paying any attention knows that the current toxicity is almost entirely on his party’s side. Hillary Clinton and Bernie Sanders have mostly conducted their primary contest on a high, substantive level. Hatch calls this the “nastiest election year in recent memory.” He neglects to mention that the nastiness is almost entirely on the Republican side. By some inexplicable logic, the fact that the Republican Party is wallowing in the political gutter means that the Democratic president’s nominee for the Supreme Court should not be treated like any other nominee.

Lastly, Hatch notes: “I have witnessed firsthand the deterioration of the confirmation process. Neither party has clean hands on this front.” That is true. It is also true that what Hatch proposes as the responsible course of action is in fact an extraordinary escalation of the politicization of the nomination process far beyond what either party has done in the past. It shows contempt for the 2012 electorate that elected Barack Obama. It shows contempt for the president personally. It shows contempt for American history.

Nothing in Hatch’s piece changes any of that.

If Hatch and his fellow Republicans want to vote against Judge Garland, they have every right to do so. But they should stop being cowards. They should make a substantive argument against him, vote against him, and accept the political consequences of that vote. They should stop pretending that this reckless path they have chosen is anything but a desperate attempt to hold onto a Supreme Court majority.

Sunday, March 6, 2016

BBC 5 Radio Interview on Ted Cruz and his Phony Supreme Court "Tradition"


On February 25, BBC 5 Radio program "Up All Night" with Rhod Sharp interviewed me about the Supreme Court vacancy and the post I wrote about Ted Cruz. The audio file is below.

video

Tuesday, February 16, 2016

Ted Cruz's Phony Supreme Court "Tradition"

[This post originally appeared on History News Network]

“It has been 80 years since a Supreme Court vacancy was nominated and confirmed in an election year. There is a long tradition that you don't do this in an election year."—Senator Ted Cruz 
If he honestly believes it is not legitimate to nominate and confirm a justice in an election year, Ted Cruz must hate the appointment of Chief Justice John Marshall. John Adams nominated him in January 1801, after he lost his re-election bid to Thomas Jefferson in the election of 1800. Adams was a lame duck in the truest sense of the term—he was serving out the remainder of his term after being repudiated by the voters. Yet he did not hesitate to fill the vacancy in the Supreme Court, and Marshall was confirmed by a lame duck Senate.

Perhaps the most striking irony of Cruz’s position (and increasingly the position of the entire Republican Party) is that this absurd debate is taking place over the replacement of Antonin Scalia. If there is one thing Scalia was known for, it is his originalist interpretation of the Constitution: it means what the founding generation said it meant. So is seems appropriate to ask: what did the Founders actually do in such circumstances?

In the final year of his presidency, George Washington had two nominations to the Supreme Court approved by the Senate. It was an election year and he was not running for reelection. It doesn’t get more "original intent" than that. Adams could easily have left the Supreme Court vacancy for Jefferson—who had already been elected, after all, and would take office in a matter of weeks—and didn’t. That seems as clear as it could be. The founders saw no impediment to a president in the final year--or even in the final weeks--of the presidency successfully appointing new justices to the Supreme Court.

What about Cruz's contention about the last 80 years? Even that does not hold up.

The facts are pretty simple. In the last 80 years there has only been one instance in which a president was in a position to nominate a justice in an election year and did not have the nominee confirmed. In 1968, LBJ’s nomination of Abe Fortas to be Chief Justice to succeed Earl Warren (and of Homer Thornberry to take the seat held by Fortas) was blocked in the Senate, but not because of some alleged “tradition.” Certainly there were Senators who wanted the next president to name a new justice. But the opposition to Fortas had everything to do with the specific nominee and specific objections to him (particularly charges of cronyism and inappropriate financial dealings). To the best of my knowledge, no one cited Cruz’s “tradition” to say it was not appropriate for Johnson to nominate someone, or that it would have been inappropriate to confirm anyone.

A second instance took place 28 years earlier. In 1940, FDR nominated Frank Murphy in January of that election year and he was confirmed that same month. There was no “tradition” blocking that election-year appointment. (This also shows that Cruz got the math wrong—this happened 76 years ago, not 80.) [Note: The morning after this post first appeared, Orrin Hatch spoke on NPR and amended the claim to no "term-limited" president had had a nominee confirmed in an election year--evidently an attempt to exempt FDR's confirmed nominee from the "tradition."]

So, there were two instances similar to the current situation in the last 80 years. In one case the nomination was rejected and in the other it wasn’t. To Ted Cruz, this constitutes “a long tradition that you don't do this.”

Ted Cruz’s invention of this alleged "tradition" that we don’t nominate and confirm Supreme Court justices in an election year would be laughable if so many supposedly responsible political leaders were not taking it seriously.

It is absurd on the face of it. If the Republicans in the Senate want to block any nominee Barack Obama sends them, they have the votes to do it. But they should stop hiding behind the obvious fiction that doing so is part of some “tradition.” It would be nothing but the raw, cynical use of their political power. This suggestion that Obama should not even nominate someone (both John Kasich and Marco Rubio said so in Saturday’s debate), or if he does, that the nominee should be rejected out of hand simply because of the timing (as the Senate Majority Leader and many Republican Senators are now saying), is simply silly. 

True conservatives don’t invent traditions. They work to protect existing ones. Our true tradition is that the president nominates and the Senate votes, regardless of when the vacancy occurs. 

The speed with which Cruz jumped to make this claim and with which so many others have fallen in line, speaks to the nihilistic radicalism that has infected today's Republican Party. Any position can be taken if it produces the correct result. Facts can be denied, “traditions” can be invented. The only value taken seriously is “does it work to our advantage?”

This tactic may well work politically. It has already had the effect of framing the debate as “Should Obama nominate someone?” That is truly extraordinary. The actual question should be “Should the Senate confirm Obama’s nominee?” That’s a legitimate debate, but it would put the focus on the nominee and that person’s qualifications. By hiding behind this phony “tradition,” Republicans are trying to avoid having to show that a given nominee should be rejected on the merits. In short, they don’t want to take responsibility for rejecting someone who—in all likelihood—will be eminently qualified for the job. That’s not statesmanship. It’s cowardice.

Wednesday, December 16, 2015

A Brief History of American Attitudes Toward Refugees

      [Back in September, in response to efforts opposing the resettlement of Syrian refugees in South Carolina, my colleague Dr. Byron McCane organized a group of Wofford College faculty to present a panel on the subject of refugees. My colleagues Dr. Laura Barbas-Rhoden (Modern Languages), Dr. Phil Dorroll (Religion), Dr. Kim Rostan (English) and I all participated. My job was to give a brief overview of refugees in American history in the September 24 event at Wofford.
      On Nov. 11, we reprised the panel at the University of South Carolina in Columbia, with the welcome additions of USC colleagues Dr. Breanne Grace (College of Social Work) and Dr. Rajeev Bais (Clinical Internal Medicine).
      Due to recent events, the refugee situation has unfortunately become a political issue in the presidential race, with candidates like Ted Cruz and Jeb Bush asserting that only Christian refugees should be admitted into the United States, and Donald Trump calling for a ban on all Muslims entering the United States. Below is an adapted version of my presentations at Wofford and USC. This previously appeared as a series of posts on History News Network.]
     
     
From the earliest days of the republic, the American attitude toward those fleeing conflicts and hardship abroad has been marked by an ambivalence and tension between two contradictory reactions.
     
On the one hand, Americans want to see themselves as a people who welcome refugees. In the 1790s, the American scientist David Rittenhouse said the United States was “an asylum to the good, to the persecuted, and to the oppressed of other climes.” The prominent historian Gordon Wood writes: “By the early 1790s Americans were not surprised that their country was in fact attracting refugees from the tyrannies of the Old World. The enlightened everywhere had come to recognize the United States as the special asylum for liberty.”
     
On the other hand, Americans have also feared that such people might represent a danger to the United States: religious, political, economic, cultural--or all of the above.
     
When I say from the earliest days, I mean just that. The decade of the 1790s saw nearly 100,000 immigrants come into the United States—at a time when the population of the country was about 4 million people. Probably at least 15-20,000 of them were political refugees, fleeing revolutionary violence and political oppression.
     
The first refugee crisis in United States history came during the first term of George Washington, in 1792. The revolution in Santo Domingo led to thousands of refugees fleeing the island, most of whom came to Richmond, Virginia. One historian’s estimate of perhaps 10,000 is probably too high, but there are records indicating the existence of at least 2,000 such refugees in the US by 1794. We know this because Congress voted a specific appropriation of $15,000 for the relief of the refugees (out of $6.3 million budget that year). As the historian of this incident concluded: “For the first time in its existence as an independent state, the United States met the refugee problem in its most tragic form, and met it with … generosity and human sympathy.”
     
Many thousands of other refugees also fled to the United States in the 1790s, mostly from the more famous revolution in France. They were, as one historian put it, of all political stripes: “Royalists, Republicans, Catholics, Masons, courtiers, artisans, priests and philosophers.” These political refugees started their own explicitly political newspapers and book presses. They brought their passions with them, and competing groups sometimes engaged in street violence against each other.
     
In 1795, the pro-British Jay’s Treaty damaged American relations with revolutionary France and threatened to result in outright war. If war came, the Federalists feared that the French would use “native collaborators to create revolutionary puppet republics” and “French emigres and Jacobinical sympathizers in the country [might] become collaborators.”
     
Suddenly, asylum seekers were seen as the threat within. In 1798, Federalist Rep. Harrison Gray of Massachusetts, said: “Do we not know that the French nation have organized bands of aliens as well as their own citizens, in other countries, to bring about their nefarious purposes? By these means they have overrun all the republics in the world but our own … And may we not expect the same means to be employed against this country?”  Another Federalist said that the new immigrants were “the grand cause of all our present difficulties” and plainly stated: “let us no longer pray that America become an asylum to all nations.”
     
As a result of this growing fear, Congress changed the law. The first Naturalization Law in 1790 had required only two years residency in the US before one could become a citizen. That was extended to five years residency in 1795, and then in 1798, Congress raised it to 14 years. All immigrants were required to register with the government within 48 hours of arrival, and the law forbade all aliens who were citizens or subjects of a nation with which the US was at war from becoming American citizens.
     
The crackdown on immigrants and refugees was inextricably wrapped up in domestic politics. The Alien Act, passed by a Federalist Congress and signed by a Federalist president, was a reaction to their fear that the newcomers were overwhelmingly supporters of Thomas Jefferson’s Republican party. Refugees from revolutionary France were joined by hundreds, perhaps thousands, fleeing political oppression in Ireland. Their historian Michael Durey concludes: “the radicals’ experiences after emigration were too varied and problematic to allow us to any longer assume uncritically that America was a welcoming asylum for them all. For many it was Bedlam.”
     
The Alien Act was allowed to expire, and the anti-French fever broke. But the tendency to both welcome and fear refugees would continue in the 19th century, long after the specific fear of the French dissipated.

Fifty years after the Alien Act, revolution in Europe again produced a similar American reaction to the influx of refugees. The revolutions of 1848, starting in Paris and spreading through much of Europe, also produced a large number of political refugees to the United States, especially Germans who were known in the U.S. as the “Forty-eighters.”

The American government generally welcomed the revolutions, seeing them as democratic in character, and thus consistent with American values. In fact, the United States “was the only major government which saw fit to send greetings to the Parliament at Frankfurt.” President James K. Polk stated: “The great principles of … the Declaration of Independence seem now to be in the course of rapid development throughout the world.”

But as students of the Revolutions of 1848 well know, those revolutions were more complex than that, and so were the refugees who fled to America. According to their historian, the “typical Forty-eighter was a freethinker, if not an atheist. They believed in universal suffrage, abolition of the Sunday laws, taxation of church property, establishment of the eight hour day, and government ownership of the railroads.”

Some Americans thus denounced them as “socialists, rationalists, atheists and desecrators of the Sabbath.” Southerners in particular feared their influence because the Forty-eighters were thought to favor abolitionism.  Some Forty-eighters were, in fact, socialists.  One, Ernst Schmidt, would later run for mayor of Chicago in 1859 on a socialist ticket, while others formed their own utopian socialist communities in the United States.

Some of the Forty-eighters were also liberal Catholics, and of course at the same time thousands upon thousands of Irish Catholics were arriving in the United States as economic refugees of the famine in Ireland. This combination gave rise to an explicitly nativist movement that found political expression in the American Party, more commonly known as the “Know-Nothings.”

The Know-Nothings never actually succeeded in changing American law regarding refuges and immigrants, but in their oath, members pledged to never vote for any man for any office who was not born in the United States. They called for “War to the hilt on political Romanism” and “Hostility to all Papal influences when brought to bear against the Republic.” They effectively argued that Catholicism was not so much a religion deserving First Amendment protection as a dangerous political movement contrary to democracy. (This is reminiscent of Dr. Ben Carson’s recent statement that Islam is “inconsistent with the values and principles of America.”)

The Know-Nothings saw the Irish and Germans as a religious/political threat, bringing “Popery” to the United States and thus undermining American principles. The Know-Nothings wanted to deny the newcomers the right to vote—they called for increasing the required number of years of residency from 5 to 21 before an immigrant could vote. (I cannot help but wonder how the Know-Nothings of the 1850s would have reacted to the sight back in September of the Pope, standing where the President stands when giving the State of the Union, addressing the United States Congress, while the Catholic Speaker of the House and Catholic Vice-President sat behind him.)

Despite these political reactions in the mid-19th century, what seems note-worthy in retrospect is that there was no legislative attempt to actually prevent any people from coming into the United States prior to 1882. When it happened, it was deliberately, openly discriminatory. That year, Congress passed the Chinese Exclusion Act. This was an explicitly racial law, a response to the popular backlash against the large number of Chinese in the west, which barred immigration by the Chinese.

Most American are familiar with the fact that many Chinese came to work on the transcontinental railroad, but what is often forgotten is that many were also refugees from the one of the bloodiest periods of Chinese history, the era of the Taiping Rebellion—in the 30 years before the Chinese Exclusion Act, an estimated 20-30 million Chinese died in a major civil war and several different rebellions. Over 1.5 million fled China, and historians estimate that 250,000 of them came to the United States. (Oregon alone had about 100,000 Chinese in 1890.) The Exclusion Act remained law for 60 years, until it was finally repealed during World War II, when China was an American ally in the war against Japan.

Despite the Chinese Exclusion Act, for most of the people of the world, the United States remained a place of asylum. The great turning point was World War I. The previous two decades had seen millions of immigrants, many from southern and eastern Europe, arrive on American shores, leading to increasing calls for limitations.

Once America entered the World War in 1917, the fear that lingering attachments of these relative newcomers to their mother countries might create conflicting loyalties in wartime led to the propaganda theme “100% Americanism.” In addition to the well-known reactions against German-Americans, any so-called “hyphenated American” now became suspect. The Bolshevik Revolution in Russia in 1917 added the fear of radical politics to the mix—this, despite the fact that many of those seeking asylum in the United States because of the revolutions in Russia were fleeing the Bolsheviks, not people who shared their views.

The postwar period saw immigrants—particularly those suspected of radical politics—subjected to heightened levels of scrutiny and even deportation. The drive to put restrictions on eventually led to legislation: first the Emergency Quota Act of 1921, and then a permanent Immigration Act in 1924.

As a result of decades of growing nativist sentiment, the United States for the first time in its history imposed quota limits on the number of people allowed to come into the country: 165,000 maximum per year, with a quota that was based on the number of people from that country in the 1890 census. No specific provision was made in the legislation for refugees. Supporters of the legislation made it clear that the goal of maintain an “Anglo-Saxon” nation was more important that being an “asylum for the oppressed.”

Senator Ellison DuRant Smith of South Carolina said:
Thank God we have in America perhaps the largest percentage of any country in the world of the pure, unadulterated Anglo-Saxon stock; certainly the greatest of any nation in the Nordic breed. It is for the preservation of that splendid stock that has characterized us that I would make this not an asylum for the oppressed of all countries, but a country to assimilate and perfect that splendid type of manhood that has made America the foremost Nation.
After 140 years of effectively welcoming all those who wished to come, the United States shut the door.

It is probably no coincidence that this change corresponds roughly with the emergence of the nation as a great power on the world stage. While outsiders had long been viewed suspiciously—especially those with different religious or political views—now such people were perceived as not just a potential internal threat, but as what we would now call a “national security threat.”

There was no need for an American policy toward refugees prior to the 1920s, since there were so few restrictions on entering the United States. The immigration restriction legislation, however, changed that. It required that no more than two percent “of the number of foreign-born individuals of such nationality resident in continental United States as determined by the United States Census of 1890” be allowed into the United States in any year. By setting strict numerical quotas based on the country of origin, the law left no flexibility depending on the circumstances in that country, and thus no ability to adjust to a refugee problem.
     
Thus the Immigration Act of 1924 set the stage for two disgraceful incidents in America’s history of dealing with refugees. Despite the rising persecution of German Jews in the late 1930s, all German immigration to the United States was subject to the existing yearly quota (due to the formula noted above, Germany actually had by far the highest quota in the world, over 50,000). In early 1939, in the aftermath of Krystallnacht in November 1938, Sen. Robert Wagner of New York proposed to Congress a Refugee Act that would allow 20,000 German children into the United States, over and above the established yearly national quota.
     
Wagner’s intent was that those children would be German Jews, but fearing that anti-Semitism would doom the bill, he did not specify that in the legislation. Opponents argued that, whatever its merits might be, the bill would undermine the quota system. They also made an economic argument. One said in testimony to Congress: “These children, if admitted, will presumably grow up and as youths will become competitors with American citizens for American jobs.” Opponents killed the bill in Congress, and no refugee children came to the United States. There is no way to know how many children might have able to enter the United States, but it seems likely that some who might have been saved later died in the Holocaust.
     
In another case that has become much more well known in recent weeks, we know exactly how many could have been saved had they been admitted into the United States. In the midst of the Refugee Bill debate, the ocean liner St. Louis sailed from Hamburg for Cuba with over 900 German Jews fleeing the Nazi regime. Opposition arose in Cuba to letting them into that country, with anti-immigrant groups claiming that the passengers were Communists and thus should not be admitted. Only 22 of the Jewish passengers were allowed into Cuba. 743 were awaiting visas to enter the United States but had not received them. They cabled the State Department and the White House from the ship asking to be allowed into the United States. But at that time 83% of Americans opposed any relaxation of the immigration laws, and since the German quota for the year had already been filled, they were denied entry. The passengers returned to Europe. The British, Dutch, Belgians and French took in the refugees. But due to the German occupation during the war of all of those countries save Britain, some 254 of them died in the Holocaust. The United States government, knowing full well that Germany was persecuting its Jews, refused to alter its immigration policy to save refugees and 254 lives that could have been saved were lost.
     
World War II, of course, created an unprecedentedly large refugee problem. In 1945, President Truman did what FDR never did: he issued an executive order allowing in 40,000 refugees above the quotas. In 1946, he proposed to Congress the Displaced Persons Act, which produced the same kind of response as Wagner’s Refugee Act did in 1939—opponents charged (despite the nearly full employment postwar economy) that they would take jobs from returning veterans. Some argued that the bill would allow Communists into the United States. Concerns that large numbers of Jews (who were often equated with Communism) would be admitted led supporters of the legislation to stress that most of those admitted would be Christians. This time opponents did not defeat the bill. It passed. They did, however, cut the number admitted into the US in half, from 400,000 to 200,000.
     
Throughout most of the cold war, U.S. policy toward refugees was largely driven by cold war foreign policy, and on a case-by-case basis. As a rule, those fleeing communism were welcomed. The United States admitted refugees of the Hungarian revolution against the Soviet-backed regime in 1956, for example. Under President Dwight Eisenhower, the United States conducted “Operation Safe Haven” for Hungarian refugees. Eisenhower said: "It is heartening to witness the speed with which free nations have opened their doors to these most recent refugees from tyranny. In this humanitarian effort our own nation must play its part." That pattern was repeated several times: those fleeing the Cuban revolution in 1959, as well as the boat people seeking to escape North Vietnam’s conquest of the south in 1975 (under the Indochinese Migration and Refugee Assistance Act), were welcomed into the United States, while those fleeing other tyrannies were often out of luck. In 1980, the Refugee Act finally put refugees outside the regular immigration system, allowing for 50,000 refugees per year.
     
In sum, the reactions we see today to the prospect of admitting refugees from Syria and elsewhere have a long history in this country. Americans have a history of both welcoming and refusing refugees. Today we face a choice: which of those legacies will we embrace? When I began working on this issue nearly three months ago, I had some hope that it would be the former. The events in Paris and San Bernadino—and more importantly, the generally fearful reaction of many Americans to those events—have left me fearing that Americans are more inclined to opt for the latter. What this overview of the history shows is that such fears have in the past been overblown, and Americans have often had great reason to regret their fear-driven, short-sighted overreactions. Nevertheless, that list of regrets looks likely to grow longer.

Tuesday, June 23, 2015

The Flag Needs to go Down. So Does the Lie it Represents

Yes, it’s a symbol. That doesn’t mean it doesn’t matter. That is why it matters.

Within hours of Gov. Nikki Haley calling for the removal of the Confederate battle flag from the grounds of the state capitol in Columbia, Wal-Mart announced that it would stop selling articles with that symbol on it. The Republican Speaker of the Mississippi House said it was time to change the state flag that contains it. Amazon and NASCAR have turned against it. The acknowledgement that it deserves no place of honor may be contagious.

But we should not—cannot—be satisfied with the removal of the symbol. We also have a responsibility to combat the lie it represents.

While Gov. Haley’s decision to support removing the flag is undeniably progress, the way she and other elected officials couch their new-found sensitivity to the insult this flag has always been to black citizens is troubling.

In her statement, Haley said: “For many people in our state, the flag stands for traditions that are noble. Traditions of history, of heritage, and of ancestry. At the same time, for many others in South Carolina, the flag is a deeply offensive symbol of a brutally oppressive past." There is room for both views, she said: “We do not need to declare a winner and a loser."

That is where she is wrong. We do need to declare something: the truth wins and the lie loses. Leadership—true leadership—does not create false equivalencies such as this. Both views, she said, are reasonable. They are not. One is in line with historical reality, while the other is the product of historical self-delusion.

Symbols can be tricky. Meaning can vary from person to person. But we’re not talking about a piece of abstract art in this case. We are talking about a symbol of a specific historical entity. I cannot simply declare that, for me personally, the Confederate battle flag represents say, abolitionism. It was a flag under which men fought against the armies of the United States government, in defense of a government that had as its central tenet the preservation of slavery. That is not up for discussion or debate. (Ta-Nehisi Coates has an exhaustive collection of Confederate leaders saying so, here.)

In 1948, Strom Thurmond’s Dixiecrats waved it to show their opposition to President Truman’s civil rights plank in the Democratic platform. Throughout the civil rights movement, segregationists flew it to show their devotion to Jim Crow and their rejection of racial equality. Rabid segregations waved it in the faces of civil rights protesters, and Gov. George Wallace of “segregation now, segregation forever” infamy proudly stood in front of it. People who defiantly shoved that flag in the face of people marching for racial equality still walk among us.

In each of those instances, it represented a willingness to fight to maintain white supremacy.

The reality that many people refuse to acknowledge those facts does not change them.

Those who still openly defend that flag are, fortunately, diminishing in number. But the near universal meaning people attributed to it in the past, we’re still asked to believe, is not the “real” meaning for its supporters now. Now we are told the murderer “hijacked” the Confederate battle flag. It’s not about slavery or segregation now, it’s about “southern pride.”

What does that term mean? One of the murderer’s friends recalled: “I never heard him say anything, but just he had that kind of Southern pride, I guess some would say…. He made a lot of racist jokes, but you don’t really take them seriously like that.” For this friend, making racist jokes was a sign of “southern pride.” Racism is only serious, it seems, when it leads to actual violence. When it’s jokes and racial epithets, it’s “southern pride.”

Even if we allow that today most white southerners would not define “southern pride” that way, when one associates “southern pride” with a flag that the overwhelming majority of black southerners find offensive, there is a damning, unstated admission: their “south” is, of course, a white south. It is not the south of slaves and their descendants. They were denied their humanity under slavery, they were denied their rights under segregation, and they are denied their southern identity by this definition of “southern pride.”

Haley’s remarks say that the “southern pride” view is worthy of respect. It is not. Only by denying the historical reality of how that flag has been used—not by the one or the few, but by the many—can one view it as representing anything “noble.” It is that kind of denialism that allowed the murderer to believe that the flag called for his hateful violence. It has promoted violence in the name of white supremacy throughout its history, but it has persisted in our culture under the guise of a harmless “southern pride.” The murderer did not hijack it, he did not “misappropriate” it. He made manifest--in the ugliest, most awful way--what it has always meant.

He tore the disguise off so utterly that even many of the willfully blind could not help but see.

That will be why the flag comes down.

No one who has dodged this issue in the past, or openly been on what is now clearly the wrong side of it, wants to have to admit having been wrong. But some flag supporters have. The former radio host and speech writer known as the “Southern Avenger” recently wrote: “I was wrong. That flag is always about race.” That’s the kind of honest reckoning with the past that we need.

Most politicians, however, present this act of removal not so much as a change of opinion but as a change of circumstances. They are beneficently going above and beyond due to the extreme circumstances created by this event. But this awful event did not really create new circumstances. It simply made undeniable what has always been true. It has shamed at least some people. They know what that flag means. But they still continue to indulge the fantasies of those who insist it is only about “southern pride,” and tell them that their point of view is a perfectly legitimate one.

There is a price to be paid for indulging a lie.

For 150 years, this nation has failed to recognize fully what Frederick Douglass rightly identified in 1878 as the central truth of the Civil War: “There was a right side and a wrong side in the late war which no sentiment ought to cause us to forget.”

The nation’s willingness to indulge the “Lost Cause” mythology of the defeated Confederates is one of the reasons that 150 years later this mass murderer had no problem finding a false version of history (adhered to by many people beyond the Council of Conservative Citizens and the Sons of Confederate Veterans) that supported his vile racism.

Taking down the Confederate battle flag is the right thing to do—but not just because it stands in the way of unity at a time of bereavement. It should come down because it represents a pernicious lie: that the south worth honoring is a white supremacist one. Taking it down while indulging the lie is still progress. But it nonetheless avoids the hard truths that need to be spoken.

The cause of the Confederacy was not “noble.” The cause of the segregationists was not “noble.” Neither deserves any honor or reverence.

There is a right side and a wrong side in the Confederate flag debate which no sentiment ought to cause us to forget.

Wednesday, February 4, 2015

"There Was a Right Side and a Wrong Side": Art and Historical Memory

[This piece was originally posted on History News Network]

The last several weeks have given rise to much commentary on how drama presents the recent historical past. The films “Selma” and “American Sniper” both provoked passionate, even divisive, disagreement. To some extent, that may be inevitable, since both treat subjects in the living memory of many people.

My recent experience in this area is rather different, concerning as it does long past events. Last weekend, I attended a local performance of “The Civil War: The Musical.” Everything about the production was top notch: the set, the costumes, the direction, the lighting, the singing. Yet I left the theater with a sense that the show itself was deeply flawed.

At the very beginning of the show, a voiceover sets the scene, discussing the firing on Fort Sumter, and ends by quoting Walt Whitman: “the real war will never get in the books.” This sets the theme for the whole show: the war was, first and foremost, the stories of the individuals engaged in it, the vast majority of whom never did—perhaps never could—record was it was truly like.

As far as it goes, there is much to be said for this approach, the kind of “history from the bottom up” story rather than “top-down” history of presidents and generals. There is also a danger in it, however, one that the show inadvertently showcases.

The audience is introduced to three (mostly) separate groups of characters: white northerners, white southerners, and black slaves (called the “Union Army,” the “Confederate Army” and “The Enslaved” in the script). There is virtually no interaction among the groups other than the battle scenes between the two groups of white men. While I’m sure the creators of the show (Frank Wildhorn, Gregory Boyd, and Jack Murphy) had the best of intentions, this division serves a certain interpretation of the war—one that all too often did find its way into the books.

To the best of my recollection (and perhaps I missed something), the subject of slavery is never discussed by any of the white characters—none condemns it, none defends it. The northerners talk of fighting for Union and freedom, of course, but not the issue of slavery itself. The southerners talk of defending their land, their way of life, but don’t talk about slavery as part of that way of life.

The entire nation, north and south, was complicit in slavery. Northern business interests invested in and profited from it. Northern politicians made common cause with southerners to defend it. Some southerners, like South Carolina’s Grimke sisters, openly fought against it. But you’ll hear none of that in the musical. We hear no northern soldier talk with pride of fighting to free the enslaved; we hear no northern soldier speak with resentment about being asked to risk his life to secure the rights of people he considered inferior. Both types existed. We hear no southern soldier denounce Lincoln for wanting to make blacks his equal; we hear no southerner angered at the prospect of dying for the wealthy planter’s right to own human beings. Both types existed.

The subject of slavery is, of course, addressed by the black characters, and one song that recreates a slave auction (“Peculiar Institution”), is emotionally wrenching. But in the larger context of the show, the institution of slavery primarily appears like an act of God, akin to a famine, a plague, a hurricane—anything but a choice made by human beings to enslave their fellow human beings. Even the slave auction scene ironically has that effect. We hear the crack of a whip and a woman recoils from the blow. Theatrically, it is a powerful moment. It has, however, the inadvertent effect of removing any human agency from the whipping. We see the human being on the receiving end, but no human being administers the punishment.

That’s the problem: in this show, no one is to blame. Everyone acts honorably, fights bravely, dies nobly. Historians of the late 19th century will recognize this interpretation of the war. It is the idea that allowed northern and southern whites to come together and put the war behind them after the end of Reconstruction. It is also the one that abandoned the freedpeople to the depredations of the “Redeemers” who took control after Union soldiers ended their occupation of the Confederacy.

As I noted earlier, the characters in this show are not famous—with one notable exception, Frederick Douglass. (He is inaccurately listed among “The Enslaved,” despite the fact that Douglass escaped slavery in 1838 and was a free man during the Civil War.) No doubt Douglass was used by the show’s creators to include his eloquent denunciations of slavery, and that is a welcome addition. But I could not help but think that the historical Douglass would be rolling over in his grave.

There was no greater critic of this show’s “no one is to blame” ethos than Frederick Douglass. In 1878, he stated the exact opposite, as clearly as is humanly possible: “There was a right side and a wrong side in the late war which no sentiment ought to cause us to forget.” Yet in his final decades, he saw sentiment prevailing over memory. “I am not of that school of thinkers that teaches us to let bygones be bygones, to let the dead past bury its dead. In my view, there are no bygones in the world, and the past is not dead and cannot die. The evil as well as the good that men do lives after them…. The duty of keeping in memory the great deeds of the past and of transmitting the same from generation to generation is implied in the mental and moral constitution of man.”

“The Civil War: The Musical” aims to capture the war that didn’t make it into the books by focusing on unknown individuals and their admirable personal qualities. But Frederick Douglass was right. We do ourselves no favor by remembering only the good and forgetting the evil that men do. The Civil War—the historical event—was the product of human choice and human agency. There was a right side and a wrong side. That’s the truth that all too often has failed to make it into the books—and this musical. 

Wednesday, December 17, 2014

Dick Cheney's David Frost Moment

[This post was originally published on the History News Network.]

All of us who teach face a daily, daunting task: how do we take our subject matter—which we know from years of study to be terrifically complicated and nuanced—and make it accessible and understandable to our students, all while avoiding the peril of oversimplification?

We all do our best, succeeding sometimes, failing others. We are eternally grateful when we find a key: that piece of evidence, that compelling argument, that helps us do our jobs. The most prominent example of that for my teaching is perhaps Richard Nixon’s infamous comment to David Frost about his actions during the Watergate scandal: "Well, when the president does it, that means that it is not illegal."

That one simple sentence helps me communicate to students the essence of the danger inherent in the many complicated events that we call “Watergate”: Nixon’s sincerely held belief that as president he was incapable of committing an illegal act; that whatever he deemed necessary to security of the United States was, by definition, legal. It was a sentiment more consistent with the absolutism of Louis XIV than the constitutional principles that gave birth to the nation. What makes those words so powerful is that they come not from one of Nixon’s many implacable political foes, or from a historian interpreting his actions. They come from the man himself.

Since the release of the Senate Intelligence Committee’s torture report last week, I’ve been struggling with how to synthesize the multiplicity of reactions it provoked. Then on Sunday, I saw former Vice President Dick Cheney on “Meet the Press.”

Amidst all the dissembling, Cheney made one remark that struck me as his David Frost moment.

Moderator Chuck Todd confronted Cheney with evidence that 25% of the detainees were innocent, and that one was physically abused so badly that he died. Cheney replied: “I'm more concerned with bad guys who got out and released than I am with a few that, in fact, were innocent.” When pressed about whether it was acceptable to abuse innocent people even to the point of death, Cheney said: “I have no problem as long as we achieve our objective.”

Keep in mind, Cheney was not talking about the accidental death of innocents on the battlefield. Every war involves the accidental death of innocents, but just war standards command that every reasonable effort be made to avoid them. This was someone in the custody of the United States, who had done no wrong and was mistakenly taken into custody, whose physical mistreatment by representatives of the United States killed him while in custody. Faced with that travesty of justice, Dick Cheney could not even muster a perfunctory expression of regret.

Confronted with an unquestionable injustice, Cheney says: “I have no problem as long as we achieve our objective.” That is the essence of everything wrong with the Bush-Cheney “war on terror.” It admitted no principle whatsoever as superior to the objective of keeping the nation safe. Fundamental human rights—even of innocent people—can be violated with impunity, Cheney asserts. Even after being presented with evidence that an innocent man was killed, Cheney blithely said, “I'd do it again in a minute.” The end justifies the means.

That is the mindset of the authoritarian. Dictators the world over use that logic every day. Dick Cheney will never admit that the techniques he endorsed constitute torture—to do so would be to admit he is a war criminal. But he has now admitted, beyond any doubt, that he has the mentality of a torturer.