Tuesday, May 30, 2017

Why Trump's Snub of NATO Matters

[This post was first published on HNN: http://historynewsnetwork.org/blog/153934#sthash.nbZ5tfZy.dpuf]

Donald Trump went to a NATO meeting last week and never explicitly stated his commitment to Article 5, which states that an attack on one member is an attack on all.

Why is this such a big deal? Because he’s undermining U.S. national interests.

I often have the chance to teach about the creation of NATO, in four different courses I teach (Western Civilization since 1815, U.S. History since 1865, American Diplomatic History, and U.S. since 1945). Whenever I do, I make a point of stressing what an incredible and important departure it was in American foreign policy.

When I tell students that the U.S. created and joined the North Atlantic alliance in 1949, I always ask them when the U.S. last had entered a formal alliance. They often guess World War II, and then World War I. Students understandably assume that since the U.S. fought along side other nations in both the First and Second World Wars, and we casually refer to America’s “allies” in those wars, that there were treaties of alliance. But there were none—in each case, the U.S. quite deliberately maintained its formal separation from those it called its “allies.”

NATO was the first formal alliance for the U.S. in nearly 150 years. In 1800, the Adams administration negotiated an end to the French alliance of 1778 that had helped the Americans win the Revolutionary War, and the U.S. had not agreed to a single treaty of alliance since. When revolutionary France went to war with Britain (America’s largest trading partner) in the 1790s, the alliance seriously complicated not only American foreign policy, but American domestic politics as well, and soured Americans on the idea of any binding foreign commitments.

After its war with Britain ended in 1815, the U.S. assiduously avoided involvement in European political affairs. Its response to conflicts in Europe was essentially “none of our business.” When both World Wars broke out, the American response was to declare its neutrality. In 1949, that changed.

American membership in NATO represented a fundamental shift in American foreign policy. For nearly a century and a half, Americans insisted on complete freedom of action in foreign policy. No binding commitments would threaten to drag the U.S. into a foreign war. That determination was the single largest factor in the Senate’s rejection of Woodrow Wilson’s vision for the League of Nations after World War I. With NATO, the U.S. reversed course and said that it would immediately go to war if one if its allies were attacked. Why such a dramatic change?

The lesson of the two World Wars, in the minds of American foreign policy makers, was that the U.S. could not avoid involvement in a major European war. The only way to stay out of such a war, they decided, was to make sure that one never broke out again. The only way to do that was deterrence through a binding collective security agreement. Send the message to a potential aggressor (the Soviet Union at the time) that American neutrality was unequivocally a thing of the past: if World War III broke out, the (nuclear armed) U.S. would be in it on Day One. That certainty would deter any potential aggression and prevent another war.

That certainty is what Donald Trump recklessly undermined last week. NATO’s effectiveness depends on certainty, and he created uncertainty. During the campaign, Trump suggested that America's commitment to honoring Article 5 would become conditional. When asked if the U.S. under a Trump administration would defend the Baltic states if attacked by Russia, he said “If they fill their obligations to us.” That one small word, “if,” has the potential to undermine the entire alliance. The whole point of NATO was to take the “if” out of the calculation.

Last week, Trump had an opportunity as president to repair the damage he had done as a candidate, and he passed on it. Administration officials assured reporters beforehand that Trump would “publicly endorse NATO’s mutual defense commitment.” But he did no such thing. He briefly mentioned it in the context of NATO coming to America’s aid after 9/11, but never stated his commitment to reciprocate. Instead, he harped once again on the need for NATO nations to pay “their fair share.”

There’s nothing inherently wrong with reminding members that they have agreed to spend 2% of GDP on defense. Trump, by refusing to state his commitment to Article 5 while making that demand, however, is turning NATO into just another “deal.” He has said in the past that he thinks the U.S. is being “taken advantage of” in NATO. In his transactional framing of the alliance, European members are paying for American protection, and to get them to pay up, he is implicitly threatening to refuse to honor America’s commitment. This is his simple-minded idea of what constitutes “tough” leadership.

As with so many other aspects of his disastrous presidency, Trump here is misapplying his business approach to realms where it is not only not applicable but downright destructive. NATO is not a “deal.” It is not a protection racket. The American creation of NATO was meant to serve American interests. It has done so for nearly 70 years. Undermining the alliance with his childish and churlish attitude is self-defeating. It undeniably damages American interests. The only open question is whether Trump is doing so out of ignorance and foolishness, or for far more disturbing and sinister reasons.

Sunday, December 18, 2016

The Election of 2016 and American Identity

[This post was originally published on HNN: http://historynewsnetwork.org/blog/153856]

Not all presidential elections are created equal. Every election is a choice, of course, but the choices are not equally consequential. In some cases, the country seems largely set on what to do, and is debating little more than how to do it (Kennedy-Nixon in 1960). In others, there are more substantial questions of what we as a nation should do (Reagan-Carter in 1980). The most consequential ones, however, come down to the question of who we are as a people, how we define America as a state.

I would argue that 2016 was the last of these.

It was so because Donald Trump made it so.

The 2008 campaign easily could have been one of those, with the Democrats choosing the first African-American major party nominee, with all that choice symbolized about what kind of country this is. While there were certainly moments in the campaign that threatened to veer in that direction, the Republican nominee, Sen. John McCain, stopped his campaign from exploiting that approach.  When a woman at one of his town hall meetings said she thought Obama was “an Arab,” McCain stopped her: “No, ma'am. He's a decent family man [and] citizen that I just happen to have disagreements with on fundamental issues and that's what this campaign's all about. He's not [an Arab].” McCain was given the chance to make it a campaign that said I am one of “us” and he is one of “them,” and he insisted it should instead be a campaign about issues.

Those two words—“No, ma’am”—made clear that McCain was determined not to take the low road. He would talk about what we should do, not who we are. He would say “no” to his supporters when they went down that other road. They are also the words Donald Trump never uttered in his campaign rallies, no matter what vile shouts his deliberate rabble-rousing provoked.

Long before he became a candidate, Trump took the low road by becoming the most famous “birther” in America, again and again claiming that he was finding proof that Barack Obama was not born in the US, asserting that Obama was secretly some non-American “other.” What McCain disavowed, Trump took up—with glee.  McCain thought there were things more important than winning, an attitude Trump clearly views with utter disdain. To Trump, decency is for losers.

Trump’s birtherism was more than just a way to attract attention (though that may have been its chief attraction for him personally). It was in practice an attempt to repudiate the vision of America that Obama’s presidency represented, an America that defines itself by core beliefs that are available to all people, no matter their race, ethnicity, or religion—rather than by an immutable national type of person.

It is no coincidence that Trump then literally began his campaign by demonizing Mexicans as criminals and rapists. His opening salvo against Mexicans set the tone that he never abandoned: these “other” people are different, they are not good, they do not belong here, they are not “us.” His attack on Judge Curiel demonstrated this perfectly. He said the judge could not be fair to him in the Trump University case because “he’s Mexican.” The fact that the judge was born and raised in the United States did not matter to Trump. “He’s Mexican. I’m building a wall.” For Trump, Curiel’s ethnic heritage was who he was. His birthplace, his profession, his devotion to the law and the Constitution were all irrelevant to Trump. The judge’s identity was his ethnicity, and it was Mexican, not American.

He added to the ethnic dimension a religious one by calling for a ban on Muslims coming into the US. He did not call for a ban on extremists or terrorists. He called for a ban on everyone who adhered to a specific religion. He told CNN: “I think Islam hates us.” Not some Muslims, not even some people from some countries that are predominantly Muslim. “Islam hates us,” he said—ignoring the many American Muslims who are “us.” What that lays bare is that for Trump, Muslims are not “us.” They may be here, but they don’t really belong here, because they are not really of “us.”

His positions and policies (and the rhetoric he used to promote them) made it clear that his slogan—“Make America Great Again”—meant that the US should be defined in racial, ethnic, and religious terms: as a predominantly white, Christian country again. His unabashed bigotry throughout his campaign challenged every American to decide: is this who we are? Is America defined by racial, ethnic, and religious traits or is it not?

As I see it, there have long been two competing visions of what the United States is: a country based on an idea or a nation like all the others.

The first argues that the United States is not any particular ethnicity, language, culture, or religion—some of the traits that usually comprise a “nation.” Instead, the United States is fundamentally an idea, one whose basic tenets were argued in the Declaration of Independence and given practical application in the Constitution. At its core, America is the embodiment of the liberalism that emerged from the Enlightenment, which took as a self-evident truth that all people are equal, that all people are fundamentally the same, no matter where they live. They all have basic rights as humans, rights that no government can grant or deny, but only respect or violate. Because this fundamental liberal idea erased the traditional lines that divided people based on race, ethnicity, or religion, it was a “universalist” (or, to use a common term of derision among Trump supporters, “globalist”) concept. It was open to everyone, everywhere. By extension, the American idea (and America itself) was open to everyone, everywhere.

Unlike the situation in other “nations,” since America was an idea, one could become an American by learning about and devoting oneself to that idea. This fact is embodied today in the citizenship test given to those wishing to become Americans: it is a civics test, with questions about American history and government. The final step is taking an oath of allegiance, in which one pledges to support and defend not the “homeland” but the Constitution. The oath is not to territory or blood, but to what we believe and how we do things: to become an American means to believe in certain ideas and commit to living by them.

The other concept of the state is older and more traditional. The United States is a territory, a piece of land. It is also a particular group of people with unique, identifiable national traits that set them apart from others. Trump’s constant refrain about “the wall” perfectly captures this sense of territory in concrete terms. He says that the borders are absolutely essential to defining the nation: “A nation without borders is not a nation at all.” After the Orlando shooting, Trump tied the idea of the nation explicitly to immigration. Eliding the fact that the killer himself was born in the US, he noted that his parents were immigrants and said: “If we don't get tough and if we don't get smart, and fast, we're not going to have our country anymore. There will be nothing, absolutely nothing left.” Immigrants, he suggested, will destroy the country.

This is why the border must be, in his words, “strong” or “secure.” Keeping “our” country means keeping the wrong people out. Otherwise there will be “people who don’t belong here.” While in theory this could be merely about a given immigrant’s legal status, Trump’s rhetoric and proposals give the lie to that—the Orlando killer’s parents were not “illegal” after all, but they were Afghans and Muslims. The wall won’t be on the border with Canada, either. He singles out Mexicans and Muslims, which has the effect of defining who exactly the people who do “belong here” are—those who are white and Christian. Trump’s nonsensical promise that “we are going to start saying ‘Merry Christmas’ again” signals that he will make America Christian again. He told Tony Perkins: “I see more and more, especially, in particular, Christianity, Christians, their power is being taken away.” The passive voice masks who precisely is doing the taking away, but it is not hard to imagine who he means: it must be non-Christians, maybe secularists, maybe Muslims. Either way, “them,” and not “us.” (It is also noteworthy that he says Christians had “power”—which suggests a previous supremacy that’s been lost.)

By striking these themes, Trump has appealed to this traditional, more tribal concept of what America is, or should be: not an idea based on universal principles, but a state rooted in a particular place and with a specific, dominant identity comprised of racial, ethnic, and religious traits that should never change.

The irony is that in doing so, Trump is effectively saying the United States is not really distinctive, at least not in the way it usually thinks of itself. It is a nation like all other nations. Trump has, in fact, explicitly rejected American exceptionalism: “I don't think it's a very nice term. We're exceptional; you're not…. I don't want to say, ‘We're exceptional. We're more exceptional.’ Because essentially we're saying we're more outstanding than you.” While he couched this is business terms, claiming that since the US was being bested in trade it could not claim to be better, he was openly and consciously rejecting a basic tenet of Republican orthodoxy since at least Ronald Reagan. Coming from the standard bearer of the 2016 Republican Party, which has beat the “American exceptionalism” drum relentlessly (especially in the Obama years), that is rather stunning—but it also makes sense from another perspective.

Jelani Cobb wrote recently in the New Yorker that Trump’s political rise represents the “death of American exceptionalism.” He states: “The United States’ claim to moral primacy in the world, the idea of American exceptionalism, rests upon the argument that this is a nation set apart.” By emulating the “anti-immigrant, authoritarian, and nationalist movements we’ve witnessed in Germany, the U.K., Turkey, and France,” Cobb argues, Trump forfeits that American “claim to moral superiority.”

I agree with Cobb, but I think it goes even deeper than he suggests: it is a rejection of the idea-based definition of what America is and a reversion to an older, European one. American exceptionalism not only encompassed a moral claim, not only set the United States apart from other nations. It even—or maybe especially—set the US apart from those places from which most of its founding generation fled: the states of Europe. Here in America, the thinking went, the people will create something new and different, based on first principles and following the dictates of reason, unrestrained by tradition, culture, religion—by anything but the best ideas. In Thomas Paine’s famous words, “we have it in our power to begin the world over again.” The United States would show the world what could be accomplished when free people creating a new state had the chance to write on John Locke’s tabula rosa. (It should go without saying that this was never literally true, but rather an ideal to which people aspired.)

In doing so, Americans were effectively saying: “We are not our European ancestors. We are different. They are tribal, we are not.” For most of the 19th century and well into the 20th, American isolationism was based on the foundational idea that the US, despite its ancestry, was decidedly not European. It would not be ruled by Europe and it would not be drawn into Europe’s tribal squabbles. The US was different—and better. It may have been borne of Europe, but it would supersede it and show it a better way.

More often than not in recent decades, it has been American conservatives who have shown disdain for Europe, sneering at the idea that the US should look to Europe for ideas or leadership of any kind: in law, in public policy, in diplomacy. But scratch the surface and what we see is not contempt for Europe per se but for liberalism as it has developed in Europe since the end of World War II. As right-wing, anti-liberal movements have grown in Europe, so has American conservatism’s appreciation for what Europe has to teach Americans.

As Cobb points out, what is striking about Trump is how much his program resembles that of right-wing extremists in European states who reject that better way America sought to offer in favor of the old European way. Trump’s program is not uniquely American. Arguably, it is following an ancient pattern set in Europe that is rearing its ugly head again in the 21st century. (Trump himself said his election would be “Brexit times 10”—bigger, but not original.) Trump is following more than he is leading, copying a formula that has had some success elsewhere, one that is far from uniquely American. It is, if anything, uniquely European—in the worst sense.

Recently the New York Times had an article on how the far-right European movements have adopted Vladimir Putin as their hero, for his defense of “traditional values.” It quotes an American white Christian nationalist praising Putin: “I see President Putin as the leader of the free world.” (His definition of “free” must be markedly different from the one that has dominated in American political culture, but the framing is telling. Theirs is not the freedom of the Enlightenment, but rather freedom from the threat of the non-western or non-traditional “other.”)

Most American pundits, still caught in a cold-war paradigm, marveled at Trump’s embrace of Putin, and could not understand how it failed to discredit him as it seemingly should have (even this past weekend’s stories on the CIA’s conclusion that Russia sought to help Trump in the election has yet to leave a mark on him). Those critics failed to see that a new paradigm has completely eclipsed that of the cold war. They missed the fact that, despite his KGB pedigree, Putin has transformed himself into “a symbol of strength, racial purity and traditional Christian values in a world under threat from Islam, immigrants and rootless cosmopolitan elites.” In the new paradigm, these are the new enemies, the real enemies of the 21st century. Communists have been vanquished. Islamists, immigrants, globalists, “others” of all kinds, have taken their place. The cold war was a battle of ideologies; this is a battle of identities.

If this take is correct, the combination of Trump’s willingness to jettison American exceptionalism and his embrace of Putinism as “real” leadership portends a significant transformation of what it means to be an American. Rather than a country built on ideas and principles, which defines itself by its devotion to those principles, Trump’s America is simply one (albeit the most powerful) of the many western tribes beating back the “uncivilized” hordes that threaten to undermine the white, Christian traditional identity of the west. In such a world, embracing Putin as a partner makes sense—even if he does have journalists and other political enemies murdered or imprisoned. Embracing anti-liberal autocrats and dictators in order to destroy ISIS becomes not a necessary evil, but a positive good, a desirable state of affairs, a restoration of an ancient European unity against the infidel.

Implicit in this view is a rejection of Enlightenment liberalism. Once you jettison the commitment to an idea and embrace a politics based on racial, ethnic, and religious identity, showing a reckless disregard for democratic norms and processes (as Trump reflexively does) is natural, since those things have no inherent value. How we do things does not matter—all that matters is who we are and what we must do to protect that essential identity. Since American identity is not defined by principles of any kind, it is not important to have principles of any kind. The only standard by which to judge right and wrong is success in defending the homeland from the “other.” So Trump can blithely pledge to restore “waterboarding and a hell of a lot worse than waterboarding” with no qualms whatsoever. After all, he asserts, “torture works.”

Trump has made clear repeatedly that that is his only standard: what works. When asked by the Wall Street Journal after the election whether he had gone too far with his rhetoric during the campaign, he said flatly: “No. I won.” His worldview is entirely instrumental: what works is right, what fails is wrong. Nothing could be more fundamentally opposed to a commitment to liberal process, which values process as a good in itself, as the glue that holds together people with different views and beliefs.

When Marxists, following the logic of economic determinism, claimed that class created identity, fascists countered with racial determinism: the blood determined identity. What has always set liberalism apart from these extremist ideologies is the belief that people create their own identities. As rational beings, we can create who we are by deciding what we believe. We are not merely the products of race, or ethnicity, or class. We are who we choose to be.

What made this election so consequential is that it posed the question of who Americans are as a people as clearly as it has been since 1860. Hillary Clinton’s campaign recognized this with its slogan: “Stronger Together.” Trump’s strategy was to encourage white Christian nationalism, and Clinton’s was to say we cannot go back to some tribal concept of American identity. What has disturbed so many of us about Trump’s elevation to the presidency is not simply that our candidate didn’t win. It is that the choice that 46.2% of the voters made is so antithetical to our vision of what America can and should be. It threatens a reversion to a more primitive tribalism that has proved so horrifically destructive in the past. We know the history. We know the danger. That is why this was no normal election and this will be no normal presidency. This country is about to be tested as it has not been since the 1860s, and the outcome is not at all clear.

Wednesday, March 30, 2016

Orrin Hatch's Embarrassing New York Times Op-ed

This piece was originally published on History News Network

Senator Orrin Hatch took to the New York Times op-ed page to try to make the case for the Senate refusing to take up President Obama’s Supreme Court nominee.

It didn’t go well.

He starts by praising the late Antonin Scalia, implying that the rules should be different when replacing one of the “greatest jurists in our nation’s history.” The obvious reply is that it does not matter who the president is replacing. All openings on the Court are created equal.

Hatch then asserts that Obama has “contempt” for Scalia’s judicial philosophy. That may or may not be true, but in any case, it is irrelevant. When the electorate once again decisively elected Obama as president in 2012, it did not include an asterisk that said he could only replace justices with whom he agreed.

His next point is that when a senator, Obama opposed two of President Bush’s nominees. Again, this is irrelevant. No one is claiming that Hatch or any other Republican has to support Obama’s nominee—just that Judge Garland deserves a hearing and a vote. Republicans now are as free as Obama was then to oppose the confirmation of the nominee.

Hatch then moves on to even more absurdly irrelevant points, such as his assertion that Obama has “consistently exceeded the scope of his legitimate constitutional authority.” Putting aside how questionable that point is, what Hatch seems to be suggesting is that if senators think such a thing about a president, the president loses the right to exercise legitimate constitutional powers. The Constitution provides Congress with a remedy for a president who exceeds the scope of legitimate constitutional authority: impeachment. The simple fact that a Republican House has not taken up impeachment reveals Hatch’s point for the nonsense it is.

He then notes that the American people have chosen a Democratic president and Republican Senate. Fair enough. But that in no way leads to Hatch’s conclusion that the Senate can therefore ignore the nomination. What that “split decision” suggests is that the Democratic president should nominate a person who is not his political ideal, but a compromise candidate more acceptable to that Republican Senate. By choosing Merrick Garland, that is precisely what Obama has done. He is respecting the idea of checks and balances, both institutionally and politically. He did not chose someone who was a darling of the Democratic left, but someone who has (in the past) been repeatedly praised by Republicans, including Hatch himself. By refusing to even consider the nominee of the elected president, it is Hatch and the Senate Republicans who are not respecting the “split decision” of the American people, not the president. They are saying that the smaller subset of the American public that elected those Senate Republicans can simply ignore the decision of the entire national electorate in the last presidential election.

For an historian, perhaps the most offensive point Hatch makes is this: “Throughout its history, the Senate has never confirmed a nominee to fill a Supreme Court vacancy that occurred this late in a term-limited president’s time in office.” As a history teacher, I am used to the instinct unprepared undergraduates have to bolster a poor argument with the “throughout history” trick. I expect better of United States Senators.

Hatch shows his contempt for his readers with this tortured construction. To make his “throughout its history” line work, Hatch needs to make that history awfully short. He does that with the phrase “term-limited.” The 22nd Amendment, which imposes term limits on presidents, has only been in effect for 65 years. So this particular “throughout its history” means for 65 years—less than Hatch’s own life span.

As I pointed out in my previous piece on this subject, there has only been one other vacancy during that period that was “this late” in a president’s term: LBJ’s nomination of Abe Fortas in 1968. Yes, Fortas was not confirmed as Chief Justice. That nomination received a hearing, however, and a vote. It was not met with this disingenuous nonsense that “we never do this.” And as Hatch well knows, 1968 was one of the most contentious elections years in American history. Somehow, the Senate still did its job.

That leads to the next part of Hatch’s “kitchen sink” piece. He blames the “toxic presidential election” for Republican irresponsibility on this nomination. Anyone paying any attention knows that the current toxicity is almost entirely on his party’s side. Hillary Clinton and Bernie Sanders have mostly conducted their primary contest on a high, substantive level. Hatch calls this the “nastiest election year in recent memory.” He neglects to mention that the nastiness is almost entirely on the Republican side. By some inexplicable logic, the fact that the Republican Party is wallowing in the political gutter means that the Democratic president’s nominee for the Supreme Court should not be treated like any other nominee.

Lastly, Hatch notes: “I have witnessed firsthand the deterioration of the confirmation process. Neither party has clean hands on this front.” That is true. It is also true that what Hatch proposes as the responsible course of action is in fact an extraordinary escalation of the politicization of the nomination process far beyond what either party has done in the past. It shows contempt for the 2012 electorate that elected Barack Obama. It shows contempt for the president personally. It shows contempt for American history.

Nothing in Hatch’s piece changes any of that.

If Hatch and his fellow Republicans want to vote against Judge Garland, they have every right to do so. But they should stop being cowards. They should make a substantive argument against him, vote against him, and accept the political consequences of that vote. They should stop pretending that this reckless path they have chosen is anything but a desperate attempt to hold onto a Supreme Court majority.

Sunday, March 6, 2016

BBC 5 Radio Interview on Ted Cruz and his Phony Supreme Court "Tradition"


On February 25, BBC 5 Radio program "Up All Night" with Rhod Sharp interviewed me about the Supreme Court vacancy and the post I wrote about Ted Cruz. The audio file is below.

video

Tuesday, February 16, 2016

Ted Cruz's Phony Supreme Court "Tradition"

[This post originally appeared on History News Network]

“It has been 80 years since a Supreme Court vacancy was nominated and confirmed in an election year. There is a long tradition that you don't do this in an election year."—Senator Ted Cruz 
If he honestly believes it is not legitimate to nominate and confirm a justice in an election year, Ted Cruz must hate the appointment of Chief Justice John Marshall. John Adams nominated him in January 1801, after he lost his re-election bid to Thomas Jefferson in the election of 1800. Adams was a lame duck in the truest sense of the term—he was serving out the remainder of his term after being repudiated by the voters. Yet he did not hesitate to fill the vacancy in the Supreme Court, and Marshall was confirmed by a lame duck Senate.

Perhaps the most striking irony of Cruz’s position (and increasingly the position of the entire Republican Party) is that this absurd debate is taking place over the replacement of Antonin Scalia. If there is one thing Scalia was known for, it is his originalist interpretation of the Constitution: it means what the founding generation said it meant. So is seems appropriate to ask: what did the Founders actually do in such circumstances?

In the final year of his presidency, George Washington had two nominations to the Supreme Court approved by the Senate. It was an election year and he was not running for reelection. It doesn’t get more "original intent" than that. Adams could easily have left the Supreme Court vacancy for Jefferson—who had already been elected, after all, and would take office in a matter of weeks—and didn’t. That seems as clear as it could be. The founders saw no impediment to a president in the final year--or even in the final weeks--of the presidency successfully appointing new justices to the Supreme Court.

What about Cruz's contention about the last 80 years? Even that does not hold up.

The facts are pretty simple. In the last 80 years there has only been one instance in which a president was in a position to nominate a justice in an election year and did not have the nominee confirmed. In 1968, LBJ’s nomination of Abe Fortas to be Chief Justice to succeed Earl Warren (and of Homer Thornberry to take the seat held by Fortas) was blocked in the Senate, but not because of some alleged “tradition.” Certainly there were Senators who wanted the next president to name a new justice. But the opposition to Fortas had everything to do with the specific nominee and specific objections to him (particularly charges of cronyism and inappropriate financial dealings). To the best of my knowledge, no one cited Cruz’s “tradition” to say it was not appropriate for Johnson to nominate someone, or that it would have been inappropriate to confirm anyone.

A second instance took place 28 years earlier. In 1940, FDR nominated Frank Murphy in January of that election year and he was confirmed that same month. There was no “tradition” blocking that election-year appointment. (This also shows that Cruz got the math wrong—this happened 76 years ago, not 80.) [Note: The morning after this post first appeared, Orrin Hatch spoke on NPR and amended the claim to no "term-limited" president had had a nominee confirmed in an election year--evidently an attempt to exempt FDR's confirmed nominee from the "tradition."]

So, there were two instances similar to the current situation in the last 80 years. In one case the nomination was rejected and in the other it wasn’t. To Ted Cruz, this constitutes “a long tradition that you don't do this.”

Ted Cruz’s invention of this alleged "tradition" that we don’t nominate and confirm Supreme Court justices in an election year would be laughable if so many supposedly responsible political leaders were not taking it seriously.

It is absurd on the face of it. If the Republicans in the Senate want to block any nominee Barack Obama sends them, they have the votes to do it. But they should stop hiding behind the obvious fiction that doing so is part of some “tradition.” It would be nothing but the raw, cynical use of their political power. This suggestion that Obama should not even nominate someone (both John Kasich and Marco Rubio said so in Saturday’s debate), or if he does, that the nominee should be rejected out of hand simply because of the timing (as the Senate Majority Leader and many Republican Senators are now saying), is simply silly. 

True conservatives don’t invent traditions. They work to protect existing ones. Our true tradition is that the president nominates and the Senate votes, regardless of when the vacancy occurs. 

The speed with which Cruz jumped to make this claim and with which so many others have fallen in line, speaks to the nihilistic radicalism that has infected today's Republican Party. Any position can be taken if it produces the correct result. Facts can be denied, “traditions” can be invented. The only value taken seriously is “does it work to our advantage?”

This tactic may well work politically. It has already had the effect of framing the debate as “Should Obama nominate someone?” That is truly extraordinary. The actual question should be “Should the Senate confirm Obama’s nominee?” That’s a legitimate debate, but it would put the focus on the nominee and that person’s qualifications. By hiding behind this phony “tradition,” Republicans are trying to avoid having to show that a given nominee should be rejected on the merits. In short, they don’t want to take responsibility for rejecting someone who—in all likelihood—will be eminently qualified for the job. That’s not statesmanship. It’s cowardice.

Wednesday, December 16, 2015

A Brief History of American Attitudes Toward Refugees

      [Back in September, in response to efforts opposing the resettlement of Syrian refugees in South Carolina, my colleague Dr. Byron McCane organized a group of Wofford College faculty to present a panel on the subject of refugees. My colleagues Dr. Laura Barbas-Rhoden (Modern Languages), Dr. Phil Dorroll (Religion), Dr. Kim Rostan (English) and I all participated. My job was to give a brief overview of refugees in American history in the September 24 event at Wofford.
      On Nov. 11, we reprised the panel at the University of South Carolina in Columbia, with the welcome additions of USC colleagues Dr. Breanne Grace (College of Social Work) and Dr. Rajeev Bais (Clinical Internal Medicine).
      Due to recent events, the refugee situation has unfortunately become a political issue in the presidential race, with candidates like Ted Cruz and Jeb Bush asserting that only Christian refugees should be admitted into the United States, and Donald Trump calling for a ban on all Muslims entering the United States. Below is an adapted version of my presentations at Wofford and USC. This previously appeared as a series of posts on History News Network.]
     
     
From the earliest days of the republic, the American attitude toward those fleeing conflicts and hardship abroad has been marked by an ambivalence and tension between two contradictory reactions.
     
On the one hand, Americans want to see themselves as a people who welcome refugees. In the 1790s, the American scientist David Rittenhouse said the United States was “an asylum to the good, to the persecuted, and to the oppressed of other climes.” The prominent historian Gordon Wood writes: “By the early 1790s Americans were not surprised that their country was in fact attracting refugees from the tyrannies of the Old World. The enlightened everywhere had come to recognize the United States as the special asylum for liberty.”
     
On the other hand, Americans have also feared that such people might represent a danger to the United States: religious, political, economic, cultural--or all of the above.
     
When I say from the earliest days, I mean just that. The decade of the 1790s saw nearly 100,000 immigrants come into the United States—at a time when the population of the country was about 4 million people. Probably at least 15-20,000 of them were political refugees, fleeing revolutionary violence and political oppression.
     
The first refugee crisis in United States history came during the first term of George Washington, in 1792. The revolution in Santo Domingo led to thousands of refugees fleeing the island, most of whom came to Richmond, Virginia. One historian’s estimate of perhaps 10,000 is probably too high, but there are records indicating the existence of at least 2,000 such refugees in the US by 1794. We know this because Congress voted a specific appropriation of $15,000 for the relief of the refugees (out of $6.3 million budget that year). As the historian of this incident concluded: “For the first time in its existence as an independent state, the United States met the refugee problem in its most tragic form, and met it with … generosity and human sympathy.”
     
Many thousands of other refugees also fled to the United States in the 1790s, mostly from the more famous revolution in France. They were, as one historian put it, of all political stripes: “Royalists, Republicans, Catholics, Masons, courtiers, artisans, priests and philosophers.” These political refugees started their own explicitly political newspapers and book presses. They brought their passions with them, and competing groups sometimes engaged in street violence against each other.
     
In 1795, the pro-British Jay’s Treaty damaged American relations with revolutionary France and threatened to result in outright war. If war came, the Federalists feared that the French would use “native collaborators to create revolutionary puppet republics” and “French emigres and Jacobinical sympathizers in the country [might] become collaborators.”
     
Suddenly, asylum seekers were seen as the threat within. In 1798, Federalist Rep. Harrison Gray of Massachusetts, said: “Do we not know that the French nation have organized bands of aliens as well as their own citizens, in other countries, to bring about their nefarious purposes? By these means they have overrun all the republics in the world but our own … And may we not expect the same means to be employed against this country?”  Another Federalist said that the new immigrants were “the grand cause of all our present difficulties” and plainly stated: “let us no longer pray that America become an asylum to all nations.”
     
As a result of this growing fear, Congress changed the law. The first Naturalization Law in 1790 had required only two years residency in the US before one could become a citizen. That was extended to five years residency in 1795, and then in 1798, Congress raised it to 14 years. All immigrants were required to register with the government within 48 hours of arrival, and the law forbade all aliens who were citizens or subjects of a nation with which the US was at war from becoming American citizens.
     
The crackdown on immigrants and refugees was inextricably wrapped up in domestic politics. The Alien Act, passed by a Federalist Congress and signed by a Federalist president, was a reaction to their fear that the newcomers were overwhelmingly supporters of Thomas Jefferson’s Republican party. Refugees from revolutionary France were joined by hundreds, perhaps thousands, fleeing political oppression in Ireland. Their historian Michael Durey concludes: “the radicals’ experiences after emigration were too varied and problematic to allow us to any longer assume uncritically that America was a welcoming asylum for them all. For many it was Bedlam.”
     
The Alien Act was allowed to expire, and the anti-French fever broke. But the tendency to both welcome and fear refugees would continue in the 19th century, long after the specific fear of the French dissipated.

Fifty years after the Alien Act, revolution in Europe again produced a similar American reaction to the influx of refugees. The revolutions of 1848, starting in Paris and spreading through much of Europe, also produced a large number of political refugees to the United States, especially Germans who were known in the U.S. as the “Forty-eighters.”

The American government generally welcomed the revolutions, seeing them as democratic in character, and thus consistent with American values. In fact, the United States “was the only major government which saw fit to send greetings to the Parliament at Frankfurt.” President James K. Polk stated: “The great principles of … the Declaration of Independence seem now to be in the course of rapid development throughout the world.”

But as students of the Revolutions of 1848 well know, those revolutions were more complex than that, and so were the refugees who fled to America. According to their historian, the “typical Forty-eighter was a freethinker, if not an atheist. They believed in universal suffrage, abolition of the Sunday laws, taxation of church property, establishment of the eight hour day, and government ownership of the railroads.”

Some Americans thus denounced them as “socialists, rationalists, atheists and desecrators of the Sabbath.” Southerners in particular feared their influence because the Forty-eighters were thought to favor abolitionism.  Some Forty-eighters were, in fact, socialists.  One, Ernst Schmidt, would later run for mayor of Chicago in 1859 on a socialist ticket, while others formed their own utopian socialist communities in the United States.

Some of the Forty-eighters were also liberal Catholics, and of course at the same time thousands upon thousands of Irish Catholics were arriving in the United States as economic refugees of the famine in Ireland. This combination gave rise to an explicitly nativist movement that found political expression in the American Party, more commonly known as the “Know-Nothings.”

The Know-Nothings never actually succeeded in changing American law regarding refuges and immigrants, but in their oath, members pledged to never vote for any man for any office who was not born in the United States. They called for “War to the hilt on political Romanism” and “Hostility to all Papal influences when brought to bear against the Republic.” They effectively argued that Catholicism was not so much a religion deserving First Amendment protection as a dangerous political movement contrary to democracy. (This is reminiscent of Dr. Ben Carson’s recent statement that Islam is “inconsistent with the values and principles of America.”)

The Know-Nothings saw the Irish and Germans as a religious/political threat, bringing “Popery” to the United States and thus undermining American principles. The Know-Nothings wanted to deny the newcomers the right to vote—they called for increasing the required number of years of residency from 5 to 21 before an immigrant could vote. (I cannot help but wonder how the Know-Nothings of the 1850s would have reacted to the sight back in September of the Pope, standing where the President stands when giving the State of the Union, addressing the United States Congress, while the Catholic Speaker of the House and Catholic Vice-President sat behind him.)

Despite these political reactions in the mid-19th century, what seems note-worthy in retrospect is that there was no legislative attempt to actually prevent any people from coming into the United States prior to 1882. When it happened, it was deliberately, openly discriminatory. That year, Congress passed the Chinese Exclusion Act. This was an explicitly racial law, a response to the popular backlash against the large number of Chinese in the west, which barred immigration by the Chinese.

Most American are familiar with the fact that many Chinese came to work on the transcontinental railroad, but what is often forgotten is that many were also refugees from the one of the bloodiest periods of Chinese history, the era of the Taiping Rebellion—in the 30 years before the Chinese Exclusion Act, an estimated 20-30 million Chinese died in a major civil war and several different rebellions. Over 1.5 million fled China, and historians estimate that 250,000 of them came to the United States. (Oregon alone had about 100,000 Chinese in 1890.) The Exclusion Act remained law for 60 years, until it was finally repealed during World War II, when China was an American ally in the war against Japan.

Despite the Chinese Exclusion Act, for most of the people of the world, the United States remained a place of asylum. The great turning point was World War I. The previous two decades had seen millions of immigrants, many from southern and eastern Europe, arrive on American shores, leading to increasing calls for limitations.

Once America entered the World War in 1917, the fear that lingering attachments of these relative newcomers to their mother countries might create conflicting loyalties in wartime led to the propaganda theme “100% Americanism.” In addition to the well-known reactions against German-Americans, any so-called “hyphenated American” now became suspect. The Bolshevik Revolution in Russia in 1917 added the fear of radical politics to the mix—this, despite the fact that many of those seeking asylum in the United States because of the revolutions in Russia were fleeing the Bolsheviks, not people who shared their views.

The postwar period saw immigrants—particularly those suspected of radical politics—subjected to heightened levels of scrutiny and even deportation. The drive to put restrictions on eventually led to legislation: first the Emergency Quota Act of 1921, and then a permanent Immigration Act in 1924.

As a result of decades of growing nativist sentiment, the United States for the first time in its history imposed quota limits on the number of people allowed to come into the country: 165,000 maximum per year, with a quota that was based on the number of people from that country in the 1890 census. No specific provision was made in the legislation for refugees. Supporters of the legislation made it clear that the goal of maintain an “Anglo-Saxon” nation was more important that being an “asylum for the oppressed.”

Senator Ellison DuRant Smith of South Carolina said:
Thank God we have in America perhaps the largest percentage of any country in the world of the pure, unadulterated Anglo-Saxon stock; certainly the greatest of any nation in the Nordic breed. It is for the preservation of that splendid stock that has characterized us that I would make this not an asylum for the oppressed of all countries, but a country to assimilate and perfect that splendid type of manhood that has made America the foremost Nation.
After 140 years of effectively welcoming all those who wished to come, the United States shut the door.

It is probably no coincidence that this change corresponds roughly with the emergence of the nation as a great power on the world stage. While outsiders had long been viewed suspiciously—especially those with different religious or political views—now such people were perceived as not just a potential internal threat, but as what we would now call a “national security threat.”

There was no need for an American policy toward refugees prior to the 1920s, since there were so few restrictions on entering the United States. The immigration restriction legislation, however, changed that. It required that no more than two percent “of the number of foreign-born individuals of such nationality resident in continental United States as determined by the United States Census of 1890” be allowed into the United States in any year. By setting strict numerical quotas based on the country of origin, the law left no flexibility depending on the circumstances in that country, and thus no ability to adjust to a refugee problem.
     
Thus the Immigration Act of 1924 set the stage for two disgraceful incidents in America’s history of dealing with refugees. Despite the rising persecution of German Jews in the late 1930s, all German immigration to the United States was subject to the existing yearly quota (due to the formula noted above, Germany actually had by far the highest quota in the world, over 50,000). In early 1939, in the aftermath of Krystallnacht in November 1938, Sen. Robert Wagner of New York proposed to Congress a Refugee Act that would allow 20,000 German children into the United States, over and above the established yearly national quota.
     
Wagner’s intent was that those children would be German Jews, but fearing that anti-Semitism would doom the bill, he did not specify that in the legislation. Opponents argued that, whatever its merits might be, the bill would undermine the quota system. They also made an economic argument. One said in testimony to Congress: “These children, if admitted, will presumably grow up and as youths will become competitors with American citizens for American jobs.” Opponents killed the bill in Congress, and no refugee children came to the United States. There is no way to know how many children might have able to enter the United States, but it seems likely that some who might have been saved later died in the Holocaust.
     
In another case that has become much more well known in recent weeks, we know exactly how many could have been saved had they been admitted into the United States. In the midst of the Refugee Bill debate, the ocean liner St. Louis sailed from Hamburg for Cuba with over 900 German Jews fleeing the Nazi regime. Opposition arose in Cuba to letting them into that country, with anti-immigrant groups claiming that the passengers were Communists and thus should not be admitted. Only 22 of the Jewish passengers were allowed into Cuba. 743 were awaiting visas to enter the United States but had not received them. They cabled the State Department and the White House from the ship asking to be allowed into the United States. But at that time 83% of Americans opposed any relaxation of the immigration laws, and since the German quota for the year had already been filled, they were denied entry. The passengers returned to Europe. The British, Dutch, Belgians and French took in the refugees. But due to the German occupation during the war of all of those countries save Britain, some 254 of them died in the Holocaust. The United States government, knowing full well that Germany was persecuting its Jews, refused to alter its immigration policy to save refugees and 254 lives that could have been saved were lost.
     
World War II, of course, created an unprecedentedly large refugee problem. In 1945, President Truman did what FDR never did: he issued an executive order allowing in 40,000 refugees above the quotas. In 1946, he proposed to Congress the Displaced Persons Act, which produced the same kind of response as Wagner’s Refugee Act did in 1939—opponents charged (despite the nearly full employment postwar economy) that they would take jobs from returning veterans. Some argued that the bill would allow Communists into the United States. Concerns that large numbers of Jews (who were often equated with Communism) would be admitted led supporters of the legislation to stress that most of those admitted would be Christians. This time opponents did not defeat the bill. It passed. They did, however, cut the number admitted into the US in half, from 400,000 to 200,000.
     
Throughout most of the cold war, U.S. policy toward refugees was largely driven by cold war foreign policy, and on a case-by-case basis. As a rule, those fleeing communism were welcomed. The United States admitted refugees of the Hungarian revolution against the Soviet-backed regime in 1956, for example. Under President Dwight Eisenhower, the United States conducted “Operation Safe Haven” for Hungarian refugees. Eisenhower said: "It is heartening to witness the speed with which free nations have opened their doors to these most recent refugees from tyranny. In this humanitarian effort our own nation must play its part." That pattern was repeated several times: those fleeing the Cuban revolution in 1959, as well as the boat people seeking to escape North Vietnam’s conquest of the south in 1975 (under the Indochinese Migration and Refugee Assistance Act), were welcomed into the United States, while those fleeing other tyrannies were often out of luck. In 1980, the Refugee Act finally put refugees outside the regular immigration system, allowing for 50,000 refugees per year.
     
In sum, the reactions we see today to the prospect of admitting refugees from Syria and elsewhere have a long history in this country. Americans have a history of both welcoming and refusing refugees. Today we face a choice: which of those legacies will we embrace? When I began working on this issue nearly three months ago, I had some hope that it would be the former. The events in Paris and San Bernadino—and more importantly, the generally fearful reaction of many Americans to those events—have left me fearing that Americans are more inclined to opt for the latter. What this overview of the history shows is that such fears have in the past been overblown, and Americans have often had great reason to regret their fear-driven, short-sighted overreactions. Nevertheless, that list of regrets looks likely to grow longer.

Tuesday, June 23, 2015

The Flag Needs to go Down. So Does the Lie it Represents

Yes, it’s a symbol. That doesn’t mean it doesn’t matter. That is why it matters.

Within hours of Gov. Nikki Haley calling for the removal of the Confederate battle flag from the grounds of the state capitol in Columbia, Wal-Mart announced that it would stop selling articles with that symbol on it. The Republican Speaker of the Mississippi House said it was time to change the state flag that contains it. Amazon and NASCAR have turned against it. The acknowledgement that it deserves no place of honor may be contagious.

But we should not—cannot—be satisfied with the removal of the symbol. We also have a responsibility to combat the lie it represents.

While Gov. Haley’s decision to support removing the flag is undeniably progress, the way she and other elected officials couch their new-found sensitivity to the insult this flag has always been to black citizens is troubling.

In her statement, Haley said: “For many people in our state, the flag stands for traditions that are noble. Traditions of history, of heritage, and of ancestry. At the same time, for many others in South Carolina, the flag is a deeply offensive symbol of a brutally oppressive past." There is room for both views, she said: “We do not need to declare a winner and a loser."

That is where she is wrong. We do need to declare something: the truth wins and the lie loses. Leadership—true leadership—does not create false equivalencies such as this. Both views, she said, are reasonable. They are not. One is in line with historical reality, while the other is the product of historical self-delusion.

Symbols can be tricky. Meaning can vary from person to person. But we’re not talking about a piece of abstract art in this case. We are talking about a symbol of a specific historical entity. I cannot simply declare that, for me personally, the Confederate battle flag represents say, abolitionism. It was a flag under which men fought against the armies of the United States government, in defense of a government that had as its central tenet the preservation of slavery. That is not up for discussion or debate. (Ta-Nehisi Coates has an exhaustive collection of Confederate leaders saying so, here.)

In 1948, Strom Thurmond’s Dixiecrats waved it to show their opposition to President Truman’s civil rights plank in the Democratic platform. Throughout the civil rights movement, segregationists flew it to show their devotion to Jim Crow and their rejection of racial equality. Rabid segregations waved it in the faces of civil rights protesters, and Gov. George Wallace of “segregation now, segregation forever” infamy proudly stood in front of it. People who defiantly shoved that flag in the face of people marching for racial equality still walk among us.

In each of those instances, it represented a willingness to fight to maintain white supremacy.

The reality that many people refuse to acknowledge those facts does not change them.

Those who still openly defend that flag are, fortunately, diminishing in number. But the near universal meaning people attributed to it in the past, we’re still asked to believe, is not the “real” meaning for its supporters now. Now we are told the murderer “hijacked” the Confederate battle flag. It’s not about slavery or segregation now, it’s about “southern pride.”

What does that term mean? One of the murderer’s friends recalled: “I never heard him say anything, but just he had that kind of Southern pride, I guess some would say…. He made a lot of racist jokes, but you don’t really take them seriously like that.” For this friend, making racist jokes was a sign of “southern pride.” Racism is only serious, it seems, when it leads to actual violence. When it’s jokes and racial epithets, it’s “southern pride.”

Even if we allow that today most white southerners would not define “southern pride” that way, when one associates “southern pride” with a flag that the overwhelming majority of black southerners find offensive, there is a damning, unstated admission: their “south” is, of course, a white south. It is not the south of slaves and their descendants. They were denied their humanity under slavery, they were denied their rights under segregation, and they are denied their southern identity by this definition of “southern pride.”

Haley’s remarks say that the “southern pride” view is worthy of respect. It is not. Only by denying the historical reality of how that flag has been used—not by the one or the few, but by the many—can one view it as representing anything “noble.” It is that kind of denialism that allowed the murderer to believe that the flag called for his hateful violence. It has promoted violence in the name of white supremacy throughout its history, but it has persisted in our culture under the guise of a harmless “southern pride.” The murderer did not hijack it, he did not “misappropriate” it. He made manifest--in the ugliest, most awful way--what it has always meant.

He tore the disguise off so utterly that even many of the willfully blind could not help but see.

That will be why the flag comes down.

No one who has dodged this issue in the past, or openly been on what is now clearly the wrong side of it, wants to have to admit having been wrong. But some flag supporters have. The former radio host and speech writer known as the “Southern Avenger” recently wrote: “I was wrong. That flag is always about race.” That’s the kind of honest reckoning with the past that we need.

Most politicians, however, present this act of removal not so much as a change of opinion but as a change of circumstances. They are beneficently going above and beyond due to the extreme circumstances created by this event. But this awful event did not really create new circumstances. It simply made undeniable what has always been true. It has shamed at least some people. They know what that flag means. But they still continue to indulge the fantasies of those who insist it is only about “southern pride,” and tell them that their point of view is a perfectly legitimate one.

There is a price to be paid for indulging a lie.

For 150 years, this nation has failed to recognize fully what Frederick Douglass rightly identified in 1878 as the central truth of the Civil War: “There was a right side and a wrong side in the late war which no sentiment ought to cause us to forget.”

The nation’s willingness to indulge the “Lost Cause” mythology of the defeated Confederates is one of the reasons that 150 years later this mass murderer had no problem finding a false version of history (adhered to by many people beyond the Council of Conservative Citizens and the Sons of Confederate Veterans) that supported his vile racism.

Taking down the Confederate battle flag is the right thing to do—but not just because it stands in the way of unity at a time of bereavement. It should come down because it represents a pernicious lie: that the south worth honoring is a white supremacist one. Taking it down while indulging the lie is still progress. But it nonetheless avoids the hard truths that need to be spoken.

The cause of the Confederacy was not “noble.” The cause of the segregationists was not “noble.” Neither deserves any honor or reverence.

There is a right side and a wrong side in the Confederate flag debate which no sentiment ought to cause us to forget.