Wednesday, December 16, 2015

A Brief History of American Attitudes Toward Refugees

      [Back in September, in response to efforts opposing the resettlement of Syrian refugees in South Carolina, my colleague Dr. Byron McCane organized a group of Wofford College faculty to present a panel on the subject of refugees. My colleagues Dr. Laura Barbas-Rhoden (Modern Languages), Dr. Phil Dorroll (Religion), Dr. Kim Rostan (English) and I all participated. My job was to give a brief overview of refugees in American history in the September 24 event at Wofford.
      On Nov. 11, we reprised the panel at the University of South Carolina in Columbia, with the welcome additions of USC colleagues Dr. Breanne Grace (College of Social Work) and Dr. Rajeev Bais (Clinical Internal Medicine).
      Due to recent events, the refugee situation has unfortunately become a political issue in the presidential race, with candidates like Ted Cruz and Jeb Bush asserting that only Christian refugees should be admitted into the United States, and Donald Trump calling for a ban on all Muslims entering the United States. Below is an adapted version of my presentations at Wofford and USC. This previously appeared as a series of posts on History News Network.]
From the earliest days of the republic, the American attitude toward those fleeing conflicts and hardship abroad has been marked by an ambivalence and tension between two contradictory reactions.
On the one hand, Americans want to see themselves as a people who welcome refugees. In the 1790s, the American scientist David Rittenhouse said the United States was “an asylum to the good, to the persecuted, and to the oppressed of other climes.” The prominent historian Gordon Wood writes: “By the early 1790s Americans were not surprised that their country was in fact attracting refugees from the tyrannies of the Old World. The enlightened everywhere had come to recognize the United States as the special asylum for liberty.”
On the other hand, Americans have also feared that such people might represent a danger to the United States: religious, political, economic, cultural--or all of the above.
When I say from the earliest days, I mean just that. The decade of the 1790s saw nearly 100,000 immigrants come into the United States—at a time when the population of the country was about 4 million people. Probably at least 15-20,000 of them were political refugees, fleeing revolutionary violence and political oppression.
The first refugee crisis in United States history came during the first term of George Washington, in 1792. The revolution in Santo Domingo led to thousands of refugees fleeing the island, most of whom came to Richmond, Virginia. One historian’s estimate of perhaps 10,000 is probably too high, but there are records indicating the existence of at least 2,000 such refugees in the US by 1794. We know this because Congress voted a specific appropriation of $15,000 for the relief of the refugees (out of $6.3 million budget that year). As the historian of this incident concluded: “For the first time in its existence as an independent state, the United States met the refugee problem in its most tragic form, and met it with … generosity and human sympathy.”
Many thousands of other refugees also fled to the United States in the 1790s, mostly from the more famous revolution in France. They were, as one historian put it, of all political stripes: “Royalists, Republicans, Catholics, Masons, courtiers, artisans, priests and philosophers.” These political refugees started their own explicitly political newspapers and book presses. They brought their passions with them, and competing groups sometimes engaged in street violence against each other.
In 1795, the pro-British Jay’s Treaty damaged American relations with revolutionary France and threatened to result in outright war. If war came, the Federalists feared that the French would use “native collaborators to create revolutionary puppet republics” and “French emigres and Jacobinical sympathizers in the country [might] become collaborators.”
Suddenly, asylum seekers were seen as the threat within. In 1798, Federalist Rep. Harrison Gray of Massachusetts, said: “Do we not know that the French nation have organized bands of aliens as well as their own citizens, in other countries, to bring about their nefarious purposes? By these means they have overrun all the republics in the world but our own … And may we not expect the same means to be employed against this country?”  Another Federalist said that the new immigrants were “the grand cause of all our present difficulties” and plainly stated: “let us no longer pray that America become an asylum to all nations.”
As a result of this growing fear, Congress changed the law. The first Naturalization Law in 1790 had required only two years residency in the US before one could become a citizen. That was extended to five years residency in 1795, and then in 1798, Congress raised it to 14 years. All immigrants were required to register with the government within 48 hours of arrival, and the law forbade all aliens who were citizens or subjects of a nation with which the US was at war from becoming American citizens.
The crackdown on immigrants and refugees was inextricably wrapped up in domestic politics. The Alien Act, passed by a Federalist Congress and signed by a Federalist president, was a reaction to their fear that the newcomers were overwhelmingly supporters of Thomas Jefferson’s Republican party. Refugees from revolutionary France were joined by hundreds, perhaps thousands, fleeing political oppression in Ireland. Their historian Michael Durey concludes: “the radicals’ experiences after emigration were too varied and problematic to allow us to any longer assume uncritically that America was a welcoming asylum for them all. For many it was Bedlam.”
The Alien Act was allowed to expire, and the anti-French fever broke. But the tendency to both welcome and fear refugees would continue in the 19th century, long after the specific fear of the French dissipated.

Fifty years after the Alien Act, revolution in Europe again produced a similar American reaction to the influx of refugees. The revolutions of 1848, starting in Paris and spreading through much of Europe, also produced a large number of political refugees to the United States, especially Germans who were known in the U.S. as the “Forty-eighters.”

The American government generally welcomed the revolutions, seeing them as democratic in character, and thus consistent with American values. In fact, the United States “was the only major government which saw fit to send greetings to the Parliament at Frankfurt.” President James K. Polk stated: “The great principles of … the Declaration of Independence seem now to be in the course of rapid development throughout the world.”

But as students of the Revolutions of 1848 well know, those revolutions were more complex than that, and so were the refugees who fled to America. According to their historian, the “typical Forty-eighter was a freethinker, if not an atheist. They believed in universal suffrage, abolition of the Sunday laws, taxation of church property, establishment of the eight hour day, and government ownership of the railroads.”

Some Americans thus denounced them as “socialists, rationalists, atheists and desecrators of the Sabbath.” Southerners in particular feared their influence because the Forty-eighters were thought to favor abolitionism.  Some Forty-eighters were, in fact, socialists.  One, Ernst Schmidt, would later run for mayor of Chicago in 1859 on a socialist ticket, while others formed their own utopian socialist communities in the United States.

Some of the Forty-eighters were also liberal Catholics, and of course at the same time thousands upon thousands of Irish Catholics were arriving in the United States as economic refugees of the famine in Ireland. This combination gave rise to an explicitly nativist movement that found political expression in the American Party, more commonly known as the “Know-Nothings.”

The Know-Nothings never actually succeeded in changing American law regarding refuges and immigrants, but in their oath, members pledged to never vote for any man for any office who was not born in the United States. They called for “War to the hilt on political Romanism” and “Hostility to all Papal influences when brought to bear against the Republic.” They effectively argued that Catholicism was not so much a religion deserving First Amendment protection as a dangerous political movement contrary to democracy. (This is reminiscent of Dr. Ben Carson’s recent statement that Islam is “inconsistent with the values and principles of America.”)

The Know-Nothings saw the Irish and Germans as a religious/political threat, bringing “Popery” to the United States and thus undermining American principles. The Know-Nothings wanted to deny the newcomers the right to vote—they called for increasing the required number of years of residency from 5 to 21 before an immigrant could vote. (I cannot help but wonder how the Know-Nothings of the 1850s would have reacted to the sight back in September of the Pope, standing where the President stands when giving the State of the Union, addressing the United States Congress, while the Catholic Speaker of the House and Catholic Vice-President sat behind him.)

Despite these political reactions in the mid-19th century, what seems note-worthy in retrospect is that there was no legislative attempt to actually prevent any people from coming into the United States prior to 1882. When it happened, it was deliberately, openly discriminatory. That year, Congress passed the Chinese Exclusion Act. This was an explicitly racial law, a response to the popular backlash against the large number of Chinese in the west, which barred immigration by the Chinese.

Most American are familiar with the fact that many Chinese came to work on the transcontinental railroad, but what is often forgotten is that many were also refugees from the one of the bloodiest periods of Chinese history, the era of the Taiping Rebellion—in the 30 years before the Chinese Exclusion Act, an estimated 20-30 million Chinese died in a major civil war and several different rebellions. Over 1.5 million fled China, and historians estimate that 250,000 of them came to the United States. (Oregon alone had about 100,000 Chinese in 1890.) The Exclusion Act remained law for 60 years, until it was finally repealed during World War II, when China was an American ally in the war against Japan.

Despite the Chinese Exclusion Act, for most of the people of the world, the United States remained a place of asylum. The great turning point was World War I. The previous two decades had seen millions of immigrants, many from southern and eastern Europe, arrive on American shores, leading to increasing calls for limitations.

Once America entered the World War in 1917, the fear that lingering attachments of these relative newcomers to their mother countries might create conflicting loyalties in wartime led to the propaganda theme “100% Americanism.” In addition to the well-known reactions against German-Americans, any so-called “hyphenated American” now became suspect. The Bolshevik Revolution in Russia in 1917 added the fear of radical politics to the mix—this, despite the fact that many of those seeking asylum in the United States because of the revolutions in Russia were fleeing the Bolsheviks, not people who shared their views.

The postwar period saw immigrants—particularly those suspected of radical politics—subjected to heightened levels of scrutiny and even deportation. The drive to put restrictions on eventually led to legislation: first the Emergency Quota Act of 1921, and then a permanent Immigration Act in 1924.

As a result of decades of growing nativist sentiment, the United States for the first time in its history imposed quota limits on the number of people allowed to come into the country: 165,000 maximum per year, with a quota that was based on the number of people from that country in the 1890 census. No specific provision was made in the legislation for refugees. Supporters of the legislation made it clear that the goal of maintain an “Anglo-Saxon” nation was more important that being an “asylum for the oppressed.”

Senator Ellison DuRant Smith of South Carolina said:
Thank God we have in America perhaps the largest percentage of any country in the world of the pure, unadulterated Anglo-Saxon stock; certainly the greatest of any nation in the Nordic breed. It is for the preservation of that splendid stock that has characterized us that I would make this not an asylum for the oppressed of all countries, but a country to assimilate and perfect that splendid type of manhood that has made America the foremost Nation.
After 140 years of effectively welcoming all those who wished to come, the United States shut the door.

It is probably no coincidence that this change corresponds roughly with the emergence of the nation as a great power on the world stage. While outsiders had long been viewed suspiciously—especially those with different religious or political views—now such people were perceived as not just a potential internal threat, but as what we would now call a “national security threat.”

There was no need for an American policy toward refugees prior to the 1920s, since there were so few restrictions on entering the United States. The immigration restriction legislation, however, changed that. It required that no more than two percent “of the number of foreign-born individuals of such nationality resident in continental United States as determined by the United States Census of 1890” be allowed into the United States in any year. By setting strict numerical quotas based on the country of origin, the law left no flexibility depending on the circumstances in that country, and thus no ability to adjust to a refugee problem.
Thus the Immigration Act of 1924 set the stage for two disgraceful incidents in America’s history of dealing with refugees. Despite the rising persecution of German Jews in the late 1930s, all German immigration to the United States was subject to the existing yearly quota (due to the formula noted above, Germany actually had by far the highest quota in the world, over 50,000). In early 1939, in the aftermath of Krystallnacht in November 1938, Sen. Robert Wagner of New York proposed to Congress a Refugee Act that would allow 20,000 German children into the United States, over and above the established yearly national quota.
Wagner’s intent was that those children would be German Jews, but fearing that anti-Semitism would doom the bill, he did not specify that in the legislation. Opponents argued that, whatever its merits might be, the bill would undermine the quota system. They also made an economic argument. One said in testimony to Congress: “These children, if admitted, will presumably grow up and as youths will become competitors with American citizens for American jobs.” Opponents killed the bill in Congress, and no refugee children came to the United States. There is no way to know how many children might have able to enter the United States, but it seems likely that some who might have been saved later died in the Holocaust.
In another case that has become much more well known in recent weeks, we know exactly how many could have been saved had they been admitted into the United States. In the midst of the Refugee Bill debate, the ocean liner St. Louis sailed from Hamburg for Cuba with over 900 German Jews fleeing the Nazi regime. Opposition arose in Cuba to letting them into that country, with anti-immigrant groups claiming that the passengers were Communists and thus should not be admitted. Only 22 of the Jewish passengers were allowed into Cuba. 743 were awaiting visas to enter the United States but had not received them. They cabled the State Department and the White House from the ship asking to be allowed into the United States. But at that time 83% of Americans opposed any relaxation of the immigration laws, and since the German quota for the year had already been filled, they were denied entry. The passengers returned to Europe. The British, Dutch, Belgians and French took in the refugees. But due to the German occupation during the war of all of those countries save Britain, some 254 of them died in the Holocaust. The United States government, knowing full well that Germany was persecuting its Jews, refused to alter its immigration policy to save refugees and 254 lives that could have been saved were lost.
World War II, of course, created an unprecedentedly large refugee problem. In 1945, President Truman did what FDR never did: he issued an executive order allowing in 40,000 refugees above the quotas. In 1946, he proposed to Congress the Displaced Persons Act, which produced the same kind of response as Wagner’s Refugee Act did in 1939—opponents charged (despite the nearly full employment postwar economy) that they would take jobs from returning veterans. Some argued that the bill would allow Communists into the United States. Concerns that large numbers of Jews (who were often equated with Communism) would be admitted led supporters of the legislation to stress that most of those admitted would be Christians. This time opponents did not defeat the bill. It passed. They did, however, cut the number admitted into the US in half, from 400,000 to 200,000.
Throughout most of the cold war, U.S. policy toward refugees was largely driven by cold war foreign policy, and on a case-by-case basis. As a rule, those fleeing communism were welcomed. The United States admitted refugees of the Hungarian revolution against the Soviet-backed regime in 1956, for example. Under President Dwight Eisenhower, the United States conducted “Operation Safe Haven” for Hungarian refugees. Eisenhower said: "It is heartening to witness the speed with which free nations have opened their doors to these most recent refugees from tyranny. In this humanitarian effort our own nation must play its part." That pattern was repeated several times: those fleeing the Cuban revolution in 1959, as well as the boat people seeking to escape North Vietnam’s conquest of the south in 1975 (under the Indochinese Migration and Refugee Assistance Act), were welcomed into the United States, while those fleeing other tyrannies were often out of luck. In 1980, the Refugee Act finally put refugees outside the regular immigration system, allowing for 50,000 refugees per year.
In sum, the reactions we see today to the prospect of admitting refugees from Syria and elsewhere have a long history in this country. Americans have a history of both welcoming and refusing refugees. Today we face a choice: which of those legacies will we embrace? When I began working on this issue nearly three months ago, I had some hope that it would be the former. The events in Paris and San Bernadino—and more importantly, the generally fearful reaction of many Americans to those events—have left me fearing that Americans are more inclined to opt for the latter. What this overview of the history shows is that such fears have in the past been overblown, and Americans have often had great reason to regret their fear-driven, short-sighted overreactions. Nevertheless, that list of regrets looks likely to grow longer.

Tuesday, June 23, 2015

The Flag Needs to go Down. So Does the Lie it Represents

Yes, it’s a symbol. That doesn’t mean it doesn’t matter. That is why it matters.

Within hours of Gov. Nikki Haley calling for the removal of the Confederate battle flag from the grounds of the state capitol in Columbia, Wal-Mart announced that it would stop selling articles with that symbol on it. The Republican Speaker of the Mississippi House said it was time to change the state flag that contains it. Amazon and NASCAR have turned against it. The acknowledgement that it deserves no place of honor may be contagious.

But we should not—cannot—be satisfied with the removal of the symbol. We also have a responsibility to combat the lie it represents.

While Gov. Haley’s decision to support removing the flag is undeniably progress, the way she and other elected officials couch their new-found sensitivity to the insult this flag has always been to black citizens is troubling.

In her statement, Haley said: “For many people in our state, the flag stands for traditions that are noble. Traditions of history, of heritage, and of ancestry. At the same time, for many others in South Carolina, the flag is a deeply offensive symbol of a brutally oppressive past." There is room for both views, she said: “We do not need to declare a winner and a loser."

That is where she is wrong. We do need to declare something: the truth wins and the lie loses. Leadership—true leadership—does not create false equivalencies such as this. Both views, she said, are reasonable. They are not. One is in line with historical reality, while the other is the product of historical self-delusion.

Symbols can be tricky. Meaning can vary from person to person. But we’re not talking about a piece of abstract art in this case. We are talking about a symbol of a specific historical entity. I cannot simply declare that, for me personally, the Confederate battle flag represents say, abolitionism. It was a flag under which men fought against the armies of the United States government, in defense of a government that had as its central tenet the preservation of slavery. That is not up for discussion or debate. (Ta-Nehisi Coates has an exhaustive collection of Confederate leaders saying so, here.)

In 1948, Strom Thurmond’s Dixiecrats waved it to show their opposition to President Truman’s civil rights plank in the Democratic platform. Throughout the civil rights movement, segregationists flew it to show their devotion to Jim Crow and their rejection of racial equality. Rabid segregations waved it in the faces of civil rights protesters, and Gov. George Wallace of “segregation now, segregation forever” infamy proudly stood in front of it. People who defiantly shoved that flag in the face of people marching for racial equality still walk among us.

In each of those instances, it represented a willingness to fight to maintain white supremacy.

The reality that many people refuse to acknowledge those facts does not change them.

Those who still openly defend that flag are, fortunately, diminishing in number. But the near universal meaning people attributed to it in the past, we’re still asked to believe, is not the “real” meaning for its supporters now. Now we are told the murderer “hijacked” the Confederate battle flag. It’s not about slavery or segregation now, it’s about “southern pride.”

What does that term mean? One of the murderer’s friends recalled: “I never heard him say anything, but just he had that kind of Southern pride, I guess some would say…. He made a lot of racist jokes, but you don’t really take them seriously like that.” For this friend, making racist jokes was a sign of “southern pride.” Racism is only serious, it seems, when it leads to actual violence. When it’s jokes and racial epithets, it’s “southern pride.”

Even if we allow that today most white southerners would not define “southern pride” that way, when one associates “southern pride” with a flag that the overwhelming majority of black southerners find offensive, there is a damning, unstated admission: their “south” is, of course, a white south. It is not the south of slaves and their descendants. They were denied their humanity under slavery, they were denied their rights under segregation, and they are denied their southern identity by this definition of “southern pride.”

Haley’s remarks say that the “southern pride” view is worthy of respect. It is not. Only by denying the historical reality of how that flag has been used—not by the one or the few, but by the many—can one view it as representing anything “noble.” It is that kind of denialism that allowed the murderer to believe that the flag called for his hateful violence. It has promoted violence in the name of white supremacy throughout its history, but it has persisted in our culture under the guise of a harmless “southern pride.” The murderer did not hijack it, he did not “misappropriate” it. He made manifest--in the ugliest, most awful way--what it has always meant.

He tore the disguise off so utterly that even many of the willfully blind could not help but see.

That will be why the flag comes down.

No one who has dodged this issue in the past, or openly been on what is now clearly the wrong side of it, wants to have to admit having been wrong. But some flag supporters have. The former radio host and speech writer known as the “Southern Avenger” recently wrote: “I was wrong. That flag is always about race.” That’s the kind of honest reckoning with the past that we need.

Most politicians, however, present this act of removal not so much as a change of opinion but as a change of circumstances. They are beneficently going above and beyond due to the extreme circumstances created by this event. But this awful event did not really create new circumstances. It simply made undeniable what has always been true. It has shamed at least some people. They know what that flag means. But they still continue to indulge the fantasies of those who insist it is only about “southern pride,” and tell them that their point of view is a perfectly legitimate one.

There is a price to be paid for indulging a lie.

For 150 years, this nation has failed to recognize fully what Frederick Douglass rightly identified in 1878 as the central truth of the Civil War: “There was a right side and a wrong side in the late war which no sentiment ought to cause us to forget.”

The nation’s willingness to indulge the “Lost Cause” mythology of the defeated Confederates is one of the reasons that 150 years later this mass murderer had no problem finding a false version of history (adhered to by many people beyond the Council of Conservative Citizens and the Sons of Confederate Veterans) that supported his vile racism.

Taking down the Confederate battle flag is the right thing to do—but not just because it stands in the way of unity at a time of bereavement. It should come down because it represents a pernicious lie: that the south worth honoring is a white supremacist one. Taking it down while indulging the lie is still progress. But it nonetheless avoids the hard truths that need to be spoken.

The cause of the Confederacy was not “noble.” The cause of the segregationists was not “noble.” Neither deserves any honor or reverence.

There is a right side and a wrong side in the Confederate flag debate which no sentiment ought to cause us to forget.

Wednesday, February 4, 2015

"There Was a Right Side and a Wrong Side": Art and Historical Memory

[This piece was originally posted on History News Network]

The last several weeks have given rise to much commentary on how drama presents the recent historical past. The films “Selma” and “American Sniper” both provoked passionate, even divisive, disagreement. To some extent, that may be inevitable, since both treat subjects in the living memory of many people.

My recent experience in this area is rather different, concerning as it does long past events. Last weekend, I attended a local performance of “The Civil War: The Musical.” Everything about the production was top notch: the set, the costumes, the direction, the lighting, the singing. Yet I left the theater with a sense that the show itself was deeply flawed.

At the very beginning of the show, a voiceover sets the scene, discussing the firing on Fort Sumter, and ends by quoting Walt Whitman: “the real war will never get in the books.” This sets the theme for the whole show: the war was, first and foremost, the stories of the individuals engaged in it, the vast majority of whom never did—perhaps never could—record was it was truly like.

As far as it goes, there is much to be said for this approach, the kind of “history from the bottom up” story rather than “top-down” history of presidents and generals. There is also a danger in it, however, one that the show inadvertently showcases.

The audience is introduced to three (mostly) separate groups of characters: white northerners, white southerners, and black slaves (called the “Union Army,” the “Confederate Army” and “The Enslaved” in the script). There is virtually no interaction among the groups other than the battle scenes between the two groups of white men. While I’m sure the creators of the show (Frank Wildhorn, Gregory Boyd, and Jack Murphy) had the best of intentions, this division serves a certain interpretation of the war—one that all too often did find its way into the books.

To the best of my recollection (and perhaps I missed something), the subject of slavery is never discussed by any of the white characters—none condemns it, none defends it. The northerners talk of fighting for Union and freedom, of course, but not the issue of slavery itself. The southerners talk of defending their land, their way of life, but don’t talk about slavery as part of that way of life.

The entire nation, north and south, was complicit in slavery. Northern business interests invested in and profited from it. Northern politicians made common cause with southerners to defend it. Some southerners, like South Carolina’s Grimke sisters, openly fought against it. But you’ll hear none of that in the musical. We hear no northern soldier talk with pride of fighting to free the enslaved; we hear no northern soldier speak with resentment about being asked to risk his life to secure the rights of people he considered inferior. Both types existed. We hear no southern soldier denounce Lincoln for wanting to make blacks his equal; we hear no southerner angered at the prospect of dying for the wealthy planter’s right to own human beings. Both types existed.

The subject of slavery is, of course, addressed by the black characters, and one song that recreates a slave auction (“Peculiar Institution”), is emotionally wrenching. But in the larger context of the show, the institution of slavery primarily appears like an act of God, akin to a famine, a plague, a hurricane—anything but a choice made by human beings to enslave their fellow human beings. Even the slave auction scene ironically has that effect. We hear the crack of a whip and a woman recoils from the blow. Theatrically, it is a powerful moment. It has, however, the inadvertent effect of removing any human agency from the whipping. We see the human being on the receiving end, but no human being administers the punishment.

That’s the problem: in this show, no one is to blame. Everyone acts honorably, fights bravely, dies nobly. Historians of the late 19th century will recognize this interpretation of the war. It is the idea that allowed northern and southern whites to come together and put the war behind them after the end of Reconstruction. It is also the one that abandoned the freedpeople to the depredations of the “Redeemers” who took control after Union soldiers ended their occupation of the Confederacy.

As I noted earlier, the characters in this show are not famous—with one notable exception, Frederick Douglass. (He is inaccurately listed among “The Enslaved,” despite the fact that Douglass escaped slavery in 1838 and was a free man during the Civil War.) No doubt Douglass was used by the show’s creators to include his eloquent denunciations of slavery, and that is a welcome addition. But I could not help but think that the historical Douglass would be rolling over in his grave.

There was no greater critic of this show’s “no one is to blame” ethos than Frederick Douglass. In 1878, he stated the exact opposite, as clearly as is humanly possible: “There was a right side and a wrong side in the late war which no sentiment ought to cause us to forget.” Yet in his final decades, he saw sentiment prevailing over memory. “I am not of that school of thinkers that teaches us to let bygones be bygones, to let the dead past bury its dead. In my view, there are no bygones in the world, and the past is not dead and cannot die. The evil as well as the good that men do lives after them…. The duty of keeping in memory the great deeds of the past and of transmitting the same from generation to generation is implied in the mental and moral constitution of man.”

“The Civil War: The Musical” aims to capture the war that didn’t make it into the books by focusing on unknown individuals and their admirable personal qualities. But Frederick Douglass was right. We do ourselves no favor by remembering only the good and forgetting the evil that men do. The Civil War—the historical event—was the product of human choice and human agency. There was a right side and a wrong side. That’s the truth that all too often has failed to make it into the books—and this musical. 

Wednesday, December 17, 2014

Dick Cheney's David Frost Moment

[This post was originally published on the History News Network.]

All of us who teach face a daily, daunting task: how do we take our subject matter—which we know from years of study to be terrifically complicated and nuanced—and make it accessible and understandable to our students, all while avoiding the peril of oversimplification?

We all do our best, succeeding sometimes, failing others. We are eternally grateful when we find a key: that piece of evidence, that compelling argument, that helps us do our jobs. The most prominent example of that for my teaching is perhaps Richard Nixon’s infamous comment to David Frost about his actions during the Watergate scandal: "Well, when the president does it, that means that it is not illegal."

That one simple sentence helps me communicate to students the essence of the danger inherent in the many complicated events that we call “Watergate”: Nixon’s sincerely held belief that as president he was incapable of committing an illegal act; that whatever he deemed necessary to security of the United States was, by definition, legal. It was a sentiment more consistent with the absolutism of Louis XIV than the constitutional principles that gave birth to the nation. What makes those words so powerful is that they come not from one of Nixon’s many implacable political foes, or from a historian interpreting his actions. They come from the man himself.

Since the release of the Senate Intelligence Committee’s torture report last week, I’ve been struggling with how to synthesize the multiplicity of reactions it provoked. Then on Sunday, I saw former Vice President Dick Cheney on “Meet the Press.”

Amidst all the dissembling, Cheney made one remark that struck me as his David Frost moment.

Moderator Chuck Todd confronted Cheney with evidence that 25% of the detainees were innocent, and that one was physically abused so badly that he died. Cheney replied: “I'm more concerned with bad guys who got out and released than I am with a few that, in fact, were innocent.” When pressed about whether it was acceptable to abuse innocent people even to the point of death, Cheney said: “I have no problem as long as we achieve our objective.”

Keep in mind, Cheney was not talking about the accidental death of innocents on the battlefield. Every war involves the accidental death of innocents, but just war standards command that every reasonable effort be made to avoid them. This was someone in the custody of the United States, who had done no wrong and was mistakenly taken into custody, whose physical mistreatment by representatives of the United States killed him while in custody. Faced with that travesty of justice, Dick Cheney could not even muster a perfunctory expression of regret.

Confronted with an unquestionable injustice, Cheney says: “I have no problem as long as we achieve our objective.” That is the essence of everything wrong with the Bush-Cheney “war on terror.” It admitted no principle whatsoever as superior to the objective of keeping the nation safe. Fundamental human rights—even of innocent people—can be violated with impunity, Cheney asserts. Even after being presented with evidence that an innocent man was killed, Cheney blithely said, “I'd do it again in a minute.” The end justifies the means.

That is the mindset of the authoritarian. Dictators the world over use that logic every day. Dick Cheney will never admit that the techniques he endorsed constitute torture—to do so would be to admit he is a war criminal. But he has now admitted, beyond any doubt, that he has the mentality of a torturer.

Wednesday, October 22, 2014

Can Obama Do in Iraq What Nixon and Ford Couldn't in Vietnam?

[Originally published on History News Network]

Practically every American intervention abroad since the 1960s has prompted comparisons to Vietnam. So it was hardly surprising when on October 8, in response to President Obama’s decision to expand the campaign against ISIS into Syria, Frederik Logevall and Gordon M. Goldstein authored an op-ed in the New York Times that asked “Will Syria Be Obama’s Vietnam?”

I’m not sure that’s the right question. The American concern over ISIS originated in Iraq, after all—an intervention that is now eleven years old. America’s air campaign against ISIS today reminds me less of the intervention that happened in Vietnam than the one that didn’t happen—in the spring of 1975.

This past June, when ISIS suddenly broke through America’s collective effort to forget about Iraq and seemed poised to take Baghdad, it was easy to wonder if we were about to witness a repeat of the fall of Saigon.

More than two years after the peace agreement that led to the withdrawal of American troops from Vietnam, a North Vietnamese offensive against South Vietnam met with little effective resistance, much like last June’s stories of Iraqi armed forces dropping their arms and failing to fight ISIS. Compare these passages from the New York Times coverage of the fall of Hue in March 1975 and Mosul in June 2014:
“By the thousands, the people are abandoning Hue…. The armed forces are also moving out, some by landing craft, some in military vehicles, some bundled into trucks with family members, furniture and food. No one seemed in the slightest doubt yesterday that Hue and the rest of the north were being left to the Communists.”
“Thousands of civilians fled south toward Baghdad…. The Iraqi Army apparently crumbled in the face of the militant assault, as soldiers dropped their weapons, shed their uniforms for civilian clothes and blended in with the fleeing masses…. ‘They took control of everything, and they are everywhere,’ said one soldier who fled the city.”
The political reaction this summer also eerily echoed the reaction to events of nearly 40 years ago.

In his Memoirs, Richard Nixon argued that he had won the Vietnam war and that American bombing of the North would have preserved the South Vietnamese government. It had survived for two years after the peace agreement. That meant Nixon’s Vietnamization had worked.

“When Congress reneged on our obligations under the agreements,” Nixon wrote, “the Communists predictably rushed in to fill the gap.” Nixon had privately assured South Vietnamese President Thieu that violations of the peace agreement by Hanoi would be met with renewed American bombing. But in June 1973, the Church-Case amendment forbade funding for any military operations in Vietnam. “The congressional bombing cutoff, coupled with the limitations placed on the President by the War Powers Resolution in November 1973, set off a string of events that led to the Communist takeover.” The war was “lost within a matter of months once Congress refused to fulfill our obligations,” Nixon said.

Henry Kissinger has also repeatedly argued that the peace agreement reached with Hanoi had secured the independence of South Vietnam, and that he and Nixon intended to use air power to thwart any North Vietnamese aggression against the South. But, he asserts, Watergate so weakened Nixon that they were unable to overcome the opposition of Congress. In a meeting with Singapore’s Lee Quan Yew on August 4, 1973, Kissinger said: “We have suffered a tragedy because of Watergate … We were going to bomb North Vietnam for a week, then go to Russia, then meet with [North Vietnam’s lead negotiator] Le Duc Tho. Congress made it impossible.”

Lewis Sorley, in his 1999 work A Better War: The Unexamined Victories and Final Tragedy of America’s Last Years in Vietnam, argued that “[t]here came a time when the war was won.” Due to the pacification efforts of Gen. Creighton Abrams, he writes, victory in in Vietnam “can probably best be dated in late 1970.” The countryside was pacified, and South Vietnamese forces were “capable of resisting aggression so long as America continued to provide logistical and financial support, and … renewed application of U.S. air and naval power should North Vietnam violate the terms of that agreement.”

The argument that continued American application of its air power against North Vietnam could have preserved South Vietnam has been thus been a staple of Vietnam War revisionism.

In June, Sen. John McCain made an argument about Iraq similar to the one that Nixon, Kissinger, and Sorley made about Vietnam:

"We had it won," McCain said. "Gen. [David] Petraeus had the conflict won, thanks to the surge. And if we had left a residual force behind, that we could have, we would not be facing the crisis we are today. Those are fundamental facts ... The fact is, we had the conflict won, and we had a stable government.” Sen. Lindsey Graham of South Carolina added: “There is no scenario where we can stop the bleeding in Iraq without American air power."

There are no do-overs in history, and no one can say for certain whether the renewed application of American air power after the 1973 peace agreement might have prevented the fall of Saigon—or if it did, for how long. But we are currently seeing why Congress sought to limit the executive branch’s options back in 1973.

The fear then was that, despite the peace agreement, Nixon and Kissinger would continue to fight a war that the country overwhelmingly wanted to be over. Kissinger’s repeated statements indicate that they in fact intended to do just that, not just in Vietnam but possibly in Cambodia, too. The Church-Case Amendment was how Congress expressed the national consensus against reviving the war.

Today, there seems little will in Congress to restrain the president’s war-making powers. If anything, the loudest voices have been those arguing for even greater military action. In response to such pressure, the president has already expanded the air war to Syria.

Just last week, McCain argued that “pinprick” airstrikes were proving ineffective, and called for further expansions of the war: “They’re winning, and we’re not,” McCain told CNN. “The Iraqis are not winning. The Peshmerga, the Kurds are not winning.” Thus, he argued, there was a need for “more boots on the ground … in the form of forward air controllers, special forces and other people like that…. You have to arm the Peshmerga … Buffer zone in Syria, no-fly zone, take on Bashar al Assad the same as we have ISIS.”

McCain’s vision of a renewed, ever-expanding war is precisely what Congress in 1973 meant to prevent Nixon and Kissinger from doing. After nearly a decade of war, Americans had decided that the fall of South Vietnam, Cambodia, and Laos would not be a mortal threat to American security.

Today, what stands between the United States and the full-scale revival of a war Americans thought was over is not Congress, but the president himself. Obama has repeatedly stated that he will not re-introduce American combat troops to Iraq, and he is trying to maintain a sense of balance about the nature of the threat: “While we have not yet detected specific plotting against our homeland, these terrorists have threatened America and our allies. And the United States will meet this threat with strength and resolve.”

McCain, however, is doing the opposite, hyping the threat the U.S. Back in June he said: “We are now facing an existential threat to the security of the United States of America.” Last week he said: “it is a threat to the United States of America if they are able to establish this caliphate.”

A September CNN public opinion poll suggests that Americans agree with McCain about the threat, while siding with Obama on the limits of the U.S. response. Ninety percent say ISIS represents a threat to the U.S., with 45 percent calling the threat “serious,” 22 percent saying it is “fairly serious” and 23 percent saying it is “somewhat serious.” (Two years after 9/11, in 2003, 49 percent considered Al Qaeda a “serious” threat to the U.S.) Seventy-one percent believe ISIS terrorists are already in the U.S. But at the same time, by a 61-38 margin, Americans oppose using American ground forces to defeat ISIS.

ISIS has succeeded in making Americans think that Iraq matters again, and that U.S. interests require its defeat, but it has not yet convinced them that it is worth Americans doing the fighting and dying. That's Obama's dilemma. If air power is not enough, does he take the chance that Iraq (or Syria) falls to ISIS, or does he break his promise?

In the spring of 1975, Congressional and public opinion meant that President Ford had little choice but to watch as the North Vietnamese Army rolled into Saigon. Nearly 40 years later, President Obama faces a far more difficult task: prevent the collapse of the Iraqi government (and, increasingly, the Syrian opposition) without fully reviving a war he spent years trying to end—all in the face of an opposition that is intent on proving that the Iraq war it supported was won until the president lost it.

Whether Obama will be able to keep his promise not to send American ground forces back to Iraq is very much an open question. Having taken the first step to save Iraq by applying American air power—what Nixon, Kissinger and Ford could not do in Vietnam—it may be increasingly hard to resist subsequent steps if air power proves to be not enough.

Tuesday, September 9, 2014

Lies, Damn Lies, and Statistics (Higher Education "Reform" Edition)

Following this summer's seminar on the liberal arts at Transylvania University, I resolved to more consciously talk about the liberal arts with my new crop of first-year students in my Humanities class this semester. Last week, we spent a full class period talking about their reasons for coming to Wofford, and Wofford's commitment to a liberal arts education. We'll spend two more classes this week discussing it. They should understand what they're getting into, I think.

The beginning of the academic year always prompts some thinking about the purpose of education, even among those not engaged in it. Frank Bruni has an interesting piece in the New York Times arguing that higher education has an obligation to challenge students: "college needs to be an expansive adventure, yanking students toward unfamiliar horizons and untested identities rather than indulging and flattering who and where they already are." I couldn't agree more.

The Times also carried another piece that conveys the more dominant view in American culture: that college, first and foremost, is about getting a job.

Ben Carpenter, vice chairman of the CRT Capital Group, argues that what is missing from college today is "career education." For Carpenter, it is not enough for colleges to provide majors geared toward professional pursuits, and to have Career Services offices. The college must also offer courses in "career training":
So what can be done to make certain these young adults are being prepared for life post-graduation? The answer is simple: Colleges need to create, and require for graduation, a course in career training that would begin freshman year and end senior year.
(Note to self: remind students to always beware whatever statement follows the phrase "The answer is simple.")

The first thing worth noticing here is Carpenter's choice of words. He is clear about what his concern is: "how to get, and succeed at, a job." But the title of the article isn't "Is Your Student Prepared for a Job?--it is "Is Your Student Prepared for Life?" Throughout the piece, Carpenter uses the words "job," "career," and "life" interchangeably.

It does not take a liberal arts education to know that those words do not mean the same things. Too often in discussions of education, we elide the differences, so when talking to my students last week, I made the difference explicit. A liberal arts education is meant to prepare you, I said, not just to make a living, but to make a life.

I do not know whether Carpenter intentionally conflates "job" and "life" to confuse the reader, or if he honestly does not see a meaningful distinction between the two. Either way, doing so has the effect of perpetrating the idea that your job is your life and so college is only about getting a job.

The second issue that got my attention was that Carpenter employs what seems to me the knee-jerk "reform" response to every perceived challenge in higher education: make it part of the curriculum! I have no problem with the idea that colleges should help students find post-graduate employment. Here at Wofford, The Space is devoted to that project, and does a lot of good for our students. But it is not part of the curriculum.

That's not what Carpenter is calling for; in fact, he denigrates Career Service offices as suffering from a "major disconnect" with students. He wants "a course," one that lasts for four years and is required of all students. Since Carpenter does not get more specific, it is hard to know whether he means a course every semester for four years, or one course a year, or one course that lasts four years. But he clearly is talking about making it part of the curriculum.

It is self-evident that every new course requirement reduces the electives available for students to take to investigate their own passions or interests. The more expansive Carpenter's plan, the fewer academic courses students in it will take. It is hard not to wonder if that isn't part of the idea. If college exists merely to train workers, what do they need those electives for, anyway?

Finally, there is the matter of the precise problem that is driving his proposal. At the start of the article, Carpenter states:
According to a recent poll conducted by AfterCollege, an online entry-level job site, 83 percent of college seniors graduated without a job this spring.
In contrast, toward the end, he cites an example that suggests the efficacy of what he proposes:
One year after graduation, 96 percent of all Connecticut College alumni report that they are employed or in graduate school.
One of the things my liberal arts education taught me is to look closely and carefully when someone cites statistics. On the surface, the difference seems stark: 83 percent with no job, 96 percent employed! See, the answer is simple! Certainly that's what Carpenter wants us to think. But a moment's consideration shows that he's doing the old apples and oranges comparison.

The AfterCollege survey only purports to measure only how many students reported having a job lined up before graduation. The accuracy of that number may be questionable, since it was an online survey, and not, as Carpenter says, a scientific "poll." Second, the fine print on the survey reveals that the respondents not just students about to graduate--a majority had already graduated, 23.38 percent were college seniors, and 12.25 percent were juniors. (Safe to say that few if any juniors already have a job lined up for after graduation.)

The 83 percent number comes just from students still in school, including those juniors. For recent grads, the number is 76.3 percent. No doubt that's a big number, but it is not 83. In addition, since the survey was conducted between February 27 and April 15, 2014, some seniors who answered "no" in late February or March may well have had jobs by the time they graduated in May 2014.

In short, it is not true that 83 percent of last year's graduates had no job at graduation, even according to this survey.

Now let's look at the Connecticut College numbers. By contrast, they are not a mix of recent grads and current juniors and seniors. They measure an entire graduating class. In no way can that group be reasonably compared to the AfterCollege survey respondents. In addition, it measures the outcome for those students one year after graduation.

A true comparison would require surveying only graduating seniors right after they graduated and then comparing the number with jobs to the number with jobs one year later. A year makes a huge difference in the job search, as does being out of school--I recall not feeling much urgency about getting a job until after I graduated. In my experience, most college seniors are preoccupied with either the successful completion of their degrees or enjoying the final months with friends they've known for three and a half years, or both. The job search gets serious after graduation.

In addition, the Connecticut number lumps together the employed and those who are going to graduate school--those planning to attend graduate school of course do not have a job lined up before graduation. For all we know, a significant percentage of those reporting "no job" in the AfterCollege survey may well have had plans to go to graduate school.

The Connecticut College program may well be worthwhile and do great good. But Carpenter's comparison is misleading. I have no idea whether Carpenter realizes that the comparison of the two numbers is misleading, but it is. I have to think if he had direct apples-to-apples comparisons that served his argument, he would have used them instead. But I suspect that they would not have been nearly as stark as the ones he uses.

As I stated in my last post, the idea that colleges are miserably failing their students by not preparing them for the working world is simply not true. It is true that few graduates move seamlessly from college straight into their dream jobs. But the idea that somehow there is a problem so significant that students must replace some of their academic courses with "career training" courses--and that such courses will solve the problem in what is still an extremely competitive and tight job market--is just silly.

But that's what passes for intelligent commentary on higher education these days.

Tuesday, July 29, 2014

On the State of the Liberal Arts

For teachers, one of the most enjoyable things to do is spend time being students again.

So it was that I spent the past weekend at Transylvania University’s seminar on Twenty-First Century Liberal Education, along with 18 other academics from a variety of liberal arts institutions.

We all read hundreds of pages of material in preparation. In the span of 65 hours at the seminar, we spent two hours listening to formal lectures (and another hour discussing them), 10 hours in formal discussion sessions, and countless more hours informally continuing those exchanges.

Yes, this is what teachers do in the summer for fun. And it was fun—as well as intellectually illuminating and invigorating.

It was also sobering, coming as it did at a time when higher education faces plenty of public scrutiny and criticism, and when the liberal arts and liberal arts colleges in particular face charges of irrelevance.

The value of this kind of intensive consideration of a topic is that it inevitably focuses the mind. Many of the issues we discussed have been bouncing around my brain for a while (sometimes showing up in this blog), but I’ve never considered them as intensely as I did at the seminar.

Since I’m forever preaching to my students that the best way to figure out what they think about a reading or discussion is to write about it, I’ll attempt to do that myself. (All of the pieces I quote below are from the wonderful reader that the Transylvania seminar leaders put together.)

For the historian, the easiest and most obvious conclusion to take from our readings is that there is nothing new about the liberal arts—or higher education in general—being under siege. It rather seems like a permanent state of affairs. That’s no excuse for complacency about its current challenges, to be sure, but it does help leaven one’s reaction to all of the apocalyptic warnings of the demise of liberal arts. This is not new: the liberal arts college has been through this before and survived. As Alan O. Pfnister put it in 1984, “the free-standing liberal arts college in America has been a study in persistence amid change, continuity amid adaptation.”

“Continuity and change” is the essence of history, and the story of the liberal arts has seen plenty of both. The perennial debate seems to revolve mostly around the question of value and utility: What precisely is the value of the liberal arts? How do we determine that value, how do we present that value to prospective students and their parents?

For clarity’s sake, the sides can be simplified: 1) the liberal arts have value that cannot be quantified and assessed in any meaningful way, but they prepare students to lead better, more meaningful lives; and 2) the liberal arts must demonstrate their practical value in concrete, accessible ways that give others outside the academy reason to believe they are worth the time and money expended in studying them.

Since these are simplifications, few people are likely to identify with either without some kind of reservation, but I’d argue that at some point everyone concerned with the topic will end up choosing one as having primacy over the other.

I choose the first. I am not unaware of the pressures being brought to bear to make college education ever more “practical” (read “directly applicable to post-graduation employment”) to justify its high price tag. I simply believe first causes matter and that something essential is lost when we, as another participant in the seminar put it, allow external rather than internal causes to determine what and how we teach.

The second point of view, however, seems to dominate the field these days. Writing in 2007, David C. Paris, professor of government at Hamilton College (and one-time participant in the Transylvania seminar) said: “the liberal arts and the academy in general need to make peace with, or at least acknowledge, the importance of the market.”

I’ll meet Paris half-way: I acknowledge that the market matters. Despite his rather disdainful portrayal of the traditional liberal arts as appearing “esoteric and apart from real concerns” or “ornamental,” and of its defenders as not concerned with the “real world,” I am not oblivious to reality.

But no, I will not “make peace” with the idea that the market should determine what and how educators in the liberal arts teach. Paris argues that “the liberal arts are threatened,” at least in part, by “too narrow a self-concept” among its practitioners. He writes that “promoting a good life recognizes that there are many ways of living such a life.” The latter is true. But it is not the liberal arts that are “too narrow.” It is the market that defines the good life in the most narrow way possible, i.e., by a single standard: the dollar sign.

Our students do not need the liberal arts to tell them that money matters. The entire culture tells them that relentlessly. They cannot escape it. It is our job as educators to open them to some of the other possible answers to that basic question: “What makes a good life?”

The liberal arts have a long history of addressing that question and advancing our understanding of the good. Liberal education has been a vehicle for addressing questions of inequality and oppression, empowering students to challenge the institutions that buttress those conditions, primarily through encouraging independent thinking. It has been a truly liberating force, and it has not achieved that by asking what the market wants from it.

What message does it send about the answer to that fundamental question of the good when the Association of American Colleges and Universities (AAC&U) resorts to focus groups of students and employers to tell educators what liberal education should be? Or when the AAC&U endorses and privileges certain educational trends as superior (“active” or “high-impact”) to others and justifies its prescriptions by noting that “employers strongly endorsed” them and that they will receive “very strong support from the employer community”?

Whether they realize it or not, they are saying in effect: Let the market decide. They are abdicating their responsibility as educators to shape curriculum. They are buying into not just the language but the values of the market: if it is demanded, it must be supplied.

David L. Kirp writes in Shakespeare, Einstein, and the Bottom Line: The Marketing of Higher Education: “This is more than a matter of semantics and symbols.” When we use “business vocabulary we enforce business-like ways of thinking.” (Thanks to Transylvania’s Jeffrey B. Freyman for this quotation from his paper, “The Neoliberal Turn in Liberal Education.”)

Though the proponents of this point of view often come from the progressive side of the political spectrum, they unwittingly are endorsing a decidedly illiberal view of education. As Christopher Flannery and Rae Wineland Newstad point out in “The Classical Liberal Arts Tradition,” the phrase “liberal arts” literally means the “arts of freedom” as opposed to those practiced by slaves. “Slaves are subjected to the will of others, mere tools or instruments of alien purposes, unable to choose for themselves.” So-called “practical” training was for slaves, and the liberal arts would ruin slaves for their role in society as servants to their superiors.

Liberal education later evolved—particularly in the United States—into not just the privilege of the already free, but as a vehicle for freeing the young from servile status. As Frederick Douglass makes clear in his autobiography, the liberating quality of education was the reason American slaves were denied it: “Knowledge unfits a child to be a slave.” Liberal education equips students to take their places as equals in a free society, as makers of their own lives.

But note how the AAC&U approached its call for reform in 2008. In advocating its “Engaged Learning Reforms” (which closely mirror John Dewey’s practical learning agenda of the 1930s--it is nothing new), AAC&U president Carol Geary Schneider justified the plan primarily with a table showing the “Percentage of Employers Who Want Colleges to ‘Place More Emphasis’ on Liberal Education Outcomes.” Leading the pack was “science and technology,” with the support of 82%. Next came “teamwork skills in diverse groups,” with 76%.

The clinching argument for Schneider is this: “these goals for college learning are strongly endorsed by the constituency that today’s students particularly want to please—their future employers.”

That sentence, to my mind, lays bare the essential problem with the AAC&U approach: rather than strongly reaffirming the goal of educating students to think for themselves—the traditional goal of liberal education—the AAC&U implicitly admits that it has substituted the goal of pleasing their future employers. At the end of the day, how far is that from students being “subjected to the will of others, mere tools or instruments of alien purposes, unable to choose for themselves”?

This vision of the liberal arts does not free students; it puts the liberal arts at the service of society’s economic masters. It is natural that economic fear in uncertain times leads college students to want to please future employers. That does not mean that educators should seek to assuage that fear by shirking their responsibility to provide their students with far more than that, or should bend the curriculum to meet the desires of employers.

Schneider’s statement is not an isolated case, either. AAC&U’s LEAP program (Liberal Education and America’s Promise) published a piece in 2005 titled “Liberal Education for the 21st Century: Business Expectations” by Robert T. Jones, president of Education and Workforce Policy. Jones is not shy about how he sees the role of higher education: it “must respond to these trends by keeping the curriculum aligned with the constantly changing content and application of technical specialties in the workplace.”

Note, education “must” serve the needs of the workplace. That which business does and wants, higher education must do—because, at the end of the day, education serves business. Education must submit to business’ “assessment” of how well it produces the “outcomes” business wants, it must get “continual input from both employers and graduates” and change its ways accordingly.

Jones states that employers “are less concerned with transcripts than the demonstration of achievement and competency across a variety of general and specialized skills.” Knowledge, wisdom, perspective—none of these traditional liberal arts goals fit this account of what employers want. “Competency” in “general and specialized skills” is the aim. Today, “competencies” has become a common buzzword in education discussions, even opening the door for granting academic credit for work experience, and threatening to make the classroom experience virtually unnecessary.

The new liberal education, Jones says, “now enhanced with practical learning [how’s that for product branding?] is the essential foundation for success in every growing occupation.”

Jones is smart enough to compliment liberal education, even as he asserts that it is, at least in its current form, wholly inadequate and must be altered to serve the workplace better. But his ultimate purpose could not be clearer: education must “make peace” with the market.

Yes, there are substantial economic pressures on students today. Do we as educators, however, serve them best by surrendering our purposes to what prospective employers tell us they want? I say no. The question we need to ask is this: are the traits that employers say they want, and the means we are urged to adopt to meet them, wholly compatible with liberal education?

Take one example: Schenider tells us that colleges should change curriculum to include more “experiential learning” such as internships and “team-based assignments”—the latter because 76% of employers want more emphasis in college on “teamwork skills.”

Do employers and faculty mean the same things when they advocate “teamwork skills” as an educational goal? If employers next tell us we're not producing the correct "outcome" when we teach teamwork, will we be called upon to change practices once again? Is it not possible that when some employers say they want employees with “teamwork skills,” they mean people who will not rock the boat and bring up the less essential “ethical values” that the team might be violating? I’d suggest that the recent record of the banking and financial industries shows that we may be teaching too much teamwork and not enough ethics.

It may not be coincidental that the two lowest priorities for employers on Schneider’s survey were “ethics and values” at 56% and “cultural values/traditions” at 53%. Would those who use such survey results to justify their preferred educational reforms also accept that the curriculum should not emphasize ethics and values, because employers don’t seem to care so much about them? Shouldn’t the low priority the employers placed on ethics and values suggest to us that perhaps their goals are not the same as liberal education’s, and make us at least question whether we should give priority to their preferences?

A liberal arts education should empower students with a sense of perspective, but that is precisely what is sorely lacking in this debate. The AAC&U approach smacks of fear and desperation, but is the reality really so dire that we need to look to surveys of employers to tell us what to do? Yes, the price of higher education is high (though not as high as the sticker price suggests, since most students do not pay that price), and students and their parents have a right to expect that a high-priced college education will prepare its graduates for life—including the working life.

But today’s sense of panic comes less from those realities than from a culture that reflexively and unthinkingly ridicules the liberal arts as impractical, simply because they do not immediately and automatically funnel graduates into high-paying jobs. Seemingly everyone from Click and Clack on “Car Talk” to President Obama buys into the idea that the art history major won’t get you a good job. We laugh and nod knowingly when people joke that all that liberal arts majors really need to know is how to ask “Do you want fries with that?”

But it is simply not true, as an AAC&U report shows. It may seem true when graduation comes and that dream job (making, say, at least as much money as last year’s tuition cost) does not materialize. It certainly did for me when I was in that boat. But I see much better now than I did then. Thirty years down the road, the full value to me of my liberal arts education continues to emerge.

The liberal education will not pay its dividends—either economic or otherwise—in one or two or five years. When we expect it to do so, we are unthinkingly adopting the short-run values of today’s market mentality, with its concern with the next quarter’s profit, not the long-term viability of the company (see, again, the banking and financial industries). When we then change the way we teach in deference to such illusory expectations, we begin to sacrifice what we have always done best in the service of a mirage.

It is hard for liberal arts colleges to preach patience and perspective; perhaps it has rarely been harder to do so than it is now. But it is true: a liberal arts education has long-term value, value that cannot be reduced to income earned two or four years out, as the President’s “College Scorecard” seems to be intending to do.

The fact of the matter is that ten or twenty or thirty years down the road, liberal arts majors are doing fine. True, they may not make as much as their cohorts in the STEM fields. Some may need a graduate degree to enhance further their economic well-being. But the traditional liberal arts curriculum does NOT condemn liberal arts graduates to a life of poverty, and we do not serve our students well when we buy into the lie that it does.

When we accept that false narrative as true, when we contort ourselves and embrace any curricular reform that promises to make us more “practical” and “useful,” when we adopt educational practices for their branding or marketing potential rather than their educational value, we betray our fundamental mission: the education of our students for freedom, not for servitude.