Tuesday, July 29, 2014

On the State of the Liberal Arts

For teachers, one of the most enjoyable things to do is spend time being students again.

So it was that I spent the past weekend at Transylvania University’s seminar on Twenty-First Century Liberal Education, along with 18 other academics from a variety of liberal arts institutions.

We all read hundreds of pages of material in preparation. In the span of 65 hours at the seminar, we spent two hours listening to formal lectures (and another hour discussing them), 10 hours in formal discussion sessions, and countless more hours informally continuing those exchanges.

Yes, this is what teachers do in the summer for fun. And it was fun—as well as intellectually illuminating and invigorating.

It was also sobering, coming as it did at a time when higher education faces plenty of public scrutiny and criticism, and when the liberal arts and liberal arts colleges in particular face charges of irrelevance.

The value of this kind of intensive consideration of a topic is that it inevitably focuses the mind. Many of the issues we discussed have been bouncing around my brain for a while (sometimes showing up in this blog), but I’ve never considered them as intensely as I did at the seminar.

Since I’m forever preaching to my students that the best way to figure out what they think about a reading or discussion is to write about it, I’ll attempt to do that myself. (All of the pieces I quote below are from the wonderful reader that the Transylvania seminar leaders put together.)

For the historian, the easiest and most obvious conclusion to take from our readings is that there is nothing new about the liberal arts—or higher education in general—being under siege. It rather seems like a permanent state of affairs. That’s no excuse for complacency about its current challenges, to be sure, but it does help leaven one’s reaction to all of the apocalyptic warnings of the demise of liberal arts. This is not new: the liberal arts college has been through this before and survived. As Alan O. Pfnister put it in 1984, “the free-standing liberal arts college in America has been a study in persistence amid change, continuity amid adaptation.”

“Continuity and change” is the essence of history, and the story of the liberal arts has seen plenty of both. The perennial debate seems to revolve mostly around the question of value and utility: What precisely is the value of the liberal arts? How do we determine that value, how do we present that value to prospective students and their parents?

For clarity’s sake, the sides can be simplified: 1) the liberal arts have value that cannot be quantified and assessed in any meaningful way, but they prepare students to lead better, more meaningful lives; and 2) the liberal arts must demonstrate their practical value in concrete, accessible ways that give others outside the academy reason to believe they are worth the time and money expended in studying them.

Since these are simplifications, few people are likely to identify with either without some kind of reservation, but I’d argue that at some point everyone concerned with the topic will end up choosing one as having primacy over the other.

I choose the first. I am not unaware of the pressures being brought to bear to make college education ever more “practical” (read “directly applicable to post-graduation employment”) to justify its high price tag. I simply believe first causes matter and that something essential is lost when we, as another participant in the seminar put it, allow external rather than internal causes to determine what and how we teach.

The second point of view, however, seems to dominate the field these days. Writing in 2007, David C. Paris, professor of government at Hamilton College (and one-time participant in the Transylvania seminar) said: “the liberal arts and the academy in general need to make peace with, or at least acknowledge, the importance of the market.”

I’ll meet Paris half-way: I acknowledge that the market matters. Despite his rather disdainful portrayal of the traditional liberal arts as appearing “esoteric and apart from real concerns” or “ornamental,” and of its defenders as not concerned with the “real world,” I am not oblivious to reality.

But no, I will not “make peace” with the idea that the market should determine what and how educators in the liberal arts teach. Paris argues that “the liberal arts are threatened,” at least in part, by “too narrow a self-concept” among its practitioners. He writes that “promoting a good life recognizes that there are many ways of living such a life.” The latter is true. But it is not the liberal arts that are “too narrow.” It is the market that defines the good life in the most narrow way possible, i.e., by a single standard: the dollar sign.

Our students do not need the liberal arts to tell them that money matters. The entire culture tells them that relentlessly. They cannot escape it. It is our job as educators to open them to some of the other possible answers to that basic question: “What makes a good life?”

The liberal arts have a long history of addressing that question and advancing our understanding of the good. Liberal education has been a vehicle for addressing questions of inequality and oppression, empowering students to challenge the institutions that buttress those conditions, primarily through encouraging independent thinking. It has been a truly liberating force, and it has not achieved that by asking what the market wants from it.

What message does it send about the answer to that fundamental question of the good when the Association of American Colleges and Universities (AAC&U) resorts to focus groups of students and employers to tell educators what liberal education should be? Or when the AAC&U endorses and privileges certain educational trends as superior (“active” or “high-impact”) to others and justifies its prescriptions by noting that “employers strongly endorsed” them and that they will receive “very strong support from the employer community”?

Whether they realize it or not, they are saying in effect: Let the market decide. They are abdicating their responsibility as educators to shape curriculum. They are buying into not just the language but the values of the market: if it is demanded, it must be supplied.

David L. Kirp writes in Shakespeare, Einstein, and the Bottom Line: The Marketing of Higher Education: “This is more than a matter of semantics and symbols.” When we use “business vocabulary we enforce business-like ways of thinking.” (Thanks to Transylvania’s Jeffrey B. Freyman for this quotation from his paper, “The Neoliberal Turn in Liberal Education.”)

Though the proponents of this point of view often come from the progressive side of the political spectrum, they unwittingly are endorsing a decidedly illiberal view of education. As Christopher Flannery and Rae Wineland Newstad point out in “The Classical Liberal Arts Tradition,” the phrase “liberal arts” literally means the “arts of freedom” as opposed to those practiced by slaves. “Slaves are subjected to the will of others, mere tools or instruments of alien purposes, unable to choose for themselves.” So-called “practical” training was for slaves, and the liberal arts would ruin slaves for their role in society as servants to their superiors.

Liberal education later evolved—particularly in the United States—into not just the privilege of the already free, but as a vehicle for freeing the young from servile status. As Frederick Douglass makes clear in his autobiography, the liberating quality of education was the reason American slaves were denied it: “Knowledge unfits a child to be a slave.” Liberal education equips students to take their places as equals in a free society, as makers of their own lives.

But note how the AAC&U approached its call for reform in 2008. In advocating its “Engaged Learning Reforms” (which closely mirror John Dewey’s practical learning agenda of the 1930s--it is nothing new), AAC&U president Carol Geary Schneider justified the plan primarily with a table showing the “Percentage of Employers Who Want Colleges to ‘Place More Emphasis’ on Liberal Education Outcomes.” Leading the pack was “science and technology,” with the support of 82%. Next came “teamwork skills in diverse groups,” with 76%.

The clinching argument for Schneider is this: “these goals for college learning are strongly endorsed by the constituency that today’s students particularly want to please—their future employers.”

That sentence, to my mind, lays bare the essential problem with the AAC&U approach: rather than strongly reaffirming the goal of educating students to think for themselves—the traditional goal of liberal education—the AAC&U implicitly admits that it has substituted the goal of pleasing their future employers. At the end of the day, how far is that from students being “subjected to the will of others, mere tools or instruments of alien purposes, unable to choose for themselves”?

This vision of the liberal arts does not free students; it puts the liberal arts at the service of society’s economic masters. It is natural that economic fear in uncertain times leads college students to want to please future employers. That does not mean that educators should seek to assuage that fear by shirking their responsibility to provide their students with far more than that, or should bend the curriculum to meet the desires of employers.

Schneider’s statement is not an isolated case, either. AAC&U’s LEAP program (Liberal Education and America’s Promise) published a piece in 2005 titled “Liberal Education for the 21st Century: Business Expectations” by Robert T. Jones, president of Education and Workforce Policy. Jones is not shy about how he sees the role of higher education: it “must respond to these trends by keeping the curriculum aligned with the constantly changing content and application of technical specialties in the workplace.”

Note, education “must” serve the needs of the workplace. That which business does and wants, higher education must do—because, at the end of the day, education serves business. Education must submit to business’ “assessment” of how well it produces the “outcomes” business wants, it must get “continual input from both employers and graduates” and change its ways accordingly.

Jones states that employers “are less concerned with transcripts than the demonstration of achievement and competency across a variety of general and specialized skills.” Knowledge, wisdom, perspective—none of these traditional liberal arts goals fit this account of what employers want. “Competency” in “general and specialized skills” is the aim. Today, “competencies” has become a common buzzword in education discussions, even opening the door for granting academic credit for work experience, and threatening to make the classroom experience virtually unnecessary.

The new liberal education, Jones says, “now enhanced with practical learning [how’s that for product branding?] is the essential foundation for success in every growing occupation.”

Jones is smart enough to compliment liberal education, even as he asserts that it is, at least in its current form, wholly inadequate and must be altered to serve the workplace better. But his ultimate purpose could not be clearer: education must “make peace” with the market.

Yes, there are substantial economic pressures on students today. Do we as educators, however, serve them best by surrendering our purposes to what prospective employers tell us they want? I say no. The question we need to ask is this: are the traits that employers say they want, and the means we are urged to adopt to meet them, wholly compatible with liberal education?

Take one example: Schenider tells us that colleges should change curriculum to include more “experiential learning” such as internships and “team-based assignments”—the latter because 76% of employers want more emphasis in college on “teamwork skills.”

Do employers and faculty mean the same things when they advocate “teamwork skills” as an educational goal? If employers next tell us we're not producing the correct "outcome" when we teach teamwork, will we be called upon to change practices once again? Is it not possible that when some employers say they want employees with “teamwork skills,” they mean people who will not rock the boat and bring up the less essential “ethical values” that the team might be violating? I’d suggest that the recent record of the banking and financial industries shows that we may be teaching too much teamwork and not enough ethics.

It may not be coincidental that the two lowest priorities for employers on Schneider’s survey were “ethics and values” at 56% and “cultural values/traditions” at 53%. Would those who use such survey results to justify their preferred educational reforms also accept that the curriculum should not emphasize ethics and values, because employers don’t seem to care so much about them? Shouldn’t the low priority the employers placed on ethics and values suggest to us that perhaps their goals are not the same as liberal education’s, and make us at least question whether we should give priority to their preferences?

A liberal arts education should empower students with a sense of perspective, but that is precisely what is sorely lacking in this debate. The AAC&U approach smacks of fear and desperation, but is the reality really so dire that we need to look to surveys of employers to tell us what to do? Yes, the price of higher education is high (though not as high as the sticker price suggests, since most students do not pay that price), and students and their parents have a right to expect that a high-priced college education will prepare its graduates for life—including the working life.

But today’s sense of panic comes less from those realities than from a culture that reflexively and unthinkingly ridicules the liberal arts as impractical, simply because they do not immediately and automatically funnel graduates into high-paying jobs. Seemingly everyone from Click and Clack on “Car Talk” to President Obama buys into the idea that the art history major won’t get you a good job. We laugh and nod knowingly when people joke that all that liberal arts majors really need to know is how to ask “Do you want fries with that?”

But it is simply not true, as an AAC&U report shows. It may seem true when graduation comes and that dream job (making, say, at least as much money as last year’s tuition cost) does not materialize. It certainly did for me when I was in that boat. But I see much better now than I did then. Thirty years down the road, the full value to me of my liberal arts education continues to emerge.

The liberal education will not pay its dividends—either economic or otherwise—in one or two or five years. When we expect it to do so, we are unthinkingly adopting the short-run values of today’s market mentality, with its concern with the next quarter’s profit, not the long-term viability of the company (see, again, the banking and financial industries). When we then change the way we teach in deference to such illusory expectations, we begin to sacrifice what we have always done best in the service of a mirage.

It is hard for liberal arts colleges to preach patience and perspective; perhaps it has rarely been harder to do so than it is now. But it is true: a liberal arts education has long-term value, value that cannot be reduced to income earned two or four years out, as the President’s “College Scorecard” seems to be intending to do.

The fact of the matter is that ten or twenty or thirty years down the road, liberal arts majors are doing fine. True, they may not make as much as their cohorts in the STEM fields. Some may need a graduate degree to enhance further their economic well-being. But the traditional liberal arts curriculum does NOT condemn liberal arts graduates to a life of poverty, and we do not serve our students well when we buy into the lie that it does.

When we accept that false narrative as true, when we contort ourselves and embrace any curricular reform that promises to make us more “practical” and “useful,” when we adopt educational practices for their branding or marketing potential rather than their educational value, we betray our fundamental mission: the education of our students for freedom, not for servitude.

Tuesday, July 22, 2014

Historically Moving

After more than four years doing this blog, I'm starting a new venture. History New Network recently invited me to blog on their site, and with this post, "Historical Humility," I begin.

I'll still be posting my pieces here; probably a day after they make their debut on HNN. And I will continue to use this space for the occasional less historical and more personal piece.

I'd like to thank you readers who have been following this blog--some since it began early in 2010. In retrospect, it seems that every time I began to wonder if it was worth the time and effort, someone would, out-of-the-blue, send me a nice compliment, or ask me when the next piece was coming. So thanks to everyone who did that.

I just wish my Dad was still here to see the new blog. He was probably the biggest fan of "The Past Isn't Past." Nothing gave me more satisfaction than when he would drop a casual "I liked your blog post" into our weekly Sunday afternoon phone call. After he passed, I went on his computer to send a message to his contacts to let them know, and noticed that "The Past Isn't Past" was the first bookmark on his web browser.

So, for that Great Web Browser in the Sky--and the rest of you, too--here's the bookmark for my new web home, Mark Byrnes's Facing Backwards.

Friday, July 4, 2014

I love the Fourth of July

(Re-posted from July 1, 2010)

I love the Fourth of July.

Not just because of fireworks (though who doesn't love a good fireworks display?). And not just because of cookouts (and, since you can throw a veggie burger on the grill too, who doesn't love a good cookout?). And not just because it gives me a reason to play two of my favorite songs, Bruce Springsteen's "Fourth of July, Asbury Park (Sandy)" and Dave Alvin's "Fourth of July" (though, seriously, this would be reason enough).

I love the Fourth because of the Declaration of Independence.

It began sometime in my childhood. At some point, on some vacation, at some historical site, my parents bought me a facsimile of the Declaration. It probably tells you all you need to know about me that I thought this was a great souvenir. It was hard, brittle, yellowed paper that crackled when you handled it. For some time I thought all official documents were thus. So when, in the fifth grade, my classmates called upon me to write a peace treaty ending the Great Spitball War between Group 2 and Group 3 (a foreshadowing that I would one day study diplomatic history?), I insisted on taking the piece of paper, coloring it with a yellow crayon, and then crumpling it up in a ball and flattening it out so that, at least to my eye, it looked like my copy of the Declaration. Then it was official.

Later, I eventually stopped wondering why there were so many "f"s where there should clearly be "s"s, and thought more about its content. Just about every American is familiar with the most famous passage about the self-evident truths. But there is a lot more to the Declaration. Much of it, the bulk of it really, is essentially an indictment of George III justifying the break. Reading it with an historian’s rather than
a patriot’s eye, many of the points don’t really hold up. But my favorite part of the Declaration isn’t one of the well-known lines, or something obscure from the list of charges. It comes at the end, just a simple, short phrase, and it encapsulates for me what is best about the Fourth of July.

When you think about it, July 4 isn’t really the most natural date for the nation’s birth. There are other turning points we could have chosen, for example, the outbreak of hostilities. Using that criterion, April 19, 1775, the date of the battles of Lexington and Concord, would be a better choice. Perhaps February 6, 1778, the date a great power, France, recognized American independence and entered an alliance with the U.S. that would help win the war, would be fitting. Legally one could argue that April 9, 1784, the date Britain recognized independence with its acceptance of the Treaty of Paris, was the true independence day.

But we didn’t chose the date of a battle, or the recognition of a great power, or the acceptance of the mother country. We chose the date of a declaration. What does July 4, 1776 mark, after all? A decision. An intention. Not a change in fact, but a change of mind. Looked at coldly, purely as a matter of fact, the Declaration is an absurdity. The colonies declared that they were independent, but they clearly were not. The colonies were still ruled by royal governors appointed by the King, and were occupied by tens of thousands of British soldiers. But the declaration nonetheless boldly states, in the words of a resolution first proposed by Richard Henry Lee nearly a month earlier, that “these united Colonies are, and of Right ought to be Free and Independent States.”

And it’s that phrase that I love: “and of Right ought to be.” The Declaration is not one of fact. It is one of what “of Right ought to be.” This country was founded with its eyes on the Right. Those men who signed the declaration were not always right. About some things, many of them, in many ways, were tragically wrong. But they knew the importance of what ought to be. And they knew that the most important date was not the one when men took up arms, but when they decided to do what was right. When it has been at its worst, this country has settled passively for what is, or what cynics said has always been and thus must always be. When it has been at its best, it has remembered to keep its eyes on what "of Right ought to be."

Have a wonderful Fourth of July, and sometime between the cookout and the fireworks, think a little about what of Right ought to be. And then work to make it a reality. That’s what the Fourth, and being an American, means to me.

Tuesday, June 24, 2014

Maliki is the New Diem

Some people are talking coup d'etat in Iraq.

David Ignatius writes that "President Obama sensibly appears to be leaning toward an alternative policy that would replace Maliki with a less sectarian and polarizing prime minister."

The impulse to replace Maliki is understandable. Most observers of Iraq argue that he has played a large role in the growing sectarian divide between the majority Shi'ites and the minority Sunnis, and thus bears responsibility for the growth of ISIS in the north.

The unstated assumption, of course, is that another popularly elected, plausible leader could have governed differently and guided Iraq into a functioning democracy, and that now, the fact that elections produced Maliki should not stop the United States from maneuvering behind the scenes to get a more able (read "pliable") leader in his place. Then the United States can go about fixing Iraq.

President George W. Bush shakes hands with Iraqi Prime
Minister Nuri al-Maliki, July 25, 2006. Photo by
Kimberlee Hewitt, public domain via Wikimedia Commons
Perhaps. More likely is that the internal conditions in Iraq produced the kind of leader Maliki became. If that's the case, then a coup to oust Maliki will do no good at all. Instead, it is likely to make things worse.

There is certainly precedent for that. In the mid-1950s in South Vietnam, the Eisenhower administration sought a non-communist popular leader who would not be tarnished by associations with the departing French colonizers. It settled on Ngo Dinh Diem.

For about six years, Diem seemed the answer to American prayers. He created a separate South Vietnamese government as a counter to Ho Chi Minh's communist North. He led a fairly stable regime that served American interests in the region.

President Dwight D. Eisenhower shakes hands with South
Vietnamese President Ngo Dinh Diem, May 8 , 1957
U.S. National Archives and Records Administation
But then in 1960, the National Liberation Front began its offensive against Diem's government. As pressure grew, Diem grew more oppressive, in particularly cracking down on the majority Buddhists. By the fall of 1963, the American embassy and elements of the Kennedy administration decided that Diem was the problem and needed to go. American officials sent signals to South Vietnamese generals who then ousted and murdered Diem and his brother.

Ignatius effectively proposes that the United States do the same thing in Iraq today:
The people who will pull the plug on Maliki are Kurdish leader Massoud Barzani and other Iraqi kingmakers. The United States should push them to signal unmistakably that Maliki is finished…. Saudi Arabia wants Obama to announce that he opposes Maliki. It would be better just to move him out, rather than hold a news conference.
One can only hope that Obama resists such pressure. Things with Diem didn't work out well.

In a February 1, 1966 conversation with Sen. Eugene McCarthy, LBJ put it bluntly. Kennedy was told, he said, that Diem
was corrupt and he ought to be killed. So we killed him. We all got together and got a goddamn bunch of thugs and we went in and assassinated him. Now, we've really had no political stability since then.
The political instability that followed the Diem coup was a major contributing factor in LBJ's disastrous decision to Americanize the war in Vietnam.

The desire to replace Maliki is another example of the imperial attitude toward Iraq: America gets to decide when it is time for the leader to go. I have little doubt that if the United States determined to do so, it could mount a coup against Maliki.  But as always, the question is: what then?

As with the initial invasion, it is relatively easy to destroy. It is much harder to build. The United States can probably destroy Maliki if it so chooses. But can it build anything to replace him?

Sunday, June 22, 2014

David Brooks and Pottery Barn Imperialism

One of the reasons I continue to read David Brooks is that he is often unintentionally revealing. Since he is, I think, quite sincere, he does not indulge in clever subterfuge in making his arguments. Thus he sometimes lays bare what otherwise remains hidden behind what Andrew Sullivan last week (ironically) called "noble lies."

In his June 13 column, Brooks tries to lay the blame for Iraq's current travails at the foot of Barack Obama. Before American troops left in 2011, he writes:
American diplomats rode herd on Prime Minister Nuri Kamal al-Maliki to restrain his sectarian impulses. American generals would threaten to physically block Iraq troop movements if Maliki ordered any action that seemed likely to polarize the nation.
After U.S. troops left, he writes:
Almost immediately things began to deteriorate. There were no advisers left to restrain Maliki’s sectarian tendencies. The American efforts to professionalize the Iraqi Army came undone.
Brooks never acknowledges the obvious (though unstated) assumption behind all of this: that Iraq could not be expected to function without the United States. It seems that Nuri al-Maliki (hand-picked by George W. Bush in 2007, by the way) bears no responsibility for indulging his "sectarian impulses" (and note that Maliki is ruled by "impulse," not thought or calculation), and the Iraqi army bears no responsibility for not being professional. It is all due to the absence of Americans, who of course, know best.

Brooks says, quite without irony, that "Iraq is in danger of becoming a non-nation." It never occurs to him that a state that--according to him--cannot function without American diplomats riding herd and American generals threatening its leader might already be a "non-nation."

Without knowing it, Brooks embraces an imperial role for the United States. It was America's job to control the Iraqi government, make it do the right thing. The United States should have stayed in Iraq for as long as it took. Leaving Iraq was "American underreach."

Brooks also embraces the reflexive American-centric mindset far too common on both the left and the right in the United States: the idea that whatever happens abroad happens because of something the United States either did or did not do. An incorrect American policy of withdrawal led to this state of affairs. It necessarily follows that whatever is going on in Iraq now can be fixed by the correct American policy.

Neither of those things is true. It is an illusion that Americans cherish because they think it gives them control over a chaotic world.

The American invasion of Iraq in 2003 broke Iraq. Iraqis thus far have not been able to put it back together. Maybe they never will. The lesson to be learned from that, however, is not what Brooks would have us believe: "The dangers of American underreach have been lavishly and horrifically displayed."

In the lead up to the Iraq War in 2003, Colin Powell allegedly talked about the so-called "Pottery Barn rule: You break it, you own it." The true lesson of Iraq is this: that American military intervention can easily break a country. It does not follow that American military intervention can just as easily make a country. Having disastrously bungled in breaking Iraq, Brooks would now have the United States once again bungle in trying to make it.

What the United States must "own" is not the state of Iraq, but the responsibility for breaking that state. Those are not the same thing. Responsibility begins with not making the situation worse by repeating the original mistake.

David Brooks, it seems, never learned that lesson. One hopes Barack Obama has.

Friday, June 20, 2014

Somebody Told Us There'd Be Days Like These

With chaos returning to Iraq due to the growing power of ISIS (Islamic State of Iraq and Syria) in the north, the partisan divide over the American war there has resurfaced as well. Supporters of the war charge President Obama with losing Iraq because he withdrew American forces, while critics of the war fume at the gall of the architects of that disastrous war now posing as experts on the region.

Because the debate has been largely partisan, with Republicans and Democrats lining up rather predictably, there is a sense that this is merely a partisan dispute. It is not. Unfortunately, the partisan nature of the current debate makes it seem so.

Rather than go back to the 2003 debate, I decided to look back a little further--to the first war with Iraq in 1991, and the criticism of the George H. W. Bush administration for its refusal to go "on to Baghdad." Those Republican foreign policy leaders defended their decision by predicting undesirable outcomes--ones which we are now seeing come to fruition.

Re-reading the memoirs of Colin Powell (then Chair of the Joint Chiefs) and James Baker (then Secretary of State), it becomes immediately apparent that they foresaw today's events as the nearly inevitable outcome of a U.S. invasion to topple Saddam.

President George H. W. Bush, Secretary of State James Baker, National
Security Advisor Brent Scowcroft, Gen. Colin Powell, Jan. 15, 1991
U.S. National Archives and Records Administration
Writing in 1995, Gen. Powell quoted U.S. ambassador to Saudi Arabia, Charles Freeman, who wrote in a 1991 cable: "For a range of reasons, we cannot pursue Iraq's unconditional surrender and occupation by us. It is not in our interest to destroy Iraq or weaken it to the point that Iran and/or Syria are not constrained by it."

Baker also observed in 1995 that "as much as Saddam's neighbors wanted to see him gone, they feared that Iraq might fragment in unpredictable ways that would play into the hands of the mullahs in Iran, who could export their brands of Islamic fundamentalism with the help of Iraq's Shi'ites and quickly transform themselves into the dominant regional power."

Supporters of the war who now bemoan the growth of Iran's influence in Iraq have no one but themselves to blame. We were told it would be like this.

The current situation--a stable Kurdistan, ISIS in control of much of the Sunni-dominated areas, Shi'ites rallying to the defense of their holy sites--portends the possible partition of Iraq, either formally or de facto. That, too, was foreseen in 1991.

Powell: "It would not contribute to the stability we want in the Middle East to have Iraq fragmented into separate Sunni, Shia, and Kurd political entities. The only way to have avoided this outcome was to have undertaken a largely U.S. conquest and occupation of a remote nation of twenty million people."

The United States spent eight long years doing just that, occupying Iraq to keep it together. But that was never a sustainable long-term prospect. It went on too long as it was. Nevertheless, there are some neocons today suggesting that the United States never should have left Iraq.

Baker, who was known for his domestic political skills before he went to the State Department, knew that scenario was untenable: "Even if Saddam were captured and his regime toppled, American forces would still be confronted with the specter of a military occupation of indefinite duration to pacify the country and sustain a new government in power. The ensuing urban warfare would surely result in more casualties to American GIs than the war itself, thus creating a political firestorm at home."

Twenty years ago, these Republican statesmen predicted the situation we now see in Iraq. They warned anyone who would listen that an American intervention to overthrow Saddam Hussein would have undesirable consequences contrary to American interests, regardless of any specific actions the United States did or did not take in pursuit of that larger goal.

Keep in mind that they said these things would happen with their president in charge, with themselves making policy. They understood that there are forces that such an act would set loose which the United States could not control, no matter who was in office. They said all this long before anyone had ever even heard of Barack Obama. The idea that any specific act by the president is primarily responsible for the current state of affairs in Iraq is absurd on the face of it.

That won't stop people from saying so. But it should keep the rest of us from believing it.

Monday, June 16, 2014

Leadership and Interventionism Are Not the Same Thing

Robert Kagan has written a piece in the New Republic entitled "Superpowers Don't Get to Retire." In it, he bemoans what he perceives as America's retreat from its responsibility to preserve a liberal world order. Kagan argues: "Many Americans and their political leaders in both parties, including President Obama, have either forgotten or rejected the assumptions that undergirded American foreign policy for the past seven decades."

Kagan is correct that public attitudes towards America's role in the world have shifted recently, but he dramatically overstates the case when he posits a break with a 70-year tradition. He seems to equate "leadership" with military interventionism. Americans have rejected the latter, not the former.

What Kagan does not recognize is that the public's current aversion to military interventionism abroad is not only consistent with America's pre-World War II foreign policy, but with the golden age of leadership he praises.

Kagan's fundamental mistake is to think that the American people embraced military interventionism during and after World War II. They did not.

Americans have always been averse to military actions leading to large numbers of American casualties and extended occupations of hostile territory. In the two years before Pearl Harbor, Americans (even the so-called "interventionists") desperately clung to the idea that they could protect American interests merely by supplying the British (and later the Soviets) with the weapons to do the fighting.

While conventional wisdom suggests that Pearl Harbor changed all that, the reality is different. Even after the United States entered the war, it was reluctant to launch military operations that posed the threat of huge casualties. As David M. Kennedy has stated, this American predilection to avoid combat with Germany's forces in France led Stalin to conclude: "it looks like the Americans have decided to fight this war with American money and American machines and Russian men."

Even the major architect of the postwar order, Franklin Roosevelt, did not envision an America that would permanently station large numbers of U.S. soldiers abroad, much less deploy them on a regular basis. Yes, he did see the United States as the leading power in the new United Nations. But the point of having the so-called "Four Policemen" was to insure that the other three would be the ones to send soldiers to keep order in their respective spheres of interest. He imagined that the American role would be primarily in the form of naval and air power. "The United States will have to lead," FDR said of the UN, but its role would be to use "its good offices always to conciliate to help solve the differences which will arise between the others."

FDR, Churchill, and Stalin at Teheran
By Horton (Capt), War Office official photographer
[Public domain], via Wikimedia Commons
As the historian Warren Kimball has written, at the 1943 Teheran conference, when Stalin pressed him on how the United States would comport itself as one of the policemen, "FDR resorted to his prewar notion of sending only planes and ships from the United States to keep the peace in Europe." In FDR's mind, the United States would be primarily responsible for order in the western hemisphere, a role it had played for decades.

Even the so-called American declaration of cold war, the Truman Doctrine speech of March 1947, avoided the implication that American military forces would be deployed to uphold the doctrine. The speech simultaneously signaled to the world that the United States was both assuming some of Britain's responsibilities and had given up on the idea of cooperation with the Soviet Union. Nonetheless, Truman explicitly stated that the aid he was requesting would not be military: "I believe that our help should be primarily through economic and financial aid which is essential to economic stability and orderly political processes." Truman presented aid to Greece and Turkey as mere money to make good on the far larger investment of lives and treasure during World War II: "The assistance that I am recommending for Greece and Turkey amounts to little more than 1 tenth of 1 per cent of this investment. It is only common sense that we should safeguard this investment and make sure that it was not in vain."

The Korean War changed that by requiring quick American military intervention to prevent the collapse of South Korea in the summer of 1950, but when it bogged down into a stalemate after the Chinese intervention in November, the public quickly soured on the war. In January 1951, "49% thought the decision was a mistake, while 38% said it was not, and 13% had no opinion," according to Gallup. While those numbers fluctuated over the next two years, and more Americans thought the war was not a mistake whenever an end to the war was in sight, the American public in general did not support military actions that led to substantial American casualties and prolonged combat. The public's disillusionment with the war was one of the reasons that an increasingly unpopular President Truman decided not to run for reelection in 1952.

The next president, Dwight Eisenhower, moved quickly to end that war, and, more importantly, instituted a foreign policy that had at its core the principle of avoidance of any Korea-style wars in the future. Rather than engage in limited wars in every world hot spot, Eisenhower determined that such a course would bankrupt the country. He preferred "massive retaliation": the idea that a threat to essential American interests would be met with a nuclear threat, not a conventional response in kind. Even when the French faced defeat in Vietnam, Eisenhower refused to intervene, and never seriously considered deploying American troops to Vietnam.

While John Kennedy came into office criticizing that approach, pledging to "pay any price, bear any burden," the sobering experience of the Cuban missile crisis made him rethink that mindset. The cold war, he said in June 1963, imposed "burdens and dangers to so many countries," and specifically noted that the US and Soviet Union "bear the heaviest burdens." He spoke of the American aversion to war: "The United States, as the world knows, will never start a war. We do not want a war. We do not now expect a war. This generation of Americans has already had enough -- more than enough -- of war and hate and oppression."

While one may argue that Kennedy's policies led to the next American war in Vietnam under his successor Lyndon Johnson, it is also the case that Johnson sought to avoid a land war. Significantly, he looked first to use air power. Operation Rolling Thunder, the American air campaign against North Vietnam, was meant to forestall the need for American ground troops in large numbers. It was only after the clear failure of bombing to achieve American aims that Johnson escalated the war with more ground troops.

When that effort too proved futile, Richard Nixon again returned to air power as America's main instrument to maintain order abroad. His Vietnamizaion policy tried to balance the withdrawal of American troops with the deployment of increased air power. The "Nixon Doctrine," announced that henceforth "we shall furnish military and economic assistance when requested in accordance with our treaty commitments. But we shall look to the nation directly threatened to assume the primary responsibility of providing the manpower for its defense." In other words, America's friends should not expect American troops to do their fighting for them.

I'd argue that from Nixon up until George W. Bush's invasion of Iraq, that was American policy. Ronald Reagan, George H. W. Bush, and Bill Clinton all avoided open-ended military commitments of American troops (Clinton's air-only campaign against Serbia in 1999 is the best example).

Only the first war against Iraq in 1991 challenged that trend, and even that war involved a longer preliminary air campaign than a ground one: five weeks of bombing preceded the ground campaign, which lasted only 100 hours. According to Colin Powell, Bush had the Vietnam War in mind when he resisted the calls of "on to Baghdad." Bush "had promised the American people that Desert Storm would not become a Persian Gulf Vietnam," Powell writes in his memoir, "and he kept his promise." Within two weeks of the ceasefire, the 540,000 U.S. troops began their withdrawal from the Persian Gulf.

Even the American war in Afghanistan in 2001 was planned to keep the American "footprint" light, relying on American air power and the Afghan Northern Alliance to do much of the fighting. It was the invasion and prolonged occupation of Iraq beginning in 2003 that predictably soured Americans once again on the prospect of extended military engagements.

In sum, what Americans are experiencing now is not exceptional, but rather normal. In the aftermath of extended, costly military interventions leading to the loss of American lives, the American people revert to their historical aversion to solving problems by fighting in and occupying foreign states. That does not mean the United States ceases to be relevant, or ceases to lead. It simply means that Americans have been reminded once again that not every problem can be solved by an invasion, and that leadership is more than a reflexive application of American military might.