Wednesday, December 17, 2014

Dick Cheney's David Frost Moment

[This post was originally published on the History News Network.]

All of us who teach face a daily, daunting task: how do we take our subject matter—which we know from years of study to be terrifically complicated and nuanced—and make it accessible and understandable to our students, all while avoiding the peril of oversimplification?

We all do our best, succeeding sometimes, failing others. We are eternally grateful when we find a key: that piece of evidence, that compelling argument, that helps us do our jobs. The most prominent example of that for my teaching is perhaps Richard Nixon’s infamous comment to David Frost about his actions during the Watergate scandal: "Well, when the president does it, that means that it is not illegal."

That one simple sentence helps me communicate to students the essence of the danger inherent in the many complicated events that we call “Watergate”: Nixon’s sincerely held belief that as president he was incapable of committing an illegal act; that whatever he deemed necessary to security of the United States was, by definition, legal. It was a sentiment more consistent with the absolutism of Louis XIV than the constitutional principles that gave birth to the nation. What makes those words so powerful is that they come not from one of Nixon’s many implacable political foes, or from a historian interpreting his actions. They come from the man himself.

Since the release of the Senate Intelligence Committee’s torture report last week, I’ve been struggling with how to synthesize the multiplicity of reactions it provoked. Then on Sunday, I saw former Vice President Dick Cheney on “Meet the Press.”

Amidst all the dissembling, Cheney made one remark that struck me as his David Frost moment.

Moderator Chuck Todd confronted Cheney with evidence that 25% of the detainees were innocent, and that one was physically abused so badly that he died. Cheney replied: “I'm more concerned with bad guys who got out and released than I am with a few that, in fact, were innocent.” When pressed about whether it was acceptable to abuse innocent people even to the point of death, Cheney said: “I have no problem as long as we achieve our objective.”

Keep in mind, Cheney was not talking about the accidental death of innocents on the battlefield. Every war involves the accidental death of innocents, but just war standards command that every reasonable effort be made to avoid them. This was someone in the custody of the United States, who had done no wrong and was mistakenly taken into custody, whose physical mistreatment by representatives of the United States killed him while in custody. Faced with that travesty of justice, Dick Cheney could not even muster a perfunctory expression of regret.

Confronted with an unquestionable injustice, Cheney says: “I have no problem as long as we achieve our objective.” That is the essence of everything wrong with the Bush-Cheney “war on terror.” It admitted no principle whatsoever as superior to the objective of keeping the nation safe. Fundamental human rights—even of innocent people—can be violated with impunity, Cheney asserts. Even after being presented with evidence that an innocent man was killed, Cheney blithely said, “I'd do it again in a minute.” The end justifies the means.

That is the mindset of the authoritarian. Dictators the world over use that logic every day. Dick Cheney will never admit that the techniques he endorsed constitute torture—to do so would be to admit he is a war criminal. But he has now admitted, beyond any doubt, that he has the mentality of a torturer.


Wednesday, October 22, 2014

Can Obama Do in Iraq What Nixon and Ford Couldn't in Vietnam?

[Originally published on History News Network]

Practically every American intervention abroad since the 1960s has prompted comparisons to Vietnam. So it was hardly surprising when on October 8, in response to President Obama’s decision to expand the campaign against ISIS into Syria, Frederik Logevall and Gordon M. Goldstein authored an op-ed in the New York Times that asked “Will Syria Be Obama’s Vietnam?”

I’m not sure that’s the right question. The American concern over ISIS originated in Iraq, after all—an intervention that is now eleven years old. America’s air campaign against ISIS today reminds me less of the intervention that happened in Vietnam than the one that didn’t happen—in the spring of 1975.

This past June, when ISIS suddenly broke through America’s collective effort to forget about Iraq and seemed poised to take Baghdad, it was easy to wonder if we were about to witness a repeat of the fall of Saigon.

More than two years after the peace agreement that led to the withdrawal of American troops from Vietnam, a North Vietnamese offensive against South Vietnam met with little effective resistance, much like last June’s stories of Iraqi armed forces dropping their arms and failing to fight ISIS. Compare these passages from the New York Times coverage of the fall of Hue in March 1975 and Mosul in June 2014:
“By the thousands, the people are abandoning Hue…. The armed forces are also moving out, some by landing craft, some in military vehicles, some bundled into trucks with family members, furniture and food. No one seemed in the slightest doubt yesterday that Hue and the rest of the north were being left to the Communists.”
“Thousands of civilians fled south toward Baghdad…. The Iraqi Army apparently crumbled in the face of the militant assault, as soldiers dropped their weapons, shed their uniforms for civilian clothes and blended in with the fleeing masses…. ‘They took control of everything, and they are everywhere,’ said one soldier who fled the city.”
The political reaction this summer also eerily echoed the reaction to events of nearly 40 years ago.

In his Memoirs, Richard Nixon argued that he had won the Vietnam war and that American bombing of the North would have preserved the South Vietnamese government. It had survived for two years after the peace agreement. That meant Nixon’s Vietnamization had worked.

“When Congress reneged on our obligations under the agreements,” Nixon wrote, “the Communists predictably rushed in to fill the gap.” Nixon had privately assured South Vietnamese President Thieu that violations of the peace agreement by Hanoi would be met with renewed American bombing. But in June 1973, the Church-Case amendment forbade funding for any military operations in Vietnam. “The congressional bombing cutoff, coupled with the limitations placed on the President by the War Powers Resolution in November 1973, set off a string of events that led to the Communist takeover.” The war was “lost within a matter of months once Congress refused to fulfill our obligations,” Nixon said.

Henry Kissinger has also repeatedly argued that the peace agreement reached with Hanoi had secured the independence of South Vietnam, and that he and Nixon intended to use air power to thwart any North Vietnamese aggression against the South. But, he asserts, Watergate so weakened Nixon that they were unable to overcome the opposition of Congress. In a meeting with Singapore’s Lee Quan Yew on August 4, 1973, Kissinger said: “We have suffered a tragedy because of Watergate … We were going to bomb North Vietnam for a week, then go to Russia, then meet with [North Vietnam’s lead negotiator] Le Duc Tho. Congress made it impossible.”

Lewis Sorley, in his 1999 work A Better War: The Unexamined Victories and Final Tragedy of America’s Last Years in Vietnam, argued that “[t]here came a time when the war was won.” Due to the pacification efforts of Gen. Creighton Abrams, he writes, victory in in Vietnam “can probably best be dated in late 1970.” The countryside was pacified, and South Vietnamese forces were “capable of resisting aggression so long as America continued to provide logistical and financial support, and … renewed application of U.S. air and naval power should North Vietnam violate the terms of that agreement.”

The argument that continued American application of its air power against North Vietnam could have preserved South Vietnam has been thus been a staple of Vietnam War revisionism.

In June, Sen. John McCain made an argument about Iraq similar to the one that Nixon, Kissinger, and Sorley made about Vietnam:

"We had it won," McCain said. "Gen. [David] Petraeus had the conflict won, thanks to the surge. And if we had left a residual force behind, that we could have, we would not be facing the crisis we are today. Those are fundamental facts ... The fact is, we had the conflict won, and we had a stable government.” Sen. Lindsey Graham of South Carolina added: “There is no scenario where we can stop the bleeding in Iraq without American air power."

There are no do-overs in history, and no one can say for certain whether the renewed application of American air power after the 1973 peace agreement might have prevented the fall of Saigon—or if it did, for how long. But we are currently seeing why Congress sought to limit the executive branch’s options back in 1973.

The fear then was that, despite the peace agreement, Nixon and Kissinger would continue to fight a war that the country overwhelmingly wanted to be over. Kissinger’s repeated statements indicate that they in fact intended to do just that, not just in Vietnam but possibly in Cambodia, too. The Church-Case Amendment was how Congress expressed the national consensus against reviving the war.

Today, there seems little will in Congress to restrain the president’s war-making powers. If anything, the loudest voices have been those arguing for even greater military action. In response to such pressure, the president has already expanded the air war to Syria.

Just last week, McCain argued that “pinprick” airstrikes were proving ineffective, and called for further expansions of the war: “They’re winning, and we’re not,” McCain told CNN. “The Iraqis are not winning. The Peshmerga, the Kurds are not winning.” Thus, he argued, there was a need for “more boots on the ground … in the form of forward air controllers, special forces and other people like that…. You have to arm the Peshmerga … Buffer zone in Syria, no-fly zone, take on Bashar al Assad the same as we have ISIS.”

McCain’s vision of a renewed, ever-expanding war is precisely what Congress in 1973 meant to prevent Nixon and Kissinger from doing. After nearly a decade of war, Americans had decided that the fall of South Vietnam, Cambodia, and Laos would not be a mortal threat to American security.

Today, what stands between the United States and the full-scale revival of a war Americans thought was over is not Congress, but the president himself. Obama has repeatedly stated that he will not re-introduce American combat troops to Iraq, and he is trying to maintain a sense of balance about the nature of the threat: “While we have not yet detected specific plotting against our homeland, these terrorists have threatened America and our allies. And the United States will meet this threat with strength and resolve.”

McCain, however, is doing the opposite, hyping the threat the U.S. Back in June he said: “We are now facing an existential threat to the security of the United States of America.” Last week he said: “it is a threat to the United States of America if they are able to establish this caliphate.”

A September CNN public opinion poll suggests that Americans agree with McCain about the threat, while siding with Obama on the limits of the U.S. response. Ninety percent say ISIS represents a threat to the U.S., with 45 percent calling the threat “serious,” 22 percent saying it is “fairly serious” and 23 percent saying it is “somewhat serious.” (Two years after 9/11, in 2003, 49 percent considered Al Qaeda a “serious” threat to the U.S.) Seventy-one percent believe ISIS terrorists are already in the U.S. But at the same time, by a 61-38 margin, Americans oppose using American ground forces to defeat ISIS.

ISIS has succeeded in making Americans think that Iraq matters again, and that U.S. interests require its defeat, but it has not yet convinced them that it is worth Americans doing the fighting and dying. That's Obama's dilemma. If air power is not enough, does he take the chance that Iraq (or Syria) falls to ISIS, or does he break his promise?

In the spring of 1975, Congressional and public opinion meant that President Ford had little choice but to watch as the North Vietnamese Army rolled into Saigon. Nearly 40 years later, President Obama faces a far more difficult task: prevent the collapse of the Iraqi government (and, increasingly, the Syrian opposition) without fully reviving a war he spent years trying to end—all in the face of an opposition that is intent on proving that the Iraq war it supported was won until the president lost it.

Whether Obama will be able to keep his promise not to send American ground forces back to Iraq is very much an open question. Having taken the first step to save Iraq by applying American air power—what Nixon, Kissinger and Ford could not do in Vietnam—it may be increasingly hard to resist subsequent steps if air power proves to be not enough.

Tuesday, September 9, 2014

Lies, Damn Lies, and Statistics (Higher Education "Reform" Edition)

Following this summer's seminar on the liberal arts at Transylvania University, I resolved to more consciously talk about the liberal arts with my new crop of first-year students in my Humanities class this semester. Last week, we spent a full class period talking about their reasons for coming to Wofford, and Wofford's commitment to a liberal arts education. We'll spend two more classes this week discussing it. They should understand what they're getting into, I think.

The beginning of the academic year always prompts some thinking about the purpose of education, even among those not engaged in it. Frank Bruni has an interesting piece in the New York Times arguing that higher education has an obligation to challenge students: "college needs to be an expansive adventure, yanking students toward unfamiliar horizons and untested identities rather than indulging and flattering who and where they already are." I couldn't agree more.

The Times also carried another piece that conveys the more dominant view in American culture: that college, first and foremost, is about getting a job.

Ben Carpenter, vice chairman of the CRT Capital Group, argues that what is missing from college today is "career education." For Carpenter, it is not enough for colleges to provide majors geared toward professional pursuits, and to have Career Services offices. The college must also offer courses in "career training":
So what can be done to make certain these young adults are being prepared for life post-graduation? The answer is simple: Colleges need to create, and require for graduation, a course in career training that would begin freshman year and end senior year.
(Note to self: remind students to always beware whatever statement follows the phrase "The answer is simple.")

The first thing worth noticing here is Carpenter's choice of words. He is clear about what his concern is: "how to get, and succeed at, a job." But the title of the article isn't "Is Your Student Prepared for a Job?--it is "Is Your Student Prepared for Life?" Throughout the piece, Carpenter uses the words "job," "career," and "life" interchangeably.

It does not take a liberal arts education to know that those words do not mean the same things. Too often in discussions of education, we elide the differences, so when talking to my students last week, I made the difference explicit. A liberal arts education is meant to prepare you, I said, not just to make a living, but to make a life.

I do not know whether Carpenter intentionally conflates "job" and "life" to confuse the reader, or if he honestly does not see a meaningful distinction between the two. Either way, doing so has the effect of perpetrating the idea that your job is your life and so college is only about getting a job.

The second issue that got my attention was that Carpenter employs what seems to me the knee-jerk "reform" response to every perceived challenge in higher education: make it part of the curriculum! I have no problem with the idea that colleges should help students find post-graduate employment. Here at Wofford, The Space is devoted to that project, and does a lot of good for our students. But it is not part of the curriculum.

That's not what Carpenter is calling for; in fact, he denigrates Career Service offices as suffering from a "major disconnect" with students. He wants "a course," one that lasts for four years and is required of all students. Since Carpenter does not get more specific, it is hard to know whether he means a course every semester for four years, or one course a year, or one course that lasts four years. But he clearly is talking about making it part of the curriculum.

It is self-evident that every new course requirement reduces the electives available for students to take to investigate their own passions or interests. The more expansive Carpenter's plan, the fewer academic courses students in it will take. It is hard not to wonder if that isn't part of the idea. If college exists merely to train workers, what do they need those electives for, anyway?

Finally, there is the matter of the precise problem that is driving his proposal. At the start of the article, Carpenter states:
According to a recent poll conducted by AfterCollege, an online entry-level job site, 83 percent of college seniors graduated without a job this spring.
In contrast, toward the end, he cites an example that suggests the efficacy of what he proposes:
One year after graduation, 96 percent of all Connecticut College alumni report that they are employed or in graduate school.
One of the things my liberal arts education taught me is to look closely and carefully when someone cites statistics. On the surface, the difference seems stark: 83 percent with no job, 96 percent employed! See, the answer is simple! Certainly that's what Carpenter wants us to think. But a moment's consideration shows that he's doing the old apples and oranges comparison.

The AfterCollege survey only purports to measure only how many students reported having a job lined up before graduation. The accuracy of that number may be questionable, since it was an online survey, and not, as Carpenter says, a scientific "poll." Second, the fine print on the survey reveals that the respondents not just students about to graduate--a majority had already graduated, 23.38 percent were college seniors, and 12.25 percent were juniors. (Safe to say that few if any juniors already have a job lined up for after graduation.)

The 83 percent number comes just from students still in school, including those juniors. For recent grads, the number is 76.3 percent. No doubt that's a big number, but it is not 83. In addition, since the survey was conducted between February 27 and April 15, 2014, some seniors who answered "no" in late February or March may well have had jobs by the time they graduated in May 2014.

In short, it is not true that 83 percent of last year's graduates had no job at graduation, even according to this survey.

Now let's look at the Connecticut College numbers. By contrast, they are not a mix of recent grads and current juniors and seniors. They measure an entire graduating class. In no way can that group be reasonably compared to the AfterCollege survey respondents. In addition, it measures the outcome for those students one year after graduation.

A true comparison would require surveying only graduating seniors right after they graduated and then comparing the number with jobs to the number with jobs one year later. A year makes a huge difference in the job search, as does being out of school--I recall not feeling much urgency about getting a job until after I graduated. In my experience, most college seniors are preoccupied with either the successful completion of their degrees or enjoying the final months with friends they've known for three and a half years, or both. The job search gets serious after graduation.

In addition, the Connecticut number lumps together the employed and those who are going to graduate school--those planning to attend graduate school of course do not have a job lined up before graduation. For all we know, a significant percentage of those reporting "no job" in the AfterCollege survey may well have had plans to go to graduate school.

The Connecticut College program may well be worthwhile and do great good. But Carpenter's comparison is misleading. I have no idea whether Carpenter realizes that the comparison of the two numbers is misleading, but it is. I have to think if he had direct apples-to-apples comparisons that served his argument, he would have used them instead. But I suspect that they would not have been nearly as stark as the ones he uses.

As I stated in my last post, the idea that colleges are miserably failing their students by not preparing them for the working world is simply not true. It is true that few graduates move seamlessly from college straight into their dream jobs. But the idea that somehow there is a problem so significant that students must replace some of their academic courses with "career training" courses--and that such courses will solve the problem in what is still an extremely competitive and tight job market--is just silly.

But that's what passes for intelligent commentary on higher education these days.

Tuesday, July 29, 2014

On the State of the Liberal Arts

For teachers, one of the most enjoyable things to do is spend time being students again.

So it was that I spent the past weekend at Transylvania University’s seminar on Twenty-First Century Liberal Education, along with 18 other academics from a variety of liberal arts institutions.

We all read hundreds of pages of material in preparation. In the span of 65 hours at the seminar, we spent two hours listening to formal lectures (and another hour discussing them), 10 hours in formal discussion sessions, and countless more hours informally continuing those exchanges.

Yes, this is what teachers do in the summer for fun. And it was fun—as well as intellectually illuminating and invigorating.

It was also sobering, coming as it did at a time when higher education faces plenty of public scrutiny and criticism, and when the liberal arts and liberal arts colleges in particular face charges of irrelevance.

The value of this kind of intensive consideration of a topic is that it inevitably focuses the mind. Many of the issues we discussed have been bouncing around my brain for a while (sometimes showing up in this blog), but I’ve never considered them as intensely as I did at the seminar.

Since I’m forever preaching to my students that the best way to figure out what they think about a reading or discussion is to write about it, I’ll attempt to do that myself. (All of the pieces I quote below are from the wonderful reader that the Transylvania seminar leaders put together.)

For the historian, the easiest and most obvious conclusion to take from our readings is that there is nothing new about the liberal arts—or higher education in general—being under siege. It rather seems like a permanent state of affairs. That’s no excuse for complacency about its current challenges, to be sure, but it does help leaven one’s reaction to all of the apocalyptic warnings of the demise of liberal arts. This is not new: the liberal arts college has been through this before and survived. As Alan O. Pfnister put it in 1984, “the free-standing liberal arts college in America has been a study in persistence amid change, continuity amid adaptation.”

“Continuity and change” is the essence of history, and the story of the liberal arts has seen plenty of both. The perennial debate seems to revolve mostly around the question of value and utility: What precisely is the value of the liberal arts? How do we determine that value, how do we present that value to prospective students and their parents?

For clarity’s sake, the sides can be simplified: 1) the liberal arts have value that cannot be quantified and assessed in any meaningful way, but they prepare students to lead better, more meaningful lives; and 2) the liberal arts must demonstrate their practical value in concrete, accessible ways that give others outside the academy reason to believe they are worth the time and money expended in studying them.

Since these are simplifications, few people are likely to identify with either without some kind of reservation, but I’d argue that at some point everyone concerned with the topic will end up choosing one as having primacy over the other.

I choose the first. I am not unaware of the pressures being brought to bear to make college education ever more “practical” (read “directly applicable to post-graduation employment”) to justify its high price tag. I simply believe first causes matter and that something essential is lost when we, as another participant in the seminar put it, allow external rather than internal causes to determine what and how we teach.

The second point of view, however, seems to dominate the field these days. Writing in 2007, David C. Paris, professor of government at Hamilton College (and one-time participant in the Transylvania seminar) said: “the liberal arts and the academy in general need to make peace with, or at least acknowledge, the importance of the market.”

I’ll meet Paris half-way: I acknowledge that the market matters. Despite his rather disdainful portrayal of the traditional liberal arts as appearing “esoteric and apart from real concerns” or “ornamental,” and of its defenders as not concerned with the “real world,” I am not oblivious to reality.

But no, I will not “make peace” with the idea that the market should determine what and how educators in the liberal arts teach. Paris argues that “the liberal arts are threatened,” at least in part, by “too narrow a self-concept” among its practitioners. He writes that “promoting a good life recognizes that there are many ways of living such a life.” The latter is true. But it is not the liberal arts that are “too narrow.” It is the market that defines the good life in the most narrow way possible, i.e., by a single standard: the dollar sign.

Our students do not need the liberal arts to tell them that money matters. The entire culture tells them that relentlessly. They cannot escape it. It is our job as educators to open them to some of the other possible answers to that basic question: “What makes a good life?”

The liberal arts have a long history of addressing that question and advancing our understanding of the good. Liberal education has been a vehicle for addressing questions of inequality and oppression, empowering students to challenge the institutions that buttress those conditions, primarily through encouraging independent thinking. It has been a truly liberating force, and it has not achieved that by asking what the market wants from it.

What message does it send about the answer to that fundamental question of the good when the Association of American Colleges and Universities (AAC&U) resorts to focus groups of students and employers to tell educators what liberal education should be? Or when the AAC&U endorses and privileges certain educational trends as superior (“active” or “high-impact”) to others and justifies its prescriptions by noting that “employers strongly endorsed” them and that they will receive “very strong support from the employer community”?

Whether they realize it or not, they are saying in effect: Let the market decide. They are abdicating their responsibility as educators to shape curriculum. They are buying into not just the language but the values of the market: if it is demanded, it must be supplied.

David L. Kirp writes in Shakespeare, Einstein, and the Bottom Line: The Marketing of Higher Education: “This is more than a matter of semantics and symbols.” When we use “business vocabulary we enforce business-like ways of thinking.” (Thanks to Transylvania’s Jeffrey B. Freyman for this quotation from his paper, “The Neoliberal Turn in Liberal Education.”)

Though the proponents of this point of view often come from the progressive side of the political spectrum, they unwittingly are endorsing a decidedly illiberal view of education. As Christopher Flannery and Rae Wineland Newstad point out in “The Classical Liberal Arts Tradition,” the phrase “liberal arts” literally means the “arts of freedom” as opposed to those practiced by slaves. “Slaves are subjected to the will of others, mere tools or instruments of alien purposes, unable to choose for themselves.” So-called “practical” training was for slaves, and the liberal arts would ruin slaves for their role in society as servants to their superiors.

Liberal education later evolved—particularly in the United States—into not just the privilege of the already free, but as a vehicle for freeing the young from servile status. As Frederick Douglass makes clear in his autobiography, the liberating quality of education was the reason American slaves were denied it: “Knowledge unfits a child to be a slave.” Liberal education equips students to take their places as equals in a free society, as makers of their own lives.

But note how the AAC&U approached its call for reform in 2008. In advocating its “Engaged Learning Reforms” (which closely mirror John Dewey’s practical learning agenda of the 1930s--it is nothing new), AAC&U president Carol Geary Schneider justified the plan primarily with a table showing the “Percentage of Employers Who Want Colleges to ‘Place More Emphasis’ on Liberal Education Outcomes.” Leading the pack was “science and technology,” with the support of 82%. Next came “teamwork skills in diverse groups,” with 76%.

The clinching argument for Schneider is this: “these goals for college learning are strongly endorsed by the constituency that today’s students particularly want to please—their future employers.”

That sentence, to my mind, lays bare the essential problem with the AAC&U approach: rather than strongly reaffirming the goal of educating students to think for themselves—the traditional goal of liberal education—the AAC&U implicitly admits that it has substituted the goal of pleasing their future employers. At the end of the day, how far is that from students being “subjected to the will of others, mere tools or instruments of alien purposes, unable to choose for themselves”?

This vision of the liberal arts does not free students; it puts the liberal arts at the service of society’s economic masters. It is natural that economic fear in uncertain times leads college students to want to please future employers. That does not mean that educators should seek to assuage that fear by shirking their responsibility to provide their students with far more than that, or should bend the curriculum to meet the desires of employers.

Schneider’s statement is not an isolated case, either. AAC&U’s LEAP program (Liberal Education and America’s Promise) published a piece in 2005 titled “Liberal Education for the 21st Century: Business Expectations” by Robert T. Jones, president of Education and Workforce Policy. Jones is not shy about how he sees the role of higher education: it “must respond to these trends by keeping the curriculum aligned with the constantly changing content and application of technical specialties in the workplace.”

Note, education “must” serve the needs of the workplace. That which business does and wants, higher education must do—because, at the end of the day, education serves business. Education must submit to business’ “assessment” of how well it produces the “outcomes” business wants, it must get “continual input from both employers and graduates” and change its ways accordingly.

Jones states that employers “are less concerned with transcripts than the demonstration of achievement and competency across a variety of general and specialized skills.” Knowledge, wisdom, perspective—none of these traditional liberal arts goals fit this account of what employers want. “Competency” in “general and specialized skills” is the aim. Today, “competencies” has become a common buzzword in education discussions, even opening the door for granting academic credit for work experience, and threatening to make the classroom experience virtually unnecessary.

The new liberal education, Jones says, “now enhanced with practical learning [how’s that for product branding?] is the essential foundation for success in every growing occupation.”

Jones is smart enough to compliment liberal education, even as he asserts that it is, at least in its current form, wholly inadequate and must be altered to serve the workplace better. But his ultimate purpose could not be clearer: education must “make peace” with the market.

Yes, there are substantial economic pressures on students today. Do we as educators, however, serve them best by surrendering our purposes to what prospective employers tell us they want? I say no. The question we need to ask is this: are the traits that employers say they want, and the means we are urged to adopt to meet them, wholly compatible with liberal education?

Take one example: Schenider tells us that colleges should change curriculum to include more “experiential learning” such as internships and “team-based assignments”—the latter because 76% of employers want more emphasis in college on “teamwork skills.”

Do employers and faculty mean the same things when they advocate “teamwork skills” as an educational goal? If employers next tell us we're not producing the correct "outcome" when we teach teamwork, will we be called upon to change practices once again? Is it not possible that when some employers say they want employees with “teamwork skills,” they mean people who will not rock the boat and bring up the less essential “ethical values” that the team might be violating? I’d suggest that the recent record of the banking and financial industries shows that we may be teaching too much teamwork and not enough ethics.

It may not be coincidental that the two lowest priorities for employers on Schneider’s survey were “ethics and values” at 56% and “cultural values/traditions” at 53%. Would those who use such survey results to justify their preferred educational reforms also accept that the curriculum should not emphasize ethics and values, because employers don’t seem to care so much about them? Shouldn’t the low priority the employers placed on ethics and values suggest to us that perhaps their goals are not the same as liberal education’s, and make us at least question whether we should give priority to their preferences?

A liberal arts education should empower students with a sense of perspective, but that is precisely what is sorely lacking in this debate. The AAC&U approach smacks of fear and desperation, but is the reality really so dire that we need to look to surveys of employers to tell us what to do? Yes, the price of higher education is high (though not as high as the sticker price suggests, since most students do not pay that price), and students and their parents have a right to expect that a high-priced college education will prepare its graduates for life—including the working life.

But today’s sense of panic comes less from those realities than from a culture that reflexively and unthinkingly ridicules the liberal arts as impractical, simply because they do not immediately and automatically funnel graduates into high-paying jobs. Seemingly everyone from Click and Clack on “Car Talk” to President Obama buys into the idea that the art history major won’t get you a good job. We laugh and nod knowingly when people joke that all that liberal arts majors really need to know is how to ask “Do you want fries with that?”

But it is simply not true, as an AAC&U report shows. It may seem true when graduation comes and that dream job (making, say, at least as much money as last year’s tuition cost) does not materialize. It certainly did for me when I was in that boat. But I see much better now than I did then. Thirty years down the road, the full value to me of my liberal arts education continues to emerge.

The liberal education will not pay its dividends—either economic or otherwise—in one or two or five years. When we expect it to do so, we are unthinkingly adopting the short-run values of today’s market mentality, with its concern with the next quarter’s profit, not the long-term viability of the company (see, again, the banking and financial industries). When we then change the way we teach in deference to such illusory expectations, we begin to sacrifice what we have always done best in the service of a mirage.

It is hard for liberal arts colleges to preach patience and perspective; perhaps it has rarely been harder to do so than it is now. But it is true: a liberal arts education has long-term value, value that cannot be reduced to income earned two or four years out, as the President’s “College Scorecard” seems to be intending to do.

The fact of the matter is that ten or twenty or thirty years down the road, liberal arts majors are doing fine. True, they may not make as much as their cohorts in the STEM fields. Some may need a graduate degree to enhance further their economic well-being. But the traditional liberal arts curriculum does NOT condemn liberal arts graduates to a life of poverty, and we do not serve our students well when we buy into the lie that it does.

When we accept that false narrative as true, when we contort ourselves and embrace any curricular reform that promises to make us more “practical” and “useful,” when we adopt educational practices for their branding or marketing potential rather than their educational value, we betray our fundamental mission: the education of our students for freedom, not for servitude.

Tuesday, July 22, 2014

Historically Moving

After more than four years doing this blog, I'm starting a new venture. History New Network recently invited me to blog on their site, and with this post, "Historical Humility," I begin.

I'll still be posting my pieces here; probably a day after they make their debut on HNN. And I will continue to use this space for the occasional less historical and more personal piece.

I'd like to thank you readers who have been following this blog--some since it began early in 2010. In retrospect, it seems that every time I began to wonder if it was worth the time and effort, someone would, out-of-the-blue, send me a nice compliment, or ask me when the next piece was coming. So thanks to everyone who did that.

I just wish my Dad was still here to see the new blog. He was probably the biggest fan of "The Past Isn't Past." Nothing gave me more satisfaction than when he would drop a casual "I liked your blog post" into our weekly Sunday afternoon phone call. After he passed, I went on his computer to send a message to his contacts to let them know, and noticed that "The Past Isn't Past" was the first bookmark on his web browser.

So, for that Great Web Browser in the Sky--and the rest of you, too--here's the bookmark for my new web home, Mark Byrnes's Facing Backwards.

Friday, July 4, 2014

I love the Fourth of July

(Re-posted from July 1, 2010)

I love the Fourth of July.

Not just because of fireworks (though who doesn't love a good fireworks display?). And not just because of cookouts (and, since you can throw a veggie burger on the grill too, who doesn't love a good cookout?). And not just because it gives me a reason to play two of my favorite songs, Bruce Springsteen's "Fourth of July, Asbury Park (Sandy)" and Dave Alvin's "Fourth of July" (though, seriously, this would be reason enough).

I love the Fourth because of the Declaration of Independence.

It began sometime in my childhood. At some point, on some vacation, at some historical site, my parents bought me a facsimile of the Declaration. It probably tells you all you need to know about me that I thought this was a great souvenir. It was hard, brittle, yellowed paper that crackled when you handled it. For some time I thought all official documents were thus. So when, in the fifth grade, my classmates called upon me to write a peace treaty ending the Great Spitball War between Group 2 and Group 3 (a foreshadowing that I would one day study diplomatic history?), I insisted on taking the piece of paper, coloring it with a yellow crayon, and then crumpling it up in a ball and flattening it out so that, at least to my eye, it looked like my copy of the Declaration. Then it was official.

Later, I eventually stopped wondering why there were so many "f"s where there should clearly be "s"s, and thought more about its content. Just about every American is familiar with the most famous passage about the self-evident truths. But there is a lot more to the Declaration. Much of it, the bulk of it really, is essentially an indictment of George III justifying the break. Reading it with an historian’s rather than
a patriot’s eye, many of the points don’t really hold up. But my favorite part of the Declaration isn’t one of the well-known lines, or something obscure from the list of charges. It comes at the end, just a simple, short phrase, and it encapsulates for me what is best about the Fourth of July.

When you think about it, July 4 isn’t really the most natural date for the nation’s birth. There are other turning points we could have chosen, for example, the outbreak of hostilities. Using that criterion, April 19, 1775, the date of the battles of Lexington and Concord, would be a better choice. Perhaps February 6, 1778, the date a great power, France, recognized American independence and entered an alliance with the U.S. that would help win the war, would be fitting. Legally one could argue that April 9, 1784, the date Britain recognized independence with its acceptance of the Treaty of Paris, was the true independence day.

But we didn’t chose the date of a battle, or the recognition of a great power, or the acceptance of the mother country. We chose the date of a declaration. What does July 4, 1776 mark, after all? A decision. An intention. Not a change in fact, but a change of mind. Looked at coldly, purely as a matter of fact, the Declaration is an absurdity. The colonies declared that they were independent, but they clearly were not. The colonies were still ruled by royal governors appointed by the King, and were occupied by tens of thousands of British soldiers. But the declaration nonetheless boldly states, in the words of a resolution first proposed by Richard Henry Lee nearly a month earlier, that “these united Colonies are, and of Right ought to be Free and Independent States.”

And it’s that phrase that I love: “and of Right ought to be.” The Declaration is not one of fact. It is one of what “of Right ought to be.” This country was founded with its eyes on the Right. Those men who signed the declaration were not always right. About some things, many of them, in many ways, were tragically wrong. But they knew the importance of what ought to be. And they knew that the most important date was not the one when men took up arms, but when they decided to do what was right. When it has been at its worst, this country has settled passively for what is, or what cynics said has always been and thus must always be. When it has been at its best, it has remembered to keep its eyes on what "of Right ought to be."

Have a wonderful Fourth of July, and sometime between the cookout and the fireworks, think a little about what of Right ought to be. And then work to make it a reality. That’s what the Fourth, and being an American, means to me.

Tuesday, June 24, 2014

Maliki is the New Diem

Some people are talking coup d'etat in Iraq.

David Ignatius writes that "President Obama sensibly appears to be leaning toward an alternative policy that would replace Maliki with a less sectarian and polarizing prime minister."

The impulse to replace Maliki is understandable. Most observers of Iraq argue that he has played a large role in the growing sectarian divide between the majority Shi'ites and the minority Sunnis, and thus bears responsibility for the growth of ISIS in the north.

The unstated assumption, of course, is that another popularly elected, plausible leader could have governed differently and guided Iraq into a functioning democracy, and that now, the fact that elections produced Maliki should not stop the United States from maneuvering behind the scenes to get a more able (read "pliable") leader in his place. Then the United States can go about fixing Iraq.

President George W. Bush shakes hands with Iraqi Prime
Minister Nuri al-Maliki, July 25, 2006. Photo by
Kimberlee Hewitt, public domain via Wikimedia Commons
Perhaps. More likely is that the internal conditions in Iraq produced the kind of leader Maliki became. If that's the case, then a coup to oust Maliki will do no good at all. Instead, it is likely to make things worse.

There is certainly precedent for that. In the mid-1950s in South Vietnam, the Eisenhower administration sought a non-communist popular leader who would not be tarnished by associations with the departing French colonizers. It settled on Ngo Dinh Diem.

For about six years, Diem seemed the answer to American prayers. He created a separate South Vietnamese government as a counter to Ho Chi Minh's communist North. He led a fairly stable regime that served American interests in the region.

President Dwight D. Eisenhower shakes hands with South
Vietnamese President Ngo Dinh Diem, May 8 , 1957
U.S. National Archives and Records Administation
But then in 1960, the National Liberation Front began its offensive against Diem's government. As pressure grew, Diem grew more oppressive, in particularly cracking down on the majority Buddhists. By the fall of 1963, the American embassy and elements of the Kennedy administration decided that Diem was the problem and needed to go. American officials sent signals to South Vietnamese generals who then ousted and murdered Diem and his brother.

Ignatius effectively proposes that the United States do the same thing in Iraq today:
The people who will pull the plug on Maliki are Kurdish leader Massoud Barzani and other Iraqi kingmakers. The United States should push them to signal unmistakably that Maliki is finished…. Saudi Arabia wants Obama to announce that he opposes Maliki. It would be better just to move him out, rather than hold a news conference.
One can only hope that Obama resists such pressure. Things with Diem didn't work out well.

In a February 1, 1966 conversation with Sen. Eugene McCarthy, LBJ put it bluntly. Kennedy was told, he said, that Diem
was corrupt and he ought to be killed. So we killed him. We all got together and got a goddamn bunch of thugs and we went in and assassinated him. Now, we've really had no political stability since then.
The political instability that followed the Diem coup was a major contributing factor in LBJ's disastrous decision to Americanize the war in Vietnam.

The desire to replace Maliki is another example of the imperial attitude toward Iraq: America gets to decide when it is time for the leader to go. I have little doubt that if the United States determined to do so, it could mount a coup against Maliki.  But as always, the question is: what then?

As with the initial invasion, it is relatively easy to destroy. It is much harder to build. The United States can probably destroy Maliki if it so chooses. But can it build anything to replace him?

Sunday, June 22, 2014

David Brooks and Pottery Barn Imperialism

One of the reasons I continue to read David Brooks is that he is often unintentionally revealing. Since he is, I think, quite sincere, he does not indulge in clever subterfuge in making his arguments. Thus he sometimes lays bare what otherwise remains hidden behind what Andrew Sullivan last week (ironically) called "noble lies."

In his June 13 column, Brooks tries to lay the blame for Iraq's current travails at the foot of Barack Obama. Before American troops left in 2011, he writes:
American diplomats rode herd on Prime Minister Nuri Kamal al-Maliki to restrain his sectarian impulses. American generals would threaten to physically block Iraq troop movements if Maliki ordered any action that seemed likely to polarize the nation.
After U.S. troops left, he writes:
Almost immediately things began to deteriorate. There were no advisers left to restrain Maliki’s sectarian tendencies. The American efforts to professionalize the Iraqi Army came undone.
Brooks never acknowledges the obvious (though unstated) assumption behind all of this: that Iraq could not be expected to function without the United States. It seems that Nuri al-Maliki (hand-picked by George W. Bush in 2007, by the way) bears no responsibility for indulging his "sectarian impulses" (and note that Maliki is ruled by "impulse," not thought or calculation), and the Iraqi army bears no responsibility for not being professional. It is all due to the absence of Americans, who of course, know best.

Brooks says, quite without irony, that "Iraq is in danger of becoming a non-nation." It never occurs to him that a state that--according to him--cannot function without American diplomats riding herd and American generals threatening its leader might already be a "non-nation."

Without knowing it, Brooks embraces an imperial role for the United States. It was America's job to control the Iraqi government, make it do the right thing. The United States should have stayed in Iraq for as long as it took. Leaving Iraq was "American underreach."

Brooks also embraces the reflexive American-centric mindset far too common on both the left and the right in the United States: the idea that whatever happens abroad happens because of something the United States either did or did not do. An incorrect American policy of withdrawal led to this state of affairs. It necessarily follows that whatever is going on in Iraq now can be fixed by the correct American policy.

Neither of those things is true. It is an illusion that Americans cherish because they think it gives them control over a chaotic world.

The American invasion of Iraq in 2003 broke Iraq. Iraqis thus far have not been able to put it back together. Maybe they never will. The lesson to be learned from that, however, is not what Brooks would have us believe: "The dangers of American underreach have been lavishly and horrifically displayed."

In the lead up to the Iraq War in 2003, Colin Powell allegedly talked about the so-called "Pottery Barn rule: You break it, you own it." The true lesson of Iraq is this: that American military intervention can easily break a country. It does not follow that American military intervention can just as easily make a country. Having disastrously bungled in breaking Iraq, Brooks would now have the United States once again bungle in trying to make it.

What the United States must "own" is not the state of Iraq, but the responsibility for breaking that state. Those are not the same thing. Responsibility begins with not making the situation worse by repeating the original mistake.

David Brooks, it seems, never learned that lesson. One hopes Barack Obama has.

Friday, June 20, 2014

Somebody Told Us There'd Be Days Like These

With chaos returning to Iraq due to the growing power of ISIS (Islamic State of Iraq and Syria) in the north, the partisan divide over the American war there has resurfaced as well. Supporters of the war charge President Obama with losing Iraq because he withdrew American forces, while critics of the war fume at the gall of the architects of that disastrous war now posing as experts on the region.

Because the debate has been largely partisan, with Republicans and Democrats lining up rather predictably, there is a sense that this is merely a partisan dispute. It is not. Unfortunately, the partisan nature of the current debate makes it seem so.

Rather than go back to the 2003 debate, I decided to look back a little further--to the first war with Iraq in 1991, and the criticism of the George H. W. Bush administration for its refusal to go "on to Baghdad." Those Republican foreign policy leaders defended their decision by predicting undesirable outcomes--ones which we are now seeing come to fruition.

Re-reading the memoirs of Colin Powell (then Chair of the Joint Chiefs) and James Baker (then Secretary of State), it becomes immediately apparent that they foresaw today's events as the nearly inevitable outcome of a U.S. invasion to topple Saddam.

President George H. W. Bush, Secretary of State James Baker, National
Security Advisor Brent Scowcroft, Gen. Colin Powell, Jan. 15, 1991
U.S. National Archives and Records Administration
Writing in 1995, Gen. Powell quoted U.S. ambassador to Saudi Arabia, Charles Freeman, who wrote in a 1991 cable: "For a range of reasons, we cannot pursue Iraq's unconditional surrender and occupation by us. It is not in our interest to destroy Iraq or weaken it to the point that Iran and/or Syria are not constrained by it."

Baker also observed in 1995 that "as much as Saddam's neighbors wanted to see him gone, they feared that Iraq might fragment in unpredictable ways that would play into the hands of the mullahs in Iran, who could export their brands of Islamic fundamentalism with the help of Iraq's Shi'ites and quickly transform themselves into the dominant regional power."

Supporters of the war who now bemoan the growth of Iran's influence in Iraq have no one but themselves to blame. We were told it would be like this.

The current situation--a stable Kurdistan, ISIS in control of much of the Sunni-dominated areas, Shi'ites rallying to the defense of their holy sites--portends the possible partition of Iraq, either formally or de facto. That, too, was foreseen in 1991.

Powell: "It would not contribute to the stability we want in the Middle East to have Iraq fragmented into separate Sunni, Shia, and Kurd political entities. The only way to have avoided this outcome was to have undertaken a largely U.S. conquest and occupation of a remote nation of twenty million people."

The United States spent eight long years doing just that, occupying Iraq to keep it together. But that was never a sustainable long-term prospect. It went on too long as it was. Nevertheless, there are some neocons today suggesting that the United States never should have left Iraq.

Baker, who was known for his domestic political skills before he went to the State Department, knew that scenario was untenable: "Even if Saddam were captured and his regime toppled, American forces would still be confronted with the specter of a military occupation of indefinite duration to pacify the country and sustain a new government in power. The ensuing urban warfare would surely result in more casualties to American GIs than the war itself, thus creating a political firestorm at home."

Twenty years ago, these Republican statesmen predicted the situation we now see in Iraq. They warned anyone who would listen that an American intervention to overthrow Saddam Hussein would have undesirable consequences contrary to American interests, regardless of any specific actions the United States did or did not take in pursuit of that larger goal.

Keep in mind that they said these things would happen with their president in charge, with themselves making policy. They understood that there are forces that such an act would set loose which the United States could not control, no matter who was in office. They said all this long before anyone had ever even heard of Barack Obama. The idea that any specific act by the president is primarily responsible for the current state of affairs in Iraq is absurd on the face of it.

That won't stop people from saying so. But it should keep the rest of us from believing it.

Monday, June 16, 2014

Leadership and Interventionism Are Not the Same Thing

Robert Kagan has written a piece in the New Republic entitled "Superpowers Don't Get to Retire." In it, he bemoans what he perceives as America's retreat from its responsibility to preserve a liberal world order. Kagan argues: "Many Americans and their political leaders in both parties, including President Obama, have either forgotten or rejected the assumptions that undergirded American foreign policy for the past seven decades."

Kagan is correct that public attitudes towards America's role in the world have shifted recently, but he dramatically overstates the case when he posits a break with a 70-year tradition. He seems to equate "leadership" with military interventionism. Americans have rejected the latter, not the former.

What Kagan does not recognize is that the public's current aversion to military interventionism abroad is not only consistent with America's pre-World War II foreign policy, but with the golden age of leadership he praises.

Kagan's fundamental mistake is to think that the American people embraced military interventionism during and after World War II. They did not.

Americans have always been averse to military actions leading to large numbers of American casualties and extended occupations of hostile territory. In the two years before Pearl Harbor, Americans (even the so-called "interventionists") desperately clung to the idea that they could protect American interests merely by supplying the British (and later the Soviets) with the weapons to do the fighting.

While conventional wisdom suggests that Pearl Harbor changed all that, the reality is different. Even after the United States entered the war, it was reluctant to launch military operations that posed the threat of huge casualties. As David M. Kennedy has stated, this American predilection to avoid combat with Germany's forces in France led Stalin to conclude: "it looks like the Americans have decided to fight this war with American money and American machines and Russian men."

Even the major architect of the postwar order, Franklin Roosevelt, did not envision an America that would permanently station large numbers of U.S. soldiers abroad, much less deploy them on a regular basis. Yes, he did see the United States as the leading power in the new United Nations. But the point of having the so-called "Four Policemen" was to insure that the other three would be the ones to send soldiers to keep order in their respective spheres of interest. He imagined that the American role would be primarily in the form of naval and air power. "The United States will have to lead," FDR said of the UN, but its role would be to use "its good offices always to conciliate to help solve the differences which will arise between the others."

FDR, Churchill, and Stalin at Teheran
By Horton (Capt), War Office official photographer
[Public domain], via Wikimedia Commons
As the historian Warren Kimball has written, at the 1943 Teheran conference, when Stalin pressed him on how the United States would comport itself as one of the policemen, "FDR resorted to his prewar notion of sending only planes and ships from the United States to keep the peace in Europe." In FDR's mind, the United States would be primarily responsible for order in the western hemisphere, a role it had played for decades.

Even the so-called American declaration of cold war, the Truman Doctrine speech of March 1947, avoided the implication that American military forces would be deployed to uphold the doctrine. The speech simultaneously signaled to the world that the United States was both assuming some of Britain's responsibilities and had given up on the idea of cooperation with the Soviet Union. Nonetheless, Truman explicitly stated that the aid he was requesting would not be military: "I believe that our help should be primarily through economic and financial aid which is essential to economic stability and orderly political processes." Truman presented aid to Greece and Turkey as mere money to make good on the far larger investment of lives and treasure during World War II: "The assistance that I am recommending for Greece and Turkey amounts to little more than 1 tenth of 1 per cent of this investment. It is only common sense that we should safeguard this investment and make sure that it was not in vain."

The Korean War changed that by requiring quick American military intervention to prevent the collapse of South Korea in the summer of 1950, but when it bogged down into a stalemate after the Chinese intervention in November, the public quickly soured on the war. In January 1951, "49% thought the decision was a mistake, while 38% said it was not, and 13% had no opinion," according to Gallup. While those numbers fluctuated over the next two years, and more Americans thought the war was not a mistake whenever an end to the war was in sight, the American public in general did not support military actions that led to substantial American casualties and prolonged combat. The public's disillusionment with the war was one of the reasons that an increasingly unpopular President Truman decided not to run for reelection in 1952.

The next president, Dwight Eisenhower, moved quickly to end that war, and, more importantly, instituted a foreign policy that had at its core the principle of avoidance of any Korea-style wars in the future. Rather than engage in limited wars in every world hot spot, Eisenhower determined that such a course would bankrupt the country. He preferred "massive retaliation": the idea that a threat to essential American interests would be met with a nuclear threat, not a conventional response in kind. Even when the French faced defeat in Vietnam, Eisenhower refused to intervene, and never seriously considered deploying American troops to Vietnam.

While John Kennedy came into office criticizing that approach, pledging to "pay any price, bear any burden," the sobering experience of the Cuban missile crisis made him rethink that mindset. The cold war, he said in June 1963, imposed "burdens and dangers to so many countries," and specifically noted that the US and Soviet Union "bear the heaviest burdens." He spoke of the American aversion to war: "The United States, as the world knows, will never start a war. We do not want a war. We do not now expect a war. This generation of Americans has already had enough -- more than enough -- of war and hate and oppression."

While one may argue that Kennedy's policies led to the next American war in Vietnam under his successor Lyndon Johnson, it is also the case that Johnson sought to avoid a land war. Significantly, he looked first to use air power. Operation Rolling Thunder, the American air campaign against North Vietnam, was meant to forestall the need for American ground troops in large numbers. It was only after the clear failure of bombing to achieve American aims that Johnson escalated the war with more ground troops.

When that effort too proved futile, Richard Nixon again returned to air power as America's main instrument to maintain order abroad. His Vietnamizaion policy tried to balance the withdrawal of American troops with the deployment of increased air power. The "Nixon Doctrine," announced that henceforth "we shall furnish military and economic assistance when requested in accordance with our treaty commitments. But we shall look to the nation directly threatened to assume the primary responsibility of providing the manpower for its defense." In other words, America's friends should not expect American troops to do their fighting for them.

I'd argue that from Nixon up until George W. Bush's invasion of Iraq, that was American policy. Ronald Reagan, George H. W. Bush, and Bill Clinton all avoided open-ended military commitments of American troops (Clinton's air-only campaign against Serbia in 1999 is the best example).

Only the first war against Iraq in 1991 challenged that trend, and even that war involved a longer preliminary air campaign than a ground one: five weeks of bombing preceded the ground campaign, which lasted only 100 hours. According to Colin Powell, Bush had the Vietnam War in mind when he resisted the calls of "on to Baghdad." Bush "had promised the American people that Desert Storm would not become a Persian Gulf Vietnam," Powell writes in his memoir, "and he kept his promise." Within two weeks of the ceasefire, the 540,000 U.S. troops began their withdrawal from the Persian Gulf.

Even the American war in Afghanistan in 2001 was planned to keep the American "footprint" light, relying on American air power and the Afghan Northern Alliance to do much of the fighting. It was the invasion and prolonged occupation of Iraq beginning in 2003 that predictably soured Americans once again on the prospect of extended military engagements.

In sum, what Americans are experiencing now is not exceptional, but rather normal. In the aftermath of extended, costly military interventions leading to the loss of American lives, the American people revert to their historical aversion to solving problems by fighting in and occupying foreign states. That does not mean the United States ceases to be relevant, or ceases to lead. It simply means that Americans have been reminded once again that not every problem can be solved by an invasion, and that leadership is more than a reflexive application of American military might.

Thursday, June 12, 2014

Won't You Let Me Take You On a Sea Cruise?

Frank Bruni wrote a piece in the New York Times the other day, urging politicians to seek more solitude:
Take more time away. Spend more time alone. Trade the speechifying for solitude, which no longer gets anything close to the veneration it’s due, not just in politics but across many walks of life.
It’s in solitude that much of the sharpest thinking is done and many of the best ideas are hatched.
Coincidentally, I was reading about how FDR came up with the Lend-Lease program to aid Britain before the United States entered World War II, which makes Bruni's point perfectly.

Film title from an earlier FDR cruise,
from an FDR Library archival film
After winning his unprecedented third term the previous month, on December 2, 1940, FDR set off aboard the cruiser USS Tuscaloosa for a two-week cruise in the Caribbean.

Now, try to imagine the indignation today if Barack Obama slipped away from Washington (without any notice, no less!) for a two-week sea cruise.

FDR's cruise was not hidden, but rather
filmed for use in Navy recruiting
Not only did FDR not hesitate to take a vacation, he also did pretend it was a "working" vacation. He took a few close friends and advisors, and according to David Kaiser in his fine new book, No End Save Victory, they "spent the two weeks fishing, playing poker, sunning themselves and watching movies in the evening." Though the White House tried to portray it as a base-inspection tour, FDR "boasted proudly after his return that he did not read any of the working papers he had brought with him."

FDR fishing during a February 1940 southern cruise
That did not mean, however, that this was unproductive time.

FDR did read at least one item of business, what Winston Churchill called one of the most important letters he ever wrote--an appeal for the United States to drop its "cash and carry" requirement on aid to Britain because Britain no longer had the cash to pay.

Churchill later wrote:
Harry Hopkins [one of FDR's companions on the trip] told me later that Mr Roosevelt read and re-read this letter as he sat alone on his deck chair, and that for two days he did not seem to have reached any clear conclusion. He was plunged in intense thought, and brooded silently.
Hopkins said:
I didn't know for quite awhile what he was thinking about, but then--I began to get the idea that he was refueling, the way he so often does when he seems to be resting and care-free. So I didn't ask him any questions. Then, one evening, he suddenly came out with it--the whole program.
Two things are key here--FDR's own understanding that he needed to occasionally "refuel" in order to do his job well, and the understanding of his close friend and advisor Hopkins that FDR needed to be left alone to think. He allowed his boss the time to brood silently.

Could there be a better riposte to today's obsession with being busy for the sake of being busy, meeting for the sake of meeting? None of us bear the tremendous burdens that FDR had at the time--a world war to navigate the nation through--yet we are so prone to exaggerate our own importance and pose as too busy to "waste" time.

FDR was wiser. There was nothing wasteful about his sea cruise vacation. It was an investment, and one that paid off for the entire world.

Wednesday, May 14, 2014

"God Save Me From My Friends"

I've often used this space to bemoan the absence of the spirit of compromise in American politics. Not everything is a matter of principle, and political leaders do not serve us well when they act as if everything is.

That does not mean that compromise is always the right response to every question, however.

Yesterday's South Carolina Senate deal on the USC-Upstate and College of Charleston book controversy is a case in point. Some members of the state legislature (particularly in the House), angered by required readings dealing with LBGT issues at the two institutions, have tried to punish them by cutting their state funding by the amount spent on the reading programs.

The Senate "compromise" was to restore the funding, but at the same time demand that the institutions spent that much money on teaching the Declaration of Independence, Constitution, Federalist Papers, and other founding documents. In addition, it required that students be allowed in the future to opt out of a reading if they object to the subject matter “based on a sincerely held religious, moral, or cultural belief.”

While some supporters of academic freedom hailed the compromise as a qualified success for avoiding the punitive cuts, I see it as a surrender of principle.

The point was not simply to avoid financial punishment for an education choice--it was to uphold the principle that educators must be free to assign reading material they deem to be well-suited to their educational purposes. Both "compromise" measures violate that basic principle.

The first part does so by effectively restoring the money cut on the condition that it be used for purposes determined by the state legislature. The worthiness of studying the Declaration of Independence, Constitution, Federalist Papers, and other founding documents is not the issue. I happily teach them in my American history classes. The fundamental question is: who decides? In that crucial matter, the putative supporters of the institutions under attack actually give the same answer as those attacking them: the state legislature decides. The only difference is that rather than dictating which material should not be taught by punishing the institutions for assigning it, they dictate which material must be taught. In both cases, they remove the essential power to make the judgment about academic content of assignments from educators.

The second part similarly undermines the authority of educators. No one can reasonably judge whether or not a student's objection to subject matter is the product of a "sincerely held … belief." Of necessity, all students must be taken at their word if they say so. This then means that every student has been issued a veto power over content. This proviso amounts to granting every student the right to not have a belief challenged. The entire academic enterprise hinges on the ability of educators to subject ideas to critical analysis. If a student may say "my sincerely held belief may not be scrutinized, I refuse to read something that might challenge my beliefs," then educators are forced to teach with their hands tied.

I don't doubt that this "compromise" was legislatively necessary to avoid Senate approval of the budget cuts. The money was kept in the budget by the slimmest of margins: 22-21. It seems likely that some fearful senators were convinced to support the restoration of the money only on the condition that they would then be given the chance to vote for mandating the teaching of the founding documents and giving students the opt-out power.

I'm sure those who crafted and supported the compromise in order to maintain the funding think that they served the cause of academic freedom, but they did not. If forced to choose between the bigoted and ignorant idea of punishing institutions of higher education for the content they assign, and the allegedly "reasonable" idea that led to this compromise, I'd prefer the former. It is open, honest, and straightforward in its opposition to academic freedom. However well-intentioned the compromisers were, they actually showed that they don't understand the principle of academic freedom, and in trying to serve it, they actually undermined it.

The whole thing reminds me of a proverb: "God save me from my friends. I can protect myself from my enemies."

Thursday, April 10, 2014

"A Sweet Fool"


"Dost thou know the difference, my boy, between a bitter fool and a sweet fool?"--the Fool, in Shakespeare's "King Lear"

Stephen Colbert is leaving "The Colbert Report" to take over David Letterman's slot on "The Late Show" on CBS.

I don't normally post about TV, but then again, this isn't a post about TV. It's about the value of satire in a democracy.

As an avid fan of "The Daily Show," going all the way back to the not-at-all lamented Craig Kilborn days, I can remember when Colbert was "the new guy." (Most people have forgotten that Colbert actually preceded Jon Stewart on TDS by two years.) I had always liked him, but never as much as when he developed the Bill O'Reilly-esque persona for which he is now famous.
By David Shankbone, via Wikimedia Commons

I've been a devoted fan of the show. I may not have seen every episode, but I've probably come close (and may in fact have seen 100% since I got a DVR). What I've enjoyed the most is the relish Colbert takes in his satire. His talents (and those of his writers) have created what I consider to be the best satirical character in modern American history.

That's why this career move gives me pause.

Joan Walsh published a nice piece just the day before the announcement about Colbert's value to the progressive movement. That's true, but I'd go further. He is valuable to our democracy.

Humor, particularly sharply satirical humor, is incompatible with the totalitarian mind. It punctures holes in the immense pretensions of totalitarians. One of the lesser-acknowledged attributes shared by totalitarians of the right and left alike is their humorlessness. They are so deadly serious about not only their ideas but themselves that they cannot abide any mockery. As O'Brien says to Winston Smith in George Orwell's 1984, under Big Brother "[t]here will be no laughter, except the laugh of triumph over a defeated enemy."

By contrast, I think you can judge the health of a democracy by the extent of its self-mocking humor. The liberalism born in the 18th century had as a cornerstone its openness to critique--an acknowledgement that, however well-thought out one may believe a position to be, it is always subject to argument and new evidence--and mockery, which in the form of satire is, itself, a kind of argument that exposes unfounded assumptions and unacknowledged hypocrisy.

Colbert's satire has always been at the expense of the powerful, not of the "defeated enemy." Americans, at their best, have always seen their leaders as fit subjects for mockery. It is one of the ways we remind them that they are, after all, just like us: no better or worse, just temporarily entrusted with power. We have often loved best those leaders who show they have a sense of humor, especially of the self-deprecating kind (Lincoln most of all, FDR to a lesser extent), while judging harshly those who appear humorless (see Herbert Hoover and Richard Nixon in particular).

But at the same time, Americans have also been--at least to my mind--insufficiently appreciative of good satire. Ever since the ridiculous controversy over Randy Newman's "Short People" in 1978, it's been clear to me if the general public could not see that Newman's song was meant to satirize prejudice, America must suffer from a severe irony-deficiency. That's why I've been so heartened by the success of Colbert's right-wing pundit character. People got it. That had to be a good thing.

Now that this success has catapulted the real Colbert to late night network stardom, however, that satirical character will be no more. He's a smart, talented man. I'm sure he can and will do other things well, and succeed in his new job. But his gain is our loss.

Colbert's combination of sharp intellect, courage, and human decency has made him ideal for political satire. (His 2006 speech at the White House Correspondents Dinner remains, to use his language, the ballsiest act of political comedy I've ever seen.) There is a deep compassion for the weak, the downtrodden, and the suffering that informs Colbert's satire. No doubt that quality will continue to inform his future work. But in the satire of "The Colbert Report," it combined with the swift sword of his intellect in a particularly effective way. It is hard to imagine it will be the same when he emerges from that character and has to entertain the broad, irony-deficient expanse of all of America.

He may prove me wrong.  I certainly hope so. And if not, America is still the better for nine years of Stephen Colbert's brilliant satire on "The Colbert Report."

Friday, February 14, 2014

The Czech Balance

In the last dozen years, I've traveled in two communist states, China and Vietnam, but I'd never been to a former communist state before last month. Other than its beauty and generally rich history, one of the appealing things for me about going to Prague was its place in 20th century history, particularly the Cold War. I've spent a large part of my career studying and teaching about the Cold War, and I was curious to get a taste of how people who lived under its shadow looked back on it.

Thus, I was particularly looking forward to our group's visit to the Museum of Communism.

I'm not sure exactly what I was expecting, but I wasn't expecting to be able to walk right past it without seeing it from the street. I should note for the record that I have a notoriously bad sense of direction, but in this case I was actually on the right street, Na Příkopě, and in the right place. But you can't see it from the street. The brochure helpfully points, however, that it is "above McDonald's, next to Casino."

That was my first clue that this would be no ordinary museum experience.

Once inside, that insight was continually reaffirmed. The first thing one notices is the prevalence of Soviet-era paintings and statues of Marx, Lenin, and Stalin. They had to go somewhere after the fall of communism, I suppose, and this is where at least some of them went.

One of the more interesting aspects of the museum for me was its willingness to examine the role that Czechs themselves played in the communist regime--particularly in its establishment in 1948. Unlike Poland and some of the other regimes in the old Soviet bloc, Czechoslovakia did not immediately fall under Soviet domination after World War II. Czech communists did fairly well in reasonably free postwar elections, and were included in the government of Edvard Beneš, the Czech nationalist who served as president from 1940 to 1948.

The Czech reality is that homegrown communists were largely responsible for the imposition of communism after 1948, and the museum does not shy away from that fact.

It is also impossible to escape the Czech sense of humor, which permeates the entire museum. One of its displays, for example, is a communist-era shop, with almost nothing but a few nondescript cans on the shelves. Apart from being historically accurate, this is a highly comic choice, and it draws some of its power from its humor. It reminded me of the comments our wonderful Czech guide, Helena, would make whenever we'd pass an example of what she liked to refer to as "socialist architecture": "Coming up on the right is something no one should see," she'd wryly say. "So please, close your eyes."

The museum, like the Czechs themselves, does not play everything for humor, not at all. The re-creation of a secret police interrogation room and the display telling the story of the 1968 Soviet invasion make that readily apparent.

But the humor is never far removed from the tragedy. It is that sense of balance--the knowledge that comedy and tragedy are not opposites but integrally related parts of life--that gives this museum its character.

The absence of basic necessities is no joke. Nonetheless, walking down a hallway, you pass this picture tucked away in a little nook. At first it appears to be a bit of socialist realist "art," but then you read the caption: "Like their sisters in the west they would have burned their bras, if there were any in the shops."

Working my way through the museum, with its largely chronological approach leading inextricably to the amazing events of 1989, it was hard not to feel a touch of American triumphalism. You move from the dreary existence of the 1950s, through the brief optimism of the Prague Spring, only to see it brutally snuffed out by the Soviet invasion. You learn of the repression of post-'68 "normalization," and thrill at the rise of the Charter 77 dissidents and Vaclav Havel. Then the forty-year Czech nightmare comes to an end, and the people embrace the liberal values that the U.S. stood for in the Cold War.

Upon arriving at the gift shop and looking for some postcards to take home as souvenirs, I came across this one that seemed to capture just that pro-American sentiment. "We're above McDonald's--Across from Bennetton--Viva La Imperialism!"

It's a funny card. It takes such glee in mocking Lenin, one of the original critics of imperialism. I had to buy one.

I kept rummaging through the shop, and found a couple of collections of Soviet-era anti-American and anti-capitalist propaganda posters, spending far too much on them and rationalizing that I could make use of them in class one day.

And I made one last purchase. Another postcard, but this one had a message significantly different from "Viva La Imperialism!" The two of them together tell, I think, a meaningfully different story than either one does in isolation.

"Come and see the times when Voice of America was still the voice of freedom." Now there's a quick cure for the American tourist's sense of triumphalism. I have no way of knowing when exactly this card was first produced and therefore can only guess at what particular events or policies produced it, but there is no escaping its message of disillusionment with the post-cold war U.S.

There was that sense of balance again. Even in a museum dedicated to demonstrating the failure of the ideology of America's Cold War nemesis, there was a refusal to indulge in a mirror-image worship of America's victorious ideology.

What I took from the museum, what I took from much of the reading I did to prepare for the trip (in particular works by Ivan Klima and Milan Kundera), what I took from my admittedly brief and superficial exposure to the Czechs, is that it is the uncritical embrace of ideology--perhaps as much as the content of that ideology--that leads people to destructive fanaticism. If we take our ideas so seriously that we cannot laugh at ourselves and see the humor in the sometimes absurd manifestations of our own beliefs, that is when we lose our way. I'm sure that's not a uniquely Czech view of life, but I saw enough of it there to associate it with the lovely city of Prague. That, as much as the beauty of the city itself, is what I think I'll most remember.