Wednesday, October 22, 2014

Can Obama Do in Iraq What Nixon and Ford Couldn't in Vietnam?

[Originally published on History News Network]

Practically every American intervention abroad since the 1960s has prompted comparisons to Vietnam. So it was hardly surprising when on October 8, in response to President Obama’s decision to expand the campaign against ISIS into Syria, Frederik Logevall and Gordon M. Goldstein authored an op-ed in the New York Times that asked “Will Syria Be Obama’s Vietnam?”

I’m not sure that’s the right question. The American concern over ISIS originated in Iraq, after all—an intervention that is now eleven years old. America’s air campaign against ISIS today reminds me less of the intervention that happened in Vietnam than the one that didn’t happen—in the spring of 1975.

This past June, when ISIS suddenly broke through America’s collective effort to forget about Iraq and seemed poised to take Baghdad, it was easy to wonder if we were about to witness a repeat of the fall of Saigon.

More than two years after the peace agreement that led to the withdrawal of American troops from Vietnam, a North Vietnamese offensive against South Vietnam met with little effective resistance, much like last June’s stories of Iraqi armed forces dropping their arms and failing to fight ISIS. Compare these passages from the New York Times coverage of the fall of Hue in March 1975 and Mosul in June 2014:
“By the thousands, the people are abandoning Hue…. The armed forces are also moving out, some by landing craft, some in military vehicles, some bundled into trucks with family members, furniture and food. No one seemed in the slightest doubt yesterday that Hue and the rest of the north were being left to the Communists.”
“Thousands of civilians fled south toward Baghdad…. The Iraqi Army apparently crumbled in the face of the militant assault, as soldiers dropped their weapons, shed their uniforms for civilian clothes and blended in with the fleeing masses…. ‘They took control of everything, and they are everywhere,’ said one soldier who fled the city.”
The political reaction this summer also eerily echoed the reaction to events of nearly 40 years ago.

In his Memoirs, Richard Nixon argued that he had won the Vietnam war and that American bombing of the North would have preserved the South Vietnamese government. It had survived for two years after the peace agreement. That meant Nixon’s Vietnamization had worked.

“When Congress reneged on our obligations under the agreements,” Nixon wrote, “the Communists predictably rushed in to fill the gap.” Nixon had privately assured South Vietnamese President Thieu that violations of the peace agreement by Hanoi would be met with renewed American bombing. But in June 1973, the Church-Case amendment forbade funding for any military operations in Vietnam. “The congressional bombing cutoff, coupled with the limitations placed on the President by the War Powers Resolution in November 1973, set off a string of events that led to the Communist takeover.” The war was “lost within a matter of months once Congress refused to fulfill our obligations,” Nixon said.

Henry Kissinger has also repeatedly argued that the peace agreement reached with Hanoi had secured the independence of South Vietnam, and that he and Nixon intended to use air power to thwart any North Vietnamese aggression against the South. But, he asserts, Watergate so weakened Nixon that they were unable to overcome the opposition of Congress. In a meeting with Singapore’s Lee Quan Yew on August 4, 1973, Kissinger said: “We have suffered a tragedy because of Watergate … We were going to bomb North Vietnam for a week, then go to Russia, then meet with [North Vietnam’s lead negotiator] Le Duc Tho. Congress made it impossible.”

Lewis Sorley, in his 1999 work A Better War: The Unexamined Victories and Final Tragedy of America’s Last Years in Vietnam, argued that “[t]here came a time when the war was won.” Due to the pacification efforts of Gen. Creighton Abrams, he writes, victory in in Vietnam “can probably best be dated in late 1970.” The countryside was pacified, and South Vietnamese forces were “capable of resisting aggression so long as America continued to provide logistical and financial support, and … renewed application of U.S. air and naval power should North Vietnam violate the terms of that agreement.”

The argument that continued American application of its air power against North Vietnam could have preserved South Vietnam has been thus been a staple of Vietnam War revisionism.

In June, Sen. John McCain made an argument about Iraq similar to the one that Nixon, Kissinger, and Sorley made about Vietnam:

"We had it won," McCain said. "Gen. [David] Petraeus had the conflict won, thanks to the surge. And if we had left a residual force behind, that we could have, we would not be facing the crisis we are today. Those are fundamental facts ... The fact is, we had the conflict won, and we had a stable government.” Sen. Lindsey Graham of South Carolina added: “There is no scenario where we can stop the bleeding in Iraq without American air power."

There are no do-overs in history, and no one can say for certain whether the renewed application of American air power after the 1973 peace agreement might have prevented the fall of Saigon—or if it did, for how long. But we are currently seeing why Congress sought to limit the executive branch’s options back in 1973.

The fear then was that, despite the peace agreement, Nixon and Kissinger would continue to fight a war that the country overwhelmingly wanted to be over. Kissinger’s repeated statements indicate that they in fact intended to do just that, not just in Vietnam but possibly in Cambodia, too. The Church-Case Amendment was how Congress expressed the national consensus against reviving the war.

Today, there seems little will in Congress to restrain the president’s war-making powers. If anything, the loudest voices have been those arguing for even greater military action. In response to such pressure, the president has already expanded the air war to Syria.

Just last week, McCain argued that “pinprick” airstrikes were proving ineffective, and called for further expansions of the war: “They’re winning, and we’re not,” McCain told CNN. “The Iraqis are not winning. The Peshmerga, the Kurds are not winning.” Thus, he argued, there was a need for “more boots on the ground … in the form of forward air controllers, special forces and other people like that…. You have to arm the Peshmerga … Buffer zone in Syria, no-fly zone, take on Bashar al Assad the same as we have ISIS.”

McCain’s vision of a renewed, ever-expanding war is precisely what Congress in 1973 meant to prevent Nixon and Kissinger from doing. After nearly a decade of war, Americans had decided that the fall of South Vietnam, Cambodia, and Laos would not be a mortal threat to American security.

Today, what stands between the United States and the full-scale revival of a war Americans thought was over is not Congress, but the president himself. Obama has repeatedly stated that he will not re-introduce American combat troops to Iraq, and he is trying to maintain a sense of balance about the nature of the threat: “While we have not yet detected specific plotting against our homeland, these terrorists have threatened America and our allies. And the United States will meet this threat with strength and resolve.”

McCain, however, is doing the opposite, hyping the threat the U.S. Back in June he said: “We are now facing an existential threat to the security of the United States of America.” Last week he said: “it is a threat to the United States of America if they are able to establish this caliphate.”

A September CNN public opinion poll suggests that Americans agree with McCain about the threat, while siding with Obama on the limits of the U.S. response. Ninety percent say ISIS represents a threat to the U.S., with 45 percent calling the threat “serious,” 22 percent saying it is “fairly serious” and 23 percent saying it is “somewhat serious.” (Two years after 9/11, in 2003, 49 percent considered Al Qaeda a “serious” threat to the U.S.) Seventy-one percent believe ISIS terrorists are already in the U.S. But at the same time, by a 61-38 margin, Americans oppose using American ground forces to defeat ISIS.

ISIS has succeeded in making Americans think that Iraq matters again, and that U.S. interests require its defeat, but it has not yet convinced them that it is worth Americans doing the fighting and dying. That's Obama's dilemma. If air power is not enough, does he take the chance that Iraq (or Syria) falls to ISIS, or does he break his promise?

In the spring of 1975, Congressional and public opinion meant that President Ford had little choice but to watch as the North Vietnamese Army rolled into Saigon. Nearly 40 years later, President Obama faces a far more difficult task: prevent the collapse of the Iraqi government (and, increasingly, the Syrian opposition) without fully reviving a war he spent years trying to end—all in the face of an opposition that is intent on proving that the Iraq war it supported was won until the president lost it.

Whether Obama will be able to keep his promise not to send American ground forces back to Iraq is very much an open question. Having taken the first step to save Iraq by applying American air power—what Nixon, Kissinger and Ford could not do in Vietnam—it may be increasingly hard to resist subsequent steps if air power proves to be not enough.

Tuesday, September 9, 2014

Lies, Damn Lies, and Statistics (Higher Education "Reform" Edition)

Following this summer's seminar on the liberal arts at Transylvania University, I resolved to more consciously talk about the liberal arts with my new crop of first-year students in my Humanities class this semester. Last week, we spent a full class period talking about their reasons for coming to Wofford, and Wofford's commitment to a liberal arts education. We'll spend two more classes this week discussing it. They should understand what they're getting into, I think.

The beginning of the academic year always prompts some thinking about the purpose of education, even among those not engaged in it. Frank Bruni has an interesting piece in the New York Times arguing that higher education has an obligation to challenge students: "college needs to be an expansive adventure, yanking students toward unfamiliar horizons and untested identities rather than indulging and flattering who and where they already are." I couldn't agree more.

The Times also carried another piece that conveys the more dominant view in American culture: that college, first and foremost, is about getting a job.

Ben Carpenter, vice chairman of the CRT Capital Group, argues that what is missing from college today is "career education." For Carpenter, it is not enough for colleges to provide majors geared toward professional pursuits, and to have Career Services offices. The college must also offer courses in "career training":
So what can be done to make certain these young adults are being prepared for life post-graduation? The answer is simple: Colleges need to create, and require for graduation, a course in career training that would begin freshman year and end senior year.
(Note to self: remind students to always beware whatever statement follows the phrase "The answer is simple.")

The first thing worth noticing here is Carpenter's choice of words. He is clear about what his concern is: "how to get, and succeed at, a job." But the title of the article isn't "Is Your Student Prepared for a Job?--it is "Is Your Student Prepared for Life?" Throughout the piece, Carpenter uses the words "job," "career," and "life" interchangeably.

It does not take a liberal arts education to know that those words do not mean the same things. Too often in discussions of education, we elide the differences, so when talking to my students last week, I made the difference explicit. A liberal arts education is meant to prepare you, I said, not just to make a living, but to make a life.

I do not know whether Carpenter intentionally conflates "job" and "life" to confuse the reader, or if he honestly does not see a meaningful distinction between the two. Either way, doing so has the effect of perpetrating the idea that your job is your life and so college is only about getting a job.

The second issue that got my attention was that Carpenter employs what seems to me the knee-jerk "reform" response to every perceived challenge in higher education: make it part of the curriculum! I have no problem with the idea that colleges should help students find post-graduate employment. Here at Wofford, The Space is devoted to that project, and does a lot of good for our students. But it is not part of the curriculum.

That's not what Carpenter is calling for; in fact, he denigrates Career Service offices as suffering from a "major disconnect" with students. He wants "a course," one that lasts for four years and is required of all students. Since Carpenter does not get more specific, it is hard to know whether he means a course every semester for four years, or one course a year, or one course that lasts four years. But he clearly is talking about making it part of the curriculum.

It is self-evident that every new course requirement reduces the electives available for students to take to investigate their own passions or interests. The more expansive Carpenter's plan, the fewer academic courses students in it will take. It is hard not to wonder if that isn't part of the idea. If college exists merely to train workers, what do they need those electives for, anyway?

Finally, there is the matter of the precise problem that is driving his proposal. At the start of the article, Carpenter states:
According to a recent poll conducted by AfterCollege, an online entry-level job site, 83 percent of college seniors graduated without a job this spring.
In contrast, toward the end, he cites an example that suggests the efficacy of what he proposes:
One year after graduation, 96 percent of all Connecticut College alumni report that they are employed or in graduate school.
One of the things my liberal arts education taught me is to look closely and carefully when someone cites statistics. On the surface, the difference seems stark: 83 percent with no job, 96 percent employed! See, the answer is simple! Certainly that's what Carpenter wants us to think. But a moment's consideration shows that he's doing the old apples and oranges comparison.

The AfterCollege survey only purports to measure only how many students reported having a job lined up before graduation. The accuracy of that number may be questionable, since it was an online survey, and not, as Carpenter says, a scientific "poll." Second, the fine print on the survey reveals that the respondents not just students about to graduate--a majority had already graduated, 23.38 percent were college seniors, and 12.25 percent were juniors. (Safe to say that few if any juniors already have a job lined up for after graduation.)

The 83 percent number comes just from students still in school, including those juniors. For recent grads, the number is 76.3 percent. No doubt that's a big number, but it is not 83. In addition, since the survey was conducted between February 27 and April 15, 2014, some seniors who answered "no" in late February or March may well have had jobs by the time they graduated in May 2014.

In short, it is not true that 83 percent of last year's graduates had no job at graduation, even according to this survey.

Now let's look at the Connecticut College numbers. By contrast, they are not a mix of recent grads and current juniors and seniors. They measure an entire graduating class. In no way can that group be reasonably compared to the AfterCollege survey respondents. In addition, it measures the outcome for those students one year after graduation.

A true comparison would require surveying only graduating seniors right after they graduated and then comparing the number with jobs to the number with jobs one year later. A year makes a huge difference in the job search, as does being out of school--I recall not feeling much urgency about getting a job until after I graduated. In my experience, most college seniors are preoccupied with either the successful completion of their degrees or enjoying the final months with friends they've known for three and a half years, or both. The job search gets serious after graduation.

In addition, the Connecticut number lumps together the employed and those who are going to graduate school--those planning to attend graduate school of course do not have a job lined up before graduation. For all we know, a significant percentage of those reporting "no job" in the AfterCollege survey may well have had plans to go to graduate school.

The Connecticut College program may well be worthwhile and do great good. But Carpenter's comparison is misleading. I have no idea whether Carpenter realizes that the comparison of the two numbers is misleading, but it is. I have to think if he had direct apples-to-apples comparisons that served his argument, he would have used them instead. But I suspect that they would not have been nearly as stark as the ones he uses.

As I stated in my last post, the idea that colleges are miserably failing their students by not preparing them for the working world is simply not true. It is true that few graduates move seamlessly from college straight into their dream jobs. But the idea that somehow there is a problem so significant that students must replace some of their academic courses with "career training" courses--and that such courses will solve the problem in what is still an extremely competitive and tight job market--is just silly.

But that's what passes for intelligent commentary on higher education these days.

Tuesday, July 29, 2014

On the State of the Liberal Arts

For teachers, one of the most enjoyable things to do is spend time being students again.

So it was that I spent the past weekend at Transylvania University’s seminar on Twenty-First Century Liberal Education, along with 18 other academics from a variety of liberal arts institutions.

We all read hundreds of pages of material in preparation. In the span of 65 hours at the seminar, we spent two hours listening to formal lectures (and another hour discussing them), 10 hours in formal discussion sessions, and countless more hours informally continuing those exchanges.

Yes, this is what teachers do in the summer for fun. And it was fun—as well as intellectually illuminating and invigorating.

It was also sobering, coming as it did at a time when higher education faces plenty of public scrutiny and criticism, and when the liberal arts and liberal arts colleges in particular face charges of irrelevance.

The value of this kind of intensive consideration of a topic is that it inevitably focuses the mind. Many of the issues we discussed have been bouncing around my brain for a while (sometimes showing up in this blog), but I’ve never considered them as intensely as I did at the seminar.

Since I’m forever preaching to my students that the best way to figure out what they think about a reading or discussion is to write about it, I’ll attempt to do that myself. (All of the pieces I quote below are from the wonderful reader that the Transylvania seminar leaders put together.)

For the historian, the easiest and most obvious conclusion to take from our readings is that there is nothing new about the liberal arts—or higher education in general—being under siege. It rather seems like a permanent state of affairs. That’s no excuse for complacency about its current challenges, to be sure, but it does help leaven one’s reaction to all of the apocalyptic warnings of the demise of liberal arts. This is not new: the liberal arts college has been through this before and survived. As Alan O. Pfnister put it in 1984, “the free-standing liberal arts college in America has been a study in persistence amid change, continuity amid adaptation.”

“Continuity and change” is the essence of history, and the story of the liberal arts has seen plenty of both. The perennial debate seems to revolve mostly around the question of value and utility: What precisely is the value of the liberal arts? How do we determine that value, how do we present that value to prospective students and their parents?

For clarity’s sake, the sides can be simplified: 1) the liberal arts have value that cannot be quantified and assessed in any meaningful way, but they prepare students to lead better, more meaningful lives; and 2) the liberal arts must demonstrate their practical value in concrete, accessible ways that give others outside the academy reason to believe they are worth the time and money expended in studying them.

Since these are simplifications, few people are likely to identify with either without some kind of reservation, but I’d argue that at some point everyone concerned with the topic will end up choosing one as having primacy over the other.

I choose the first. I am not unaware of the pressures being brought to bear to make college education ever more “practical” (read “directly applicable to post-graduation employment”) to justify its high price tag. I simply believe first causes matter and that something essential is lost when we, as another participant in the seminar put it, allow external rather than internal causes to determine what and how we teach.

The second point of view, however, seems to dominate the field these days. Writing in 2007, David C. Paris, professor of government at Hamilton College (and one-time participant in the Transylvania seminar) said: “the liberal arts and the academy in general need to make peace with, or at least acknowledge, the importance of the market.”

I’ll meet Paris half-way: I acknowledge that the market matters. Despite his rather disdainful portrayal of the traditional liberal arts as appearing “esoteric and apart from real concerns” or “ornamental,” and of its defenders as not concerned with the “real world,” I am not oblivious to reality.

But no, I will not “make peace” with the idea that the market should determine what and how educators in the liberal arts teach. Paris argues that “the liberal arts are threatened,” at least in part, by “too narrow a self-concept” among its practitioners. He writes that “promoting a good life recognizes that there are many ways of living such a life.” The latter is true. But it is not the liberal arts that are “too narrow.” It is the market that defines the good life in the most narrow way possible, i.e., by a single standard: the dollar sign.

Our students do not need the liberal arts to tell them that money matters. The entire culture tells them that relentlessly. They cannot escape it. It is our job as educators to open them to some of the other possible answers to that basic question: “What makes a good life?”

The liberal arts have a long history of addressing that question and advancing our understanding of the good. Liberal education has been a vehicle for addressing questions of inequality and oppression, empowering students to challenge the institutions that buttress those conditions, primarily through encouraging independent thinking. It has been a truly liberating force, and it has not achieved that by asking what the market wants from it.

What message does it send about the answer to that fundamental question of the good when the Association of American Colleges and Universities (AAC&U) resorts to focus groups of students and employers to tell educators what liberal education should be? Or when the AAC&U endorses and privileges certain educational trends as superior (“active” or “high-impact”) to others and justifies its prescriptions by noting that “employers strongly endorsed” them and that they will receive “very strong support from the employer community”?

Whether they realize it or not, they are saying in effect: Let the market decide. They are abdicating their responsibility as educators to shape curriculum. They are buying into not just the language but the values of the market: if it is demanded, it must be supplied.

David L. Kirp writes in Shakespeare, Einstein, and the Bottom Line: The Marketing of Higher Education: “This is more than a matter of semantics and symbols.” When we use “business vocabulary we enforce business-like ways of thinking.” (Thanks to Transylvania’s Jeffrey B. Freyman for this quotation from his paper, “The Neoliberal Turn in Liberal Education.”)

Though the proponents of this point of view often come from the progressive side of the political spectrum, they unwittingly are endorsing a decidedly illiberal view of education. As Christopher Flannery and Rae Wineland Newstad point out in “The Classical Liberal Arts Tradition,” the phrase “liberal arts” literally means the “arts of freedom” as opposed to those practiced by slaves. “Slaves are subjected to the will of others, mere tools or instruments of alien purposes, unable to choose for themselves.” So-called “practical” training was for slaves, and the liberal arts would ruin slaves for their role in society as servants to their superiors.

Liberal education later evolved—particularly in the United States—into not just the privilege of the already free, but as a vehicle for freeing the young from servile status. As Frederick Douglass makes clear in his autobiography, the liberating quality of education was the reason American slaves were denied it: “Knowledge unfits a child to be a slave.” Liberal education equips students to take their places as equals in a free society, as makers of their own lives.

But note how the AAC&U approached its call for reform in 2008. In advocating its “Engaged Learning Reforms” (which closely mirror John Dewey’s practical learning agenda of the 1930s--it is nothing new), AAC&U president Carol Geary Schneider justified the plan primarily with a table showing the “Percentage of Employers Who Want Colleges to ‘Place More Emphasis’ on Liberal Education Outcomes.” Leading the pack was “science and technology,” with the support of 82%. Next came “teamwork skills in diverse groups,” with 76%.

The clinching argument for Schneider is this: “these goals for college learning are strongly endorsed by the constituency that today’s students particularly want to please—their future employers.”

That sentence, to my mind, lays bare the essential problem with the AAC&U approach: rather than strongly reaffirming the goal of educating students to think for themselves—the traditional goal of liberal education—the AAC&U implicitly admits that it has substituted the goal of pleasing their future employers. At the end of the day, how far is that from students being “subjected to the will of others, mere tools or instruments of alien purposes, unable to choose for themselves”?

This vision of the liberal arts does not free students; it puts the liberal arts at the service of society’s economic masters. It is natural that economic fear in uncertain times leads college students to want to please future employers. That does not mean that educators should seek to assuage that fear by shirking their responsibility to provide their students with far more than that, or should bend the curriculum to meet the desires of employers.

Schneider’s statement is not an isolated case, either. AAC&U’s LEAP program (Liberal Education and America’s Promise) published a piece in 2005 titled “Liberal Education for the 21st Century: Business Expectations” by Robert T. Jones, president of Education and Workforce Policy. Jones is not shy about how he sees the role of higher education: it “must respond to these trends by keeping the curriculum aligned with the constantly changing content and application of technical specialties in the workplace.”

Note, education “must” serve the needs of the workplace. That which business does and wants, higher education must do—because, at the end of the day, education serves business. Education must submit to business’ “assessment” of how well it produces the “outcomes” business wants, it must get “continual input from both employers and graduates” and change its ways accordingly.

Jones states that employers “are less concerned with transcripts than the demonstration of achievement and competency across a variety of general and specialized skills.” Knowledge, wisdom, perspective—none of these traditional liberal arts goals fit this account of what employers want. “Competency” in “general and specialized skills” is the aim. Today, “competencies” has become a common buzzword in education discussions, even opening the door for granting academic credit for work experience, and threatening to make the classroom experience virtually unnecessary.

The new liberal education, Jones says, “now enhanced with practical learning [how’s that for product branding?] is the essential foundation for success in every growing occupation.”

Jones is smart enough to compliment liberal education, even as he asserts that it is, at least in its current form, wholly inadequate and must be altered to serve the workplace better. But his ultimate purpose could not be clearer: education must “make peace” with the market.

Yes, there are substantial economic pressures on students today. Do we as educators, however, serve them best by surrendering our purposes to what prospective employers tell us they want? I say no. The question we need to ask is this: are the traits that employers say they want, and the means we are urged to adopt to meet them, wholly compatible with liberal education?

Take one example: Schenider tells us that colleges should change curriculum to include more “experiential learning” such as internships and “team-based assignments”—the latter because 76% of employers want more emphasis in college on “teamwork skills.”

Do employers and faculty mean the same things when they advocate “teamwork skills” as an educational goal? If employers next tell us we're not producing the correct "outcome" when we teach teamwork, will we be called upon to change practices once again? Is it not possible that when some employers say they want employees with “teamwork skills,” they mean people who will not rock the boat and bring up the less essential “ethical values” that the team might be violating? I’d suggest that the recent record of the banking and financial industries shows that we may be teaching too much teamwork and not enough ethics.

It may not be coincidental that the two lowest priorities for employers on Schneider’s survey were “ethics and values” at 56% and “cultural values/traditions” at 53%. Would those who use such survey results to justify their preferred educational reforms also accept that the curriculum should not emphasize ethics and values, because employers don’t seem to care so much about them? Shouldn’t the low priority the employers placed on ethics and values suggest to us that perhaps their goals are not the same as liberal education’s, and make us at least question whether we should give priority to their preferences?

A liberal arts education should empower students with a sense of perspective, but that is precisely what is sorely lacking in this debate. The AAC&U approach smacks of fear and desperation, but is the reality really so dire that we need to look to surveys of employers to tell us what to do? Yes, the price of higher education is high (though not as high as the sticker price suggests, since most students do not pay that price), and students and their parents have a right to expect that a high-priced college education will prepare its graduates for life—including the working life.

But today’s sense of panic comes less from those realities than from a culture that reflexively and unthinkingly ridicules the liberal arts as impractical, simply because they do not immediately and automatically funnel graduates into high-paying jobs. Seemingly everyone from Click and Clack on “Car Talk” to President Obama buys into the idea that the art history major won’t get you a good job. We laugh and nod knowingly when people joke that all that liberal arts majors really need to know is how to ask “Do you want fries with that?”

But it is simply not true, as an AAC&U report shows. It may seem true when graduation comes and that dream job (making, say, at least as much money as last year’s tuition cost) does not materialize. It certainly did for me when I was in that boat. But I see much better now than I did then. Thirty years down the road, the full value to me of my liberal arts education continues to emerge.

The liberal education will not pay its dividends—either economic or otherwise—in one or two or five years. When we expect it to do so, we are unthinkingly adopting the short-run values of today’s market mentality, with its concern with the next quarter’s profit, not the long-term viability of the company (see, again, the banking and financial industries). When we then change the way we teach in deference to such illusory expectations, we begin to sacrifice what we have always done best in the service of a mirage.

It is hard for liberal arts colleges to preach patience and perspective; perhaps it has rarely been harder to do so than it is now. But it is true: a liberal arts education has long-term value, value that cannot be reduced to income earned two or four years out, as the President’s “College Scorecard” seems to be intending to do.

The fact of the matter is that ten or twenty or thirty years down the road, liberal arts majors are doing fine. True, they may not make as much as their cohorts in the STEM fields. Some may need a graduate degree to enhance further their economic well-being. But the traditional liberal arts curriculum does NOT condemn liberal arts graduates to a life of poverty, and we do not serve our students well when we buy into the lie that it does.

When we accept that false narrative as true, when we contort ourselves and embrace any curricular reform that promises to make us more “practical” and “useful,” when we adopt educational practices for their branding or marketing potential rather than their educational value, we betray our fundamental mission: the education of our students for freedom, not for servitude.

Tuesday, July 22, 2014

Historically Moving

After more than four years doing this blog, I'm starting a new venture. History New Network recently invited me to blog on their site, and with this post, "Historical Humility," I begin.

I'll still be posting my pieces here; probably a day after they make their debut on HNN. And I will continue to use this space for the occasional less historical and more personal piece.

I'd like to thank you readers who have been following this blog--some since it began early in 2010. In retrospect, it seems that every time I began to wonder if it was worth the time and effort, someone would, out-of-the-blue, send me a nice compliment, or ask me when the next piece was coming. So thanks to everyone who did that.

I just wish my Dad was still here to see the new blog. He was probably the biggest fan of "The Past Isn't Past." Nothing gave me more satisfaction than when he would drop a casual "I liked your blog post" into our weekly Sunday afternoon phone call. After he passed, I went on his computer to send a message to his contacts to let them know, and noticed that "The Past Isn't Past" was the first bookmark on his web browser.

So, for that Great Web Browser in the Sky--and the rest of you, too--here's the bookmark for my new web home, Mark Byrnes's Facing Backwards.

Friday, July 4, 2014

I love the Fourth of July

(Re-posted from July 1, 2010)

I love the Fourth of July.

Not just because of fireworks (though who doesn't love a good fireworks display?). And not just because of cookouts (and, since you can throw a veggie burger on the grill too, who doesn't love a good cookout?). And not just because it gives me a reason to play two of my favorite songs, Bruce Springsteen's "Fourth of July, Asbury Park (Sandy)" and Dave Alvin's "Fourth of July" (though, seriously, this would be reason enough).

I love the Fourth because of the Declaration of Independence.

It began sometime in my childhood. At some point, on some vacation, at some historical site, my parents bought me a facsimile of the Declaration. It probably tells you all you need to know about me that I thought this was a great souvenir. It was hard, brittle, yellowed paper that crackled when you handled it. For some time I thought all official documents were thus. So when, in the fifth grade, my classmates called upon me to write a peace treaty ending the Great Spitball War between Group 2 and Group 3 (a foreshadowing that I would one day study diplomatic history?), I insisted on taking the piece of paper, coloring it with a yellow crayon, and then crumpling it up in a ball and flattening it out so that, at least to my eye, it looked like my copy of the Declaration. Then it was official.

Later, I eventually stopped wondering why there were so many "f"s where there should clearly be "s"s, and thought more about its content. Just about every American is familiar with the most famous passage about the self-evident truths. But there is a lot more to the Declaration. Much of it, the bulk of it really, is essentially an indictment of George III justifying the break. Reading it with an historian’s rather than
a patriot’s eye, many of the points don’t really hold up. But my favorite part of the Declaration isn’t one of the well-known lines, or something obscure from the list of charges. It comes at the end, just a simple, short phrase, and it encapsulates for me what is best about the Fourth of July.

When you think about it, July 4 isn’t really the most natural date for the nation’s birth. There are other turning points we could have chosen, for example, the outbreak of hostilities. Using that criterion, April 19, 1775, the date of the battles of Lexington and Concord, would be a better choice. Perhaps February 6, 1778, the date a great power, France, recognized American independence and entered an alliance with the U.S. that would help win the war, would be fitting. Legally one could argue that April 9, 1784, the date Britain recognized independence with its acceptance of the Treaty of Paris, was the true independence day.

But we didn’t chose the date of a battle, or the recognition of a great power, or the acceptance of the mother country. We chose the date of a declaration. What does July 4, 1776 mark, after all? A decision. An intention. Not a change in fact, but a change of mind. Looked at coldly, purely as a matter of fact, the Declaration is an absurdity. The colonies declared that they were independent, but they clearly were not. The colonies were still ruled by royal governors appointed by the King, and were occupied by tens of thousands of British soldiers. But the declaration nonetheless boldly states, in the words of a resolution first proposed by Richard Henry Lee nearly a month earlier, that “these united Colonies are, and of Right ought to be Free and Independent States.”

And it’s that phrase that I love: “and of Right ought to be.” The Declaration is not one of fact. It is one of what “of Right ought to be.” This country was founded with its eyes on the Right. Those men who signed the declaration were not always right. About some things, many of them, in many ways, were tragically wrong. But they knew the importance of what ought to be. And they knew that the most important date was not the one when men took up arms, but when they decided to do what was right. When it has been at its worst, this country has settled passively for what is, or what cynics said has always been and thus must always be. When it has been at its best, it has remembered to keep its eyes on what "of Right ought to be."

Have a wonderful Fourth of July, and sometime between the cookout and the fireworks, think a little about what of Right ought to be. And then work to make it a reality. That’s what the Fourth, and being an American, means to me.

Tuesday, June 24, 2014

Maliki is the New Diem

Some people are talking coup d'etat in Iraq.

David Ignatius writes that "President Obama sensibly appears to be leaning toward an alternative policy that would replace Maliki with a less sectarian and polarizing prime minister."

The impulse to replace Maliki is understandable. Most observers of Iraq argue that he has played a large role in the growing sectarian divide between the majority Shi'ites and the minority Sunnis, and thus bears responsibility for the growth of ISIS in the north.

The unstated assumption, of course, is that another popularly elected, plausible leader could have governed differently and guided Iraq into a functioning democracy, and that now, the fact that elections produced Maliki should not stop the United States from maneuvering behind the scenes to get a more able (read "pliable") leader in his place. Then the United States can go about fixing Iraq.

President George W. Bush shakes hands with Iraqi Prime
Minister Nuri al-Maliki, July 25, 2006. Photo by
Kimberlee Hewitt, public domain via Wikimedia Commons
Perhaps. More likely is that the internal conditions in Iraq produced the kind of leader Maliki became. If that's the case, then a coup to oust Maliki will do no good at all. Instead, it is likely to make things worse.

There is certainly precedent for that. In the mid-1950s in South Vietnam, the Eisenhower administration sought a non-communist popular leader who would not be tarnished by associations with the departing French colonizers. It settled on Ngo Dinh Diem.

For about six years, Diem seemed the answer to American prayers. He created a separate South Vietnamese government as a counter to Ho Chi Minh's communist North. He led a fairly stable regime that served American interests in the region.

President Dwight D. Eisenhower shakes hands with South
Vietnamese President Ngo Dinh Diem, May 8 , 1957
U.S. National Archives and Records Administation
But then in 1960, the National Liberation Front began its offensive against Diem's government. As pressure grew, Diem grew more oppressive, in particularly cracking down on the majority Buddhists. By the fall of 1963, the American embassy and elements of the Kennedy administration decided that Diem was the problem and needed to go. American officials sent signals to South Vietnamese generals who then ousted and murdered Diem and his brother.

Ignatius effectively proposes that the United States do the same thing in Iraq today:
The people who will pull the plug on Maliki are Kurdish leader Massoud Barzani and other Iraqi kingmakers. The United States should push them to signal unmistakably that Maliki is finished…. Saudi Arabia wants Obama to announce that he opposes Maliki. It would be better just to move him out, rather than hold a news conference.
One can only hope that Obama resists such pressure. Things with Diem didn't work out well.

In a February 1, 1966 conversation with Sen. Eugene McCarthy, LBJ put it bluntly. Kennedy was told, he said, that Diem
was corrupt and he ought to be killed. So we killed him. We all got together and got a goddamn bunch of thugs and we went in and assassinated him. Now, we've really had no political stability since then.
The political instability that followed the Diem coup was a major contributing factor in LBJ's disastrous decision to Americanize the war in Vietnam.

The desire to replace Maliki is another example of the imperial attitude toward Iraq: America gets to decide when it is time for the leader to go. I have little doubt that if the United States determined to do so, it could mount a coup against Maliki.  But as always, the question is: what then?

As with the initial invasion, it is relatively easy to destroy. It is much harder to build. The United States can probably destroy Maliki if it so chooses. But can it build anything to replace him?

Sunday, June 22, 2014

David Brooks and Pottery Barn Imperialism

One of the reasons I continue to read David Brooks is that he is often unintentionally revealing. Since he is, I think, quite sincere, he does not indulge in clever subterfuge in making his arguments. Thus he sometimes lays bare what otherwise remains hidden behind what Andrew Sullivan last week (ironically) called "noble lies."

In his June 13 column, Brooks tries to lay the blame for Iraq's current travails at the foot of Barack Obama. Before American troops left in 2011, he writes:
American diplomats rode herd on Prime Minister Nuri Kamal al-Maliki to restrain his sectarian impulses. American generals would threaten to physically block Iraq troop movements if Maliki ordered any action that seemed likely to polarize the nation.
After U.S. troops left, he writes:
Almost immediately things began to deteriorate. There were no advisers left to restrain Maliki’s sectarian tendencies. The American efforts to professionalize the Iraqi Army came undone.
Brooks never acknowledges the obvious (though unstated) assumption behind all of this: that Iraq could not be expected to function without the United States. It seems that Nuri al-Maliki (hand-picked by George W. Bush in 2007, by the way) bears no responsibility for indulging his "sectarian impulses" (and note that Maliki is ruled by "impulse," not thought or calculation), and the Iraqi army bears no responsibility for not being professional. It is all due to the absence of Americans, who of course, know best.

Brooks says, quite without irony, that "Iraq is in danger of becoming a non-nation." It never occurs to him that a state that--according to him--cannot function without American diplomats riding herd and American generals threatening its leader might already be a "non-nation."

Without knowing it, Brooks embraces an imperial role for the United States. It was America's job to control the Iraqi government, make it do the right thing. The United States should have stayed in Iraq for as long as it took. Leaving Iraq was "American underreach."

Brooks also embraces the reflexive American-centric mindset far too common on both the left and the right in the United States: the idea that whatever happens abroad happens because of something the United States either did or did not do. An incorrect American policy of withdrawal led to this state of affairs. It necessarily follows that whatever is going on in Iraq now can be fixed by the correct American policy.

Neither of those things is true. It is an illusion that Americans cherish because they think it gives them control over a chaotic world.

The American invasion of Iraq in 2003 broke Iraq. Iraqis thus far have not been able to put it back together. Maybe they never will. The lesson to be learned from that, however, is not what Brooks would have us believe: "The dangers of American underreach have been lavishly and horrifically displayed."

In the lead up to the Iraq War in 2003, Colin Powell allegedly talked about the so-called "Pottery Barn rule: You break it, you own it." The true lesson of Iraq is this: that American military intervention can easily break a country. It does not follow that American military intervention can just as easily make a country. Having disastrously bungled in breaking Iraq, Brooks would now have the United States once again bungle in trying to make it.

What the United States must "own" is not the state of Iraq, but the responsibility for breaking that state. Those are not the same thing. Responsibility begins with not making the situation worse by repeating the original mistake.

David Brooks, it seems, never learned that lesson. One hopes Barack Obama has.