Sunday, November 24, 2013

"Pain Which Cannot Forget"

The past week was marked by remembrances of JFK on the 50th anniversary of his murder. The historian in me can't help but take some satisfaction with the impulse to re-visit the past. Nonetheless, all week long the coverage produced in me a nagging unease, whose source I could not pin down.

On the day itself, it came to me. At least in the coverage I saw, heard, and read, it seemed there was an awful lot of re-living, but precious little reflection.

Over and over, people who were in Dallas and who played some role--reporters who covered the story, the Secret Service agent who jumped onto the president's car, doctors at the hospital, people lining the motorcade route--all re-told their stories. Average people repeated where they were when they heard the terrible news. Perhaps because at that time I was alive but not yet aware, these stories seemed, ultimately, somewhat unsatisfactory.

I think my inner historian was waiting for someone to seriously reflect and not simply remember. The closest most accounts ever got to reflection was trotting out the tired, cliched remark that America lost its "innocence" that day. How a nation that had lived through the Civil War, or more recently the Great Depression and World War II, could be described as "innocent" escapes me.

Reflection is more than remembering and re-living. It involves a search for meaning and perspective. What do we do with those memories, how do we process them, and how are we different when we re-emerge from that process?

A discussion with a friend and colleague on the anniversary for some reason triggered a memory not about JFK, but RFK and the speech he gave the night of Martin Luther King Jr.'s murder four and a half years after his own brother had been gunned down. In that age before instant communication, Kennedy learned the news on his way to give a speech in Indianapolis, knowing that most if not all of the people gathered to hear him would be unaware of what happened.

The police feared a riot and advised Kennedy to cancel the speech. Instead, he insisted on going ahead with it. According to Evan Thomas' biography of RFK, the "police escort peeled off when he entered the ghetto." It's a remarkable speech, well worth watching in its entirety.

The reason it came to my mind is the way RFK takes his own pain at the death of his brother and uses it to try to assuage the pain and anger he knows his audience feels.

He quoted Aeschylus:

"Even in our sleep, pain which cannot forget falls drop by drop upon the heart, until, in our own despair, against our will, comes wisdom through the awful grace of God."

That RFK quoted that passage from Aeschylus is no accident. Thomas reports that, in his grief after his brother's murder, in his search for answers and meaning, RFK took the advice of Jacquelyn Kennedy and began reading the works of the ancient Greeks: "The saving grace for Kennedy was the exaltation Greeks found in suffering. 'In agony learn wisdom!' cries the herald in Aeschylus' Prometheus. The Greeks understood that 'injustice was the nature of things,' but that the awfulness of fate could be borne and redeemed through pain."

RFK reflected. He learned. He found wisdom. He adopted some humility to balance the brash, youthful arrogance for which he had become known. He became a better man.

By the time history assigned him that role to play on April 4, 1968, he had transformed himself in such a way that the casting was ideal. He converted his personal pain into comfort for others.

Perhaps that's something only individuals, and not nations, can do. But I can't help but wish that this past week's remembrances had revealed a nation that had reflected and learned. That had become more humble. That was better. Whose pain had led to wisdom through the awful grace of God.

Monday, November 4, 2013

Experience Becoming

One of the great things about reading a variety of kinds of writing from a variety of sources is the occasional serendipitous connection it helps you make.

I read yesterday's Education Life section of the New York Times, growing increasingly agitated at the mindless cheerleading in its section titled "The Disrupters." Article after article treats the reader to largely uncritical accounts of various "edupreneurs" (no, that word is not my snarky coinage, but what some of these people evidently call themselves). 

We are promised (or is it threatened?) that "the disrupters" are on the verge of "disrupting" higher education by bringing the always perfect approaches that universally serve the private sector so well to that hopelessly outdated institution, the American college/university. (That sentence is snarky.)

There so many things wrong with every one of those articles that I could not focus on any one of them. The avalanche of mindless corporatespeak passing itself off as wisdom and insight and innovation was just too overwhelming.

So to break the spell of banality, I went online to Andrew Sullivan's The Dish and saw this link to a letter by Kurt Vonnegut (which everyone should take a minute or two to read).

At first, I simply enjoyed the letter. But within a few minutes, I realized that it had brought into perfect focus one of the things that had bothered me most about the New York Times articles. Vonnegut's advice to a group of high school students was simple:
Practice any art, music, singing, dancing, acting, drawing, painting, sculpting, poetry, fiction, essays, reportage, no matter how well or badly, not to get money and fame, but to experience becoming, to find out what's inside you, to make your soul grow.

That's what was missing in all of these breathless edupreneurial proposals to disrupt higher education, but its absence was most obvious in one in particular: awarding degrees based simply on demonstrating competence in various areas.

Something calling itself College for America offers an associates degree for $1250 per six month term. Students can breeze through as quickly as they like. The article highlights one young man who completed all "120 competency goals he was given" in only "three months and five days"--in other words, he got a two-year degree in one semester, for very little cost.

Pretty impressive, huh? Well, not if the goal was education.

This man got a degree. He did not get an education. Education is about becoming. It is not simply a checklist of (often employer-determined) competencies. This man had no time at all for reflection, no time for actual learning, no time for any of the discrete assignments he tackled to percolate in his unconscious, no time for the unplanned, unexpected connection to form and develop and blossom.

None of these supposedly "disrupting" ideas care one bit about those things. All we hear is that they will cut costs or speed up the process of getting a degree. Whether these ideas actually do anything to help students become something (other than a person with a "marketable diploma") seems to be of little or no concern to these Disruptive Masters of the Educational Universe.

A college is not simply a job-training institute. Its purpose is not to turn out interchangeable cogs who have been trained in specific marketable skills that our corporate masters dictate they must have.

At its best, it gives young women and men the chance to to find out what's inside them, to become who they want to be. You don't do that in three months of producing "deliverables." You don't do that sitting at home in front of your computer looking at online videos. You don't do that with the aid of "academic success coaches."

Those things might get you a "marketable diploma" but they will never get you an education. As long as it makes them a profit, the edupreneurs could not care less.

The rest of society should.

Monday, October 28, 2013

To Still a Wackobird

"Simply because we were licked a hundred years before we started is no reason for us not to try to win."

I came across this line last week while re-reading To Kill a Mockingbird for my humanities class, and it struck me that it nicely captures the appeal of the GOP's recent quixotic effort to defund Obamacare that resulted in the government shutdown.

The line belongs to Atticus Finch, the attorney who takes on the legal defense of Tom Robinson, a black man falsely accused of rape. While I think it absurd to compare the principled nobility of that fictional act to the GOP's attempt to destroy the Affordable Care Act, the people who rallied to Texas Sen. Ted Cruz's cause don't. They believe in the nobility of the hopeless fight.

Harper Lee's novel retains its power more than 50 years after it was first published because it not only sends a clear message of condemnation of racial prejudice, but also tries to understand how whites came to hold those views, and turn those views back on them.

One of the lessons Atticus imparts to his children is the need to understand those we are tempted to dismiss or condemn: "You never really understand a person until you consider things from his point of view ... until you climb into his skin and walk around in it." On some level, Lee's novel is an exercise in just that.

Lee understands the depth of the racial prejudice she attacks, and knows how to reveal it. She adroitly connects her hero to the very quality ostensibly prized by the adherents of the South's "Lost Cause" mentality. When Atticus says it is worth fighting even when you know you will lose, Scout associates the sentiment with his Cousin Ike.
"Tell you, Atticus," Cousin Ike would say, "the Missouri Compromise was what licked us, but if I had to go through it agin I'd walk every step of the way there an' every step back just like I did it before an' furthermore we'd whip 'em this time ..."
Lee's brilliance consists of taking this mindset--the one that allowed defeated Confederates to salvage something from their defeat by focusing on the honor of making a good fight rather than the system of slavery that victory would have perpetuated--and transferring it to a character who stands for the rule of law and equal justice, rather than the Jim Crow oppression that the "Lost Cause" sentimentality made possible. Atticus tells Scout: "This time we aren't fighting the Yankees, we're fighting our friends." Lee neatly equates the proponents of white supremacy with the hated Northerners.

If what white Southerners truly value is a principled fight, Lee suggests, then they should stand with Atticus Finch. When they don't, they show what it is they truly value: the preservation of a system that institutionalizes their racial privilege. In some sense, Lee's novel revolves around that insight: the gap between the purported ideals and the ugly reality that they mask.

The power of the ideal is undeniable: persisting in the face of certain defeat is supposed to prove the purity of the motive. The honor of the fight is all.

This is the ideal embraced by the supporters of the shutdown strategy. More mainstream Republican figures said the idea was crazy and bound to fail (John McCain has memorably called Cruz a "wackobird"). To them, an effort that has no chance of success is foolish, even counterproductive. For others, however, the fact that it has no chance is precisely what recommends it.

In the aftermath of his utter lack of success, Cruz refused to express any regret. He called the fight a "courageous stand" and a "profile in courage." When asked if the fight was worth it, Rep. Michele Bachmann replied: "Absolutely.... What we did is fought the right fight.”

It is too easy to dismiss Cruz's shutdown advocacy as a stunt meant to propel him into the rank of 2016 presidential contenders. Of course it was that. The more important question is this: why did he think (evidently correctly) that it would have appeal among the Republican Tea Party base?

I would argue it is because he rightly recognized the appeal of the "Lost Cause" mentality.

In the course of American history, that concept is most closely associated with white Southerners, and while it may be more common among them, it is not at all uniquely "Southern." It is, however, an idea that has a special appeal to people who believe they have already lost the battle.

Many Southerners are Tea Party supporters, but not all Tea Party supporters are Southerners. During the shutdown, when a Tea Party protest at the White House brought out a Confederate battle flag, it was easy (too easy, really) to label all Tea Partiers as "neo-Confederates." Yes, race is an element here. It is not, however, everything. What is going on is more complex than that.

What we too loosely refer to as the "southern" mentality of the Tea Party is not geographic, but cultural. The Tea Party represents a subset of the larger culture: more white, more rural, more elderly, more traditional. The reason the apocalyptic rhetoric, the dig-in-your-heels style, and the confrontational (even anti-democratic) tactics appeal to Tea Party supporters is due to a simple fact: they see "their" America dying.

That's what they have in common with the Southern fire-eaters of the pre-Civil War era.

What many people don't understand about secession is that it was prompted merely by the fact of Lincoln's election, not anything concrete he had done--he had not even taken the office yet when eight states seceded. Secession was a response to what his election represented: the end of the Southern veto over national policy. Lincoln's election proved the northern states could elect a president without the aid of the southern states.

At that point, the fire-eaters decided the democratic game was over within the United States: they would always lose. Thus the only way to win, the only way to preserve "their" America, was to separate and create a new one in which they would be the permanent majority.

Today's GOP faces something similar. Demographic trends suggest that in the future, the GOP will not be able to remain the same ideologically and also be a majority party. Since it is not geographically defined the way the pro-slavery South was, the Tea Party core cannot secede in order to create a new majority (though "secession" and "nullification" have predictably enjoyed a recent resurgence in Tea Party circles).

One solution to this dilemma would be ideological change, which would require writing off the Tea Party. But the party is unwilling to take that step. Nothing shows that better than the way Speaker John Boehner abdicated all leadership in deference to the Tea Party caucus during the recent shutdown.

So what the party has been trying to do instead is change the rules so that they can control government without having to be an actual majority party.

That is what holds together the variety of the tactics used by the GOP since Obama's election in 2008. The abuse of the filibuster in the Senate has become a vehicle of minority veto, a way to say no to everything, to make a supermajority the new requirement for things that traditionally required a regular majority. The voter ID laws reflect the same desire to rig the outcome: if we cannot get a majority of the existing electorate, we can find a legal way to redefine the electorate and create an artificial majority.

The shutdown debacle was the same thing--unable to achieve the "correct" result through normal democratic process, Tea Partiers decided to hold the funding of government hostage to achieve its end of defunding Obamacare.

Why choose that issue? The term "Obamacare" has come to encompass everything they despise: the man himself, the electoral coalition that brought him to power and successfully kept him in office, the governmental philosophy he represents. It is the embodiment of their fear--bordering on certainty--that history is passing them by, that the America they believe in is passing away.

The desperation in the rhetoric is real. As long as cynics like Ted Cruz continue to pander to it, it will not diminish, and the Tea Party will remain politically relevant--and destructive.

Republicans are trying their best to frame the current divisions within the party as merely a matter of "tactics and strategies," as Cruz recently put it. It is not. It is a fight between those who think time is short and compromise is betrayal, and those who don't. The Tea Partiers are right, I think, if deep down they believe that they are fighting a losing battle. As long as they continue to demonstrate power within the GOP primaries, however, the racket of the wackobirds will go on and on.

Wednesday, October 2, 2013

Trey Gowdy Thinks Obamacare is Comparable to Segregation

The lunacy never ends.

It's bad enough that one of the two great American political parties has gone completely off the deep end, indulging in magical thinking that says it can reverse the results of the last presidential election simply by politically holding its breath. Now they want us to see their political temper tantrum as the equivalent of the Civil Rights movement.

In today's Spartanburg Herald-Journal, my representative, Trey Gowdy (R-SC), said the following about the government shutdown and the Tea Party obsession with derailing the Affordable Care Act:
“Some people might say that we should go along with what the President wants, and the Supreme Court ruling, but I would submit that just because it's law doesn't make it a good law,” Gowdy said. “There was a time when it would have been unlawful for (Sen.) Tim Scott and I to sit-down at a restaurant and eat together. There are bad laws and those worth staying and fighting for and I happen to think this is one of them.”
It shouldn't be necessary to point out how utterly absurd this comparison is, but as George Orwell once said, "we have now sunk to a depth at which the restatement of the obvious is the first duty of intelligent men."

So let me state the obvious: a law that makes it possible for uninsured Americans to buy health insurance at a price they can afford is not comparable to segregation.

Gowdy postures as a man of principle, but anyone with an elementary understanding of what constitutes principles would understand the difference. Yes, the general idea (a law is not necessarily a good law) is true.

Everything else Gowdy says is despicable, an insult to everyone who fought to end segregation. Those people quite literally put their lives on the line to challenge a moral and constitutional injustice.

What has Trey Gowdy done?
Trey Gowdy, talking to a history class in his district. Hopefully
he did not tell them that Obamacare is like segregation.

He has refused to approve ongoing funding of the federal government because he insists on defunding a law he does not like.

And what has Trey Gowdy sacrificed for this "principle"?

He continues to draw his salary while hundreds of thousands of federal employees, many trying desperately to make ends meet, are furloughed and lose their incomes. He continues to enjoy the security of knowing that he and his family are well-insured if, God forbid, any health problems should arise, while he tries to prevent less fortunate people from having that same peace of mind.

A real profile in courage, that.

And what will happen if Trey Gowdy gets his way? Who will his principled stand help?

No one.

But he can be proud of having made a principled stand to free other people from the burden of having health insurance.

If Trey Gowdy were truly concerned with principle, he would take seriously the principle upon which our government rests: that our laws are the result of a process, one which was followed and ratified in every particular in the case of this law. It is not simply "what the President wants." It is the law of the land.

Trey Gowdy has every right--even the duty--to argue his case, to do all he can to convince Americans to elect representatives, senators, and a president who want to repeal the law. If he succeeds, people like me who favor the idea will have to accept repeal. Because we accept the legitimacy of the process.

But what he and his Tea Party cohort want is not to play by the rules, but to blow them up. They don't give a damn about process. If it does not serve their ends, they will manipulate it, undermine it, pervert it until they get what they want.

Their arguments failed to sway the Supreme Court, they failed to win the presidential election, they failed to gain control of the Senate, and they think none of that should matter. They should get their way regardless, because they are RIGHT.

After all, they think they are like the Civil Rights protesters.

The lunacy never ends.

Monday, September 9, 2013

Elevate the Debate

It hasn't exactly been an inspiring week for democratic discourse.

When President Obama decided to submit to Congress the question of military action against Syria, I wrote that he was doing the right thing in showing respect for process. I still believe that. But I also said that it was now up to Congress to "have a dignified and intelligent debate." So far, not so much.

There have been more lowlights than highlights. We've been treated to Rep. Jeff Duncan, Republican of South Carolina, embarrassing himself by launching an ad hominem attack on Secretary of State John Kerry in the guise of a question: “Is the power of the executive branch so intoxicating," Duncan said, "that you would abandon past caution in favor for pulling the trigger on a military response so quickly?”

This is not a statesman making an argument. This is a hack trying to score cheap political points.

Sen. John McCain, who has long supported miltiary intervention in Syria, makes the "credibility" argument. “If the Congress were to reject a resolution like this, after the president of the United States has already committed to action, the consequences would be catastrophic, in that the credibility of this country with friends and adversaries alike would be shredded,” McCain said.

The "credibility" case is perhaps the worst possible argument for military intervention. It amounts to saying that is better to do something stupid than take a chance that you might be seen as fickle or weak by deciding not to do the stupid thing you said you would do. If military strikes against Syria make sense as policy, proponents need to make that case, and not hide behind the absurd "credibility" argument that helped drag the United States into Vietnam.

Congressional Democrats have been no better.

Democratic Rep. Elijah Cummings of Maryland, who seems to want to vote yes in order to support a Democratic president, explained that over 90% of his constituents are against military action against Syria, and cited an exchange with a nurse, who opposes strikes:

"So I said, 'Do you understand there's chemical weapons?' She said, 'Folks have been using chemical weapons for a long time.'"

The problem is that the nurse is factually wrong. Since their widespread use in World War I and subsequent banning in 1925, such weapons have in fact rarely been used. Cummings either did not know that, or did not bother to correct her. Leadership sometimes means telling the people when they are wrong.

The whole idea behind a punitive military response against Syria is to reassert the idea that using such weapons is beyond the pale and insure that it does not now become commonplace. A member of Congress about to vote on the proposal should know that, and has no obligation to be swayed by the uninformed opinions of constituents.

Explaining why he was leaning against supporting military strikes against Syria, Rep. Gregory W. Meeks, Democrat of New York said: “I wasn’t elected just to go along to get along. I was elected to utilize my thought process and to determine what I think is in the best interest of my district.”

No, Rep. Meeks. When it comes to foreign policy, your job is not to think about "the best interest of my district." A congressional district does not have national security interests; the United States does. In these cases, you think as an American, not as the reflexive servant of your constituents. Meeks was trying to paint his fear of opposing constituent wishes as the political courage to be independent of the president, but instead makes himself look like a politician about to cravenly submit to the voters, rather than deciding what he thinks is right.

If members of Congress think the president is wrong, they should explain why, and not hide behind platitudes about constituent wishes or political independence. If they think he is right, they ought not to use "credibility" to avoid explaining exactly what American interests are at stake.

Perhaps in these hyper-partisan times, an elevated debate was too much to hope for. Rep. Tim Murphy, Republican of Pennsylvania, admitted that his constituents openly say that they oppose action against Syria simply because Obama is asking for it: “Generally, the calls are like this: ‘I can’t stand President Obama; don’t you dare go along with him,’” he said.

I've spent the last several years researching the American debate over involvement in World War II, so I inevitably tend to see this debate through that lens. That debate also had its low points, with demagoguery on both sides often drowning out more reasoned discourse. Some people no doubt opposed FDR's proposals simply because they came from "that man."

Nonetheless, there was a substantive debate over American policy, one that went on for 27 months. In those specific historical circumstances, the United States had the luxury of time. Since then--in part due to the difficulties FDR had in moving Congress toward intervention--presidents have often eschewed Congressional debates before taking action, citing the need for quick action. (It is likely also that they feared getting bogged down in precisely the kind of self-interested and often partisan Congressional posturing we've just seen).

Sadly, the last week shows why those previous presidents acted the way they did. If Congress wants to reassert its role in making foreign and military policy, if it wants to show that those previous presidents were wrong to act without Congress and that future presidents should follow Obama's example, today's Representatives and Senators need to elevate the debate to a level commensurate with the stakes. If they fail, they may squander their last best chance to show that the legislative branch can be a responsible partner in the making of American national security policy.

Monday, September 2, 2013

Bare Minimum

Since today is Labor Day, it seems appropriate to note that what was lost in last week's Eleazar David Melendez piece in the Huffington Post on the 1949 minimum wage increase was the role played by American labor.

Melendez rightly notes that the Truman administration and its conservative opponents compromised to reach the agreement to increase the minimum wage, but fails to note what motivated Truman and the Democrats to push so hard for the increase: the desire to fulfill at least one of its promises to labor.

Most union members made far more than the minimum wage in 1949--despite the fact that the final bill raised the minimum from 40 cents to 75, the average wage increase for most workers was only 5 to 10 cents, since most workers earned more than the minimum. Yet labor made it a priority because it saw itself as representing all workers, and believed that an increase in the minimum wage would have a ripple effect that would ultimately benefit all workers.

Politically, labor mattered. In his 1948 campaign, which many political observers dismissed as futile, Truman had run on a platform that pledged two major things to American labor: an increase in the minimum wage (which Truman had first asked for in 1945) and repeal of the anti-union Taft-Hartley Act of 1947, which passed the Republican-controlled Congress over Truman's veto.

In his State of the Union address in January 1949, Truman had called for repeal: "At present, working men and women of the Nation are discriminated against by a statute [the Taft-Hartley Act] that abridges their rights, curtails their constructive efforts, and hampers our system of free collective bargaining.... That act should be repealed!"

The reality, however, was that Truman lacked the votes to repeal the act, despite the fact that Democrats had regained control of Congress. The conservative coalition of Republicans and southern Democrats made it impossible.

What was possible was an increase in the minimum wage, and Democratic leaders in Congress quickly gave up on Taft-Hartley repeal and focused instead on that instead. While labor wanted $1 an hour, Truman had asked for "at least" 75 cents while also dramatically expanding (perhaps by 5 million) the number of workers covered by the law.

Predictably, conservatives tried to derail the proposal--but not by using today's obstructionist tactics. They actually proposed an alternative: limiting the increase to 65 cents an hour, indexing the wage to inflation, and eliminating the expansion of workers covered. Truman and the Democrats held firm on 75 cents, and Majority leader John McCormack made that number a matter of party loyalty, citing the 1948 platform. But they accepted the fact that they could not get both an increase in the wage and an increase in coverage, and accepted a bill that, in the short run, actually reduced the number of workers covered by the law.

That compromise led to a 361-35 vote in the House in favor of its version of the bill (this is the vote I referred to as "extraordinary" in the Melendez piece--he mistakenly attributed my statement to another 186 to 116 vote and has not responded to requests to correct the record).

What makes that vote extraordinary is that there were only 263 Democrats in the House. In other words, a large number of Republicans voted to increase the minimum wage.

Contrast that with today's conservative orthodoxy resolutely which resists any increase in the minimum wage. Sunday's Spartanburg Herald-Journal made a typical free-market argument: "The federal minimum wage is an artificial control on the market system" which "will only spur businesses to raise prices and cut jobs." They acknowledge that inflation has eroded the real value of the minimum wage, but reject the idea that this is any reason to increase it.

For the last 30 years, conservatives have resisted increases in the minimum wage, effectively lowering the wage when accounting for inflation. The minimum wage reached its height in 1967, when it was an inflation-adjusted $9.79 an hour. In fact, from 1962 to 1979, the minimum wage was always more than $9.00 an hour. Beginning in 1980 (coinciding with the start of the Reagan era), it began a steady decline, reaching an inflation-adjusted low of $6.59 in 2007.

That finally prompted the Democratically-controlled House in 2007 to pass an increase. They had the votes to do it alone, but a mere 6 years ago, 82 House Republicans also voted to increase the minimum wage. Can anyone imagine today's Republican House members casting such a vote?

The fact that President Obama's current proposal to increase the minimum wage to $9.00 an hour seems dead in the water is testimony to how reactionary today's Republicans have become. That rate today would only restore the minimum wage to where it was at the end of 1961. Despite a general nostalgia for the America of 50 years ago, in this one respect today's conservatives do not want to go back.

It is no coincidence that the erosion of the minimum wage parallels the decline of the power of the American labor movement. Ronald Reagan famously broke the air traffic controller strike in 1981, and it would be 9 years before the minimum wage increased again (it had increased 7 times in the previous 9 years). It is also no coincidence that the same period has seen a marked redistribution of wealth upward.

In 1979, when the minimum wage was an inflation-adjusted $9.33, the bottom 99% controlled 79.5% of the national wealth; in 2010, it was down to 64.6%.

Raising the minimum wage is one of the tools we have to try to maintain the kind of balanced economy that produces widespread prosperity. Today's conservative refusal to use that tool betrays an reactionary agenda that seeks to enhance, rather than alleviate, the trend toward maldistribution of wealth.

For all of its well-documented faults, American labor was a countervailing force that balanced the power of corporate America from the mid-1940s to the late-1970s, to the benefit of all Americans. Its decline in the decades since has led to a distorted, winner-take-all economy which is incapable of maintaining balanced, long-term economic growth. It may be impossible to revive the American labor movement, but it is imperative that we find a political substitute to play the role that unions once played in American political life.

Saturday, August 31, 2013

"Roll of the Dice"

Having just watched President Obama's statement on Syria, in which he both strongly made the case for his belief that the U.S. should launch punitive military strikes against the Assad regime and said he would ask for Congressional authorization to do so, I listened in amazement as the CNN commentators displayed just how poorly they know this president.

Wolf Blitzer kept repeating that this was a "roll of the dice," because by asking for Congressional authorization, Obama was taking the chance that the answer would be "no." In that case, he suggested, Obama risked looking weak.

In short, Blitzer et al simply could not seem to fathom that Obama may have been thinking of this decision in any terms other than the crassly political. It never seems to have occurred to them that he may believe that, however much he thinks that this is the right course of action, he had a responsibility to provide the time for the people's representatives to weigh in before acting. He may have actually believed a Congressional vote was the right and proper thing to do even if the Congress did not approve a military strike.

I've studied American political history and foreign policy long enough to know that presidents are never unaware of or unconcerned with the political ramifications of their military decisions. But that is not the same thing as saying that politics always trumps other considerations.

Over the last 30 to 40 years, we Americans have become so accustomed to presidents justifying the assertion of unilateral power to do nearly anything around the world by referring to their powers as "commander-in-chief" that a president refraining from doing so seems inexplicable, an irrational "roll of the dice."

President Obama made clear that he believes a strike is justified, even required, in this instance. But he did not therefore conclude that he had a unilateral right to do it. Perhaps if Congress refuses to grant the authority, he will act anyway. My guess is that he will not. My belief is that he was saying something that far too few Americans--in the media, and especially in Congress--seem to understand: process matters.

However important he thinks it is for the United States to make a statement that the use of chemical weapons cannot be tolerated, he does not think it more important than that basic principle. In our political life, we have accepted the corrosive idea that the only thing that matters is getting our way, process be damned.

The American system of government, if it is to work again, requires that all Americans recommit to the idea that the most important thing is not that we get our way by hook or by crook, but that we all agree to respect, abide by, and not abuse the process. If we have an election and our candidates and ideas lose, we do not then seek to subvert the result, or hold the government hostage in order to undo the results of that election.

If our ideas are rejected by the majority, we have every right to continue to believe them and advocate for them. But when we connive to impose them on others, when we run roughshod over the process in the name of achieving our desired result, we undermine the only thing that can ever make the system work.

In his deference to Congress today, President Obama has shown respect for process. Now it is up to Congress to show similar respect, have a dignified and intelligent debate, and face the responsibility of making its decision. If it does, regardless of the result, our system of government will be the stronger for it.

And with any luck, some members of Congress might even get in the habit of putting process over results. We can only hope.

Wednesday, August 28, 2013

Truman and the Minimum Wage

Huffington Post has an article out today on the 1949 minimum wage law. I spoke with the writer
Eleazar David Melendez for about 40 minutes a couple of weeks ago, helping him understand how the law got passed, despite the general opposition to Truman's Fair Deal proposals that year.

I intend to elaborate more on the dynamics of passing this legislation in a future post, but for now, the article does a good job laying out the basics and quotes some of my observations.

Sunday, August 18, 2013

Presidential Historian "Branding"

"Branding" seems to be everywhere. The concept of a "brand" began in business, defined by Businessweek as "the genuine 'personality' of your company." But in the increasingly commodified, "Glengarry Glen Ross" society in which we are all expected to "always be selling," the idea has become virtually indistinguishable from marketing and self-promotion.

As a result, to my ear the word "brand" smacks of manipulation. I prefer "reputation," since a reputation carries with it the sense of something earned by one's actions, not fabricated by one's conscious self-promotion.

To a greater or lesser degree, everyone on social media engages in some form of "brand" creation--am I someone who posts regularly or irregularly to Facebook? Are my posts personal, political, inspirational, religious, etc? When I decided to start this blog, I had to decide what (if anything) it might be known for, and since my wish was to apply my historian's perspective to contemporary events, most (though not all) of my posts have roughly fit that category.

In recent years, no one has been more successful at this than "presidential historian" Michael Beschloss. He's a regular on PBS and NBC, and recently he has made something of a splash on Twitter, (@BeschlossDC). His account was named to Time magazine's "140 Best Twitter Feeds of 2013" in the category of "Politics," though it is really more historical than political.

Beschloss has specialized in tweeting interesting photographs, and so it might be more accurate to say that he is doing a kind of history through photography. My friend, the attorney Bill Carleton, who follows Beschloss on Twitter and has an interest in intellectual property and new media, noted to me awhile back that it seemed odd that Beschloss almost never identified the sources of the photos or the photographer.
The Twitter profile of Michael Beschloss

Carleton has tweeted Beschloss on this subject, and last week he wrote a blog post about it. He made his point with this brilliant mock photo credit:
Pictured, from @BeschlossDC: Truman and LBJ in 1965. It's remarkable that Michael Beschloss would have had both the access to the Presidents and the facility with camera equipment of the time to pull this photograph off. From the camera angle, we can infer that he was unusually tall for a child (he would have been 9 years old in 1965)
Despite the tone, the issue Carleton raises is a serious one. Beschloss has added "Twitter photo historian" to his brand, and gained some fairly high level exposure for it, such as this Gwen Ifill interview from December 2012. But it is hard not to notice how Beschloss artfully ducked Ifill's question about how he finds these photos by talking instead about why he finds them interesting.

More recently, Beschloss did the same thing in this interview with Jonathan Karl when asked directly (about 5 minutes into the interview) "where do you get these photos?" Beschloss explained when he does it (on the weekends) and why does it. When asked a follow-up about where he found a specific photo of Lou Gehrig and Frank Sinatra, Beschloss said it was from "archive that was connected I believe to Lou Gehrig who has a lot of fan sites." Finally, Beschloss simply said that he relies on his memory: "I remembered seeing that image somewhere and I went out and grabbed it."

Historians will know where I'm going with this. When it comes to citing primary sources, "I went out and grabbed it" does not cut it. In his books, Beschloss--like all authors--has to credit the photos he uses, in the same way that a historian is trained to cite all primary sources.

As a blogger who occasionally likes to use photos with a post, I can sympathize with Beschloss. The internet has made the replication of images easy, and it can be difficult to track down the original provenance of every photo one would like to use. (Thinking about this issue has convinced me of the need to be more vigilant about using citing photos in future blog posts). It is also the case that in the classroom, we teachers frequently make use of photos pulled from the internet in our Powerpoint presentations without crediting them.

At what point does the size of the audience matter? Beschloss now has over 60,000 followers on Twitter. When he tweets photos, is he more like a teacher using them in a classroom or more like an author publishing them in a book?

Complicating the issue further is the fact that the greater the notoriety that Beschloss gains, the more the photos he tweets in some sense "become" his photos to followers.  For example, the CBS Sports web page made the Sinatra/Gehrig picture its "Photo of the Day," and said it was "Courtesy of presidential historian Michael Beschloss."

Given all of the above, it is good to see that Beschloss (probably because of the prompting of Bill Carleton and others) has now started to credit most of his photos, and promises a future website which will have the sources of the images. I do wish that he had openly acknowledged the change, however, and offered an explanation to his Twitter followers. That, too, could have been a form of educating the public, by letting them know that crediting the original sources is a value historians hold dear.

Unfortunately, it does not seem that Beschloss is overly interested in acknowledging mistakes. Last month, he tweeted a picture of Andrew Jackson and wrote: "Andrew Jackson tday [sic] 1832 vetoed Bank of US renewal ending tradition of veto's use only against unconstitutionality." As soon as I read that, I knew it was wrong: Jackson had explicitly argued in his veto message that he believed the Bank to be unconstitutional.

I tweeted Beschloss a quotation from Jackson's message ("the powers and privileges possessed by the existing bank are unauthorized by the Constitution") and included a link to the full text of the statement. There was no reply, but I later noticed that the tweet was gone. Fortunately, I had done a screen capture of the original tweet, pictured here.

Upon learning of his mistake (no doubt from others as well), Beschloss merely deleted the tweet and thus the evidence of his error.

I don't blame Beschloss for getting something wrong, particularly given the volume of tweeting he does. But the fact that he did not acknowledge and correct the error strikes me as beyond the pale. Errors of fact should not just be dropped down the technological memory hole.

I noticed the tweet was gone and I followed up: "I like how you use Twitter, but I think simply deleting the erroneous Jackson tweet is insufficient. Anyone can make a mistake-But a historian has an obligation to correct mistakes. You have nearly 48,000 followers--how many saw it and thought it true?" Beschloss neither responded nor issued a correction. He just removed offending tweet from the record.

Perhaps by making his mistake disappear, and by belatedly (though without explanation) beginning to credit some of the photos he uses, Beschloss is protecting his "brand." But at least for this historian, his reputation has suffered.

Wednesday, August 14, 2013

The Egyptian Tiananmen

In light of today's horrors in Egypt, Andrew Sullivan, Marc Lynch, and others are now calling for the cutting off of aid to Egypt. While that is the right thing to do now, it is a classic case of too little, too late.

The time to end the aid was after the coup, as the law required. By twisting itself in knots to pretend the coup was not a coup, the Obama administration signaled to the Egyptian generals that it valued its relationship with them more than the democratic process. The military no doubt took that as effectively a green light for today's events.

The administration allowed a simplistic idea of realpolitick to convince it that the worldy-wise way to approach the coup that removed Morsi from power was to finesse the situation. It would maintain its influence with the generals by showing that it had faith in their intentions to restore democracy. Lynch writes:
It seemed prudent to many in Washington to wait and see how things would play out, especially given the intense arguments of those defending what they considered popular revolution. It didn't help that neither the United States nor other outside actors knew quite what they wanted. Few particularly wanted to go to the mat for the Muslim Brotherhood or a Morsy restoration, and Washington quickly understood that this was not in the cards. But they also didn't want a return to military rule.
What Obama should have done instead was use the law requiring an aid cut-off as a way of pressuring the Egyptian military to restore quickly a legitimate government with a popular mandate. Obama would have had the excuse of saying that the coup left him with no options. Secretary of State Kerry then could have quietly made assurances that the aid would be immediately resumed once an elected government was in place.

Such a course would have given the administration actual leverage. Instead, its refusal to call a coup a coup sent precisely the wrong message.

What should have been clear before is now undeniable: when the military acted to remove Morsi from power, it was not acting on the popular will. It was rather exploiting the anti-Morsi protests to do what it wanted to do all along: decapitate the Muslim Brotherhood. By not objecting, the administration implied that it shared that objective. Was it really so odd that the Egyptian generals believed that if they could remove an elected president without consequences, they could also violently disperse protestors?

In academic discussions of American foreign policy, there is a common division between those who argue that U.S. diplomacy should be guided by ideals and those who say it should only serve material interests. In this case, that is a false choice. A stable Egypt, with real respect for democratic process, in which the Muslim Brotherhood has a stake in electoral politics, is in America's interest, but today that result seems sadly unlikely. By taking an allegedly "hard-headed" approach focused purely on interests, the Obama administration has served neither American ideals nor its interests.

As Ethar El Katataney says in the tweet pictured above, "Pandora's Box is wide open. How are we going to close it?"

Q & A on Jaron Lanier's "Who Owns the Future?"

I spent the last week in Seattle, visiting my college roommate and good friend Bill Carleton (@Wac6 on Twitter). Bill is a lawyer with expertise in securities and intellectual property law, and lots of experience with start ups.

Bill Carleton
Over cigars and scotch, we had a long discussion/debate over the NSA revelations, which then morphed into a discussion of how private companies like Amazon, Google, and Facebook are amassing big data on all of us as well. Bill referred me to the work of Jaron Lanier, a technologist and author of You Are Not a Gadget. He also gave me his copy of Who Owns the Future?, which I read.

Lanier's argument, as I see it, is essentially this: these companies seduce the user with the lure of "free" services, and then mine all of us for valuable data. Lanier refers to these as "Siren Servers." The information we voluntarily cede has value, which those companies then convert into profits. But we users are never compensated for that valuable data. Lanier proposes a new model, which would acknowledge the monetary value that data has, so that each time we surrender such data, we receive a "nanopayment" to reflect that value.

Our resulting discussions led to this Q & A, which Bill has posted on his blog, William Carelton, Counselor @ Law.

I intend to expand on these preliminary thoughts on MOOCs in a future post here, but for now, this gives some sense of where I'm heading.

Thursday, August 1, 2013

British Diplomacy and Royal Baby Mania

A new heir to a powerless throne was born last week, and a portion of the American public went into a collective tizzy over it. Predictably, another portion went into a tizzy over that, and muttered some variant of "George Washington [or some other Founder] must be rolling over in this grave."

As an American of Irish descent, I don't go in much for British royalty worship, but I suppose it is no more harmful than many other types of American celebrity adoration. But these two types of responses got me thinking about the nature of the American relationship with the former mother country.

For the vast majority of living Americans, it has always been the "special relationship," but that is actually of fairly recent vintage, forged in the crucible of World War II (and visually captured by this image of FDR and Churchill, meeting on a British naval vessel in August 1941, singing "Onward Christian Soldiers").

As so often happens, the memory of the living is incomplete.

One need not go back to the Revolution to find Americans who would be appalled at the sight of last week's American attention to a British royal birth. For at least the first century after American independence, Britain was the enemy of the United States: at war from 1812 to 1815, at odds for most the next few decades (including a near-war over Oregon in 1846), no other country caused 19th century American presidents more consternation than Britain.

Not even fighting on the same side in the first World War completely changed that. American Anglophobia was still alive as FDR and Churchill were meeting in the summer of 1941. The United States was in the midst of the Great Debate over the extent of American involvement in World War II. Pearl Harbor lay months in the future, and isolationists continued to argue that the United States should not enter the war. There were plenty of Americans who rejected the idea that there was any "special relationship," or that American interests overlapped much with those of Great Britain.

While their arguments against American involvement were numerous, one is particularly relevant here: it would mean fighting for British imperialism. In April 1941, in a nationally broadcast radio debate, Fay Bennett, executive secretary of the Youth Committee Against War said: "if you have any faith that the British Empire is going to bring peace and democracy to the world, I would like one bit of evidence to that effect."

A particular focus of this line of argument was British rule in India. Gandhi's independence movement had put the British in the awkward position of fighting German imperialism in Europe while defending their own in Asia. As Churchill later said in November 1942, "we mean to hold our own. I have not become the King's First Minister in order to preside over the liquidation of the British Empire."

The isolationist argument that America should not assist British imperialism was so common that an interventionist anticipated it in one debate: “I don’t think the present regime in England … and even what is going on in India—I suppose some of you people will want to bring that up—can in any way compare to the world we would have to live in if Hitler were the victor.”

The defenders of aid to Britain often resorted not to appeals to any "special relationship," but instead fell back on the idea that the British were simply the lesser of evils in this fight.

Pearl Harbor ended all that, as the German declaration of war brought the U.S. fully into the war on the same side as Great Britain, and Churchill came to the U.S. and lived at the White House for several weeks.

That result was no accident, however. It was the product of over 40 years of wise diplomacy by the British Foreign Office, which surveyed the world situation in the 1890s and made a conscious decision to cultivate American friendship. It saw three rising powers in the world: the U.S., Germany, and Japan. It could not successfully oppose all three, so Britain choose to accommodate the rise of American power and hoped to enlist it in the fight to thwart the other two. In December 1941, Britain saw that decision pay off magnificently, as America joined forces with Britain to defeat the designs of Germany and Japan.

I'm quite confident that officials of the British Foreign Office never anticipated American "royal baby mania" in 2013. In some sense, however, the unthinking American adoption of the British royal family witnessed last week is also an unintended product of their diplomacy.

Saturday, July 20, 2013

Orwell on Hitler

A great writer will surprise you.

For the last few months, I have been intermittently dipping into George Orwell's collected Essays. A few pieces have seemed rather dated, most have been interesting and enlightening, and not a few (like his extended musings on Dickens) are extraordinary.

The other night, I was reading in bed, finishing his review of Henry Miller's Tropic of Cancer (which is less a review than an examination of the role of literature now that World War II had come), and I turned the page to find that the next piece was titled "Review of Mein Kampf, by Adolf Hitler, unabridged translation" from March 1940.

"Well," I thought, "THIS should be interesting." I decided I could stay up reading just a little longer.

The first half deals mostly with how Hitler's image in Britain had changed over the last year. Then Orwell writes something that stopped me dead in my tracks:
"I should like to put it on record that I have never been able to dislike Hitler."
I reread the sentence, certain that I had missed something, but I hadn't. How could Orwell, with his unremitting hatred of totalitarianism, not hate Hitler? Orwell spends the second half of the essay persuasively explaining himself, but the brief answer is this: simply hating Hitler is easy, lazy, and self-defeating.

Precisely because he despises totalitarianism, Orwell is interested in the reason that Germans have accepted Hitler's leadership. He starts by recognizing that Hitler's political success was due in part to the "attraction of his own personality." Orwell writes that while he has thought that, given the chance, "I would certainly kill him," he would "feel no personal animosity" because "there is something deeply appealing about him." That is, really, the horrible truth. Hitler otherwise never would have become so powerful.

Orwell argues that it is Hitler's portrayal of himself as a kind of underdog that is so affecting:
He is the martyr, the victim, Prometheus chained to the rock, the self-sacrificing hero who fights single-handed against impossible odds.... The attraction of such a pose is of course enormous; half the films that one sees turn upon some such theme.
What makes Orwell's analysis so powerful is not simply that he identifies the source of Hitler's appeal, but that he admits that he himself is susceptible to it. He does not separate himself from (and thereby elevate himself above) the Germans who support Hitler. He identifies with and understands them.

Even more, he gives the devil his due. It is not merely that Hitler's personality can be attractive, Orwell argues. Hitler's appeal is also due to his ideology, which has at its foundation an important insight:
Also he has grasped the falsity of the hedonistic attitude toward life.... Hitler, because in his own joyless mind he feels it with exceptional strength, knows that human beings don't only want comfort, safety, short working-hours, hygiene, birth control and, in general, common sense; they also, at least intermittently, want struggle and self-sacrifice, not to mention drums, flags and loyalty-parades.
It goes without saying that Orwell finds Hitler's ideology repugnant; why then say that "Fascism and Nazism are psychologically far sounder than any hedonistic conception of life"? It is precisely because he finds it so horrific that he must recognize its power. Few people of his time knew better than Orwell the awful places that totalitarianism would soon lead humanity. He was able to see where it would lead because he understood its psychological power. He did not unthinkingly dismiss it as evil, he did not live in denial. He grappled with it.

In a passage I suspect will resonate with most of my friends who are parents, Orwell writes:
The Socialist who finds his children playing with soldiers is usually upset, but he is never able to think of a substitute for the tin soldiers; tin pacifists somehow won't do.
His point is not that the parent need approve or encourage that part of the child's make-up, but rather that it is foolish and unproductive to ignore or deny its reality.

Orwell knew that merely demonizing the enemy is in fact doing the enemy a favor. Understanding the appeal of your enemy and your enemy's ideas does not mean abandoning one's own views, or excusing those of the enemy. It is, instead, key to defeating the enemy.

Orwell believed that Hitler's way was bound to produce "years of slaughter and starvation" for Germany. At that point, he writes, "Greatest happiness of the greatest number" would once again be "a good slogan." But, he says,
at this moment, "Better an end with horror than a horror without end" is a winner. Now that we are fighting against the man who coined it, we ought not to underrate its emotional appeal.
That's a lesson we all can take from Orwell's surprising take on Hitler.

Wednesday, June 19, 2013

The Enduring Appeal of the "Great Leader" Myth

“Some folks still don’t think I spend enough time with Congress. ‘Why don’t you get a drink with Mitch McConnell?’ they ask. Really? Why don’t you get a drink with Mitch McConnell?”

That's probably my favorite joke from President Obama's Correspondents Dinner routine, because it exposes the silliness of some of the criticism he's taken for his alleged lack of leadership. If only he'd spend more time schmoozing with members of Congress, then he might be able to get more legislation passed. Right. Mitch McConnell and John Boehner would just love to help the president with his agenda if only Obama would spend more time with them.

This kind of critique is all too common, because it is all too easy. Truly understanding the power dynamics of the House and the Senate, as well as the Republican and Democratic parties, is hard work. It is so much easier to just say the president hasn't led, and pretend there is some easy and obvious solution to the problem; say, schmoozing.

This is, of course, nothing new. I've been working for the last few weeks on the historical literature on the period before Pearl Harbor, and there are plenty of historians who level the same charge of lack of leadership at FDR. One of the more egregious examples comes from Stephen Ambrose. Charging that FDR's pre-Pearl Harbor policy was a "dismal failure," he slams Roosevelt for failing to convince the American people to go to war with Germany before the Japanese attack.

Ambrose notes that polls showed the public was against declaring war up until Dec. 7, 1941. "What the polls could not measure," Ambrose writes, "was how attitudes might have changed had presidential leadership been stiffer." Leaving aside the rather important question of whether or not FDR actually wanted all-out war with Germany at that point (and I am not convinced he did), Ambrose never says what "stiffer" leadership could have been.

Since the war in Europe began, FDR had consistently and persistently argued that a German victory in the war would be a disaster for the United States. He urged greater and greater assistance for Britain as its needs grew in order to prevent that outcome. No one who reads what FDR said over those two years can have any doubt that he made the strongest possible argument against Nazi Germany. Yet to Ambrose, somehow or other, FDR's leadership should have been "stiffer." There was something else he could have said that magically would have convinced the public to support a declaration of war on Hitler, even though Hitler had not declared war on the United States.

The flip side of this kind of thinking is similarly to attribute whatever political success a president enjoys to leadership--usually in the form of giving a speech. That was on full display last week in the New York Times on the 50th anniversary of JFK's civil rights speech. In a news piece entitled "When Presidential Words Led to Swift Action," Adam Clymer writes: "These days it is hard to imagine a single presidential speech changing history," but the civil rights speech is now seen as a "critical turning point." In an op-ed piece entitled "Kennedy’s Finest Moment," Peniel E. Joseph says it "might have been the single most important day in civil rights history."

Now, I think the speech is important--I always talk about it when teaching about the civil rights movement. But "most important day"? Or "critical turning point"? Hardly.

Clymer says the speech "led quickly and directly to important changes." In fact, the speech did neither.

The Civil Rights Act, which JFK proposed that night, was passed more than a year later (MLK's March on Washington two months after JFK's speech was in response to the lack of quick movement in Congress). That legislation's passage was not directly due to the speech, but the efforts of LBJ (who was aided greatly by the fact that JFK's death allowed him to portray the legislation as a tribute to the martyred president) and bipartisan support in Congress.

More importantly, what both pieces neglect is that the speech is not a good example of presidential leadership (at least not in the sense that they mean it). By the time the Civil Rights Act became law, ten years had passed since the Supreme Court's Brown v. Board of Education case stated that separate is inherently unequal. That decade was marked primarily not by presidential leadership, but by grassroots organizing and protest. Before giving that speech, JFK had spent two and half years riding the fence on civil rights, while sit-ins and freedom rides and demonstrations relentlessly pushed the issue into the public consciousness. What forced JFK at last to take sides was the movement, specifically recent events in Birmingham. Whatever you call that, it wasn't "leadership."

Presidents do not make history on their own, however much some people like to pretend they do. FDR famously once told a group of supporters that he agreed with what they wanted him to do. "Now make me do it," he added.

That's how democracy works. Leaders have no magic wands that they can wave. The people have to demand that it get done, and create the political pressure. In June 1940, Eleanor Roosevelt met with some supporters who were frustrated with the lack of progressive reform in recent years. She said to them:
If you believe in democracy, you have got to work for the majority and have got to be willing to wait for it. I have done organization work practically all my life and I know that until you organize a thing down to the precincts and get a real demand from there up, there is not that majority demand for the thing and you cannot get it. You come away with the feeling that the President is willing to lead, but never too far in advance of public opinion, because that is the way things work in a democracy as understood by a politician and a democrat.
The people who don't understand that are often neither politicians nor democrats, but ideologues who are so utterly convinced of the virtue of their positions that they cannot imagine that anything other than incompetence or malice could be standing in the way of achieving their self-evidently desirable goals. They want the leader to simply do it, already!

This idea of "great person" leadership is not only mistaken, it is profoundly undemocratic. When critics bellyache about the lack of leadership, they are often betraying an authoritarian impulse to impose their own views on others, without doing the hard work of convincing others that it is the right way to go. You've got to be willing to wait for it. As frustrating as that work can be, it certainly beats dictatorship by the ideologically certain.

Yes, presidents matter. Speeches matter. Stating where one stands matters. But mostly, change comes when people decide that change matters.

Monday, May 27, 2013

Three-Fifths Understanding

The infamous three-fifths clause is back in the news.

The Twittersphere exploded last Wednesday when the GOP nominee for Lt. Governor of Virginia, E. W. Jackson, made this comment:

“Rev. [Charles Wallace] Smith must not have understood the 3/5ths clause was an anti-slavery amendment. Its purpose was to limit the voting power of slave-holding states.”

Jackson has made all sorts of interesting comments, so it is not surprising that some people put the worst possible interpretation on his remarks, suggesting that he was defending the clause. But Jackson was not entirely wrong.

E.W. Jackson
The clause is probably both one of the most well-known and most misunderstood clauses in the Constitution:
Representatives and direct Taxes shall be apportioned among the several States which may be included within this Union, according to their respective Numbers, which shall be determined by adding to the whole Number of free Persons, including those bound to Service for a Term of Years, and excluding Indians not taxed, three fifths of all other Persons.
In the popular mind, the clause has come to mean that the Constitution considered a slave to be three-fifths of a person, and in a metaphorical sense, that is true. That is not, however, what motivated the clause.

What was actually at issue here was how to deal with the reality that the new government would be a union of states, some of which had done away with slavery (or were in the process of doing so; most northern states did away with slavery gradually) and others which had large numbers of slaves. Since the new House of Representatives was to apportion representatives based on population, the framers needed to decide whether the slave population would be taken into account.

From a modern perspective, it seems simple: just add up the total number of people (or "Persons" in the Constitution's language). In this case (as in so many others for the next six decades), slavery took what should have been an easy question and made it a complicated one. In a legal sense, slaves were property, not "Persons." So northerners argued that they should not be counted for representation at all.

That's right: the anti-slavery portion of the country argued that slaves should not be three-fifths of a person, but zero-fifths of a person. To add to the absurdity, southerners--who otherwise insisted that slaves were not "Persons" any more than horses were--argued that in this instance, and this instance alone, slaves should be counted as whole Persons, equal to every white man, woman, and child.

Each section adopted the convenient argument which would enhance its political power. To get past the impasse, they compromised on three-fifths. So compared to the southern position, Jackson is correct: three-fifths is relatively anti-slavery in that it gave slave states less power in the House than they would have had if slaves were counted as whole Persons. Compared to the northern position, however, three-fifths can only be seen as pro-slavery, since it allowed white southerners to count 60% of their slaves as both Persons and property.

The important question, however, is not whether the three-fifths clause was pro- or anti-slavery. What matters is what this story tells us about the nature of compromise in the American political system.

How do we evaluate such a compromise? From today's perspective, most of us are understandably appalled, and shudder when someone fails to be.

For example, three months ago, James Wagner, the president of Emory, stepped into it. He called "the agreement to count three-fifths of the slave population for purposes of state representation in Congress" an example of "constitutional compromise" which he praised as one of the "[p]ragmatic half-victories [that] kept in view the higher aspiration of drawing the country more closely together."

He was pilloried for those comments, and was forced to issue this statement: "I do not consider slavery anything but heinous, repulsive, repugnant, and inhuman."

Wagner's mistake was not in praising compromise in general, but in holding up this compromise as an example to emulate.

No one, north or south, looks good in this story. No one. The practical business of creating a framework for government brought the Constitutional Convention face-to-face with a fundamentally moral question, one whose very existence exposed the essential, inescapable immorality of slavery: is a slave a Person or property?

Faced with a stark moral choice, the framers side-stepped it and reframed it as merely a practical matter. They split the difference. They compromised.

Today, we rightly recoil from the immorality of this compromise. But this instance should not deceive us into rejecting all compromise.

The American political system is built for compromise. It breaks down when faced with moral questions that do not lend themselves to compromise solutions. That is why the federal government today is in danger of becoming utterly dysfunctional: the GOP has seemingly decided that virtually all compromise is ignoble surrender.

There are times when compromise truly is ignoble. The three-fifths clause seems like one. Today we rightly recoil at the idea that slaves were considered three-fifths of a Person.

But given the two proposals, which one would you choose? Would the north's zero-fifths option have been preferable moral ground? Would the south's blatantly hypocritical stand--that for purposes of political power, slaves were whole Persons, but under state law they were mere property--have been morally satisfying?

No, there was no good moral ground on which to stand. So pervasive was the moral corruption caused by the evil of slavery that it left no good political choices. Everything was tainted by it. The only truly moral choice would have been to abolish it completely and immediately, and few white American leaders took that idea seriously in the late 1780s. And the nation made that choice only after the political system broke down completely and the country descended into civil war.

Most of the decisions government makes, however, don't involve basic moral principles. They engage practical questions, ones where splitting the difference makes good sense. Ideologues who insist on treating every disagreement as one over principles threaten to disrupt the political system and render it as useless on all questions as it once proved on the issue of slavery.
Sen. Ted Cruz (R-TX)

Perhaps the distinguishing characteristic of the Tea Party movement is this reflex to raise every decision to that moral level. The demagogue de jour, Sen. Ted Cruz of Texas, does this as well as anyone. Recently, he has refused to let Senate and House conferees meet to decide on a budget because he is convinced it will lead to a deal to raise the debt limit. Cruz casts this (and virtually every legislative issue) as a matter of "fighting to defend liberty, ... fighting to defend the Constitution."

This is absurd. There is no moral (or immoral) answer to whether (or how much) to raise the debt ceiling.

Political maturity requires knowing the difference between matters of principle and matters of utility. By casting nearly everything as the former rather than the latter, the Tea Party members of Congress show that they lack the most basic judgment needed to govern effectively. And as long as saner members of Congress appease them, they risk making the national legislature utterly unworkable.

Thursday, May 23, 2013

"Read the History"

Last week, a reporter asked President Obama the inevitable question: was his administration like that of Richard Nixon? That the question was so predictable made it no less absurd. "I’ll let you guys engage in those comparisons," Obama replied. "And you can go ahead and read the history, I think, and draw your own conclusions."

Of course a historian will like his advice: "read the history." The president's problem is that people do not know the history of Watergate and are unlikely to read it. (For those who are interested, Elizabeth Drew has a nice summary of Nixon's misdeeds.)

Since the early 1970s, almost every case of any kind of misbehavior in the political realm gets saddled with the Watergate comparison, and just about every comparison to Nixon and Watergate ends up being silly (for example, the case of Wisconsin Governor Scott Walker's phony Koch brothers tape as I noted here).

Why do we repeatedly do this? The problem, as I see it, is in the name: "Watergate." By assigning that single word to the Nixon administration's many crimes, we reduce them to something small and petty. They were anything but that, but most of us don't know it.

For Americans born after the 1960s, the term "Watergate" is likely little more than a synonym for political scandal.  Even those (like me) with some memory of the scandal are unlikely to remember more than a few basic facts: a group of thugs associated with the Nixon re-election campaign stupidly broke into the Democratic headquarters and got caught, Nixon panicked, tried to cover it up, and then was done in by the foolish efforts to cover-up an incident that, in the end, was really no big deal.

In a perfect example of that flawed conventional wisdom,  Newt Gingrich, when asked about the president's recent troubles, said: "It's always the cover-up that kills you."

That's simply wrong, but you hear it whenever there is a new Washington "scandal."

What it neglects is this: Nixon had plenty he needed to cover up. I've never thought that Nixon ordered the break-in at the Watergate hotel, and to this day no definitive evidence shows that he did. The important fact, however, is that that does not matter. Nixon's motive for the cover up was that the men involved in the break-in were involved in plenty of other nefarious activities, ones that Nixon did know about and often did order. The burglars answered to people close to Nixon who had direct knowledge of the president's involvement in numerous criminal activities.

Nixon's plan was to throw the burglars under the bus, but bribe them to keep quiet. It didn't work. They talked, and then others began to talk too--most notably Nixon's lawyer, John Dean. (Dean is on record saying that nothing going on today approaches the seriousness of the Nixon years.) That led to the revelations of widespread abuse of power that finally forced Nixon from power.

The Nixon of the common understanding of Watergate was stupid, or crazy, or both. I reject that interpretation. Nixon was many things, but stupid is not one of them. As the pressures of the Watergate scandal grew, one can make the case that he may have become mentally unhinged. But the actions that forced him from office were done when Nixon was in full command of his faculties. They reflected who he was. One's actions usually do.

Perhaps we will find that a secretly Nixonian Obama gave orders that the IRS screw anyone associated with the Tea Party (if so, they did a terrible job of it--none of the organizations subjected to extra scrutiny were denied tax-exempt status). Maybe we'll learn that he became so obsessed with leaks that he suggested poisoning reporters, as Nixon suggested should be done to Jack Anderson.

But it is far more likely that we'll find that this brouhaha has been just another uninformed use of the Watergate analogy. If we read the history.

Monday, May 20, 2013

Political Targeting, Unconscionable Delays, Harassing Questions

Have you heard about the scandal in Washington? Turns out that people are being targeted for their political beliefs. People in positions of authority are abusing their power. For no good reason (other than political animus) the powerful are imposing unconscionable delays on simple requests. Applicants are being forced to answer an absurd number of questions by out-of-control, power-drunk people in government.

But I'm not talking about the IRS. I'm talking about Republican Senators.

Recently, the Senate finally confirmed the nominee to head the Privacy and Civil Liberties Oversight Board.

It took them 510 days to do it.

President Obama has nominated Gina McCarthy to head the EPA. According to a New York Times report, Republican Senators have submitted 1,100 questions for her to answer. Treasury Secretary Jack Lew received 395 questions from Republicans (compared to 49 questions from Democrats to George W. Bush's Treasury nominee).

The committee considering Ms. McCarthy's nomination has been unable to vote on it because no Republican member of the committee will show up for a meeting, denying it a quorum.

The IRS officials who used search terms to identify conservative organizations for extra scrutiny were obviously wrong to do so. But it is more than a little ironic to hear Republicans loudly denounce as a scandalous abuse of power harassment tactics that they regularly use to deny the president his nominees.

This is not "advise and consent" by any reasonable definition of that phrase. It is pure obstructionism.