AL JAZEERA COLUMN: Too Soon To Tell

I am pleased to announce that I am now writing a weekly long-form column for Al Jazeera English. Here is my second piece for Al Jazeera:

One Year Early, Obama’s Reelection Far From Certain

The American punditocracy (and, perhaps more importantly, Las Vegas oddsmakers) currently cite Barack Obama as their odds-on favorite to win next year’s presidential election. Some even predict a landslide.

Mainstream media politicos acknowledge the atrocious economy, with its real unemployment rate nearly matching the worst years of the Great Depression of the 1930s, as an obstacle to reelection. But most of them believe that other factors will prove decisive: disarray in the field of candidates for the nomination of the opposition Republican Party, the GOP’s reliance on discredited Reagan-style austerity measures for the masses coupled with tax cuts for the wealthy, and Obama’s assassination of Osama bin Laden.

Maybe they’re right. But if I were the President, I wouldn’t be offering the White House chef a contract renewal any time soon. Count me among the majority of Americans (54 to 44 percent) who told a March 2011 CNN/Opinion Research poll they think Obama will lose the 2012 election.

I could be wrong.

Scott Keeter, director of survey research at the Pew Research Center, doesn’t think much of these so-called “trial-run” polls. “A review of polls conducted in the first quarter of the year preceding the election found many of them forecasting the wrong winner—often by substantial margins,” Keeter wrote in 2007, citing three elections as far back as 1968.

However, a historical analysis of the more recent presidential races, those over the two decades, reveals an even bigger gap. The year before a U.S. presidential election, the conventional wisdom is almost always wrong. The early favorite at this point on the calendar usually loses. So betting against the pundits—in this case, against Obama—is the safe bet at this point.

The meta question is: what difference does it make who wins next year? In practical terms, not much.

For one thing, American presidents tend to find more heartbreak than political success during their second terms. Had Richard Nixon retired in 1972, for example, he would have been fondly remembered as the architect of the Paris peace talks that ended the Vietnam War, the founder of the Environmental Protection Agency, and the defender of the working and middle class (for imposing wage and price controls to soften the effect of inflation). His second term saw him sinking into, and ultimately succumbing, to the morass of the Watergate scandal.

The next second termer, Ronald Reagan, was similarly preoccupied by scandal, in case the Iran-Contra imbroglio in which the United States traded arms to Iran in return for hostages held by students in Tehran and illegally funded right-wing death squads in Central America. Bill Clinton’s last four years were overshadowed by his developing romance, and the consequences of the revelation thereof, with intern Monica Lewinsky. George W. Bush’s second term, from 2005 to 2009, was defined by his administration’s inept response to hurricane Katrina in New Orleans, the deteriorating security situation in U.S.-occupied Afghanistan and Iraq, and the economic collapse that began in 2008. His number-one political priority, privatizing the U.S. Social Security system, never got off the ground.

Presidents rarely accomplish much of significance during their second term. So why do they bother to run again? Good question. Whether it’s ego—1600 Pennsylvania Avenue is one hell of an address—or something else, I don’t know. Whatever, I have long maintained that a sane president would think of himself as standing for one four-year term, then announce his intention not to run again at the last possible moment.

From the standpoint of the American people and the citizens of countries directly affected by U.S. foreign policy, it is unlikely that the basic nature of the beast will change much regardless of Obama’s fortunes in the next election. One only has to consider the subtle “differences” between the tenures of Presidents Bush and Obama.

On the domestic front Obama continued and expanded upon Bush’s non-reaction to the economic crisis, exploiting the panic created by widespread unemployment, the bursting of the housing bubble and a massive foreclosure crisis that put tens of millions of Americans out of their homes in order to pour hundreds of billions of federal dollars into the pockets of the top executives of the nation’s largest banks, with no resulting stimulus effect whatsoever. Controversial attacks on privacy rights and civil liberties inaugurated by the Bush years were expanded and extended: the USA-Patriot Act, the National Security Agency “domestic surveillance” program that allowed the government to spy on U.S. citizens’ phone calls, emails and other communications. Obama even formalized Bush’s assertion that the president has the right to unilaterally order the assassination of anyone, including a U.S. citizen, without evidence or proof that he or she has committed a crime.

As promised during the 2008 campaign, Obama expanded the U.S. war against Afghanistan, transforming what Bush described as a short-term attempt to find Osama bin Laden after 9/11 into the most protracted military conflict in the history of the United States. The war continued in Iraq, albeit with “combat” troops redefined as “trainers.” During the last few years, the “global war on terror” expanded into Pakistan, east Africa, Libya and Yemen. Drone attacks escalated. Violating his campaign promises, he continued to keep torture available as a legal option—indeed, he ordered it against a U.S. solder, Private First Class Bradley Manning—and kept Guantánamo and other Bush-era concentration camps open.

If Obama goes down to defeat next year, then, the results should be viewed less as a shift in overall U.S. policy—hegemonic, imperialistic, increasingly authoritarian—than one that is symbolic. An Obama defeat would reflect the anger of ordinary Americans caught in the “two-party trap,” flailing back and forth between the Dems and the Reps, voting against the party in power to express their impotent rage, particularly at the economy. Mr. Hopey-Changey’s trip back to Chicago would mark the end of a brief, giddy, moment of reformism.

The argument that an overextended, indebted empire can be repaired via internal changes of personnel would be dead. With the reformism that Obama embodied no longer politically viable, American voters would be once again faced, as are the citizens of other repressive states, with the choice between sullen apathy and revolution.

Obamaism is currently believed to be unstoppable. If history serves as an accurate predictor, that belief is good cause to predict its defeat next November.

During the late spring and early summer of 1991, just over a year before the 1992 election, President George H.W. Bush was soaring in the polls in the aftermath of the Persian Gulf War, which the American media positively portrayed as successful, quick, internationalist, and cost the lives of few America soldiers. A March 1991 CBS poll gave him an 88 percent approval rating—a record high.

By October 1991 Bush was heavily favored to win. A Pew Research poll found that 78 percent of Democratic voters thought Bush would defeat any Democratic nominee. New York governor Mario Cuomo, an eloquent, charismatic liberal star of the party, sized up 1992 as unwinnable and decided not to run.

When the votes were counted, however, Democrat Bill Clinton defeated Bush, 43 to 37.5 percent. Although Republicans blamed insurgent billionaire Ross Perot’s independent candidacy for siphoning away votes from Bush, subsequent analyses do not bear this out. In fact, Perot’s appeal had been bipartisan, attracting liberals opposed to the North American Free Trade Agreement (NAFTA) between the U.S., Canada and Mexico and globalization in general, as well as conservative deficit hawks.

The most credible explanation for Bush’s defeat was handwritten on a sign that the victorious Bill Clinton’s campaign manager famously taped to the wall of the Dems’ war room: “It’s the economy, stupid.” As the 1989-1993 recession deepened Bush’s ratings tumbled to around 30 percent. A February 1992 incident, in which Bush was depicted by The New York Times as wearing “a look of wonder” when confronted with a supermarket price scanning machine, solidified his reputation with voters as patrician, out of touch, and unwilling to act to stimulate the economy or alleviate the suffering of the under- and unemployed. “Exit polls,” considered exceptionally reliable because they query voters seconds after exiting balloting places, showed that 75 percent of Americans thought the economy was “bad” or “very bad.”

In 1995, Bill Clinton was preparing his reelection bid. On the Republican side, Kansas senator and 1976 vice presidential candidate Bob Dole was expected to (and did) win his party’s nomination. Perot ran again, but suffered from a media blackout; newspapers and broadcast outlets had lost interest in him after a bizarre meltdown during the 1992 race in which he accused unnamed conspirators of plotting to violently disrupt his daughter’s wedding. He received eight percent in 1996.

Clinton trounced Dole, 49 to 40 percent. In 1995, however, that outcome was anything but certain. Bill Clinton had been severely wounded by a series of missteps during his first two years in office. His first major policy proposal, to allow gays and lesbians to serve openly in the U.S. military, was so unpopular that he was forced to water it down into the current “Don’t Ask, Don’t Tell” compromise. Clinton’s 1993 attempt to deprivatize the healthcare system, mocked as HillaryCare after he put his wife in charge of marketing it, went down to defeat. He signed the pro-corporate, Republican-backed trade agreement, NAFTA, alienating his party’s liberal and progressive base. Low voter turnout by the American left in the 1994 midterm elections led to the “Republican Revolution,” a historic sweep of both houses of the American Congress by right-wing conservatives led by the fiery new Speaker of the House, Newt Gingrich.

1995 saw the so-called “co-presidency” between Gingrich and a cowed Bill Clinton, who was reduced to telling a press conference that “the president is relevant.” The United States, which does not have a European-style parliamentary system, had never seen a president so politically weak while remaining in office.

During the spring and summer of 1995 Bob Dole was already the heir apparent to the nomination of a Republican Party that traditionally rewards those who wait their turn. Dole was a seasoned campaigner, a Plains States centrist whose gentlemanly demeanor and credentials as a hero of World War II. Conventional wisdom had him beating Clinton. So did the polls. A March 1995 Los Angeles Times poll had Dole defeating Clinton, 52 to 44 percent in a head-to-head match-up. “Among all voters, Clinton’s generic reelect remains dismal, with 40 percent inclined to vote him in again and 53% tilting or definitely planning a vote against him,” reported the Times.

By late autumn, however, the polls had flipped. Though statisticians differ about how big a factor it was, a summer 1995 shutdown of the federal government blamed on the refusal of Gingrich’s hardline Republicans to approve the budget turned the tide. At the end of the year the die was cast. As Americans began to pay more attention to his challenger they recoiled at Dole’s age—if elected, he would have been the oldest president in history, even older than Reagan—as it contrasted with Clinton’s youthful vigor. The Democrat coasted to reelection. But that’s not how things looked at this stage in the game.

When analyzing the 2000 race, remember that Republican George W. Bush lost the election to Al Gore by a bizarre quirk of the American system, the Electoral College. The U.S. popular vote actually determines the outcome of elected delegates to the College from each of the 50 states. The winner of those delegates is elected president.

Most of the time, the same candidate wins the national popular vote and the Electoral College tally. In 2000, there is no dispute: Democrat Al Gore won the popular vote, 48.4 to 47.9 percent. There was a legal dispute over 25 electoral votes cast by the state of Florida; ultimately the U.S. Supreme Court decided, along party lines, to award the state to Bush despite clear indications that Gore would have won recounts by tens of thousands of votes in that state.

Regardless of one’s views of the 2000 Florida recount controversy, from a predictive standpoint, one should assume that Gore won because no one could have anticipated a difference between the results of the electoral and popular votes.

Under normal circumstances Gore should have faced, as Dick Cheney said about the Iraq invasion, a cakewalk. A popular sitting vice president, he enjoyed the trappings of incumbency and a reputation as a thoughtful environmentalist and government efficiency expert. The economy was booming—always a good argument for the “don’t change horses in midstream” sales pitch. The early favorite on the Republican side, George W. Bush, was considered an intellectual lightweight who would get eaten alive the first time the two met in a presidential debate. But Monicagate had wounded Bill Clinton to the extent that Gore made a fateful decision to disassociate himself from the president who had appointed him.

A January 1999 CNN poll had Bush over Gore, 49 to 46 percent. By June 2000 the same poll had barely budged: now it was 49 to 45 percent. “The results indicate that the public is far more likely to view Texas Governor George W. Bush as a strong and decisive leader, and is also more confident in Bush’s ability to handle an international crisis—a worrisome finding for a vice president with eight years of international policy experience,” analyzed CNN in one of the most frightening summaries of the American people’s poor judgment ever recorded.

Gore didn’t become president. But he won the 2000 election. Once again, the media was wrong.

In the 2004 election, it was my turn to screw up. Howard Dean, the combative liberal darling and former Vermont governor, was heavily favored to win the Democratic nomination against incumbent George W. Bush. I was so convinced at his inevitability after early primary elections and by the importance of unifying the Democratic Party behind a man who could defeat Bush that I authored a column I wish I could chuck down the memory hole calling for the party to suspend remaining primaries and back Dean. In 2004, John Kerry won the nomination.

Oops.

But I wasn’t alone. Polls and pundits agreed that George W. Bush, deeply embarrassed by the failure to find weapons of mass destruction in Iraq, would lose to Kerry, a Democrat with a rare combination of credentials: he was a bonafide war hero during the Vietnam War and a noted opponent of the war after his service there.

Bush trounced Kerry. “How can 59,054,087 people be so DUMB?” asked Britain’s Daily Mirror. Good question. Maybe that’s why no one saw it coming.

Which brings us to the most recent presidential election. First, the pundit class was wrong about the likely Democratic nominee. Former First Lady and New York Senator Hillary Rodham Clinton, everyone “knew,” would win. It wasn’t even close. An August 2007 Gallup/USA Today poll had Clinton ahead of Obama, 48 to 26 percent. As it turned out, many Democratic primary voters were wowed by Obama’s charisma and annoyed by Clinton’s refusal to apologize for her brazenly cynical vote in favor of the Iraq war in 2003. Aging Arizona Senator John McCain, on the other hand, remained the best-funded, and thus the continuous favorite, on the Republican side.

Obama’s advantages over McCain became clear by 2008. “The political landscape overwhelmingly favors Obama,” reported USA Today in June. At this point in 2007?

He didn’t stand a chance.

Ted Rall is an American political cartoonist, columnist and author. His most recent book is The Anti-American Manifesto. His website is rall.com.

SYNDICATED COLUMN: Stamped Out

The Statue of Liberty Stamp Error and the End of America

It may seem like a minor thing. Objectively it is a minor thing. But the Great Statue of Liberty Stamp Screw-up of 2011 presents a picture-perfect portrait of a society in the midst of collapse.

You can tell a lot about the state of a country from its stamps and its currency. At a nation’s peak its graphic iconography tends to be striking, elegant and original. As it begins to wane abstraction gives way to self-caricature, innovative design to self-parody, high art to kitsch.

Look at U.S. stamps and paper money from 100 or 50 or even 30 years ago and you’ll see my point. Quarters were nearly sterling silver; now they’re mystery metal (nickel-copper-zinc alloy).

America: we’re not what we used to be.

A century ago President Theodore Roosevelt commissioned the famous Beaux Art sculptor Augustus Saint-Gaudens to redesign the nation’s coinage. Among the results were Saint-Gaudens’ breathtaking $20 gold “double eagle”; numismatists consider it one of the most elegant coins of the 20th century.

How the mighty have fallen! According to U.S. Mint officials recent revamps of the $100, $50, $20, $10 and $5 bills were undertaken without the slightest consideration for aesthetics. They didn’t even consult an art director. Stymieing counterfeiters was the sole concern.

Now the U.S. Postal Service has issued its newest first-class “forever” stamp. As the most widely used denomination, a new forever is a big deal.

The new stamp features a photo of the head of the Statue of Liberty. Well…not exactly. Instead of the Statue of Liberty paid for by coins donated by French schoolchildren, the proud iconic figure which has greeted millions of immigrants to New York, the stamp bears the visage of the small replica which stands in front of the New York-New York casino in Las Vegas.

Mistakes happen. As every philatelist knows, another error—the 1918 “Jenny Invert,” which features an image of an upside-down airplane—is one of the most prized collectibles in philately because Post Office officials destroyed all but one sheet of the 100 stamps.

That’s the usual response to a catastrophe in stampdom. Ten years ago Postal Service recalled and destroyed the entire run of a stamp that wrongly placed the Grand Canyon in Colorado.

But that was before the economic collapse that began in 2008. The Postal Service is broke. Quality standards? Can’t afford them. Incredibly, postal officials are allowing this monstrosity, this bastard creation, this artistic obscenity—the face is clearly the wrong one—to be sold at your local post office.

“We still love the stamp design and would have selected this photograph anyway,” USPS spokesman Roy Betts told The New York Times.

Uh-huh.

“The [postal] service selected the image from a photography service, and issued rolls of the stamp bearing the image in December,” reported the newspaper. “This month, it issued a sheet of 18 Lady Liberty and flag stamps. Information accompanying the original release of the stamp included a bit of history on the real Statue of Liberty. Las Vegas was never mentioned.”

It’s bad enough that they use photographs. Stamps should be engraved. Engraved stamps look classier and more substantial.

But whether they are using an engraver, illustrator or photographer, a U.S. stamp ought to be a big gig. For an assignment such as this I would expect the USPS to hire a professional and pay huge money—six-figures huge. It’s a stamp, for Chrissake.

Nope. The U.S. Postal Service buys stock photos. For stamp design. That’s right—the same cheesy clipart you can download for your kid’s birthday party invitation.

Insane.

In and of itself, this is no big deal. These are lean times. Austerity abounds. Why not save a few bucks?

It matters because symbolism matters. The kind of country that puts stock photos on its stamps is the kind of country that puts a single air traffic controller in charge of one of its biggest airports. The kind of country that doesn’t fix its mistakes is the kind that tells people under the age of 55 that they can go to hell and die when they get old and sick because it’s more important to cut taxes for rich scum than to fund Medicare.

As for the symbolism of a phony Statue of Liberty that stands in front of a casino in the nation’s gambling capital—well, that’s obvious.

It would be fine if the money being saved by printing crappy stamps went to new textbooks in inner-city schools. But it doesn’t. It goes to Halliburton and Bill Gates. Now that American workers have been hung out to dry, robbed and fleeced, wrung out and burned out, the government and its associated agencies (the USPS is quasi-governmental) have turned on themselves in service to the 21st century robber barons.

Don’t get mad about the stamps. Get mad at what they mean.

(Ted Rall is the author of “The Anti-American Manifesto.” His website is tedrall.com.)

COPYRIGHT 2011 TED RALL

css.php