After a devastating fire laid waste to much of the Cathedral of Notre Dame in Paris, three spectacularly wealthy French industrial families pledged 700 million euros to re-build the iconic structure. It was a generous gesture. But it was disconcerting that purse strings would open so quickly to repair a damaged building while so many actual living breathing human beings in France were suffering that the so-called “yellow vest movement” was rioting in the streets of Paris just a few weeks earlier.
The Charlie Hebdo massacre couldn’t have happened here in the United States. But it’s not because American newspapers have better security.
Gunmen could never kill four political cartoonists in an American newspaper office because no paper in the U.S. employs two, much less four, staff political cartoonists — the number who died Wednesday in Paris. There is no equivalent of Charlie Hebdo, which puts political cartoons front and center, in the States. (The Onion never published political cartoons — and it ceased print publication last year. MAD, for which I draw, focuses on popular culture.)
When I began drawing political cartoons professionally in the early 1990s, hundreds of my colleagues worked on staff at newspapers, with full salaries and benefits. That was already down from journalism’s mid-century glory days, when there were thousands. Many papers employed two. Shortly after World War II, The New York Times, which today has none, employed four cartoonists on staff. Today there are fewer than 30.
Most American states have zero full-time staff political cartoonists.
Many big states — California, New York, Texas, Illinois — have one.
No American political magazine, on the left, center or right, has one.
No American political website (Huffington Post, Talking Points Memo, Daily Kos, Slate, Salon, etc.) employs a political cartoonist. Although its launch video was done in cartoons, eBay billionaire Pierre Omidyar’s new $250 million left-wing start-up First Look Media refuses to hire political cartoonists — or pay tiny fees to reprint syndicated ones.
These outfits have tons of staff writers.
During the last few days, many journalists and editors have spread the “Je Suis Charlie” meme through social media in order to express “solidarity” with the victims of Charlie Hebdo, political cartoonists (who routinely receive death threats, whether they live in France or the United States) and freedom of expression. That’s nice.
No it’s not.
As far as political cartoonists are concerned, editorials pledging “solidarity” with the Charlie Hebdo cartoonists is an empty gesture — corporate slacktivism. Less than 24 hours after the shootings at Charlie Hebdo, the Fort Lauderdale Sun-Sentinel fired its long-time, award-winning political cartoonist, Chan Lowe.
Political cartoonists: editors love us when we’re dead. While we’re still breathing, they’re laying us off, slashing our rates, stealing our copyrights and disappearing us from where we used to appear — killing our art form.
American editors and publishers have never been as willing to publish satire, whether in pictures or in words, as their European counterparts. But things have gone from bad to apocalyptic in the last 30 years.
Humor columnists like the late Art Buchwald earned millions syndicating their jokes about politicians and current events to American newspapers through the 1970s and 1980s. Miami Herald humor writer Dave Barry was a rock star through the 1990s, routinely cranking out bestselling books. Then came 9/11.
When I began working as an executive talent scout for the United Media syndicate in 2006, my sales staff informed me that, if Barry had started out then, they wouldn’t have been able to sell him to a single newspaper, magazine or website — not even if they gave his work to them for free. Barry was still funny, but there was no market for satire anywhere in American media.
That’s even truer today.
The youngest working political cartoonist in the United States, Matt Bors, is 31. When people ask me who the next up-and-comer is, I tell them there isn’t one — and there won’t be one any time soon.
Americans are funny. Americans like funny. They especially like wicked funny. We’re so desperate for funny that we think Jon Stewart is hilarious. (But…Richard Pryor. He really was.) But editors and producers won’t give them funny, much less mean-funny.
Like any other disaster, media censorship of satire — especially graphic satire — in the U.S. is caused by several contributing factors.
Most media outlets are owned by corporations, not private owners. Publicly-traded companies are risk-averse. Executives prefer to publish boring/safe content that won’t generate complaints from advertisers or shareholders, much less force them to hire extra security guards.
Half a century ago, many editors had working-class backgrounds and rose through the ranks from the bottom. Now they’re graduates of pricey graduate university journalism programs that don’t offer scholarships — and don’t teach a single class about comics, cartoons, humor or graphic art. It takes an unusually curious editor to make the effort to educate himself or herself about political cartoons.
Corporate journalism executives view cartoons as frivolous, less serious than “real” commentary like columns or editorials. Unfortunately, some editorial cartoonists make this problem worse by drawing silly gags about current events (as opposed to trenchant attacks on the powers that be) because they’ve seen their blandest work win Pulitzers and coveted spots in the major weekend cartoon “round-ups.” When asked to cut their budget, editors often look at their cartoonist first.
There is still powerful political cartooning online. Ironically, the Internet contributes to the death of satire in America by sating the demand for hard-hitting political art. Before the Web, if a paper canceled my cartoons they would receive angry letters from my fans. Now my readers find me online — but the Internet pays pennies on the print dollar. I’m stubbornly hanging on, but many talented cartoonists, especially the young, won’t work for free.
It’s not that media organizations are broke. Far from it. Many are profitable. American newspapers and magazines employ tens of thousands of writers — they just don’t want anyone writing or drawing anything that questions the status quo, especially not in a form as powerful as political cartooning.
The next time you hear editors pretending to stand up for freedom of expression, ask them if they employ a cartoonist.
(Ted Rall, syndicated writer and cartoonist for The Los Angeles Times, is the author of the new critically-acclaimed book “After We Kill You, We Will Welcome You Back As Honored Guests: Unembedded in Afghanistan.” Subscribe to Ted Rall at Beacon.)
COPYRIGHT 2015 TED RALL, DISTRIBUTED BY CREATORS.COM
Originally published by The Los Angeles Times:
An event like yesterday’s slaughter of at least 10 staff members, including four political cartoonists, and two policemen, at the office of Charlie Hebdo newspaper in Paris, elicits so many responses that it’s hard to sort them out.
If you have a personal connection, that comes first.
I met a group of Charlie Hebdo cartoonists, including one of the victims, a few years ago at the annual cartoon Festival in Angoulême, France, the biggest gathering of cartoonists and their fans in the world. They had sought me out, partly as fans of my work – for whatever reason, my stuff seems to travel well overseas – and because I was an American cartoonist who speaks French. We did what cartoonists do: we got drunk, complained about our editors, exchanged trade secrets including pay rates.
If I lived in France, that’s where I’d want to work.
My French counterparts struck me as more self-confident and cockier than the average cartoonist. Unlike at the older, venerable Le Canard Enchainée, cartoons are the centerpiece of Charlie Hebdo, not prose. The paper has suffered financial troubles over the years, yet somehow the French continued to keep it afloat because they love comics.
Here’s how much France values graphic satire:
- More full-time staff political cartoonists were killed in Paris yesterday than are employed at newspapers in the states of California, Texas and New York combined.
- More full-time staff cartoonists were killed in Paris yesterday than work at all American magazines and websites combined.
The Charlie Hebdo artists knew they were working at a place that not only allows them to push the envelope, but encourages it. Hell, they didn’t even tone things down after their office got bombed.
They weren’t paid much, but they were having fun. The last time that I met print journalists as punk rock as those guys, they were at the old Spy magazine.
They would definitely want that attitude to outlive them.
Next comes the “there but for the grace of God” reaction.
Every political cartoonist receives threats. After 9/11 especially, people promised to blow me up with a bomb, slit the throats of every member of my family, rape me, and deprive me of a livelihood by organizing sketchy boycott campaigns. (That last one almost worked.)
The most chilling came from a New York police officer, a sergeant, who was so careless and/or unconcerned about getting in trouble that his caller ID popped up.
Who was I going to call to complain? The cops?
As far as I know, no editorial cartoonist has been murdered in response to the content of his or her work in the United States, but there’s a first time for everything. Political cartoonists have been killed and brutally beaten in other countries. Here in the United States, the murder of an outspoken radio talkshow host reminds us that political murder isn’t something that only takes place somewhere else.
Every political cartoonist takes a risk to exercise freedom expression.
We know that our work, strident and opinionated, makes a lot of people very angry, and that we live in a country where a lot of people have a lot of guns. Whether you work in a newspaper office guarded by a minimum wage security guard or, as is increasingly the norm, in your own home, you are always one pull of a trigger away from death when you hit “send” to fire off your cartoon to your syndicate, blog or publication.
Which brings me to my big-picture reaction to yesterday’s horror:
Cartoons are incredibly powerful.
Not to denigrate writing (especially since I do a lot of it myself), but cartoons elicit far more response from readers, both positive and negative, than prose. Websites that run cartoons, especially political cartoons, are consistently amazed at how much more traffic they generate than words. I have twice been fired by newspapers because my cartoons were too widely read — editors worried that they were overshadowing their other content.
Scholars and analysts of the form have tried to articulate exactly what it is about comics that make them so effective at drawing an emotional response, but I think it’s the fact that such a deceptively simple art form can pack such a wallop. Particularly in the political cartoon format, nothing more than workaday artistic chops and a few snide sentences can be enough to cause a reader to question his long-held political beliefs, national loyalties, even his faith in God.
That drives some people nuts.
Think of the rage behind the gunmen who invaded Charlie Hebdo’s office yesterday, and that of the men who ordered them to do so. It’s too early to say for sure, but it’s a fair guess that they were radical Islamists. I’d like to ask them: how weak is your faith, how lame a Muslim must you be, to allow yourself to be reduced to the murder of innocents, over ink on paper colorized in Photoshop? In a sense, they were victims of cartoon derangement syndrome, the same affliction that led to the burning of embassies over the Danish Mohammed cartoons, the repeated outrage over The New Yorker’s insipid yet controversial covers, and that NYPD sergeant in Brooklyn who called me after he read my cartoon criticizing the invasion of Iraq.
Political cartooning in the United States gets no respect. I was thinking about that this morning when I heard NPR’s Eleanor Beardsley call Charlie Hebdo “gross” and “in poor taste.” (I should certainly hope so! If it’s in good taste, it ain’t funny.) It was a hell of a thing to say, not to mention not true, while the bodies of dead journalists were still warm. But these were cartoonists, and therefore unworthy of the same level of decorum that a similar event at, say, The Onion – which mainly runs words – would merit.
But no matter. Political cartooning may not pay well, or often at all, and media elites can ignore it all they want. (Hey book critics: graphic novels exist!) But it matters.
Almost enough to die for.
Heavily-armed men who took over Crimea last week refused to say who they were, so foreign media outlets dutifully refused to accuse Russia of invading Ukraine until after it had happened. Imagine how much better the invasion of Iraq would have gone if nobody had been able to blame the United States for it?
If Bradley Manning had murdered Iraqi civilians as a soldier, he’d be a hero. Instead, he revealed proof that his superiors killed Iraqi civilians. It’s all about understanding your role in the hierarchy that keeps our society together…which we should all want for some reason.
Obama and the US media are taking credit for Gaddafi’s downfall, but it was the Libyan fighters who won the war.
The fall of Moammar Gaddafi was a Libyan story first and foremost. Libyans fought, killed and died to end the Colonel’s 42-year reign.
No doubt, the U.S. and its NATO proxies tipped the military balance in favor of the Benghazi-based rebels. It’s hard for any government to defend itself when denied the use of its own airspace as enemy missiles and bombs blast away its infrastructure over the course of more than 20,000 sorties.
Still, it was Libyans who took the biggest risks and paid the highest price. They deserve the credit. From a foreign policy standpoint, it behooves the West to give it to them. Consider a parallel, the fall 2001 bombing campaign against the Taliban. With fewer than a thousand Special Forces troops on the ground in Afghanistan to bribe tribal leaders and guide bombs to their targets, the U.S. military and CIA relied exclusively on air power to allow the Northern Alliance to advance. The premature announcement that major combat operations had ceased, followed by the installation of Hamid Karzai as de facto president—a man widely seen as a U.S. figurehead—set the stage for what would eventually become America’s longest war.
As did the triumphalism of the U.S. media, who treated the “defeat” (more like the dispersing) of the Taliban as Bush’s victory. The Northern Alliance was a mere afterthought, condescended to at every turn by the punditocracy. To paraphrase Bush’s defense secretary Donald Rumsfeld, the U.S. went to war with the ally it had, not the one it would have liked to have had. America’s attitude toward Karzai and his government reflected that in many ways: snipes and insults, including the suggestion that the Afghan leader was mentally ill and ought to be replaced, as well as years of funding levels too low to meet payroll and other basic needs, thus limiting its power to metro Kabul and a few other major cities. In retrospect it would have been smarter for the U.S. to have graciously credited (and funded) the Northern Alliance with its defeat over the Taliban, content to remain the power behind the throne.
Despite this experience in Afghanistan “victory” in Libya has prompted a renewal of triumphalism in the U.S. media.
Like a slightly drunken crowd at a football match giddily shouting “U-S-A,” editors and producers keep thumping their chests long after it stops being attractive.
When Obama announced the anti-Gaddafi bombing campaign in March, Stephen Walt issued a relatively safe pair of predictions. “If Gaddafi is soon ousted and the rebel forces can establish a reasonably stable order there, then this operation will be judged a success and it will be high-fives all around,” Walt wrote in Foreign Policy. “If a prolonged stalemate occurs, if civilian casualties soar, if the coalition splinters, or if a post-Gaddafi Libya proves to be unstable, violent, or a breeding ground for extremists…his decision will be judged a mistake.”
It’s only been a few days since the fall of Tripoli, but high-fives and victory dances abound.
“Rebel Victory in Libya a Vindication for Obama,” screamed the headline in U.S. News & World Report.
I am pleased to announce that I am now writing a weekly long-form column for Al Jazeera English. Here is my second piece for Al Jazeera:
One Year Early, Obama’s Reelection Far From Certain
The American punditocracy (and, perhaps more importantly, Las Vegas oddsmakers) currently cite Barack Obama as their odds-on favorite to win next year’s presidential election. Some even predict a landslide.
Mainstream media politicos acknowledge the atrocious economy, with its real unemployment rate nearly matching the worst years of the Great Depression of the 1930s, as an obstacle to reelection. But most of them believe that other factors will prove decisive: disarray in the field of candidates for the nomination of the opposition Republican Party, the GOP’s reliance on discredited Reagan-style austerity measures for the masses coupled with tax cuts for the wealthy, and Obama’s assassination of Osama bin Laden.
Maybe they’re right. But if I were the President, I wouldn’t be offering the White House chef a contract renewal any time soon. Count me among the majority of Americans (54 to 44 percent) who told a March 2011 CNN/Opinion Research poll they think Obama will lose the 2012 election.
I could be wrong.
Scott Keeter, director of survey research at the Pew Research Center, doesn’t think much of these so-called “trial-run” polls. “A review of polls conducted in the first quarter of the year preceding the election found many of them forecasting the wrong winner—often by substantial margins,” Keeter wrote in 2007, citing three elections as far back as 1968.
However, a historical analysis of the more recent presidential races, those over the two decades, reveals an even bigger gap. The year before a U.S. presidential election, the conventional wisdom is almost always wrong. The early favorite at this point on the calendar usually loses. So betting against the pundits—in this case, against Obama—is the safe bet at this point.
The meta question is: what difference does it make who wins next year? In practical terms, not much.
For one thing, American presidents tend to find more heartbreak than political success during their second terms. Had Richard Nixon retired in 1972, for example, he would have been fondly remembered as the architect of the Paris peace talks that ended the Vietnam War, the founder of the Environmental Protection Agency, and the defender of the working and middle class (for imposing wage and price controls to soften the effect of inflation). His second term saw him sinking into, and ultimately succumbing, to the morass of the Watergate scandal.
The next second termer, Ronald Reagan, was similarly preoccupied by scandal, in case the Iran-Contra imbroglio in which the United States traded arms to Iran in return for hostages held by students in Tehran and illegally funded right-wing death squads in Central America. Bill Clinton’s last four years were overshadowed by his developing romance, and the consequences of the revelation thereof, with intern Monica Lewinsky. George W. Bush’s second term, from 2005 to 2009, was defined by his administration’s inept response to hurricane Katrina in New Orleans, the deteriorating security situation in U.S.-occupied Afghanistan and Iraq, and the economic collapse that began in 2008. His number-one political priority, privatizing the U.S. Social Security system, never got off the ground.
Presidents rarely accomplish much of significance during their second term. So why do they bother to run again? Good question. Whether it’s ego—1600 Pennsylvania Avenue is one hell of an address—or something else, I don’t know. Whatever, I have long maintained that a sane president would think of himself as standing for one four-year term, then announce his intention not to run again at the last possible moment.
From the standpoint of the American people and the citizens of countries directly affected by U.S. foreign policy, it is unlikely that the basic nature of the beast will change much regardless of Obama’s fortunes in the next election. One only has to consider the subtle “differences” between the tenures of Presidents Bush and Obama.
On the domestic front Obama continued and expanded upon Bush’s non-reaction to the economic crisis, exploiting the panic created by widespread unemployment, the bursting of the housing bubble and a massive foreclosure crisis that put tens of millions of Americans out of their homes in order to pour hundreds of billions of federal dollars into the pockets of the top executives of the nation’s largest banks, with no resulting stimulus effect whatsoever. Controversial attacks on privacy rights and civil liberties inaugurated by the Bush years were expanded and extended: the USA-Patriot Act, the National Security Agency “domestic surveillance” program that allowed the government to spy on U.S. citizens’ phone calls, emails and other communications. Obama even formalized Bush’s assertion that the president has the right to unilaterally order the assassination of anyone, including a U.S. citizen, without evidence or proof that he or she has committed a crime.
As promised during the 2008 campaign, Obama expanded the U.S. war against Afghanistan, transforming what Bush described as a short-term attempt to find Osama bin Laden after 9/11 into the most protracted military conflict in the history of the United States. The war continued in Iraq, albeit with “combat” troops redefined as “trainers.” During the last few years, the “global war on terror” expanded into Pakistan, east Africa, Libya and Yemen. Drone attacks escalated. Violating his campaign promises, he continued to keep torture available as a legal option—indeed, he ordered it against a U.S. solder, Private First Class Bradley Manning—and kept Guantánamo and other Bush-era concentration camps open.
If Obama goes down to defeat next year, then, the results should be viewed less as a shift in overall U.S. policy—hegemonic, imperialistic, increasingly authoritarian—than one that is symbolic. An Obama defeat would reflect the anger of ordinary Americans caught in the “two-party trap,” flailing back and forth between the Dems and the Reps, voting against the party in power to express their impotent rage, particularly at the economy. Mr. Hopey-Changey’s trip back to Chicago would mark the end of a brief, giddy, moment of reformism.
The argument that an overextended, indebted empire can be repaired via internal changes of personnel would be dead. With the reformism that Obama embodied no longer politically viable, American voters would be once again faced, as are the citizens of other repressive states, with the choice between sullen apathy and revolution.
Obamaism is currently believed to be unstoppable. If history serves as an accurate predictor, that belief is good cause to predict its defeat next November.
During the late spring and early summer of 1991, just over a year before the 1992 election, President George H.W. Bush was soaring in the polls in the aftermath of the Persian Gulf War, which the American media positively portrayed as successful, quick, internationalist, and cost the lives of few America soldiers. A March 1991 CBS poll gave him an 88 percent approval rating—a record high.
By October 1991 Bush was heavily favored to win. A Pew Research poll found that 78 percent of Democratic voters thought Bush would defeat any Democratic nominee. New York governor Mario Cuomo, an eloquent, charismatic liberal star of the party, sized up 1992 as unwinnable and decided not to run.
When the votes were counted, however, Democrat Bill Clinton defeated Bush, 43 to 37.5 percent. Although Republicans blamed insurgent billionaire Ross Perot’s independent candidacy for siphoning away votes from Bush, subsequent analyses do not bear this out. In fact, Perot’s appeal had been bipartisan, attracting liberals opposed to the North American Free Trade Agreement (NAFTA) between the U.S., Canada and Mexico and globalization in general, as well as conservative deficit hawks.
The most credible explanation for Bush’s defeat was handwritten on a sign that the victorious Bill Clinton’s campaign manager famously taped to the wall of the Dems’ war room: “It’s the economy, stupid.” As the 1989-1993 recession deepened Bush’s ratings tumbled to around 30 percent. A February 1992 incident, in which Bush was depicted by The New York Times as wearing “a look of wonder” when confronted with a supermarket price scanning machine, solidified his reputation with voters as patrician, out of touch, and unwilling to act to stimulate the economy or alleviate the suffering of the under- and unemployed. “Exit polls,” considered exceptionally reliable because they query voters seconds after exiting balloting places, showed that 75 percent of Americans thought the economy was “bad” or “very bad.”
In 1995, Bill Clinton was preparing his reelection bid. On the Republican side, Kansas senator and 1976 vice presidential candidate Bob Dole was expected to (and did) win his party’s nomination. Perot ran again, but suffered from a media blackout; newspapers and broadcast outlets had lost interest in him after a bizarre meltdown during the 1992 race in which he accused unnamed conspirators of plotting to violently disrupt his daughter’s wedding. He received eight percent in 1996.
Clinton trounced Dole, 49 to 40 percent. In 1995, however, that outcome was anything but certain. Bill Clinton had been severely wounded by a series of missteps during his first two years in office. His first major policy proposal, to allow gays and lesbians to serve openly in the U.S. military, was so unpopular that he was forced to water it down into the current “Don’t Ask, Don’t Tell” compromise. Clinton’s 1993 attempt to deprivatize the healthcare system, mocked as HillaryCare after he put his wife in charge of marketing it, went down to defeat. He signed the pro-corporate, Republican-backed trade agreement, NAFTA, alienating his party’s liberal and progressive base. Low voter turnout by the American left in the 1994 midterm elections led to the “Republican Revolution,” a historic sweep of both houses of the American Congress by right-wing conservatives led by the fiery new Speaker of the House, Newt Gingrich.
1995 saw the so-called “co-presidency” between Gingrich and a cowed Bill Clinton, who was reduced to telling a press conference that “the president is relevant.” The United States, which does not have a European-style parliamentary system, had never seen a president so politically weak while remaining in office.
During the spring and summer of 1995 Bob Dole was already the heir apparent to the nomination of a Republican Party that traditionally rewards those who wait their turn. Dole was a seasoned campaigner, a Plains States centrist whose gentlemanly demeanor and credentials as a hero of World War II. Conventional wisdom had him beating Clinton. So did the polls. A March 1995 Los Angeles Times poll had Dole defeating Clinton, 52 to 44 percent in a head-to-head match-up. “Among all voters, Clinton’s generic reelect remains dismal, with 40 percent inclined to vote him in again and 53% tilting or definitely planning a vote against him,” reported the Times.
By late autumn, however, the polls had flipped. Though statisticians differ about how big a factor it was, a summer 1995 shutdown of the federal government blamed on the refusal of Gingrich’s hardline Republicans to approve the budget turned the tide. At the end of the year the die was cast. As Americans began to pay more attention to his challenger they recoiled at Dole’s age—if elected, he would have been the oldest president in history, even older than Reagan—as it contrasted with Clinton’s youthful vigor. The Democrat coasted to reelection. But that’s not how things looked at this stage in the game.
When analyzing the 2000 race, remember that Republican George W. Bush lost the election to Al Gore by a bizarre quirk of the American system, the Electoral College. The U.S. popular vote actually determines the outcome of elected delegates to the College from each of the 50 states. The winner of those delegates is elected president.
Most of the time, the same candidate wins the national popular vote and the Electoral College tally. In 2000, there is no dispute: Democrat Al Gore won the popular vote, 48.4 to 47.9 percent. There was a legal dispute over 25 electoral votes cast by the state of Florida; ultimately the U.S. Supreme Court decided, along party lines, to award the state to Bush despite clear indications that Gore would have won recounts by tens of thousands of votes in that state.
Regardless of one’s views of the 2000 Florida recount controversy, from a predictive standpoint, one should assume that Gore won because no one could have anticipated a difference between the results of the electoral and popular votes.
Under normal circumstances Gore should have faced, as Dick Cheney said about the Iraq invasion, a cakewalk. A popular sitting vice president, he enjoyed the trappings of incumbency and a reputation as a thoughtful environmentalist and government efficiency expert. The economy was booming—always a good argument for the “don’t change horses in midstream” sales pitch. The early favorite on the Republican side, George W. Bush, was considered an intellectual lightweight who would get eaten alive the first time the two met in a presidential debate. But Monicagate had wounded Bill Clinton to the extent that Gore made a fateful decision to disassociate himself from the president who had appointed him.
A January 1999 CNN poll had Bush over Gore, 49 to 46 percent. By June 2000 the same poll had barely budged: now it was 49 to 45 percent. “The results indicate that the public is far more likely to view Texas Governor George W. Bush as a strong and decisive leader, and is also more confident in Bush’s ability to handle an international crisis—a worrisome finding for a vice president with eight years of international policy experience,” analyzed CNN in one of the most frightening summaries of the American people’s poor judgment ever recorded.
Gore didn’t become president. But he won the 2000 election. Once again, the media was wrong.
In the 2004 election, it was my turn to screw up. Howard Dean, the combative liberal darling and former Vermont governor, was heavily favored to win the Democratic nomination against incumbent George W. Bush. I was so convinced at his inevitability after early primary elections and by the importance of unifying the Democratic Party behind a man who could defeat Bush that I authored a column I wish I could chuck down the memory hole calling for the party to suspend remaining primaries and back Dean. In 2004, John Kerry won the nomination.
But I wasn’t alone. Polls and pundits agreed that George W. Bush, deeply embarrassed by the failure to find weapons of mass destruction in Iraq, would lose to Kerry, a Democrat with a rare combination of credentials: he was a bonafide war hero during the Vietnam War and a noted opponent of the war after his service there.
Bush trounced Kerry. “How can 59,054,087 people be so DUMB?” asked Britain’s Daily Mirror. Good question. Maybe that’s why no one saw it coming.
Which brings us to the most recent presidential election. First, the pundit class was wrong about the likely Democratic nominee. Former First Lady and New York Senator Hillary Rodham Clinton, everyone “knew,” would win. It wasn’t even close. An August 2007 Gallup/USA Today poll had Clinton ahead of Obama, 48 to 26 percent. As it turned out, many Democratic primary voters were wowed by Obama’s charisma and annoyed by Clinton’s refusal to apologize for her brazenly cynical vote in favor of the Iraq war in 2003. Aging Arizona Senator John McCain, on the other hand, remained the best-funded, and thus the continuous favorite, on the Republican side.
Obama’s advantages over McCain became clear by 2008. “The political landscape overwhelmingly favors Obama,” reported USA Today in June. At this point in 2007?
He didn’t stand a chance.
Ted Rall is an American political cartoonist, columnist and author. His most recent book is The Anti-American Manifesto. His website is rall.com.