Possible GOP Presidential candidate Marco Rubio, whose father entered the US illegally, says the US is full up, that there just isn’t any room left in the US for new immigrants, including the children arriving at the border with Mexico. But when you actually consider how much space there is in the country, that’s obviously untrue.
Obama and the US media are taking credit for Gaddafi’s downfall, but it was the Libyan fighters who won the war.
The fall of Moammar Gaddafi was a Libyan story first and foremost. Libyans fought, killed and died to end the Colonel’s 42-year reign.
No doubt, the U.S. and its NATO proxies tipped the military balance in favor of the Benghazi-based rebels. It’s hard for any government to defend itself when denied the use of its own airspace as enemy missiles and bombs blast away its infrastructure over the course of more than 20,000 sorties.
Still, it was Libyans who took the biggest risks and paid the highest price. They deserve the credit. From a foreign policy standpoint, it behooves the West to give it to them. Consider a parallel, the fall 2001 bombing campaign against the Taliban. With fewer than a thousand Special Forces troops on the ground in Afghanistan to bribe tribal leaders and guide bombs to their targets, the U.S. military and CIA relied exclusively on air power to allow the Northern Alliance to advance. The premature announcement that major combat operations had ceased, followed by the installation of Hamid Karzai as de facto president—a man widely seen as a U.S. figurehead—set the stage for what would eventually become America’s longest war.
As did the triumphalism of the U.S. media, who treated the “defeat” (more like the dispersing) of the Taliban as Bush’s victory. The Northern Alliance was a mere afterthought, condescended to at every turn by the punditocracy. To paraphrase Bush’s defense secretary Donald Rumsfeld, the U.S. went to war with the ally it had, not the one it would have liked to have had. America’s attitude toward Karzai and his government reflected that in many ways: snipes and insults, including the suggestion that the Afghan leader was mentally ill and ought to be replaced, as well as years of funding levels too low to meet payroll and other basic needs, thus limiting its power to metro Kabul and a few other major cities. In retrospect it would have been smarter for the U.S. to have graciously credited (and funded) the Northern Alliance with its defeat over the Taliban, content to remain the power behind the throne.
Despite this experience in Afghanistan “victory” in Libya has prompted a renewal of triumphalism in the U.S. media.
Like a slightly drunken crowd at a football match giddily shouting “U-S-A,” editors and producers keep thumping their chests long after it stops being attractive.
When Obama announced the anti-Gaddafi bombing campaign in March, Stephen Walt issued a relatively safe pair of predictions. “If Gaddafi is soon ousted and the rebel forces can establish a reasonably stable order there, then this operation will be judged a success and it will be high-fives all around,” Walt wrote in Foreign Policy. “If a prolonged stalemate occurs, if civilian casualties soar, if the coalition splinters, or if a post-Gaddafi Libya proves to be unstable, violent, or a breeding ground for extremists…his decision will be judged a mistake.”
It’s only been a few days since the fall of Tripoli, but high-fives and victory dances abound.
“Rebel Victory in Libya a Vindication for Obama,” screamed the headline in U.S. News & World Report.
I am pleased to announce that I am now writing a weekly long-form column for Al Jazeera English. Here is my second piece for Al Jazeera:
One Year Early, Obama’s Reelection Far From Certain
The American punditocracy (and, perhaps more importantly, Las Vegas oddsmakers) currently cite Barack Obama as their odds-on favorite to win next year’s presidential election. Some even predict a landslide.
Mainstream media politicos acknowledge the atrocious economy, with its real unemployment rate nearly matching the worst years of the Great Depression of the 1930s, as an obstacle to reelection. But most of them believe that other factors will prove decisive: disarray in the field of candidates for the nomination of the opposition Republican Party, the GOP’s reliance on discredited Reagan-style austerity measures for the masses coupled with tax cuts for the wealthy, and Obama’s assassination of Osama bin Laden.
Maybe they’re right. But if I were the President, I wouldn’t be offering the White House chef a contract renewal any time soon. Count me among the majority of Americans (54 to 44 percent) who told a March 2011 CNN/Opinion Research poll they think Obama will lose the 2012 election.
I could be wrong.
Scott Keeter, director of survey research at the Pew Research Center, doesn’t think much of these so-called “trial-run” polls. “A review of polls conducted in the first quarter of the year preceding the election found many of them forecasting the wrong winner—often by substantial margins,” Keeter wrote in 2007, citing three elections as far back as 1968.
However, a historical analysis of the more recent presidential races, those over the two decades, reveals an even bigger gap. The year before a U.S. presidential election, the conventional wisdom is almost always wrong. The early favorite at this point on the calendar usually loses. So betting against the pundits—in this case, against Obama—is the safe bet at this point.
The meta question is: what difference does it make who wins next year? In practical terms, not much.
For one thing, American presidents tend to find more heartbreak than political success during their second terms. Had Richard Nixon retired in 1972, for example, he would have been fondly remembered as the architect of the Paris peace talks that ended the Vietnam War, the founder of the Environmental Protection Agency, and the defender of the working and middle class (for imposing wage and price controls to soften the effect of inflation). His second term saw him sinking into, and ultimately succumbing, to the morass of the Watergate scandal.
The next second termer, Ronald Reagan, was similarly preoccupied by scandal, in case the Iran-Contra imbroglio in which the United States traded arms to Iran in return for hostages held by students in Tehran and illegally funded right-wing death squads in Central America. Bill Clinton’s last four years were overshadowed by his developing romance, and the consequences of the revelation thereof, with intern Monica Lewinsky. George W. Bush’s second term, from 2005 to 2009, was defined by his administration’s inept response to hurricane Katrina in New Orleans, the deteriorating security situation in U.S.-occupied Afghanistan and Iraq, and the economic collapse that began in 2008. His number-one political priority, privatizing the U.S. Social Security system, never got off the ground.
Presidents rarely accomplish much of significance during their second term. So why do they bother to run again? Good question. Whether it’s ego—1600 Pennsylvania Avenue is one hell of an address—or something else, I don’t know. Whatever, I have long maintained that a sane president would think of himself as standing for one four-year term, then announce his intention not to run again at the last possible moment.
From the standpoint of the American people and the citizens of countries directly affected by U.S. foreign policy, it is unlikely that the basic nature of the beast will change much regardless of Obama’s fortunes in the next election. One only has to consider the subtle “differences” between the tenures of Presidents Bush and Obama.
On the domestic front Obama continued and expanded upon Bush’s non-reaction to the economic crisis, exploiting the panic created by widespread unemployment, the bursting of the housing bubble and a massive foreclosure crisis that put tens of millions of Americans out of their homes in order to pour hundreds of billions of federal dollars into the pockets of the top executives of the nation’s largest banks, with no resulting stimulus effect whatsoever. Controversial attacks on privacy rights and civil liberties inaugurated by the Bush years were expanded and extended: the USA-Patriot Act, the National Security Agency “domestic surveillance” program that allowed the government to spy on U.S. citizens’ phone calls, emails and other communications. Obama even formalized Bush’s assertion that the president has the right to unilaterally order the assassination of anyone, including a U.S. citizen, without evidence or proof that he or she has committed a crime.
As promised during the 2008 campaign, Obama expanded the U.S. war against Afghanistan, transforming what Bush described as a short-term attempt to find Osama bin Laden after 9/11 into the most protracted military conflict in the history of the United States. The war continued in Iraq, albeit with “combat” troops redefined as “trainers.” During the last few years, the “global war on terror” expanded into Pakistan, east Africa, Libya and Yemen. Drone attacks escalated. Violating his campaign promises, he continued to keep torture available as a legal option—indeed, he ordered it against a U.S. solder, Private First Class Bradley Manning—and kept Guantánamo and other Bush-era concentration camps open.
If Obama goes down to defeat next year, then, the results should be viewed less as a shift in overall U.S. policy—hegemonic, imperialistic, increasingly authoritarian—than one that is symbolic. An Obama defeat would reflect the anger of ordinary Americans caught in the “two-party trap,” flailing back and forth between the Dems and the Reps, voting against the party in power to express their impotent rage, particularly at the economy. Mr. Hopey-Changey’s trip back to Chicago would mark the end of a brief, giddy, moment of reformism.
The argument that an overextended, indebted empire can be repaired via internal changes of personnel would be dead. With the reformism that Obama embodied no longer politically viable, American voters would be once again faced, as are the citizens of other repressive states, with the choice between sullen apathy and revolution.
Obamaism is currently believed to be unstoppable. If history serves as an accurate predictor, that belief is good cause to predict its defeat next November.
During the late spring and early summer of 1991, just over a year before the 1992 election, President George H.W. Bush was soaring in the polls in the aftermath of the Persian Gulf War, which the American media positively portrayed as successful, quick, internationalist, and cost the lives of few America soldiers. A March 1991 CBS poll gave him an 88 percent approval rating—a record high.
By October 1991 Bush was heavily favored to win. A Pew Research poll found that 78 percent of Democratic voters thought Bush would defeat any Democratic nominee. New York governor Mario Cuomo, an eloquent, charismatic liberal star of the party, sized up 1992 as unwinnable and decided not to run.
When the votes were counted, however, Democrat Bill Clinton defeated Bush, 43 to 37.5 percent. Although Republicans blamed insurgent billionaire Ross Perot’s independent candidacy for siphoning away votes from Bush, subsequent analyses do not bear this out. In fact, Perot’s appeal had been bipartisan, attracting liberals opposed to the North American Free Trade Agreement (NAFTA) between the U.S., Canada and Mexico and globalization in general, as well as conservative deficit hawks.
The most credible explanation for Bush’s defeat was handwritten on a sign that the victorious Bill Clinton’s campaign manager famously taped to the wall of the Dems’ war room: “It’s the economy, stupid.” As the 1989-1993 recession deepened Bush’s ratings tumbled to around 30 percent. A February 1992 incident, in which Bush was depicted by The New York Times as wearing “a look of wonder” when confronted with a supermarket price scanning machine, solidified his reputation with voters as patrician, out of touch, and unwilling to act to stimulate the economy or alleviate the suffering of the under- and unemployed. “Exit polls,” considered exceptionally reliable because they query voters seconds after exiting balloting places, showed that 75 percent of Americans thought the economy was “bad” or “very bad.”
In 1995, Bill Clinton was preparing his reelection bid. On the Republican side, Kansas senator and 1976 vice presidential candidate Bob Dole was expected to (and did) win his party’s nomination. Perot ran again, but suffered from a media blackout; newspapers and broadcast outlets had lost interest in him after a bizarre meltdown during the 1992 race in which he accused unnamed conspirators of plotting to violently disrupt his daughter’s wedding. He received eight percent in 1996.
Clinton trounced Dole, 49 to 40 percent. In 1995, however, that outcome was anything but certain. Bill Clinton had been severely wounded by a series of missteps during his first two years in office. His first major policy proposal, to allow gays and lesbians to serve openly in the U.S. military, was so unpopular that he was forced to water it down into the current “Don’t Ask, Don’t Tell” compromise. Clinton’s 1993 attempt to deprivatize the healthcare system, mocked as HillaryCare after he put his wife in charge of marketing it, went down to defeat. He signed the pro-corporate, Republican-backed trade agreement, NAFTA, alienating his party’s liberal and progressive base. Low voter turnout by the American left in the 1994 midterm elections led to the “Republican Revolution,” a historic sweep of both houses of the American Congress by right-wing conservatives led by the fiery new Speaker of the House, Newt Gingrich.
1995 saw the so-called “co-presidency” between Gingrich and a cowed Bill Clinton, who was reduced to telling a press conference that “the president is relevant.” The United States, which does not have a European-style parliamentary system, had never seen a president so politically weak while remaining in office.
During the spring and summer of 1995 Bob Dole was already the heir apparent to the nomination of a Republican Party that traditionally rewards those who wait their turn. Dole was a seasoned campaigner, a Plains States centrist whose gentlemanly demeanor and credentials as a hero of World War II. Conventional wisdom had him beating Clinton. So did the polls. A March 1995 Los Angeles Times poll had Dole defeating Clinton, 52 to 44 percent in a head-to-head match-up. “Among all voters, Clinton’s generic reelect remains dismal, with 40 percent inclined to vote him in again and 53% tilting or definitely planning a vote against him,” reported the Times.
By late autumn, however, the polls had flipped. Though statisticians differ about how big a factor it was, a summer 1995 shutdown of the federal government blamed on the refusal of Gingrich’s hardline Republicans to approve the budget turned the tide. At the end of the year the die was cast. As Americans began to pay more attention to his challenger they recoiled at Dole’s age—if elected, he would have been the oldest president in history, even older than Reagan—as it contrasted with Clinton’s youthful vigor. The Democrat coasted to reelection. But that’s not how things looked at this stage in the game.
When analyzing the 2000 race, remember that Republican George W. Bush lost the election to Al Gore by a bizarre quirk of the American system, the Electoral College. The U.S. popular vote actually determines the outcome of elected delegates to the College from each of the 50 states. The winner of those delegates is elected president.
Most of the time, the same candidate wins the national popular vote and the Electoral College tally. In 2000, there is no dispute: Democrat Al Gore won the popular vote, 48.4 to 47.9 percent. There was a legal dispute over 25 electoral votes cast by the state of Florida; ultimately the U.S. Supreme Court decided, along party lines, to award the state to Bush despite clear indications that Gore would have won recounts by tens of thousands of votes in that state.
Regardless of one’s views of the 2000 Florida recount controversy, from a predictive standpoint, one should assume that Gore won because no one could have anticipated a difference between the results of the electoral and popular votes.
Under normal circumstances Gore should have faced, as Dick Cheney said about the Iraq invasion, a cakewalk. A popular sitting vice president, he enjoyed the trappings of incumbency and a reputation as a thoughtful environmentalist and government efficiency expert. The economy was booming—always a good argument for the “don’t change horses in midstream” sales pitch. The early favorite on the Republican side, George W. Bush, was considered an intellectual lightweight who would get eaten alive the first time the two met in a presidential debate. But Monicagate had wounded Bill Clinton to the extent that Gore made a fateful decision to disassociate himself from the president who had appointed him.
A January 1999 CNN poll had Bush over Gore, 49 to 46 percent. By June 2000 the same poll had barely budged: now it was 49 to 45 percent. “The results indicate that the public is far more likely to view Texas Governor George W. Bush as a strong and decisive leader, and is also more confident in Bush’s ability to handle an international crisis—a worrisome finding for a vice president with eight years of international policy experience,” analyzed CNN in one of the most frightening summaries of the American people’s poor judgment ever recorded.
Gore didn’t become president. But he won the 2000 election. Once again, the media was wrong.
In the 2004 election, it was my turn to screw up. Howard Dean, the combative liberal darling and former Vermont governor, was heavily favored to win the Democratic nomination against incumbent George W. Bush. I was so convinced at his inevitability after early primary elections and by the importance of unifying the Democratic Party behind a man who could defeat Bush that I authored a column I wish I could chuck down the memory hole calling for the party to suspend remaining primaries and back Dean. In 2004, John Kerry won the nomination.
But I wasn’t alone. Polls and pundits agreed that George W. Bush, deeply embarrassed by the failure to find weapons of mass destruction in Iraq, would lose to Kerry, a Democrat with a rare combination of credentials: he was a bonafide war hero during the Vietnam War and a noted opponent of the war after his service there.
Bush trounced Kerry. “How can 59,054,087 people be so DUMB?” asked Britain’s Daily Mirror. Good question. Maybe that’s why no one saw it coming.
Which brings us to the most recent presidential election. First, the pundit class was wrong about the likely Democratic nominee. Former First Lady and New York Senator Hillary Rodham Clinton, everyone “knew,” would win. It wasn’t even close. An August 2007 Gallup/USA Today poll had Clinton ahead of Obama, 48 to 26 percent. As it turned out, many Democratic primary voters were wowed by Obama’s charisma and annoyed by Clinton’s refusal to apologize for her brazenly cynical vote in favor of the Iraq war in 2003. Aging Arizona Senator John McCain, on the other hand, remained the best-funded, and thus the continuous favorite, on the Republican side.
Obama’s advantages over McCain became clear by 2008. “The political landscape overwhelmingly favors Obama,” reported USA Today in June. At this point in 2007?
He didn’t stand a chance.
Ted Rall is an American political cartoonist, columnist and author. His most recent book is The Anti-American Manifesto. His website is rall.com.
Yesterday I published my first column for Al Jazeera English. I get more space than my syndicated column (2000 words compared to the usual 800) and it’s an exciting opportunity to run alongside a lot of other writers whose work I respect.
Here it is:
Stalemate in Libya, Made in USA
Republicans in the United States Senate held a hearing to discuss the progress of what has since become the war in Libya. It was one month into the operation. Senator John McCain, the Arizona conservative who lost the 2008 presidential race to Barack Obama, grilled top U.S. generals. “So right now we are facing the prospect of a stalemate?” McCain asked General Carter Ham, chief of America’s Africa Command. “I would agree with that at present on the ground,” Ham replied.
How would the effort to depose Colonel Gaddafi conclude? “I think it does not end militarily,” Ham predicted.
That was over two months ago.
It’s a familiar ritual. Once again a military operation marketed as inexpensive, short-lived and—naturally—altruistic, is dragging on, piling up bills, with no end in sight. The scope of the mission, narrowly defined initially, has radically expanded. The Libyan stalemate is threatening to become, along with Iraq and especially Afghanistan, America’s third quagmire.
Bear in mind, of course, that the American definition of a military quagmire does not square with the one in the dictionary, namely, a conflict from which one or both parties cannot disengage. The U.S. could pull out of Libya. But it won’t. Not yet.
Indeed, President Obama would improve his chances in his upcoming reelection campaign were he to order an immediate withdrawal from all four of America’s “hot wars”: Libya, along with Afghanistan, Iraq, and now Yemen. When the U.S. and NATO warplanes began dropping bombs on Libyan government troops and military targets in March, only 47 percent of Americans approved—relatively low for the start of a military action. With U.S. voters focused on the economy in general and joblessness in particular, this jingoistic nation’s typical predilection for foreign adventurism has given way to irritation to anything that distracts from efforts to reduce unemployment. Now a mere 26 percent support the war—a figure comparable to those for the Vietnam conflict at its nadir.
For Americans “quagmire” became a term of political art after Vietnam. It refers not to a conflict that one cannot quit—indeed, the U.S. has not fought a war where its own survival was at stake since 1815—but one that cannot be won. The longer such a war drags on, with no clear conclusion at hand, the more that American national pride (and corporate profits) are at stake. Like a commuter waiting for a late bus, the more time, dead soldiers and materiel has been squandered, the harder it is to throw up one’s hands and give up. So Obama will not call off his dogs—his NATO allies—regardless of the polls. Like a gambler on a losing streak, he will keep doubling down.
U.S. ground troops in Libya? Not yet. Probably never. But don’t rule them out. Obama hasn’t.
It is shocking, even by the standards of Pentagon warfare, how quickly “mission creep” has imposed itself in Libya. Americans, at war as long as they can remember, recognize the signs: more than half the electorate believes that U.S. forces will be engaged in combat in Libya at least through 2012.
One might rightly point out: this latest American incursion into Libya began recently, in March. Isn’t it premature to worry about a quagmire?
“Like an unwelcome specter from an unhappy past, the ominous word ‘quagmire’ has begun to haunt conversations among government officials and students of foreign policy, both here and abroad,” R.W. Apple, Jr. reported in The New York Times. He was talking about Afghanistan.
Apple was prescient. He wrote his story on October 31, 2001, three weeks into what has since become the United States’ longest war.
Obama never could have convinced a war-weary public to tolerate a third war in a Muslim country had he not promoted the early bombing campaign as a humanitarian effort to protect Libya’s eastern-based rebels (recast as “civilians”) from imminent Srebrenica-esque massacre by Gaddafi’s forces. “We knew that if we waited one more day, Benghazi—a city nearly the size of Charlotte [North Carolina]—could suffer a massacre that would have reverberated across the region and stained the conscience of the world,” the President said March 28th. “It was not in our national interest to let that happen. I refused to let that happen.”
Obama promised a “limited” role for the U.S. military, which would be part of “broad coalition” to “protect civilians, stop an advancing army, prevent a massacre, and establish a no-fly zone.” There would be no attempt to drive Gaddafi out of power. “Of course, there is no question that Libya—and the world—would be better off with Gaddafi out of power,” he said. “I, along with many other world leaders, have embraced that goal, and will actively pursue it through non-military means. But broadening our military mission to include regime change would be a mistake.”
“Regime change [in Iraq],” Obama reminded, “took eight years, thousands of American and Iraqi lives, and nearly a trillion dollars. That is not something we can afford to repeat in Libya.”
The specifics were fuzzy, critics complained. How would Libya retain its territorial integrity—a stated U.S. war aim—while allowing Gaddafi to keep control of the western provinces around Tripoli?
The answer, it turned out, was essentially a replay of Bill Clinton’s bombing campaign against Serbia during the 1990s. U.S. and NATO warplanes targeted Gaddafi’s troops. Bombs degraded Libyan military infrastructure: bases, radar towers, even ships. American policymakers hoped against hope that Gaddafi’s generals would turn against him, either assassinating him in a coup or forcing the Libyan strongman into exile.
If Gaddafi had disappeared, Obama’s goal would have been achieved: easy in, easy out. With a little luck, Islamist groups such as Al Qaeda in the Islamic Maghreb would have little to no influence on the incoming government to be created by the Libyan National Transitional Council. With more good fortune, the NTC could even be counted upon to sign over favorable oil concessions to American and European energy concerns.
But Gaddafi was no Milosevic. The dictator dug in his heels. This was at least in part due to NATO’s unwillingness or inability to offer him the dictator retirement plan of Swiss accounts, gym bags full of bullion, and a swanky home in the French Riviera.
Stalemate was the inevitable result of America’s one foot in, one foot out Libya war policy—an approach that continued after control of the operation was officially turned over to NATO, specifically Britain and France. Allied jets were directed to deter attacks on Benghazi and other NTC-held positions, not to win the revolution for them. NTC forces, untrained and poorly armed, were no match for Gaddafi’s professional army. On the other hand, loyalist forces were met by heavy NATO air strikes whenever they tried to advance into rebel-held territory. Libya was bifurcated. With Gaddafi still alive and in charge, this was the only way Obama Administration policy could have played out.
No one knows whether Gaddafi’s angry bluster—the rants that prompted Western officials to attack—would have materialized in the form of a massacre. It is clear, on the other hand, that Libyans on both sides of the front are paying a high price for the U.S.-created stalemate.
At least one million out of Libya’s population of six million has fled the nation or become internally displaced refugees. There are widespread shortages of basic goods, including food and fuel. According to the Pakistani newspaper Dawn, the NTC has pulled children out of schools in areas they administer and put them to work “cleaning streets, working as traffic cops and dishing up army rations to rebel soldiers.”
NATO jets fly one sortie after another; the fact that they’re running out of targets doesn’t stop them from dropping their payloads. Each bomb risks killing more of the civilians they are ostensibly protecting. Libyans will be living in rubble for years after the war ends.
Coalition pilots were given wide leeway in the definition of “command and control centers” that could be targeted; one air strike against the Libyan leader’s home killed 29-year-old Mussa Ibrahim said Saif al-Arab, Gaddafi’s son, along with three of his grandchildren. Gaddafi himself remained in hiding. Officially, however, NATO was not allowed to even think about trying to assassinate him.
Pentagon brass told Obama that more firepower was required to turn the tide in favor of the ragtag army of the Libyan National Transitional Council. But he couldn’t do that. He was faced with a full-scale rebellion by a coalition of liberal antiwar Democrats and Republican constitutionalists in the U.S. House of Representatives. Furious that the President had failed to request formal Congressional approval for the Libyan war within 60 days as required by the 1973 War Powers Resolution, they voted against a military appropriations bill for Libya.
The planes kept flying. But Congress’ reticence now leaves one way to close the deal: kill Gaddafi.
As recently as May 1st,, after the killing of Gaddafi’s son and grandchildren, NATO was still denying that it was trying to dispatch him. “We do not target individuals,” said Lieutenant General Charles Bouchard of Canada, commanding military operations in Libya.
By June 10th CNN television confirmed that NATO was targeting Libya’s Brother Leader for death. “Asked by CNN whether Gaddafi was being targeted,” the network reported, “[a high-ranking] NATO official declined to give a direct answer. The [UN] resolution applies to Gaddafi because, as head of the military, he is part of the control and command structure and therefore a legitimate target, the official said.”
In other words, a resolution specifically limiting the scope of the war to protecting civilians and eschewing regime change was being used to justify regime change via political assassination.
So what happens next?
First: war comes to Washington. On June 14th House of Representatives Speaker John Boehner sent Obama a rare warning letter complaining of “a refusal to acknowledge and respect the role of Congress” in the U.S. war against Libya and a “lack of clarity” about the mission.
“It would appear that in five days, the administration will be in violation of the War Powers Resolution unless it asks for and receives authorization from Congress or withdraws all U.S. troops and resources from the mission [in Libya],” Boehner wrote. “Have you…conducted the legal analysis to justify your position?” he asked. “Given the gravity of the constitutional and statutory questions involved, I request your answer by Friday, June 17, 2011.”
Next, the stalemate/quagmire continues. Britain can keep bombing Libya “as long as we choose to,” said General Sir David Richards, the UK Chief of Defense Staff.
One event could change everything overnight: Gaddafi’s death. Until then, NATO and the United States must accept the moral responsibility for dragging out a probable aborted uprising in eastern Libya into a protracted civil war with no military—or, contrary to NATO pronouncements, political—solution in the foreseeable future. Libya is assuming many of the characteristics of a proxy war such as Afghanistan during the 1980s, wherein outside powers armed warring factions to rough parity but not beyond, with the effect of extending the conflict at tremendous cost of life and treasure. This time around, only one side, the NTC rebels, are receiving foreign largess—but not enough to score a decisive victory against Gaddafi by capturing Tripoli.
Libya was Obama’s first true war. He aimed to show how Democrats manage international military efforts differently than neo-cons like Bush. He built an international coalition. He made the case on humanitarian grounds. He declared a short time span.
In three short months, all of Obama’s plans have fallen apart. NATO itself is fracturing. There is talk about dissolving it entirely. The Libya mission is stretching out into 2011 and beyond.
People all over the world are questioning American motives in Libya and criticizing the thin veneer of legality used to justify the bombings. “We strongly believe that the [UN] resolution [on Libya] is being abused for regime change, political assassinations and foreign military occupation,” South African President Jacob Zuma said this week, echoing criticism of the invasion of Iraq.
Somewhere in Texas, George W. Bush is smirking.
(C) 2011 Ted Rall, All Rights Reserved.
The Conspiracy to Abolish Cash
For many years figures on the political fringe, especially on the right, have claimed that the government and its corporate owners want to transform us into a cashless society. Their warnings about the conspiracy against paper money fell on deaf ears, primarily because the digitalization of financial transactions seemed more like the result of organic business trends than the manifestation of some sinister conspiracy.
Now, however, those who want to do away with liquid currency are stepping out of the shadows. They talk about increased efficiency and profit potential, but their real agenda is nothing less than enslavement of the human race.
“Physical currency is a bulky, germ-smeared, carbon-intensive, expensive medium of exchange. Let’s dump it,” argued David Wolman in Wired.
Citing a 2002 study for the Organization for Economic Development that states “money’s destiny is to become digital, ” a Defense Department-affiliated economics professor has authored an Op/Ed for The New York Times that asks: “Why not eliminate the use of physical cash worldwide?” Jonathan Lipow urges President Obama to “push for an international agreement to eliminate the largest-denomination bills” and urges the replacement of bills and coins by “smart cards with biometric security features.”
Lipow’s justification for calling for the most radical change to the fundamental nature of commerce since industrialization is, of all things, fighting terrorism. “In a cashless economy, insurgents’ and terrorists’ electronic payments would generate audit trails that could be screened by data mining software; every payment and transfer would yield a treasure trove of information about their agents, their locations and their intentions,” Lipow writes. “This would pose similar challenges for criminals.”
Terrorism is a mere fig leaf. According to the annual “Patterns of Global Terrorism” report compiled by the U.S. State Department, the highest total death toll attributed to terrorism in the last 20 years occurred in—surprise—2001. Including 9/11, only 3,547 people were killed in 346 acts of violence worldwide. Tragic. Obviously. But, in the overall scheme of things, terrorism is not a big deal.
Measured in terms of loss of life and economic disruption, terrorism is a trivial problem, hardly worth mentioning. According to the UN, 36 million people die annually from hunger and malnutrition. Over half a million die in car wrecks—but you don’t hear people like Lipow demanding that we get rid of cars. A more legitimate concern is the “loss” of taxes upon the underground economy, estimated by the IMF at 15 percent of transactions in developed nations.
What the anti-cash movement really wants is digital totalitarianism: a dystopian nightmare in which the entire human race is enslaved by international corporations and their pet governments. An anti-establishment gadfly like WikiLeaks founder Julian Assange could be instantly deprived of money—and thus freedom of movement—with a couple of keystrokes. (We saw a preview of this when PayPal and Amazon shut down WikiLeaks donation mechanism and web server, respectively.) The high-tech hell depicted by the film “Enemy of the State” would become reality.
It is true that, in a society where every good and service has to be paid for with a debit or credit card, terrorist groups would find it much harder to operate. Don’t forget, however, that today’s terrorists often become tomorrow’s liberators. Anti-British terrorists George Washington and Thomas Jefferson wouldn’t have stood a chance if the Brits had been able to intercept wire transfers from France.
Decashification would establish digital totalitarianism, a form of corporo-government control so rigid, thorough and all-encompassing that by comparison it would make Hitler and Stalin look like easygoing surfer dudes. The abolition of unregulated financial transactions would freeze the political configuration of the world, making it impossible for opposition movements—much less revolutionary ones—to challenge the status quo.
A society without dissent has no hope. Even if we lived in a perfect world where everyone was ruled by wildly popular, benevolent, scrupulously honest regimes—ha!—eliminating the slightest possibility of opposition would lead to barbarism.
We’re already more than halfway to a cashless society. In the U.S. few young adults still use checks. In many countries debit and credit card transactions now exceed those made via cash and checks combined. In 2007 the chairman of Visa Europe predicted the abolition of cash by 2012. Obviously he was wrong. But that’s where we’re headed. The U.K. plans to abolish checking accounts by 2018.
Even if you love your government, don’t want it to change, and think political opponents belong in prison, you ought to worry. As things currently stand, we know the big banks can’t be trusted. Remember when they introduced ATM cards? Banks wanted us to use them so they could lay off tellers. Then they instituted “convenience fees.” Which they have raised, and raised, to the point that taking $20 out of an out-of-town ATM could cost you $5 in fees ($2 for their bank, $3 for yours).
Imagine what your life will look like under digital totalitarianism. Your pay is direct-deposited into your bank account. You’ll pay for small purchases with your cellphone; if you owe a few bucks to a friend you’ll be able to bump your phone against your friend’s to settle up. Nowadays, some corporations allow you to control when your bills get deducted; in the future they’ll demand that you authorize them to do it automatically. What if you have a disputed charge? They’ll already have your dollars, or work credits, or whatever they’ll call them. Good luck trying to get it back from the Indian call-center guy.
As corporate ownership becomes increasingly monopolized and intertwined, your overdue phone bill might be owned by the same outfit as your bank, which would simply take what it says you owe.
The law of unintended consequences is getting a serious workout thanks to digitalization. Motorists in New York were thrilled when the EZPass system allowed them to breeze past lines at toll bridges—at a discount, no less. Then divorce lawyers began subpoenaing EZPass records to prove that a spouse was cheating. Next police set up EZPass scanners on the bridges; if you pass two of them too fast, a speeding ticket is automatically generated. The next step is to eliminate cash lanes entirely; non-EZPass tag holders will soon have their license plates scanned and receive a bill by mail—plus a $2 to $3 “handling” fee.
Think there are too many fees now? If you think you can’t trust banks now, imagine how badly they’ll gouge you when they control every single commercial transaction down to the purchase of a pack of gum. Angry about taxes? When tax agencies can take the money out of your account without asking, they will. Unlike cash, that phone bump to pay your friend will be a trackable, data mineable, fully taxable commercial transaction.
As if the post-2008 economic collapse hadn’t proven that no one is looking out for We the People but ourselves—and then barely so—the digivangelists tell us not to worry, that Big Brother, Inc. will smooth out the rough patches on the road to techno-fascist domination. From Wolman in Wired: “Opponents used to argue that killing cash would hurt low-income workers—for instance, by eliminating cash tips. But a modest increase in the minimum wage would offset that loss; government savings from not printing money could go toward lower taxes for employers.” Sure. The same way banks passed on the savings they earned by replacing tellers with ATMs to their customers.
Americans are skipping into the digital inferno wearing a smile and relishing the smell of their own burning flesh. Countless friends and acquaintances pay all their bills online. “I’m all about using my checking account in place of cash and would love to be able to eliminate cash entirely from my life,” gushed PCWorld’s Tony Bradley recently.
“Give me convenience or give me death” was the title of an album by the punk band Dead Kennedys.
We’ll get both.
(Ted Rall is the author of “The Anti-American Manifesto.” His website is tedrall.com.)
COPYRIGHT 2010 TED RALL