Time To Go: Please, Boomers, Just Retire Already!

Originally published by Breaking Modern:

It’s too bad, but Baby Boomers continue to belie generational stereotypes. In a recent survey, they overwhelmingly say they either feel too healthy or too financially insecure to retire at the normal age 65. Even to the bitter end, they continue to overshadow Generation Xers and Millennials who need them to step aside gracefully and make room for them.

Sardonic to the End

SYNDICATED COLUMN: The Joy of Hopelessness

http://www.utne.com/~/media/Images/UTR/Editorial/Articles/Magazine%20Articles/2011/01-01/Ten%20Things%20to%20Do%20When%20Youre%20Feeling%20Hopeless/ten-things-feeling-hopeless.jpg

Desire, the Indian philosopher Jiddu Krishnamurti taught, causes suffering.

I managed to make it half a century, and thus likely through more than halfway to death (which Arthur Schopenhauer teaches us, is the goal of life), not only by failing to internalize the belief that optimism breeds disappointment, but by passionately refusing to believe it. Without desire, I fervently believed, there is no motivation and thus no accomplishment.

Without ambition, how does one succeed in one’s work or find the love of one’s life? I know people who don’t want anything. They’re called potheads.

But I’ve changed my mind. The stoners may be on to something.

Give up hope — and you might find happiness. I did!

As I’ve read and heard often occurs with spiritual journeys, I arrived at my epiphany as the result of an unexpected accident.

Like other cartoonists, I apply for the Pulitzer Prize, America’s most prestigious journalism award, every January.

I hate it. Yet I do it.

I hate it because it’s a lot of work, the odds are long, and the choice of the winner is usually — to be diplomatic — baffling. Out of the 20-ish times I’ve entered, spending a full day or two each year printing out and pasting up cartoons and clips into a binder (and in the computer age, formatting and uploading them), not to mention 20-ish $50 application fees, all I have to show for my efforts is one finalistship. Back in 1996.

To datestamp this story: the letter was typed. As in: on a typewriter.

Like Charlie Brown trying to kick Lucy’s football, I apply for the Big P under the old New York Lotto dictum that you have to be in it to win it. What if the year I don’t enter is the year that I would have won?

Contest Judge #1 to other Judges: So that’s all the entries in the cartooning category.

Judge #2: Wait a minute. Where’s Ted Rall?

Judge #1: He didn’t apply.

Judge #2: WTF?

Judge #3: I specifically came here to give Ted Rall his long-overdue award!

Judge #1: Me too. I doublechecked. Tragically for journalism, he did not enter.

Judge #4: Can we call him?

Judge #1: That would be against the solemn Rules. We must choose from the other entries.

Judges #2-#4 commit suicide in interesting ways.

The deadline used to be January 30th, so I thought it still was, but they changed it a few years ago to the 25th God knows why. I blew the deadline.

As though carried off by a drone labeled “Short-Sighted Defense Policy,” a metaphorical weight bigger than a crosshatched albatross labeled “National Debt” lifted from my shoulders.

I didn’t enter. So I would not, could not, win.

Which meant I couldn’t be passed up in favor of someone else. To be precise, I couldn’t lose to someone I didn’t think was as good as me.

What a relief!

I really really really don’t mind losing to someone good. When someone good has won, I have been happy for the winner. I did not grit my teeth. I congratulated them, and meant it, and resolved to do better next year.

The problem is, the winner of the Pulitzer is usually very not good. Not as good as me. Not pretty good. Not even as good as average.

Losing to someone whose work I don’t respect hurts because it means either (a) the sucky winner is better than me, so therefore I suck even more, or (b) the Pulitzers are judged by dolts, so I must be an idiot to submit to the process, much less care about the results. I strongly suspect (b), though (a) could be true.

From late January, when I realized that I couldn’t enter, to early April, when they announced the results, I felt lighter on my feet. When my colleagues called to handicap the prize, my usual toxic mix of ambition, dread and fear of disappointment was replaced by the carelessness of knowing that I had no dog in the race and that whatever happened wouldn’t be a reflection upon me. So what if someone bad won? The judges never saw my stuff. So I wouldn’t have to spend weeks and months wondering how it was possible that anyone could look at the cartoons by the terrible winner next to mine and choose him instead of me.

I should confess that other cartoonists, no doubt smarter than me, arrived at this wisdom when they were younger. One, 10 years my junior, casually remarked that she gave herself a mini Pulitzer Prize every year by not entering: $50 a year adds up. Not to mention the time she saved compiling entries.

Last year’s winner turned out to be someone whose cartoons couldn’t possibly be any different than mine. Ditto for the finalists. Given who they chose, the judges weren’t interested in the genre of cartooning I do, so I would never have stood a chance.

Not entering was the right move. Or non-move.

This year, however, I remembered the deadline. To enter or not to enter? I entered.

Now I wish I hadn’t.

(Ted Rall, syndicated writer and cartoonist for The Los Angeles Times, is the author of the new critically-acclaimed book “After We Kill You, We Will Welcome You Back As Honored Guests: Unembedded in Afghanistan.” Subscribe to Ted Rall at Beacon.)

COPYRIGHT 2015 TED RALL, DISTRIBUTED BY CREATORS.COM

Amnesia: Yet Another Reason Why Newspapers Are Dying

Originally published by ANewDomain.net:

Legacy news organizations are failing for a lot of reasons, mostly brought upon by themselves, but there’s one that rarely if ever gets remarked upon: the fact that they have forgotten the definition of “news.”

As you and I know, news is stuff that happened that a significant number of people would like to know about. By definition, news is surprising.

All too often in recent decades, however, corporate media conglomerates have conflated news with press releases – in other words, informing us not about what we need or want to know, but about what they would like us to know.

A major driver of this trend is the misguided belief by press and broadcast organizations that the powers that be – politicians, government agencies and businesses – create news and thus must be coddled, and have all their official pronouncements disseminated in the form of news, lest they be denied access, which would of course put an end to their ability to do their jobs.

One symptom of this too close for comfort relationship between the fourth estate and those it is supposed to cover is the willingness of outlets like the New York Times to suppress or delay stories at the request of intelligence agencies due to so-called “national security concerns.”

The idea that reporters need access to PR flacks is nonsense. The opposite is true: publicists need journalists. A press conference is a news-free zone, a place where spin and propaganda rules. Unfortunately for them and for us — since the vast majority of reporting still originates in corporate-owned newspapers — the trend is accelerating.

Check out, for example, this excuse for a news story: “Obama condemns ‘brutal and outrageous murders.’”

According to Google, this story – about the president’s reaction to the murder of three Muslim students in Chapel Hill, North Carolina – was reproduced over 78,000 times in American and foreign media outlets.

There is nothing wrong with what Obama said. To the contrary: I agree with him 100%. Most likely, so do you. So do 100% of sane Americans, which means perhaps 90% of all Americans. His reaction was the exact reaction that you would expect from anyone and, to my point, specifically from him.

In other words, a news “story” about President Obama saying that mass murder is bad (outside the context of, say, wars of choice and drone assassinations) is no story at all. It is “dog bites man.” And not a particularly interesting dog or a particularly interesting man.

why-newspapers-are-dying-Emerson-photo-wikimedia-commonsYou really have to question the judgment of those thousands of editors and producers who put that story out yesterday. Who, exactly, did they think that story served? Certainly not the readers or viewers or listeners. Not one of them was surprised; not one of them cared.

Every newsroom receives hundreds if not thousands of emails a day from people who want their story or product or person covered. Publishers want their books reviewed. Manufacturers want a free plug for their products. Agents want their pet musician profiled. The vast majority of them are, of course, ignored. Pertinent story: a friend who works at a major American newspaper tells me about the fax machine that no one ever checks, that runs 24 hours a day, endless press releases dumping straight into the recycled paper bin, totally pointless for all concerned. Yet I know for a fact that that same paper ran the story about Obama taking the bold risk of coming out against random mass murder. Why that story and not the others?

I’m not arguing that traditional media outlets ought to descend to Huffington Post’s SEO-optimized clickbait or BuzzFeed’s “18 ridiculously cute photos of insipid pets” listicles. But the Internet is certainly a lot better at knowing what people might actually want to read or see. Stories like the one above make that painfully obvious.

As an editor at the New York Times told me once, “the President of the United States controls the world’s largest armies and presides over the world’s largest economy. By definition, anything he says and does is news.”

They live by that attitude. They are also dying by it.

The Freelance Workers Manifesto

Originally published by Breaking Modern:

Occupy Wall Street is no more, but its demand that America treat its workers better remains at the forefront of the national conversation.

Jobs and stagnant salaries will probably be important issues in the 2016 presidential campaign. Even Republicans, traditionally the party of business, are building their platform around the problem of rising income inequality and how conservative ideas can alleviate it. President Obama wants to improve conditions for American workers by requiring employers to provide guaranteed paid sick days and family leave, and making it easier for them to join a union.

All great news for workers, who have been taking it on the chin since at least the 1970s, the last time real wages kept up with inflation. Yet there’s a glaring gap in the discussion: freelancers.

Ten million Americans are completely self-employed — that’s way up, from just 1.3 million people in 2001. A whopping 53 million people devote part of their workweek to freelance work. “Even though there may not be jobs in the conventional sense, there is still work,” urban analyst Bill Fulton told Forbes. “That’s the whole idea of the 1099 economy. It’s just a different way of organizing the economy.”

When politicians and the media talk about workers and how to improve their lot, independent contractors and entrepreneurs are almost always left out. But self-employment isn’t going away. Though the current tentative economic recovery has caused a slight dip in the percentage of U.S. workers who receive 1099s (as opposed to W-2s) at the end of each year, labor experts anticipate that more workers will become freelancers. This will either be by choice or, after being laid off, out of necessity. As automation and international outsourcing continue to reduce the demand for full-time workers, and CEOs increasingly turn to the “contingent workforce” to fulfill their staffing needs on an as-needed basis, being dumped like a dirty napkin when demand slackens is common.

The software company Intuit predicts that a whopping 40% of American workers will be freelancers, contractors or temporary workers by the year 2020.

Are proposed reforms enough?

No. None of the proposed reforms would do anything to help these lone wolves.

When you work for yourself there’s no employer to give you paid vacation days, much less paid sick days or parental leave. What are you going to do, unionize against yourself? Forget about going on strike for higher wages — which, given the fact that 12% of freelancers are on food stamps, the self-employed could use higher wages.

freelance-workers-manifesto-ted-rallFreelancers earn less than full-timers. They work longer hours. They’re less economically secure. Because they can’t afford to say no when a possible client calls, their time isn’t their own, even on weekends and holidays. Speaking of which: what holidays?

If the balance between laborer and management has inexorably shifted toward the latter in traditional workplaces over the past half-century, the move toward an increasingly insecure, off-and-on-again workforce will only accelerate that trend. It’s a seismic shift and the main force driving down average wages. Yet public policy hasn’t merely failed to catch up — it hasn’t even begun to think about it.

As David Atkins wrote in Washington Monthly: “Simply letting the economy slide into the enforced uncertainty of the freelance economy without helping workers achieve dignity and stability is not an acceptable outcome.” But how can we avoid it?

Sara Horowitz of the Freelancers Union (not a union in the traditional sense, mostly just a way for independent workers to buy pooled health insurance) tells The Washington Post that one way to even out the feast-or-famine problem would be for Congress to authorize 529-like savings schemes. “Freelancers could be allowed to set up pre-tax accounts for their earnings that would go tax-free if they fell below a certain level, to keep them out of poverty during dry spells. In England, government officials have experimented with a ‘central database of available hours‘ as a public option for freelance work scheduling.”

Also in the Post: “As a general philosophy, social welfare benefits might need to shift towards how they work in Europe, where entitlements are attached to the individual, rather than their relationship with an employer. Some academics have described a new ‘dependent contractor’ status that would cover workers who serve mostly one client. These workers, the argument goes, should have more protections — unemployment insurance, for example, or workers compensation — than those who pick and choose their assignments from a number of different sources.”

Good suggestions, but pretty weak tea compared to the really big problems — much lower pay, much less security — faced by the new rising class of on-demand workers.

The best way to reverse decreasing wages

The best way to reverse downward pressure on wages would be for the federal government to set prices for labor on everything from the cost of a new roof to the price per word received by a writer to create an article like this one. For Americans accustomed to letting the “magic of the marketplace” govern their financial destinies, this would be a radical reform. But it’s not unprecedented. Wage and price controls have been deployed in India, the world’s biggest democracy. In 1971 President Nixon went after inflation driven by predatory corporations by freezing all wages and prices — a move conservatives declared a failure but that dramatically helped working poor people like my mom, who still says it saved our lives (even though she hated Nixon).

This would require a new government bureaucracy, but hey, hiring federal workers would reduce unemployment. Setting minimum wages for freelance work would be challenging, but experts know the marketplace. As a writer, I know that outfits that offer $25 for 1,000 words ought to be ashamed of themselves — no one should write anything for less than $1 a word.

Congress should extend protections against workplace discrimination based on race, age, gender, sexual orientation and disability to allow wronged freelancers to sue for compensatory as well as punitive damages.

Companies and individuals who engage the services of freelance workers should be required to pay into a general compensation fund managed by the federal government. This would probably be remitted as a percentage of compensation. Freelancers should be able to draw on the fund to take paid vacation and sick time off, as well as paternity and maternity benefits.

The only way to prevent American freelance workers from sliding into a chattel class indicative of life in a third-world country will be to give them the same rights, privileges and protections as those enjoyed by full-time workers.

Which, of course, will likely reduce the number of employers who transition from a full-time to an on-demand workforce.

Goodbye, Jon Stewart: Please Let the New Guy Be Funnier

Originally published by ANewDomain.net:

Jon Stewart’s decision to leave “The Daily Show” at what critics universally call the top of his game serves as another reminder of just how humor-deprived contemporary American television has become.

Stewart is, like his fellow Comedy Central alum Stephen Colbert and standup megastar Louis C.K., one of the most overrated talents of our time. Not that he isn’t fast on his feet – he is. Not that the camera doesn’t love him – it does. Not that he doesn’t understand timing – he does.

jon-stewartWhat Stewart isn’t is falling-down-on-the-floor hilarious. His fits and starts, lurching style of monologue elicits plenty of knowing guffaws and the occasional eye-rolling laugh at the expense of, typically, an ideologically inconsistent politician. But because he refuses to take the chance of alienating his audiences by offending them, he never risks falling off the high wire you have to climb in order to achieve comedy greatness.

If you want to be really funny, you have to be dangerous.

(To illustrate this point, I was going to cite a farm-based joke by Rudy Ray Moore, the black comedian and Blaxploitation filmmaker of the 1970s and 1980s, but it’s so outrageous and so obscene that I’m pretty sure I’ve never work again if I did. Now that’s some wickedly funny stuff.)

I remember – actually, as a cartoonist, I am traumatized by recollecting – a female friend telling me why she turned against the late great George Carlin.

She loved Carlin. She owned many of his albums. She had seen him in concert many times. She couldn’t stop talking about how brilliant he was. Then, she explained, he said one joke that offended her feminist sensibilities. After that, he was dead to her.

I was baffled and a little disgusted. “In baseball, if you hit the ball 35 percent of the time, you’re a God. So you need to tell me that George Carlin told thousands of jokes that you loved, gave you hours of pleasure and countless laugh out loud moments, but because of one joke, he was dead to you? You fired a guy with a .999 batting average!” (I’m more in the 30 percent range.)

He was.

Here’s the joke that pissed her off: “Have you ever noticed that the women who are against abortion are women you wouldn’t want to fuck anyway?”

Neither Jon Stewart nor Stephen Colbert nor John Oliver are ever going to say anything that funny. Or that mean. That’s not their business model. They walk between a very narrow set of lines defined by decades of political correctness.

Which is fine. Really. I don’t have a problem with what they do. The issue isn’t that they play it safe; the problem is that America is so starved for comedy that they manage to pass this bland stuff off as the real thing. The only reason that they have been so successful is that, following decades of horrible late-night tedium like Jay Leno, David Letterman and the inexplicably still on the air “Saturday Night Live” which, contrary to conventional wisdom was never very funny but is certainly much less so now.

“The humor that makes me laugh hardest is the material I know would offend or insult someone else,” wrote “Dilbert” cartoonist Scott Adams in 2008. “But offending isn’t enough. The audience gets more out of humor if the messenger is putting himself in danger.” Adams says it’s a universal law, and I agree with him. It certainly applies to me. My most outrageous work – on 9/11 widows, Pat Tillman, making fun of American soldiers fighting in Afghanistan and Iraq – is also some of my funniest. And it definitely put me in danger: I stopped counting the death threats at 1000. And I lost some good jobs.

Every now and then, someone has to kill a humorist to remind us how dangerous good humor can be.

Of all things, last month’s massacre of – whether you like them or not, outrageously funny – cartoonists at the Charlie Hebdo magazine in Paris reminded some Americans of what exists elsewhere, but has been lost here, or perhaps never existed: an over-the-top, ribald, take-no-prisoners culture of satire, particularly in print but also on television.

Every few years, I make the rounds in Hollywood trying to pitch TV show ideas. During the peak of the Bush years, and before the idiotic “That’s My Bush!,” I presented to executives all over Los Angeles my idea for a comedy, either live action or animated, that made fun of the Bush family and the president’s top officials. The hook was, Bush was actually a reluctant leader, didn’t want to be there, and was secretly brilliant but didn’t want to let on. His daughter Jenna was really running the show. Dick Cheney was a softhearted wimp who broke into tears over nothing.

Maybe the show was a dumb idea, I don’t know, I’m not a TV executive. But that’s not the point of the story. The point is, Hollywood was so satirically illiterate that they rejected the idea based on legal fears: they were worried about being sued by the first family. As I repeatedly explained, Bush and Cheney and their families were public figures, so it would have been possible to mock them six ways till Sunday without having to worry about a successful lawsuit. Besides, almost every other Western country on earth had some sort of comedy show that sent up their political leaders: France, Germany, Spain, Portugal, even, at the time, Russia. They were all quite popular.

I explained that, as an editorial cartoonist, I routinely say all sorts of terrible things about the president, and yet, here I am, not in prison, at least not yet. But the pitch meetings never got beyond the legal questions. That’s how safe TV has become: they don’t even know what the legal landscape looks like.

There have been some bright spots. But you have to wonder, would anyone greenlight “The Simpsons” today? How about “South Park”?

There is no denying the success of the Comedy Central approach. Millennial viewers who would never watch the evening news nevertheless enjoy, and learn from, the fake news format pioneered by Stewart and Colbert. But make no mistake: that is not hard-hitting political satire.

Louis C.K., who is undeniably much funnier than those two, nonetheless likes to keep things safe as well. Although these incredibly incisive when issuing humorists observations about divorce, relationships, parenthood and popular culture, he generally shies away from straight-ahead politics.

The fact that it hasn’t always been this way tells us that things can change. In the 1960s and 1970s, even the relatively tame Bob Newhart and Bill Cosby routinely delivered more trenchant humor than you’ll find on television today. Richard Pryor, of course, was a God. Hell, Lenny Bruce got arrested! That’s not going to happen to the big-time comedians that we are constantly being told are so funny today.

Don’t get me wrong, there’s plenty of great humor out there – but not on the national stage, not on network television.

So now that the big Comedy Central stars have left, Colbert to the blander than bland Tonight Show (notice how no one talks about him anymore?), Stewart to whatever he figures out, we have an opportunity to reconsider the fact that, as a humor-loving people, Americans have the God-given right to watch dangerously funny TV shows – and there has never been a time when they were more needed.

Shhh! The Samsung TV is Listening! Or is It?

Originally published at Breaking Modern:

This week it came out that Samsung was warning users of its new smart televisions to not discuss personal information around their TVs because it could be transmitted to a third party.

“Please be aware that if your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted to a third party through your use of Voice Recognition,” said the company’s privacy policy. A day after the so-called revelation, though, Samsung said it was removing the warning and that the Samsung TV doesn’t eavesdrop on or store conversations at all.

Only $1984!

keyboard_arrow_up
css.php