Monday, September 26, 2011

The "Antichrist"

September 25, 2011

Why the Antichrist Matters in Politics

Pullman, Wash.

THE end is near — or so it seems to a segment of Christians aligned with the religious right. The global economic meltdown, numerous natural disasters and the threat of radical Islam have fueled a conviction among some evangelicals that these are the last days. While such beliefs might be dismissed as the rantings of a small but vocal minority, apocalyptic fears helped drive the antigovernment movements of the 1930s and ’40s and could help define the 2012 presidential campaign as well.

Christian apocalypticism has a long and varied history. Its most prevalent modern incarnation took shape a century ago, among the vast network of preachers, evangelists, Bible-college professors and publishers who established the fundamentalist movement. Baptists, Methodists, Presbyterians, Pentecostals and independents, they shared a commitment to returning the Christian faith to its “fundamentals.”

Biblical criticism, the return of Jews to the Holy Land, evolutionary science and World War I convinced them that the second coming of Jesus was imminent. Basing their predictions on biblical prophecy, they identified signs, drawn especially from the books of Daniel, Ezekiel and Revelation, that would foreshadow the arrival of the last days: the growth of strong central governments and the consolidation of independent nations into one superstate led by a seemingly benevolent leader promising world peace.

This leader would ultimately prove to be the Antichrist, who, after the so-called rapture of true saints to heaven, would lead humanity through a great tribulation culminating in the second coming and Armageddon. Conservative preachers, evangelists and media personalities of the 20th century, like Billy Sunday, Aimee Semple McPherson, Billy Graham and Jerry Falwell, shared these beliefs.

Fundamentalists’ anticipation of a coming superstate pushed them to the political right. As the government grew in response to industrialization, fundamentalists concluded that the rapture was approaching. Their anxieties worsened in the 1930s with the rise of fascism. Obsessed with matching biblical prophecy with current events, they studied Mussolini, Hitler and Stalin, each of whom seemed to foreshadow the Antichrist.

President Franklin D. Roosevelt troubled them as well. His consolidation of power across more than three terms in the White House, his efforts to undermine the autonomy of the Supreme Court, his dream of a global United Nations and especially his rapid expansion of the government confirmed what many fundamentalists had feared: the United States was lining up with Europe in preparation for a new world dictator.

As a result, prominent fundamentalists joined right-wing libertarians in their effort to undermine Roosevelt. That this mix of millennialism and activism seemed inconsistent — why work for reform if the world is destined for Armageddon? — never troubled them. They simply asserted that Jesus had called them to “occupy” until he returned (Luke 19:13). Like orthodox Marxists who challenge capitalism even though they say they believe it represents an inevitable step on the road to the socialist paradise, conservative Christians never let their conviction that the future is already written lead them to passivity.

The world in 2011 resembles the world of the 1930s in many respects. International turmoil and a prolonged economic downturn have fueled distrust of government, as has the rise of a new libertarianism represented in the explosive growth of the Tea Party.

For some evangelicals, President Obama is troubling. The specious theories about his place of birth, his internationalist tendencies, his measured support for Israel and his Nobel Peace Prize fit their long-held expectations about the Antichrist. So does his commitment to expanding the reach of government in areas like health care.

In 2008, the campaign of Senator John McCain, the Republican nominee, presciently tapped into evangelicals’ apocalyptic fears by producing an ad, “The One,” that sarcastically heralded Mr. Obama as a messiah. Mr. McCain was onto something. Not since Roosevelt have we had a president of charisma and global popularity, who so perfectly fits the evangelicals’ Antichrist mold.

While Depression-era fundamentalists represented only a small voice among the anti-Roosevelt forces of the 1930s, evangelicals have grown ever savvier and now constitute one of the largest interest groups in the Republican Party. In the past, relatively responsible leaders like Mr. Graham, who worked with Presidents Lyndon B. Johnson and Richard M. Nixon, and even Mr. Falwell, who reined in evangelical excess in exchange for access to the Reagan White House, channeled their evangelical energy.

Not now. A leadership vacuum exists on the evangelical right that some Republicans — Rick Perry, Michele Bachmann and even Ron Paul — are exploiting. How tightly their strident anti-statism will connect with evangelical apocalypticism remains to be seen.

The left is in disarray while libertarianism is on the ascent. A new generation of evangelicals — well-versed in organizing but lacking moderating influences — is lining up behind hard-right anti-statists. While few of the faithful truly think that the president is the Antichrist, millions of voters, like their Depression-era predecessors, fear that the time is short. The sentiment that Mr. Obama is preparing the United States, as Roosevelt did, for the Antichrist’s global coalition is likely to grow.

Barring the rapture, Mrs. Bachmann or Mr. Perry could well ride the apocalyptic anti-statism of conservative Christians into the Oval Office. Indeed, the tribulation may be upon us.

Matthew Avery Sutton, an associate professor of history at Washington State University, is the author of “Aimee Semple McPherson and the Resurrection of Christian America.”

Monday, September 19, 2011

Dear Important Novelists

Dear Important Novelists: Be Less Like Moses and More Like Howard Cosell

“To start with, look at all the books.” That’s the promising first sentence of Jeffrey Eugenides’s new novel, “The Marriage Plot,” due in bookstores this fall. The book is about three college friends from Brown University, and at first taste — I’ve been toting around an advance copy — it appears to be luscious: humane, recondite and funny. Part of the heavy anticipation for “The Marriage Plot,” though, is a simple emotional response: We’ve missed this man. It has been nine years since Eugenides’s last novel, the Pulitzer Prize-winning “Middlesex.” It has been a long, lonely vigil. We’d nearly forgotten he was out there.

Eugenides, it turns out, has always worked on the nine-year plan. “The Marriage Plot” is only his third novel in 18 years. His books, like certain comets, are rare events. So much time elapses between them that his image in dust-jacket photographs can change alarmingly. On the flap of his first novel, “The Virgin Suicides” (1993), he seemed shy and earnest; he had some baby fat and a bit of hair. By the time “Middlesex” appeared, he was thinner, balder and possessed of a wily Van Dyke beard. Suddenly he resembled Ezra Pound or Errol Flynn in a swashbuckling role. I’m a bit afraid to look at his next one.

“To start with, look at all the books.” That’s a terrific first sentence. But at the pace he’s publishing — Eugenides is 51 — there will never be many of his books at which to look.

Distressingly, this kind of long gestation period is pretty typical for America’s corps of young, elite celebrity novelists. Jonathan Franzen took nine years to follow “The Corrections” (2001) with his next novel, “Freedom” (2010), and “The Corrections” itself was nine years in the making. Donna Tartt vanished for a decade between “The Secret History” (1992) and “The Little Friend” (2002); at this pace we’re due for a fat new Tarttlet next fall. Michael Chabon has gone seven years between major novels. David Foster Wallace was still working on his follow-up to “Infinite Jest” (1995) when he died in 2008, though in between he published excellent books of nonfiction and stories.

Obviously, some of this is about personal style. There have always been prolific writers as well as slow-moving, blocked, gin-addled or silent ones. It’s worth suggesting, though, that something more meaningful may be going on here; these long spans between books may indicate a desalinating tidal change in the place novelists occupy in our culture. Suddenly our important writers seem less like color commentators, sifting through the emotional, sexual and intellectual detritus of how we live today, and more like a mountaintop Moses, handing down the granite tablets every decade or so to a bemused and stooped populace. We roll our eyes at how seldom Time magazine puts writers on its cover — it once did so quite often — and sense this is evidence of the public’s shrinking appetite for quality literature. Perhaps it has got more to do with our novelists’ lagging output, their eroded willingness to be central to the cultural conversation.

Take, as a counterpoint, Saul Bellow, who over 11 industrious years delivered four novels, several of them among the 20th century’s best: “The Adventures of Augie March” (1953); “Seize the Day” (1956); “Henderson the Rain King” (1959); and “Herzog” (1964). Bellow snatched control, with piratical confidence and a throbbing id, of American literature’s hive mind. “We are always looking,” he once said, “for the book it is necessary to read next.” For this vivifying span, the book to read next was nearly always one of Bellow’s own.

Bellow could have spent those 11 years differently. He might have toiled on a “grander” book, let’s say a slablike “Augie March.” This hurts my head to ponder. Philip Roth, who has been on a phosphorescent late-career run, issuing nearly a novel a year from 2000 through 2010, might have instead decided to decant all that strong sinister wine into a single 1,200-page vessel that you would be tempted to title “Horny Goat Stares Down Death.” Thank you, Philip Roth, for not afflicting us thus. Among the mesmerizing things about his recent work is that we’ve felt as if we were reading him in something close to real time; this has given his books heat as well as light.

John Updike kept up a casually herculean pace his entire career. He wrote so much — 60 books during his lifetime — that Martin Amis called him, memorably, a “psychotic Santa of volubility.” When Updike reviewed a novel by another sadistically productive human being, Joyce Carol Oates, he wondered aloud if she needed “a lustier audience” of “Victorian word eaters.”

Updike and Oates are extreme examples, but there’s something to be said for what might be called the Woody Allen Method: Good times, bad times, you keep making art. Many of your productions will hit; some will miss; some will miss by a lot. But there’s no time for the flatulent gas of pretension to seep into your construction’s sheetrock. This is how Trollope, Balzac and Dickens worked. Each would have agreed with Gore Vidal, who once declared of those who moan about writer’s block: “You’re not meant to be doing this. Plenty more where you came from.”

This is not a plea for hasty work or for the death of the big novel. If there’s a “Middlemarch” or a “Magic Mountain” or a “House for Mr. Biswas” to be had, yes, please, bring it on. These novels are sustained and sustaining; they are also extreme rarities.

It’s not hard to sense what these modern, parsimonious writers are rebelling against. Surely they’re in flight from the shackling apparatus of modern publishing: the long press tours (“Hello, Cleveland!”), the much-hated publicity stops. The very economics of being a writer function as a set of speed bumps. Most novelists hold down teaching positions that subsidize their work; these jobs are also work-thwarters.

Some novelists may be in revolt against today’s almost militarily mechanized pop writers. Not so long ago a dignified genre writer — a John Grisham, let’s say — was expected to issue a book a year. Now we confront James Patterson, who publishes as many as nine a year; they pop from the chute like Krispy Kremes. Many of Patterson’s books are composed with other writers, as if he had a tree filled with Keebler elves outfitted with laptops and wee kegs of Red Bull.

For some novelists who write long and slow, there may be a subconscious critical strategy at play. As the critic Dwight Macdonald once put it, “It is difficult for American reviewers to resist a long, ambitious novel: they are betrayed by the American admiration of size and scope, also by the American sense of good fellowship; they find it hard to say to the author, after all his work: ‘Sorry, but it’s terrible.’ ”

Another result of the once-a-decade approach is that you feel obliged to put out novels that appear to have genuinely taken 10 years to write. Franzen’s “Freedom” (576 pages) could have stood some liposuction, and Tartt’s “Little Friend” (640 pages) would have benefited from a great deal of it. Few readers wished these books — or Eugenides’s “Middlesex” (544 pages) — longer. The Brits have always been better at not overstaying their welcome. In the 1980s and ’90s, Julian Barnes delivered a running master class in the shortish novel, and Ian McEwan, running alongside him, picked up the baton. But here in America, we respect girth. “Middlesex” may go down as Eugenides’s signal accomplishment. But it’s his ethereal and almost novella-length “Virgin Suicides” I’ll reread first and perhaps even often. I feel about it the way its young protagonists did, watching one of the teenage Lisbon sisters in an intimate moment: “The zipper opened all the way down our spines.”

Art is supposed to emerge, unless you are Jackson Pollock or Banksy, slowly; it drips rather than flows. We are suspicious of ease. It has always been easy to poke fun at overwriters. Most-prolific lists are packed with regrettable names like Barbara Cartland (700-plus books), Isaac Asimov (400 plus) and the South African Mary Faulkner, no relation to William, who wrote more than 900 books. At minimum, a writer’s book should be like his or her serious boyfriends or girlfriends; if there are some you can’t remember, you have had too many.

Serious writers are sometimes mocked for logorrhea as well. According to the novelist and biographer Jay Parini, himself no slouch in the production department, there’s a story about a graduate student who telephoned the great literary critic Harold Bloom at home. Bloom’s wife answered and said, “I’m sorry, he’s writing a book.” The student replied: “That’s all right. I’ll wait.”

A long silence can, on occasion, help a writer. Thomas Pynchon’s spectral reputation only rose in the 17 years between “Gravity’s Rainbow” (1973) and “Vineland” (1990). A long silence can wound as well. William Styron’s inability to complete a novel in the last 27 years of his life became entangled with the depression chronicled in his memoir, “Darkness Visible.”

You sense this generation of writers’ relative absence in other ways. It’s a calamity for our literary culture that so few of them write criticism on anything close to a regular basis. Not so long ago, writing occasionally about your peers was seen as part of being in the guild. It provided a different way for a writer to walk his or her wits, in ways that often reflected back on their own work. Attending to criticism was a way of stumbling upon new and articulate voices. I recall ordering my first Diane Johnson novel after reading her coruscating review of some books about food and entertaining in The New York Review of Books.

I’ve picked on Franzen and Eugenides, writers I admire, more than enough in this small essay. (And I’ve ignored many fine writers who publish more frequently.) But let me conclude with this autumnal observation. At their current rate of production, by the next time a novel from either appears, my children, not yet in high school, will have graduated from college. Actuarial tables inform me that my dogs, their muzzles not yet close to gray, will have died and been buried in the backyard. Two presidents may have come and gone.

I won’t entirely have forgotten these writers, but I will have learned to live without them. They’re like old friends who are never around enough to quite miss. The sign I’ll hang in my mental shop window will read: “I’ll be around when you get back. Sort of.”

Flip that sign over, and it will declare: “If you and your peers wish to regain a prominent place in the culture, one novel a decade isn’t going to cut it."

Saturday, September 17, 2011

Know-Nothing Candidates

Egghead and Blockheads

WASHINGTON

THERE are two American archetypes that were sometimes played against each other in old Westerns.

The egghead Eastern lawyer who lacks the skills or stomach for a gunfight is contrasted with the tough Western rancher and ace shot who has no patience for book learnin’.

The duality of America’s creation story was vividly illustrated in “The Man Who Shot Liberty Valance,” the 1962 John Ford Western.

Jimmy Stewart is the young attorney who comes West to Shinbone and ends up as a U.S. senator after gaining fame for killing the sadistic outlaw Liberty Valance, played by Lee Marvin. John Wayne is the rancher, a fast-draw Cyrano who hides behind a building and actually shoots Marvin because he knows Stewart is hopeless in a duel. He does it even though they’re in love with the same waitress, who chooses the lawyer because he teaches her to read.

A lifetime later, on the verge of becoming a vice presidential candidate, Stewart confesses the truth to a Shinbone newspaperman, who refuses to print it. “When the legend becomes fact,” the editor says, “print the legend.”

At the cusp of the 2012 race, we have a classic cultural collision between a skinny Eastern egghead lawyer who’s inept in Washington gunfights and a pistol-totin’, lethal-injectin’, square-shouldered cowboy who has no patience for book learnin’.

Rick Perry, from the West Texas town of Paint Creek, is no John Wayne, even though he has a ton of executions notched on his belt. But he wears a pair of cowboy boots with the legend “Liberty” stitched on one. (As in freedom, not Valance.) He plays up the effete-versus-mesquite stereotypes in his second-grade textbook of a manifesto, “Fed Up!”

Trashing Massachusetts, he writes: “They passed state-run health care, they have sanctioned gay marriage, and they elected Ted Kennedy, John Kerry, and Barney Frank repeatedly — even after actually knowing about them and what they believe! Texans, on the other hand, elect folks like me. You know the type, the kind of guy who goes jogging in the morning, packing a Ruger .380 with laser sights and loaded with hollow-point bullets, and shoots a coyote that is threatening his daughter’s dog.”

At a recent campaign event in South Carolina, Perry grinned, “I’m actually for gun control — use both hands.”

Traveling to Lynchburg, Va., to speak to students at Liberty University (as in Falwell, not Valance), Perry made light of his bad grades at Texas A&M.

Studying to be a veterinarian, he stumbled on chemistry and made a D one semester and an F in another. “Four semesters of organic chemistry made a pilot out of me,” said Perry, who went on to join the Air Force.

“His other D’s,” Richard Oppel wrote in The Times, “included courses in the principles of economics, Shakespeare, ‘Feeds & Feeding,’ veterinary anatomy and what appears to be a course called ‘Meats.’ ”

He even got a C in gym.

Perry conceded that he “struggled” with college, and told the 13,000 young people in Lynchburg that in high school, he had graduated “in the top 10 of my graduating class — of 13.”

It’s enough to make you long for W.’s Gentleman’s C’s. At least he was a mediocre student at Yale. Even Newt Gingrich’s pseudo-intellectualism is a relief at this point.

Our education system is going to hell. Average SAT scores are falling, and America is slipping down the list of nations for college completion. And Rick Perry stands up with a smirk to talk to students about how you can get C’s, D’s and F’s and still run for president.

The Texas governor did help his former chief of staff who went to lobby for a pharmaceutical company that donated to Perry, so he at least knows the arithmetic of back scratching.

Perry told the students, “God uses broken people to reach a broken world.” What does that even mean?

The Republicans are now the “How great is it to be stupid?” party. In perpetrating the idea that there’s no intellectual requirement for the office of the presidency, the right wing of the party offers a Farrelly Brothers “Dumb and Dumber” primary in which evolution is avant-garde.

Having grown up with a crush on William F. Buckley Jr. for his sesquipedalian facility, it’s hard for me to watch the right wing of the G.O.P. revel in anti-intellectualism and anti-science cant.

Sarah Palin, who got outraged at a “gotcha” question about what newspapers and magazines she read, is the mother of stupid conservatism. Another “Don’t Know Much About History” Tea Party heroine, Michele Bachmann, seems rather proud of not knowing anything, simply repeating nutty, inflammatory medical claims that somebody in the crowd tells her.

So we’re choosing between the overintellectualized professor and blockheads boasting about their vacuity?

The occupational hazard of democracy is know-nothing voters. It shouldn’t be know-nothing candidates.

Thursday, September 15, 2011

"New" Atheism & Theism

Beyond ‘New Atheism’

The Stone is a forum for contemporary philosophers on issues both timely and timeless.

The Stone is featuring occasional posts by Gary Gutting, a professor of philosophy at the University of Notre Dame, that apply critical thinking to information and events that have appeared in the news.

Led by the biologist Richard Dawkins, the author of “The God Delusion,” atheism has taken on a new life in popular religious debate. Dawkins’s brand of atheism is scientific in that it views the “God hypothesis” as obviously inadequate to the known facts. In particular, he employs the facts of evolution to challenge the need to postulate God as the designer of the universe. For atheists like Dawkins, belief in God is an intellectual mistake, and honest thinkers need simply to recognize this and move on from the silliness and abuses associated with religion.

Most believers, however, do not come to religion through philosophical arguments. Rather, their belief arises from their personal experiences of a spiritual world of meaning and values, with God as its center.

In the last few years there has emerged another style of atheism that takes such experiences seriously. One of its best exponents is Philip Kitcher, a professor of philosophy at Columbia. (For a good introduction to his views, see Kitcher’s essay in “The Joy of Secularism,” perceptively discussed last month by James Wood in The New Yorker.)

Instead of focusing on the scientific inadequacy of theistic arguments, Kitcher critically examines the spiritual experiences underlying religious belief, particularly noting that they depend on specific and contingent social and cultural conditions. Your religious beliefs typically depend on the community in which you were raised or live. The spiritual experiences of people in ancient Greece, medieval Japan or 21st-century Saudi Arabia do not lead to belief in Christianity. It seems, therefore, that religious belief very likely tracks not truth but social conditioning. This “cultural relativism” argument is an old one, but Kitcher shows that it is still a serious challenge. (He is also refreshingly aware that he needs to show why a similar argument does not apply to his own position, since atheistic beliefs are themselves often a result of the community in which one lives.)

Even more important, Kitcher takes seriously the question of whether atheism can replace the sense of meaning and purpose that believers find in religion. Pushed to the intellectual limit, many will prefer a religion of hope if faith is not possible. For them, Tennyson’s “‘the stars,’ she whispers, ‘blindly run’” is a prospect too bleak to sustain our existence. Kitcher agrees that mere liberation from theism is not enough. Atheists, he maintains, need to undertake the positive project of showing how their worldview can take over what he calls the ethical “functions” of theism.

There are those — Dawkins, for one example; existentialists like Sartre, for another — who are invigorated at the very thought that there is no guiding power in the universe. Many others, however, need convincing that atheism (or secular humanism, as Kitcher prefers) has the resources to inspire a fulfilling human life. If not, isn’t the best choice to retreat to a religion of hope? Why not place our bet on the only chance we have of real fulfillment?

Kitcher has a two-part answer. First, he offers a refined extension of Plato’s famous dilemma argument in “Euthyphro” to show that contrary to widespread opinion, theism is not in fact capable of grounding the ethical values that make life worthwhile. Second, to show that secularism is capable of grounding these values, he offers a sophisticated account of how ethics could have evolved as a “social technology” — a set of optimally designed practices and norms — to satisfy basic human desires.

Kitcher’s case is open to serious objections, but it has the conceptual and logical weight that is lacking in the polemics of the scientific atheists. It also lets Kitcher enter into genuine dialogue with believers like the philosopher Charles Taylor, whose defense of religion in “A Secular Age” offers an essential counterpoint to almost everything Kitcher says.

For a long time, meaningful engagement between believers and nonbelievers was, especially in the United States, blocked by an implicit mutual agreement: religious belief was exempted from challenge, provided it remained within a private sphere of religious life, and was not asserted as relevant to any issues of public concern. Over the last few decades, however, conservative Christians have rejected this agreement, particularly over issues like abortion and evolution. The scientific atheists, led by Dawkins, rightly responded with their aggressive insistence that militant believers justify the claims they wanted taken seriously in the public sphere.

The resulting polemics cleared some murky air but now have little use except to keep assuring each side of the other’s perversity. Kitcher’s secular humanism reanimates the debate, promising much needed serious reflection on whether the divine can or should be eliminated from our moral lives.

Such a debate may not result in a victory for secular humanism. But even if it does, secular humanists would still face the much greater practical task of embedding their convictions in secular versions of the religious institutions, rituals and customs that even today remain vital fixtures in our social world. But Kitcher’s challenge, unlike Dawkins’s, is one that reflective believers have no easy way of evading, and meeting it may well seriously revise their understanding of their faith.

Wednesday, September 14, 2011

Global Warming

Is It Weird Enough Yet?

Every time I listen to Gov. Rick Perry of Texas and Representative Michele Bachmann of Minnesota talk about how climate change is some fraud perpetrated by scientists trying to gin up money for research, I’m always reminded of one of my favorite movie lines that Jack Nicholson delivers to his needy neighbor who knocks on his door in the film “As Good As It Gets.” “Where do they teach you to talk like this?” asks Nicholson. “Sell crazy someplace else. We’re all stocked up here.”

Thanks Mr. Perry and Mrs. Bachmann, but we really are all stocked up on crazy right now. I mean, here is the Texas governor rejecting the science of climate change while his own state is on fire — after the worst droughts on record have propelled wildfires to devour an area the size of Connecticut. As a statement by the Texas Forest Service said last week: “No one on the face of this earth has ever fought fires in these extreme conditions.”

Remember the first rule of global warming. The way it unfolds is really “global weirding.” The weather gets weird: the hots get hotter; the wets wetter; and the dries get drier. This is not a hoax. This is high school physics, as Katharine Hayhoe, a climatologist in Texas, explained on Joe Romm’s invaluable Climateprogress.org blog: “As our atmosphere becomes warmer, it can hold more water vapor. Atmospheric circulation patterns shift, bringing more rain to some places and less to others. For example, when a storm comes, in many cases there is more water available in the atmosphere and rainfall is heavier. When a drought comes, often temperatures are already higher than they would have been 50 years ago, and so the effects of the drought are magnified by higher evaporation rates.”

CNN reported on Sept. 9 that “Texas had the distinction of experiencing the warmest summer on record of any state in America, with an average of 86.8 degrees. Dallas residents sweltered for 40 consecutive days of grueling 100-plus degree temperatures. ... Temperature-related energy demands soared more than 22 percent above the norm this summer, the largest increase since record-keeping of energy demands began more than a century ago.”

There is still much we don’t know about how climate change will unfold, but it is no hoax. We need to start taking steps, as our scientists urge, “to manage the unavoidable and avoid the unmanageable.” If you want a quick primer on the latest climate science, tune into “24 Hours of Reality.” It is a worldwide live, online update that can be found at climaterealityproject.org and will be going on from Sept. 14-15, over 24 hours, with contributors from 24 time zones.

Not only has the science of climate change come under attack lately, so has the economics of green jobs. Here the critics have a point — sort of. I wasn’t surprised to read that the solar panel company Solyndra, which got $535 million in loan guarantees from the Department of Energy to make solar panels in America, filed for bankruptcy protection two weeks ago and laid off 1,100 workers. This story is an embarrassment to the green jobs movement, but the death by bankruptcy was a collaboration of the worst Democratic and Republican impulses.

How so? There is only one effective, sustainable way to produce “green jobs,” and that is with a fixed, durable, long-term price signal that raises the price of dirty fuels and thereby creates sustained consumer demand for, and sustained private sector investment in, renewables. Without a carbon tax or gasoline tax or cap-and-trade system that makes renewable energies competitive with dirty fuels, while they achieve scale and move down the cost curve, green jobs will remain a hobby.

President Obama has chosen not to push for a price signal for political reasons. He has opted for using regulations and government funding. In the area of regulation, he deserves great credit for just pushing through new fuel economy standards that will ensure that by 2025 the average U.S. car will get the mileage (and have the emissions) of today’s Prius hybrid. But elsewhere, Obama has relied on green subsidies rather than a price signal. Some of this has really helped start-ups leverage private capital, but you also get Solyndras. The G.O.P. has blocked any price signal and fought every regulation. The result too often is taxpayer money subsidizing wonderful green innovation, but with no sustainable market within which these companies can scale.

Let’s fix that. We need revenue to balance the budget. We need sustainable clean-tech jobs. We need less dependence on Mideast oil. And we need to take steps to mitigate climate change — just in case Governor Perry is wrong. The easiest way to do all of this at once is with a gasoline tax or price on carbon. Would you rather cut Social Security and Medicare or pay a little more per gallon of gas and make the country stronger, safer and healthier? It still amazes me that our politicians have the courage to send our citizens to war but not to ask the public that question.

Sunday, September 04, 2011

So Who's Schmart?

Pass, Fail and Politics

IT’S a foolish question, asking how smart a politician is.

It’s too vague. It ignores all the different wrinkles of intelligence and ways to measure it, along with the debatable link between brain power as it’s typically defined and skilled governance in terms of actual results. It’s a vessel for prejudices, a stand-in for grievances.

And yet it comes back around almost every election cycle, as it’s doing now.

Meet Rick Perry. At Texas A&M University, his grades were so poor he was on academic probation. He flunked advanced organic chemistry, which, in his defense, sounds eminently flunkable. He got a C in animal breeding, which doesn’t. For a “principles of economics” course, he attained a glittering D, as The Huffington Post detailed. You won’t be hearing him mention that much amid all his talk about Texas jobs creation.

His academic background, coupled with his rejection of climate change and fondness for gauzy generalities, prompted a story in Politico last week with this subtle headline: “Is Rick Perry Dumb?”

Based on grades alone, it seems so. But by that yardstick, even a politician as outwardly cerebral as Al Gore has some explaining to do. Gore got his very own college D — in a course about man’s place in nature, no less. Granted, this was at Harvard. But still.

Perry can’t dazzle in policy discussions. That’s also clear. The farther he ventures from Texas, the smaller he shrinks. When the radio talk show host Laura Ingraham recently tried to get him to say something specific — anything specific — about how America should deal with China, he clung so tightly to banalities that she was forced twice to plead: “What does that mean?”

But he’s savvy enough to have assembled a political team and adopted a political strategy that have him leading the (flawed) Republican field in a raft of recent polls. There’s something to that. Something more than excellent hair.

I’m less troubled by how thickheaded Perry may be than by how wrongheaded we already know he is on issues like evolution, which he says is just a theory, and homosexuality, which he has likened to alcoholism.

President Obama has those issues right. And can talk authoritatively about them and most others. A former editor of the Harvard Law Review, he has that kind of mind, that kind of fluency. In this one poised man, erudition and eloquence join hands.

But they don’t save him. Last week, he set himself up once again to look like the nation’s deferrer in chief by proposing a date for his jobs speech that had the possibility of provoking Republican opposition and did precisely that, at which point he retreated. Is this the Mother-May-I presidency? With John Boehner in the role of paddle-wielding matriarch?

That many Republicans will viciously seize any opportunity to defy and undercut Obama is a lesson he should have learned by now. Regardless of who was being unreasonable, it was he who actually ended up sending an e-mail to supporters with the one-word subject line “frustrated.” The president of the United States is supposed to salve our frustrations, not meekly bemoan his own.

Shouldn’t he or someone in his inner circle have foreseen the potential for events unfolding in such a humiliating fashion and made sure to avoid it? Apparently no one did, and that suggests a deficit of smarts by almost any definition of that ludicrously imprecise term.

Worse yet, this was only the latest in a long series of questionable calculations. Was it smart/prudent/pick-your-adjective to lavish all that precious post-election political capital on health care reform rather than economic revitalization and jobs creation, especially if it winds up being the first in a chain of dominoes that leads to defeat in 2012 and the repeal of that precise legislation?

Was it smart/prudent/pick-your-adjective not to head off a debt-ceiling showdown by settling the matter during last year’s lame-duck session of Congress, before Republicans took the reins in the House? And, during the showdown, didn’t Obama and his advisers misjudge both the zeal of some House Republicans and the magnitude of his own powers of persuasion?

Time and again, Obama hasn’t been a prescient or brutal enough tactician and hasn’t adjusted his high-minded ways to the low-minded sport of Congressional politics. That’s a failure of some kind, and intelligence may be one word for it.

“Is Obama Smart?” the Wall Street Journal columnist Bret Stephens asked in early August. That was the headline, and it’s at least as good a question, in terms of the president’s political efficacy of late, as the one Politico posed about Perry.

THAT Perry’s headline contained the harsher adjective — “dumb” — is typical, say many Republicans, who complain that journalists tend to equate the anti-intellectualism and populist affects of many of their party’s candidates with outright stupidity. They cite Ronald Reagan as an example of someone first dismissed as a dunce and understood only later to be wise in some basic, consequential ways.

And they say that Democrats get a greater pass on gaffes than Republicans do. There’s merit to the argument. The recent verbal hiccup with which Joe Biden seemed to endorse China’s one-child policy lengthened a formidable list of Bidenisms, including his statement in 2007 that Obama, as a presidential contender, was “the first mainstream African-American who is articulate and bright and clean.” But Biden’s intelligence is seldom questioned, not the way it would be if he had a Tea Party affiliation and Southern drawl.

Then there’s the whole matter of whether we’re well served by a brainy president. In an excellent piece in Slate in 1999, Jacob Weisberg explored a growing body of thought that a president’s supposed brilliance (or lack thereof) has no bearing on success in office. By this theory, relatively ordinary smarts yielded extraordinary accomplishment (Reagan, Franklin D. Roosevelt, Harry S. Truman) while extreme intelligence led to defeat (Jimmy Carter, Richard Nixon, Herbert Hoover).

Weisberg rightly noted the huge flaws with this analysis, including the small sample size and the subjectivity involved in judging achievement. Here’s another: what makes Carter more brilliant than Roosevelt — or, for that matter, Reagan?

That assessment reflects a narrow, traditional understanding of smarts as a sort of academic aptitude, a facility with facts and language. But a whole genre of best-selling books over the last decade and a half insist — correctly, I think — that there are various ways to be clever: “Emotional Intelligence,” “The Wisdom of Crowds,” a knack for gut responses formed in the span of a “Blink.” None require exemplary SAT scores.

Instead of talking about how smart politicians are or aren’t, we should have an infinitely more useful, meaningful conversation about whether we share and respect their values and whether they have shown themselves to be effective. Someone who rates high on both counts is someone to rally unreservedly around.

Right now, neither Perry nor Obama fits that double bill.

One-Term "Wonder"

One and Done?

WASHINGTON

ONE day during the 2008 campaign, as Barack Obama read the foreboding news of the mounting economic and military catastrophes that W. was bequeathing his successor, he dryly remarked to aides: “Maybe I should throw the game.”

On the razor’s edge of another recession; blocked at every turn by Republicans determined to slice him up at any cost; starting an unexpectedly daunting re-election bid; and puzzling over how to make a prime-time speech about infrastructure and payroll taxes soar, maybe President Obama is wishing that he had thrown the game.

The leader who was once a luminescent, inspirational force is now just a guy in a really bad spot.

His Republican rivals for 2012 have gone to town on the Labor Day weekend news of zero job growth, using the same line of attack Hillary used in 2008: Enough with the big speeches! What about some action?

Polls show that most Americans still like and trust the president; but they may no longer have faith that he’s a smarty-pants who can fix the economy.

Just as Obama miscalculated in 2009 when Democrats had total control of Congress, holding out hope that G.O.P. lawmakers would come around on health care after all but three senators had refused to vote for the stimulus bill; just as he misread John Boehner this summer, clinging like a scorned lover to a dream that the speaker would drop his demanding new inamorata, the Tea Party, to strike a “grand” budget bargain, so the president once more set a trap for himself and gave Boehner the opportunity to dis him on the timing of his jobs speech this week.

Obama’s re-election chances depend on painting the Republicans as disrespectful. So why would the White House act disrespectful by scheduling a speech to a joint session of Congress at the exact time when the Republicans already had a debate planned?

And why is the White House so cocky about Obama as a TV draw against quick-draw Rick Perry? As James Carville acerbically noted, given a choice between watching an Obama speech and a G.O.P. debate, “I’d watch the debate, and I’m not even a Republican.”

The White House caved, of course, and moved to Thursday, because there’s nothing the Republicans say that he won’t eagerly meet halfway.

No. 2 on David Letterman’s Top Ten List of the president’s plans for Labor Day: “Pretty much whatever the Republicans tell him he can do.”

On MSNBC, the anchors were wistfully listening to old F.D.R. speeches, wishing that this president had some of that fight. But Obama can’t turn into F.D.R. for the campaign because he aspires to the class that F.D.R. was a traitor to; and he can’t turn into Harry Truman because he lacks the common touch. He has an acquired elitism.

MSNBC’s Matt Miller offered “a public service” to journalists talking about Obama — a list of synonyms for cave: “Buckle, fold, concede, bend, defer, submit, give in, knuckle under, kowtow, surrender, yield, comply, capitulate.”

And it wasn’t exactly Morning in America when Obama sent out a mass e-mail to supporters Wednesday under the heading “Frustrated.”

It unfortunately echoed a November 2010 parody in The Onion with the headline, “Frustrated Obama Sends Nation Rambling 75,000-Word E-Mail.”

“Throughout,” The Onion teased, “the president expressed his aggravation on subjects as disparate as the war in Afghanistan, the sluggish economic recovery, his live-in mother-in-law, China’s undervalued currency, Boston’s Logan Airport, and tort reform.”

You know you’re in trouble when Harry Reid says you should be more aggressive.

If the languid Obama had not done his usual irritating fourth-quarter play, if he had presented a jobs plan a year ago and fought for it, he wouldn’t have needed to elevate the setting. How will he up the ante next time? A speech from the space station?

Republicans who are worried about being political props have a point. The president is using the power of the incumbency and a sacred occasion for a political speech.

Obama is still suffering from the Speech Illusion, the idea that he can come down from the mountain, read from a Teleprompter, cast a magic spell with his words and climb back up the mountain, while we scurry around and do what he proclaimed.

The days of spinning illusions in a Greek temple in a football stadium are done. The One is dancing on the edge of one term.

The White House team is flailing — reacting, regrouping, retrenching. It’s repugnant.

After pushing and shoving and caving to get on TV, the president’s advisers immediately began warning that the long-yearned-for jobs speech wasn’t going to be that awe-inspiring.

“The issue isn’t the size or the newness of the ideas,” one said. “It’s less the substance than how he says it, whether he seizes the moment.”

The arc of justice is stuck at the top of a mountain. Maybe Obama was not even the person he was waiting for.