The right eloquence needs no bell to call the people together and no constable to keep them. ~ Emerson

Thursday, July 30, 2009

Too Much to Bear



A Different Take on Profiling and Run-Ins with Authority

NOTE – Although this post is a parody, the event it describes did actually happen, if not the subsequent reaction and debate.

Everett Skinner of Grants Pass Oregon was awakened by his daughter Nicole last Saturday night because she believed a bear was inside their house. Everett grabbed his shotgun and went downstairs to investigate. Sure enough, a black bear returning from a long trip spent foraging in chinaberry bushes had ripped off a window screen and climbed into the family’s den.

Skinner reports that he and the bear saw each other at about the same time. He said the animal stood up and headed toward him at that point, so he fired at it four times, killing the animal. Skinner and his family were no worse for their ordeal but he reported they no longer slept with the windows open and would have to replace the carpeting in their den, as the current rug was “a bloody mess.”

Skinner would soon find himself at the center of a bloody mess of a very different kind.

As the news services spread his story, groups such as People for the Ethical Treatment of Animals Civil Liberties Union (PETACLU) responded angrily. They charged Skinner with over-reacting as well as hunting out-of-season without a license. They noted the bear in question was elderly, graying, distinguished-looking, and walked with a limp. In their opinion, it was clear the bear posed no real threat.

Skinner argued that since the bear was in his home at the time, he was free to shoot it just because he felt like it.

PETACLU spokesperson Zell Ott conceded the bear was in Skinner’s house but countered, “Mr. Skinner built his house in the woods that were the bear’s home. We think Skinner is the real trespasser in this instance.”

Outrage was prevalent throughout the U.S. bear community but a few bears mildly stuck up for Skinner.

U.S. Forest Service mascot Smokey the Bear (semi-retired), interviewed by CNN’s Larry King, said the bear “might have come outside, talked to the human instead of growling at him, and that might have been the end of it.”

Smokey noted the bear was just back from the chinaberry bushes and exhausted. “All he wanted to do was hibernate,” he said.

However, “I think the bear should have reflected on whether or not this was the time to make that big a deal,” Smokey added.

On the other side of the issue, former cartoon star Yogi Bear believes Skinner is a speciest and his actions are symptomatic of a larger problem in Twenty-First Century America.

“That bear could have been in the house to use the bathroom or he might have mistaken it for a liquor store,” Yogi told reporters. “However, as soon as Mr. Skinner saw it was a bear, he immediate assumed it was dangerous and had to be put down. That is species profiling and it is hurtful and offensive to U.S. bears, who have suffered from it ever since white Europeans and other primates first came to American shores.”

Yogi then posed a hypothetical question. “Who here believes that Mr. Skinner would have shot to kill if he had discovered a black man, rather than a black bear, prowling about his house late at night?” he asked.

“I know I don’t believe he would,” Yogi continued, answering his own question and thereby causing most of the reporters in the room to lower their raised hands quickly and abashedly.

Smokey the Bear admits he too has been a victim of species profiling. “There is no bear in this country that has not been exposed to this kind of situation,” he said.

But, Smokey continued, “When you are faced with a human father trying to protect his family and get to the bottom of something, this is not the time to get in an argument with him. I was taught that as a cub.”

Yogi Bear rejects this line of reasoning.

“I respect him for his many accomplishments,” he told reporters, “but Smokey the Bear is what some of us bears nickname an ‘Uncle Teddy.’ He’s almost as big a suck-up as that goody-two-shoes little Boo-Boo I used to hang around with.”

Since leaving cartoons, Yogi has become a respected national authority and advocate on human/bear relations. He currently shits in the woods behind the Goldilocks and the Three Bears Tree of Ursine Studies at Jellystone National Park. Yogi said he was inspired to take up his new career to “advance inter-species dialogue, promote tolerance and cooperation, and . . . you know . . . make a little coin.”

He says Skinner was quick to feel fear when he confronted the bear but wonders if the man has ever thought about what was going through the animal’s mind.

“When he suddenly saw a human being standing in the doorway with a gun, that bear did not see a mild-mannered husband and father,” says Yogi. “He saw ‘Mr. Ranger, Sir’ and I can tell you from experience that carries a lot of emotional baggage and shame with it.”

Things got more complicated when Nicole Skinner held a charged press conference, where she fought back tears while denying she had labeled the household intruder over whom she first sounded the alarm as a bear. She insisted she had simply told her father “something big and hairy was moving around downstairs.” She insisted it was only after he pressed her to be more specific that she said it “might have been” a bear or possibly a Hispanic.

Yogi has since backed off some of his initial incendiary rhetoric. He now says he would like to use what happened at the Skinner house as a “teachable moment” to educate human being about the lingering problem of speciesism in America.

President Obama was asked for his thoughts on the incident. He replied that although he lacked all the facts and was biased because Yogi Bear was a personal friend, he still thought the whole thing “sounded specious.”

When challenged by a reporter that he meant to say, “speciest,” Obama replied, “I stand by my initial remark.”

Asked if he would like an invitation to the White House for a beer with Skinner and the President, Yogi was highly receptive to the idea.

“That sounds great,” he said. “But why stop at a beer? Why not some sandwiches and fried chicken too . . . maybe even a piece of chocolate cake . . . and maybe they could put it in a basket or something so we could take it out and have a picnic in the Rose Garden or the White House lawn . . . hey, I’m just saying.”

Wednesday, July 29, 2009

Trimming the Fat



The Link Between Cutting Healthcare Costs and Massive Government Intervention Is Closer than Many Realize

Reports of increased costs associated with the Congressional healthcare reform plans written by progressives have alarmed the public. In the midst of all this generated anxiety, moderates and conservatives see an opportunity to present themselves as brakemen on a runaway spending train.

Three Republican members of the Senate Finance Committee have teamed up with an equal number of centrist Democrats and announced they are close to compromise on an alternate bill that places reducing costs within the existing system ahead of universal coverage. Eliminating any type of government-run public insurance option, a source of anathema to most Republicans, is central to their proposal, as well as eliminating mandates for even large companies to provide health insurance to all employees.

Meanwhile, in the House, seven Blue Dog Democrats on the Energy and Commerce Committee have stalled that panel into inaction by demanding more cost savings.

President Obama signaled he might be willing to compromise on these points if he deems the counterproposals equally effective. However, most liberal Democrats have cried foul, insisting any such changes effectively gut reform.

Former DNC Chairman Howard Dean’s comments were typical. “This compromise does nothing except reform health insurance,” he told reporters. “It is not worthless because it makes [health insurance] fair, but it is not healthcare reform.”

The Washington Post’s Harold Meyerson agrees, today writing that the Senate Finance Committee seemed “on track to produce a plan that falls short of universal coverage, omits the savings that a competitive public plan would create, and might actually make health care harder to get. The only justification for such a bill is that it might win some Republican support. Why that is a goal worth pursuing at the expense of decent reform, however, is not at all apparent.”

Nonetheless, cost-cutting approaches are likely to garner enthusiastic public support. A recent Rasmussen poll found that Americans viewed reducing existing healthcare costs as more important than achieving universal coverage by a hefty forty point margin.
Common complaints about reforms leading to universal coverage include a Congressional Budget Office projection they will initially increase healthcare costs rather than reducing them and result in more government intrusion and loss of individual choice. In addition, the CBO insists the progressive plans will do little to fight the underlying drivers leading to healthcare cost inflation.

Let us consider one such driver. A 2004 study, published in the journal Health Affairs by healthcare economist Kenneth Thorpe of Emory University, tracked three hundred and seventy medical conditions and found a mere fifteen accounted for over fifty percent of the $200 billion rise in health spending between 1987 and 2000.

The top five conditions – heart disease, mental illness, pulmonary conditions (e.g. asthma, allergies), cancer, and hypertension – alone accounted for a third of the increase. The remaining ten conditions, in descending order of contribution, are trauma, cerebrovascular disease, arthritis, diabetes, back problems, skin disorders, pneumonia, infectious disease, endocrine disease, and kidney disease.

The Washington Post hailed the study, noting, “By documenting the most costly conditions, Thorpe’s findings offer the beginnings of a road map for controlling health costs.”

Now, five years later, researchers have drawn a great big Interstate highway down the middle of that roadmap. A study just released this week by the nonprofit research group RTI International, again published in Health Affairs, identifies obesity as the key risk factor increasing the likelihood of virtually all those most expensive health complaints.

Obesity-related health spending topped $147 billion last year, double what it was nearly a decade ago. Obesity care now constitutes over nine percent of all healthcare spending, up from six and a half percent over the past ten years. The RTI study found medical spending averages $1,400 more per year for an obese person than someone of normal weight.

Thorpe is already in agreement with RTI’s conclusions. This past May, he testified before the Senate Health, Education, Labor, and Pensions Committee that six obesity-related medical conditions – diabetes, hypertension, hyperlipidemia, asthma, back problems and comorbid depression – are “key factors driving growth in traditional FFS Medicare.”

The problem of obesity is widespread in our nation, with about a third of adult Americans qualifying as obese. The obesity rate rose thirty-seven percent in the past decade.

For those fearing big government intrusions because of healthcare reform, the identification of obesity as the number one healthcare cost driver is not good news. Although some researchers have begun to reveal genetic predispositions and chemical imbalances that cause some individuals to eat more and/or convert food into fat more rapidly than others, the majority of the medical and healthcare community continues to regard obesity as a modifiable lifestyle choice.

“We have ways of changing behavior and changing those health outcomes so that we don't have to deal with the medical consequences of obesity,” explains Jeff Levi of the nonprofit Trust for America's Health, which advocates community-based programs that promote physical activity and better nutrition.

In addition, the CBO and other economic studies conclude people too seldom heed the warning signs identified in traditional preventive care, such as regular screenings and checkups, to result in any significant cost savings.

The bottom line here is that those seeking to combat healthcare cost inflation must conclude the main reason we’re sick is because we’re fat and the main reason we’re fat is because we’re lazy, over-privileged, and stupid. The “hard choices” faced by government regarding healthcare reform are to force us to make the hard choices for a healthier diet and lifestyle that we are apparently unwilling to make for ourselves.

Instead of the cost/availability of health insurance tied to age and family medical history, as it often is today, how about tying it to weight and body mass index? How about an income tax penalty or surtax along the lines of the Alternative Minimum Tax, using a formula tied to how many pounds each of us exceed the “ideal weights” for our heights? How about national ID cards we swipe at pre-approved establishments that prove we are putting in a minimum hours of exercise per week at the gym?

How about hefty federal sales tax on Big Macs, Coca-Cola, and ice cream, similar to the ones levied on consumers of products like tobacco and alcohol? How many small businesses do you think that might drive out of operation and how many total jobs will be lost along the manufacturing and supply chain?

In many ways, these dire predictions are as much a bugaboo against this type of healthcare reform as the often-raised specter of “socialized medicine” against a government-run public insurance option. Yet just as cold-blooded bureaucrats could dictate who really needs various tests and treatments, cold-hearted economists could just as easily dictate who must shape up or ship out their hard-earned incomes.

In the end, a government-run public option has less to do with the type and quality of healthcare provided and much more with how Americans might obtain their health insurance. It is only when we pursue healthcare cost drivers and seek to contain/eliminate them that we begin to consider reforming the type and quality of healthcare provided and flirt with massive government invention into private decisions feared by so many.

If we want to trim the fat from healthcare costs, the best place to start is by trimming the fat from our waistlines. The old cliché remains the rule for healthcare reform – we cannot have our cake and eat it too . . . at least not without paying extra.

Monday, July 27, 2009

They Really Like Us Now! So What?



Perhaps the Most Encouraging Sign in a New Poll Is Who Likes Us Less Nowadays

President Obama has not had much encouraging news in the polls lately. Today’s Rasmussen daily tracking poll finds him falling below fifty percent popularity for the first time in his Presidency. Similarly, voters dislike the proposed Democratic Congressional healthcare plans, Obama’s signature initiative for this year, by about a ten point margin. This is a complete turnaround from a mere month ago.

Thus, it must have seemed like manna from heaven for his Administration last Thursday when the Pew Institute published the results of a poll taken among two dozen foreign countries that shows America’s perception by the world has skyrocketed into the affirmative since Obama took office. In twenty-one countries scattered throughout Western Europe, Latin America, Africa, and Asia, an average seventy-one percent of respondents voiced a positive view of the United States, up from a mere seventeen percent in those same countries when former President Bush was in office.

For those gushing, “They like us! They really like us!” Sally Field-style over this legitimately encouraging news, the harder follow-up question cannot be avoided – “So what?”

Popularity is a fine thing but it does not guarantee respect or influence. On the other hand, it can help grease the skids toward them. Moreover, while influence is certainly possible while unpopular, true respect seldom follows from unpopularity.

This accomplishment is less impressive in its own right than as a first step leading to greater international cooperation and consensus with U.S. foreign policy. Whether those subsequent steps will come to fruition remains unclear. However, even on its own, it does counter the argument that Obama’s reliance on diplomacy is just feel-good mumbo jumbo that will lead to a loss of authority and esteem for the U.S.

Still, there are warning signs of fragility in the poll results. Much like his situation at home, Pew reports improvements in the perception of America by other nations “are being driven much more by personal confidence in Obama than by . . . specific [U.S.] policies.”

Obama’s greatest strength among foreigners may be the distance perceived between him and the policies of his predecessor. His decision to close the detention facility at Guantanamo Bay and establishment of a timeline for withdrawing U.S. troops from Iraq, met with universal approval. This is also consistent with continuing public patience for his domestic policies. Rasmussen recently found that fifty-four percent of Americans still place primary blame on former President Bush for the nation’s current economic problems, unchanged from a month earlier.

Yet the most significant finding by the poll may not be who likes us more but rather who likes us less nowadays. Among all the nations surveyed, regard for the U.S. decreased in only one since Obama took office – Israel. The significance in this derives from that fact that while Obama has gained modest ground for America in some parts of the Muslim world, such as Egypt, Jordan, and Indonesia, distrust remains unchanged and at very high levels in the Palestinian territories.

Key to negotiating a lasting settlement in this long-standing Israeli-Palestinian conflict is achieving compromise and concessions from both parties. This requires trust in the U.S. as a broker by both parties. The Palestinians will never trust the U.S. as someone who will support them, at least some of the time, if we continue to give them every reason to believe we will never fail to support Israel all of the time.

We have certainly given Israel that impression up to this point. Although we often criticize the Israeli government, we have never seriously threatened to withdraw aid, let alone level sanction against them in reprisal for their truculence or aggression. Within that safety net, they have sometimes willingly made concessions only to subsequently withdraw them or act intrusively in other ways.

While it should not be U.S. policy to deliberately foster bad blood with a long-established ally, signs of visible disapproval toward us by Israel may do more to foster good blood with Palestinians than any set of our promises. The unflinching sternness exhibited by President Obama and Secretary of State Clinton of late toward Israeli provocations is yet another encouraging first step that could led to limited trust and influence for the U.S. within certain parts of the Muslim world long before we gain popularity there (if ever).

Friday, July 24, 2009

Lessons Unlearned



Professor Gates Erred Badly by Injecting Racism into a Case of Dual Bruised Egos

Henry Louis “Skip” Gates is the Alphonse Fletcher University Professor at Harvard University, where he is Director of the W. E. B. Du Bois Institute for African and African American Research. He is a noted author and PBS television personality. He commands respect from all for his attempts to explore and communicate the black experience in America throughout its history. He earns veneration from many of his fellow African Americans.

Last week, in a single unfortunate incident, he did much to erase all of those previous contributions.

By now, we have all heard the story of Gates run-in with a white police officer investigating an erroneous report that Gates was breaking into his own home. The version of events as described by Gates and the arresting officer, Sergeant James Crowley, vary wildly.

According to Crowley, Gates was agitated and kept yelling at him. He repeatedly accused Crowley of being a “racist officer.” He asked he if were under suspicion because he was “a black man in America.” He also warned Crowley, “You don’t know who you’re messing with” and that police had not heard the last of him or this incident. He initially refused Crowley’s request to produce identification.

In Gates’s version of events, he was suspicious but cooperative and astonished when Crowley appeared to continue his investigation even after Gates had provided identification proving he lived at the property. He said he began asking Crowley repeatedly for his name and badge number but Crowley refused to respond. Only then, according to Gates, did he allege, “You're not responding because I'm a black man, and you're a white officer.”

Crowley agrees that Gates repeatedly demanded his name and badge number and claims that he provided them twice but that Gates paid no attention and just kept yelling at him.

Cambridge police released Gates without bail after booking and subsequently dropped all charges against him in the firestorm that broke out over his arrest. However, Gates seems determined that Crowley and the Cambridge police have not heard the last of him. He says he plans to discuss the incident in his classes as a teaching tool and may explore a possible PBS special on racial profiling.

“This is not about me; this is about the vulnerability of black men in America,” he told CNN.

Others have been quick to join Gates in his assessment.

“This arrest is indicative of at best police abuse of power or at worst the highest example of racial profiling I have seen,” proclaimed the Reverend Al Sharpton. “I have heard of driving while black and even shopping while black but now even going to your own home while black is a new low in police community affairs.”

This charge ignores two pertinent facts. First, police responded not because of their own observations of Gates on the porch of his home but to a citizen’s complaint. That phone call may well have had some racist-inspired phobia behind it but Crowley and the other officers were still duty-bound to answer it.

Second, Gates’s arrest never had anything to do with suspicion of trespassing. Crowley states in his report, “I was led to believe that Gates was lawfully in the residence” even before Gates provided any identification. Instead, Gates was arrested for “loud and tumultuous behavior in a public place at a uniform police officer who was present investigating a report of a crime in progress.”

For many, Gates’s reputation did more to prove his arrest must have been unwarranted than anything which actually happened that day. “If a mild-mannered, bespectacled Ivy League professor who walks with a cane can be pulled from his own home and arrested on a minor charge, the rest of us don't stand a chance," bemoaned Jimi Izrael in The Root.

Yet unless the police report is a pack of lies, Gates was distinctly less than “mild-mannered” with police. Officer Carlos Figueroa, Crowley’s partner, concurs that Gates refused to comply with Crowley’s initial request to provide identification and angrily accused Crowley of being racist. He also described Gates as shouting, uncooperative, and refusing to listen.

Allen Counter, a longtime Professor of Neuroscience at Harvard insists the Cambridge police have long engaged in racial profiling. As “proof,” he offers his own near arrest five years when police mistook him for a robbery suspect and he could not produce identification. “We do not believe that this arrest would have happened if Professor Gates was white,” Counter maintains.

Gates goes one step further. “I can't believe that an individual policeman on the Cambridge police force would treat any African-American male this way and I am astonished that this happened to me; and more importantly I'm astonished that it could happen to any citizen of the United States, no matter what their race.”

No citizen would expect questions from a police officer responding to a break-in report because the fact they were standing in a house proves beyond question that it must be their home? Gates keeps setting new bars with his disingenuousness about this matter.

“I'd be glad if somebody called the police if somebody was breaking into my house,” Michael Schaffer, one of Gates’s neighbors, told a local reporter.

If Cambridge Police had dismissed the white neighbor’s phone report as a racist delusion and it turned out she was witnessing an actual break-in, would Gates have subsequently applauded them for not falling into profiling patterns? Or would he have charged them with racism for giving less priority to protecting the home of a black resident?

Crowley has no other known charges of racism or profiling on his record. In fact, his superiors picked Crowley to teach a course at the Lowell Police Academy, demonstrating to recruits how to avoid racial profiling in their duties, which he has done for the past five years with consistent high marks in student evaluations.

Both Crowley and Gates are probably guilty of bias in their recall of the incident. One only has to look at the language used by both sides to see this is less about break-ins or disorderly conduct or racism or profiling than it is dual cases of bruised egos.

Racism charges “deeply hurts the pride of this agency,” admits Cambridge Police Commissioner Robert Haas.

Asked whether he would sue the Cambridge Police, Gates’s lawyer, Charles Ogletree replied, “We’re not focusing on a lawsuit right now. We’re focusing on trying to move forward and clarifying what happened and how to repair the damage to personalities." [my emphasis]

On the one hand, a distinguished black Harvard professor is surprised and embarrassed to find himself under suspicion by police of breaking into his own home. Instead of recognizing him and asking for his autograph, the police officer asks for ID. In his agitation and chagrin, he remembers countless legitimate examples of racism against blacks by police and impulsively injects it into his protests.

On the other hand, a respected white Cambridge police sergeant, responding to a call, is “surprised and confused” by the combative nature of the otherwise distinguished elderly man he find in the house. They exchange words and the man accuses him of racism. Stung by this charge, the officer decides to teach the malcontent a lesson by placing him under arrest.

The saddest thing about this incident is that it happened not between a black man and a white police officer but between two teachers. Both Gates and Crowley should have known better. Gates should never have played the race card in response to a routine and reasonable police inquiry and Crowley should not have arrested a homeowner for no reason other than being understandably if loudly upset.

Still, the greater dishonor must rest with Gates in the end. Crowley was just doing his job if not necessarily as well as he could or often does. Gates conflated an unfortunate incident with an issue that did not appear to be there by any other measure and now plans to develop a TV special out of it. Again, unless the police report is a pack of lies, Gates will only be hurt as more and more details about his arrest become known. In the name of championing racial understanding, charging racism in his arrest will only serve to feed the polarizing prejudices of both blacks and whites.

One of the oldest adages regarding racism is that education is the only cure. Unfortunately, Gates proves one can earn a Ph.D. and still leave lessons unlearned.

Wednesday, July 22, 2009

Economics, Healthcare, and Social Justice



We Will Never Achieve Universal Coverage so Long as We View the End of Healthcare Reform as a Viable Tradeoff with the Means

Senator Ted Kennedy of Massachusetts is someone with whom I have often disagreed over the years. Nevertheless, he makes a valid point in the current issue of Newsweek when he observes, “Social justice is often the best economics.”

Such thinking is not always intuitively obvious. Traditionally, we look upon caring for society’s less fortunate, although as noble endeavor, as a “nice to have” rather than a necessity. Charitable causes are those we take up only after first ensuring we have excess funds left over to meet our own needs.

Yet if we are honest with ourselves, there is often a gap between our ability to afford charity and our willingness to do so. It is only human nature to upgrade our own wants to needs and dismiss the needs of others as selfish wants.

It is also human nature to see only the up-front costs associated with a particular action, without acknowledging or under-valuing its long-term rewards. The same is true for the opportunity costs associated with inaction.

In the case of healthcare reform, Kennedy posits, “If we don't reform the system, if we leave things as they are, healthcare inflation will cost far more over the next decade than health-care reform. We will pay far more for far less – with millions more Americans uninsured or underinsured.”

As President Obama pressures Congress to enact healthcare reform this year, three separate bills have emerged between the House and Senate. The Washington Post notes today that a common thread running through all of them is a mandate for every adult American to carry health insurance.

The impact of such a mandate is twofold. First, it means the forty-seven million individuals without health insurance will be required to purchase it (possibly with subsidies) or face financial penalties. Second, it means all but the smallest businesses will be required to contribute to the cost of their employees’ healthcare.

If these bills end up dying, it will be because certain legislators objected to them on the basis they cost too much (i.e. they will contribute too much to the already burgeoning deficit) and they insufficiently contain the drivers causing current healthcare costs to spiral out of control. In addition, there are fears a government-run option – to cover the poorest, oldest, and sickest individuals normally rejected by private insurance – will eventually swallow up private insurance and lead to inefficient, bureaucratic socialized medicine.

These are all practical, reasonable concerns. Republicans and moderate-to-conservative Democrats, sometimes referred to as “Blue Dog Democrats”, are primarily those voicing them in Congress. At the heart of the debate is whether their concerns justify halting healthcare reform in its tracks. Columnist Michael Gerson praises Blue Dog reticence as “a self-correcting mechanism that is Madisonian in its balance and elegance,” whereas columnist Harold Meyerson bemoans their attitude as the reason why “America is now the world's leading can't-do country.”

The problem I see with the cost-conscious crowd is that they apparently view failing to achieve the end of healthcare reform as a viable tradeoff for failing to achieve doing it by their desired means. In other words, quite a few of us view leaving one-sixth of our fellow Americans uninsured as a perfectly acceptable price to leave our red ink ever-so-slightly pinker as well as retain our own coverage without changes.

Yesterday, forty Senators voted, in a losing cause, to continue production of the F-22 fighter jet, despite the fact that neither the Defense Department nor the Pentagon view it as viable weapon for the future, because it will likely mean a loss of jobs which their states cannot afford. Some of those same Senators will undoubtedly also vote against healthcare reform with universal coverage that would help protect their constituents against catastrophic hardships – like losing their jobs – because they cannot afford it either. Why are we forsaking something everyone needs for things nobody does?

Many may object that universal coverage is really the end goal of healthcare reform. A recent poll by Rasmussen found that voters saw controlling costs as a bigger problem than lack of universal coverage in healthcare reform by a whopping forty percent margin. Yet when presented with a choice between health care reform and a tax hike versus no health care reform and no tax hike, the preference for lowest possible cost drops to a mere six percent advantage.

Even with our natural instinct for selfish concerns, most of us recognize the inherent unfairness of the current healthcare system.

The principal goal of healthcare reform is and always has been ensuring a minimum level of basic coverage for all. Pricing of services, who provides/oversees those services, and who pays for it are all aspects of how we achieve that goal.

Our nation and our society possess the wealth to pay for universal healthcare, even in these bleak economic times. The question is how much we are willing to spend and what other things we are willing to forego in order to achieve it.

As Meyerson notes, Blue Dogs and other conservatives seem appalled at “the notion that actual individuals might have to pay to secure the national interest.” They venerate soldiers who give their lives to keep other people’s children free. They stand aghast at giving up a few of their own dollars to keep other people’s children healthy. They call the latter action socialism. I call both actions patriotism.

If healthcare reform requires spending that will increase the deficit, then increase the deficit. If it will require tax hikes on the wealthy and middle class alike, then hike taxes. If it means European-style socialized medicine, then hello socialized medicine. Just get it done and make universal coverage the reality.

None of this means we should not look for ways to reduce costs, improve quality, and increase individual choice – both in the current legislation and in the future. However, we must view these things within the constraint of universal coverage and not as a tradeoff against it, as is too often the case today.

Sometimes positive economic outcomes occur in unexpected ways. The much-maligned healthcare reform initiative in Massachusetts fell short of its goal of universal coverage, with eighty-six thousand of its nearly four million tax filers preferring simply to pay a penalty and opt out. At the same time, the state mandate spurred nearly one hundred and fifty thousand individuals to purchase employee-provided health insurance they had previously avoided as a cost savings. Many of these newly insured were relatively young and healthy, putting more into the system than they took out.

For those grumbling over government intervention, it is hard to see any benefits offsetting the increased price being asked of us – both in terms of dollars and sacrificed individual choice – until it is our spouse or child diagnosed with cancer requiring specialized treatment, our elderly parent diagnosed with Alzheimer’s requiring long-term assisted care.

We take out insurance to protect our immediate families and ourselves. We form governments to protect ourselves as a larger collective. It is understandable human nature to balk at writing out a big check for health insurance, as a family or as a government. It seems the height of pragmatism to go slower. However, we have been using going slower as an excuse for going nowhere too long where healthcare is concerned.

The day a democracy denies social justice to some of its members because it is not economically justifiable is the day it ceases to deserve either justice or democracy.

Monday, July 20, 2009

A Day that Will Live in Memory and in Grief



By Challenging Us Each Night to Face “The Way It Is,” Walter Cronkite Reminded Us of the Way It Could Be

Today, July 20, 2009 is a day that lives in memory because exactly forty years ago today, two men from the United States of America, representing all of humankind, first set foot upon the moon. Whatever disappointments have passed since then in manned space flight, even if we never venture forth again, it was still a remarkable endeavor – a moment of national pride in our achievement and one of universal wonderment in our exploration of the unknown.

Yet today also lives in grief because the man who brought that moment into so many of our homes is no longer here to share it with us.

Walter Cronkite died last Friday at age ninety-two. He gained his greatest fame by anchoring the CBS Nightly News for nineteen years, from 1962 to 1981.

The tributes heaped upon Cronkite over the past few days have been legion. President Obama noted of him, “He invited us to believe in him and he never let us down.” In so saying, our young President may have been wistfully engaging in hopeful expectation of his own legacy. Cronkite seems to be a standard against which many seek to measure themselves.

This was clear over and over again in the esteem paid Cronkite by his fellow journalists, both contemporaries and successors. Bob Schieffer, host of CBS’s Face the Nation, perhaps said it best. “Walter was who I wanted to be when I grew up.”

When speaking of “greatness” among news anchors, many have lionized Edward R. Murrow to the point of deification. As the practical inventor of television journalism, he deserves great praise but, in many ways, he remained fundamentally a reporter throughout his career. Cronkite was the one who developed and advanced the traditional role of the anchor as dispassionate observer seated at a desk.

Murrow once said of television, “This instrument can teach, it can illuminate, yes, it can inspire. But it can only do so to the extent that humans are determined to use it to those ends.”

Although some of his writings and speeches after retirement revealed a personal ideology that ran left of center, Cronkite was absolutely committed to the concept of objectivity, fairness, and even-handedness. He regarded his chance to read the news each evening as a privilege and a responsibility.

In a 1990 column for the New York Times, he conveyed his “long-held principle” that no journalist, having achieved national recognition, should ever consider subsequently running for public office, even after retirement. In such a case, he shuddered, “the public is going to have every reason to question whether that person had been tailoring the news to build a political platform. The burden of credibility is already heavy enough without that extra load.”

The “burden of credibility” was always borne well by Cronkite. Numerous opinion polls voted him the “Most Trusted Man in America.” When the archconservative Archie Bunker referred to Cronkite as a pinko and a communist on the sitcom All in the Family, most Americans found it funny because it struck them as outrageously incongruous with Cronkite’s character and reputation.

However, in spite of his discipline and commitment to objectivity, Cronkite never forgot he was a human being reading news about human beings to other human beings. He was neither afraid nor ashamed to allow his underlying humanity to show. During the Apollo spaceflights, he shouted and trembled with boyish excitement. He choked up on air as he read the news of President Kennedy’s assassination. He delivered an uncharacteristic editorial in which he angrily denounced the Johnson Administration over Vietnam.

Some right-wingers roundly criticized Cronkite as unpatriotic for this last example. Yet his actual editorial never denounced Vietnam as evil or unworthy, never rejected the U.S. presence there as a mistake. Cronkite’s ire derived solely from his clear perception that our government’s leadership was lying to the American people about our military’s ability to end the stalemate and win a clear victory there.

Something about Cronkite struck a chord with the public despite, or perhaps because of, these brief but striking outbursts of empathy. Former Washington Post editor Ben Bradlee, writing about Cronkite in Newsweek, remembered, “He conveyed seriousness through that face. That face and his behavior . . . He was not young and hustling; he was not overly aggressive. He was such a nice person on top of everything else . . . Everyone respected him.”

Possibly more impressive than the extent of Cronkite’s reputation as a guardian or trustee of the Truth was its durability. He spanned decades in the anchor’s chair and left his position just as well regarded, if not more so, than when he started. The most recent spate of network anchors has been noticeably lacking in this particular accomplishment.

At the end of the broadcast of President Kennedy’s funeral, Cronkite made a few closing remarks that sought to challenge his listeners as much as comfort them.

It is said that the human mind has a greater capacity for remembering the pleasant than the unpleasant. But today was a day that will live in memory and in grief . . . Were these dark days the harbingers of even blacker ones to come or, like the black before the dawn, shall they lead to some still as yet indiscernible sunrise of understanding among men, that violent words, no matter what their origin or motivation, can lead only to violent deeds?
. . .

Tonight there will be few Americans who will go to bed without carrying with them the sense that somehow they have failed.

To me, this was the greatness of Walter Cronkite. He was not a marble model. He was capable of bias and mistakes in judgment, just like any other person. Nevertheless, in a time before twenty-four hour cable news channels, the Internet, Twitter, and fact-checking organizations, when he was just one of a few voices shaping a nation’s perception of reality, he attempted, with steadfast determination, never to shape that reality, even to soften it.

Cronkite presented us with the unvarnished Truth. We never grew cynical from that, at least not due to him, because we sensed he never grew cynical about it.

In a eulogy/editorial this morning, the New York Times, opined, “Some deaths end only a life. Some end a generation. Walter Cronkite’s death ends something larger and more profound. He stood for a world, a century, that no longer exists. His death is like losing the last veteran of a world-changing war, one of those men who saw too much but was never embittered by it.”

Every evening, Cronkite sat behind his desk and analytically bade us look upon the world “the way it is.” For the simple reason that his demeanor was as free of despair as it was of jingoism, he inspired countless numbers of us not only to tune in but to care the world was that way and to hope, to seek for something better – the way it could be.

Friday, July 17, 2009

Slamming Shut the Gate



It Wasn’t Assassinating Foreign Terrorists that Rightly Worried Panetta; It Was a Perceived Cover-Up

Eyebrows rose throughout Washington when President-elect Obama nominated Leon Panetta for CIA Director. Although Panetta had extensive government experience as Director of the Office of Management and Budget as well as Chief of Staff in the Clinton Administration, he lacked practical intelligence experience.

The CIA had suffered repeated political bashings in the aftermath of September 11 for intelligence failures. Many felt the CIA needed a CIA person at the helm – someone who understood the agency and would be able to protect it. Panetta was unknown, and therefore distrusted, by many career intelligence officials. The Senate ultimately confirmed him despite these misgivings.

Many now feel such fears were justified. Panetta recently cancelled a heretofore confidential program to assassinate foreign terrorist leaders and immediately ran tattletale to Congress about it. He stands accused of giving Congressional Democrats ammunition against his own agency.

To recap as briefly as possible –

Former President Bush authorized the killing of al-Qaida leaders in 2001 and duly notified Congress. Sometime shortly thereafter, the CIA began initial planning to create and train anti-terrorist assassination teams. The planning dragged along for several years until former CIA Director George Tenet cancelled it in 2004, citing the agency’s inability to work out practical details. The concept had also lost much of its urgency, as the CIA had found foreign intelligence services and missiles launched from unmanned drones could do the job just as well.

Tenet’s successor, Porter Gross, resurrected the teams in 2005. Yet by the time Michael Hayden succeeded Gross in 2006, the program was back in mothballs for the same old logistical problems.

Senior agency officials brought the assassination teams to Panetta’s attention last month because they were finally ready to move beyond planning into a “somewhat more operational phase.” At that point, Panetta permanently killed the concept and informed Congress of its past existence. According to Panetta, the CIA had kept Congress in the dark about the program because then-Vice President Dick Cheney had directly ordered them to do so.

Congressional Democrats, led by Senator Russ Feingold of Wisconsin, expressed their outrage over the program’s secrecy. They accused the CIA of violating the law through their failure to inform and begun tentative consideration of a formal investigation. Congressional Republicans countercharge Democrats of blowing the revelation out of all proportion in order to prop up Speaker of the House Pelosi’s controversial accusations that CIA officials had lied to her during past briefings on interrogation techniques.

Republicans further vilify Democrats as undermining crucial anti-terrorism intelligence gathering by their criticisms of the CIA. “Will anyone go to jail? Probably not. But you will leave a trail of destroyed officers,” predicted one CIA veteran.

Columnist David Ignatius of the Washington Post was one of a handful of voices among pundits originally defending Panetta’s selection at CIA, arguing his previous government posts had given him “tangential exposure” to intelligence operations. He may well have come to regret his endorsement. On Wednesday, he moaned over elected officials “turning the CIA into a political football.” Such actions, admonished Ignatius, were counter to proper Congressional oversight functions. Their only outcome would be to “lacerate [U.S.] intelligence services.”

However, the problem here is not that the CIA was planning to assassinate al-Qaida leaders whenever possible. This was already established policy. Nor was the disclosure of such teams to Congress doomed to make them incapable of carrying out their missions.

Dennis Blair, the Obama Administration's Director of National Intelligence, questioned whether Panetta was legally bound to inform Congress but defended his actions as the right thing to do. Even the CIA officials asking for Panetta’s permission to go forward with assassination teams did so with the recommendation to brief Congress.

I suspect the thing that made Panetta jettison the program so quickly and absolutely had little to do with its nature. Instead, it was the past decision to hide the program purposefully from Congress. Panetta has been around politics long enough to know nothing sets legislators, the press, and the public into a frenzy of suspicion faster than the appearance of a cover-up.

Far too few officials share his savvy on this point. Since the original Watergate scandal in 1974, America has experienced Koreagate in 1976, Billygate in 1979, Debategate in 1980, Irangate/Contragate in 1986, Travelgate, Whitewatergate, and Troopergate I (the Bill Clinton version) in 1993, Filegate in 1996, Monicagate/Lewinskygate/Sexgate/Zippergate in 1998, Plamegate/Leakgate in 2003, Rathergate/Memogate in 2004, Hookergate in 2005, Katrinagate/FEMAgate in 2005, Macacagate in 2006, and NAFTAgate, Troopergate II (the Sarah Palin version), and Blagogate in 2008.

In each of these incidents, the original “crime” often proved to be far less politically damaging – or even proved nonexistent – than the defendants’ subsequent attempts to deny the charges or cover-up any suspicious evidence. It is hard to blame Panetta for attempting to avoid padding the list with Assassingate in 2009.

On Wednesday, in a speech in Michigan, President Obama departed from his scripted remarks to tell the crowd, “I love these folks who helped get us in this mess and then suddenly say, ‘Well, this is Obama's economy.’ That's fine. Give it to me!” Many pundits credited Obama for accepting full ownership for the economy, even if it was likely to come back to haunt him later.

Seven months into his Presidency, ready or not, it is probably more than time for Obama to take ownership for . . . well, everything. One of the things you do when you take over ownership of the farm is to make sure all the gates are shut, so the cattle and other livestock do not go wandering off.

Far from acting against the CIA’s or U.S. intelligence’s best interests, Panetta may well have been acting to protect them from further taints of scandal. He was not being a loyal Democratic partisan by going to Congress so much as a loyal farmhand to his President by slamming shut a potentially damaging open gate.

Wednesday, July 15, 2009

Physician, Repair Thyself



Neither Cheap Car Insurance nor Henry David Thoreau Make Good Arguments Against Universal Healthcare

Doctor Thomas Szasz, Emeritus Professor of Psychiatry at Upstate Medical University in Syracuse New York, pens an attack on universal healthcare in today’s Wall Street Journal. His arguments seem such healthy, common-sense libertarian wisdom at skin level yet suffer from such putrid and diseased logic underneath that they warrant a response.

Szasz’s first objection is that virtually all health insurance, as offered today, robs the individual of choice. He argues insurers must pay for protection against catastrophic costs of treatment against too many types of diseases, occurrences, and behaviors that society frowns upon.

While Szasz is sure any reasonable person would wish to bear the costs of protecting themselves against the onslaught of cancer or the accidental loss of a limb, he asserts many of us would decline protection against “voluntary, goal-directed behavior.” Szasz includes activities/conditions such as smoking and obesity in this latter category. He also includes depression and various other forms of mental illness.

Szasz allows that society mandating a person’s protection from all possible harms that might befall them is “a fine religious sentiment and moral ideal.” However, he insists, “As political and economic policy, it is vainglorious delusion.”

As an example, he points to auto insurance, in which an individual driver can achieve a lower premium, if desired, by limiting coverage of the possible types of injuries to their car well as the dollar extent of that coverage. “People who seek the services of auto mechanics want car repair, not “auto care,” maintains Szasz.

His example would probably have more traction if made in an antediluvian time when automobiles were little more than simple internal combustion engines covered by steel shells. In those days, any man with basic intelligence, decent hand coordination, and some simple tools might well keep the family car running smoothly, despite normal wear and tear as well as small, unexpected problems.

The modern car, by contrast, is a complicated, interrelated set of computer-driven electronic, hydraulic, and mechanical systems that are beyond the average person’s acumen and wallet to fix quickly, cheaply, and easily. What is more, many people lack the time and interest to perform their own maintenance. The concept of more comprehensive, albeit more expensive, “auto care,” of the sort that Szasz is so quick to deride, is fast on the rise to becoming, if it has not already become, the norm for most of us.

When we take a chance on car insurance and lose, it may be economically difficult for us to shoulder the burden but the stakes are very different where health insurance is concerned. A human being is not something that we can junk quite so easily.

Szasz has a ready answer to this as well. We make a mistake, he warns, in seeking universal healthcare that provides “the same low quality health care to everyone.” Yet he also admits that providing good healthcare to all is not only cost-prohibitive but also impracticable because “Not all doctors are equally good physicians and not all sick persons are equally good patients.” We must learn to accept that life is unfair.

Szasz concedes the affluent generally are healthier but not only because they can afford better/more healthcare. They are also smarter about taking care of themselves. The answer then, in Szasz’s view, is “educational . . . advancement for everyone.” Yet this is exactly the goal of “well care” and other types of healthcare programs that Szasz dismisses as unnecessary and burdensome mandates.

Presumably, it is the mandate part and not the goal of such programs to which Szasz objects. He quotes Henry David Thoreau – “If I knew for a certainty that a man was coming to my house with the conscious design of doing me good, I should run for my life.” – and concludes from him that government intervention is contrary to our native Yankee sensibilities.

Yet in his famous work, Civil Disobedience, Thoreau states, “I ask for, not at once no government, but at once a better government.” And he sets forth his famous dictum, “That government is best which governs not at all” as an aspiration that he believes will be achieved only “when men are prepared for it.”

Szasz, on the other hand, seems very much bought into the bromide that people should strive for continual self-improvement. Yet he shares the curious conservative revulsion of any attempt by them to do so collectively.

Moreover, while Thoreau would undoubtedly agree with Szasz so far as automobiles are concerned, much like any other tool or possession, it seems less likely he would be quite so cavalier on the subject of human health. This is the writer, after all, who proclaimed, “Every man is the builder of a temple called his body,” as well as, “I stand in awe of my body,” and “What is called genius is the abundance of life and health.”

Thoreau maintained a healthy skepticism of government but this did not mean he thought it immoral for one man or society to express concern for the well-being of another, even if he dreaded such occasional intrusiveness in his own solitude. “Every creature is better alive than dead . . . and he who understands it aright will rather preserve its life than destroy it.” Thoreau further understood that quality of life was often just as important as being alive. “To affect the quality of the day, that is the highest of arts.”

Even as an economic tradeoff, Thoreau has a rebuttal to Szasz regarding healthcare. “The cost of a thing is the amount of life which is required to be exchanged for it, immediately or in the long run.” It is easy to confuse the price of a thing today with its true value for the future, a confusion many in Congress are experiencing right now with regard to the affordability of universal healthcare.

Doctor Szasz is a psychiatrist and writer, among whose book is The Myth of Mental Illness. The need for universal healthcare in this country is no myth. The myth lies with the idea that such coverage is somehow inconsistent with the principles of democracy, egalitarianism, or even economy.

In the end, Szasz’s arguments all boil down to the same selfish precept so often made against universal healthcare – it is better for millions of individuals to remain without any coverage than for a single individual with coverage to experience decreased quality. The luxurious aspects of many private plans are now ingrained as necessities for too many of us. “Simplicity, simplicity, simplicity!” exhorts Thoreau. “Most of the luxuries and many of the so-called comforts of life are not only not indispensable but positive hindrances to the elevation of mankind.”

This seems a classic case of “Physician, heal thyself!” Or, to express it in terms with which Doctor Szasz can better relate – He is in desperate need of a tune-up, regardless of whether he realizes it and/or is willing to pay for it.

Friday, July 10, 2009

Ultra Deep Fields



Whether Galaxies or Healthcare Bureaucracy, a Lot More Is Already Out There Than May Appear

This is the story of two photographs.

The first photograph, taken several years ago, ranks among the most famous ever produced by the Hubble Space Telescope. The Hubble was already legendary for allowing human being to view the heavens with unmatched clarity and detail. However, Dr. Harry Ferguson and some other astronomers noted that while NASA built Hubble to see new things, they mostly pointed it at celestial objects we could already see with the naked eye or using Earth-bound telescopes.



As a result, they created the Ultra Deep Field Project, which, along with its precursor, the Deep Field Project, attempted to look further out into space (and, thus, further back in time) than we have ever previously peered. Their efforts produced a series of breathtaking images. The photograph you see here, like most pictured in books and web pages, is only a cross-section – the full images are so large. They contain over three thousand specks of light and all but a few of them are not local stars but entire galaxies.

NASA obtained these stunning pictures by pointing the Hubble at a piece of what astronomers previously had referred to as “empty space.” When we look up at the night sky, we mostly see the bright stars of our own galaxy separated by black void. Yet the Ultra Deep Field Project demonstrates that when we look at the blackness, we are not looking at a void at all but countless billions of stars, albeit stars too far away and faint for our eyes to perceive.

The lesson here is simple enough – Just because we cannot see something or even give it a name that seems the opposite of its reality (e.g. “empty space”), it still exists and may actually be larger and grander than anything we can see.

The second photograph, taken this week, is contained in a much smaller space, consisting of a hearing/conference room at the Dirksen Senate Office Building in Washington D.C. Inside it, the Senate Committee on Health, Education, Labor and Pensions was working on healthcare overhaul legislation. Plenty of press photographers were on hand and most focused their lenses on the twenty-two Senators sitting at the table in the middle of the room – the “local stars” of the occasion, as it were.

But one of them, Robb Hill, a freelancer working with a team of reporters from National Public Radio, decided the more interesting picture – in terms of size and sheer numbers – was behind him. So he turned his camera around a snapped a shot of all the other people in the room. The version of his photograph you see here, like the Hubble Ultra Deep Field, is also a cross-section. Capturing his entire subject required a panoramic composition, viewable on this >link< from NPR.

All but a few of the two hundred plus individuals you see packed into the room watching the Senators are healthcare lobbyists. You can bet if the room were bigger, there would be still more of them.

NPR has teamed with ProPublica, an independent, non-profit news organization that conducts investigative journalism in the public interest, to identify each of the lobbyists portrayed in the photograph as well as the causes they represent. The effort is just beginning but, to date, all of them lobby for healthcare providers – healthcare consumers (i.e. us) are virtually unrepresented in the room.

The number of registered healthcare lobbyists more than doubled over the past decade to over thirty-five hundred, according to the Center for Responsive Politics. This is more than the number of galaxies in the Hubble Ultra Deep Field photo. Last year alone, healthcare lobbyists spent nearly $484.5 million attempting to influence government legislators and regulators. That amount should skyrocket with major healthcare reform on the table this year.

One of the initiatives favored by President Obama and Democratic legislators is a public option to cover people traditionally passed over by private healthcare. Republican legislators as well as an array of conservative thinkers oppose such a plan, arguing that government is inherently behemoth, bureaucratic, slow, and inefficient as an administrator. They assure us that private markets are always the infinitely better choice.

After all, private markets are also known as “free markets.” Who can object to “free?” Why pay massive taxes or incur massive deficits to have government run healthcare when natural market competition will inevitably force the best possible quality for the lowest possible cost? The problem with that logic, as the “ultra deep field” photograph taken by NPR demonstrates, is that our current healthcare system already has loads of bureaucracy in place at great cost.

David Leonhardt, an economic columnist for the New York Times, pointed out on Wednesday that whatever else its advantage, private healthcare deserves no praise for being trim and efficient. “The answer [to healthcare reform] isn’t obvious. But this much is – The current health care system is hard-wired to be bloated and inefficient.”

David Brooks echoes Leonhardt in his column today. “The basic problem is that the American people have gotten used to high-tech, all-everything health care, under the illusion that they don’t have to pay for it and that it’s always better for them. Politicians are unwilling to force voters and donors to give up that sort of system, even the parts that are ineffective.”

It is true that the public option Congress is currently proposing will do little if anything to reign in out-of-control costs. However, the most recent draft out of the Senate committee would leave only three percent of the population left uncovered. If it takes a massive bureaucracy to run such a system, then it will be more coverage with no more bureaucracy than we have today.

We may soon be able to name over two hundred different healthcare lobbyists who would vehemently disagree with this conclusion, if NPR and ProPublica prove successful in their efforts. However, the photograph of them all seated in a single room testifies to this conclusion’s honest reality. Just because we call them free markets, does not mean they are without significant unnecessary costs. Just because we cannot always readily see it, does not mean behemoth, slow, and inefficient healthcare bureaucracy is not already in place today.

Wednesday, July 8, 2009

Lame Duck to Sitting Duck



By Resigning as Governor, Palin Made a Smart Move . . . Maybe the Last Smart Move Open to Her

Endless speculation revolves around Sarah Palin’s surprise announcement last Friday. The former Republican Vice-Presidential candidate will not seek a second term as Governor of Alaska and will resign her current term, with a year and a half remaining, at the end of the month. Is this the end of her political career? Or is it the start of a 2012 Presidential bid?



I am not sure either scenario is true. I am not sure if Palin herself has decided this yet. However, I do think it was a smart move for her . . . perhaps the last smart move open to her at this point in her national career.

Palin may not be a demonstrable policy wonk but she has shrewd political instincts. She has more raw charisma than does any other elected official in this country today. The charisma is the good news. The bad news is the raw part.

Prior to her selection by John McCain as his 2008 running mate, Palin was one of the golden young up-and-comers in Republican politics. She was a widely popular Governor in a state where she risen to power quickly and enjoyed bipartisan legislative accomplishments. Ethics investigations against her were few and likely to die quietly from disinterest. Her family’s personal life remained private and respected.

However, McCain did select her and things changed for Palin, seldom for the better. Like Obama, Palin will face 2012 or any future Presidential bid not as newcomer/outsider but a known quantity with a known record. Unlike Obama, Palin has had far less success in controlling the image of her portrayed to the public. Palin is a walking demonstration that, even in the age of Obama, it is possible to suffer from moving too far too fast.

We could debate endlessly whether Palin’s perception problems are the result of her own lack of finesse and basic competency as opposed to a hostile media’s lack of restraint and basic decency. The truth in this case, as with most matters, probably lies somewhere in-between. Regardless of their source, Palin has perception problems and eight months after the 2008 election, it is clear they remain persistent and damaging. Consider the judgment of her fellow Republicans.

After her disjointed resignation speech, New York Times columnist David Brooks bemoaned her as “A woman who aspires to a high public role but is unfamiliar with the traits of equipoise and constancy, which are the sources of authority and trust.”

Similarly, the Wall Street Journal editorial board, after giving her decision to resign and political career every benefit of the doubt, mournfully concluded, “The GOP nominee in 2012 will need an explanation for how we got into this [economic] mess . . . as well as an agenda for how to restore U.S. prosperity . . . Republicans will need more than a critical riff about spending and budget deficits. On the evidence so far, Mrs. Palin isn't yet up to that task.”

Palin and her family could not visit New York without ending up in a feud with late night talk show host David Letterman. A scathing article in the current issue of Vanity Fair features unnamed McCain aides questioning whether Palin possesses or can learn the basics required for the Presidency. As a final straw, on the very day of her announcement, the National Society of Newspaper Columnists named her the winner of its annual Sitting Duck Award, choosing her over ousted Illinois Governor Rod Blagojevich as the most ridiculed newsmaker in the United States.

However fair or unfair journalists have treated her, Palin consistently made it clear that she would meet any criticisms by combating rather than courting the press.

Even in the earliest, headiest days of her Saint Paul convention speech, she drew her battle lines. “I’m not a member of the permanent political establishment. And I’ve learned quickly these past few days that if you’re not a member in good standing of the Washington elite, then some in the media consider a candidate unqualified for that reason alone.”

Her crusade against the media, with herself cast in the Joan of Arc role as combination sturdy warrior-maiden and vulnerable martyr, continued right through her Facebook ruminations over her resignation. “How sad that Washington and the media will never understand; it's about country. And though it's honorable for countless others to leave their positions for a higher calling and without finishing a term, of course we know by now, for some reason, a different standard applies for the decisions I make.”

Palin went on to explain that her media woes lay at the heart of her decision to withdraw. She accused “political operatives” of using the very state ethics law she championed to bombard her with complaints. Despite surviving fifteen such accusations, Palin rued the expense to the state and herself from constantly defending against them. “I know I promised no more ‘politics as usual,’ but this isn’t what anyone had in mind for Alaska,” she lamented.

Yet even if sincere in fearing she was costing her state too much, Palin’s decision to quit probably derives at least equally from the fact that state office was costing her dearly as well. While her march through Alaskan politics as usual earned rightful praise, it has become increasingly obvious this march may have come in like a lion but is fated to go out like a lamb.

Media attention and all those ethics charges significantly diminished Palin’s once ubiquitous popularity. What is more, she now deals with an increasingly truculent Legislature. The Senate recently shot down her nominee for Attorney General. She faces a potential override of her veto against $29 million in federal stimulus funds for energy efficiency programs. Some Alaskan political observers now believe that final enactment of a trans-Alaska natural gas pipeline, once held up as her signature legislative accomplishment, is more likely to pass without her support than with it.

Palin could probably survive charges of being “just another politician” – every Presidential aspirant has to deal with this complaint from time to time. She is likewise sufficiently savvy to run an anti-media campaign successfully. The two things in combination, however, are deadly to her, threatening to label her permanently as “that flaky, abrasive, and ineffective Governor from up North.”

Palin could never renounce her combative relationship with the press – it simply is not in her character and it is further doubtful the press would allow her to do it even if she were so inclined. As a result, she jettisoned the other liability associated with her and was smart to do so.

By freeing herself from the political failures and setbacks associated with day-to-day governance, Palin can focus on attacking the media (and Democrats) from within – her education is in journalism rather than political science, after all – or on its fringes as a conservative spokesperson. In either case, Palin can control her message far better by giving commentary/speeches over interviews. She need not change her message as much as hone its presentation.

In 2008, Palin proved immensely popular with the Republican base. Despite its exotic remoteness, it seems that Alaska is very much the American heartland on steroids. Unfortunately, this image is exactly what scared the bejesus out of so many outside the far right about Palin, as did the often-angry tone of her rallies. If Obama rallies appealed to what is best in each of us, Palin rallies had an unfortunate tendency to bring out some of the worst.

Palin needs to bolster her knowledge of facts and tone down her opinions. She needs to remind moderates and Independent voters that they already share some/many of her conservative, traditional values as opposed to waging a winner-take-all culture war that pits reactionary regression against socialist liberalism. Her natural conservative/libertarian views on limited government are likely to resonate more positively with moderates and Independents this time around, given aggressive government spending and deficit growth under Obama.

Palin’s recent decision resulted from getting in touch with her inner duck. She is trading being a lame duck for being a sitting duck. The latter is far from a comfortable position but at least a sitting duck is still in the game; a lame duck, by definition, is not. Sarah Palin is still very much in the game.

Monday, July 6, 2009

You Can Call Me Al



Even More than “Chad,” It’s the Name Most Associated with Spurious Post-Election Lawsuits

When the Minnesota Supreme Court handed down its ruling against him on June 30, former Republican Senator Norm Coleman finally conceded his long-disputed election with Democrat Al Franken and Democrats as a whole heaved a collective sigh of relief. After all, most political and legal analyst believed Coleman had little chance of actually prevailing and public opinion had turned solidly against him. Coleman’s repeated lawsuits fell suspect as everything from national-level strategic delaying tactics to personal vendetta.

Alas, Democrats have nowhere but the mirror to look for the source of their frustration and long wait. In many ways, the protracted resolution of the2008 Minnesota Senate race was but the culmination – and one hopes the vanquishment, although this seems sadly unlikely – of a phenomenon first begun in Florida during the 2000 Presidential campaign. The similarities between these two races are legion.

Both were statewide elections with larger national implications. Both were highly partisan and bitterly fought. Both had multiple candidates on the ballot beyond nominees of the two major Parties.

When the voting completed, the outcomes were very, very close. In 2000 Florida, George W. Bush led by 1,784 votes. In 2008 Minnesota, Norm Coleman was in front by as much as 726 votes, although his lead dropped to 215 votes by the time results were officially certified. Both contests were within one-half of one percent – sufficient to trigger a mechanized recount in Florida and a manual one in Minnesota.

In both cases, the recount ate into the original leader’s margin. In 2000 Florida, Bush held on to win by a mere 327 votes. In 2008 Minnesota, the tide changed sufficiently to place Franken ahead by an even slimmer 49 votes. Subsequent inclusion of wrongly rejected absentee ballots grew Franken’s lead to 225 votes.

This should have been the end of things in both cases. Regrettably, emotions were running high in 2000 Florida and everywhere Democrats looked that year they saw voting irregularities.

Little old Jewish ladies from Miami tearfully claimed to have voted for the wrong candidate in their confusion. Other voters became worried they may have made similar mistakes hours or days after returning from the polls. “Butterfly ballots” and other complicated ballot designs drew criticism. People gasped in horror to learn “hanging chads” and other problems could keep punch card ballots from scanning properly. Al Gore requested manual recounts in four counties but local officials ran out of time in three of them before results were due to the Secretary of State.

The result was a flurry of threatened or filed legal actions challenging Florida’s results. At the crux of these lawsuits was insistence upon a standard of vote counting never before contemplated.

More than seeking mere remedies for specific instances of voter fraud and intimidation at the polls – although these were alleged as well – Democrats argued that unless absolutely every vote was counted and counted correctly according to individual voter intent, every voter in the state of Florida had been disenfranchised and, given the nature of the contest, every voter in the nation along with them. They asked the courts to redress this wrong.

The Florida State Supreme Court, perhaps so caught up in the prevailing hysteria that they succumbed to it, ordered a statewide manual recount by a narrow four to three decision. The Bush campaign immediately sued in federal court, arguing such a recount violated the Equal Protection Clause of the Fourteenth Amendment because there was no statewide standard. This meant a ballot rejected in one county might well count in another, thereby resulting in the very disenfranchisement the recount supposedly prevented.

(It is relevant to note in 2008 Minnesota that local officials directed any contested or questionable ballots for interpretation and counting by a non-partisan State Canvassing Board. While still subjective, this approach at least ensured consistency.)

The U.S. Supreme Court agreed with this logic, ordering the manual recount stopped the day after it began by a narrow five to four vote. In the order to stop counting, Justice Scalia wrote, “One of the principal issues in the appeal we have accepted is precisely whether the votes that have been ordered to be counted are, under a reasonable interpretation of Florida law, ‘legally cast votes’ . . . Count first and rule upon legality afterwards is not a recipe for producing election results that have the public acceptance democratic stability requires.”

After hearing oral arguments, a respectable seven to two majority of the Court agreed, “The statewide standard (that a ‘legal vote’ is ‘one in which there is a 'clear indication of the intent of the voter’) could not guarantee that each county would count the votes in a Constitutionally permissible fashion.”

The subsequent remedy to stop the recount and remand the case back to the Florida State Supreme Court for further consideration was by a much closer five to four vote. However, it is notable that two of the dissenters (Souter and Breyer) favored counting for the democratic appeal of counting but acknowledged the process as carried out was neither impartial nor consistent. Thus, they bizarrely favored the continuance of an inherently unfair process in the name of fairness.

Many have criticized the Supreme Court for their decision in Bush v. Gore. Personally, I side with those who believe the Court got the right answer despite vague and poorly spelled-out reasoning. The heart of their decision probably lays not in the rambling majority opinion but in Justice Stevens’s blistering dissent, in which he rails against “the Nation's [loss of] confidence in the judge as an impartial guardian of the rule of law.”

Despite receiving darts for messing where it did not belong with Bush v. Gore, this is exactly the message the Supreme Court sent to other courts. Specifically, it suggested to the Florida State Supreme Court that it had no business impinging on the election procedures established by the Florida Legislature and carried out by its Executive Branch to impose an impossible standard of its own and then declare itself and the Judiciary in general as the only ones capable of deciding when said standard had been reasonably met.

Although the Florida State Supreme Court got the message and backed off, followed quickly by Gore, it was too late to prevent permanent damage. Every politician in a close race since has evoked the same unreasonable standard and turned to the courts as an instrument for remedy. Norm Coleman was simply the latest and longest lasting to do so.

There is no doubt that courts should intervene when state and local election laws, infrastructure, and procedures violate Constitutional rights and other federal mandates. However, voting as a “right” focuses principally on allowing every person to vote who desires such. The actual mechanics of voting correctly, let alone wisely, are more a matter of individual voter responsibility. They require vigilant study beforehand, followed by scrupulous care at the polls.

While, it is again the duty of the State to record each vote accurately, it must be understood that some regrettable but unavoidable and unintentional errors will be introduced into the process by human beings and human-made machine of the variety that continue to plague every other human endeavor, including brain surgery and rocket science. In short, no election board, regardless of their scrupulousness, can guarantee counting every vote according to voter intent any more than individual voters can guarantee casting their votes beyond all error.

If the hanging chad could name itself, it would surely pick the monicker “Al.” After all, it belongs to a spectacle that began with Al Gore and most recently included Al Franken. At this point, both major Parties find themselves tied in their attempts to win spurious post-election lawsuits. It is significant that, for all their efforts, neither has been successful in overturning the formal count/recount verdicts. This is where it always ought to end. Let it do so from now on.

Unfortunately, this sentiment is as likely to find fruition as the Michael Jackson funeral winding up a quiet, distinguished affair.

Wednesday, July 1, 2009

Watering the Tree of Liberty in Honduras



Attempting to Defend Its Constitution, the Honduran Government Has Torn It to Shreds Instead

The tree of liberty must be refreshed from time to time with the blood of patriots and tyrants. It is its natural manure.
~ Thomas Jefferson, 1787

Roberto Micheletti, the acting President of Honduras, has vowed, “No one can make me resign.” He further declared that he would never permit former President Manual Zelaya to return to power “unless another Latin American country comes and imposes him using guns.”

Such an outcome will probably never happen but might be cosmically appropriate, considering that Micheletti achieved office because of Zelaya forced resignation at gunpoint by the Honduran military. Then there is Micheletti’s own promise to use that same military to arrest Zelaya if he so much as tries to set foot in Honduras again.

To be sure, Zelaya is not a good person. He is a known ally of power-hungry Venezuelan President Hugo Chavez. His recent actions as President sometimes skirted the Honduran Constitution and sometimes defied it outright.

Unable to amend the law to allow him to run for another term, Zelaya first tried to call for a convention to rewrite the Constitution, which had been in effect since 1982. When the Honduran Supreme Court deemed this illegal as well, Zelaya insisted upon holding a non-binding resolution anyway and ordered the military to begin distributing ballots as part of its election role. When the head of the military refused to do so, Zelaya dismissed him.

What happened next is the source of much controversy. The Honduran Constitution apparently allows for impeachment of the President but is fuzzy if not altogether silent on the exact mechanisms for doing so. The Honduran Legislature, unsure what to do next with an Executive who appeared inclined to disregard laws with which he disagreed, turned back to the Supreme Court.

The Court faced a variety of options. It could have ordered the Honduran military as a whole not to comply with Zelaya’s ballot distribution directive as an illegal order – something with which the military seemed already inclined to agree. The Court might have ordered impeachment proceedings again Zelaya to begin and either lay out their exact mechanisms or order the Legislature to do so.

Instead, the Honduran Supreme Court took the catastrophic step of ordering the military to forcibly arrest Zelaya, remove him from office, and expel him from the country. The Court declared, “The armed forces, in charge of supporting the Constitution, acted to defend the state of law and have been forced to apply legal dispositions against those who have expressed themselves publicly and acted against the dispositions of the basic law.”

The Honduran Legislature, delighted to see their dirty work done for them, immediately followed with their own unanimous if now empty resolution to remove Zelaya for “manifest irregular conduct” and “putting in present danger the state of law.”
In fact, the actions taken by Honduras in the name of “Rule of Law” are diametrically opposed to everything for which that concept stands and represents a far worse trashing of the Constitution than anything perpetrated by Zelaya. It is though the U.S. Supreme Court, in deciding Bush v. Gore, had ordered the Army to deport the loser of the election to Canada and jail those supporting him for being too much of a nuisance for a functioning democracy to handle.

The use of the military to override difficult civilian problems, rather than maintaining civilian authority over it, is so obviously dangerous and undemocratic that Honduras has received almost universal condemnation for it.

Both President Obama and Secretary of State Clinton have issued statements declaring that Zelaya’s removal violates the principles of the Inter-American Democratic Charter. The United Nations and the Organization of American States (OAS) called for the unconditional return and restoration of the Constitutional President. The Bolivarian Alliance for the Americas, Caribbean Community (CARICOM), the Association of Caribbean States, and the Union of South American Nations issued similar condemnations.

Governments in the region, including those of Costa Rica, Guatemala, Nicaragua, El Salvador, Panama, Chile, Argentina, Peru, Paraguay, Uruguay, Bolivia, Columbia, and Brazil, joined others from around the world in condemning the military action, calling for Zelaya’s return, and/or refusing to recognize the new Honduran head of state. All refer to Zelaya’s removal as a coup, rather than a legal, peaceful transition of power.

It is true that Zelaya flaunted the Honduran Constitution for selfish reasons of personal gain while the Honduran Supreme Court and Legislature did so for altruistic, civic ones. Unfortunately, a democratic government of laws may not disobey or circumvent its own rules, even when a villain is involved. It breaks the sacred contract with the People giving it authority when it does so – the very contract it sought to defend in this case.

Álvaro Vargas Llosa, a senior fellow at the nonpartisan Independent Institute and an expert on Latin America, considers the Honduran government’s blow for democracy an unmitigated disaster that is likely to bring about the exact opposite of its intent. He argued yesterday in the New York Times that military intervention had turned “an unpopular President who was nearing the end of his term into an international cause célèbre.”

Worse still, laments Llosa, as Zelaya’s closest ally in the region, the situation allows socialist strongman Chavez of Venezuela to cast himself as “the unlikely champion of Jeffersonian democracy in Latin America.”

Well, maybe not so unlikely. As the quote at the top of this post indicates, Jefferson was a fervent proponent of democracy as an underlying spirit among the common people but not such a great believer in the staying power of democratic governments. Indeed, he extolled occasional forays into bloodshed and anarchy more than once in his writing, much to the chagrin/outrage of the other Founders.

Part of Jefferson’s seeming penchant for carnage drew from his desire to defend the worst barbarities and excesses of the French Revolution, in which a country he deeply loved threw off a system of monarchy and nobility he deeply hated. Then, as now, most Americans recognize that hoping to perpetuate Rule of Law by pulling it down from time to time is as treacherous as it is nonsensical.

The government of Honduras thought it was defending democracy. It now faces a world that no longer considers it a democracy but a country in the midst of a new revolution. Its Constitution had a good twenty-seven year run but it is now kaput – torn to shreds. Jefferson might have approved but those who truly trust in the Rule of Law find this revolutionary view of democracy a load of manure.