Democracy and equality

The coverage of Margaret Thatcher’s funeral, with the emphasis on the fact that she was the first female Prime Minister of Britain, got me thinking a bit about democracy and equality.

We usually assume that these go hand in hand: that democracy inevitably brings with it greater respect for human rights, greater equality of opportunity, and so forth. But here’s an interesting fact. The British system began to be more democratic in the late 1600’s and early 1700’s. The office of Prime Minister is usually thought to have come into existence in 1721, or 293 years ago.

The Prime Minister has been a woman for 11 of those 293 years. The Monarch, on the other hand, has been a woman for 124 of those 293 years.

The United States has existed for 237 years, and the Presidency of the United States has existed for 224 years. Although the United States was conceived from the beginning with a much clearer commitment to democracy and equality of opportunity, the United States has yet to elect a female chief executive.

I don’t mean to suggest that the Monarchy is more of an equal-opportunity job than the Prime Ministership or the US Presidency. It’s not.

The monarchy is in many ways an arbitrary institution. But that arbitrariness introduces a random element. Even in a very patriarchal society, if the rules of inheritance declare that the monarchy must pass to a woman then the monarchy does, indeed, pass to a woman.

Democracy seeks to give those who are governed more control over who will govern them. This is a very good thing (and, in my view, a significant improvement over monarchy). But note that democracy also means that the prejudices of those governing have much more power to determine who will govern.

A patriarchal society might accidentally find itself ruled by a Queen. But it would not accidentally elect a female President or Prime Minister.

I am not arguing that monarchy is preferable to democracy: it is not. But I hope that those of us who support democracy and believe in popular sovereignty will recognize that empowering the people empowers their vices as well as their virtues. Popular sovereignty is only as good as the people who hold it, and so a good democracy requires a virtuous citizenry.

Posted in politics | 1 Comment

A note to my gun-owning friends

I’m not a gun control extremist. I grew up in a rural area, and know that there are lots of legitimate reasons to own guns, and that many gun owners use their weapons responsibly. I would love to protect kids from gun violence in ways that do not restrict the freedom of law-abiding gun owners.

But here’s the deal: 20 little kids just got killed. If, in the face of that tragedy, I get the sense that you care more about your right to bear arms than you care about those kids, you have lost your credibility. I’m not interested in your arguments for gun ownership. I’m only interested in the fact that you appear to care more for your guns than for kids.

When I first heard news of the tragedy in CT today, the first thought that crossed my mind was not gun control. It was to assume that the shooter was mentally ill (like several previous mass shooters), and to think we need to do more to make sure that people like that get treated before they hurt people.

But then I began to see comments from gun owners that seemed to show more concern for protecting the right to bear arms than for the kids who have died, the kids who have survived, the families affected. If I were to receive a call right now from a gun control organization asking for donations, I’d be tempted to clean out my bank account to support them.

I’m a pretty rational guy, and I don’t make impulsive emotional decisions. I also don’t have kids. I don’t see those images, and think, “What if that was my kid who had just been shot?” So if I feel this way, I can guarantee that there are a bunch of parents out there who are reacting far more strongly, with far deeper personal emotional investment, than I am.

So, if you’re a gun owner, or a defender of gun rights, do yourself a favor. Remember that the lives of those kids are worth infinitely more than your guns. Make sure that, before you say anything, you take the time to think of those kids. Think what it’s like for the families. And make sure that anything you say in defense of your guns is not going to alienate the folks who care for those kids.

I presume that, in the aftermath of this tragedy, there’s going to be some political debate over gun control. Like I said, I know lots of responsible gun owners. I’m more than willing to support the right of responsible gun owners to keep and bear arms. But you’ll be a lot more likely to get my vote in that debate if I’m convinced you care as much about the kids as I do.

If I worry you care more about your guns than about the kids, I’m gonna be 10x as likely to support proposals that would further restrict gun ownership.

So, over the next few days, I suggest that you think carefully before you speak. A lot of people are upset over this kind of tragedy. Don’t stick your foot in your mouth.

Posted in Uncategorized | 4 Comments

The power of arguments?

I talked to a friend of mine today. He was frustrated with a long philosophical discussion he had with some of his friends. The upshot of it all was that after hours of argument, he had made no progress in convincing his friends of the flaws in their arguments, or the strengths of his.

Tomorrow, we start the classes. I will be teaching medical ethics, and I expect the whole semester to be somewhat like this, potentially intractable arguments over questions where the stakes are, literally, life and death.

Of course, I cannot expect, in a one semester undergraduate class, to finally resolve dilemmas which our culture has not been able to resolve for more than a generation. At best, the students will understand the arguments better, learn to think more deeply and critically, and hopefully come to somewhat better grounded beliefs about medical practice.

Still, reflecting on the intractability of ethical debate in the last few decades reminded me of this passage from Robert Nozick’s Philosophical Explanations (p. 4):

Children think an argument involves raised voices, anger, negative emotion. To argue with someone is to attempt to push him around verbally. But a philosophical argument isn’t like that—is it?

The terminology of philosopical art is coercive: arguments are powerful and best when they are knockdown, arguments force you to a conclusion, if you believe the premisses you have to or must believe the conclusion, some arguments do not carry much punch, and so forth. A philosophical argument is an attempt to get someone to believe something, whether he wants to believe it or not. A successful philosophical argument, a strong argument, forces someone to a belief.

Though philosophy is carried on as a coercive activity, the penalty philosophers wield is, after all, rather weak. If the other person is willing to bear the label of “irrational” or “having the worse arguments”, he can skip away happily maintaining his previous belief. He will be trailed, of course, by the philosopher furiously hurling philosophical imprecations: “What do you mean, you’re willing to be irrational? You shouldn’t be irrational because . . .” And although the philosopher is embarrassed by his inability to complete this sentence in a non-circular fashion—he can only produce reasons for accepting reasons—still, he is unwilling to let his adversary go.

Wouldn’t it be better if philosophical arguments left the person no possible answer at all, reducing him to impotent silence? Even then, he might sit there silently, smiling, Buddhalike. Perhaps philosophers need arguments so powerful they set up reverberations in the brain: if the person refuses to accept the conclusion, he dies. How’s that for a powerful argument? Yet, as with other physical threats (“your money or your life”), he can choose defiance. A “perfect” philosophical argument would leave no choice.

If I happen to come up with a perfect philosophical argument this semester, I will be sure to share it here first.

Posted in humanities, philosophy | 1 Comment

Democracy and the nuclear state

Perhaps the most striking sentence I came across in my readings on the atomic bomb had nothing (directly) to do the morality of the atomic bombs.

As the Manhattan Project neared completion, General Leslie Groves, the project’s head, commissioned a physicist named Henry De Wolf Smyth to write an in-depth report on the development of the atomic bomb. The report—titled Atomic Energy for Military Purposes: The Official Report on the Development of the Atomic Bomb Under the Auspices of the United States Government—was released to the public on August 12, 1945, just 6 days after the Hiroshima bombing, and 3 days before the Japanese surrendered on August 15.

The report did not provide any direct information about how to build an atomic bomb. However, it provided an in-depth account of the Manhattan project and the implications of atomic energy in war.

The most striking sentence is the very first sentence of the Report’s preface:

The ultimate responsibility for our nation’s policy rests on its citizens and they can discharge such responsibilities wisely only if they are informed.

This is a philosophical premise with enormous political implications.

The first thing that strikes me about this sentence is the degree to which it clashes with the emphasis on secrecy that has come to characterize the War on Terror. Numerous vitally important policy questions—like what interrogation techniques are consistent with morality and with international law, or under what circumstances (if any) extra-judicial killing can be legitimate—are treated as state secrets, unavailable for public debate.

Of course, given the secrecy surrounding the Manhattan Project itself, and the fact that the decision to use the bomb was taken without any public debate, we may question whether the Manhattan Project itself conformed to the principle stated on the first page of this report.

Nevertheless, the fact that such an in-depth report was released immediately after the first atomic bombings, explicitly inviting public debate on the future of nuclear weapons, seems to me significant, and a rather different attitude from that which has taken hold as a result of the Cold War and the War on Terror.

In any case, regardless of how we answer those questions, I think it is important to recognize that democratic government is only possible where citizens know enough about government policy to be able to make an informed decision in the voting booth. This requires not only that the government provide adequate information about its policies, but also that candidates engage in thoughtful and informed discussion of these issues during the campaign.

I realize that this is, perhaps, a hope which is almost certain to be disappointed in the current political climate. Nevertheless, I think the principle is worth remembering, even if only in the breach.

And it is in the hope of making some small contribution to that discussion that I have collected these ongoing reflections.

Posted in just war | Leave a comment

Remembering Hiroshima and Nagasaki

Sixty-seven years ago this month, the United States dropped atomic bombs on Hiroshima and Nagasaki, killing somewhere between 150,000 and a quarter million people, most of them civilians. Of these, about half were killed in the blast itself, and about half died by the end of 1945 from severe burns and radiation, both of which were made worse by the lack of adequate medical care in the aftermath of the attack.

Two views of the attack

The decision to use the bomb remains controversial.

Some argue that it was justified because it saved many lives—both American and Japanese—and hastened the end of the war. For example, the eminent cultural and literary historian Paul Fussell (author of The Great War and Modern Memory), makes this argument in “Thank God for the Atomic Bomb” (PDF).

Others regard it as a serious war crime. For example, the Second Vatican Council declared, “Any act of war aimed indiscriminately at the destruction of entire cities of extensive areas along with their population is a crime against God and man himself. It merits unequivocal and unhesitating condemnation” (Gaudium et Spes, 80). The Catholic Catechism reiterates this judgment in the section on Just War (2314). This is also the argument made by Elizabeth Anscombe in “Mr. Truman’s Degree” (PDF).

I am firmly on the side of those who believe that the bombings were a war crime. But I recognize that this position seems unrealistic and out of touch to many. It seems to them that out of devotion to an abstract “moral” principle, I am willing to insist on killing millions of people in order to have a more “just” war.

Not surprisingly, this sort of “morality” strikes a lot of people as crazy.

Digging deeper

Of course, for a philosopher, it’s insufficient to say, “I believe that the bombings were a war crime.” An ethicist does not simply command or forbid: the goal of ethical inquiry is to help students to understand the reasons behind ethical judgments. Ethical argument seeks to interpret, to justify, and to explain its conclusions.

To mark the anniversary of the bombings, I spent some time this month reading a number of different resources on the bombing, hoping to deepen my understanding of the different sides in the conflict.

Although I am a philosopher, I think that fiction often provides an important window into philosophical questions. So I began with Nuclear Holocausts: Atomic War in Fiction 1895-1984 (web) by Paul Brians, and Masuji Ibuse’s novel Black Rain (Amazon). Black Rain is a particularly powerful exploration of the way the bomb affected the lives of ordinary people.

In a similar vein, John Hersey’s Hiroshima (Amazon), originally published in the New Yorker on the first anniversary of the bombings, provides a non-fictional narrative of the impact of the bombings on survivors Hersey interviewed in the months after the explosion.

For a more detached and technical treatment of the development of the bombs and their aftermath, I looked at Atomic Energy for Military Purposes: The Official Report on the Development of the Atomic Bomb Under the Auspices of the United States Government (web) by Henry De Wolf Smyth and The Manhattan Engineer District’s report, The Atomic Bombings of Hiroshima and Nagasaki (Amazon Kindle).

These two provided in in-depth account of the development of the bomb and its effects; the Manhattan Engineer District’s report also provided the account of an eyewitness, Fr. John A. Siemes, a German Jesuit priest who was a professor of modern philosophy at Tokyo’s Catholic University, and was staying at a Jesuit house on the outskirts of Hiroshima at the time of the bombing.

Although Siemes is a Catholic priest, a professor of philosophy, and himself a survivor of the atomic bombings, who saw the horrible effects of the attack first-hand, he is ambivalent about the morality of the use of the bomb. He reports that he and his fellow priests “have disucssed among ourselves the ethics of the use of the bomb”:

Some consider it in the same category as poison gas and were against its use on a civil population. Others were of the view that in a total war, as carried on in Japan, there was no difference between civilians and soldiers, and that the bomb itself was an effective force tending to end the bloodshed, warning Japan to surrender and thus to avoid total destruction. It seems logical to me that he who supports total war in principle cannot complain of war against civilians. The crux of the matter is whether total war in its present form is justifiable, even when it serves a just purpose. Does it not have material and spiritual evil as its consequences which far exceed whatever good that might result? When will our moralists give us a clear answer to this question?

In the next several posts, I will share some observations stemming from this rather diverse reading, with the goal of both increasing awareness of the bombings themselves, and trying to shed more light on my own answer to Fr. Siemes’s question.

Posted in just war | Leave a comment

The value of the humanities

It isn’t often that you hear the CEO of a multi-billion dollar corporation blame another multi-billion dollar corporation’s failures on its lack of attention to the humanities and liberal arts.

So I was somewhat surprised when I came across this quote, from “Microsoft’s Lost Decade” in this month’s Vanity Fair:

In Walter Isaacson’s authorized biography Steve Jobs, Jobs acknowledged Ballmer’s role in Microsoft’s problems: “The company starts valuing the great salesmen, because they’re the ones who can move the needle on revenues, not the product engineers and designers. So the salespeople end up running the company.… [Then] the product guys don’t matter so much, and a lot of them just turn off. It happened at Apple when [John] Sculley came in, which was my fault, and it happened when Ballmer took over at Microsoft. Apple was lucky and it rebounded, but I don’t think anything will change at Microsoft as long as Ballmer is running it.”

Most interesting, however, is that Jobs put the ultimate blame on Bill Gates: “They were never as ambitious product-wise as they should have been. Bill likes to portray himself as a man of the product, but he’s really not. He’s a businessperson. Winning business was more important than making great products. Microsoft never had the humanities and liberal arts in its DNA.(emphasis added)

I worked at Microsoft during the closing years of Bill Gates’ tenure, and the first years of Balmer’s leadership, and watched the transition the article describes begin to happen. From my own experience, and from what I have heard from friends who remained at the company, the article offers a lot of insight into the ways that Microsoft—despite its enormous advantages in cash and market position—failed to keep up its leadership, and was surpassed by Apple and Google.

The primary lesson of the article is that Microsoft’s senior leaders fundamentally did not understand people: they did not understand the people who worked for them, and they did not understand the people they were trying to sell to. Within the company, they created bureaucratic structures that destroyed teamwork and long-term focus. In research and development, they ignored products, like e-readers, that could have had great customer appeal; and they also created products, like the Zune, that nobody wanted to buy.

Ultimately, businesses thrive or stagnate because human beings think their products are worth buying. Their success in producing products that customers love depends on their ability to organize large groups of human beings to come up with good ideas, refine them, and bring them to the market. Both of these tasks demand an in-depth understanding of human beings—of the human condition.

This understanding is seldom taught in business or engineering classes. It’s also, unfortunately, not always something you learn in liberal arts classes. Many in the liberal arts have tried to mimic science and engineering by turning the humanities into specialized, technical disciplines. Unfortunately, this approach usually does not inspire students to reflect more deeply on their own lives or to feel a hunger to understand the human condition.

In his 2005 Stanford University commencement address, Steve Jobs talked about how a calligraphy class he took as an undergraduate helped him, many years later, to craft the unique aesthetic of the Apple Macintosh. He also speaks at length of love, loss, and death, and how these shaped his attitudes toward his own work:

[transcript of the address here]

Viewed through a kind of narrow lens, the humanities may seem useless, as a calligraphy class seems useless for a future tech entrepreneur. And the big questions—love, death, vocation—may seem irrelevant to the bottom line. But Jobs does a good job of showing us how these human stories are intimately related to the larger story of his phenomenal successes with Apple and Pixar.

Of course, I am not interested in the humanities primarily because of the bottom line. I made more in a single year at Microsoft than I’ve made in 5 years of grad school. But I am much more satisfied with my life now than I was then. The big questions of human life are interesting and worth pursuing for their own sake.  A life driven by the bottom line is, paradoxically, worth less than a life open to more important things.

Steve Jobs was driven more by aesthetics than by the bottom line. But because many people value aesthetics more than money, they are willing to pay more for a beautiful computer or phone than for one which is utilitarian but clunky and unattractive.

Those of us who teach in the humanities should be willing to challenge the claim that an engineering or business education is more “useful” than an education in the humanities, or that humanities classes are a “waste of time” for students in engineering or business. Apple is surely as good a case study of business and technological success as any, but its success is clearly rooted in aesthetic and human values that are not usually found in business or engineering curricula.

It’s helpful to point out that inattention to human nature can lead to stagnation in business, and that an appreciation for aesthetics and a focus on what customers want can lead to runaway success.

Love, beauty, and the big questions of human life are the crown jewels of the humanities. We should not be embarrassed by them, and shy away from them in an effort to model our inquiries on the narrow and technical model of the apparently more prestigious disciplines within the university.

Am I suggesting that the humanities can replace business or engineering education? Of course not. Apple’s success required technical expertise and business savvy as much as it required an appreciation for aesthetics and human-centered computing.

But is an education in the liberal arts valuable for knowing how to use talents in business and engineering to build products that human beings will respond to? Does the knowledge of human nature that comes from wide study in the humanities help with the team interactions necessary to bring a complex product to the market? To both questions, we should be willing to argue—with good evidence to back up our argument—that the answer is a resounding “Yes!”

Above all, we should be willing to “stay hungry” and “stay foolish” by approaching the big questions found in art, philosophy, theology, and literature with the confidence that we have something important to offer our students—something more important than mere financial success.

Posted in humanities, technology | 1 Comment

The power of checklists in medicine

As usually happens before a new semester begins, I am reading a lot of articles, looking for material to make my medical ethics classes more interesting.

One of the most fascinating articles I read this week was “The Checklist,” by Atul Gawande. Gawande is a surgeon at Brigham and Women’s Hospital in Boston. He also teaches at the Medical School and School of Public Health at Harvard University.

The first thing to say about Gawande is that he is a good story-teller. He tells the story of a 3-year-old girl in Austria, rescued and successfully resuscitated after spending 30 minutes at the bottom of a frozen pond, and the story of Anthony DeFilippo, a middle aged limousine driver who nearly died of surgical complications. In among these stories, he weaves a fascinating discussion of research by Peter Pronovost about the power of checklists to reduce medical errors and improve patient outcomes.

On a sheet of plain paper, [Pronovost] plotted out the steps to take in order to avoid infections when putting a line in. Doctors are supposed to (1) wash their hands with soap, (2) clean the patient’s skin with chlorhexidine antiseptic, (3) put sterile drapes over the entire patient, (4) wear a sterile mask, hat, gown, and gloves, and (5) put a sterile dressing over the catheter site once the line is in. Check, check, check, check, check. These steps are no-brainers; they have been known and taught for years. So it seemed silly to make a checklist just for them. Still, Pronovost asked the nurses in his I.C.U. to observe the doctors for a month as they put lines into patients, and record how often they completed each step. In more than a third of patients, they skipped at least one.

The next month, he and his team persuaded the hospital administration to authorize nurses to stop doctors if they saw them skipping a step on the checklist; nurses were also to ask them each day whether any lines ought to be removed, so as not to leave them in longer than necessary. This was revolutionary. Nurses have always had their ways of nudging a doctor into doing the right thing, ranging from the gentle reminder (“Um, did you forget to put on your mask, doctor?”) to more forceful methods (I’ve had a nurse bodycheck me when she thought I hadn’t put enough drapes on a patient). But many nurses aren’t sure whether this is their place, or whether a given step is worth a confrontation. (Does it really matter whether a patient’s legs are draped for a line going into the chest?) The new rule made it clear: if doctors didn’t follow every step on the checklist, the nurses would have backup from the administration to intervene.

Pronovost and his colleagues monitored what happened for a year afterward. The results were so dramatic that they weren’t sure whether to believe them: the ten-day line-infection rate went from eleven per cent to zero. So they followed patients for fifteen more months. Only two line infections occurred during the entire period. They calculated that, in this one hospital, the checklist had prevented forty-three infections and eight deaths, and saved two million dollars in costs. (Emphasis added.)

Gawande then tells the story of the Keystone Initiative, an effort by the state of Michigan to implement checklists in the state’s ICUs. The project was a phenomenal success:

In December, 2006, the Keystone Initiative published its findings in a landmark article in The New England Journal of Medicine. Within the first three months of the project, the infection rate in Michigan’s I.C.U.s decreased by sixty-six per cent. The typical I.C.U.—including the ones at Sinai-Grace Hospital—cut its quarterly infection rate to zero. Michigan’s infection rates fell so low that its average I.C.U. outperformed ninety per cent of I.C.U.s nationwide. In the Keystone Initiative’s first eighteen months, the hospitals saved an estimated hundred and seventy-five million dollars in costs and more than fifteen hundred lives. The successes have been sustained for almost four years—all because of a stupid little checklist.

Despite the effectiveness of checklists, however, they have not entered widespread use. Pronovost argues that:

The fundamental problem with the quality of American medicine is that we’ve failed to view delivery of health care as a science. The tasks of medical science fall into three buckets. One is understanding disease biology. One is finding effective therapies. And one is insuring those therapies are delivered effectively. That third bucket has been almost totally ignored by research funders, government, and academia. It’s viewed as the art of medicine. That’s a mistake, a huge mistake. And from a taxpayer’s perspective it’s outrageous.

Read the whole thing. If you find it interesting, you might want to look at The Checklist Manifesto (Metropolitan Books, 2009), Gawande’s book-length expansion of the ideas in this article.

Posted in medical ethics | Leave a comment