What's Gone Right in the Study of What's Gone Wrong

By RICHARD M. VALELLY

Ever have discipline envy? As a political scientist, I've lately felt that the grass is greener in, of all things, the sociology of disasters.

That subfield is distinguished by studies of, for instance, the Mann Gulch firefighting debacle of August 1949, or of how informal social networks left hundreds of Chicago's weakest citizens vulnerable to a blistering 1995 heat wave. Such inquiry overlaps with more technical work by disaster-prevention and management engineers at the Universities of Colorado and Delaware, George Washington University, and the National Science Foundation. Let me be clear: It's not that I like calamities. Like most people, I fear and abhor them. That is why the sociology of disaster seems so promising. Work that unravels the logics of avoidable catastrophes could prove extraordinarily useful to policy makers and government officials.

Such work also ought to strongly attract political scientists. We typically enter our discipline because we care about good government and productive politics. Yet many of us are put off by the case-study or "small N" nature of disaster studies. Except sometimes in international relations and security studies, the best graduate students in political science are told by their mentors never to even whisper the dread words "case study." Political scientists offer good reasons for that advice. The case-study design inevitably narrows the explanatory focus. By zeroing in on mechanisms gone wrong, case studies sharply truncate the range of variation in the phenomenon under investigation. The worry is that the causal explanation offered by the case-study analyst is never, or only inadequately, tested against alternative values of the study variable - every "what went wrong" shadowed by a "what went right."

In contrast, in quantitative "large N" research design, episodes of failure by, say, the National Aeronautics and Space Administration fit at one slim end of a distribution of outcomes. The observations range from unmitigated calamity at one side to unalloyed success at the other, with mixed cases in between. Given enough comparable situations, a researcher can pick a variable - for instance, organizational culture - and see what role it plays across the full range of outcomes and in interaction with other variables, such as the average tenure of key officials and executives. If you don't design a study in that way, how can you generate a theory of organizational failure? Can you say why a bureaucracy or agency will sometimes fall prey to a frightful short-circuiting of its mission, yet at other times puts in virtuoso performances?

There are also bleak paradoxes at work in disaster sociology's reliance on the case study. Imagine that a case study proposes, and seems to validate, a strong hypothesis for why failure occurred. It would take a second disaster of the same sort, with apparently similar sources, to deepen one's confidence that the hypothesis is correct. But would such a recurrence ever happen? Wouldn't that mean the relevant organizations had learned nothing? Besides, recurrences of an organizationally produced cataclysm are rare. Much time may pass in the interim. How can one be certain that the second disaster, or even a third, is really a comparable disaster, with comparable origins? As a matter of science, doesn't one need a politically and socially unacceptable frequency of catastrophic performance failures to be confident in any hypothesis?

Those are good questions, but that logic, taken to an extreme, would be paralyzing, and such grim methodological concerns ought not to block political scientists from participating in this line of work. In the wake of 9/11 and the creation of the Department of Homeland Security, it's obvious that government, and only government, is our front line of defense against Al Qaeda and other terrorist threats. Only government handles the delivery of certain public goods.

As for risky technologies, sure, the private sector handles and uses them every day. But the August 16th, 2003, blackout underscores that private-sector errors in the handling of risky technologies cannot be traded in a market the same way that emissions can. When it comes to a certain class of events and outcomes, in other words, there is no escape from governmental execution of certain tasks or the close governmental regulation of licensed agents. Do political scientists really want to sit on the sidelines while other social scientists try to improve government capacities for disaster prevention and response?

It might help if we political scientists grasped the considerable methodological strength that does undergird the sociological literature on disaster. Yes, a case study leaves indeterminate the precise degree of confidence one can assign to rival hypotheses. But case studies offer compelling detail and narrative depth that most empirical research can't. Busy professionals running an airport, for example, surely need that kind of depth. They need to see in slow motion, as it were, just how disaster happens and why a particular hypothesis about its causes tends to hold up through close scrutiny of the facts. That characteristic of the literature makes "disasterology" particularly policy-relevant.

One can see both traits - the persuasive depiction of causal mechanisms that would otherwise go undetected, and consequent policy relevance - in the prize-winning work of Robin Wagner-Pacifici, a colleague at Swarthmore. (Disclosure: I don't get anything from discussing her work, except maybe a warm hello the next time we bump into each other.) She has a particular interest in the law-enforcement standoff. She has studied the very deadly May 1985 assault by 500 Philadelphia police officers on the urban redoubt of MOVE (a cult that for many years had made its neighbors quite miserable), as well as the law-enforcement tragedies at Ruby Ridge, Idaho, in 1992 and Waco, Tex., in 1993.

Wagner-Pacifici has discovered, quite counterintuitively, that such appalling events pivot around socially and organizationally constructed concepts of space and time adopted by the different actors in the standoff. She shows in her third book on such issues, Theorizing the Stand-Off: Contingency in Action (Cambridge University Press, 2000), that law-enforcement agents on the one hand, and outlaw or aggressively dissident cults and figures on the other, will view time and space in completely different ways. That disjuncture is fraught with danger.

Although she does not explicitly offer policy prescriptions for police forces, the relevance of Wagner-Pacifici's analysis to their work is clear: Law-enforcement agents should never establish a well-demarcated perimeter around a holed-up group. Although establishing a perimeter seems to be an obvious and necessary practice, seeking to control space in that way sets in motion a cascade of further, continually escalating standoffs that will engage wholly irresolvable, rival conceptions of time. Once there is a perimeter, the standoff drama subtly acquires a necessary ending. At some point, the exceptional circumstance of the perimeter's existence must end, or so the law-enforcement actors and their political superiors reason. That ending will signal the restoration of normalcy. Even if law enforcement strives to avoid thinking that way, the perimeter's existence quickly draws sustained news-media attention that sooner or later crystallizes the issues of when the authorities will act and when the standoff will end.

But the other side in the standoff hardly takes the same view of those pressures. For instance, the Branch Davidians understood the passage of time in quasi-apocalyptic terms. For them, a fiery ending to the standoff fulfilled religious prophecy. A group's cognitive dissonance -İlike the Davidians' vision of a religious drama as opposed to a law-enforcement scenario -İcan goad police officers into setting ultimatums. When a fiery, deadly resolution ensues, the catastrophe demoralizes the highly trained officers, appalls the public, and indelibly stains the reputations of dedicated public servants, like the Rev. Wilson Goode, the former mayor of Philadelphia. And, as the case of Waco shows, a fierce military engagement may even sow the seeds of another disaster, such as the Oklahoma City bombing, which was executed by Timothy McVeigh, a veteran of the first gulf war angered by what happened at Waco.

After living through both Ruby Ridge and Waco, Attorney General Janet Reno appeared to realize the importance of the perimeter. At any rate, the Montana Freemen and Elian Gonzalez standoffs that occurred later on her watch did not feature the establishment of obvious perimeters. Perhaps not coincidentally, they were peaceably resolved. The lessons of such studies encounter enormous organizational and political resistance. Why they do, and how such resistance breaks down, reflect social and political processes that deserve analysis themselves.

Consider the story of Diane Vaughan, the Boston College sociologist who wrote The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA (University of Chicago Press, 1996). On the day of the Columbia space-shuttle disaster, February 1, 2003, she began to receive phone calls within an hour and a half of the news. For six weeks, 12 hours a day, she responded to requests for analysis of the parallels between the two events. Soon, Vaughan was asked to testify before the Columbia Accident Investigation Board. With the vigorous backing of the board's chairman, Adm. Harold W. Gehman Jr., retired from the Navy, Vaughan participated extensively in writing the board's report, even writing one of its chapters. Upon the report's release, NASA's lack of contact with Vaughan became an issue. Vaughan was finally brought into full, productive contact with senior NASA administrators.

Vaughan's 1996 case study proposed a strong hypothesis for why the 1986 Challenger failure occurred, namely, the agencywide organizational and cultural "normalization" of disturbing signals about the performance of the risky technology that NASA handles routinely. Because NASA's people were so good at rapidly integrating and acting on huge amounts of information, they tended to not quite catch the high pitch of the serious signals when something was wrong. In a sort of perpetual, and usually productive, crisis mode, they were blind to crises of a more serious caliber. Seven years later the Columbia Accident Investigation Board concluded that in effect the same accident recurred in 2003 despite strong efforts at organizational learning within NASA after the 1986 calamity. The Challenger and Columbia disasters were all too comparable.

Contrast Vaughan's ultimate role in NASA's discussion with the experience of Eric Klinenberg, a rising star in the sociology department at New York University. His book, Heat Wave: A Social Autopsy of Disaster in Chicago (University of Chicago Press, 2002), showed that there was very little "natural" about either Chicago's vulnerability to the 1995 heat wave that killed 739 people or the inadequate nature of its response. Although he emphasized that the social breakdown he chronicled had complex social and organizational causes for which no one in particular could be held responsible, Chicago officials kept him at arm's length.

When the European heat wave of August 2003 occurred, about a year after the publication of Klinenberg's book, it caused about 35,000 "excess deaths" in France, England, Germany, Spain, and Italy -İthat is, deaths in excess of the number of fatalities expected. Belatedly, French government and other officials critically examined their inaction in light of early warning signs, like a spike in hospital admissions for heat flush and increased calls for paramedic services. An information campaign on how to handle the heat and stay hydrated, in a country that shuns air-conditioning, might have lessened the heat wave's toll. And the French began looking harshly at social factors, too, like the isolation of the elderly, particularly during the summer-vacation period when their families are out of town.

But although some French officials made contact with Klinenberg, as did journalists from major European publications, no one from the governments of the other four nations has been in touch with him, he said in a recent phone interview.

If government agencies do eventually pay attention to relevant disaster studies, why don't they do so earlier? Can taking account of such highly professional case-study work make a difference to the agencies' performance? Under what conditions? Those are questions that political scientists are well equipped to take on.

Of relevance here is the concept of "causal stories" -İthat is, rival forms of educated explanation. The idea was proposed by the political scientist Deborah Stone in her book, Policy Paradox: The Art of Political Decision Making (Norton, 1997). One function of "causal stories" is to separate events into either natural or man-made phenomena. That helps explain why Klinenberg's work has not had the policy impact that it deserves. As Klinenberg has stressed, heat waves are the deadliest form of recurring disaster in modern, urbanized societies. But people can plausibly see them as "natural." Harsh weather, many conclude, killed off people who were already likely to die. The problem with that perspective, Klinenberg says, is that a vulnerable senior nestled within a social network of friends, neighbors, associations, or family will be much more likely to survive a heat wave than an identical counterpart who lacks the same social context. It's not weather; it's whether the senior is isolated. That's a hard sell, though, because the idea that Mother Nature simply harvests people whose time has come lets public officials and private institutions off the hook.

Until Klinenberg's point is accepted, the deadly impact of heat waves will continue to be treated as essentially beyond human remedy. Klinenberg has given political scientists who specialize in urban politics a reason for taking up observation posts every summer from here on to help us overcome the angry Mother Nature causal story. Their data could offer operational insights, but could also help lower the numbers of elderly who, in Klinenberg's words, "die alone."

Of course, even if a calamity has unmistakably organizational origins, the same intra-organizational culture that contributed to or produced the outcome also arms members with cognitive defenses against outside criticism. Academic studies can be dismissed as the logorrhea of flighty critics who "don't really get it."

That's when the news media's assessment of a case study's relevance becomes critical. Sociologists and political scientists will, no doubt, find that the politically effective level of press scrutiny is hardly guaranteed. On the other hand, training in social-science methodology and select social-science findings in journalism schools might make journalists more alert to policy-relevant research. One can imagine foundation-backed formal links between journalism programs and departments of sociology and political science.

Political scientists can also collaborate with sociologists based on our knowledge of bureaucracies. We're pretty sure - thanks to James Q. Wilson's book Bureaucracy (Basic Books, 1989) - that bureaucracies cannot be pulled apart and put back together without paying a cost in diminished performance for some time. We also believe - from Wilson, and his foremost student John DiIulio's Governing Prisons (Free Press, 1987) - that improvement in organizational performance is rarely a matter of resources and instead results from fruitful relationships between talented leadership and various types of operators lower in a hierarchy.

The top executives of many, if not most, agencies do not tarry long at their appointments and instead quickly move on, aborting the possibility of such intra-organizational bonds. The long tenures within NASA are rare, and Janet Reno's long tenure at the Department of Justice or George Tenet's at the CIA are also truly exceptional. More typical are the short, tense tenures of Paul O'Neill, the former treasury secretary, or Christine Todd Whitman, who led the Environmental Protection Agency.

The problem of brief tenure for government executives might change if political scientists and sociologists found a relationship between improved organizational performance and the integration of helpful case-study research. Thanks in large part to Vaughan, such cross-fertilization may already be under way at NASA, putting emphasis on administrative culture rather than the scapegoating of current or recently dumped officials, or the reform for reform's sake of incoming white-knight appointees.

It may be wildly optimistic to contemplate the prospect, but perhaps the bureaucratic rose will regain some of its bloom. When actors in a bureaucracy see themselves as negotiating cultures that can be profoundly dysfunctional or profoundly adaptive and performance-enhancing, then their jobs become exciting, not staid "government work." Taking disasterology seriously might pave the way for new bureaucratic entrepreneurs and organizational revitalization.

There are dangers, of course. The impact that disaster sociology has had so far partly depends on conceptual clarity that can be summarized in a vivid or memorable phrase - "dying alone" (Heat Wave) or "the normalization of deviance" (The Challenger Launch Decision). Klinenberg and Vaughan fashioned such terms after they made their discoveries and grasped their policy relevance. If the subfield attracts more practitioners, however, social scientists will need to impose informal professional safeguards to keep ideologues from tailoring their research to conform to prefabricated bumper-sticker concepts.

The bottom line, though, is that the qualitative sociology of disasters serves as a reminder to all the social sciences of how useful they can be. Social science can and should be many things besides useful, of course, including rigorous and formal. But utility and empiricism aren't at odds. They can be complementary and, in avoiding preventable horrors, crucial.

Richard M. Valelly is a professor of political science at Swarthmore College. http://chronicle.com

Section: The Chronicle Review

Volume 50, Issue 32, Page B6