Richard Cook, director of the University of Chicago’s Cognitive Technologies Laboratory, is an anesthesiologist by training. But he’s best known for his work on spectacular accidents outside of health care–Three Mile Island, Chernobyl, the Challenger disaster–which concludes that such accidents are caused not by human error but by flaws in systems. His lab’s mission is to study “safety culture” and devise training methods that treat human judgment as a resource rather than an obstacle.

RC: Most people didn’t become fascinated with human error in complex systems until the 60s and 70s, when we started to create elaborate computer-controlled systems in manufacturing, airplane cockpits, and nuclear power plants. The watershed event in the field was the Three Mile Island meltdown in 1979. At first this appeared to be a simple matter of human error: the operators mismanaged the plant. But on closer examination over several years people discovered that the operators hadn’t mismanaged the plant. They had been trapped by the complexity of the system and by its usual state of being partially broken. They were experienced operators with plenty of knowledge, but they had something like 100 alarms go off in the first five minutes of the event.

Best of Chicago voting is live now. Vote for your favorites »

RC: Hindsight bias is a durable character of human cognition. Among other things, it allows us to believe that an event is more likely to occur if it has occurred. A lot of the reaction to 9/11 is hindsight bias: It’s so obvious! We should have foreseen it! But the plain fact is, people didn’t. The problem is that our understanding of the world has now been transformed by the event. It’s what we call a fundamental surprise.

HH: Another standard viewpoint is that safety should always be the number one priority.

RC: I don’t put it that way, because that can lead into almost religious arguments. It’s just that making error the target of our efforts doesn’t lead to fruitful results. The old military idea is to reinforce success, not failure.