Richard Cook (safety researcher)

Dr. Richard I. Cook (May 3, 1953 – August 31, 2022) was a system safety researcher, physician, anesthesiologist, university professor, and software engineer. Cook did research in safety, incident analysis, cognitive systems engineering, and resilience engineering across a number of fields, including critical care medicine, aviation, air traffic control, space operations, semiconductor manufacturing, and software services.

Biography
Cook graduated Cum Laude from Lawrence University in 1975 from a customized program that included physics and urban planning. After completing his bachelor's degree, Cook took a position as a lead systems analysis at Control Data Corporation, working with finite element analysis programs such as ANSYS and NASTRAN on the CDC STAR-100, and managing teams of programmers and support analysts.

In 1986, Cook received his MD degree from the University of Cincinnati where he was a General Surgery intern. In 1994, he completed his Anesthesiology residence at the Ohio State University.

Cook served in the Department of Anesthesia and Critical Care at the University of Chicago as an associate professor and Director of the Cognitive Technologies Laboratory from 1994 to 2012, where he provided clinical care, did teaching and training, research, and community service.

In 2012, Cook was named Sweden's First Professor of Patient Safety, at KTH Royal Institute of Technology, where he served until 2015, when he retired from the position as a Professor Emeritus.

From 2015 to 2020, Cook worked as a research scientist at the Ohio State University in the Department of Integrated Systems Engineering. During this time, he also had a part-time appointment as a clinical professor of anesthesiology at the Ohio State University Wexner Medical Center, where he provided patient care and trained new medical practitioners.

In 2017, Cook, along with John Allspaw and David Woods, founded a consulting company, Adaptive Capacity Labs.

Patient safety work
Cook was active in the patient safety movement from the mid-1990s to the mid-2000s. He was one of the founding board members of the National Patient Safety Foundation and served on its executive committee until 2007. From 1998 to 2000, he advised the U.S. Veterans Health Administration (VA) on patient safety initiatives, and in 2000 was appointed co-director of a V.A. "Gaps Center" that was funded due to Cook's research.

In 1997, Cook helped organize the workshop "Assembling the Scientific Basis for Progress on Patient Safety" in Chicago, and co-authored the resulting report published the following year: A Tale of Two Stories: Contrasting Views of Patient Safety.

In 2011, Cook served on an advisory panel for the Institute of Medicine on the topic of Health IT and Patient Safety. In the final published report, he wrote a dissent where he argued that health IT software should be regulated as a class III medical device.

How complex systems fail
In 1998, Cook wrote a treatise titled How Complex Systems Fail, republished in the book Web Operations: Keeping the Data on Time, and Hindsight magazine, where he identified eighteen characteristics of complex system failure modes.

In 2012, Cook gave a talk on the topic at the O’Reilly Velocity conference.

New look
Cook was a proponent of what came to be known as the "new look" of safety (referred to by Sidney Dekker as the "new view" ).

According to the New Look, operators within safety-critical are faced with competing demands, dilemmas, conflicts (both technical and organizational), and uncertainty. In particular, operators are always faced with the competing demands for achieving production goals and for failure-free operations.

When accidents occur, they tend to be attributed to human error because of hindsight bias. Interventions in the wake of accidents lead to a cycle of error, where they increase the complexity of the system and create the potential for new failure modes.

Instead of focusing on the people in the system as the source of accidents, the "new look" perspective argues that it is the people in the system that create the safety in the system; that it is the work of the human operators that compensate for gaps in the designed system, and that successful work is much more common than failure.

"Going sour"
Cook (along with David Woods and John McDonald) introduced the term going sour to refer to incidents where there is a slow degradation of system performance over time. Cook noted going sour incidents are more complex and more difficult to describe than acute incidents. In addition, in these types of incidents, the actions of human operators play more of a role in how the incident unfolds.

"Going solid"
Cook (along with Jens Rasmussen) introduced the term going solid to describe a significant shift in systems operations when a form of capacity becomes exhausted. The term originates from the nuclear power industry, where it is used as slang to refer to a technical situation that has become difficult to manage. More literally, the term describes a change in system behavior related to the state of a steam boiler. Typically, a steam boiler contains a mixture of steam and liquid water. When the boiler becomes completely filled with liquid water, it is said to "go solid".

Cook applied the concept of "going solid" to an intensive care unit in a hospital that undergoes a "bed crunch", when there are no longer enough beds to assign to patients. Cook notes that "going solid" situations tend to foster opportunities for accidents to occur.

Line of representation
Cook noted that operators of software systems are not able to interact directly with the systems that they supervise, but instead they interact through representations. Operators see visual representations of the internal state of software systems, and they manipulate representations in order to act on the system.

Cook used the term line of representation as a metaphor for distinguishing between two sets of entities. Above the line of representation lie people, organizations, and human processes. Below the line of representation are the software artifacts and infrastructure.

Cook notes that the people within an organization, who exist above the line, will describe in concrete terms the entities below the line, despite not being able to directly observe or act upon these entities.

Selected publications

 * 1994 — Cook, R.I., Woods, D.D., Operating at the Sharp End: The Complexity of Human Error, in: Human Error in Medicine, Bogner, M.S., ed., CRC Press, June 1994 (10.1201/9780203751725)
 * 1998 — Cook, R.I., How Complex Systems Fail, Cognitive Technologies Laboratory, University of Chicago, Revision D (00.04.21)
 * 1998 — A Tale of Two Stories: Contrasting Views of Patient Safety, National Health Care Safety Council of the National Patient Safety Foundation at the AMA, 1998
 * 1998 — Cook, R.I., Two years before the mast: learning how to learn about patient safety, In: Scheffler AL, Zipperer LA, eds. Proceedings of the Second Annenberg Conference on Enhancing Patient Safety and Reducing Errors in Healthcare. Rancho Mirage, CA: National Patient Safety Foundation, pp61-4.
 * 1998 — Cook, R.I. Being Bumpable, Proceedings of the Fourth Conference on Naturalistic Decision Making, May 29–31, 1998, The Airlie Conference Center, Warrenton, VA
 * 2000 — Cook R.I, Render M., Woods, D.D., Gaps in the continuity of care and progress on patient safety, BMJ Clinical Research 2000;320(7237):791-4.
 * 2005 — Cook R.I, Rasmussen J., “Going solid”: a model of system dynamics and consequences for patient safety, BMJ Quality & Safety 2005;14:130-134.
 * 2005 – Cook, R.I. A brief look at the New Look in complex system failure, error, safety, and resilience, Cognitive Technologies Laboratory, University of Chicago, Revision AA (05.11.07).
 * 2006 — Cook, R.I., Woods, D.D., Distancing Through Differencing: An Obstacle to Organizational Learning Following Accidents, in: Resilience Engineering, Nov. 2006 (10.1201/9781315605685-28)
 * 2010 — Woods, D.D., Dekker, S., Cook, R.I., Johannesen, L., Sarter, N. Behind Human Error, 2nd Edition, CRC Press, 2010.
 * 2020 — Cook, R.I., Above the Line, Below the Line, ACM Queue 2020; 17(6) (10.1145/3380774.3380777)

Recorded talks
The Future of above-the-line Tooling, SRECon Americas 2022

A Few Observations on the Marvelous Resilience of Bone & Resilience Engineering, REdeploy Conference, San Francisco, CA, 2019

Resilience In Complex Adaptive Systems, O'Reilly Velocity Web Performance and Operations Conference, New York, NA, 2013

How Complex Systems Fail, O'Reilly Velocity Web Performance And Operations Conference, Santa Clara, CA, 2012

Lectures on the study of cognitive work, The Royal Institute of Technology, Huddinge, Sweden, 2012