These accidents often resemble Rube Goldberg devices in the way that small errors of judgment, flaws in technology, and insignificant damages combine to form an emergent disaster.
Langewiesche writes about, "an entire pretend reality that includes unworkable chains of command, unlearnable training programs, unreadable manuals, and the fiction of regulations, checks, and controls.
The accident brought widespread attention to the airline's management problems, including inadequate training of employees in proper handling of hazardous materials.
[4] In a 2014 monograph, economist Alan Blinder stated that complicated financial instruments made it hard for potential investors to judge whether the price was reasonable.
When investors don't understand the risks that inhere in the securities they buy (examples: the mezzanine tranche of a CDO-Squared; a CDS on a synthetic CDO ...), big mistakes can be made–especially if rating agencies tell you they are triple-A, to wit, safe enough for grandma.
"[11] Despite a significant increase in airplane safety since 1980s, there is concern that automated flight systems have become so complex that they both add to the risks that arise from overcomplication and are incomprehensible to the crews who must work with them.
As an example, professionals in the aviation industry note that such systems sometimes switch or engage on their own; crew in the cockpit are not necessarily privy to the rationale for their auto-engagement, causing perplexity.
He quotes engineer Earl Wiener who takes the humorous statement attributed to the Duchess of Windsor that one can never be too rich or too thin, and adds "or too careful about what you put into a digital flight-guidance system."
[3] Steps in procedures may be changed and adapted in practice, from the formal safety rules, often in ways that seem appropriate and rational, and may be essential in meeting time constraints and work demands.
In a 2004 Safety Science article, reporting on research partially supported by National Science Foundation and NASA, Nancy Leveson writes:[13] However, instructions and written procedures are almost never followed exactly as operators strive to become more efficient and productive and to deal with time pressures ... even in such highly constrained and high-risk environments as nuclear power plants, modification of instructions is repeatedly found and the violation of rules appears to be quite rational, given the actual workload and timing constraints under which the operators must do their job.