Failing badly and failing well are concepts in systems security and network security (and engineering in general) describing how a system reacts to failure.
The terms have been popularized by Bruce Schneier, a cryptographer and security consultant.
[1][2] A system that fails badly is one that has a catastrophic result when failure occurs.
Examples include: A system that fails well is one that compartmentalizes or contains its failure.
Examples include: Designing a system to 'fail well' has also been alleged to be a better use of limited security funds than the typical quest to eliminate all potential sources of errors and failure.