[2] The assignment is typically done in the context of an overarching system, where the worst case consequences of software failures are investigated.
For example, automotive standard ISO 26262 requires the performance of a Hazard and Risk Assessment ("HARA") on vehicle level to derive the ASIL of the software executed on a component.
It is essential to use an adequate development and assurance process, with appropriate methods and techniques, commensurate with the safety criticality of the software.
Comprehensive documentation of the complete development and assurance process is required by virtually all software safety standards.
In system safety engineering, it is common to allocate upper bounds for failure rates of subsystems or components.
It must then be shown that these subsystems or components do not exceed their allocated failure rates, or otherwise redundancy or other fault tolerance mechanisms must be employed.
Unlike hardware failures, probabilities of such errors cannot be quantified.” Software safety and security may have differing interests in some cases.
Software that employs artificial intelligence techniques such as machine learning follows a radically different lifecycle.
For example, EN 50716 (Table A.3) states that artificial intelligence and machine learning are not recommended for any safety integrity level.
This might be partially caused by statements such as "working software over comprehensive documentation", which is found in the manifesto for agile development.