Gordon–Loeb model

The Gordon-Loeb model provides a framework for determining how much to invest in cybersecurity, using a cost-benefit approach.

The model includes the following key components: Gordon and Loeb demonstrated that the optimal level of security investment, z*, does not exceed 37% of the expected loss from a breach.

The model is widely regarded as one of the leading analytical tools in cybersecurity economics.

[8] The model has also been covered by mainstream media, including The Wall Street Journal[9] and The Financial Times.

[10] Subsequent research has critiqued the model's assumptions, suggesting that some security breach functions may require fixing no less than 1/2 the expected loss, challenging the universality of the 1/e factor.

Ideal level of investment in company computer security, given decreasing incremental returns