[2] The burst suppression pattern was first observed by Derbyshire et al. while studying effects of anesthetics on feline cerebral cortices in 1936, where the researchers noticed mixed slow and fast electrical activity with decreasing amplitude as anesthesia deepened.
[3] In 1948, Swank and Watson coined the term "burst-suppression pattern" to describe the alternation of spikes and flatlines in electrical activity in deep anesthesia.
[4] Bursts are accompanied by depletion of extracellular cortical calcium ions to levels that inhibit synaptic transmission, which leads to suppression periods.
[4] During suppression, neuronal pumps restore the calcium ion concentrations to normal levels, thus causing the cortex to be subject to the process again.
[8] The shortening of bursts and lengthening of suppression is caused by the central nervous system's inability to properly regulate calcium levels due to increased blood–brain permeability.
[8] At the cellular level, hyperpolarization of the membrane potential of cortical neurons reliably precedes any overt electroencephalographic activity of burst suppression.
[10] Another theory is that alterations in brain metabolism regulate activity dependent slow modulation of ATP-gated potassium channel conductance which induces burst suppression.
[18] These processes separates burst and suppression episodes based on EEG features such as entropies, non-linear-energy-operator, voltage variance, or adaptation of constant false alarm rate (CFAR) algorithm,[19] etc.
When the features represent distinguishable patterns of burst and suppression, a fixed threshold using ROC-curve or machine learning methods[18] are used for segmentation.
[20] where xi is the brain's suppression state at time iΔ, with Δ representing intervals for analysis, and ranges across all real numbers.
[17] Note, however, that there is evidence linking sedation-induced burst suppression with positive outcomes in patients recovering from coma following traumatic brain injury, suggesting a neuroprotective effect.