It is an attempt to explain synaptic plasticity, the adaptation of brain neurons during the learning process.
Hebb states it as follows: Let us assume that the persistence or repetition of a reverberatory activity (or "trace") tends to induce lasting cellular changes that add to its stability.
This aspect of causation in Hebb's work foreshadowed what is now known about spike-timing-dependent plasticity, which requires temporal precedence.
It also provides a biological basis for errorless learning methods for education and memory rehabilitation.
In the study of neural networks in cognitive function, it is often regarded as the neuronal basis of unsupervised learning.
Alan Allport] posits additional ideas regarding cell assembly theory and its role in forming engrams, along the lines of the concept of auto-association, described as follows: If the inputs to a system cause the same pattern of activity to occur repeatedly, the set of active elements constituting that pattern will become increasingly strongly inter-associated.
[4]: 44 Work in the laboratory of Eric Kandel has provided evidence for the involvement of Hebbian learning mechanisms at synapses in the marine gastropod Aplysia californica.
Much of the work on long-lasting synaptic changes between vertebrate neurons (such as long-term potentiation) involves the use of non-physiological experimental stimulation of brain cells.
However, some of the physiologically relevant synapse modification mechanisms that have been studied in vertebrate brains do seem to be examples of Hebbian processes.
[6] Klopf's model reproduces a great many biological phenomena, and is also simple to implement.
However, it can be shown that Hebbian plasticity does pick up the statistical properties of the input in a way that can be categorized as unsupervised learning.
: As defined in the previous sections, Hebbian plasticity describes the evolution in time of the synaptic weight
is symmetric, it is also diagonalizable, and the solution can be found, by working in its eigenvectors basis, to be of the form where
This is an intrinsic problem due to this version of Hebb's rule being unstable, as in any network with a dominant signal the synaptic weights will increase or decrease exponentially.
One may think a solution is to limit the firing rate of the postsynaptic neuron by adding a non-linear, saturating response function
[11] The compound most commonly identified as fulfilling this retrograde transmitter role is nitric oxide, which, due to its high solubility and diffusivity, often exerts effects on nearby neurons.
[12] This type of diffuse synaptic modification, known as volume learning, is not included in the traditional Hebbian model.
[13] Hebbian learning and spike-timing-dependent plasticity have been used in an influential theory of how mirror neurons emerge.
These re-afferent sensory signals will trigger activity in neurons responding to the sight, sound, and feel of the action.
After repeated experience of this re-afference, the synapses connecting the sensory and motor representations of an action are so strong that the motor neurons start firing to the sound or the vision of the action, and a mirror neuron is created.
Evidence for that perspective comes from many experiments that show that motor programs can be triggered by novel auditory or visual stimuli after repeated pairing of the stimulus with the execution of the motor program (for a review of the evidence, see Giudice et al., 2009[18]).
[19] Consistent with the fact that spike-timing-dependent plasticity occurs only if the presynaptic neuron's firing predicts the post-synaptic neuron's firing,[20] the link between sensory stimuli and motor programs also only seem to be potentiated if the stimulus is contingent on the motor program.
Spike-Timing-Dependent Plasticity (STDP), for example, refines Hebbian principles by incorporating the precise timing of neuronal spikes.
Experimental advancements have also linked Hebbian learning to complex behaviors, such as decision-making and emotional regulation, showcasing its versatility.
Current studies in AI and quantum computing continue to leverage Hebbian concepts for developing adaptive algorithms and improving machine learning models.
[21] [22] A growing area of interest is the application of Hebbian learning in quantum computing.
While classical neural networks are the primary area of application for Hebbian theory, recent studies have begun exploring the potential for quantum-inspired algorithms.
This research explores how Hebbian principles could inform the development of more efficient quantum machine learning models.
Studies suggest that inhibitory neurons can provide critical regulation for maintaining stability in neural circuits and might prevent runaway positive feedback in Hebbian learning.
It is hypothesized that Hebbian plasticity in these areas may underpin behaviors like habit formation, reinforcement learning, and even the development of social bonds.