[1][2] In the most general definition, a GL network consists of a countable number of elements (idealized neurons) that interact by sporadic nearly-instantaneous discrete events (spikes or firings).
Thus each neuron "forgets" all previous spikes, including its own, whenever it fires.
In specific versions of the GL model, the past network spike history since the last firing of a neuron N may be summarized by an internal variable, the potential of that neuron, that is a weighted sum of those spikes.
The potential may include the spikes of only a finite subset of other neurons, thus modeling arbitrary synapse topologies.
For simplicity, let's assume that these times extend to infinity in both directions, implying that the network has existed since forever.
In the GL model, all neurons are assumed evolve synchronously and atomically between successive sampling times.
denote the matrix whose rows are the histories of all neuron firings from time
, that is Then the general GL model says that Moreover, the firings in the same time step are conditionally independent, given the past network history, with the above probabilities.
we have In a common special case of the GL model, the part of the past firing history
is summarized by a real-valued internal state variable or potential
, the external input, represents some additional contribution to the potential that may arrive between times
is a history weight function that modulates the contributions of firings that happened
This is the way inhibitory synapses are approximated in the GL model.
is defined to be a decaying weighted sum of the firings of other neurons.
Apart from those contributions, during each time step, the potential decays by a fixed recharge factor
can be expressed by a recurrence formula Or, more compactly, This special case results from taking the history weight factor
This self-weight therefore represents the reset potential that the neuron assumes just after firing, apart from other contributions.
Or, more compactly, These formulas imply that the potential decays towards zero with time, when there are no external or synaptic inputs and the neuron itself does not fire.
However, this apparent discrepancy exists only because it is customary in neurobiology to measure electric potentials relative to that of the extracellular medium.
Some authors use a slightly different refractory variant of the integrate-and-fire GL neuron,[3] which ignores all external and synaptic inputs (except possibly the self-synapse
The equation for this variant is or, more compactly, Even more specific sub-variants of the integrate-and-fire GL neuron are obtained by setting the recharge factor
That is, the neuron does not have any internal state, and is essentially a (stochastic) function block.
In these sub-variants, while the individual neurons do not store any information from one step to the next, the network as a whole still can have persistent memory because of the implicit one-step delay between the synaptic inputs and the resulting firing of the neuron.
for each neuron, which can be assumed to be stored in its axon in the form of a traveling depolarization zone.
The GL model was defined in 2013 by mathematicians Antonio Galves and Eva Löcherbach.
[1] Its inspirations included Frank Spitzer's interacting particle system and Jorma Rissanen's notion of stochastic chain with memory of variable length.
[4] Galves and Löcherbach referred to the process that Cessac described as "a version in a finite dimension" of their own probabilistic model.
[5] The Galves–Löcherbach model distinguishes itself because it is inherently stochastic, incorporating probabilistic measures directly in the calculation of spikes.
It is also a model that may be applied relatively easily, from a computational standpoint, with a good ratio between cost and efficiency.
Contributions to the model were made, considering the hydrodynamic limit of the interacting neuronal system,[6] the long-range behavior and aspects pertaining to the process in the sense of predicting and classifying behaviors according to a fonction of parameters,[7][8] and the generalization of the model to the continuous time.