Predictive coding is member of a wider set of theories that follow the Bayesian brain hypothesis.
Theoretical ancestors to predictive coding date back as early as 1860 with Helmholtz's concept of unconscious inference.
[1] Unconscious inference refers to the idea that the human brain fills in visual information to make sense of a scene.
McClelland and Rumelhart's parallel processing model describes perception as the meeting of top-down (conceptual) and bottom-up (sensory) elements.
In the late 1990s, the idea of top-down and bottom-up processing was translated into a computational model of vision by Rao and Ballard.
It assumes that the brain maintains an active internal representations of the distal causes, which enable it to predict the sensory inputs.
[7] For instance, the noise in the visual signal varies between dawn and dusk, such that greater conditional confidence is assigned to sensory prediction errors in broad daylight than at nightfall.
It has also been proposed that such weighting of prediction errors in proportion to their estimated precision is, in essence, attention,[9] and that the process of devoting attention may be neurobiologically accomplished by ascending reticular activating systems (ARAS) optimizing the “gain” of prediction error units.
"[11] Much of the early work that applied a predictive coding framework to neural mechanisms came from sensory processing, particularly in the visual cortex.
In 2013, Anil Seth proposed that our subjective feeling states, otherwise known as emotions, are generated by predictive models that are actively built out of causal interoceptive appraisals.
[19] In relation to how we attribute internal states of others to causes, Sasha Ondobaka, James Kilner, and Karl Friston (2015) proposed that the free energy principle requires the brain to produce a continuous series of predictions with the goal of reducing the amount of prediction error that manifests as “free energy”.
[22][23] In a predictive coding model, Barrett hypothesizes that, in interoception, our brains regulate our bodies by activating "embodied simulations" (full-bodied representations of sensory experience) to anticipate what our brains predict that the external world will throw at us sensorially and how we will respond to it with action.
Then, in a trial-error-adjust process, our bodies find similarities in goals among certain successful anticipatory simulations and group them together under conceptual categories.
If it does not, the prediction, the simulation, and perhaps the boundaries of the conceptual category are revised in the hopes of higher accuracy next time, and the process continues.
In this sense, Barrett proposes that we construct our emotions because the conceptual category framework our brains use to compare new experiences, and to pick the appropriate predictive sensory simulation to activate, is built on the go.
[28] Future research could focus on clarifying the neurophysiological mechanism and computational model of predictive coding.