To be less topic specific, one can see this textbook for the computational link between neural activities and visual perception and behavior: "Understanding vision: theory, models, and data" , published by Oxford University Press 2014.
[2] A recent study[3] using Event-Related Potentials (ERPs) linked an increased neural activity in the occipito-temporal region of the brain to the visual categorization of facial expressions.
Scientists[3] used classification image techniques,[6] to determine what parts of complex visual stimuli (such as a face) will be relied on when patients are asked to assign them to a category, or emotion.
The N170 peaked slightly earlier for the fear stimuli at about 175 milliseconds, meaning that it took a participants less time to recognize the facial expression.
However, when processing a happy expression, where the mouth is crucial to categorization, downward integration must take place, and thus the N170 peak occurred later at around 185 milliseconds.
This will give a complete view to how the world is constantly visually perceived, and may provide insight into a link between perception and consciousness.
A cell with a receptive field in the gray square surrounded by the white receives more of the lateral inhibition and thus it does not fire as often and appears darker.
Insight into this process allows clinical psychologists to gain a greater understanding for what may be causing visual disorders in their patients.