A single cell recording experiment with monkeys demonstrated the different level of activation of mouth mirror neurons when monkey observed mouth movement depending on context (ingestive actions such as sucking juice vs. communicative actions such as lip-smacking or tongue protrusions).
Shared neural representation for a motor behavior and its observation has been extended into the domains of feelings and emotions.
In an fMRI study, the same brain regions activated when people imitated and observed emotional facial expressions such as happy, sad, angry, surprise, disgust, and afraid.
[12] These results suggest that understanding another's feelings and emotions is driven not by cognitive deduction of what the stimuli means but by automatic activation of somatosensory neurons.
A recent study on pupil size directly demonstrated emotion perception as an automatic process modulated by mirror systems.
Based on findings from neuroimaging studies, de Vignemont and Singer proposed empathy as a crucial factor in human communication: "Empathy might enable us to make faster and more accurate predictions of other people's needs and actions and discover salient aspects of our environment.
[3] In an fMRI study, a mirror system has been proposed as common neural substrate that mediates the experience of basic emotions.
[15] Participants watched video clips of happy, sad, angry, and disgusted facial expressions, and researchers measured their empathy quotient (EQ).
However, a number of other studies, using magnetoencephalography and functional MRI have since demonstrated that empathy for pain does involve the somatosensory cortex, which supports the simulation theory.
There is an impressive history of research suggesting that empathy, when activated, causes people to act in ways to benefit the other, such as receiving electric shocks for the other.