💾 Archived View for gmi.noulin.net › mobileNews › 1705.gmi captured on 2023-12-28 at 20:15:00. Gemini links have been rewritten to link to archived content

View Raw

More Information

⬅️ Previous capture (2023-01-29)

➡️ Next capture (2024-05-10)

-=-=-=-=-=-=-

Simulation theory of empathy

2009-12-16 12:23:01

From Wikipedia, the free encyclopedia

The simulation theory is not primarily a theory of empathy, but rather a theory

of how we understand others -- that we do so by way of a kind of empathetic

response. The theory holds that humans anticipate and make sense of the

behavior of others by activating mental processes that, if carried into action,

would produce similar behavior. This includes intentional behavior as well as

the expression of emotions.

Origin

The simulation theory is actually based in philosophy of mind, especially the

work of Alvin Goldman and Robert Gordon. Then, the discovery of mirror neurons

in macaque monkeys has provided a physiological mechanism for the common coding

between perception and action (see Wolfgang Prinz) [1] and the hypothesis of a

similar mirror neuron system in the human brain.[2][3] Since the discovery of

the mirror neuron system, many studies have been carried out to examine the

role of this system in action understanding, emotion and other social

functions.

Development

Mirror neurons are activated both when actions are executed and the actions are

observed. This unique function of mirror neurons may explain how people

recognize and understand the states of others; mirroring observed action in the

brain as if they conducted the observed action [4].

Two sets of evidence suggest that mirror neurons in the monkey have a role in

action understanding. First, the activation of mirror neurons requires

biological effectors such as hand or mouth. Mirror neurons do not respond to

the action with tools like pliers. [5] Mirror neurons respond to neither the

sight of an object alone nor an action without an object (intransitive action).

Umilta and colleagues [6] demonstrated that a subset of mirror neurons fired

when final critical part of the action was not visible to the observer. The

experimenter showed his hand moving toward a cube and grasping it, and later

showed the same action without showing later part grasping the cube (placing

the cube behind the occluder). Mirror neurons fired on both visible and

invisible conditions. On the other hand, mirror neurons did not discharge when

the observer knew that there was not a cube behind the occluder.

Second, responses of mirror neurons to same actions are different depending on

context of the action. A single cell recording experiment with monkeys

demonstrated the different level of activation of mouth mirror neurons when

monkey observed mouth movement depending on context (ingestive actions such as

sucking juice vs. communicative actions such as lip-smacking or tongue

protrusions). [7] An fMRI study also showed that mirror neurons respond to the

action of grasping a cup differently depending on context (to drink a cup of

coffee vs. to clean a table on which a cup was placed). [8]

Emotion understanding

Shared neural representation for a motor behavior and its observation has been

extended into the domains of feelings and emotions. Not only movements but also

facial expressions activate the same brain regions that are activated by direct

experiences. In an fMRI study, same brain regions on action representation

found to be activated when people both imitated and observed emotional facial

expressions such as happy, sad, angry, surprise, disgust, and afraid. [9].

Observing video clips that displayed facial expression of feeling disgust

activated the neural networks typical of direct experience of disgust. [10].

Similar results have been found in the case of touch. Watching movies that

someone touched legs or faces activated the somatosensory cortex for direct

feeling of the touch. [11] A similar mirror system exists in perceiving pain.

When people see other people feel pain, people feel pain not only affectively,

[12] but also sensorially. [13]

These results suggest that understanding other's feelings and emotions is

driven not by cognitive deduction of what the stimuli means but by automatic

activation of somatosensory neurons. A recent study on pupil size directly

demonstrated emotion perception was automatic process modulated by mirror

systems. [14] When people saw sad faces, pupil sizes influenced viewers in

perceiving and judging emotional states without explicit awareness of

differences of pupil size. When pupil size was 180% of original size, people

perceived a sad face as less negative and less intense than when pupil was

smaller than or equal to original pupil size. This mechanism was correlated

with brain regions that implicated in emotion process, the amygdala.

Furthermore, viewers mimic the size of their own pupils to those of sad faces

they watched. Considering that pupil size is beyond voluntary control, the

change of pupil size upon emotion judgment is a good indication that

understanding emotions is automatic process. However, the study could not find

other emotional faces such as happiness and anger influence pupil size as

sadness did.

Epistemological role of empathy

Understanding other s actions and emotions is believed to facilitate efficient

human communication. Based on findings from neuroimaging studies, de Vignemont

and Singer [15] proposed empathy as a crucial factor in human communication

arguing its epistemological role; Empathy might enable us to make faster and

more accurate predictions of other people s needs and actions and discover

salient aspects of our environment. Mental mirroring of actions and emotions

may enable humans to understand other s actions and their related environment

quickly, and thus help humans communicate efficiently [16].

In an fMRI study, a mirror system has been proposed as common neural substrates

to mediate the experiences of basic emotions [17]. Participants watched video

clips of happy, sad, angry and disgust facial expressions, and measured their

Empathy Quotient (EQ). Specific brain regions relevant to the four emotions

were found to be correlated with the EQ while the mirror system (i.e., the left

dorsal inferior frontal gyrus/premotor cortex) was correlated to the EQ across

all emotions. The authors interpreted this result as an evidence that action

perception mediates face perception to emotion perception.

Empathy for pain

A paper published in Science (Singer et al., 2005)[18] challenges the idea that

pain sensations and mirror neurons play a role in empathy for pain.

Specifically, the authors found that activity in the anterior insula and the

anterior cingulate cortex was present both when one's self and another person

were presented with a painful stimulus, two regions known to be responsible for

the affective experience of pain, but the rest of the pain matrix, responsible

for sensation, was not active. Furthermore, participants merely saw the hand of

another person with the electrode on it, making it unlikely that 'mirroring'

could have caused the empathic response. However, a number of other studies,

using magnetoencephalography and functional MRI have since demonstrated that

empathy for pain does involve the somatosensory cortex, which supports the

simulation theory.[19][20][21][22]

Support for anterior insula and anterior cingulate cortex being the neural

substrates of empathy include Wicker et al., 2003 who report that their "core

finding is that the anterior insula is activated both during observation of

disgusted facial expressions and during the emotion of disgust evoked by

unpleasant odorants"[23] (p. 655).

Furthermore, one study demonstrated that "for actions, emotions, and sensations

both animate and inanimate touch activates our inner representation of touch."

They note, however that "it is important at this point to clarify the fact that

we do not believe that the activation we observe evolved in order to empathize

with other objects or human beings"[24] (p. 343).