
2022 Impact Factor
The generation of emotions, including pleasure, sadness, anxiety, and anger, enables individuals to respond to physical and psychological upsets. Emotional expres-sion is the behavioral representation of an emotional state and can occur with or without self-awareness [1]. Emotio-nal expression is often measured by perceivers’ ratings of communication accuracy or the self-clinical monitoring scale [1]. Emotionality has traditionally been conceptu-alized as a tendency to shift from a positive or neutral state to a negative state [2,3].
Most affective neuroscience studies use pictures from the International Affective Picture System or standard facial expressions to elicit emotional experiences [4]. To enhance emotional color, several methods, including the use of sentences or paragraphs representing emotions and pictorial vignettes depicting emotional content, have been developed [5].
Emotional experience and expression usually share the Papez circuit and automatic body functions [6,7]. The Papez circuit, a well-known loop from the hippocampal formation to the anterior cingulate cortex, is involved in emotional experience and expression [7]. The brain regions involved in emotional experience may include the amygdala, anterior cingulate and insular cortices, fusiform gyrus, and prefrontal cortex [6]. However, emotio-nal experiences may be modified to attenuated emotional experiences within higher brain regions [8-10]. Thus, reasoning, rationalizing, and labeling human experiences could modify the original emotional expression to an attenuated emotional expression [8,10]. Hariri et al. [9] suggested that the neocortex might attenuate emotional experiences by interpreting and labeling them. Moreover, the labeling effect is associated with the prefrontal cortex. Ma et al. [11] reported that attention systems, including the prefrontal cortex, can mediate emotional regulation in response to stimulation with emotional faces.
Lindquist et al. [12] suggested that the experience of a pleasant or an unpleasant feeling as well as the representation of objects as ‘positive’ or ‘negative’ is associated with hedonic valence-related brain regions. Several brain imaging studies have suggested that the attention network could be associated with the control of affective networks in healthy individuals and patient groups [13,14]. Kragel et al. [13] reported that the cortico-limbic network could be a candidate neural network for the interaction between negative valence-specific processing. In a go-no-go task, children with attention deficit hyperactivity disorder showed hypoactivation within the anterior and middle cingulate cortices, which extended to the supplementary motor area, and hyperactivation within the left temporal gyrus [14,15].
We hypothesized that emotional experience is associated with brain activity within the neocortex. In addition, modification within the neocortex may be associated with brain activity within the attention system.
The Institutional Review Board of Gachon University Gil Medical Center approved the study protocol (IRB no. GBIRB2020-275). All participants provided written informed consent after receiving a full explanation of the study procedures before they underwent magnetic resonance imaging (MRI) scans.
Thirty-one healthy adult participants were recruited through advertisement. The inclusion criteria were as follows: (i) age between 19 and 50 years; (ii) absence of past or current Axis I diagnosis according to the Diagnostic and Statistical Manual of Mental Disorders, 5th edition (DSM-5) [16], which was established using the Structured Clinical Interview for DSM-5 [17]; (iii) absence of a history of substance abuse or dependence; and (iv) absence of a history of medical or neurological disorders. Partici-pants were excluded if they had first-degree relatives with a history of DSM-5 disorders or any medical conditions that might interfere with MRI scans, such as pacemakers or metal implants.
Thirty-one participants underwent brain MRI scans, and 27 completed the self-rating questionnaire. The demographic information of the participants and their scores on the self-rating questionnaire are presented in Table 1.
To assess the expression of happiness, the Korean version of the Subjective Happiness Scale was used [18], which consists of four items and is rated using a 7-point Likert scale [18]. The total scores ranged from 4 to 28. Higher scores indicate higher levels of happiness. This scale shows good internal consistency (Cronbach’s α = 0.86−0.89) [18,19].
To assess the expression of anxiety, the State and Trait Anxiety Inventory (STAI), which consists of 40 items (20 state anxiety and 20 trait anxiety items) and is rated on a 4-point frequency scale, was used [20,21]. The total scores ranged from 20 to 80. Higher scores indicate higher levels of anxiety. This scale shows good internal consistency (Cronbach’s α = 0.90−0.92).
To assess the expression of sadness, the Beck Depres-sive Inventory, which consists of 21 items and is rated using a 4-point frequency scale, was used [22]. The total scores ranged from 0 to 63. Higher scores indicate greater sadness. This scale shows good internal consistency (Cronbach’s α = 0.85−0.94) [23].
To assess the expression of anger, the Korean version of the Buss-Perry Aggression questionnaire, which consists of 27 items and is rated using a 5-point scale, was used [24]. The total score ranged from 27 to 135. Higher scores indicate greater anger. This scale shows good internal consistency (Cronbach’s α = 0.87−0.88) [25].
The stimulations for emotional experience were presented in a block design, which was composed of alternating 10-s periods of stimuli and blanks. All visual and audio stimuli were designed using E-prime software (Psychology Software Tools). To elicit an emotional experience, each stimulus block was designed to involve the presentation of a still image of a facial expression and a corresponding emotional voice for 10 seconds. Still image and voice data were sampled from the Ryerson Audio- Visual Database of Emotional Speech and Song open dataset [26]. Each voice was sampled for five seconds and repeated twice. To express the emotion with the voice tone alone, we used text content that was plain and had no emotional meaning. The blank block involved the presentation of a flickering cross (once/1 second) in the middle of the screen. A pair of “emotion-blank” blocks was randomly selected from five different emotions: neutral, anger, anxiety, depression, and happiness. A total of 20 “emotion-blank” blocks were presented to the subject for 410 seconds (Fig. 1A).
The go-no-go task was presented with a block design, which was composed of dot, control, queue, and task blocks. The dot block included a cross and a dot, which alternatively flickered with a 750 ms duration for nine seconds. The control block included a cross and an overlapping “O” and “X”, which were alternatively flickered with a 750 ms duration for nine seconds. The que block was include a white cross and a red or blue cross, each of which appeared once for a 1 second duration. The color of the cross determined when the subject should press the button. The task block was composed of an “X” or “O” that randomly appeared on the left or right side of the screen. Each task was shown with a 750 ms duration for nine seconds. Depending on the color of the que block, the subjects pressed buttons for different letters. When the blue cross was shown in the que block, the subjects were instructed to press the button as soon as an “O” appeared. When the red cross appeared, the subjects pressed the button as soon as an “X” appeared. The duration of a set of blocks was 30 seconds, and a total of ten sets were repeated for 310 seconds (Fig. 1B).
MRI was performed using a 3-Tesla MRI scanner (MAGNETOM Vida, Siemens Healthcare GmbH) with a dedicated 20-channel head/neck coil (BioMatrix head/neck 20 TCS; Siemens Healthcare GmbH).
For each functional task, MRI data were acquired using a simultaneous multi-slice (SMS) accelerated echo-planar imaging sequence with the following parameters: repetition time (TR) = 2,500 ms, echo time (TE) = 30 ms, flip angle (FA) = 77°, pixel size = 2.3 × 2.3 mm2, thickness = 2.3 mm, matrix size = 104 × 104, number of slices = 62, acceleration mode = SMS, SMS factor = 2, and partial Fourier = 7/8 (along the phase-encoding direction). A total of 164 volumes for emotional experience stimulation were collected with a total scan time of 7 minutes, and 124 volumes for the go-no-go task were collected for a total scan time of 5 minutes and 20 seconds. Structural brain images were acquired using a 3-dimensional T1-weighted magnetization-prepared rapid gradient echo sequence with the following parameters: TR = 1,800 ms, TE = 2.61 ms, inversion time = 900 ms, FA = 10°, voxel size = 0.5 × 0.5 × 1.0 mm3, matrix size = 512 × 416, number of slices = 176, acceleration mode = generalized auto-calibrating partially parallel acquisitions, and acceleration factor = 2. All participants were instructed to relax and stay awake in the scanner, and their head movements were minimized by restraining foam pads.
MRI data preprocessing was performed using the functional connectivity toolbox (CONN 18b, www.nitrc.org/projects/conn) based on statistical parametric mapping (SPM12 v.7771, http://www.fil.ion.ucl.ac.uk/spm/software/spm12/) functions. To remove artifacts, functional images were realigned, unwarped, and slice-timing cor-rected. The functional and structural images were then segmented and normalized. Functional images were smoothed with a 6-mm full width at half maximum Gaussian kernel. After applying the preprocessing pipeline, functional images were separated based on task timing and block design.
Multiple linear models in SPM12 were used to estimate the effect of brain activity within attention circuits on the connectivity between emotional expression and emotional experience after adjusting for the effect of participants’ sex and age on the results. All tests were two-sided, and differences were considered statistically significant at a significance level of 0.05. All statistical analyses were performed using the general linear model in Statistica software 7.0 (TIBCO Software Inc.).
The brain activity was higher in the following regions in response to various emotional experiences than in response to neutral experiences: happiness, the left superior frontal gyrus and right medial frontal gyrus; sadness, the right temporal lobe and left inferior frontal gyrus; anxiety, the left temporal; and anger, the left medial frontal gyrus. Furthermore, the scores corresponding to each emotion were correlated with brain activity within the following brain regions (Fig. 2): subjective happiness scale scores, the right and left transverse temporal gyri; Beck Depres-sive Inventory scale scores, the left superior temporal gyrus; STAI scores, the left limbic lobe, uncus, and left inferior temporal gyrus; Korean version of Aggression Ques-tionnaire scores, the left insular cortex.
For each emotion, the brain activity between the following brain regions was found to be positively correlated (Fig. 3): happiness, the right medial frontal gyrus with the right transverse temporal gyrus (r = 0.44, p = 0.01); anxiety, the left temporal lobe with the left limbic lobe and uncus (r = 0.39, p = 0.02); sadness, the left inferior frontal gyrus with the left superior temporal gyrus (r = 0.81, p < 0.01); and anger, the left medial frontal gyrus with the left insular cortex (r = 0.75, p < 0.01).
In response to the go-no-go task, brain activity within the right anterior cingulate gyrus was activated (Talairach code x, y, z = 10, 28, 26, 178 voxels, T = 4.11).
The following associations were observed for each emotion (Table 2). For sadness, brain activity within the left inferior frontal gyrus in response to emotional experience was associated with brain activity within the right posterior cingulate gyrus in response to the go-no-go task and within the left superior temporal gyrus in response to emotional expression. For anxiousness, brain activity within the left temporal lobe in response to emotional experience was associated with brain activity within the right posterior cingulate cortex in response to the go-no- go task and within the left limbic lobe and uncus in response to emotional expression. Finally, for happiness and anger, there was no association of brain activity in response to emotional expression, emotional experience, and the go-no-go task.
The current results showed that emotional experience was associated with brain activity within the frontotemporal cortices, while emotional expression was associated with brain activity within the temporal and insular cortices. In addition, the association of brain activity between emotional experiences and expressions of sadness and anxiety was affected by brain activity within the anterior cingulate gyrus in response to the go-no-go task.
In the current study, emotional expressions of happiness, sadness, anxiety, and anger were associated with brain activity within the temporal cortex including the transverse temporal gyrus, limbic uncus, superior temporal gyrus, and insular gyrus. These results have already been reported in several studies. The temporal and insular lobes are crucial parts of the limbic system (Papez circuit) [6], which is involved in emotional expression [7].
Interestingly, in the current study, emotional experience was associated with brain activity within the frontotemporal cortex. Frontotemporal regions are thought to be associated with the processing of facial and social emotions [27]. In particular, the frontal lobe is known to attenuate original emotional expression by reasoning, rationalizing, and labeling human experiences [8-10].
In the current study, only the experience of anxiety was associated with brain activity within the temporal lobe, while other emotional experiences were associated with brain activity within the frontal cortex [28]. Several studies have suggested that altered functions of the temporal lobe are associated with anxiety [29,30]. Servan-Schreiber et al. [29] reported that selective pharmacological activation of the temporal lobe could evoke serious anxiety symptoms, such as panic. Davies et al. [30] reported that an altered mediotemporal response to anxiety processing is associated with a high risk of clinical psychosis.
In the current study, emotional experience and expression were affected by brain activity within the attention system for sadness and anxiety. The relationship between negative emotions (anxiety and anger) and the attention system is associated with increased emotional lability [31]. Emotional lability is thought to be related to hyperconnectivity of the amygdala network, which includes the anterior cingulate cortex [32]. Moreover, top- down cognitive processes are thought to influence emotional processing. For example, the prefrontal cortex was more activated in the presence of emotional stimuli during cognitive control tasks than in cognitive tasks with mono stimuli alone [33]. A hyperactivated prefrontal cortex can reduce the reactivity of the amygdala to emotional stimuli [34]. Taxing attentional tasks during emotional stimuli could reduce emotional response in patients with anxiety disorder [35].
Taken together, the current results suggest that the attention system could affect emotional attenuation during emotional experiences of sadness and anxiety.
The current study has several limitations. First, the small number of subjects is not sufficient for the results to be generalizable to the general population with respect to all emotions. Indeed, the intercept between the attention and emotional systems was not statistically significant for happiness and sadness. Second, the emotional experiences were arbitrary. Future studies should focus on verifying tasks for emotional experiences with a large number of subjects.
In conclusion, emotional expression may be associated with brain activity within the temporal cortex, whereas emotional experience may be associated with brain activity within the frontotemporal cortices. In addition, the attention system can interfere with the connection between emotional expressions and experiences.
This work was supported by a National Research Foun-dation of Korea (NRF) grant funded by the Korean government (MSIT) (No. NRF-2020R1A4A1019623).
No potential conflict of interest relevant to this article was reported.
Conceptualization: Na Rae Won, Sun Mi Kim, Jong- Hoon Kim. Data acquisition: Sujin Bae, Young-Don Son, Jeong Hee Kim, Jong-Hoon Kim. Formal analysis: Young- Don Son, Jong-Hoon Kim. Writing−original draft: Na Rae Won, Sun Mi Kim, Doug Hyun Han. Writing−review & editing: Sujin Bae, Doug Hyun Han.
![]() |
![]() |