1.Neural Tracking of Race-Related Information During Face Perception.
Chenyu PANG ; Na ZHOU ; Yiwen DENG ; Yue PU ; Shihui HAN
Neuroscience Bulletin 2025;41(11):1957-1976
Previous studies have identified two group-level processes, neural representations of interracial between-group difference and intraracial within-group similarity, that contribute to the racial categorization of faces. What remains unclear is how the brain tracks race-related information that varies across different faces as an individual-level neural process involved in race perception. In three studies, we recorded functional MRI signals when Chinese adults performed different tasks on morphed faces in which proportions of pixels contributing to perceived racial identity (Asian vs White) and expression (pain vs neutral) varied independently. We found that, during a pain expression judgment task, tracking other-race and same-race-related information in perceived faces recruited the ventral occipitotemporal cortices and medial prefrontal/anterior temporal cortices, respectively. However, neural tracking of race-related information tended to be weakened during explicit race judgments on perceived faces. During a donation task, the medial prefrontal activity also tracked race-related information that distinguished between two perceived faces for altruistic decision-making and encoded the Euclidean distance between the two faces that predicted decision-making speeds. Our findings revealed task-dependent neural mechanisms underlying the tracking of race-related information during face perception and altruistic decision-making.
Adult
;
Female
;
Humans
;
Male
;
Young Adult
;
Brain/diagnostic imaging*
;
Brain Mapping
;
Decision Making/physiology*
;
Facial Recognition/physiology*
;
Judgment/physiology*
;
Magnetic Resonance Imaging
;
Photic Stimulation
;
Racial Groups
;
Social Perception
;
East Asian People
2.Dissecting Social Working Memory: Neural and Behavioral Evidence for Externally and Internally Oriented Components.
Hanxi PAN ; Zefeng CHEN ; Nan XU ; Bolong WANG ; Yuzheng HU ; Hui ZHOU ; Anat PERRY ; Xiang-Zhen KONG ; Mowei SHEN ; Zaifeng GAO
Neuroscience Bulletin 2025;41(11):2049-2062
Social working memory (SWM)-the ability to maintain and manipulate social information in the brain-plays a crucial role in social interactions. However, research on SWM is still in its infancy and is often treated as a unitary construct. In the present study, we propose that SWM can be conceptualized as having two relatively independent components: "externally oriented SWM" (e-SWM) and "internally oriented SWM" (i-SWM). To test this external-internal hypothesis, participants were tasked with memorizing and ranking either facial expressions (e-SWM) or personality traits (i-SWM) associated with images of faces. We then examined the neural correlates of these two SWM components and their functional roles in empathy. The results showed distinct activations as the e-SWM task activated the postcentral and precentral gyri while the i-SWM task activated the precuneus/posterior cingulate cortex and superior frontal gyrus. Distinct multivariate activation patterns were also found within the dorsal medial prefrontal cortex in the two tasks. Moreover, partial least squares analyses combining brain activation and individual differences in empathy showed that e-SWM and i-SWM brain activities were mainly correlated with affective empathy and cognitive empathy, respectively. These findings implicate distinct brain processes as well as functional roles of the two types of SWM, providing support for the internal-external hypothesis of SWM.
Humans
;
Memory, Short-Term/physiology*
;
Male
;
Female
;
Empathy/physiology*
;
Young Adult
;
Magnetic Resonance Imaging
;
Adult
;
Brain/diagnostic imaging*
;
Brain Mapping
;
Facial Expression
;
Social Behavior
;
Facial Recognition/physiology*
;
Social Perception
;
Personality/physiology*
3.Perception of Smile Aesthetics and Attractiveness among Saudi Females
Nozha Sawan ; Mamata Hebbal ; Abeer Alshami ; Afnan Ben Gassem ; Yara Alromaih ; Eman Alsagob
Archives of Orofacial Sciences 2022;17(1):113-122
ABSTRACT
Smile aesthetic, known as the static and dynamic relationship of the dentition and supporting
structures to the facial soft tissues, is one of the most important elements of facial attractiveness.
The objective of the study was to assess the perception of smile aesthetics and attractiveness through
digital image manipulation of aesthetic variables and to compare those perceptions according to
diverse sociodemographic data among female Saudi laypeople attending the dental clinic. A crosssectional study of 193 female Saudi participants were randomly selected and consented to answer the
study questionnaire. Nine smile photograph images were created to compare different smile aesthetic
perceptions. Two groups were recruited: 120 participants in the first group (under 30 years old) and
73 participants in the second group (30 years old or above). All participants in both groups were asked
to choose the attractiveness of each smile image using multiple-choice options. A statistically significant
finding showed that normal buccal corridors were chosen as the most attractive smile by 42.5%
of the participants in the younger group and by a significantly higher ratio of the participants with a
bachelor’s degree or higher level of education at 49% (p < 0.05). Laypeople’s preferences regarding smile
attractiveness vary, but a normal appearance was the ideal choice for the majority. Orthodontic treatment
should consider the general sociocultural understanding of smile perception.
Esthetics, Dental--psychology
;
Facial Recognition
;
Saudi Arabia
4.Interaction Between Conscious and Unconscious Information-Processing of Faces and Words.
Shiwen REN ; Hanyu SHAO ; Sheng HE
Neuroscience Bulletin 2021;37(11):1583-1594
It is widely acknowledged that holistic processing is a key characteristic of face perception. Although holistic processing implies the automatic integration of face parts, it is unclear whether such processing requires the awareness of face parts. Here, we investigated the interactions between visible face parts and face parts rendered invisible using continuous flash suppression (CFS). In the first experiment with the upper half-face visible and the lower half-face invisible, the results showed that perceived face identity was influenced by the invisible lower half-face, suggesting that integration occurs between the visible and invisible face parts, a variant of the "composite face effect". In the second experiment, we investigated the influence of visible face parts on the processing of invisible face parts, as measured by the time it took for the invisible parts to break out from CFS. The results showed a visible-to-invisible facilitation effect, that the aligned invisible face parts broke through CFS faster than when the visible and invisible face parts were misaligned. Visible eyes had a stronger influence on the invisible nose/mouth than the other way around. Such facilitation of processing from visible to invisible parts was also found when Chinese characters were used as stimuli. These results show that information integration occurs across the consciousness boundary.
Awareness
;
Consciousness
;
Eye
;
Face
;
Facial Recognition
;
Photic Stimulation
5.Discriminative Effects of Social Skills Training on Facial Emotion Recognition among Children with Attention-Deficit/Hyperactivity Disorder and Autism Spectrum Disorder.
Ji Seon LEE ; Na Ri KANG ; Hui Jeong KIM ; Young Sook KWAK
Journal of the Korean Academy of Child and Adolescent Psychiatry 2018;29(4):150-160
OBJECTIVES: This study investigated the effect of social skills training (SST) on facial emotion recognition and discrimination in children with attention-deficit/hyperactivity disorder (ADHD) and autism spectrum disorder (ASD). METHODS: Twenty-three children aged 7 to 10 years participated in our SST. They included 15 children diagnosed with ADHD and 8 with ASD. The participants' parents completed the Korean version of the Child Behavior Checklist (K-CBCL), the ADHD Rating Scale, and Conner's Scale at baseline and post-treatment. The participants completed the Korean Wechsler Intelligence Scale for Children-IV (K-WISC-IV) and the Advanced Test of Attention at baseline and the Penn Emotion Recognition and Discrimination Task at baseline and post-treatment. RESULTS: No significant changes in facial emotion recognition and discrimination occurred in either group before and after SST. However, when controlling for the processing speed of K-WISC and the social subscale of K-CBCL, the ADHD group showed more improvement in total (p=0.049), female (p=0.039), sad (p=0.002), mild (p=0.015), female extreme (p=0.005), male mild (p=0.038), and Caucasian (p=0.004) facial expressions than did the ASD group. CONCLUSION: SST improved facial expression recognition for children with ADHD more effectively than it did for children with ASD, in whom additional training to help emotion recognition and discrimination is needed.
Autism Spectrum Disorder*
;
Autistic Disorder*
;
Checklist
;
Child Behavior
;
Child*
;
Discrimination (Psychology)
;
Facial Expression
;
Facial Recognition
;
Female
;
Humans
;
Intelligence
;
Male
;
Parents
;
Social Skills*
6.Characteristics of facial expression recognition ability in patients with Lewy body disease.
Yuriko KOJIMA ; Tomohiro KUMAGAI ; Tomoo HIDAKA ; Takeyasu KAKAMU ; Shota ENDO ; Yayoi MORI ; Tadashi TSUKAMOTO ; Takashi SAKAMOTO ; Miho MURATA ; Takehito HAYAKAWA ; Tetsuhito FUKUSHIMA
Environmental Health and Preventive Medicine 2018;23(1):32-32
BACKGROUND:
The facial expression of medical staff has been known to greatly affect the psychological state of patients, making them feel uneasy or conversely, cheering them up. By clarifying the characteristics of facial expression recognition ability in patients with Lewy body disease, the aim of this study is to examine points to facilitate smooth communication between caregivers and patients with the disease whose cognitive function has deteriorated.
METHODS:
During the period from March 2016 to July 2017, we examined the characteristics of recognition of the six facial expressions of "happiness," "sadness," "fear," "anger," "surprise," and "disgust" for 107 people aged 60 years or more, both outpatient and inpatient, who hospital specialists had diagnosed with Lewy body diseases of Parkinson's disease, Parkinson's disease with dementia, and dementia with Lewy bodies. Based on facial expression recognition test results, we classified them by cluster analysis and clarified features of each type.
RESULTS:
In patients with Lewy body disease, happiness was kept unaffected by aging, age of onset, duration of the disease, cognitive function, and apathy; however, recognizing the facial expression of fear was difficult. In addition, due to aging, cognitive decline, and apathy, the facial expression recognition ability for sadness and anger decreased. In particular, cognitive decline reduced recognition of all of the facial expressions except for happiness. The test accuracy rates were classified into three types using the cluster analysis: "stable type," "mixed type," and "reduced type". In the "reduced type", the overall facial recognition ability declined except happiness, and in the mixed type, recognition ability of anger particularly declined.
CONCLUSION
There were several facial expressions that the Lewy body disease patients were unable to accurately identify. Caregivers are recommended to make an effort to compensate for such situations with language or body contact, etc., as a way to convey correct feeling to the patients of each type.
Aged
;
Aged, 80 and over
;
Cluster Analysis
;
Cognition
;
physiology
;
Emotions
;
Facial Expression
;
Facial Recognition
;
physiology
;
Female
;
Humans
;
Lewy Body Disease
;
physiopathology
;
psychology
;
Male
;
Middle Aged
7.Anodal Transcranial Direct-Current Stimulation Over the Right Dorsolateral Prefrontal Cortex Influences Emotional Face Perception.
Li-Chuan YANG ; Ping REN ; Yuan-Ye MA
Neuroscience Bulletin 2018;34(5):842-848
The dorsolateral prefrontal cortex (DLPFC) is considered to play a crucial role in many high-level functions, such as cognitive control and emotional regulation. Many studies have reported that the DLPFC can be activated during the processing of emotional information in tasks requiring working memory. However, it is still not clear whether modulating the activity of the DLPFC influences emotional perception in a detection task. In the present study, using transcranial direct-current stimulation (tDCS), we investigated (1) whether modulating the right DLPFC influences emotional face processing in a detection task, and (2) whether the DLPFC plays equal roles in processing positive and negative emotional faces. The results showed that anodal tDCS over the right DLPFC specifically facilitated the perception of positive faces, but did not influence the processing of negative faces. In addition, anodal tDCS over the right primary visual cortex enhanced performance in the detection task regardless of emotional valence. Our findings suggest, for the first time, that modulating the right DLPFC influences emotional face perception, especially faces showing positive emotion.
Adult
;
Emotions
;
Facial Recognition
;
physiology
;
Female
;
Humans
;
Male
;
Neuropsychological Tests
;
Prefrontal Cortex
;
physiology
;
Social Perception
;
Transcranial Direct Current Stimulation
;
Young Adult
8.Facial Expression Enhances Emotion Perception Compared to Vocal Prosody: Behavioral and fMRI Studies.
Heming ZHANG ; Xuhai CHEN ; Shengdong CHEN ; Yansong LI ; Changming CHEN ; Quanshan LONG ; Jiajin YUAN
Neuroscience Bulletin 2018;34(5):801-815
Facial and vocal expressions are essential modalities mediating the perception of emotion and social communication. Nonetheless, currently little is known about how emotion perception and its neural substrates differ across facial expression and vocal prosody. To clarify this issue, functional MRI scans were acquired in Study 1, in which participants were asked to discriminate the valence of emotional expression (angry, happy or neutral) from facial, vocal, or bimodal stimuli. In Study 2, we used an affective priming task (unimodal materials as primers and bimodal materials as target) and participants were asked to rate the intensity, valence, and arousal of the targets. Study 1 showed higher accuracy and shorter response latencies in the facial than in the vocal modality for a happy expression. Whole-brain analysis showed enhanced activation during facial compared to vocal emotions in the inferior temporal-occipital regions. Region of interest analysis showed a higher percentage signal change for facial than for vocal anger in the superior temporal sulcus. Study 2 showed that facial relative to vocal priming of anger had a greater influence on perceived emotion for bimodal targets, irrespective of the target valence. These findings suggest that facial expression is associated with enhanced emotion perception compared to equivalent vocal prosodies.
Adult
;
Brain Mapping
;
methods
;
Cerebral Cortex
;
diagnostic imaging
;
physiology
;
Emotions
;
physiology
;
Facial Expression
;
Facial Recognition
;
physiology
;
Female
;
Humans
;
Magnetic Resonance Imaging
;
Psychomotor Performance
;
physiology
;
Social Perception
;
Speech Perception
;
physiology
;
Young Adult
9.Faces of the Face.
Archives of Plastic Surgery 2017;44(3):251-256
No abstract available.
Face
;
Pattern Recognition, Visual
;
Human
;
Sociology, Medical
;
Facial Expression
;
Facial Muscles
;
Animals
;
Biological Evolution
;
Species Specificity
10.Studies of visual mismatch negativity elicited by cartoon facial expressions.
Shumei JI ; Wei LI ; Peng LIU ; Zhjie BIAN
Journal of Biomedical Engineering 2013;30(3):476-480
A modified "cross-modal delayed response" paradigm was used to investigate whether the visual mismatch negativity can be elicited by cartoon facial expressions, and to define the mechanism underlying automatic processing of facial expressions. Subjects taking part in the tests were instructed to discriminate the type of the tones they heard as quickly and accurately as possible, and to act merely when they heard the response imperative signal. Neutral, happy and angry faces were presented during intervals between a tone and a response imperative signal. Visual mismatch negativity (VMMN) was obtained by subtracting the event - related potential (ERP) elicited by neutral faces from that elicited by happy faces or angry faces. The angry-related VMMN was more negative than happy-related VMMN, and both were more negative in the left than in the right cerebral hemisphere. The results indicated that VMMN can be elicited by the cartoon facial expressions, and the facial expressions can be processed automatically.
Adult
;
Brain
;
physiology
;
Cartoons as Topic
;
Evoked Potentials, Visual
;
physiology
;
Facial Expression
;
Female
;
Humans
;
Male
;
Pattern Recognition, Visual
;
physiology
;
Photic Stimulation
;
Visual Perception
;
physiology
;
Young Adult


Result Analysis
Print
Save
E-mail