1.Research on bimodal emotion recognition algorithm based on multi-branch bidirectional multi-scale time perception.
Peiyun XUE ; Sibin WANG ; Jing BAI ; Yan QIANG
Journal of Biomedical Engineering 2025;42(3):528-536
Emotion can reflect the psychological and physiological health of human beings, and the main expression of human emotion is voice and facial expression. How to extract and effectively integrate the two modes of emotion information is one of the main challenges faced by emotion recognition. In this paper, a multi-branch bidirectional multi-scale time perception model is proposed, which can detect the forward and reverse speech Mel-frequency spectrum coefficients in the time dimension. At the same time, the model uses causal convolution to obtain temporal correlation information between different scale features, and assigns attention maps to them according to the information, so as to obtain multi-scale fusion of speech emotion features. Secondly, this paper proposes a two-modal feature dynamic fusion algorithm, which combines the advantages of AlexNet and uses overlapping maximum pooling layers to obtain richer fusion features from different modal feature mosaic matrices. Experimental results show that the accuracy of the multi-branch bidirectional multi-scale time sensing dual-modal emotion recognition model proposed in this paper reaches 97.67% and 90.14% respectively on the two public audio and video emotion data sets, which is superior to other common methods, indicating that the proposed emotion recognition model can effectively capture emotion feature information and improve the accuracy of emotion recognition.
Humans
;
Emotions
;
Algorithms
;
Facial Expression
;
Time Perception
;
Neural Networks, Computer
;
Speech
2.Application of multi-scale spatiotemporal networks in physiological signal and facial action unit measurement.
Journal of Biomedical Engineering 2025;42(3):552-559
Multi-task learning (MTL) has demonstrated significant advantages in the field of physiological signal measurement. This approach enhances the model's generalization ability by sharing parameters and features between similar tasks, even in data-scarce environments. However, traditional multi-task physiological signal measurement methods face challenges such as feature conflicts between tasks, task imbalance, and excessive model complexity, which limit their application in complex environments. To address these issues, this paper proposes an enhanced multi-scale spatiotemporal network (EMSTN) based on Eulerian video magnification (EVM), super-resolution reconstruction and convolutional multilayer perceptron. First, EVM is introduced in the input stage of the network to amplify subtle color and motion changes in the video, significantly improving the model's ability to capture pulse and respiratory signals. Additionally, a super-resolution reconstruction module is integrated into the network to enhance the image resolution, thereby improving detail capture and increasing the accuracy of facial action unit (AU) tasks. Then, convolutional multilayer perceptron is employed to replace traditional 2D convolutions, improving feature extraction efficiency and flexibility, which significantly boosts the performance of heart rate and respiratory rate measurements. Finally, comprehensive experiments on the Binghamton-Pittsburgh 4D Spontaneous Facial Expression Database (BP4D+) fully validate the effectiveness and superiority of the proposed method in multi-task physiological signal measurement.
Humans
;
Neural Networks, Computer
;
Signal Processing, Computer-Assisted
;
Face/physiology*
;
Video Recording
;
Facial Expression
;
Heart Rate
;
Algorithms
3.Three-dimensional morphological analysis of posed smile.
Yujia XIAO ; Bochun MAO ; Yanheng ZHOU
Journal of Peking University(Health Sciences) 2025;57(5):989-995
OBJECTIVE:
To investigate the changes and symmetry of facial soft tissue during posed smile, to analyze the feature of posed smile in different gender, and verify the reproducibility of posed smile.
METHODS:
Three-dimensional (3D) facial images of 41 adults (16 males and 25 females with an average age of 26.76±2.70 years) which were taken by FaceScan three-dimensional sensor, including one rest position and two posed smile images. Then these images were imported into 3D soft tissue analysis software for model repositioning. 3D morphable model method (3DMM) was carried out to automatic landmarks setting. After that, the measurement of the eyes, cheeks, nose and perioral area were carried out for 3D soft tissue analysis. Finally, the changes and symmetry of the soft tissues between the two expression states and the gender differences during the posed smiles were compared. Meanwhile, the reproducibility of posed smile was statistically tested.
RESULTS:
Compared with the rest position, except for nasolabial angle (1.45°±7.65°), the measurements of 3D soft tissue in other region were changed in posed smile (P < 0.001). It should be noted that the eye region was also significantly changed (P < 0.001). Furthermore, the prominent feature of posed smile was that the alar base length became longer, the upper and lower vermilions were narrow and thin, and the mentolabial furrows became shallow. Meanwhile the chin extended anteriorly while the mouth retracted; During posed smile, the labial fissure asymmetry [2.78 (1.73, 3.49) mm], mid-infraorbital asymmetry [2.36 (1.22, 3.27) mm] and outercanthal asymmetry [2.31(1.29, 2.80) mm] were most apparent. Compared with the rest position, the asymmetry was not significantly increased except for cheilion and alar curvature points during the posed smile (P>0.05). In the posed smile, the changes of the right palpebral fissure height and the thickness of lower vermilion (|Li-Stoi| z) of males were greater than those of females (P < 0.05), and asymmetry of exocanthion and cheekbone increased more than that of females (P < 0.05). There was no obvious difference between the two posed smiles.
CONCLUSION
In this study, during the posed smile the soft tissues of the eyes, cheeks, nose, lips and chin changed in different degrees, and the asymmetry of cheilion and alar curvature point was greater than that of the rest position; In addition, the reproducibility of posed smile was excellent, which can be a reference for clinical aesthetics and functional research of smile.
Humans
;
Smiling/physiology*
;
Female
;
Male
;
Adult
;
Imaging, Three-Dimensional/methods*
;
Face/anatomy & histology*
;
Young Adult
;
Facial Expression
4.Dissecting Social Working Memory: Neural and Behavioral Evidence for Externally and Internally Oriented Components.
Hanxi PAN ; Zefeng CHEN ; Nan XU ; Bolong WANG ; Yuzheng HU ; Hui ZHOU ; Anat PERRY ; Xiang-Zhen KONG ; Mowei SHEN ; Zaifeng GAO
Neuroscience Bulletin 2025;41(11):2049-2062
Social working memory (SWM)-the ability to maintain and manipulate social information in the brain-plays a crucial role in social interactions. However, research on SWM is still in its infancy and is often treated as a unitary construct. In the present study, we propose that SWM can be conceptualized as having two relatively independent components: "externally oriented SWM" (e-SWM) and "internally oriented SWM" (i-SWM). To test this external-internal hypothesis, participants were tasked with memorizing and ranking either facial expressions (e-SWM) or personality traits (i-SWM) associated with images of faces. We then examined the neural correlates of these two SWM components and their functional roles in empathy. The results showed distinct activations as the e-SWM task activated the postcentral and precentral gyri while the i-SWM task activated the precuneus/posterior cingulate cortex and superior frontal gyrus. Distinct multivariate activation patterns were also found within the dorsal medial prefrontal cortex in the two tasks. Moreover, partial least squares analyses combining brain activation and individual differences in empathy showed that e-SWM and i-SWM brain activities were mainly correlated with affective empathy and cognitive empathy, respectively. These findings implicate distinct brain processes as well as functional roles of the two types of SWM, providing support for the internal-external hypothesis of SWM.
Humans
;
Memory, Short-Term/physiology*
;
Male
;
Female
;
Empathy/physiology*
;
Young Adult
;
Magnetic Resonance Imaging
;
Adult
;
Brain/diagnostic imaging*
;
Brain Mapping
;
Facial Expression
;
Social Behavior
;
Facial Recognition/physiology*
;
Social Perception
;
Personality/physiology*
5.Research Progress and Application Prospect of Facial Micro-Expression Analysis in Forensic Psychiatry.
Wen LI ; Hao-Zhe LI ; Chen CHEN ; Wei-Xiong CAI
Journal of Forensic Medicine 2023;39(5):493-500
Research on facial micro-expression analysis has been going on for decades. Micro-expression can reflect the true emotions of individuals, and it has important application value in assisting auxiliary diagnosis and disease monitoring of mental disorders. In recent years, the development of artificial intelligence and big data technology has made the automatic recognition of micro-expressions possible, which will make micro-expression analysis more convenient and more widely used. This paper reviews the development of facial micro-expression analysis and its application in forensic psychiatry, to look into further application prospects and development direction.
Humans
;
Forensic Psychiatry
;
Artificial Intelligence
;
Mental Disorders/diagnosis*
;
Facial Expression
;
Emotions
6.Evaluation of the reproducibility of non-verbal facial expressions in normal persons using dynamic stereophotogrammetric system.
Tian Cheng QIU ; Xiao Jing LIU ; Zhu Lin XUE ; Zi Li LI
Journal of Peking University(Health Sciences) 2020;52(6):1107-1111
OBJECTIVE:
To assess the reproducibility of non-verbal facial expressions (smile lips closed, smile lips open, lip purse, cheek puff) in normal persons using dynamic three-dimensional (3D) imaging and provide reference data for future research.
METHODS:
In this study, 15 adults (7 males and 8 females) without facial asymmetry and facial nerve dysfunction were recruited. Each participant was seated upright in front of the 3D imaging system in natural head position. The whole face could be captured in all six cameras. The dynamic 3D system captured 60 3D images per second. Four facial expressions were included: smile lips closed, smile lips open, lip purse, and cheek puff. Before starting, we instructed the subjects to make facial expressions to develop muscle memory. During recording, each facial expression took about 3 to 4 seconds. At least 1 week later, the procedure was repeated. The rest position (T0) was considered as the base frame. The first quartile of expressions (T1), just after reaching the maximum state of expressions (T2), just before the end of maximum state of expressions (T3), the third quartile of expressions (T4), and the end of motion (T5) were selected as key frames. Using the stable part of face such as forehead, each key frame (T1-T5) of the different expressions was aligned on the corresponding frame at rest (T0). The root mean square (RMS) between each key frame and its corresponding frame at rest were calculated. The Wilcoxon signed ranks test was applied to assess statistical differences between the corresponding frames of the different facial expressions.
RESULTS:
Facial expressions like smile lips closed, smile lips open, and cheek puff were reproducible. Lip purse was not reproducible. The statistically significant differences were found on the T2 frame of the repeated lip purse movement.
CONCLUSION
The dynamic 3D imaging can be used to evaluate the reproducibility of facial expressions. Compared with the qualitative analysis and two-dimensions analysis, dynamic 3D images can be able to more truly represent the facial expressions which make the research more reliable.
Adult
;
Face/diagnostic imaging*
;
Facial Expression
;
Female
;
Humans
;
Imaging, Three-Dimensional
;
Lip/diagnostic imaging*
;
Male
;
Photogrammetry
;
Reproducibility of Results
;
Smiling
7.Möbius Syndrome Demonstrated by the High-Resolution MR Imaging: a Case Report and Review of Literature
Minhee HWANG ; Hye Jin BAEK ; Kyeong Hwa RYU ; Bo Hwa CHOI ; Ji Young HA ; Hyun Jung DO
Investigative Magnetic Resonance Imaging 2019;23(2):167-171
Möbius syndrome is a rare congenital condition, characterized by abducens and facial nerve palsy, resulting in limitation of lateral gaze movement and facial diplegia. However, to our knowledge, there have been few studies on evaluation of cranial nerves, on MR imaging in Möbius syndrome. Herein, we describe a rare case of Möbius syndrome representing limitation of lateral gaze, and weakness of facial expression, since the neonatal period. In this case, high-resolution MR imaging played a key role in diagnosing Möbius syndrome, by direct visualization of corresponding cranial nerves abnormalities.
Cranial Nerves
;
Facial Expression
;
Facial Nerve
;
Magnetic Resonance Imaging
;
Paralysis
8.The Influence of Anxiety on the Recognition of Facial Emotion Depends on the Emotion Category and Race of the Target Faces
Wonjun KANG ; Gayoung KIM ; Hyehyeon KIM ; Sue Hyun LEE
Experimental Neurobiology 2019;28(2):261-269
The recognition of emotional facial expressions is critical for our social interactions. While some prior studies have shown that a high anxiety level is associated with more sensitive recognition of emotion, there are also reports supporting that anxiety did not affect or reduce the sensitivity to the recognition of facial emotions. To reconcile these results, here we investigated whether the effect of individual anxiety on the recognition of facial emotions is dependent on the emotion category and the race of the target faces. We found that, first, there was a significant positive correlation between the individual anxiety level and the recognition sensitivity for angry faces but not for sad or happy faces. Second, while the correlation was significant for both low- and high-intensity angry faces during the recognition of the observer's own-race faces, there was significant correlation only for low-intensity angry faces during the recognition of other-race faces. Collectively, our results suggest that the influence of anxiety on the recognition of facial emotions is flexible depending on the characteristics of the target face stimuli including emotion category and race.
Anxiety
;
Continental Population Groups
;
Facial Expression
;
Humans
;
Interpersonal Relations
9.Experiences of Precocious Puberty in Primary School Girls with Hormone Therapeutics
Soon Mi CHEON ; Hye Young JUNG
Journal of Korean Academic Society of Nursing Education 2019;25(4):459-470
PURPOSE: The purpose of this qualitative study was to identify the nature of precocious puberty and to explore what it means in primary schools.METHODS: The participants of this phenomenological study were nine primary school girls who were diagnosed with precocious puberty and experienced hormone therapeutics, applying a convenience sampling method. Data were collected from July 2017 to January 2018 through individual in-depth interviews of the participants, including gestures, facial expressions and nonverbal means. The data analysis followed the method of Giorgi.RESULTS: The study identified 37 concepts, 12 clusters, and five themes from experiences of precocious puberty. The essential five themes were as follows: ‘ashamed and concealing experience’, ‘there is no therapeutics option’, ‘difficulties in the process of therapeutics’, ‘difficulties in daily life’, and ‘ambivalence toward therapeutics’.CONCLUSION: The findings of this study indicate physical, psychological and social difficulties faced by girls with precocious puberty. Based on this results of the study, it is necessary to develop nursing intervention programs focusing on healthy growth and development for children with precocious puberty.
Child
;
Facial Expression
;
Female
;
Gestures
;
Growth and Development
;
Humans
;
Methods
;
Nursing
;
Puberty, Precocious
;
Qualitative Research
;
Statistics as Topic
10.Clinical effect of integrated sandplay therapy in children with Asperger syndrome.
Guo-Kai LI ; Pin GE ; Gui-Hua LIU ; Xin-Xin HUANG ; Guo-Bin LU ; Yan-Xia WANG ; Qin-Fang QIAN ; Ping OU ; Yu-Ying XU
Chinese Journal of Contemporary Pediatrics 2019;21(3):234-238
OBJECTIVE:
To study the clinical effect of integrated sandplay therapy in preschool children with Asperger syndrome (AS).
METHODS:
A total of 44 preschool children with AS were randomly divided into an experimental group and a control group, with 22 children in each group. The children in the control group were given routine training, and those in the experimental group were given integrated sandplay therapy in addition to the routine training. The treatment response was assess by the Social Responsiveness Scale (SRS), emotional recognition tools and changes in sandplay theme characteristics after 6 months of treatment.
RESULTS:
Before intervention, there were no significant differences between the two groups in the total score of SRS, the score of each factor of SRS, and correct rates of facial expression recognition of the upright position, inverted position, upper face and lower face (P>0.05). After 6 months of intervention, both groups had significant reductions in the total score of SRS and the score of each factor of SRS (P<0.01); the control group had significant increases in the correct rates of facial expression recognition of all positions except the upright position (P<0.05), while the experimental group had significant increases in the correct rates of facial expression recognition of all positions (P<0.05). Compared with the control group after intervention, the experimental group had significantly lower total score of SRS and scores of all factors of SRS except social perception (P<0.01) and significantly higher correct rates of facial expression recognition of all positions (P<0.01). The experimental group had a significant change in the number of sandplay theme characteristics after intervention (P<0.01).
CONCLUSIONS
Integrated sandplay therapy can improve social responsiveness and emotion recognition ability in preschool children with AS.
Asperger Syndrome
;
Child, Preschool
;
Emotions
;
Facial Expression
;
Humans
;
Play Therapy

Result Analysis
Print
Save
E-mail