1.Dynamic continuous emotion recognition method based on electroencephalography and eye movement signals.
Yangmeng ZOU ; Lilin JIE ; Mingxun WANG ; Yong LIU ; Junhua LI
Journal of Biomedical Engineering 2025;42(1):32-41
Existing emotion recognition research is typically limited to static laboratory settings and has not fully handle the changes in emotional states in dynamic scenarios. To address this problem, this paper proposes a method for dynamic continuous emotion recognition based on electroencephalography (EEG) and eye movement signals. Firstly, an experimental paradigm was designed to cover six dynamic emotion transition scenarios including happy to calm, calm to happy, sad to calm, calm to sad, nervous to calm, and calm to nervous. EEG and eye movement data were collected simultaneously from 20 subjects to fill the gap in current multimodal dynamic continuous emotion datasets. In the valence-arousal two-dimensional space, emotion ratings for stimulus videos were performed every five seconds on a scale of 1 to 9, and dynamic continuous emotion labels were normalized. Subsequently, frequency band features were extracted from the preprocessed EEG and eye movement data. A cascade feature fusion approach was used to effectively combine EEG and eye movement features, generating an information-rich multimodal feature vector. This feature vector was input into four regression models including support vector regression with radial basis function kernel, decision tree, random forest, and K-nearest neighbors, to develop the dynamic continuous emotion recognition model. The results showed that the proposed method achieved the lowest mean square error for valence and arousal across the six dynamic continuous emotions. This approach can accurately recognize various emotion transitions in dynamic situations, offering higher accuracy and robustness compared to using either EEG or eye movement signals alone, making it well-suited for practical applications.
Humans
;
Electroencephalography/methods*
;
Emotions/physiology*
;
Eye Movements/physiology*
;
Signal Processing, Computer-Assisted
;
Support Vector Machine
;
Algorithms
2.A method for emotion transition recognition using cross-modal feature fusion and global perception.
Lilin JIE ; Yangmeng ZOU ; Zhengxiu LI ; Baoliang LYU ; Weilong ZHENG ; Ming LI
Journal of Biomedical Engineering 2025;42(5):977-986
Current studies on electroencephalogram (EEG) emotion recognition primarily concentrate on discrete stimulus paradigms under controlled laboratory settings, which cannot adequately represent the dynamic transition characteristics of emotional states during multi-context interactions. To address this issue, this paper proposes a novel method for emotion transition recognition that leverages a cross-modal feature fusion and global perception network (CFGPN). Firstly, an experimental paradigm encompassing six types of emotion transition scenarios was designed, and EEG and eye movement data were simultaneously collected from 20 participants, each annotated with dynamic continuous emotion labels. Subsequently, deep canonical correlation analysis integrated with a cross-modal attention mechanism was employed to fuse features from EEG and eye movement signals, resulting in multimodal feature vectors enriched with highly discriminative emotional information. These vectors are then input into a parallel hybrid architecture that combines convolutional neural networks (CNNs) and Transformers. The CNN is employed to capture local time-series features, whereas the Transformer leverages its robust global perception capabilities to effectively model long-range temporal dependencies, enabling accurate dynamic emotion transition recognition. The results demonstrate that the proposed method achieves the lowest mean square error in both valence and arousal recognition tasks on the dynamic emotion transition dataset and a classic multimodal emotion dataset. It exhibits superior recognition accuracy and stability when compared with five existing unimodal and six multimodal deep learning models. The approach enhances both adaptability and robustness in recognizing emotional state transitions in real-world scenarios, showing promising potential for applications in the field of biomedical engineering.
Humans
;
Emotions/physiology*
;
Electroencephalography
;
Neural Networks, Computer
;
Eye Movements
;
Perception
3.Development of a time-resolved fluorescence immunoassay for rapid determinat-ion of levels of cardiac troponin Ⅰantigen in human serum
Jie ZHANG ; Lingling LU ; Xiaofu PAN ; Lilin ZOU ; Jimin GAO
Chinese Journal of Immunology 2014;(8):1093-1097
To establish a method to detect cardiac troponin I by using time-resolved fluoroimmunoassay ( TRFIA) and apply to the clinic.Methods:The assay were measured by TRIFA and double antibody sandwich method .Standard protocols were evaluated with the standard curve , the limit of detection , stability, precision and cross reaction .Healthy reference populations and clinical serum specimens were measured to established the reference interval and evaluated the perspective of the clinical application . Results:The standard curve was Y=7485 .878+1400.924 X with a correlation coefficient of 0.999.The limit of detection was 0.052 ng/ml.The intra-and inter-assay coefficients of variation ( CV) were all <10%.Reference values was <0.14 μg/L.The AUC of ROC curve was 0.971 while the sensitivity was 96.45%,the specificity was 91.43% and the accuracy was 95.69%, with 98.45% of positive predictive value and 82.05%of negative predictive value.The correlation coefficient was 0.993 between our proposed method and the commercially available CLIA kits.There was no significant difference in statistics compared with ECG , CK-MB and cTnT ( P>0.05 ).There was significant difference in statistics compared before and after treatment with AMI ( P<0.001 ) .Conclusion: The TRFIA method for detecting cTnI achieves clinical application standards and may be used for the diagnosis and serosurveillance of acute myocardial infarction patients.

Result Analysis
Print
Save
E-mail