1.Dynamic continuous emotion recognition method based on electroencephalography and eye movement signals.
Yangmeng ZOU ; Lilin JIE ; Mingxun WANG ; Yong LIU ; Junhua LI
Journal of Biomedical Engineering 2025;42(1):32-41
Existing emotion recognition research is typically limited to static laboratory settings and has not fully handle the changes in emotional states in dynamic scenarios. To address this problem, this paper proposes a method for dynamic continuous emotion recognition based on electroencephalography (EEG) and eye movement signals. Firstly, an experimental paradigm was designed to cover six dynamic emotion transition scenarios including happy to calm, calm to happy, sad to calm, calm to sad, nervous to calm, and calm to nervous. EEG and eye movement data were collected simultaneously from 20 subjects to fill the gap in current multimodal dynamic continuous emotion datasets. In the valence-arousal two-dimensional space, emotion ratings for stimulus videos were performed every five seconds on a scale of 1 to 9, and dynamic continuous emotion labels were normalized. Subsequently, frequency band features were extracted from the preprocessed EEG and eye movement data. A cascade feature fusion approach was used to effectively combine EEG and eye movement features, generating an information-rich multimodal feature vector. This feature vector was input into four regression models including support vector regression with radial basis function kernel, decision tree, random forest, and K-nearest neighbors, to develop the dynamic continuous emotion recognition model. The results showed that the proposed method achieved the lowest mean square error for valence and arousal across the six dynamic continuous emotions. This approach can accurately recognize various emotion transitions in dynamic situations, offering higher accuracy and robustness compared to using either EEG or eye movement signals alone, making it well-suited for practical applications.
Humans
;
Electroencephalography/methods*
;
Emotions/physiology*
;
Eye Movements/physiology*
;
Signal Processing, Computer-Assisted
;
Support Vector Machine
;
Algorithms
2.A method for emotion transition recognition using cross-modal feature fusion and global perception.
Lilin JIE ; Yangmeng ZOU ; Zhengxiu LI ; Baoliang LYU ; Weilong ZHENG ; Ming LI
Journal of Biomedical Engineering 2025;42(5):977-986
Current studies on electroencephalogram (EEG) emotion recognition primarily concentrate on discrete stimulus paradigms under controlled laboratory settings, which cannot adequately represent the dynamic transition characteristics of emotional states during multi-context interactions. To address this issue, this paper proposes a novel method for emotion transition recognition that leverages a cross-modal feature fusion and global perception network (CFGPN). Firstly, an experimental paradigm encompassing six types of emotion transition scenarios was designed, and EEG and eye movement data were simultaneously collected from 20 participants, each annotated with dynamic continuous emotion labels. Subsequently, deep canonical correlation analysis integrated with a cross-modal attention mechanism was employed to fuse features from EEG and eye movement signals, resulting in multimodal feature vectors enriched with highly discriminative emotional information. These vectors are then input into a parallel hybrid architecture that combines convolutional neural networks (CNNs) and Transformers. The CNN is employed to capture local time-series features, whereas the Transformer leverages its robust global perception capabilities to effectively model long-range temporal dependencies, enabling accurate dynamic emotion transition recognition. The results demonstrate that the proposed method achieves the lowest mean square error in both valence and arousal recognition tasks on the dynamic emotion transition dataset and a classic multimodal emotion dataset. It exhibits superior recognition accuracy and stability when compared with five existing unimodal and six multimodal deep learning models. The approach enhances both adaptability and robustness in recognizing emotional state transitions in real-world scenarios, showing promising potential for applications in the field of biomedical engineering.
Humans
;
Emotions/physiology*
;
Electroencephalography
;
Neural Networks, Computer
;
Eye Movements
;
Perception
3.Transcranial temporal interference stimulation precisely targets deep brain regions to regulate eye movements.
Mo WANG ; Sixian SONG ; Dan LI ; Guangchao ZHAO ; Yu LUO ; Yi TIAN ; Jiajia ZHANG ; Quanying LIU ; Pengfei WEI
Neuroscience Bulletin 2025;41(8):1390-1402
Transcranial temporal interference stimulation (tTIS) is a novel non-invasive neuromodulation technique with the potential to precisely target deep brain structures. This study explores the neural and behavioral effects of tTIS on the superior colliculus (SC), a region involved in eye movement control, in mice. Computational modeling revealed that tTIS delivers more focused stimulation to the SC than traditional transcranial alternating current stimulation. In vivo experiments, including Ca2+ signal recordings and eye movement tracking, showed that tTIS effectively modulates SC neural activity and induces eye movements. A significant correlation was found between stimulation frequency and saccade frequency, suggesting direct tTIS-induced modulation of SC activity. These results demonstrate the precision of tTIS in targeting deep brain regions and regulating eye movements, highlighting its potential for neuroscientific research and therapeutic applications.
Animals
;
Superior Colliculi/physiology*
;
Transcranial Direct Current Stimulation/methods*
;
Eye Movements/physiology*
;
Male
;
Mice
;
Mice, Inbred C57BL
4.Distributions of Visual Receptive Fields from Retinotopic to Craniotopic Coordinates in the Lateral Intraparietal Area and Frontal Eye Fields of the Macaque.
Lin YANG ; Min JIN ; Cong ZHANG ; Ning QIAN ; Mingsha ZHANG
Neuroscience Bulletin 2024;40(2):171-181
Even though retinal images of objects change their locations following each eye movement, we perceive a stable and continuous world. One possible mechanism by which the brain achieves such visual stability is to construct a craniotopic coordinate by integrating retinal and extraretinal information. There have been several proposals on how this may be done, including eye-position modulation (gain fields) of retinotopic receptive fields (RFs) and craniotopic RFs. In the present study, we investigated coordinate systems used by RFs in the lateral intraparietal (LIP) cortex and frontal eye fields (FEF) and compared the two areas. We mapped the two-dimensional RFs of neurons in detail under two eye fixations and analyzed how the RF of a given neuron changes with eye position to determine its coordinate representation. The same recording and analysis procedures were applied to the two brain areas. We found that, in both areas, RFs were distributed from retinotopic to craniotopic representations. There was no significant difference between the distributions in the LIP and FEF. Only a small fraction of neurons was fully craniotopic, whereas most neurons were between the retinotopic and craniotopic representations. The distributions were strongly biased toward the retinotopic side but with significant craniotopic shifts. These results suggest that there is only weak evidence for craniotopic RFs in the LIP and FEF, and that transformation from retinotopic to craniotopic coordinates in these areas must rely on other factors such as gain fields.
Animals
;
Macaca
;
Visual Fields
;
Frontal Lobe/physiology*
;
Eye Movements
;
Brain
5.Effect and differentiation of spontaneous nystagmus of acute unilateral vestibulopathy on saccade in the video head impulse test.
Qiaomei DENG ; Xueqing ZHANG ; Chao WEN ; Xiaobang HUANG ; Taisheng CHEN ; Wei WANG
Journal of Clinical Otorhinolaryngology Head and Neck Surgery 2024;38(12):1122-1133
Objective:Exploring the performance characteristics of spontaneous nystagmus(SN) in video-head impulse test(vHIT) and its possible effects on saccade. Methods:Vestibular function tests such as vHIT and SN were conducted in 48 patients with acute unilateral vestibulopathy(AUVP). The saccade characteristics of vHIT in patients without SN and those with SN were analyzed, as well as the expression characteristics of SN in vHIT. Results:Among the 48 AUVP patients, there were 34 cases with SN, including 31 cases with saccade on the healthy side, 11 cases with both the same and opposite directions of eye movement, 19 with the opposite only, 1 with same direction only, and 3 cases without saccade. There were 14 patients without SN, of whom 10 showed saccade on the healthy side, including 4 with both eye movements in the same and opposite direction, 2 in the opposite direction only, 4 in the same direction only, and 4 without saccade. There is a correlation between reverse saccade on the healthy side and the presence of SN in patients. SN in vHIT can appear opposite to the direction of eye movement on the healthy side, while on the affected side it can appear the same as the direction of eye movement and may cause more discrete overt saccade. 32 patients in the acute phase(≤2 w), 29 patients with SN, SN intensity of(6.7 ± 3.2) °/s, and 3 patients without SN. 16 cases in non acute phase(>2 w), 5 cases with SN, SN intensity of(3.7 ± 2.1) °/s, and 11 cases without SN. In the acute phase there were 30 cases of saccade on the healthy side, 10 cases with both the same and opposite direction of eye movement, 18 cases with only the opposite direction, 2 cases with only the same direction and 2 cases without saccade. There is a correlation between the duration of the disease and the occurrence of reverse saccade on the healthy side. The intensity cut off point of SN for reverse saccade is 2.1 °/s in the healthy lateral semicircular canal vHIT. Conclusion:Compensatory saccades and SN waves with similar waveforms are mostly present in vHIT in AUVP patients. SN wave is in the opposite direction of the normal side and eye movement wave, and the affected side and dominant saccade direction are in the same direction and mixed together, which can affect the dispersion and amplitude of overt saccade in vHIT. Accurate identification of SN in vHIT of AUVP patients is not only the key factor to identify compensatory saccade, but also can provide help for the diagnosis and compensatory assessment of AUVP.
Humans
;
Head Impulse Test/methods*
;
Nystagmus, Pathologic/physiopathology*
;
Saccades/physiology*
;
Male
;
Female
;
Vestibular Diseases/physiopathology*
;
Middle Aged
;
Adult
;
Eye Movements/physiology*
;
Aged
6.Research on eye movement data classification using support vector machine with improved whale optimization algorithm.
Yinhong SHEN ; Chang ZHANG ; Lin YANG ; Yuanyuan LI ; Xiujuan ZHENG
Journal of Biomedical Engineering 2023;40(2):335-342
When performing eye movement pattern classification for different tasks, support vector machines are greatly affected by parameters. To address this problem, we propose an algorithm based on the improved whale algorithm to optimize support vector machines to enhance the performance of eye movement data classification. According to the characteristics of eye movement data, this study first extracts 57 features related to fixation and saccade, then uses the ReliefF algorithm for feature selection. To address the problems of low convergence accuracy and easy falling into local minima of the whale algorithm, we introduce inertia weights to balance local search and global search to accelerate the convergence speed of the algorithm and also use the differential variation strategy to increase individual diversity to jump out of local optimum. In this paper, experiments are conducted on eight test functions, and the results show that the improved whale algorithm has the best convergence accuracy and convergence speed. Finally, this paper applies the optimized support vector machine model of the improved whale algorithm to the task of classifying eye movement data in autism, and the experimental results on the public dataset show that the accuracy of the eye movement data classification of this paper is greatly improved compared with that of the traditional support vector machine method. Compared with the standard whale algorithm and other optimization algorithms, the optimized model proposed in this paper has higher recognition accuracy and provides a new idea and method for eye movement pattern recognition. In the future, eye movement data can be obtained by combining it with eye trackers to assist in medical diagnosis.
Animals
;
Support Vector Machine
;
Whales
;
Eye Movements
;
Algorithms
7.A review on voluntary or involuntary eye movement classification methods based on electro-oculogram and their applications.
Jiarong LIU ; Linyao WANG ; Yingnian WU ; Qing HE
Journal of Biomedical Engineering 2022;39(4):833-840
The eye-computer interaction technology based on electro-oculogram provides the users with a convenient way to control the device, which has great social significance. However, the eye-computer interaction is often disturbed by the involuntary eye movements, resulting in misjudgment, affecting the users' experience, and even causing danger in severe cases. Therefore, this paper starts from the basic concepts and principles of eye-computer interaction, sorts out the current mainstream classification methods of voluntary/involuntary eye movement, and analyzes the characteristics of each technology. The performance analysis is carried out in combination with specific application scenarios, and the problems to be solved are further summarized, which are expected to provide research references for researchers in related fields.
Computers
;
Electrooculography/methods*
;
Eye Movements
;
Movement
9.A Gaussian mixture-hidden Markov model of human visual behavior.
Huaqian LIU ; Xiujuan ZHENG ; Yan WANG ; Yun ZHANG ; Kai LIU
Journal of Biomedical Engineering 2021;38(3):512-519
Vision is an important way for human beings to interact with the outside world and obtain information. In order to research human visual behavior under different conditions, this paper uses a Gaussian mixture-hidden Markov model (GMM-HMM) to model the scanpath, and proposes a new model optimization method, time-shifting segmentation (TSS). The TSS method can highlight the characteristics of the time dimension in the scanpath, improve the pattern recognition results, and enhance the stability of the model. In this paper, a linear discriminant analysis (LDA) method is used for multi-dimensional feature pattern recognition to evaluates the rationality and the accuracy of the proposed model. Four sets of comparative trials were carried out for the model evaluation. The first group applied the GMM-HMM to model the scanpath, and the average accuracy of the classification could reach 0.507, which is greater than the opportunity probability of three classification (0.333). The second set of trial applied TSS method, and the mean accuracy of classification was raised to 0.610. The third group combined GMM-HMM with TSS method, and the mean accuracy of classification reached 0.602, which was more stable than the second model. Finally, comparing the model analysis results with the saccade amplitude (SA) characteristics analysis results, the modeling analysis method is much better than the basic information analysis method. Via analyzing the characteristics of three types of tasks, the results show that the free viewing task have higher specificity value and a higher sensitivity to the cued object search task. In summary, the application of GMM-HMM model has a good performance in scanpath pattern recognition, and the introduction of TSS method can enhance the difference of scanpath characteristics. Especially for the recognition of the scanpath of search-type tasks, the model has better advantages. And it also provides a new solution for a single state eye movement sequence.
Algorithms
;
Discriminant Analysis
;
Eye Movements
;
Humans
;
Markov Chains
;
Normal Distribution
;
Probability
10.Eye movement autophony: A unique presenting symptom of semicircular canal dehiscence syndrome
Philippine Journal of Otolaryngology Head and Neck Surgery 2020;35(1):74-75
A 31-year-old woman presented with the very unusual symptom of being able to hear the movement of her eyeballs in her left ear: “I can hear my eyeballs move!” She initially described hearing a recurrent “swishing” sound that would occur intermittently. She eventually realized that its occurrence coincided with eyeball movement. In the eight months’ duration of her symptom, she had been unable to obtain a diagnosis from physicians whom she consulted and had even been referred for psychiatric evaluation and treatment. An otolaryngologist whom she consulted had a standard pure tone audiometric examination done, and this showed normal hearing acuity in both ears. A Magnetic Resonance Imaging (MRI) of the inner ear and brain likewise showed no abnormalities. Due to the peculiarity of the patient’s complaint, the otolaryngologist consulted with a neurotologist who suspected the presence of a semicircular canal dehiscence. A computerized tomographic imaging study of the temporal bone confirmed the presence of a left superior semicircular canal dehiscence syndrome.
Semicircular Canal Dehiscence
;
Semicircular Canals
;
Eye Movements


Result Analysis
Print
Save
E-mail