Research on emotion recognition in electroencephalogram based on independent component analysis-recurrence plot and improved EfficientNet.
10.7507/1001-5515.202406029
- Author:
Guohong FENG
1
;
Xiao ZHENG
1
;
Bin ZHANG
1
;
Hongen WANG
1
Author Information
1. College of Mechanical and Electrical Engineering, Northeast Forestry University, Harbin 150040, P. R. China.
- Publication Type:Journal Article
- Keywords:
Attention mechanism;
EfficientNet;
Electroencephalogram;
Emotion recognition;
Independent component analysis;
Recurrence plot
- MeSH:
Electroencephalography/methods*;
Emotions;
Humans;
Signal Processing, Computer-Assisted;
Principal Component Analysis;
Algorithms;
Convolutional Neural Networks
- From:
Journal of Biomedical Engineering
2024;41(6):1103-1109
- CountryChina
- Language:Chinese
-
Abstract:
To accurately capture and effectively integrate the spatiotemporal features of electroencephalogram (EEG) signals for the purpose of improving the accuracy of EEG-based emotion recognition, this paper proposes a new method combining independent component analysis-recurrence plot with an improved EfficientNet version 2 (EfficientNetV2). First, independent component analysis is used to extract independent components containing spatial information from key channels of the EEG signals. These components are then converted into two-dimensional images using recurrence plot to better extract emotional features from the temporal information. Finally, the two-dimensional images are input into an improved EfficientNetV2, which incorporates a global attention mechanism and a triplet attention mechanism, and the emotion classification is output by the fully connected layer. To validate the effectiveness of the proposed method, this study conducts comparative experiments, channel selection experiments and ablation experiments based on the Shanghai Jiao Tong University Emotion Electroencephalogram Dataset (SEED). The results demonstrate that the average recognition accuracy of our method is 96.77%, which is significantly superior to existing methods, offering a novel perspective for research on EEG-based emotion recognition.