Multivariate emotional AI model for enhancing students’ ideological education and mental health via brain-computer interfaces and biomechanics
Abstract
This paper investigates attention fatigue detection and multi-grain emotional AI classification for student mental health through an educational brain-computer interface (BCI), with a focus on integrating biomechanical principles to enhance understanding and application. Recognizing the growing importance of students’ mental health and well-being, this study introduces a domain generalization approach in transfer learning to improve cross-subject BCI model accuracy, addressing the challenges of individual variability. The proposed model utilizes only seven electrodes and achieves a 90% accuracy rate in differentiating between two cognitive-behavioral tasks. A truncated weighting algorithm is employed to optimize electrode combinations, enabling effective generalization across subjects. To tackle the practical challenges of emotion recognition in educational settings, the study reduces data sampling points by identifying key brain regions and frequency bands associated with emotions. Machine learning algorithms, including support vector machines (SVM), Bayesian networks, and K-nearest neighbor (KNN), further enhance recognition accuracy. By integrating eye movement and electroencephalography (EEG) signals using deep canonical correlation analysis, the model achieves cumulative accuracy improvements of 15% and 12% compared to unimodal EEG and eye movement data, respectively, across 12 subjects. Incorporating biomechanical principles, the study also examines the mechanical properties of neural tissues and their influence on signal propagation. By analyzing the viscoelastic behavior of brain tissue and its impact on EEG signal transmission, the research provides insights into how mechanical stress and strain affect neural activity. This biomechanical perspective enhances the understanding of individual variability in EEG signals and contributes to the development of more robust and personalized BCI models. The integration of biomechanics with AI-driven emotion classification and attention fatigue detection offers a comprehensive approach to improving student mental health and educational outcomes. This fusion approach demonstrates superior performance in both emotion classification and attention fatigue detection, offering substantial potential for real-time interventions in student mental health and the enhancement of educational outcomes.
References
1. Mesquita B. Between us: How cultures create emotions. WW Norton & Company; 2022.
2. Leong SC, Tang YM, Lai CH, et al. Facial expression and body gesture emotion recognition: A systematic review on the use of visual data in affective computing. Computer Science Review. 2023; 48: 100545.
3. Yeh SC, Wu EHK, Lee YR, et al. User experience of virtual-reality interactive interfaces: A comparison between hand gesture recognition and joystick control for xrspace manova. Applied Sciences. 2022; 12(23): 12230.
4. Myers MH. Automatic detection of a student’s affective states for intelligent teaching systems. Brain Sciences. 2021; 11(3): 331.
5. Xia J, Zhang H, Wen S, et al. An efficient multitask neural network for face alignment, head pose estimation and face tracking. Expert Systems with Applications. 2022; 205: 117368.
6. Zad S, Heidari M, James Jr H, et al. Emotion detection of textual data: An interdisciplinary survey. In: Proceedings of the 2021 World AI IoT Congress (AIIoT); 10–13 May 2021; Seattle, WA, United States.
7. Szymkowiak A, Gaczek P, Jeganathan K, et al. The impact of emotions on shopping behavior during epidemic. What a business can do to protect customers. Journal of Consumer Behaviour. 2021; 20(1): 48–60.
8. Zhang J, Wang M. A survey on robots controlled by motor imagery brain-computer interfaces. Cognitive Robotics. 2021; 1: 12–24.
9. Singh P, Dalal D, Vashishtha G, et al. Learning Robust Deep Visual Representations from EEG Brain Recordings. In: Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision; 4–8 Jan 2024; Waikoloa, HI, USA. pp. 7553–7562.
10. Stajić T, Jovanović J, Jovanović N, et al. Emotion recognition based on DEAP database physiological signals. In: Proceedings of the 2021 29th telecommunications forum (TELFOR); 23–24 November 2021; Belgrade, Serbia. pp. 1–4.
11. Javaid MM, Yousaf MA, Sheikh QZ, et al. Real-time EEG-based human emotion recognition. In: Proceedings of the International Conference on Neural Information Processing, ICONIP 2015; 9–12 November 2015; Istanbul, Turkey. pp. 182–190.
12. Naser DS, Saha G. Influence of music liking on EEG based emotion recognition. Biomedical Signal Processing and Control. 2021; 64: 102251.
13. LeCun Y, Bengio Y, Hinton G. Deep learning. Nature. 2015; 521: 436–444.
14. Elyamany O, Leicht G, Herrmann CS, et al. Transcranial alternating current stimulation (tACS): From basic mechanisms towards first applications in psychiatry. European Archives of Psychiatry and Clinical Neuroscience. 2021; 271(1): 135–156.
15. Lambay A, Liu Y, Morgan PL, et al. Machine learning assisted human fatigue detection, monitoring, and recovery. Digital Engineering. 2024; 1: 100004.
16. Gonzalez-Izal M, Malanda A, Gorostiaga E, et al. Electromyographic models to assess muscle fatigue. Journal of Electromyography and Kinesiology. 2012.
17. Chaddad A, Wu Y, Kateb R, et al. Electroencephalography signal processing: A comprehensive review and analysis of methods and techniques. Sensors. 2023; 23(14): 6434.
18. Deligani RJ, Borgheai SB, McLinden J, et al. Multimodal fusion of EEG-fNIRS: A mutual information-based hybrid classification framework. Biomed Opt Express. 2021; 12(3): 1635–1650.
19. Zhai J, Barreto A. Stress recognition using non-invasive technology. In: Proceedings of the Nineteenth International Florida Artificial Intelligence Research Society Conference (FLAIRS 2006); 11–13 May 2006; Melbourne Beach, FL, USA.
20. Tzimiropoulos G, Zafeiriou S, Pantic M. Robust and efficient parametric face alignment. In: Proceedings of the 2011 International Conference on Computer Vision (ICCV); 06–13 November 2011; Barcelona, Spain.
21. Wright J, Yang A, Ganesh A, et al. Robust face recognition via sparse representation. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2009; 31(2): 210–227.
22. Garcia-Blancas J, Dominguez-Ramirez OA, Rodriguez-Torres EE, et al. A technological proposal for a robot brain computer interface for neurorehabilitation purposes. The European Physical Journal Special Topics. 2025; 1–29.
23. Henry JC. Electroencephalography: Basic principles, clinical applications, and related fields. Neurology. 2006; 67(11).
24. Fermaglich J. Electric fields of the brain: The neurophysics of EEG. JAMA. 1982; 247(13): 1879–1880.
25. Kim J, André E. Emotion recognition based on physiological changes in music listening. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2008; 30(12): 2067–2083.
26. Müller-Putz GR, Scherer R, Brunner C, et al. Better than random: A closer look on BCI results. International Journal of Bioelectromagnetism. 2008; 20(1): 52–55.
27. Ang KK, Guan C, Chua KS, et al. A clinical study of motor imagery-based brain-computer interface for upper limb robotic rehabilitation. In: Proceedings of the 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society; 3–6 September 2009; Minneapolis, MN, USA.
28. Nunez PL, Srinivasan R. Electric Fields of the Brain: The Neurophysics of EEG. Oxford University Press; 2006.
29. Zhang H, Zhou QQ, Chen H, et al. The applied principles of EEG analysis methods in neuroscience and clinical neurology. Military Medical Research. 2023; 10(1): 67.
30. Gong ZQ, Gao P, Jiang C, et al. DREAM: A toolbox to decode rhythms of the brain system[J]. Neuroinformatics. 2021; 19: 529–545.
31. Wang XJ. Neurophysiological and computational principles of cortical rhythms in cognition. Physiological Reviews. 2010.
32. Ranjan R, Sahana BC, Bhandari AK. Ocular artifact elimination from electroencephalography signals: A systematic review. Biocybernetics and Biomedical Engineering. 2021; 41(3): 960–996.
33. Gratton G, Coles MG, Donchin E. A new method for off-line removal of ocular artifact. Electroencephalography and Clinical Neurophysiology. 1983.
34. Birnie MT, Baram TZ. Principles of emotional brain circuit maturation. Science. 2022; 376(6597): 1055–1056.
35. Alsharif AH, Salleh NZM, Baharun R. The neural correlates of emotion in decision-making. International journal of academic research in business and social sciences. 2021; 11(7): 64–77.
36. MacLean PD. The triune brain in evolution: Role in paleocerebral functions. Plenum Press; 1990.
37. Swanson LW. Brain Architecture: Understanding the Basic Plan. Oxford University Press; 2012.
38. De Luca CJ, Gilmore LD, Kuznetsov M, et al. Filtering the surface EMG signal: Movement artifact and baseline noise contamination. Journal of Biomechanics. 2010.
39. Merletti R, Farina D. Surface Electromyography: Physiology, Engineering, and Applications. IEEE Press/Wiley; 2016.
40. Phinyomark A, Limsakul C, Phukpattaranont P. A review of control methods for electromyography pattern classification. Expert Systems with Applications. 2013.
41. Reaz MB, Hussain MS, Mohd-Yasin F. Techniques of EMG signal analysis: Detection, processing, classification, and applications. Biological Procedures Online. 2006.
42. Konrad P. The ABC of EMG: A Practical Introduction to Kinesiological Electromyography. Noraxon. 2005.
43. Webster JG. Medical Instrumentation: Application and Design. John Wiley & Sons; 2009.
44. De Luca CJ. Surface electromyography: Detection and recording. Delsys Incorporated; 2002.
45. Merletti R, Farina D. Analysis of intramuscular electromyographic signals. Philosophical Transactions of the Royal Society B: Biological Sciences. 2009.
46. Butterworth S. On the theory of filter amplifiers. Experimental Wireless and the Wireless Engineer. 1930.
47. Franco S. Design with operational amplifiers and analog integrated circuits. McGraw-Hill Education; 2014.
48. Huigen E, Peper A, Grimbergen CA. Investigation into the origin of the noise of surface electrodes. Medical and Biological Engineering and Computing. 2002.
49. Russell JA. A circumplex model of affect. Journal of Personality and Social Psychology. 1980.
50. Lang PJ, Bradley MM, Cuthbert BN. International affective picture system (IAPS): Affective ratings of pictures and instruction manual. University of Florida; 2008.
51. Posner J, Russell JA, Peterson BS. The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology. Development and Psychopathology. 2005.
52. Tan C, Sun F, Kong T, et al. A survey on deep transfer learning. In: Proceedings of the 27th International Conference on Artificial Neural Networks; 4–7 October 2018; Rhodes, Greece. pp. 270–279.
53. Blankertz B, Tomioka R, Lemm S, et al. Optimizing spatial filters for robust EEG single-trial analysis. IEEE Signal Processing Magazine. 2008.
54. Lu Z, Lin J, Shen Z. Kernel-based feature extraction for time-varying EEG signal classification. Neurocomputing. 2014.
55. Wu W, Gao S, Hong B, et al. Classifying single-trial EEG during motor imagery by iterative spatio-spectral patterns learning (ISSPL). IEEE Transactions on Biomedical Engineering. 2008.
56. Rakotomamonjy A, Guigue V. BCI competition III: Dataset II- ensemble of SVMs for BCI P300 speller. IEEE Transactions on Biomedical Engineering. 2008.
57. Lotte F, Congedo M, Lécuyer A, et al. A review of classification algorithms for EEG-based brain-computer interfaces. Journal of Neural Engineering. 2007.
58. Zhang D, Meng H, Wang Z, et al. Head pose estimation based on deep learning in classroom scenarios. Neurocomputing. 2020.
59. Asteriadis S, Karpouzis K, Kollias S. Head pose estimation for perceptual human-computer interfaces: A survey. Cognitive Computation. 2011.
60. Lucey P, Cohn JF, Matthews I, et al. Automatically detecting pain in video through facial action units. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics). 2010; 41(3): 664–674.
61. D’mello SK, Kory J. A review and meta-analysis of multimodal affect detection systems. ACM computing surveys (CSUR). 2015; 47(3): 1–36.
62. Tziortziotis N, Asteriadis S, Karpouzis K, et al. A neuro-inspired attention model for classroom attention monitoring using a head pose estimation system. Pattern Recognition Letters. 2012.
63. Mohamad Nezami O, Dras M, Hamey L, et al. Automatic recognition of student engagement using deep learning and facial expression. In: Proceedings of the Joint European Conference on Machine Learning and Knowledge Discovery in Databases; 14–18 September 2020; Ghent, Belgium. pp. 273–289.
64. Zaletelj J, Košir A. Predicting students’ attention in the classroom from Kinect facial and body features. EURASIP Journal on Image and Video Processing. 2017.
65. Bosch N, D’Mello S, Mills C, et al. Using video to automatically detect learner affect in computer-enabled classrooms. ACM Transactions on Interactive Intelligent Systems (TiiS). 2016.
66. Zhang Z, Sugano Y, Fritz M, et al. It’s written all over your face: Full-face appearance-based gaze estimation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; 21–26 July 2017; Honolulu, HI, USA.
67. Krafka K, Khosla A, Kellnhofer P, et al. Eye tracking for everyone. In: Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR); 27–30 June 2016; Las Vegas, NV, USA.
68. D’Mello SK, Graesser AC. Multimodal semi-automated affect detection from conversational cues, gross body language, and facial features. User Modeling and User-Adapted Interaction. 2010; 20: 147–187.
69. Whitehill J, Serpell Z, Lin YC, et al. The faces of engagement: Automatic recognition of student engagement from facial expressions. IEEE Transactions on Affective Computing. 2014; 5(1): 86–98.
70. Abadi M, Khosla A, Ramaswamy H, et al. TensorFlow: A system for large-scale machine learning. In: Proceedings of the 12th USENIX Conference on Operating Systems Design and Implementation (OSDI); 2–4 November 2016; Savannah, GA, USA.
71. Xia L, Li Y, Cai X, et al. Collaborative contrastive learning for cross-domain gaze estimation[J]. Pattern Recognition, 2025, 161: 111244.
72. Košir A, Zaletelj J. Classroom attention monitoring system based on facial and body movements using Kinect. EURASIP Journal on Image and Video Processing. 2017.
73. Lin YP, Wang CH, Wu TL, et al. EEG-based emotion recognition in music listening: A comparison of schemes for multiclass support vector machine. In: Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP); 19–24 April 2009; Taipei, Taiwan, China.
74. Zaidi SR, Khan NA, Hasan MA. Bridging Neuroscience and Machine Learning: A Gender-Based Electroencephalogram Framework for Guilt Emotion Identification[J]. Sensors, 2025, 25(4): 1222.
75. Craik A, He Y, Contreras-Vidal JL. Deep learning for electroencephalogram (EEG) classification tasks: A review. Journal of neural engineering. 2019; 16(3): 031001.
76. Pan SJ, Yang Q. A comprehensive survey on transfer learning. IEEE Transactions on Knowledge and Data Engineering. 2010; 22(10): 1345–1359.
77. Chato L, Regentova E. Survey of transfer learning approaches in the machine learning of digital health sensing data. Journal of Personalized Medicine. 2023; 13(12): 1703.
78. Brouwer AM, Hogervorst MA, van Erp JB, et al. Estimating workload using EEG spectral power and ERPs in the n-back task. Journal of Neural Engineering. 2012.
79. Borghini G, Arico P, Di Flumeri G, et al. A neurophysiological training evaluation metric for the implementation of personalized neurometrics. Frontiers in Human Neuroscience. 2017.
80. Mahmoudi A, Khosrotabar M, Gramann K, et al. Using passive BCI for personalization of assistive wearable devices: a proof-of-concept study[J]. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2025.
81. Chavarriaga R, Sobolewski A, Millan JDR. Errare machinale est: The use of error-related potentials in brain-machine interfaces. Frontiers in Neuroscience. 2014.
82. Schirrmeister RT, Springenberg JT, Fiederer LDJ, et al. Deep learning with convolutional neural networks for EEG decoding and visualization. Human Brain Mapping. 2017.
83. Han J, Li H, Zhang X, et al. EMCNN: Fine-Grained Emotion Recognition based on PPG using Multi-scale Convolutional Neural Network[J]. Biomedical Signal Processing and Control, 2025, 105: 107594.
84. Liu Y, Sourina O, Nguyen MK. Real-time EEG-based human emotion recognition and visualization. In: Proceedings of the International Conference on Cyberworlds (CW); 20–22 October 2010; Singapore.
85. Wu D, Xu Y, Lu B. Transfer learning for EEG-based brain-computer interfaces: A Euclidean space data alignment approach. IEEE Transactions on Biomedical Engineering. 2015.
86. Abir S I, Shoha S, Hossain MM, et al. Machine Learning and Deep Learning Techniques for EEG-Based Prediction of Psychiatric Disorders[J]. Journal of Computer Science and Technology Studies, 2025, 7(1): 46–63.
87. Zhang K, Jin R, Zhou ZH. Transfer learning in brain-computer interfaces: A literature review. Journal of Biomedical Informatics. 2020.
88. Chao ZC, Nagasaka Y, Fujii N. Long-term asynchronous decoding of arm motion using electrocorticographic signals in monkeys. Frontiers in Neuroengineering. 2010.
89. Bashivan P, Rish I, Yeasin M, et al. Learning representations from EEG with deep recurrent-convolutional neural networks. In: Proceedings of the International Conference on Learning Representations (ICLR); 2–4 May 2016; San Juan, Puerto Rico.
90. Zhang Y, Zhou G, Jin J, et al. Aggregation of sparse linear discriminant analyses for event-related potential classification in brain-computer interface. International Journal of Neural Systems. 2015.
91. Wang Z, Li S, Luo J, et al. Channel reflection: Knowledge-driven data augmentation for EEG-based brain-computer interfaces. Neural Networks. 2024; 176: 106351. doi: 10.1016/j.neunet.2024.106351
92. Wickramasinghe N, Chalasani S, Sloane E. Digital disruption in healthcare. Springer; 2022.
93. Qi W, Sun S, Niu T, et al. Research and prospects of virtual reality systems applying exoskeleton technology. Universal Access in the Information Society. 2024; 23(1): 119–140.
94. Stasieńko JAN, Dytman-Stasieńko A, Madej K, et al. Representations of Disability in Video Games. Available from: https://d1wqtxts1xzle7.cloudfront.net/88357265/Fragile_Avatars_ebook1-libre.pdf?1658378810=&response-content-disposition=inline%3B+filename%3DFragile_Avatars_Representations_of_Disa.pdf&Expires=1740626375&Signature=fT1ftgfLkm3Oua-Hh7NMIzsyog96U5JWUxhKEiqOCeTb56YIdSHVQfhPZeg9Ig7XnueVxdZXjGEpsKrRBYqnpJar5qSV84Ix6tCRUKzXyB60BQii7V6Q0Ko6KQQSlB7q9YHo4BwOSp-NwCiRjrbQYUhw7JRjPT3sQExf34GvtypQlAL~dvVVq2oZisknXDVkpnZ7hskL7gifUSM6qsJYufi3dmqbsA71oqZChDGceRqazY5~RRD3kPTkU1Gc5cQKFlCkAy6S6eGFlpGFWWHU1pK6egw35C72DjGAonEabic25xzuGi2lq1-y-ti1wwYuf8GVi4CK51gth7lMQXLTJQ__&Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA (accessed on 28 November 2024).
Copyright (c) 2025 Author(s)

This work is licensed under a Creative Commons Attribution 4.0 International License.
Copyright on all articles published in this journal is retained by the author(s), while the author(s) grant the publisher as the original publisher to publish the article.
Articles published in this journal are licensed under a Creative Commons Attribution 4.0 International, which means they can be shared, adapted and distributed provided that the original published version is cited.