HUMAN BEHAVIOUR ANALYSIS USING CNN

Main Article Content

Dr. Anupama Budhewar
Sanika Purbuj
Darshika Rathod
Mrunal Tukan
Palak Kulshrestha

Keywords

Affective Computing, Multimodal Emotion Recognition, Facial Expression Analysis, EEG Data Processing, Deep Learning Models, Feature Extraction, Real-time Emotion Detection, Individual Differences, Personalized Emotional Profiling, Human-Computer Interaction, Collaborative Environments, Mental Health Applications, Ethical Considerations, Privacy Protection, Informed Consent, Machine Learning Algorithms, Model Training and Optimization, Algorithmic Approaches, Virtual Reality Integration, Neuro-feedback Systems

Abstract

Emotion recognition has been the subject of extensive research due to its significant impact on various domains, including healthcare, human-computer interaction, and marketing. Traditional methods of emotion recognition rely on visual cues, such as facial expressions, to decipher emotional states. However, these methods often fall short when dealing with individuals who have limited ability to express emotions through facial expressions, such as individuals with certain neurological disorders.


This research paper proposes a novel approach to emotion recognition by combining facial expression analysis with electroencephalography (EEG) data. Deep learning techniques are applied to extract features from facial expressions captured through video analysis, while simultaneously analyzing the corresponding EEG signals. The goal is to improve emotion recognition accuracy by utilizing the complementary information offered by the interaction between facial expressions and EEG data.


Emotion recognition is a challenging task that has collected considerable recognition in the current  years. Different and refined approaches to recognize emotions based on facial expressions, voice analysis, physiological signals, and behavioral patterns have been developed.. While facial expression analysis has been a dominant approach, it falls short in instances where individuals cannot effectively express emotions through their faces. To overcome these limitations, there is a need to explore alternative methods that can provide a more accurate assessment of emotions. This research paper aims to investigate the collaboration and interaction between facial expressions and EEG data for emotion recognition.  By combining the information from both modalities, it is expected to augment the accuracy and strength of emotion recognition systems. The proposed method can range from conducting literature reviews to designing and fine-tuning deep learning models for feature extraction, developing fusion models to combine features from facial expressions and EEG data, performing experimentation and evaluation, writing papers and documentation, preparing presentations for dissemination, and engaging in regular meetings and discussions for effective collaboration. Ethical considerations, robustness and generalizability, continual learning and skill development, and utilizing collaboration tools and platforms are also essential contributions to ensure the project's success.

Abstract 114 | PDF Downloads 44

References

1. Monira Islam, Tan Lee," MEMD-HHT based Emotion Detection from EEG using 3D CNN", July 11-15, 2022
2. HONGLI ZHANG,"Expression-EEG Based Collaborative Multimodal Emotion Recognition Using Deep AutoEncoder",September 7, 2020
3. GUOSHENG YANG , RUI JIAO , HUIPING JIANG , AND TING ZHANG,"Ground Truth Dataset for EEG-Based Emotion Recognition With Visual Indication",n October 13, 2020,
4. MD. RABIUL ISLAM, (Member, IEEE), MOHAMMAD ALI MONI,MD. MILON ISLAM," Emotion Recognition From EEG Signal Focusing on Deep Learning and Shallow Learning Techniques, June 22, 2021.
5. Dahua Li , Jiayin Liu , Yi Yang , Fazheng Hou , Haotian Song, Yu Song,"Emotion Recognition of Subjects With Hearing Impairment Based on Fusion of FacialExpression and EEG Topographic Map",IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, VOL. 31, 2023.
6. Salma Alhagry,Aly Aly Fahmy,Reda A. El-Khoribi,"Emotion Recognition based on EEG using LSTM Recurrent Neural Network",Vol. 8, No. 10, 2017
7. Saeed Turabzadeh, Hongying Meng ,*ID , Rafiq M. Swash 1 ID , Matus Pleva 2 ID and Jozef Juhar ,"Facial Expression Emotion Detection for Real-Time Embedded Systems", 26 January 2018.
8. Saket S Kulkarni1, Narender P Reddy*1 and SI Hariharan,"Facial expression (mood) recognition from facial images using committee neural networks",5 August 2009.
9. J. A. Russell, A circumplex model of affect, Journal of Personality and Social Psychology, vol. 39, no. 6, pp. 1161, 1980.
10. Katie Moraes de Almondes1, Francisco Wilson Nogueira Holanda Júnior2,Maria Emanuela Matos Leonardo1 and Nelson Torro Alves3,"Facial Emotion Recognition andExecutive Functions",17April,2020.
11. E. S. Salama, R. A. El-Khoribi, M. E. Shoman, and M. A. W.laby, EEG-based emotion recognition using 3D convolutional neural networks, Int. J. Adv. Comput. Sci. Appl, vol. 9, no. 8, pp. 329-337,2018.
12. P. Gaur, R. B. Pachori, H. Wang, and G. Prasad, An automatic subject specific intrinsic mode function selection for enhancing two-class EEG-based motor imagery-brain computer interface, IEEE SensorsJournal, vol. 19, no. 16, pp. 6938-6947, 2019.
13. S. B. Wankhade and D. D. Doye, ‘‘Deep learning of empirical mean curve decomposition-wavelet decomposed EEG signal for emotion recognition,’’ Int. J. Uncertainty, Fuzziness Knowl.-Based Syst., vol. 28, no. 01, pp. 153–177, Feb. 2020.
14. S. Zhao, A. Gholaminejad, G. Ding, Y. Gao, J. Han, and K. Keutzer, ‘‘Personalized emotion recognition by personality-aware high-order learning of physiological signals,’’ ACM Trans. Multimedia Comput., Commun., Appl., vol. 15, no. 1s, pp. 1–18, Feb. 2019.
15. S. Aydin, “Deep learning classification of neuroemotional phase domain complexity levels induced by affective video film clips,”IEEE J. Biomed. Health Informat., vol. 24, no. 6, pp. 1695–1702,Jun. 2020.
16. H. Kawano, A. Seo, Z. G. Doborjeh, N. Kasabov, and M. G. Doborjeh,“Analysis of similarity and differences in brain activities between perception and production of facial expressions using EEG data and the NeuCube spiking neural network architecture,” in Proc. Int. Conf. Neural Inf. Process., vol. 9950, 2016.
17. Z. Liang et al., “EEGFuseNet: Hybrid unsupervised deep feature characterization and fusion for high-dimensional EEG with an application to emotion recognition,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 29, pp. 1913–1925, 2021.
18. W. Huang, Y. Xue, L. Hu, and H. Liuli, S-EEGNet: Electroencephalgram Signal Classification Based on a Separable Convolution Neural Network With Bilinear Interpolation, IEEE Access, vol. 8, pp. 131636-131646, 2020.