Sonawane, DhananjayDhananjaySonawanePandey, PankajPankajPandeyMukopadhyay, DyutimanDyutimanMukopadhyayMiyapuram, Krishna PrasadKrishna PrasadMiyapuram2025-08-312025-08-312021-01-01[9783030869922]10.1007/978-3-030-86993-9_252-s2.0-85115870285http://repository.iitgn.ac.in/handle/IITG2025/25619Visual, audio, and emotional perception by human beings have been an interesting research topic in the past few decades. Electroencephalography (EEG) signals are one of the ways to represent human brain activity. It has been shown, that different brain networks correspond to processes corresponding to varieties of emotional stimuli. In this paper, we demonstrate a deep learning architecture for the movie identification task from the EEG response using Convolutional Neural Network (CNN). The dataset includes nine movie clips that span across different emotional states. The EEG time series data has been collected for 20 participants. Given one second EEG response of particular participant, we tried to predict its corresponding movie ID. We have also discussed the various pre-processing steps for data cleaning and data augmentation process. All the participants have been considered in both train and test data. We obtained 80.22% test accuracy for this movie classification task. We also tried cross participant testing using the same model and the performance was poor for the unseen participants. Our result gives insight toward the creation of identifiable patterns in the brain during audiovisual perception.falseBrain signals | Classification | CNN | EEG | Neural entrainmmentMovie Identification from Electroencephalography Response Using Convolutional Neural NetworkConference Paper16113349267-27620210cpBook Series0