Repository logo
  • English
  • العربية
  • বাংলা
  • Català
  • Čeština
  • Deutsch
  • Ελληνικά
  • Español
  • Suomi
  • Français
  • Gàidhlig
  • हिंदी
  • Magyar
  • Italiano
  • Қазақ
  • Latviešu
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Srpski (lat)
  • Српски
  • Svenska
  • Türkçe
  • Yкраї́нська
  • Tiếng Việt
Log In
New user? Click here to register.Have you forgotten your password?
  1. Home
  2. IIT Gandhinagar
  3. Computer Science and Engineering
  4. CSE Publications
  5. Decoding listener identity: person identification from high-frequency EEG rhythms during music perception
 
  • Details

Decoding listener identity: person identification from high-frequency EEG rhythms during music perception

Source
TechRxiv
Date Issued
2024-06-01
DOI
10.36227/techrxiv.20502423.v2
Abstract
Research in music perception and brain activity has led to the development of Music Brain-Computer Interface Systems. While previous studies have focused on aspects such as song identification, stimulus-response correlation, and inter-subject correlations, they have often overlooked the understanding of individual differences in subjective experiences. In this research, our objective is to identify listener-specific neural signatures by analyzing EEG data obtained from six naturalistic music datasets collected in the USA, Greece, and India, involving a total of 161 listeners. Our approach consists of a feature representation pipeline that decomposes the EEG signals into five primary brain waves and extracts 21 features to predict the listener�s identity. We utilize linear and non-linear features in training Random Forest classifiers to determine the most discriminating feature among listeners. The results demonstrate that neural oscillations in higher frequency bands are crucial in distinguishing subjective differences. Specifically, beta waves emerge as the most effective in predicting these differences, yielding an average accuracy of 91.03% across all datasets. Furthermore, our findings highlight the Hjorth Mobility feature as having the highest predictive ability. Additionally, we observe that the frontal region exhibits increased sensitivity in capturing unique features for person identification. Moreover, we discuss previous studies that align with our research direction, focusing on encoding individual-related information within high-frequency brain rhythms. These findings have significant implications for the field of bio-metrics in Music Brain-Computer Interface Technology, providing valuable insights into personalized user experiences and paving the way for future advancements in this exciting research area.
Unpaywall
URI
https://d8.irins.org/handle/IITG2025/19895
IITGN Knowledge Repository Developed and Managed by Library

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science

  • Privacy policy
  • End User Agreement
  • Send Feedback
Repository logo COAR Notify