You are here

EEG-Based Emotion Recognition during Music Listening

EEG-Based Emotion Recognition during Music Listening

Authors: 
N. Thammasan, K. Fukui, K. Moriyama & M. Numao
Year: 
2014
Journal: 
The 28th Annual Conference of the Japanese Society for Artificial Intelligence
Abstract: 

In Music Emotion Research, the goal is to quantify and explain how music influences our emotional states. Scientists realize that human brain is relevant to emotion, and it leads to the study of human emotion by capturing information from brain. Electroencephalogram (EEG) is an efficient tool to capture brainwave. This research proposes a framework to recognize human emotions during music listening by using EEG. In this research, MIDI music files are used, 12 electrodes of EEG are selected and placed according to the 10-20 international standard. Two-Dimensional emotion model is used to represent human emotions. The experiment starts from music selection, music listening with brainwave capturing, and ends with emotion self-annotation continuously. For data analysis, Fractal Dimension value calculations by Higuchi Algorithm are performed, and Support Vector Machine is applied. The results of emotion classification are satisfied, with the accuracy of 90% for arousal classification and 86% for valence classification on average. 

© Copyright 1999 - 2017 ANT Neuro | www.ant-neuro.com | Terms of Use | Privacy Statement | Contact | USA Customers