Facial Expression-Based Emotion Detection and Music Player using Convolutional Neural Network

Main Article Content

Rahul Mannade, Ganesh Dongre, Sonu Ballal, Renuka Shinde, Pramod Dighole

Abstract

Music is a divine gift to humanity. It exists in many forms, and every individual has unique musical preferences. The choice of music often depends on a person’s mood. For example, when a person is happy, they tend to listen to cheerful and pleasant music, whereas during moments of sadness they may prefer melancholic songs. However, selecting music based on mood is usually done manually. To address this issue, we propose a system that automatically recommends music according to the user’s emotional state. The system captures an image of the user through a webcam and analyzes it to detect the user’s mood, such as Angry, Sad, Happy, Neutral, or Surprised. Based on the detected emotion, the system provides a predefined playlist of songs that correspond to that mood. The proposed system was tested and analyzed on a desktop or laptop running the Windows operating system. The experimental results show that the system correctly detects emotions with an accuracy of 68.75%. This paper is organized as follows: Section 1 presents the introduction, Section 2 reviews related work, Section 3 describes the design of the proposed system, Section 4 discusses the test results and performance analysis, and Sections 5 and 6 present the conclusion and future scope, respectively. 

Article Details

How to Cite
Rahul Mannade. (2020). Facial Expression-Based Emotion Detection and Music Player using Convolutional Neural Network. International Journal on Recent and Innovation Trends in Computing and Communication, 8(9), 21–28. Retrieved from https://mail.ijritcc.org/index.php/ijritcc/article/view/11952
Section
Articles