THE OXFORD COLLEGE OF ENGINEERING BOMMANAHALLI, BANGALORE 560068
DEPARTMENT OF ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING
PROJECT TITLE:
FACIAL EMOTION RECOGNITION USING CNN
CONTENTS
Abstract
Introduction
Problem statement
Objective
Literature Survey
Methodology
Hardware & Software Requirements
Screenshots Of Execution
Conclusion
Future enhancement
References
ABSTRACT
The most important aspect to understand human behavior is face reading. The expression
speaks better than words. The expressions of face reflect human perspective and its
mental state. The aim of this project is to sight faces from image, extract facial
expressions and classify them into different emotions like sad, happy, anger, disgust and
neutral. This project discusses a technique named facial emotion recognition using
Computer Vision. The proposed approach utilizes a convolutional neural network (CNN)
There are two parts in CNN first removal of background from image and second facial
feature extraction (EV). This application is used in medical treatment, teaching field,
police investigation, human robot interface.
INTRODUCTION
Facial Emotion Recognition (FER) is a subfield of affective computing that focuses on
automatically recognizing and interpreting human emotions from facial expressions.
FER has become increasingly important in various applications, including human-
computer interaction, healthcare, education, and marketing. This project proposes a
deep convolutional neural network (CNN) approach for facial emotion recognition
using transfer learning. The proposed approach fine-tunes pre-trained CNNs on
benchmark datasets. Experimental results demonstrate improved performance
compared to traditional machine learning approaches.
PROBLEM STATEMENT AND OBJECTIVIES
PROTOTYPE:
• FACIAL EMOTION RECOGNITION
OBJECTIVE:
• Emotion Detection: Identify and classify human emotions from facial expressions.
• Emotion Classification: Categorize emotions into basic categories (e.g., happiness,
sadness, anger, fear, surprise, disgust).
• Emotion Intensity Estimation: Measure the intensity of emotions.
LITERATURE REVIEW TABLE
Author(s) / Year Title of Study/Report Objective/Focus of Study Key Findings Relevance to FER
Zhang et al., 2020 Emotion Recognition Using Propose a CNN-based model with Attention mechanisms improve Enhances FER performance by
Convolutional Neural Networks attention mechanism for FER emotion recognition by focusing on improving focus on key facial
and Attention Mechanism relevant facial regions. features.
Mollahosseini et al., 2021 Facial Expression Recognition with Use transfer learning for FER task Transfer learning significantly Demonstrates how transfer learning
Transfer Learning on FER2013 on the FER2013 dataset reduces training time and improves can enhance FER efficiency and
model accuracy. scalability.
Wu et al., 2023 Few-Shot Learning for Facial Apply few-shot learning techniques Few-shot learning achieves high Addresses the challenge of training
Emotion Recognition to FER in resource-scarce accuracy even with limited training FER models with limited labeled
environments samples. data.
Jha et al., 2023 Real-Time Facial Emotion Implement real-time FER on edge real-time FER with low latency and Important for deploying FER in
Recognition Using Deep Learning devices using deep learning models high accuracy. real-time applications like
surveillance and robotics.
METHODOLOGY
1.Data Collection: Use a labeled facial emotion dataset (e.g., FER-2013, AffectNet).
2.Preprocessing: Detect faces, normalize images, and remove backgrounds.
3.Feature Extraction: Use Convolutional Neural Networks (CNNs) to extract key facial features
(eyes, mouth, nose).
4.Expression Vector (EV) Generation: Capture facial movements and generate an expression
vector representing emotional changes.
5.Emotion Classification: Apply a fully connected layer with softmax activation to classify
emotions.
6.Training: Optimize with supervised learning and backpropagation.
7.Evaluation: Assess performance using metrics like accuracy, precision, and recall.
1. Face detection
Locate and identify human faces in images or videos.
2. Preprocessing
Adjust the image's lighting, orientation, and scale, and crop
the image to focus on the face.
3. Feature extraction
Analyze specific points on the face, such as the eyes,
eyebrows, and mouth, to extract relevant features.
4. Classification
Use machine learning algorithms to analyze the extracted
features and classify them into emotions like anger, sadness, or
happiness.
Hardware and software Requirements
Hardware Requirements
• Computer/Workstation:
• Processor (CPU )
• RAM:8 GB RAM
• Storage:256 GB SSD
Software requirements:
Python: Extensive libraries
• OpenCV: Comprehensive computer vision library
• Scikit-Image: Image processing library
• Pillow: Image processing library
Datasets:
• FER2013
Screenshots Of Execution
References
[1] Tian, Y. L. , Kanade, T. , Cohn, J. F. , Li, S. Z. , & Jain, A. K. . (2005). Facial expression
analysis. Handbook of Face Recognition. Springer London. 247–275.
[2] Li, T. H. S. , Kuo, P. H. , Tsai, T. N. , & Luan, P. C. . (2019). Cnn and lstm based facial
expression analysis model for a humanoid robot. IEEE Access, PP(99), 1-1.
[3] Siqueira, H. , Magg, S. , & Wermter, S. . (2020). Efficient facial feature learning with wide
ensemble-based convolutional neural networks. arXiv.
[4] Nguyen, B. T. , Trinh, M. H. , Phan, T. V. , & Nguyen, H. D. . (2017). An efficient real-
time emotion detection using camera and facial landmarks. Seventh International Conference
on Information Science & Technology. IEEE. 2017.7926765.
https://youtu.be/eDIj5LuIL4A?si=oq7cxRvRJK7GXnjb
https://www.geeksforgeeks.org/face-recognition-using-artificial-intelli
gence/
Conclusion
In conclusion, facial emotion recognition (FER) holds significant potential in
enhancing human-computer interaction, security, healthcare, and various other
fields. Advancements in deep learning and multi-modal analysis are driving
improvements in accuracy and reliability, allowing for the detection of complex
and subtle emotions. However, challenges such as privacy concerns, biases in
data, and ethical implications remain critical. As FER technology continues to
evolve, addressing these issues will be crucial for its widespread and
responsible adoption. Ultimately, FER systems will play an essential role in
creating more emotionally aware, interactive, and empathetic technological
solutions.
Future Enhancement
Facial emotion recognition (FER) involves analyzing facial expressions to identify emotions,
a key component of human-computer interaction, security, and healthcare. Future
advancements in FER are likely to focus on improving accuracy through deep learning
techniques and large-scale datasets. These technologies will enable more nuanced
recognition of complex emotions, including micro-expressions. Additionally, multi-modal
approaches integrating facial data with voice and body language will enhance emotion
detection. Ethical considerations, including privacy concerns and bias reduction in FER
systems, will become critical. As AI continues to evolve, FER will play a central role in
creating emotionally intelligent systems.
ANY QUESTIONS ???
THANK YOU