0% found this document useful (0 votes)
34 views

Facial Emotion Detection Using International Affective Picture System (IAPS)

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
34 views

Facial Emotion Detection Using International Affective Picture System (IAPS)

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

Abstract

Recent neuroscience findings demonstrate the fundamental role of emotion in the


maintenance of physical and mental health. In the present study, a novel architecture is proposed
for the robust discrimination of emotional physiological signals evoked upon viewing pictures
selected from the International Affective Picture System (IAPS). Biosignals are multichannel
recordings from both the central and the autonomic nervous systems. Following the bidirectional
emotion theory model, IAPS pictures are rated along two dimensions, namely, their valence and
arousal. Following this model, biosignals in this paper are initially differentiated according to
their valence dimension by means of a data mining approach, which is the C4.5 decision tree
algorithm. Then, the valence and the gender information serve as an input to a Mahalanobis
distance classifier, which dissects the data into high and low arousing. Results are described in
Extensible Markup Language (XML) format, thereby accounting for platform independency,
easy interconnectivity, and information exchange. The average recognition (success) rate was
77.68% for the discrimination of four emotional states, differing both in their arousal and valence
dimension. It is, therefore, envisaged that the proposed approach holds promise for the efficient
discrimination of negative and positive emotions, and it is hereby discussed how future
developments may be steered to serve for affective healthcare applications, such as the
monitoring of the elderly or chronically ill people.

Introduction
Emotions and affect are researched in various scientific disciplines such as neuroscience,
psychology and cognitive sciences. Development of automatic affect analysers depends
significantly on the progress in the aforementioned sciences. Hence, we start our analysis by
exploring the background in emotion theory, perception and recognition. According to research
in psychology, three major approaches to affect modelling can be distinguished [1]: categorical,
dimensional and appraisal-based approach. The categorical approach claims that there exist a
small number of emotions that are basic, hard-wired in our brain and recognised universally
(e.g., [2]). This theory on universality and interpretation of affective nonverbal expressions in
terms of basic emotion categories has been the most commonly adopted approach in research on
automatic measurement of human affect. However, a number of researchers have shown that in
everyday interactions people exhibit non-basic, subtle and rather complex affective states like
thinking, embarrassment or depression. Such subtle and complex affective states can be
expressed via dozens of anatomically possible facial and bodily expressions, audio or
physiological signals. Therefore, a single label (or any small number of discrete classes) may not
reflect the complexity of the affective state conveyed by such rich sources of information [3].
Hence, a number of researchers advocate the use of dimensional description of human affect,
where affective states are not independent from one another; rather, they are related to one
another in a systematic manner (e.g., [1], [3], [4], [5]). The most widely used dimensional model
is a circular configuration called Circumplex of Affect introduced by Russell [3]. This model is
based on the hypothesis that each basic emotion represents a bipolar entity being a part of the
same emotional continuum. The proposed polars are arousal (relaxed vs. aroused) and valence
(pleasant vs. unpleasant).
Another well-accepted and commonly used dimensional description is the 3D emotional
space of pleasure – displeasure, arousal – nonarousal and dominance – submissiveness [4], at
times referred to as the PAD emotion space [6] or as emotional primitives [7]. To guarantee a
more complete description of affective colouring, some researchers include expectation (the
degree of anticipating or being taken unaware) as the fourth dimension [8], and intensity (how
far a person is away from a state of pure, cool rationality) as the fifth dimension (e.g., [9]).
Scherer and colleagues introduced another set of psychological models, referred to as
componential models of emotion, which are based on the appraisal theory [1], [5], [8]. In the
appraisal-based approach emotions are generated through continuous, recursive subjective
evaluation of both our own internal state and the state of the outside world (relevant
concerns/needs) [1], [5], [8], [10]. Despite pioneering efforts of Scherer and colleagues (e.g.,
[11]), how to use the appraisalbased approach for automatic measurement of affect is an open
research question as this approach requires complex, multicomponential and sophisticated
measurements of change. One possibility is to reduce the appraisal models to dimensional
models (e.g., 2D space of arousal-valence). Ortony and colleagues proposed a computationally
tractable model of the cognitive basis of emotion elicitation, known as OCC [12]. OCC is now
established as a standard (cognitive appraisal) model for emotions, and has mostly been used in
affect synthesis (in embodied conversational agent design, e.g., [13]). Despite the existence of
diverse affect models, search for optimal low-dimensional representation of affect, for analysis
and synthesis, and for each modality or cue, remains open.

EXISTING SYSTEM:

The machine analysis of facial expressions in general has been an active research topic in the last
two decades.
Most of the existing works have been focused on analyzing a set of prototypic emotional facial
expressions, using the data collected by asking subjects to pose deliberately these expressions

DISADVANTAGES OF EXISTING SYSTEM:

 This is a challenging problem


 It seems very difficult to capture the complex decision boundary among spontaneous
expressions
 Very limited data were used in their study

PROPOSED SYSTEM:

In this paper, we focus on smile detection in face images captured in real-world scenarios.
We present an efficient approach to Emotional detection, in which the intensity
differences between pixels in the grayscale face images are used as simple features.
International Affective Picture System (IAPS) then is adopted to choose and combine
weak classifiers based on pixel differences to form a strong classifier for smile detection

ADVANTAGES OF PROPOSED SYSTEM:

 Experimental results show that our approach achieves similar accuracy to the state-of-
the-art method but is significantly faster.

 Our approach provides 85% accuracy by examining 20 pairs of pixels and 88% accuracy
with 100 pairs of pixels.

Software Requirements:

 Operating system : Windows 7


 Front-end : Dot net framework
 Back-end : SQL Server 2008

Hardware Requirements:

 System : Intel i3 Core Processor


 Hard Disk : 500 GB.
 RAM : 4 GB.

You might also like