Facial Emotion Detection Using International Affective Picture System (IAPS)
Facial Emotion Detection Using International Affective Picture System (IAPS)
Introduction
Emotions and affect are researched in various scientific disciplines such as neuroscience,
psychology and cognitive sciences. Development of automatic affect analysers depends
significantly on the progress in the aforementioned sciences. Hence, we start our analysis by
exploring the background in emotion theory, perception and recognition. According to research
in psychology, three major approaches to affect modelling can be distinguished [1]: categorical,
dimensional and appraisal-based approach. The categorical approach claims that there exist a
small number of emotions that are basic, hard-wired in our brain and recognised universally
(e.g., [2]). This theory on universality and interpretation of affective nonverbal expressions in
terms of basic emotion categories has been the most commonly adopted approach in research on
automatic measurement of human affect. However, a number of researchers have shown that in
everyday interactions people exhibit non-basic, subtle and rather complex affective states like
thinking, embarrassment or depression. Such subtle and complex affective states can be
expressed via dozens of anatomically possible facial and bodily expressions, audio or
physiological signals. Therefore, a single label (or any small number of discrete classes) may not
reflect the complexity of the affective state conveyed by such rich sources of information [3].
Hence, a number of researchers advocate the use of dimensional description of human affect,
where affective states are not independent from one another; rather, they are related to one
another in a systematic manner (e.g., [1], [3], [4], [5]). The most widely used dimensional model
is a circular configuration called Circumplex of Affect introduced by Russell [3]. This model is
based on the hypothesis that each basic emotion represents a bipolar entity being a part of the
same emotional continuum. The proposed polars are arousal (relaxed vs. aroused) and valence
(pleasant vs. unpleasant).
Another well-accepted and commonly used dimensional description is the 3D emotional
space of pleasure – displeasure, arousal – nonarousal and dominance – submissiveness [4], at
times referred to as the PAD emotion space [6] or as emotional primitives [7]. To guarantee a
more complete description of affective colouring, some researchers include expectation (the
degree of anticipating or being taken unaware) as the fourth dimension [8], and intensity (how
far a person is away from a state of pure, cool rationality) as the fifth dimension (e.g., [9]).
Scherer and colleagues introduced another set of psychological models, referred to as
componential models of emotion, which are based on the appraisal theory [1], [5], [8]. In the
appraisal-based approach emotions are generated through continuous, recursive subjective
evaluation of both our own internal state and the state of the outside world (relevant
concerns/needs) [1], [5], [8], [10]. Despite pioneering efforts of Scherer and colleagues (e.g.,
[11]), how to use the appraisalbased approach for automatic measurement of affect is an open
research question as this approach requires complex, multicomponential and sophisticated
measurements of change. One possibility is to reduce the appraisal models to dimensional
models (e.g., 2D space of arousal-valence). Ortony and colleagues proposed a computationally
tractable model of the cognitive basis of emotion elicitation, known as OCC [12]. OCC is now
established as a standard (cognitive appraisal) model for emotions, and has mostly been used in
affect synthesis (in embodied conversational agent design, e.g., [13]). Despite the existence of
diverse affect models, search for optimal low-dimensional representation of affect, for analysis
and synthesis, and for each modality or cue, remains open.
EXISTING SYSTEM:
The machine analysis of facial expressions in general has been an active research topic in the last
two decades.
Most of the existing works have been focused on analyzing a set of prototypic emotional facial
expressions, using the data collected by asking subjects to pose deliberately these expressions
PROPOSED SYSTEM:
In this paper, we focus on smile detection in face images captured in real-world scenarios.
We present an efficient approach to Emotional detection, in which the intensity
differences between pixels in the grayscale face images are used as simple features.
International Affective Picture System (IAPS) then is adopted to choose and combine
weak classifiers based on pixel differences to form a strong classifier for smile detection
Experimental results show that our approach achieves similar accuracy to the state-of-
the-art method but is significantly faster.
Our approach provides 85% accuracy by examining 20 pairs of pixels and 88% accuracy
with 100 pairs of pixels.
Software Requirements:
Hardware Requirements: