Empathic Computing in VR
Kunal Gupta
A Bio-Sensing Approach
IVE 2024 Short Course Lecture 9 - Empathic Computing in VR
IVE 2024 Short Course Lecture 9 - Empathic Computing in VR
● VR Needs to Understand Users’ Emotions And Individual
Differences
● VR Needs to Adapt based on Psycho-Physiological Responses
● Trustworthy Virtual Agents Needs to Respond Empathetically
● Empathic Systems Should be Context-Aware
What is an Emotion? 5
Emotions are Real, but Not Objective
Emotions are Real in Same Sense as Money is Real
A Conscious Mental Reaction Subjectively Experienced as
Strong Feeling Usually Directed Toward A Specific Object and
Typically Accompanied by Physiological and Behavioral
Changes in the Body
8
Source: Verywell
10
The Mind Explained
11
—CHARLES DARWIN
“The Expression of Emotion in Man and Animals”
Emotions Are Universal
How do you distinguish one emotion from another?
13
DISCRETE
EMOTIONS
Paul Ekman Basic
Emotions
● Sadness
● Contempt
● Anger
● Happy
● Fear
● Disgust
14
DISCRETE
EMOTIONS
Robert Plutchik
“Wheel Of Emotions”
● Joy vs Sadness
● Anger vs Fear
● Trust vs Disgust
● Surprise vs Anticipation
15
CONTINUOUS
EMOTIONS
● Valence
● Arousal
Russell’s
Circumplex Model
How do we measure the Emotions?
18
Valence
Arousal
CONTINUOUS
EMOTIONS
Dominance
Self-Assessment Manikin (SAM)
FACIAL ACTION CODING SYSTEM (FACS)
Image Source: iMotions
Cheek Raiser + Lip Corner Puller = Happiness/ Joy Brow Lowerer + Lip Corner Depressor = Sadness
Image Source:
Microsoft
Camera Facial Electromyography (fEMG)
Image Source: g.Nautilus,
gtec
BrainWave - EEG Electrodermal Activity,
Heart Rate Variability
Eye Tracker
Eye Tracker for VR Speech Respiration
What Emotion Do You See in the Face?
22
Source: “How Emotions are Made?” - Lisa Feldman Barrett
23
What Emotion Do You See in her Face Now?
24
Source: “How Emotions are Made?” - Lisa Feldman Barrett
25
How can context-aware and empathic
VR systems be developed to accurately
assess and respond to users’
emotional states?
RQ 3
26
How can Context-Aware Empathic Interactions (CAEIxs) in VR
environments enhance the emotional and cognitive aspects of UX?
27
How can Context-Aware Empathic Interactions (CAEIxs) in VR
environments enhance the emotional and cognitive aspects of UX?
RQ 3
28
Empathy is “An affective
response more
appropriate to another’s
situation than one’s
own” Martin L Hauffman
30
Empathic Appraisal/
Cognitive Empathy
Empathic Response
PAM Model of Empathy: Perception + Action (De Waal, 2008)
Empathic appraisal detects user emotions,
appraises event causes by self-projection
and form empathic emotion
Empathic response involves adapting and
expressing responses to the perceived
empathic emotion.
Model of Empathy
31
AffectivelyVR
(Physiological
Emotions)
Self-Projected
Appraisal
(SPA)
[Situational
Emotions]
AffectivelyVR: A real-time VR
emotion recognition system
using EEG, EDA, and HRV with
up to 96.1% generalised
accuracy, enabling CAEIxs
SPA involves projecting oneself
into the user’s situation to
understand their emotional state
and appraise the event that
caused the emotional response.
Mainly responsible for providing
Contextual Understanding
+
Empathic Emotions
Features (EEFs)
Features provide real-time blend
of AffectivelyVR physiological
emotions & SPA's situational
emotions to guide empathic VR
interactions
Empathic Appraisal
Image Source: g.Nautilus,
gtec
AffectivelyVR: System Setup
AffectivelyVR: Results
AffectivelyVR: Learnings
● Brainwaves (EEG), Heart Rate Variability (HRV) and Skin
Sweat Responses (EDA) can sense emotional states in VR
● AffectivelyVR framework can be used to design emotion
recognition system
● Machine Learning Classification techniques such as SVM and
KNN can be implemented to model personalized emotions
with up to 96.5% cross-validation accuracy.
Self-Projected Appraisal
● Emulates user perspective to understand needs & desires.
● Contextual understanding defines interaction states.
● Goal: Elicit situational emotions (SE) based on user activity & goals.
● Examples:
○ Navigation: Stress if insufficient time to reach destination; happiness if on track.
○ Photo-taking: Sadness if insufficient photos; happiness upon capturing unique perspectives.
● Uses contextual appraisal rules:
○ Navigation: Aim to visit maximum monuments; elicits emotions based on estimated time (ET) vs.
remaining journey time (RJT).
○ Photo-taking: Emphasizes multiple shots for diverse perspectives; emotions based on remaining
photos (RP) and monuments left (ML).
36
Empathic Emotion Features
● Goal: Guide interactions based on user's emotional & contextual info.
● Categories:
○ Empathic Emotion (EE): Reflects user's current emotion (PE or SE). Enhances emotional connection &
bonding.
○ Comfort & Reassurance (CR): Provides emotional support and validation, promoting well-being.
○ Motivation & Support (MS): Encourages action and goal-pursuit.
○ Positive Reinforcement & Encouragement (PRE): Boosts self-esteem and confidence.
○ Urgency & Pressure (UP): Drives immediate action when needed.
37
Empathic Emotion Features
● Selection Criteria:
○ Positive SEs use PE; Negative SEs use SE for EE.
○ CR, MS, PRE, UP: Boolean-based (1 = focus on category).
● Note that the table presented in this research is logically designed while considering the features
involved in the context-relevant empathic interactions.
38
Empathic Response Strategies
39
Emotion Contagion Emotion Regulation
Tendency to automatically mimic and
synchronise responses with another
person’s emotional states
process of managing one’s own emotional
reactions to better respond to a situation
Empathic Response
40
Empathic Tone Empathic Dialogue
Empathic tone enhances emotional
understanding, builds trust, and bolsters
system credibility. [Breazeal et. al,
Rosenzweig et. al]
Empathic dialogue in relational virtual agents
acknowledges user emotions, enhancing
trust and rapport.
Empathic Tone
● Background: Neural network-based speech emotion recognition explored in
prior research [104]. Gap identified: No specific research on the best TTS
tone style as an empathic response for each emotional state.
● Approach: Azure TTS utilized to enable a virtual agent to express emotion
through voice. Empathic Dialogues dataset chosen for its range of emotion-
laden conversations [Rashkin et. al.].
● Goal: Identify the most empathetic tone for four key emotional states.
● Examples:
○ For the "Happy" emotion, conversations from the dataset tagged as
'happy' were used.
○ For the "Stress" emotion, conversations labeled 'stress' were selected,
and so on.
41
Empathic Tone
● Pilot Study Method:
○ Conversations played with
multiple tones like 'angry',
'cheerful', 'sad' using "en-US-
JennyNeural" voice.
○ Six participants rated empathy
levels post-listening on a 5-point
Likert scale.
○ Results mapped on a 2D plot
between Valence and Arousal to
determine the best tone for each
emotion.
42
● Findings: 'Excited' was the most empathetic for Happy,
'empathetic' for Stress, 'hopeful' for Sadness, and
'whispering' for Relaxation.
Empathic Dialogue
● Objective: Design virtual agents that offer contextually relevant empathic responses.
● Incorporating Empathy: Use Empathic Emotional Features (EEFs).
● Role: Guides the selection of empathic dialogues tailored to user emotions & context.
● Past Approaches:
○ Rule-based state machine systems [187]
○ AI techniques with deep learning [245]
● Our Structure: Context-Aware Empathic Assistance (CAEA)
○ CAEA = Empathic Dialogue + Assistance + UP (if Applicable)
43
Empathic Dialogue
● EEFs Implementation:
○ Based on: Type of assistance, user emotions
(both physiological & situational), desired
empathic response (e.g., CR, MS, PRE, UP).
● Example (Photo-taking):
○ Scenario: User is happy with sufficient photos
remaining.
○ EEFs Suggestion: Use PRE feature.
○ Response: “Splendid shot! You have 7 photos
left. Do you want to save this?”
45
● Example (Navigation):
○ Scenario: When the user is stressed in
a relaxed situation.
○ EEFs Suggestion: Use CR feature
○ Response: “It's okay, take a moment.
I'm here to help. , walk towards the
right”
● Environmental Colors - Emotions Relationship:
Research shows specific colors like orange evoke
tenderness, purple incites anguish, bright/saturated
colors are pleasant, green induces relaxation, and
yellow promotes feelings of joy and empathy.
● Reappraisal Technique: Used the "emotional
awareness and acceptance" technique, promoting
mindfulness of emotions and enhancing
understanding and regulation.
● Color Assignments: Based on insights, PVPA (Happy)
was linked with Yellow, NVPA (Stress) with Red, NVNA
(Sad) with Blue, and PVNA (Relax) with Green.
Emotion-Adaptive Responses
46
● Implementation of EAR:
○ Texture Application: Used a texture on the
User's 'MainCamera', simulating the feeling of
wearing coloured sunglasses.
○ Emotion-Adaptive Cues: Focused on color
adaptation for emotion cues rather than
empathic ones.
○ Color Manipulation: Adjusted the texture's color
based on the physiological emotion (PE) derived
from AffectivelyVR, following established color
assignments.
Emotion-Adaptive Responses
47
VRdoGraphy
Context: Positive Psychotherapy
VR Photography competition where
participants will try their best to capture 8
Monuments.
● Photowalks: 4
● Monuments/ walk: 2
● Time per walk: 7 minutes
Top 3 photos will get a prize of $20
Westfield Voucher
Each participant will receive a postcard of
their favorite VR photo
IVE 2024 Short Course Lecture 9 - Empathic Computing in VR
IVE 2024 Short Course Lecture 9 - Empathic Computing in VR
IVE 2024 Short Course Lecture 9 - Empathic Computing in VR
Experimental Procedure
52
Pre
Study
Procedur
e
Rest
Condition
1
Questionnaire
s
Rest
Condition
2
Questionnaire
s
Rest
Condition
3
Questionnaire
s
Rest
Condition
4
Questionnaire
s
Post Study
Procedure
Finish
1 minute 10 minutes 4 minutes
Start
20 minutes
10 minutes
Total Duration: ~ 90 minutes
System Architecture
53
Biosignal Manager
54
● Core Functionality: Gathers and streams biosignals (16-channel EEG, EDA, PPG) during VR
sessions using the LabStreaming Layer (LSL) protocol and LSL4Unity API.
● Sampling Rates: EEG sampled at 125 Hz; EDA and PPG at 128 Hz.
● Data Synchronization: Implements event markers like start, stop, CFA, and photo capture
within Unity to address asynchronous data collection risks.
● Enhanced Features: Captures timestamps, predicted emotional states from AffectivelyVR,
and stores data in CSV for offline analysis.
● Mapping Emotions in VR Photowalk Experience:
○ Happy (PVPA): Linked with joy and excitement, perhaps from capturing an exceptional photograph or
the thrill of potentially winning.
○ Relaxation (PVNA): Representing a contented calm, possibly after securing a satisfactory shot or
immersing in the VR scenery.
○ Stress (NVPA): Reflecting anxiety, possibly due to not getting the ideal shot, feeling time-constrained,
or facing competition.
○ Sad (NVNA): Indicating disappointment, like missing a photo opportunity or not performing as hoped.
● Continuous to Discrete Emotion Integration:
○ While emotions inherently span a continuous spectrum, discrete categorizations simplify data
interpretation, user feedback, and system interventions in VR.
○ The chosen discrete emotions (Happy, Relaxation, Stress, Sad) are contextually relevant to the VR
photography contest experience and its rewards.
AffectivelyVR
● Emotion Classification:
○ The ‘AffectivelyVR ML’ model predicts emotions like ‘happy’ (PVPA), ‘stress’ (NVPA), ‘sad’ (NVNA),
and ‘relaxation’ (PVNA) with 83.6% cross-validation accuracy.
○ These emotions were tailored for the VR photography experience of the study.
● Data Collection & Storage:
○ Biosignal data is collected from the Biosignal Manager system via LSL.
○ Data is initially buffered for 60 seconds (7500 EEG samples & 7680 EDA/PPG samples).
● Data Processing & Overlapping Window:
○ After reaching capacity, the first 30 seconds of data is deleted, ensuring a continuous 60-
second window with a 30-second overlap.
○ Predicts emotional states every 30 seconds after the initial 60 seconds.
AffectivelyVR
● Emotion Prediction & Streaming:
○ Data follows preprocessing, feature extraction, and is compared with the AffectivelyVR ML model.
○ Predicted emotional states are streamed back to Unity with classifications: 1 for Happy, 2 for
Stress, 3 for Sad, and 4 for Relaxation.
AffectivelyVR
Companion Manager
58
● Functionality & Activation: Manages companion interactions, offers voice assistance,
oversees photos, and aids in time management; only activates upon assistance request.
● Contextual Data Collection: Extracts and streams navigation data like direction, journey
duration, on-time status, and photo details to the Companion Adaptor via LSL.
● Companion Visualization & Speech: Portrayed as an orb-shaped avatar with speech
enabled by Microsoft Azure's TTS SDK; allows customization and integration with SSML for
nuanced speech playback in Unity.
● Trigger Points & Management: Companion activates during events like photo capture,
halfway time mark, last minute, and ten seconds. Tracks and manages photo count and
storage locally.
Companion Adaptor
59
● System Foundation:
○ Developed using Python 3.7.
○ Aim: Collect context from the Companion Manager system, determine relevant TTS
tone style, empathic dialogue, and assistance content, and then produce the SSML file
for the TTS SDK.
● Data Collection and Emotional Analysis:
○ Used LabStreaming Layer (LSL) to gather contextual data.
○ Derived situational emotion (SE) as per Self-Projected Appraisal (SPA)
● Merged current emotional state (Physiological Emotion) with SE to generate Empathic
Emotional Features (EEFs) as per guidelines in EEF Table.
● Used Empathic Emotion (EE) to pick the right tone style and Empathic Dialogue
Companion Adaptor
60
● Assistance Content Determination:
○ Content set based on activity type and given info. (e.g., For navigation, "Walk towards Right" if
direction to next monument is "Right").
○ For photo activities, it provides photo count details (e.g., "You have 5 photos Left. Do you wish to
save this photo?").
● In CAE systems, empathic dialogue is combined with assistance content or remains as only assistance
for non-CAE systems.
● File Generation and Communication:
○ Generated an SSML file with the chosen tone and content.
○ Once file creation is complete, an acknowledgment is sent to Unity's Companion Manager system
through LSL.
RQ 3
● 2x2 within-subject study with 14 participants (7 Female)
● Independent Variables
○ Emotion-Adaptive Environment (EA)
○ Empathic Companion (EC)
● Dependent Variables
○ Physiology (Electroencephalogram (EEG), Electrodermal Activity (EDA), & Heart Rate Variability (HRV)
○ Subjective: Self-Assessment Manikin (SAM) scale for Emotional State, IGroup Presence
Questionnaire (IPQ) for Presence, NASA-TLX for Cognitive Load, Game Experience Questionnaire
for Positive, Negative Affect and Empathy, Flow Short State Questionnaire (FSSQ) for flow state, ans
System Trust Scale (STS) for human trust on agent.
○ Behavior: Task performance, error rate, rate of assistance
61
Experimental Methodology
Conditions No-EC EC
No-EA NEC-NEA EC-NEA
EA NEC-EA EC-EA
RQ 3
● Valence Shift: CAE substantially enhanced
valence (SAM ratings, p < 0.001), indicating an
improved emotional state.
● Empathy in VR: Significant increase in
empathy towards virtual agents when CAE is
present (GEQ scores, p = 0.004).
● Cognitive Demand: CAE linked to lower task
load perception (NASA-TLX scores, p = 0.01),
suggesting more effortless interactions.
● Trust Dynamics: CAE exerted a significant
effect on the System Trust Scale (STS ratings,
p = 0.01),
62
CAEIxs Enhancing User Experience in VR
RQ 3
● Physiological Indicators: EA influenced both
skin conductance (EDA-PA, p = 0.002) and
heart rate variability (LF/HF, p < 0.01), marking
physiological engagement.
● EEG Correlations: CAE significantly impacted
specific EEG markers like Frontal Central
Theta (FC-Theta, p < 0.006) and Frontal
Parietal Beta (FP-Beta, p < 0.006), possibly
related to cognitive processing and attention.
63
CAEIxs Enhancing User Experience in VR
RQ 3
64
CAEIxs Enhancing User Experience in VR
● Adaptability in Research
○ Flexibility to adapt experiments and theories based on emerging data
○ The importance of iterating design based on user feedback and physiological data insights
● Methodological Rigor:
○ The need for comprehensive data collection methodologies to ensure the robustness of findings
○ Importance of balanced qualitative insights with quantitative and behavioral data for a well-
rounded understanding
● Interdisciplinary Approach:
○ Leveraging knowledge from psychology, HCI, and data analysis to enrich VR research
○ Collaborating across disciplines to innovate and refine empathic VR technologies
66
Lessons Learned
● User-Centered Design
○ Placing User Experience at the forefront when developing VR systems
○ Understanding the underlying process to personalize is the key to effective empathic response in
VR
● Technological Challenges:
○ Navigating the limitations of current VR and sensor technology.
○ Recognizing the importance of advancing hardware for more nuanced data capture and
interaction.
● Ethical Considerations:
○ Addressing privacy concerns with biometric data usage.
○ Considering the psychological impact of immersive experiences on diverse user groups. 67
Lessons Learned
Research Fellow at Empathic Computing Laboratory
within Auckland Bioengineering Institute at The
University of Auckland, working with Prof. Mark
Billinghurst.
Research revolves around the junction of Empathic
Computing, Virtual Reality, Digital Agents, and
Context-Aware Interactions.
93
ABOUT
ME
Worked as UX Researcher at Google, India
● Various projects related to Google Search
Social Impact: Education, Civic Services, and
Health
Interaction Designer at Assistive Wearable Startup
in India
● Lechal - Smart Insoles assisting in eye-free
navigation through haptic feedback and
tracking fitness metrics
● Lechal 2.0 - Additional feature: Fall Risk
Intervention for elderly by understanding
user’s walking behavior and predicting the fall
risk
94
INDUSTRY
EXPERIENCE
Research Fellow @ ECL
96
THANK YOU!
CAEVR
Empathic Appraisal: AffectivelyVR
Gradient Boosting
Classifier
Experimental Conditions
A B
NEC-NEA
(No Empathic Companion -
No Emotion Adaptation)
- Baseline condition with goal and tasks
- No color filter
- VA will provide on-demand assistance. It
will recommend in navigation, task
assistance
NEC-EA
(No Empathic Companion -
Emotion Adaptation)
- Baseline condition with goal and tasks
- Color filters will be added depending on
the real-time emotional state
- VA will provide on-demand assistance. It
will recommend in navigation, task
assistance
99
Experimental Conditions
C D
EC-NEA
(Empathic Companion -
No Emotion Adaptation)
- Baseline condition with goal and tasks
- No color filter
- VA will provide on-demand assistance. It
will recommend in navigation, task
assistance but starting with a context-
relevant empathic dialogue and in
emotional tone
EC-EA
(Empathic Companion -
Emotion Adaptation)
- Baseline condition with goal and tasks
- Color filters will be added depending on
the real-time emotional state
- VA will provide on-demand assistance. It
will recommend in navigation, task
assistance but starting with a context-
relevant empathic dialogue and in
emotional tone 100
CAEVR System
101
AR-glass capable of adding color filter based on the participant’s emotional state.
- Low-saturation colors evoke a sense of depression, while high- saturation ones lead people to feel
cheerful and pleasant

More Related Content

PDF
IVE 2024 Short Course Lecture10 - Multimodal Emotion Recognition in Conversat...
PDF
Empathic Computing
PDF
Empathic Extended Reality
PDF
Research Summary - Empathic Extended Reality
PDF
Applications of Emotions Recognition
PDF
[EMPIRE 2016] Adapt to Emotional Reactions In Context-aware Personalization
PDF
The Coming Age of Empathic Computing
PPTX
Mark Billinghurst (University of South Australia) The Coming Age of Empathic ...
IVE 2024 Short Course Lecture10 - Multimodal Emotion Recognition in Conversat...
Empathic Computing
Empathic Extended Reality
Research Summary - Empathic Extended Reality
Applications of Emotions Recognition
[EMPIRE 2016] Adapt to Emotional Reactions In Context-aware Personalization
The Coming Age of Empathic Computing
Mark Billinghurst (University of South Australia) The Coming Age of Empathic ...

Similar to IVE 2024 Short Course Lecture 9 - Empathic Computing in VR (20)

PDF
Effects of Sharing Physiological States of Players in a Collaborative Virtual...
PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
PPTX
BayCHI April 2015 - Towards Smart Emotional Neuro Search Engines: An Extensio...
PDF
A General Architecture for an Emotion-aware Content-based Recommender System
PDF
Emotion AI: Concepts, Challenges and Opportunities
PDF
Sharing emotions in collaborative virtual environments
PPTX
Empathy &Inclusion in the XR Space
PDF
Empathic Computing: Designing for the Broader Metaverse
PPTX
Emotion Detection Using Noninvasive Low-cost Sensors
PDF
IVE 2024 Short Course - Lecture18- Hacking Emotions in VR Collaboration.
PDF
Empathic Computing: Capturing the Potential of the Metaverse
PPTX
Affective UX: Challenges in UX involving affective computing
PDF
Ac tsumugu 20170712
PDF
Comparing emotion classification: machine learning algorithms and hybrid mode...
PPT
Believable virtual social interactions 2
PDF
Affective analysis in machine learning using AMIGOS with Gaussian expectatio...
PDF
Emotive Media - Visualization and Analysis of Human Bio-Feedback Data
KEY
Empathy amplifier presentation ms4
PDF
Empathic Computing
PPTX
Designing for Persuasion, Emotion, and Trust
Effects of Sharing Physiological States of Players in a Collaborative Virtual...
Reach Out and Touch Someone: Haptics and Empathic Computing
BayCHI April 2015 - Towards Smart Emotional Neuro Search Engines: An Extensio...
A General Architecture for an Emotion-aware Content-based Recommender System
Emotion AI: Concepts, Challenges and Opportunities
Sharing emotions in collaborative virtual environments
Empathy &Inclusion in the XR Space
Empathic Computing: Designing for the Broader Metaverse
Emotion Detection Using Noninvasive Low-cost Sensors
IVE 2024 Short Course - Lecture18- Hacking Emotions in VR Collaboration.
Empathic Computing: Capturing the Potential of the Metaverse
Affective UX: Challenges in UX involving affective computing
Ac tsumugu 20170712
Comparing emotion classification: machine learning algorithms and hybrid mode...
Believable virtual social interactions 2
Affective analysis in machine learning using AMIGOS with Gaussian expectatio...
Emotive Media - Visualization and Analysis of Human Bio-Feedback Data
Empathy amplifier presentation ms4
Empathic Computing
Designing for Persuasion, Emotion, and Trust
Ad

More from Mark Billinghurst (20)

PDF
Rapid Prototyping: A lecture on prototyping techniques for interface design
PDF
Empathic Computing: Creating Shared Understanding
PDF
Rapid Prototyping for XR: Lecture 6 - AI for Prototyping and Research Directi...
PDF
Rapid Prototyping for XR: Lecture 5 - Cross Platform Development
PDF
Rapid Prototyping for XR: Lecture 4 - High Level Prototyping.
PDF
Rapid Prototyping for XR: Lecture 3 - Video and Paper Prototyping
PDF
Rapid Prototyping for XR: Lecture 2 - Low Fidelity Prototyping.
PDF
Rapid Prototyping for XR: Lecture 1 Introduction to Prototyping
PDF
Research Directions in Heads-Up Computing
PDF
IVE 2024 Short Course - Lecture13 - Neurotechnology for Enhanced Interaction ...
PDF
IVE 2024 Short Course Lecture15 - Measuring Cybersickness
PDF
IVE 2024 Short Course - Lecture14 - Evaluation
PDF
IVE 2024 Short Course - Lecture12 - OpenVibe Tutorial
PDF
IVE 2024 Short Course - Lecture 8 - Electroencephalography (EEG) Basics
PDF
IVE 2024 Short Course - Lecture16- Cognixion Axon-R
PDF
IVE 2024 Short Course - Lecture 2 - Fundamentals of Perception
PDF
Research Directions for Cross Reality Interfaces
PDF
The Metaverse: Are We There Yet?
PDF
Human Factors of XR: Using Human Factors to Design XR Systems
PDF
IVE Industry Focused Event - Defence Sector 2024
Rapid Prototyping: A lecture on prototyping techniques for interface design
Empathic Computing: Creating Shared Understanding
Rapid Prototyping for XR: Lecture 6 - AI for Prototyping and Research Directi...
Rapid Prototyping for XR: Lecture 5 - Cross Platform Development
Rapid Prototyping for XR: Lecture 4 - High Level Prototyping.
Rapid Prototyping for XR: Lecture 3 - Video and Paper Prototyping
Rapid Prototyping for XR: Lecture 2 - Low Fidelity Prototyping.
Rapid Prototyping for XR: Lecture 1 Introduction to Prototyping
Research Directions in Heads-Up Computing
IVE 2024 Short Course - Lecture13 - Neurotechnology for Enhanced Interaction ...
IVE 2024 Short Course Lecture15 - Measuring Cybersickness
IVE 2024 Short Course - Lecture14 - Evaluation
IVE 2024 Short Course - Lecture12 - OpenVibe Tutorial
IVE 2024 Short Course - Lecture 8 - Electroencephalography (EEG) Basics
IVE 2024 Short Course - Lecture16- Cognixion Axon-R
IVE 2024 Short Course - Lecture 2 - Fundamentals of Perception
Research Directions for Cross Reality Interfaces
The Metaverse: Are We There Yet?
Human Factors of XR: Using Human Factors to Design XR Systems
IVE Industry Focused Event - Defence Sector 2024
Ad

Recently uploaded (20)

PDF
Introduction to MCP and A2A Protocols: Enabling Agent Communication
PDF
LMS bot: enhanced learning management systems for improved student learning e...
PDF
Transform-Your-Factory-with-AI-Driven-Quality-Engineering.pdf
PDF
Data Virtualization in Action: Scaling APIs and Apps with FME
PDF
NewMind AI Weekly Chronicles – August ’25 Week IV
PDF
Co-training pseudo-labeling for text classification with support vector machi...
PDF
ment.tech-Siri Delay Opens AI Startup Opportunity in 2025.pdf
PDF
IT-ITes Industry bjjbnkmkhkhknbmhkhmjhjkhj
PPTX
Build automations faster and more reliably with UiPath ScreenPlay
PDF
Early detection and classification of bone marrow changes in lumbar vertebrae...
PPTX
AI-driven Assurance Across Your End-to-end Network With ThousandEyes
PDF
Altius execution marketplace concept.pdf
PDF
EIS-Webinar-Regulated-Industries-2025-08.pdf
PDF
Ensemble model-based arrhythmia classification with local interpretable model...
PDF
4 layer Arch & Reference Arch of IoT.pdf
PPTX
SGT Report The Beast Plan and Cyberphysical Systems of Control
PPTX
Internet of Everything -Basic concepts details
PDF
Auditboard EB SOX Playbook 2023 edition.
PDF
Planning-an-Audit-A-How-To-Guide-Checklist-WP.pdf
PDF
MENA-ECEONOMIC-CONTEXT-VC MENA-ECEONOMIC
Introduction to MCP and A2A Protocols: Enabling Agent Communication
LMS bot: enhanced learning management systems for improved student learning e...
Transform-Your-Factory-with-AI-Driven-Quality-Engineering.pdf
Data Virtualization in Action: Scaling APIs and Apps with FME
NewMind AI Weekly Chronicles – August ’25 Week IV
Co-training pseudo-labeling for text classification with support vector machi...
ment.tech-Siri Delay Opens AI Startup Opportunity in 2025.pdf
IT-ITes Industry bjjbnkmkhkhknbmhkhmjhjkhj
Build automations faster and more reliably with UiPath ScreenPlay
Early detection and classification of bone marrow changes in lumbar vertebrae...
AI-driven Assurance Across Your End-to-end Network With ThousandEyes
Altius execution marketplace concept.pdf
EIS-Webinar-Regulated-Industries-2025-08.pdf
Ensemble model-based arrhythmia classification with local interpretable model...
4 layer Arch & Reference Arch of IoT.pdf
SGT Report The Beast Plan and Cyberphysical Systems of Control
Internet of Everything -Basic concepts details
Auditboard EB SOX Playbook 2023 edition.
Planning-an-Audit-A-How-To-Guide-Checklist-WP.pdf
MENA-ECEONOMIC-CONTEXT-VC MENA-ECEONOMIC

IVE 2024 Short Course Lecture 9 - Empathic Computing in VR

  • 1. Empathic Computing in VR Kunal Gupta A Bio-Sensing Approach
  • 4. ● VR Needs to Understand Users’ Emotions And Individual Differences ● VR Needs to Adapt based on Psycho-Physiological Responses ● Trustworthy Virtual Agents Needs to Respond Empathetically ● Empathic Systems Should be Context-Aware
  • 5. What is an Emotion? 5
  • 6. Emotions are Real, but Not Objective
  • 7. Emotions are Real in Same Sense as Money is Real
  • 8. A Conscious Mental Reaction Subjectively Experienced as Strong Feeling Usually Directed Toward A Specific Object and Typically Accompanied by Physiological and Behavioral Changes in the Body 8
  • 11. 11 —CHARLES DARWIN “The Expression of Emotion in Man and Animals” Emotions Are Universal
  • 12. How do you distinguish one emotion from another?
  • 13. 13 DISCRETE EMOTIONS Paul Ekman Basic Emotions ● Sadness ● Contempt ● Anger ● Happy ● Fear ● Disgust
  • 14. 14 DISCRETE EMOTIONS Robert Plutchik “Wheel Of Emotions” ● Joy vs Sadness ● Anger vs Fear ● Trust vs Disgust ● Surprise vs Anticipation
  • 16. How do we measure the Emotions?
  • 18. FACIAL ACTION CODING SYSTEM (FACS) Image Source: iMotions Cheek Raiser + Lip Corner Puller = Happiness/ Joy Brow Lowerer + Lip Corner Depressor = Sadness
  • 19. Image Source: Microsoft Camera Facial Electromyography (fEMG)
  • 20. Image Source: g.Nautilus, gtec BrainWave - EEG Electrodermal Activity, Heart Rate Variability Eye Tracker Eye Tracker for VR Speech Respiration
  • 21. What Emotion Do You See in the Face? 22 Source: “How Emotions are Made?” - Lisa Feldman Barrett
  • 22. 23
  • 23. What Emotion Do You See in her Face Now? 24 Source: “How Emotions are Made?” - Lisa Feldman Barrett
  • 24. 25 How can context-aware and empathic VR systems be developed to accurately assess and respond to users’ emotional states?
  • 25. RQ 3 26 How can Context-Aware Empathic Interactions (CAEIxs) in VR environments enhance the emotional and cognitive aspects of UX?
  • 26. 27 How can Context-Aware Empathic Interactions (CAEIxs) in VR environments enhance the emotional and cognitive aspects of UX? RQ 3
  • 27. 28
  • 28. Empathy is “An affective response more appropriate to another’s situation than one’s own” Martin L Hauffman
  • 29. 30 Empathic Appraisal/ Cognitive Empathy Empathic Response PAM Model of Empathy: Perception + Action (De Waal, 2008) Empathic appraisal detects user emotions, appraises event causes by self-projection and form empathic emotion Empathic response involves adapting and expressing responses to the perceived empathic emotion. Model of Empathy
  • 30. 31 AffectivelyVR (Physiological Emotions) Self-Projected Appraisal (SPA) [Situational Emotions] AffectivelyVR: A real-time VR emotion recognition system using EEG, EDA, and HRV with up to 96.1% generalised accuracy, enabling CAEIxs SPA involves projecting oneself into the user’s situation to understand their emotional state and appraise the event that caused the emotional response. Mainly responsible for providing Contextual Understanding + Empathic Emotions Features (EEFs) Features provide real-time blend of AffectivelyVR physiological emotions & SPA's situational emotions to guide empathic VR interactions Empathic Appraisal
  • 33. AffectivelyVR: Learnings ● Brainwaves (EEG), Heart Rate Variability (HRV) and Skin Sweat Responses (EDA) can sense emotional states in VR ● AffectivelyVR framework can be used to design emotion recognition system ● Machine Learning Classification techniques such as SVM and KNN can be implemented to model personalized emotions with up to 96.5% cross-validation accuracy.
  • 34. Self-Projected Appraisal ● Emulates user perspective to understand needs & desires. ● Contextual understanding defines interaction states. ● Goal: Elicit situational emotions (SE) based on user activity & goals. ● Examples: ○ Navigation: Stress if insufficient time to reach destination; happiness if on track. ○ Photo-taking: Sadness if insufficient photos; happiness upon capturing unique perspectives. ● Uses contextual appraisal rules: ○ Navigation: Aim to visit maximum monuments; elicits emotions based on estimated time (ET) vs. remaining journey time (RJT). ○ Photo-taking: Emphasizes multiple shots for diverse perspectives; emotions based on remaining photos (RP) and monuments left (ML). 36
  • 35. Empathic Emotion Features ● Goal: Guide interactions based on user's emotional & contextual info. ● Categories: ○ Empathic Emotion (EE): Reflects user's current emotion (PE or SE). Enhances emotional connection & bonding. ○ Comfort & Reassurance (CR): Provides emotional support and validation, promoting well-being. ○ Motivation & Support (MS): Encourages action and goal-pursuit. ○ Positive Reinforcement & Encouragement (PRE): Boosts self-esteem and confidence. ○ Urgency & Pressure (UP): Drives immediate action when needed. 37
  • 36. Empathic Emotion Features ● Selection Criteria: ○ Positive SEs use PE; Negative SEs use SE for EE. ○ CR, MS, PRE, UP: Boolean-based (1 = focus on category). ● Note that the table presented in this research is logically designed while considering the features involved in the context-relevant empathic interactions. 38
  • 37. Empathic Response Strategies 39 Emotion Contagion Emotion Regulation Tendency to automatically mimic and synchronise responses with another person’s emotional states process of managing one’s own emotional reactions to better respond to a situation
  • 38. Empathic Response 40 Empathic Tone Empathic Dialogue Empathic tone enhances emotional understanding, builds trust, and bolsters system credibility. [Breazeal et. al, Rosenzweig et. al] Empathic dialogue in relational virtual agents acknowledges user emotions, enhancing trust and rapport.
  • 39. Empathic Tone ● Background: Neural network-based speech emotion recognition explored in prior research [104]. Gap identified: No specific research on the best TTS tone style as an empathic response for each emotional state. ● Approach: Azure TTS utilized to enable a virtual agent to express emotion through voice. Empathic Dialogues dataset chosen for its range of emotion- laden conversations [Rashkin et. al.]. ● Goal: Identify the most empathetic tone for four key emotional states. ● Examples: ○ For the "Happy" emotion, conversations from the dataset tagged as 'happy' were used. ○ For the "Stress" emotion, conversations labeled 'stress' were selected, and so on. 41
  • 40. Empathic Tone ● Pilot Study Method: ○ Conversations played with multiple tones like 'angry', 'cheerful', 'sad' using "en-US- JennyNeural" voice. ○ Six participants rated empathy levels post-listening on a 5-point Likert scale. ○ Results mapped on a 2D plot between Valence and Arousal to determine the best tone for each emotion. 42 ● Findings: 'Excited' was the most empathetic for Happy, 'empathetic' for Stress, 'hopeful' for Sadness, and 'whispering' for Relaxation.
  • 41. Empathic Dialogue ● Objective: Design virtual agents that offer contextually relevant empathic responses. ● Incorporating Empathy: Use Empathic Emotional Features (EEFs). ● Role: Guides the selection of empathic dialogues tailored to user emotions & context. ● Past Approaches: ○ Rule-based state machine systems [187] ○ AI techniques with deep learning [245] ● Our Structure: Context-Aware Empathic Assistance (CAEA) ○ CAEA = Empathic Dialogue + Assistance + UP (if Applicable) 43
  • 42. Empathic Dialogue ● EEFs Implementation: ○ Based on: Type of assistance, user emotions (both physiological & situational), desired empathic response (e.g., CR, MS, PRE, UP). ● Example (Photo-taking): ○ Scenario: User is happy with sufficient photos remaining. ○ EEFs Suggestion: Use PRE feature. ○ Response: “Splendid shot! You have 7 photos left. Do you want to save this?” 45 ● Example (Navigation): ○ Scenario: When the user is stressed in a relaxed situation. ○ EEFs Suggestion: Use CR feature ○ Response: “It's okay, take a moment. I'm here to help. , walk towards the right”
  • 43. ● Environmental Colors - Emotions Relationship: Research shows specific colors like orange evoke tenderness, purple incites anguish, bright/saturated colors are pleasant, green induces relaxation, and yellow promotes feelings of joy and empathy. ● Reappraisal Technique: Used the "emotional awareness and acceptance" technique, promoting mindfulness of emotions and enhancing understanding and regulation. ● Color Assignments: Based on insights, PVPA (Happy) was linked with Yellow, NVPA (Stress) with Red, NVNA (Sad) with Blue, and PVNA (Relax) with Green. Emotion-Adaptive Responses 46
  • 44. ● Implementation of EAR: ○ Texture Application: Used a texture on the User's 'MainCamera', simulating the feeling of wearing coloured sunglasses. ○ Emotion-Adaptive Cues: Focused on color adaptation for emotion cues rather than empathic ones. ○ Color Manipulation: Adjusted the texture's color based on the physiological emotion (PE) derived from AffectivelyVR, following established color assignments. Emotion-Adaptive Responses 47
  • 45. VRdoGraphy Context: Positive Psychotherapy VR Photography competition where participants will try their best to capture 8 Monuments. ● Photowalks: 4 ● Monuments/ walk: 2 ● Time per walk: 7 minutes Top 3 photos will get a prize of $20 Westfield Voucher Each participant will receive a postcard of their favorite VR photo
  • 51. Biosignal Manager 54 ● Core Functionality: Gathers and streams biosignals (16-channel EEG, EDA, PPG) during VR sessions using the LabStreaming Layer (LSL) protocol and LSL4Unity API. ● Sampling Rates: EEG sampled at 125 Hz; EDA and PPG at 128 Hz. ● Data Synchronization: Implements event markers like start, stop, CFA, and photo capture within Unity to address asynchronous data collection risks. ● Enhanced Features: Captures timestamps, predicted emotional states from AffectivelyVR, and stores data in CSV for offline analysis.
  • 52. ● Mapping Emotions in VR Photowalk Experience: ○ Happy (PVPA): Linked with joy and excitement, perhaps from capturing an exceptional photograph or the thrill of potentially winning. ○ Relaxation (PVNA): Representing a contented calm, possibly after securing a satisfactory shot or immersing in the VR scenery. ○ Stress (NVPA): Reflecting anxiety, possibly due to not getting the ideal shot, feeling time-constrained, or facing competition. ○ Sad (NVNA): Indicating disappointment, like missing a photo opportunity or not performing as hoped. ● Continuous to Discrete Emotion Integration: ○ While emotions inherently span a continuous spectrum, discrete categorizations simplify data interpretation, user feedback, and system interventions in VR. ○ The chosen discrete emotions (Happy, Relaxation, Stress, Sad) are contextually relevant to the VR photography contest experience and its rewards. AffectivelyVR
  • 53. ● Emotion Classification: ○ The ‘AffectivelyVR ML’ model predicts emotions like ‘happy’ (PVPA), ‘stress’ (NVPA), ‘sad’ (NVNA), and ‘relaxation’ (PVNA) with 83.6% cross-validation accuracy. ○ These emotions were tailored for the VR photography experience of the study. ● Data Collection & Storage: ○ Biosignal data is collected from the Biosignal Manager system via LSL. ○ Data is initially buffered for 60 seconds (7500 EEG samples & 7680 EDA/PPG samples). ● Data Processing & Overlapping Window: ○ After reaching capacity, the first 30 seconds of data is deleted, ensuring a continuous 60- second window with a 30-second overlap. ○ Predicts emotional states every 30 seconds after the initial 60 seconds. AffectivelyVR
  • 54. ● Emotion Prediction & Streaming: ○ Data follows preprocessing, feature extraction, and is compared with the AffectivelyVR ML model. ○ Predicted emotional states are streamed back to Unity with classifications: 1 for Happy, 2 for Stress, 3 for Sad, and 4 for Relaxation. AffectivelyVR
  • 55. Companion Manager 58 ● Functionality & Activation: Manages companion interactions, offers voice assistance, oversees photos, and aids in time management; only activates upon assistance request. ● Contextual Data Collection: Extracts and streams navigation data like direction, journey duration, on-time status, and photo details to the Companion Adaptor via LSL. ● Companion Visualization & Speech: Portrayed as an orb-shaped avatar with speech enabled by Microsoft Azure's TTS SDK; allows customization and integration with SSML for nuanced speech playback in Unity. ● Trigger Points & Management: Companion activates during events like photo capture, halfway time mark, last minute, and ten seconds. Tracks and manages photo count and storage locally.
  • 56. Companion Adaptor 59 ● System Foundation: ○ Developed using Python 3.7. ○ Aim: Collect context from the Companion Manager system, determine relevant TTS tone style, empathic dialogue, and assistance content, and then produce the SSML file for the TTS SDK. ● Data Collection and Emotional Analysis: ○ Used LabStreaming Layer (LSL) to gather contextual data. ○ Derived situational emotion (SE) as per Self-Projected Appraisal (SPA) ● Merged current emotional state (Physiological Emotion) with SE to generate Empathic Emotional Features (EEFs) as per guidelines in EEF Table. ● Used Empathic Emotion (EE) to pick the right tone style and Empathic Dialogue
  • 57. Companion Adaptor 60 ● Assistance Content Determination: ○ Content set based on activity type and given info. (e.g., For navigation, "Walk towards Right" if direction to next monument is "Right"). ○ For photo activities, it provides photo count details (e.g., "You have 5 photos Left. Do you wish to save this photo?"). ● In CAE systems, empathic dialogue is combined with assistance content or remains as only assistance for non-CAE systems. ● File Generation and Communication: ○ Generated an SSML file with the chosen tone and content. ○ Once file creation is complete, an acknowledgment is sent to Unity's Companion Manager system through LSL.
  • 58. RQ 3 ● 2x2 within-subject study with 14 participants (7 Female) ● Independent Variables ○ Emotion-Adaptive Environment (EA) ○ Empathic Companion (EC) ● Dependent Variables ○ Physiology (Electroencephalogram (EEG), Electrodermal Activity (EDA), & Heart Rate Variability (HRV) ○ Subjective: Self-Assessment Manikin (SAM) scale for Emotional State, IGroup Presence Questionnaire (IPQ) for Presence, NASA-TLX for Cognitive Load, Game Experience Questionnaire for Positive, Negative Affect and Empathy, Flow Short State Questionnaire (FSSQ) for flow state, ans System Trust Scale (STS) for human trust on agent. ○ Behavior: Task performance, error rate, rate of assistance 61 Experimental Methodology Conditions No-EC EC No-EA NEC-NEA EC-NEA EA NEC-EA EC-EA
  • 59. RQ 3 ● Valence Shift: CAE substantially enhanced valence (SAM ratings, p < 0.001), indicating an improved emotional state. ● Empathy in VR: Significant increase in empathy towards virtual agents when CAE is present (GEQ scores, p = 0.004). ● Cognitive Demand: CAE linked to lower task load perception (NASA-TLX scores, p = 0.01), suggesting more effortless interactions. ● Trust Dynamics: CAE exerted a significant effect on the System Trust Scale (STS ratings, p = 0.01), 62 CAEIxs Enhancing User Experience in VR
  • 60. RQ 3 ● Physiological Indicators: EA influenced both skin conductance (EDA-PA, p = 0.002) and heart rate variability (LF/HF, p < 0.01), marking physiological engagement. ● EEG Correlations: CAE significantly impacted specific EEG markers like Frontal Central Theta (FC-Theta, p < 0.006) and Frontal Parietal Beta (FP-Beta, p < 0.006), possibly related to cognitive processing and attention. 63 CAEIxs Enhancing User Experience in VR
  • 61. RQ 3 64 CAEIxs Enhancing User Experience in VR
  • 62. ● Adaptability in Research ○ Flexibility to adapt experiments and theories based on emerging data ○ The importance of iterating design based on user feedback and physiological data insights ● Methodological Rigor: ○ The need for comprehensive data collection methodologies to ensure the robustness of findings ○ Importance of balanced qualitative insights with quantitative and behavioral data for a well- rounded understanding ● Interdisciplinary Approach: ○ Leveraging knowledge from psychology, HCI, and data analysis to enrich VR research ○ Collaborating across disciplines to innovate and refine empathic VR technologies 66 Lessons Learned
  • 63. ● User-Centered Design ○ Placing User Experience at the forefront when developing VR systems ○ Understanding the underlying process to personalize is the key to effective empathic response in VR ● Technological Challenges: ○ Navigating the limitations of current VR and sensor technology. ○ Recognizing the importance of advancing hardware for more nuanced data capture and interaction. ● Ethical Considerations: ○ Addressing privacy concerns with biometric data usage. ○ Considering the psychological impact of immersive experiences on diverse user groups. 67 Lessons Learned
  • 64. Research Fellow at Empathic Computing Laboratory within Auckland Bioengineering Institute at The University of Auckland, working with Prof. Mark Billinghurst. Research revolves around the junction of Empathic Computing, Virtual Reality, Digital Agents, and Context-Aware Interactions. 93 ABOUT ME
  • 65. Worked as UX Researcher at Google, India ● Various projects related to Google Search Social Impact: Education, Civic Services, and Health Interaction Designer at Assistive Wearable Startup in India ● Lechal - Smart Insoles assisting in eye-free navigation through haptic feedback and tracking fitness metrics ● Lechal 2.0 - Additional feature: Fall Risk Intervention for elderly by understanding user’s walking behavior and predicting the fall risk 94 INDUSTRY EXPERIENCE
  • 68. CAEVR
  • 70. Experimental Conditions A B NEC-NEA (No Empathic Companion - No Emotion Adaptation) - Baseline condition with goal and tasks - No color filter - VA will provide on-demand assistance. It will recommend in navigation, task assistance NEC-EA (No Empathic Companion - Emotion Adaptation) - Baseline condition with goal and tasks - Color filters will be added depending on the real-time emotional state - VA will provide on-demand assistance. It will recommend in navigation, task assistance 99
  • 71. Experimental Conditions C D EC-NEA (Empathic Companion - No Emotion Adaptation) - Baseline condition with goal and tasks - No color filter - VA will provide on-demand assistance. It will recommend in navigation, task assistance but starting with a context- relevant empathic dialogue and in emotional tone EC-EA (Empathic Companion - Emotion Adaptation) - Baseline condition with goal and tasks - Color filters will be added depending on the real-time emotional state - VA will provide on-demand assistance. It will recommend in navigation, task assistance but starting with a context- relevant empathic dialogue and in emotional tone 100
  • 72. CAEVR System 101 AR-glass capable of adding color filter based on the participant’s emotional state. - Low-saturation colors evoke a sense of depression, while high- saturation ones lead people to feel cheerful and pleasant