You are on page 1of 120

Multimodal Detection of Affective States:

A Roadmap from Brain-Computer Interfaces, Face-Based Emotion Recognition, Eye Tracking Systems and Other Sensors
Javier Gonzalez-Sanchez, Maria-Elena Chavez-Echeagaray Robert Atkinson, Winslow Burleson, Robert Christopherson School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Copyright is held by the author/owner(s). CHI12, May 510, 2012, Austin, Texas, USA. ACM 12/05.

About Us

Learning Sciences Research Lab | Motivational Environments Group


School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Principal Investigator

Robert Atkinson Dr. Robert Atkinson is an Associate Professor in the Ira A. Schools of Engineering and in the Mary Lou Fulton Teachers College. His research explores the intersection of cognitive science, informatics, instructional design, and educational technology. His scholarship involves the design of instructional material, including books, and computer-based learning environments according to our understanding of human cognitive architecture and how to leverage its unique constraints and affordances. His current research focus involves the study of engagement and flow in games.

This work was supported by Ofce of Naval Research under Grant N00014-10-1-0143

Principal Investigator

Winslow Burleson Dr. Winslow Burleson received his PhD from the MIT Media Lab, Affective Computing Group. He joined ASU's School of Computing and Informatics and the Arts, Media, and Engineering graduate program at ASU in 2006. He has worked with the Entrepreneurial Management Unit at the Harvard Business School, Deutsche Telekom Laboratories, SETI Institute, and IBM's Almaden Research Center where he was awarded ten patents. He holds an MSE from Stanford University's Mechanical Engineering Product Design Program and a BA in Bio-Physics from Rice University. He has been a co-Principal Investigator on the Hubble Space Telescope's Investigation of Binary Asteroids and consultant to UNICEF and the World Scout Bureau.

This work was supported by Ofce of Naval Research under Grant N00014-10-1-0143

Graduate Student

Javier Gonzalez-Sanchez Doctoral student in Computer Science at Arizona State University. His research interests are software architecture, software engineering, affective computing, and educational technology. He holds an MS in Computer Science from the Center for Research and Advanced Studies of the National Polytechnic Institute and a BS in Computer Engineering from the University of Guadalajara. His experience includes 12+ years as software engineer, and 9+ years teaching undergraduate courses at Tecnologico de Monterrey and graduate courses at Universidad de Guadalajara.

This work was supported by Ofce of Naval Research under Grant N00014-10-1-0143

Graduate Student

Helen Chavez-Echeagaray Doctoral student in Computer Science at Arizona State University. Her interests are in the areas of affective computing, educational technology (including robotics), and learning processes. The Tecnolgico de Monterrey campus Guadalajara conferred upon her the degree of MS on Computer Science and the degree of BS on Computer Systems Engineering. Before starting her PhD program she was a faculty member for 8+ years at Tecnologico de Monterrey. Her experience includes administrative position and software developer.

This work was supported by Ofce of Naval Research under Grant N00014-10-1-0143

Graduate Student

Robert Christopherson Doctoral student in Education Technology. His interests are in affective computing, animated pedagogical agents, self-regulated learning, effective feedback, and evaluation of learning through psychophysiological expressions. His background is in Instructional Design and Technology with a MS from Western Illinois University, and Multimedia Communication and Technology with a BS from Utah Valley University. His experience includes developing multimedia learning environments and instructional modules for corporate, educational and government entities and conducting lab and classroom research on the cognitive and affective processes associated with learning.

This work was supported by Ofce of Naval Research under Grant N00014-10-1-0143

Preface

Context

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

Context

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

10

Justification

One novel part of the planning and design of the interaction between people and computers, related with the facet of securing user satisfaction, is the capability of systems to adapt to their individual users showing empathy.

Being empathetic

implies that the computer is able to recognize users affective states and understand the implication of those states.

Detection

of affective states is a step forward to provide machines with the necessary intelligence to identify and understand human emotions and then appropriately interact with humans. Therefore, it is necessary equip computers with hardware and software to enable them to perceive users affective states and then use this understanding to create more harmonic interactions.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

11

Motivation

This course provides a description and demonstration of tools and methodologies necessaries for automatically detecting affective states. Automatic detection of affective states requires the computer: (1) to sense information that is complex and diverse; and (2) to process and understand information integrating several sources (senses) that could range from brain-waves signals and biofeedback readings, passing for face-based and gesture emotion recognition, to posture and pressure sensing; (3) once the data is integrated, to apply perceiving data processing tools to understand users status. During the course we will review case previous research studies.

algorithms

software and

examples

(datasets) obtained from

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

12

Objectives

Attendees of this course will: Learn about sensing devices used to detect affective states including: brain-computer interfaces, face-based emotion recognition systems, eye tracking systems and physiological sensors. Understand the affective states.

pros and cons

of the sensing devices used to detect

Learn about the data that is gathered from each sensing device and understand its characteristics. Learn about what it takes to manage (preprocess and synchronize) affective data. Learn about approaches and algorithms used to analyze affective data and how it could be used to drive computer functionality or behavior.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

13

Schedule | Session 1

1. 2.

Introduction to Affective Human-Computer Interfaces (20 minutes) Sensing Devices (60 minutes) 2.1. Brain-computer interfaces. 2.2. Face-based emotion recognition systems.

Break

2.3. Eye-tracking systems. 2.4. Physiological sensors (Skin Conductance, Pressure, Posture)

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

14

Schedule | Session 2

3.

Data Filtering and Integration (20 minutes) 3.1. Gathered data 3.2. Data synchronization

4.

Analyzing data (20 minutes) 4.1. Regression 4.2. Clustering and Classification Break

5. 6.

Sharing Experience and Group Discussion (20 minutes) Conclusions and Q&A (20 minutes)

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

15

Session I

16

Session I

I. Introduction

17

Concepts

Multimodal Detection of Affective States

physiological

physical

instinctual reaction to stimulation feelings, emotions How do you feel?

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

18

Concepts

Until recently much of the affective data gathered by systems has relied heavily on learners self-report of their affective state, observation, or software data logs [1]. But now many systems have started to include data from the physical manifestations of affective states through the use of sensing devices and the application of novel machine learning and data mining algorithms to deal with the vast amounts of data generated by the sensors [2][3].

[1] R.S.J. Baker, M.M.T Rodrigo and U.E. Xolocotzin, The Dynamics of Affective Transitions in Simulation Problem-solving Environments, Proc. Affective Computing and Intelligent Interaction: Second International Conference (ACII 07), A. Paiva, R. Prada & R. W. Picard (Eds.), Springer Verlag, Vol. Lecture Notes in Computer Science 4738, pp. 666-677. [2] I. Arroyo, D. G. Cooper, W. Burleson, F. P. Woolf, K. Muldner, and R. Christopherson, Emotion Sensors Go to School, Proc. Artificial Intelligence in Education: Building Learning Systems that Care: from Knowledge Representation to Affective Modelling, (AIED 09), V Dimitrova, R. Mizoguchi, B. du Boulay & A. Grasser (Eds.), IOS Press, July 2009, vol. Frontiers in Artificial Intelligence and Applications 200, pp. 17-24. [3] R. W. Picard, Affective Computing, MIT Press, 1997.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

19

Concepts

physical appearance measurement physiological measures identify the presence of ...

self-report

Multimodal Detection of Affective States

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

20

Model

Mul$modal)emo$on)recogni$on)system)

Sensing)) Devices)

Percep$on) mechanisms)

Integra$on) Algorithm)

Brainwaves)

Eye)movements)

Facial)expressions) Physiological)signals)

User)

Raw)data)

Beliefs)

State)
21

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

Sensing Devices

Sensing devices obtain data from the user about his physiological responses and body reactions. Sensing devices are hardware devices that collect quantitative data as measures of physiological signals of emotional change. We call the measures provided by the sensing devices raw data. Our approach includes the use of: brain-computer interfaces, eye tracking systems, biofeedback sensors, and face-based emotion recognition systems [4]. The use of several sensing devices either to recognize a broad range of emotions or to improve the accuracy for recognizing one emotion is referred to as a multimodal approach.

[4] Javier Gonzalez-Sanchez, Robert M. Christopherson, Maria Elena Chavez-Echeagaray, David C. Gibson, Robert Atkinson, Winslow Burleson, How to Do Multimodal Detection of Affective States?, icalt, pp.654-655, 2011 IEEE 11th International Conference on Advanced Learning Technologies, 2011

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

22

Session I

2. Sensing Devices | Brain Computer Interfaces

23

BCI

Brain-Computer Interfaces (BCI)


It is a particular type of a physiological instrument that uses brainwaves as information sources (electrical activity along the scalp produced by the firing of neurons within the brain). Emotiv EPOC headset [5] device will be used to show how to collect and work with this kind of data.

[5] Emotiv - Brain Computer Interface Technology. Retrieved April 26, 2011, from http://www.emotiv.com.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

24

BCI

Wireless Emotiv EEG Headset. The device reports data with intervals of 125 ms (8 Hz). The raw data output includes 14 values (7 channels on each brain hemisphere: AF3, F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8, and AF4) and two values of the acceleration of the head when leaning (gyrox and gyroy). The affectiv suite reports 5 emotions: engagement, boredom, excitement, frustration, and meditation. And the expressiv suite reports facial gestures: blink, wink (left and right), look (left and right), raise brow, furrow brow, smile, clench, smirk (left and right), and laugh.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

25

BCI

Electrodes are situated and labeled according with the CMS/DRL configuration [6][7]

[6] Sharbrough F, Chatrian G-E, Lesser RP, Lders H, Nuwer M, Picton TW. American Electroencephalographic Society Guidelines for Standard Electrode Position Nomenclature. J. Clin. Neurophysiol 8: 200-2. [7] Electroencephalography. Retrieved November 14th, 2010, from Electric and Magnetic Measurement of the Electric Activity of Neural Tissue: www.bem.fi/book/13/13.htm

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

26

BCI

emotions

EEG data

facial gestures

Emotiv Systems $299

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

27

BCI

Demo Wireless Emotiv EEG Headset


Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

28

Affectiv Suite
29

Expressiv Suite
30

BCI

Field

Description
It is the timestamp (date and time) of the computer running the system. It could be used to synchronize the data with other sensors. Identifies the user. Shows the strength of the signal.

Values
Format "yymmddhhmmssSSS" (y - year, m - month, d - day, h - hour, m - minutes, s seconds, S - milliseconds). An integer value. The value is from 0 to 4, being 4 the best one. Values between 0 and 1, being 1 the value that represents the highest power/probability for this emotion. Values between 0 and 1, being 1 the value that represents the highest power/probability for this emotion.

Timestamp

UserID Wireless Signal Status Blink, Wink Left and Right, Look Left and Right, Raise Brow, Furrow, Smile, Clench, Smirk Left and Right, Short Term and Long Laugh Term Excitement, Engagement / Boredom, Meditation, Frustration AF3, F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8, AF4. GyroX and GyroY

Part of the expressive suite.

Part of the affective suite.

Raw data coming from each of the 14 channels. The name of these fields where defined according with the CMS/DRL configuration [XXX][XXXX]. Information about how the head moves/accelerates according with X and Y axis accordingly.

Values from 4000 and higher.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

31

Raw Data

Timestamp

AF3

F7

F3

FC5

T7

P7

O1

O2

P8 4360

T8

FC6

F4

F8

AF4

AccX

AccY 2003 2002 2003 2003 2004 2005 2006 2005 2006 2004 2005 2005 2004 2004 2002 2002 2000

101116112544901 45424831.8 4247.2 4690.3 4282.6 4395.4 4591.8 4569.2

4570.8 4297.4 4311.3 4282.6 4367.2 1660 4278 4370.8 1658

101116112544901 45374802.1 4243.1 4673.9 4272.3 4393.3 4592.8 4570.3 4354.9 4570.3 4292.3 4309.7 101116112545010 4533 4799 101116112545010 45494839.5 4234.9 4669.7 4241 4301 4396.9 4592.3 4570.8 4351.3 4398 4561

4281.5 4301.5 4271.3 4363.6 1659 4278 4369.7 1659

4691.3 4333.9 4340

4596.4 4567.2 4355.9 4556.4 4286.2 4306.2 4360 4558.5 4299

101116112545010 45804865.6 4251.8 4710.3 101116112545010 4597 4860

4401.5 4603.6 4572.8

4324.6 4296.4 4395.9 1657

4252.8 4705.6 4350.3 4412.3 4603.6 4577.4 4357.4 4555.9 4295.4 4329.2 4296.4 4414.4 1656 4360 4409.2 4597.4 4569.7 4351.8 4549.7 4279 4316.9 4272.8 4399.5 1656

101116112545010 45854847.7 4246.7 4690.3

101116112545010 45664842.1 4238.5 4684.1 4322.1 4389.7 4592.8 4566.7 4351.8 4549.7 4274.4 4310.3 4262.1 4370.8 1655 101116112545010 45644844.6 4231.8 4687.7 4267.7 4387.7 4594.4 4580 4361 4556.4 4560 4279 4310.8 4274.4 4370.8 1653

101116112545010 45674847.2 4233.3 4688.7 4285.1 4409.2 4602.1 4589.2 4368.2

4280.5 4310.8 4281.5 4390.3 1655 4278 4310.3 4273.3 4384.1 1652

101116112545010 45704846.7 4234.9 4683.1 4323.1 4415.9 4604.1 4585.6 4366.7 4557.4 101116112545010 45694842.6 4233.9 4678.5 4310.8 4402.6 101116112545010 45584832.8 4234.9 4676.9 4301 4599

4583.1 4364.1 4553.9 4277.4 4310.3 4271.3 4372.3 1654 4280 4310.3 4276.9 4280 4280 4380 1653

4389.7 4595.4 4590.3 4368.7 4556.9 4598 4599 4600

101116112545010 45564831.8 4233.3 4679.5 4314.4 4390.3

4374.9 4562.6 4280.5 4311.3 4378.5 4567.2 4279 4313.3

4386.2 1653 4382.1 1653

101116112545010 45704842.6 4232.8 4684.1 4303.6 4405.6 4609.7

101116112545010 45744846.7 4235.4 4683.1 4293.3 4416.4 4619.5 4604.1 4382.6 4570.8 4280.5 4310.8 4282.1 4382.1 1652 101116112545010 45624840.5 4227.2 4673.9 4300 4405.1 4611.3 4601 4376.4 4561.5 4280 4303.6 4279.5 4374.9 1652

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

32

Expressiv Data

Timestamp 101116091145065 101116091145190 101116091145315 101116091145440 101116091145565 101116091145690 101116091145815 101116091145940 101116091146065 101116091146190 101116091146315 101116091146440 101116091146565 101116091146690 101116091146815 101116091146941

ID 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0

Signal 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2

Blink 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0

Wink L 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0

Wink R 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0

Look L 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1

Look R 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0

Eyebrow 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0

Furrow 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0

Smile 0 0.45465 0.46701 0.40101 0.24867 0.17302 0.16279 0.15649 0 0 0 0 0 0 0 0

Clench 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0

Smirk L 0 0 0 0 0 0 0 0 0.776925 0 0 0 0 0 0 0

Smirk R 0.988563 0 0 0 0 0 0 0 0 0.608679 0.342364 0.149695 0.08644 0.073348 0.118965 0.259171

Laugh 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

33

Affectiv Data

Timestamp 101116091145065 101116091145190 101116091145315 101116091145440 101116091145565 101116091145690 101116091145815 101116091145940 101116091146065 101116091146190 101116091146315 101116091146440 101116091146565 101116091146690 101116091146815 101116091146941

Short Term Excitement 0.447595 0.447595 0.447595 0.487864 0.487864 0.487864 0.487864 0.521663 0.521663 0.521663 0.521663 0.509297 0.509297 0.509297 0.509297 0.451885

Long Term Excitement 0.54871 0.54871 0.54871 0.546877 0.546877 0.546877 0.546877 0.545609 0.545609 0.545609 0.545609 0.544131 0.544131 0.544131 0.544131 0.541695

Engagement 0.834476 0.834476 0.834476 0.834146 0.834146 0.834146 0.834146 0.839321 0.839321 0.839321 0.839321 0.84401 0.84401 0.84401 0.84401 0.848087

Meditation 0.333844 0.333844 0.333844 0.339548 0.339548 0.339548 0.339548 0.348321 0.348321 0.348321 0.348321 0.358717 0.358717 0.358717 0.358717 0.368071

Frustration 0.536197 0.536197 0.536197 0.54851 0.54851 0.54851 0.54851 0.558228 0.558228 0.558228 0.558228 0.546771 0.546771 0.546771 0.546771 0.533919

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

34

Session I

2. Sensing Devices | Face-Based Emotion Recognition

35

Face-Based Recognition

Face-based emotion recognition systems


These systems infer affective states by capturing images of the users facial expressions and head movements. We are going to show the capabilities of face-based emotion recognition systems using a simple 30 fps USB webcam and software from MIT Media Lab [8].

[8] R. E. Kaliouby and P. Robinson, Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures, Proc. Conference on Computer Vision and Pattern Recognition Workshop (CVPRW 04), IEEE Computer Society, June 2004, Volume 10, p. 154.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

36

Face-Based Recognition

MindReader API that enables the real time analysis, tagging and inference of cognitive affective mental states from facial video in real-time. This framework combines vision-based processing of the face with predictions of mental state models to interpret the meaning underlying head and facial signals overtime. It provides results at intervals of 100 ms approximately (10 Hz). With this system it is possible to infer emotions such as: agreeing, concentrating, disagreeing, interested, thinking, and unsure. (Ekman and Friesen 1978) Facial Action Coding System, 46 actions (plus head movements).

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

37

Face-Based Recognition

Demo MindReader Software from MIT Media Lab

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

38

Face-Based Recognition

! 19!Lip!Corner!Depressor! ! 26!Jaw!Drop! ! 27!Mouth!Stretch!

Enero!22,!2010!

Javier!Gonzlez!!Snchez!|!Mara!E.!Chvez!Echeagaray!

22!

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

39

Mind Reader
40

Mind Reader

Field

Description

Values

Timestamp

It is the timestamp (date and time) of the computer running the system. It could be used to synchronize the data with other sensors.

Format "yymmddhhmmssSSS" (y - year, m - month, d - day, h - hour, m - minutes, s seconds, S - milliseconds).

Agreement, Concentrating, Disagreement, Interested, Thinking, Unsure

This value is between 0 to 1. This value shows the probability of this emotion being present If the value is -1 it means it was not possible to define on the user at a particular time (frame). an emotion. This happen the user's face is out of the camera focus.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

41

Mind Reader

Timestamp 101116112838516 101116112838578 101116112838672 101116112838766 101116112838860 101116112838953 101116112839016 101116112839110 101116112839156 101116112839250 101116112839391 101116112839438 101116112839547 101116112839578 101116112839688 101116112839781 101116112839828

Agreement 0.001836032 0.001447654 5.97E-04 2.46E-04 1.01E-04 4.18E-05 1.72E-05 7.10E-06 2.92E-06 1.21E-06 4.97E-07 2.05E-07 8.43E-08 3.47E-08 1.43E-08 5.90E-09 2.43E-09

Concentrating 0.999917 0.9999516 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0

Disagreement 1.79E-04 1.29E-04 1.50E-04 1.75E-04 2.04E-04 2.38E-04 2.78E-04 3.24E-04 3.77E-04 4.40E-04 5.12E-04 5.97E-04 6.96E-04 8.11E-04 9.45E-04 0.001101404 0.001283521

Interested 0.16485406 0.16310683 0.44996294 0.77445686 0.93511915 0.983739 0.9960774 0.99906266 0.99977654 0.9999467 0.9999873 0.999997 0.9999993 0.9999999 0.99999994 1 1

Thinking 0.57114255 0.5958921 0.45527613 0.32144752 0.21167138 0.13208677 0.07941038 0.046613157 0.026964737 0.015464196 0.008824189 0.005020725 0.002851939 0.001618473 9.18E-04 5.21E-04 2.95E-04

Unsure 0.04595062 0.042706452 0.00789697 0.001418217 2.53E-04 4.52E-05 8.07E-06 1.44E-06 2.57E-07 4.58E-08 8.18E-09 1.46E-09 2.60E-10 4.64E-11 8.29E-12 1.48E-12 2.64E-13

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

42

Session I

2. Sensing Devices | Eye Tracking Systems

43

Eye Tracking System

Eye-tracking systems
These are instruments that measure eyes position and eyes movement in order to detect zones in which the user has particular interest in a specific time and moment. Datasets from Tobii Eye-tracking system [9] data will be shown.

[9] Tobii Technology - Eye Tracking and Eye Control. Retrieved April 26, 2011, from http://www.tobii.com.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

44

Eye Tracking System

Tobii Eye Tracker.


The device reports data with intervals of 100 ms (10Hz). The output provides data concerning attention direction (gaze-x, duration of fixation, and pupil dilation.

gaze-y),

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

45

Eye Tracking System

Demo Tobii Eye Tracker


Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

46

Eye Tracking System

Field
Timestamp

Description
It is the timestamp (date and time) of the computer running the system. It could be used to synchronize the data with other sensors.

Values
Format "yymmddhhmmssSSS" (y - year, m - month, d - day, h - hour, m - minutes, s seconds, S - milliseconds).

GazePoint X

The horizontal screen position for either eye or the average for 0 is the left edge, the maximum value of the horizontal both eyes. This value is also used for the fixation definition. screen resolution is the right edge. The vertical screen position for either eye or the average for 0 is the bottom edge, the maximum value of the vertical both eyes. This value is also used for the fixation definition. screen resolution is the right edge. Pupil size (left eye) in mm. Validity of the gaze data. Pupil size (left eye) in mm. Validity of the gaze data. Fixation duration. The time in milliseconds that a fixation lasts. Events, automatic and logged, will show up under Event. Areas Of Interests if fixations on multiple AOIs are to be written on the same row. Variates 0 to 4. 0 if the eye is found and the tracking quality good. If the eye cannot be found by the eye tracker, the validity code will be 4. Variates 0 to 4. 0 if the eye is found and the tracking quality good. If the eye cannot be found by the eye tracker, the validity code will be 4. Variates. Variates. Variates.

GazePoint Y Pupil Left Validity Left Pupil Right Validity Right FixationDuration Event AOI

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

47

Eye Tracking System

Timestamp 101124162405582 101124162405599 101124162405615 101124162405632 101124162405649 101124162405666 101124162405682 101124162405699 101124162405716 101124162405732 101124162405749 101124162405766 101124162405782 101124162405799 101124162405816 101124162405832 101124162405849

GPX 636 641 659 644 644 628 614 701 906 947 941 938 937 934 941 946 0

GPY 199 207 211 201 213 194 177 249 341 398 400 403 411 397 407 405 0

Pupil Left 2.759313 2.684893 2.624458 2.636186 2.690685 2.651784 2.829281 2.780344 2.853761 2.829427 2.826602 2.78699 2.803387 2.819166 2.811687 2.857419 -1

Validity L 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4

Pupil Right 2.88406 2.855817 2.903861 2.916132 2.831013 2.869714 2.899828 2.907665 2.916398 2.889944 2.881179 2.87948 2.821803 2.871547 2.817927 2.857427 -1

Validity R 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4

Fixation 48 48 48 48 48 48 48 49 49 49 49 49 49 49 49 49 49

Event

AOI Content Content Content Content Content Content Content Content Content Content Content

KeyPress

Content Content Content Content Content Content

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

48

Session I

2. Sensing Devices | Galvanic Skin Conductance Sensor

49

Galvanic Skin Conductance


50

Galvanic Skin Conductance

Provide information on the activity of physiological functions of an individual.

Arousal detection.

Measures the electrical conductance of the skin, which varies with its moisture level that depends on the sweat glands, which are controlled by the sympathetic, and parasympathetic nervous systems. [10] Hardware designed by MIT Media Lab. It is a Wireless Bluetooth device that reports data in intervals of 500 ms approximately (2Hz)

[10] M. Strauss, C. Reynolds, S. Hughes, K. Park, G. McDarby, and R.W. Picard, The HandWave Bluetooth Skin Conductance Sensor, Proc. First International Conference on Affective Computing and Intelligent Interaction (ACII 05), Springer-Verlang, Oct. 2005, pp. 699-706, doi:10.1007/11573548_90.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

51

Galvanic Skin Conductance

Demo Skin electrical conductance Sensor

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

52

Galvanic Skin Conductance


53

Galvanic Skin Conductance

Field

Description

Values

Timestamp

It is the timestamp (date and time) of the computer running the system. It could be used to synchronize the data with other sensors.

Format "yymmddhhmmssSSS" (y - year, m - month, d - day, h - hour, m - minutes, s seconds, S - milliseconds).

Battery Voltage

Level of the battery voltage.

0 - 3 Volts

Conductance

Level of arousal.

0 - 3 Volts

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

54

Galvanic Skin Conductance

Timestamp
101116101332262 101116101332762 101116101333262 101116101333762 101116101334247 101116101334747 101116101335247 101116101335747 101116101336247 101116101336762 101116101337231 101116101337747 101116101338247 101116101338747 101116101339247 101116101339731 101116101340231

Voltage
2.482352941 2.482352941 2.482352941 2.482352941 2.482352941 2.482352941 2.482352941 2.482352941 2.482352941 2.482352941 2.482352941 2.482352941 2.482352941 2.482352941 2.482352941 2.482352941 2.482352941

Conductance
1.030696176 1.023404165 1.019813274 1.041657802 0.998280273 0.991181142 0.980592229 0.998280273 1.012586294 1.012586294 1.012586294 1.009008251 0.998280273 0.991181142 0.987628521 0.987628521 0.980592229

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

55

Session I

2. Sensing Devices | Pressure Sensor

56

Pressure Sensor
57

Pressure Sensor

Provide information on the activity of physiological functions of an individual. Pressure sensors able to detect the increasing amount of pressure (correlated with levels of frustration) that the user puts on a mouse, or any other controller (such as a game controller) [11]. Hardware designed by MIT Media Lab. It is a serial device that reports data in intervals of 150 ms approximately (6Hz).

[11] Y. Qi, and R. W. Picard, "Context-Sensitive Bayesian Classifiers and Application to Mouse Pressure Pattern Classification," Proc. International Conference on Pattern Recognition (ICPR 02), Aug. 2002, vol 3, pp. 30448, doi:10.1109/ICPR.2002.1047973.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

58

Pressure Sensor

Demo Mouse Pressure Sensor

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

59

Pressure Sensor
60

Pressure Sensor

Field

Description
It is the timestamp (date and time) of the computer running the system. It could be used to synchronize the data with other sensors.

Values
Format "yymmddhhmmssSSS" (y - year, m - month, d - day, h - hour, m - minutes, s seconds, S - milliseconds).

Timestamp

Right Rear

Sensor positioned in the right rear of the mouse.

0 - 1024. Being 0 the highest pressure.

Right Front

Sensor positioned in the right front of the mouse.

0 - 1024. Being 0 the highest pressure.

Left Rear

Sensor positioned in the left rear of the mouse.

0 - 1024. Being 0 the highest pressure.

Left Front

Sensor positioned in the left front of the mouse.

0 - 1024. Being 0 the highest pressure.

Middle Rear

Sensor positioned in the middle rear of the mouse.

0 - 1024. Being 0 the highest pressure.

Middle Front

Sensor positioned in the middle front of the mouse.

0 - 1024. Being 0 the highest pressure.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

61

Pressure Sensor

Timestamp
110720113306312 110720113306468 110720113306625 110720113306781 110720113306937 110720113307109 110720113307265 110720113307421 110720113307578 110720113307734 110720113307890 110720113308046 110720113308203 110720113308359 110720113308515 110720113308687 110720113308843

Right Rear
1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023

Right Front
1023 1023 998 1009 794 492 395 382 364 112 204 465 404 166 145 154 126

Left Rear
1023 1023 1023 1023 1023 1022 1021 1021 1022 1021 1021 1022 1022 1021 1021 1021 1021

Left Front
1023 1023 1002 977 982 891 916 949 983 1004 946 971 1023 1023 1023 1023 1023

Middle Rear
1023 1023 1023 1023 1023 1023 1019 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023

Middle Front
1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

62

Pressure Sensor

The raw data from the mouse is processed to obtain meaningful information [12]. The data obtained is a normalized value from the six different sensor on the mouse.

[12] Cooper, D., Arroyo, I., Woolf, B., Muldner, K., Burleson, W., and Christopherson, R. (2009). Sensors model student self concept in the classroom, User Modeling, Adaptation, and Personalization, 30--41

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

63

Pressure Sensor

Timestamp
110720113306312 110720113306468 110720113306625 110720113306781 110720113306937 110720113307109 110720113307265 110720113307421 110720113307578 110720113307734 110720113307890 110720113308046 110720113308203 110720113308359 110720113308515 110720113308687 110720113308843

Right Rear
1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023

Right Front
1023 1023 998 1009 794 492 395 382 364 112 204 465 404 166 145 154 126

Left Rear
1023 1023 1023 1023 1023 1022 1021 1021 1022 1021 1021 1022 1022 1021 1021 1021 1021

Left Front
1023 1023 1002 977 982 891 916 949 983 1004 946 971 1023 1023 1023 1023 1023

Middle Rear
1023 1023 1023 1023 1023 1023 1019 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023

Middle Front
1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023 1023

Value
6 6 5.955034213 5.941348974 5.736070381 5.350928641 5.275659824 5.299120235 5.315738025 5.088954057 5.122189638 5.402737048 5.393939394 5.160312805 5.139784946 5.1485826 5.121212121

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

64

Session I

2. Sensing Devices | Posture Sensor

65

Posture Sensor
66

Posture Sensor

Posture detection using a low-cost, low-resolution pressure sensitive seat cushion and back pad. Developed at ASU based on experience using a more expensive high resolution unit from the MIT Media Lab [13].

[13] S. Mota, and R. W. Picard, "Automated Posture Analysis for Detecting Learners Interest Level," Proc. Computer Vision and Pattern Recognition Workshop (CVPRW 03), IEEE Press, June 2003, vol. 5, pp. 49, doi:10.1109/CVPRW.2003.10047.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

67

Posture Sensor

Demo Chair Posture Sensor

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

68

Posture Sensor
69

Posture Sensor

Field

Description
It is the timestamp (date and time) of the computer running the system. It could be used to synchronize the data with other sensors. Value of the X axis of the accelerometer. Value of the Y axis of the accelerometer. Sensor positioned in the right side of the seat cushion. Sensor positioned in the middle side of the seat cushion. Sensor positioned in the left side of the seat cushion. Sensor positioned in the right side of the back pad. Sensor positioned in the middle side of the back pad. Sensor positioned in the left side of the back pad.

Values
Format "yymmddhhmmssSSS" (y - year, m - month, d - day, h - hour, m - minutes, s seconds, S - milliseconds). Variates Variates 0 - 1024. Being 1024 the highest pressure. 0 - 1024. Being 1024 the highest pressure. 0 - 1024. Being 1024 the highest pressure. 0 - 1024. Being 1024 the highest pressure. 0 - 1024. Being 1024 the highest pressure. 0 - 1024. Being 1024 the highest pressure.

Timestamp

AccX AccY Right Seat Middle Seat Left Seat Right Back Middle Back Left Back

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

70

Posture Sensor

Timestamp
110720074358901 110720074359136 110720074359401 110720074359636 110720074359854 110720074400120 110720074400354 110720074400589 110720074400839 110720074401089 110720074401526 110720074401557 110720074401714 110720074401745 110720074402026 110720074402339 110720074402620

AccX
980 969 1008 993 994 970 977 985 996 991 1068 937 979 957 945 948 952

AccY
-171 -169 -165 -166 -167 -167 -166 -128 -182 -185 -310 -124 -104 -165 -171 -169 -166

Right Seat
1015 1008 1015 1001 1015 1011 1011 1012 1016 1012 1019 92 103 143 126 0 3

Middle Seat
1019 1004 1012 1004 1011 1008 1011 1010 1014 1012 522 0 0 3 0 0 0

Left Seat
1012 1012 1008 1016 1003 1001 1013 1006 1012 1004 1001 0 0 0 0 1 1

Right Back
976 978 974 975 967 968 968 974 972 858 32 0 0 0 0 0 0

Middle Back
554 540 554 548 559 620 541 565 668 108 0 1 0 0 1 0 0

Left Back
309 305 368 306 418 358 413 314 290 2 421 247 87 0 3 0 0

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

71

Posture Sensor

The raw data from the six chair sensors is processed [13] to obtain: net seat change, net back change, and sit forward values.

[12] Cooper, D., Arroyo, I., Woolf, B., Muldner, K., Burleson, W., and Christopherson, R. (2009). Sensors model student self concept in the classroom, User Modeling, Adaptation, and Personalization, 30--41

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

72

Posture Sensor

Timestamp 110720074358901 110720074359136 110720074359401 110720074359636 110720074359854 110720074400120 110720074400354 110720074400589 110720074400839 110720074401089 110720074401526 110720074401557 110720074401714 110720074401745 110720074402026 110720074402339 110720074402620

Right Seat 1015 1008 1015 1001 1015 1011 1011 1012 1016 1012 1019 92 103 143 126 0 3

Middle Seat 1019 1004 1012 1004 1011 1008 1011 1010 1014 1012 522 0 0 3 0 0 0

Left Seat 1012 1012 1008 1016 1003 1001 1013 1006 1012 1004 1001 0 0 0 0 1 1

Right Back 976 978 974 975 967 968 968 974 972 858 32 0 0 0 0 0 0

Middle Back 554 540 554 548 559 620 541 565 668 108 0 1 0 0 1 0 0

Left Back 309 305 368 306 418 358 413 314 290 2 421 247 87 0 3 0 0

NetSeat Change 12 22 19 30 34 9 15 9 14 14 500 2450 11 43 20 127 3

NetBack Change 152 20 81 69 131 122 134 129 129 962 1353 207 161 87 4 4 0

SitForward 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

73

Summary

Sensing Device (rate in Hz)

Legacy Software

Sensing (Input or Raw Data)

Physiological responses and/or Emotion reported (output or sensed values)

EEG activity. Reported in 14 channels [16],: AF3, F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8, and AF4. Emotiv EEG headset (128 Hz) Emotiv SDK Brain Waves Face activity. Blink, wink (left and right), look (left and right), raise brow, furrow brow, smile, clench, smirk (left and right), and laugh. Emotions. Excitement, engagement, boredom, meditation and frustration.

Standard Webcam (10 Hz)

MIT Media Lab MindReader

Facial Expressions

Emotion. Agreeing, concentrating, disagreeing, interested, thinking and unsure.

MIT skin conductance sensor (2 Hz)

USB driver

Skin Conductivity

Arousal.

MIT pressure sensor (6 Hz) Tobii Eye tracking (60 Hz) MIT posture sensor (6 Hz)

USB driver

Pressure

One pressure value per sensor allocated into the input/control device.

Tobii SDK

Eye Tracking

Gaze point (x, y).

USB driver

Pressure

Pressure values in the back and the seat (in the right, middle and left zones) of a cushion chair.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

74

Session 2

75

Background

The datasets showed in the next slides correspond to the data collected in three studies. The stimuli, protocol, participants, are described briefly in the following paragraphs.

1. Study One: Subjects playing a video game.

2. Study Two: Subjects reading documents with and without pictures.

3. Study Three: Subjects solving tasks in a Tutoring System.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

76

Study One

Stimuli. The high-fidelity graphic, deeply engaging, Guitar Hero video game. The game that involves holding a guitar interface while listening to music and watching a video screen. Protocol. The study consists in a one-hour session with: (1) a 15 minutes to practice with the objective that the user gets familiar with the game controller and the environment; and (2) a 45 minutes when users played four songs of their choice one of each level: easy, medium, hard and expert. Participants. The call for participation was an open call among Arizona State University students. The experiment was run over 21 subjects, 67% of them were men and 33% women. The age range goes from 18 to 28 years. The study includes subjects with different (self-reported) experience playing video games, where 15% never play before, 5% were not skilled in video games, 28% reported being slightly skilled, 33% somewhat skilled, 14% very skilled, and only 5% reported them self as experts.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

77

Study Two

Stimuli. On-screen reading material was used for this study. Two different types of reading material were used, one with either on-task or off-task images, captions, and draws; and the second one, which only contains text. Protocol. The study consists of one 60-minutes session where the subject is presented with 10 pages from a popular Educational Psychology textbook and asked to read for understanding. Each participant was asked to complete a pre and post-test. Participants. The call for participation also was an open call among Arizona State University students. The study was ran over 28 subjects.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

78

Study Three

Stimuli. A Tutoring System for system dynamics. The tutor tutors students on how to model systems behavior using a graphical representation and algebraic expressions. The model is represented using a directed graph structure that defines a topology formed by nodes and links. Once there is a complete model, students can execute it, and debug it [13]. Protocol. Students were asked to perform a set of tasks (about system dynamics modeling) using the tutoring system. While the student was working on these tasks, the tutoring system collected emotional state information with the intention of being able to generate better and more accurate hints and feedback. Participants. Pilot test during Summer 2010 with 2 groups or 30 high-school students.

[14] K. VanLehn, W. Burleson, M.E. Chavez-Echeagaray, R. Christopherson, J. Gonzalez-Sanchez, Y. Hidalgo-Pontet, and L. Zhang. The Affective Meta-Tutoring Project: How to motivate students to use effective meta-cognitive strategies, Proceedings of the 19th International Conference on Computers in Education. Chiang Mai, Thailand: Asia- Pacific Society for Computers in Education. October 2011. In press.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

79

Session 2

3. Data Filtering and Integration

80

Filtering

Filtering the data includes cleaning, or passing a threshold or a range.

syncronizing,

and

averaging

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

81

Filtering and Integration

multimodal emotion recognition considers the existence of different sources of data that contribute to infer the affective state of an individual.
The Each sensing device has its own type of data and sample rate. The challenge is how to combine the data coming from these diverse sensing devices that have proved their independent functionality and create an improved unique output.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

82

Filtering and Integration

Data Logger
Agent

Data

Agent

Centre Multimodal

Tutoring

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

83

Framework

We developed a framework that follows the organizational strategy of an federation [15].

agent

The federation assigns one agent to collect raw data from each sensing device. That agent implements the perception mechanism for its assigned sensing device to map raw data into beliefs. Each agent is autonomous and encapsulate one sensing device and its perception mechanisms into independent, individual, and intelligent components. All the data is timestamped and independently identified by agent.

[15] Gonzalez-Sanchez, J.; Chavez-Echeagaray, M.E.; Atkinson, R.; and Burleson, W. (2011), "ABE: An Agent-Based Software Architecture for a Multimodal Emotion Recognition Framework," in Proceedings of Working IEEE/IFIP Conference on Software Architecture (June 2011).

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

84

Framework

When several agents need to interact, it is important to establish an organizational strategy for them, which defines authority relationships, data flows and coordination protocols [16] [17].

These beliefs are reported to a affective state report.

central agent,

which integrate them into one

Third-party systems are able to obtain affective state reports from the Central agent using a publish-subscribe style.

[16] F. Tuijnman, and H. Afsarmanesh, "Distributed objects in a federation of autonomous cooperating agents," Proc. International Conference on Intelligent and Cooperative Information Systems, May 1993, pp. 256-265, doi:10.1109/ICICIS.1993.291763. [17] M. Wood, and S. DeLoach, An overview of the multiagent systems engineering methodology, Agent-Oriented Software Engineering, (AOSE 2000), Springer-Verlag, 2001, pp. 207-221, doi:10.1007/3-540-44564-1_14.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

85

Framework | Architecture

[15] Gonzalez-Sanchez, J.; Chavez-Echeagaray, M.E.; Atkinson, R.; and Burleson, W. (2011), "ABE: An Agent-Based Software Architecture for a Multimodal Emotion Recognition Framework," in Proceedings of Working IEEE/IFIP Conference on Software Architecture (June 2011).

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

86

Integration | sparse

timestamp

fixationIndex

gazePointX

gazePointY

mappedFixationPo intX

mappedFixationPo intY

fixationDuration

Short Term Excitement

Long Term Excitement

Engagement/ Boredom

Meditation

Frustration

Conductance

agreement

concentrating

4135755652 4135755659 4135755668 4135755676 4135755692 4135755709 4135755714 4135755726 4135755742 4135755759 4135755761 4135755776 4135755792 4135755809 4135755824 4135755826 4135755842 4135755859 4135755876 4135755892 4135755909 4135755918 4135755926 4135755942 4135755959 4135755964 4135755976 4135755992 4135756009 4135756025 4135756027 214 214 214 215 704 693 696 728 398 415 411 406 696 696 696 804 405 405 405 387 216 216 216 183 214 214 214 694 694 698 399 407 404 696 696 696 405 405 405 216 216 216 214 214 214 214 214 214 701 697 693 700 701 686 407 403 401 402 411 398 696 696 696 696 696 696 405 405 405 405 405 405 216 216 216 216 216 216 213 213 214 574 554 603 413 402 409 570 570 696 408 408 405 216 216 216 213 213 213 568 568 563 411 409 411 570 570 570 408 408 408 216 216 216 213 213 213 566 565 567 412 404 404 570 570 570 408 408 408 216 216 216 213 573 408 570 408 216

0.436697 0.436697

0.521059 0.521059

0.550011 0.550011

0.335825 0.335825

0.498908 0.40169063 0.498908

0.436697

0.521059

0.550011

0.335825

0.498908

[18] J. Liu, S. Ji, and J. Ye. SLEP: Sparse Learning with Efficient Projections. Arizona State University, 2009. http://www.public.asu.edu/~jye02/Software/SLEP.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

87

Integration | state machine

timestamp

fixationIndex

gazePointX

gazePointY

mappedFixationPo intX

mappedFixationPo intY

fixationDuration

Short Term Excitement

Long Term Excitement

Engagement/ Boredom

Meditation

Frustration

Conductance

agreement

concentrating

4135755652 4135755659 4135755668 4135755676 4135755692 4135755709 4135755714 4135755726 4135755742 4135755759 4135755761 4135755776 4135755792 4135755809 4135755824 4135755826 4135755842 4135755859 4135755876 4135755892 4135755909 4135755918 4135755926 4135755942 4135755959 4135755964 4135755976 4135755992 4135756009 4135756025 4135756027

213 213 213 213 213 213 213 213 213 213 213 213 213 214 214 214 214 214 214 214 214 214 214 214 214 214 214 214 214 215 215

574 573 573 566 565 567 567 568 568 563 563 574 554 603 603 701 697 693 700 701 686 686 694 694 698 698 704 693 696 728 728

414 408 408 412 404 404 404 411 409 411 411 413 402 409 409 407 403 401 402 411 398 398 399 407 404 404 398 415 411 406 406

570 570 570 570 570 570 570 570 570 570 570 570 570 696 696 696 696 696 696 696 696 696 696 696 696 696 696 696 696 804 804

408 408 408 408 408 408 408 408 408 408 408 408 408 405 405 405 405 405 405 405 405 405 405 405 405 405 405 405 405 387 387

216 216 216 216 216 216 216 216 216 216 216 216 216 216 216 216 216 216 216 216 216 216 216 216 216 216 216 216 216 183 183

0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697 0.436697

0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059 0.521059

0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011 0.550011

0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825 0.335825

0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063 0.498908 0.40169063

1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1

1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

88

Integration | time window

timestamp

NetSeatChange

NetBackChange

SitForward

RightSeat

MiddleSeat

LeftSeat

RightBack

MiddleBack

LeftBack

MouseValue

RightRear

RightFront

LeftRear

LeftFront

11072009371 11072009372 11072009381 11072009382 11072009391 11072009392 11072009401 11072009402 11072009411 11072009412 11072009421 11072009422 11072009431 11072009432 11072009441 11072009442 11072009451 11072009452 11072009461 11072009462 11072009471 11072009472 11072009481 11072009482 11072009491 11072009492 11072009501 11072009502 11072009511 11072009512 11072009521

10 16 13 14 10 10 10 9 10 11 9 10 10 10 9 12 10 10 9 8 10 10 10 10 9 9 9 10 8 9 9

15 29 17 20 52 15 21 9 73 57 15 14 26 25 62 30 35 37 32 21 25 17 16 22 10 21 16 11 14 15 24

999.28866 1012.04124 977.350516 979.010309 937.876289 921.773196

5 1022.95213 1022.96277 1022.95745 1022.96809 5 1022.98404 1022.95745 1022.96277 1022.95213 5 1022.95745 1022.97872 1022.99468 1022.96809 5 1022.95745 1022.98404 1022.95745 1022.95745 5 1022.97884 1022.97355 1022.98942 1022.95767 5 1022.98931 635.336898 1021.85027 1022.96257 5 1022.97895 990.742105 1022.91579 1022.98947 5 1023 1020.54787 1023 1023 5 1022.98404 1022.95745 1022.98404 1022.95213 5 1022.95767 1022.97355 1022.96825 1022.94709 5 1022.9734 1022.96277 1022.96277 1022.95745 1022.9734 1022.95745 5 1022.94681 1022.98936 5 1022.99468 5 5 1023

0 1003.29897 1011.50516 970.257732 982.546392 924.371134 929.804124 0 1005.80208 1011.42708 976.239583 979.708333 939.520833 944.895833 0 1005.38144 1011.46392 973.896907 982.092784 938.597938 946.649485 0 1005.13402 1012.02062 977.742268 972.237113 672.329897 923.546392 0 1004.36083 1012.49485 987.505155 981.381443 865.402062 942.226804 0 1003.05208 1012.09375 988.375 984.46875 873 945.270833 942.94898 0 997.602041 1012.21429 989.510204 982.683674 868.489796

0 999.030928 1013.01031 984.556701 975.628866 791.164949 938.030928 0 1002.49485 1012.61856 978.814433 970.237113 868.402062 948.649485 0 1000.94845 1013.85567 983.85567 975.948454 931.268041 961.329897 935.71134 962.092784 948.72165 938.28866 955.701031 0 1000.26804 1013.17526 983.484536 980.762887 0 1001.6701 1013.30928 981.804124 976.56701

0 1000.92784 1013.23711 982.360825 974.958763 912.690722

5 1022.98396 1022.98396 1022.95722 1022.94652 1022.9734 1022.99468 1022.98404 1023 1023 1023

0 1002.52577 1013.51546 984.608247 971.247423 852.783505 947.092784 0 1006.11458 1011.20833 988.583333 973.729167 846.145833 948.572917 0 1006.625 1011.64583 995.875 968.916667 809.46875 934.229167 933.57732 938.875 942.09375 0 1006.34021 1012.86598 996.010309 0 1005.86458 1012.70833 0 1006.32292 1011.9375 997.489583 972.42268 807.907217 970.875 852.864583 972.0625

5 1022.97861 1022.98931 1022.97326 1022.99465 1022.9734 1022.98936 1022.98404 1022.97872 1022.9734 1022.9734 1022.97872 1022.9734 5 1022.98404 1022.95213 1022.97872 5 1022.99468 1022.96809

996.875 971.322917 815.822917

5 1022.96791 1022.93583 1022.93048 1022.94652 5 1022.98936 1022.98936 1022.96809 5 1022.9734 1022.92553 1022.94681 1022.97872

0 1005.57292 1011.42708 997.083333 0 1005.79381 0

884.28125 943.833333 901.84375 944.447917 890.34375 945.354167 945.96875 949.26087 950.76 942.21875

0 1005.07292 1011.40625 997.614583 973.333333 1005.9375 1012.33333 999.979167 972.979167 1013 1013.14

1012.3299 998.752577 970.670103 900.371134 946.536083 971.65625 900.239583 961.98 870.65

5 1022.96809 1022.95745 1022.96277 1022.94149 5 1022.98936 1022.95213 1022.96277 1022.95213 5 1022.98936 5 1022.99471 5 1022.98396 5 5 1023 1023 1023 1022.98404 1022.99468 1023 1022.95767 1022.96296 1023 1022.98931 1022.99465 1023 1023 1023 1023 1023 1023

0 1007.45833 1012.73958 1000.22917 0 1007.30435 0 1007.29 998.94

1000.6087 964.619565 869.630435

5 1022.97872 1022.99468 1022.98936 1022.98404 5 1022.98413 1022.97884 1022.99471 1022.98413

0 1006.21875 1012.34375 999.885417 963.114583 882.270833 943.677083 0 1006.82292 1012.97917 1000.15625 967.104167 884.072917 0 1006.90625 1013.42708 1001.04167 967.489583 884.708333 938.520833 0 1006.70526 1013.08421 1000.03158 966.789474 866.568421 940.157895

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

89

Session 2

4. Analyzing Data

90

Tool | Eureqa

mathematical relationships in data

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

91

Tool | Eureqa

For reverse engineering searches of the data, the Eureqa tool [19] is used to discover mathematical expressions of the structural relationships in the data records.

For example, if a record holds information about the physical and emotional behavior of an individual who was engaged in a single experimental setting, Eureqa could take all the available sources of data and reveal both how the measure of engagement is calculated from specific data streams as well as how other sensors may influence the proposed emotional construct.

[19] Dubca kova , R. Eureqa-software review. Genetic programming and evolvable machines. Genet. Program. Evol. Mach. (2010) online first. doi:10.1007/s10710010-9124-z .

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

92

Tool | Eureqa
93

Tool | Eureqa
94

Tool | Weka

explore

classification pre-processing

clustering visualization

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

95

Tool | Weka

For clustering

and classification approaches Weka [20]

It is a tool that implements a collection of machine learning algorithms for data mining tasks, is used to explore the data composition and relationships and derive useful knowledge from data records.

[20] M. Hall, E. Frank, G. Holmes, B. Pfahringer, P. Reutemann, and I.H. Witten, The WEKA Data Mining Software: An Update, Proc. SIGKDD Explorations, 2009,

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

96

P0009 novice playing expert song

Tool | Weka
97

P0009 novice playing expert song

Tool | Weka
98

Documented API for Java Developers

Tool | Weka
99

Session 2

5. Sharing Experiences and Group Discussion

100

Experiences

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

101

Visualization

BCI and Gaze Points

engagement

This figure shows the engagement fixation points of expert player playing in expert-mode. The size of the circle represents the duration of the fixation in that point, while the level of shading represents the intensity of the emotion.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

102

Visualization

BCI and Gaze Points

frustration

This figure shows the frustration fixation points of expert player playing in expert-mode. The size of the circle represents the duration of the fixation in that point, while the level of shading represents the intensity of the emotion.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

103

Visualization

BCI and Gaze Points

boredom

This figure shows the boredom fixation points of expert player playing in expert-mode. The size of the circle represents the duration of the fixation in that point, while the level of shading represents the intensity of the emotion.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

104

Visualization

BCI and Gaze Points

engagement

This figures shows the engagement gaze points (above a threshold of 0.6) of a user reading a material with seductive details (i.e. cartoons). For this user the text on the bottom part of the first column was engaging.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

105

Visualization

BCI and Gaze Points

frustration

This figure shows the frustration gaze points (above a threshold of 0.6) of a user reading a material with seductive details (i.e. cartoons). Looking the cartoon is related with high frustration level.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

106

Visualization

BCI and Gaze Points

boredom

This figure shows the boredom gaze points (above a threshold of 0.5) of a user while reading a material with seductive details (i.e. cartoons). Notice that the text in the middle part of the second column of that page was boring.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

107

Matching Values

BCI and Face-Based Emotion Recognition


BCI-based values excitement engagement meditation frustration Correlation with face-based model 0.284 0.282 0.188 0.275

Face-based values agreement concentrating disagreement interested thinking unsure

Correlation with BCI-based model 0.760 0.765 0.794 0.774 0.780 0.828

Our hypothesis is that, to a great extent, it is possible to infer the values from one source from the other source.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

108

Matching Values

Random Forest

"

[21] L. Breiman. Random forests. Machine Learning, 2001. Volume 45, Number 1. Pages 532.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

109

Networks

BCI raw data | Engagement

Structural(Equa+ons,(Adjacency(Matrixes( and(Networks(Graphs(

Brain(schema+c,(showing(the(channels( that(contribute(with(engagement(

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

110

Group Discussion

devices

data

inferences

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

111

Session 2

6. Conclusions and Q&A

112

Conclusion

The goal of this tutorial was to provide the attendees with enough information and examples to enable them to start their own investigations of the cognitiveaffective elements of learning [22]. While the course do not present an exhaustive list of all the methods available for collecting, manipulating, analyzing and interpreting affective sensor data, the tutorial describe the basis of a multimodal approach that attendees can use to launch their own research efforts.

[22] B. du Boulay, Towards a Motivationally-Intelligent Pedagogy: How should an intelligent tutor respond to the unmotivated or the demotivated?, Proc. New Perspectives on Affect and Learning Technologies, R. A. Calvo & S. D'Mello (Eds.), Springer-Verlag, in press.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

113

References

1.

R.S.J. Baker, M.M.T Rodrigo and U.E. Xolocotzin, The Dynamics of Affective Transitions in Simulation Problem-solving Environments, Proc. Affective Computing and Intelligent Interaction: Second International Conference (ACII 07), A. Paiva, R. Prada & R. W. Picard (Eds.), Springer-Verlag, Vol. Lecture Notes in Computer Science 4738, pp. 666-677. I. Arroyo, D. G. Cooper, W. Burleson, F. P. Woolf, K. Muldner, and R. Christopherson, Emotion Sensors Go to School, Proc. Artificial Intelligence in Education: Building Learning Systems that Care: from Knowledge Representation to Affective Modelling, (AIED 09), V. Dimitrova, R. Mizoguchi, B. du Boulay & A. Grasser (Eds.), IOS Press, July 2009, vol. Frontiers in Artificial Intelligence and Applications 200, pp. 17-24. R. W. Picard, Affective Computing, MIT Press, 1997. J. Gonzalez-Sanchez, R. M. Christopherson, M. E. Chavez-Echeagaray, D. C. Gibson, R. Atkinson, W. Burleson, How to Do Multimodal Detection of Affective States?, icalt, pp.654-655, 2011 IEEE 11th International Conference on Advanced Learning Technologies, 2011 Emotiv - Brain Computer Interface Technology. Retrieved April 26, 2011, from http://www.emotiv.com. Sharbrough F, Chatrian G-E, Lesser RP, Lders H, Nuwer M, Picton TW. American Electroencephalographic Society Guidelines for Standard Electrode Position Nomenclature. J. Clin. Neurophysiol 8: 200-2. Electroencephalography. Retrieved November 14th, 2010, from Electric and Magnetic Measurement of the Electric Activity of Neural Tissue: http://www.bem.fi/book/13/13.htm 114

2.

3. 4.

5. 6.

7.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

References

8.

R. E. Kaliouby and P. Robinson, Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures, Proc. Conference on Computer Vision and Pattern Recognition Workshop (CVPRW 04), IEEE Computer Society, June 2004, Volume 10, p.154. Tobii Technology - Eye Tracking and Eye Control. Retrieved April 26, 2011, from http:// www.tobii.com. M. Strauss, C. Reynolds, S. Hughes, K. Park, G. McDarby, and R.W. Picard, The HandWave Bluetooth Skin Conductance Sensor, Proc. First International Conference on Affective Computing and Intelligent Interaction (ACII 05), Springer-Verlang, Oct. 2005, pp. 699-706, doi: 10.1007/11573548_90. Y. Qi, and R. W. Picard, "Context-Sensitive Bayesian Classifiers and Application to Mouse Pressure Pattern Classification," Proc. International Conference on Pattern Recognition (ICPR 02), Aug. 2002, vol 3, pp. 30448, doi:10.1109/ICPR.2002.1047973. Cooper, D., Arroyo, I., Woolf, B., Muldner, K., Burleson, W., and Christopherson, R. (2009). Sensors model student self concept in the classroom, User Modeling, Adaptation, and Personalization, 30--41. S. Mota, and R. W. Picard, "Automated Posture Analysis for Detecting Learners Interest Level," Proc. Computer Vision and Pattern Recognition Workshop (CVPRW 03), IEEE Press, June 2003, vol. 5, pp. 49, doi:10.1109/CVPRW.2003.10047.

9.

10.

11.

12.

13.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

115

References

14.

K. VanLehn, W. Burleson, M.E. Chavez-Echeagaray, R. Christopherson, J. Gonzalez-Sanchez, Y. Hidalgo-Pontet, and L. Zhang. The Affective Meta-Tutoring Project: How to motivate students to use effective meta-cognitive strategies, Proceedings of the 19th International Conference on Computers in Education. Chiang Mai, Thailand: Asia- Pacific Society for Computers in Education. October 2011. In press. J. Gonzalez-Sanchez; M.E. Chavez-Echeagaray; R. Atkinson; and W. Burleson. (2011), "ABE: An Agent-Based Software Architecture for a Multimodal Emotion Recognition Framework," in Proceedings of Working IEEE/IFIP Conference on Software Architecture (June 2011). F. Tuijnman, and H. Afsarmanesh, "Distributed objects in a federation of autonomous cooperating agents," Proc. International Conference on Intelligent and Cooperative Information Systems, May 1993, pp. 256-265, doi:10.1109/ICICIS.1993.291763. M. Wood, and S. DeLoach, An overview of the multiagent systems engineering methodology, Agent-Oriented Software Engineering, (AOSE 2000), Springer-Verlag, 2001, pp. 207-221, doi: 10.1007/3-540-44564-1_14. [18] J. Liu, S. Ji, and J. Ye. SLEP: Sparse Learning with Efficient Projections. Arizona State University, 2009. http://www.public.asu.edu/~jye02/Software/SLEP. [19] Dubcakova, R. Eureqa-software review. Genetic programming and evolvable machines. Genet. Program. Evol. Mach. (2010) online first. doi:10.1007/s10710- 010-9124-z .

15.

16.

17.

18.

19.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

116

References

20.

M. Hall, E. Frank, G. Holmes, B. Pfahringer, P. Reutemann, and I.H.Witten, The WEKA Data Mining Software: An Update, Proc. SIGKDD Explorations, 2009, Volume 11, Issue 1. L. Breiman. Random forests. Machine Learning, 2001. Volume 45, Number 1. Pages 532. B. du Boulay, Towards a Motivationally-Intelligent Pedagogy: How should an intelligent tutor respond to the unmotivated or the demotivated?, Proc. New Perspectives on Affect and Learning Technologies, R. A. Calvo & S. D'Mello (Eds.), Springer-Verlag, in press.

21. 22.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

117

Questions | Answers

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

118

Acknowledgements

This research was supported by Office of Naval Research under Grant N00014-10-1-0143 awarded to Dr. Robert Atkinson and by National Science Foundation, Award 0705554, IIS/HCC Affective Learning Companions: Modeling and supporting emotion during teaching awarded to Dr. Beverly Woolf and Dr. Winslow Burleson.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray

119

lsrl.lab.asu.edu hci.asu.edu { javiergs, helenchavez } @ asu.edu

You might also like