Professional Documents
Culture Documents
INTRODUCTION
1.
1.1
INTRODUCTION
AIMS
1) To design smarter devices
2) To create devices with emotional intelligence
3) To create computational devices with perceptual abilities
The idea of giving computers personality or, more accurately, emotional
intelligence" may seem creepy, but the technologists say such machines would offer important
advantages.
De-spite their lightning speed and awesome powers of computation, today's PCs
are essentially deaf, dumb, and blind. They can't see you, they can't hear you, and they certainly
don't care a whit how you feel. Every computer user knows the frustration of nonsensical error
messages, buggy software, and abrupt system crashes. We might berate the computer as if it was
an unruly child, but, of course, the machine can't respond. "It's ironic that people feel like
dummies in front of their computers, when in fact the computer is the dummy," says Rosalind
Picard,
computer
science
professor
at
the
MIT
Media
Lab
in
Cambridge.
is
about
to
explode
and
warn
him
to
slow
down
a stressed-out
and
cool
off.
TRACKS USED
Our emotional changes are mostly reflected in our heart pulse
rate, breathing rate, facial expressions, eye movements, voice etc. Hence these are the
Parameters on which blue technology is being developed.
Making computers see and feel Blue Eyes uses sensing technology to identify
a user's actions and to extract key information. This information is then analyzed to determine
the user's physical, emotional, or informational state, which in turn can be used to help make the
user more productive by performing
expected
actions
or
by
providing
expected
information.
Beyond making computers more researchers say there is another compelling
reason for giving machine emotional intelligence. Contrary to the common wisdom that
emotions contribute to irrational behavior, studies have shown that feelings actually play a vital
role in logical thought and decision- making. Emotionally impaired people often find it difficult
to make decisions because they fail to recognize the subtle clues and signals--does this make me
feel happy or sad, excited or bored? That help direct healthy thought processes. It stands to
reason, therefore, that computers that can emulate human emotions are more likely to behave
rationally, in a manner we can understand. Emotions are like the weather. We only pay attention
to them when there is a sudden outburst, like a tornado, but in fact they are constantly operating
in the background, helping to monitor and guide our day-to-day activities.
Picard, who is also the author of the groundbreaking book Affective
Computing, argues that computers should operate under the same principle."They have
tremendous mathematical abilities, but when it comes to interacting with people, they are
autistic," she says. "If we want computers to be genuinely intelligent and interact naturally with
us, we must give them the ability to recognize,understand, and even to behave' and express
emotions." Imagine the benefit of a computer that could remember that a particular Internet
search had resulted in a frustrating and futile exploration of cyberspace. Next time, it might
modify its investigation to improve the chances of success when a similar request is made.
1.2
The world of science cannot be measured in terms of development and progress. It shows how
far human mind can work and think. It has now reached to the technology known as Blue eyes
technology that can sense and control human emotions and feelings through gadgets. The eyes,
fingers, speech are the elements which help to sense the emotion level of human body. This
paper implements a new technique known as Emotion Sensory World of Blue eyes technology
which identifies human emotions (sad.happy.exclted or surprised) using image processing
techniques by extracting eye portion from the captured image which is then compared with
stored images of data base. After identifying mood the songs will be played to make human
emotion level normal.
Blue Eyes Technology aims at creating computational Machines with perceptual and
sensory abilities like those of human beings. Blue Eyes system is thus a versatile system which
can be modified to cater to the working environment. The Blue Eyes system has hardware with
software loaded on it Blue Eyes system can be applied in every working environment requiring
permanent operators attention for it. The hardware comprises of data acquisition unit and central
system unit. The heart of Data acquisition unit is ATMEL 89C52 microcontroller Bluetooth
technology is used for communication and coordination between the two units. Blue eye system
can be applied in every working environment which requires pemanent operators attention. Blue
eyes system provides technical means for monitoring and recording human operators
physiological condition. A blue eyes is a project aiming to be a means of stress reliever driven by
the advanced, technology of studying the facial expressions for judgment of intensity of stress
handled. In totality blue eyes aims at adding perceptual abilities which would end up in a healthy
stress free environment and can be applied in every working environment requiring permanent
operators attention.
1.3
Paul Ekmans facial expression work gave the correlation between a persons emotional
state and persons physiologicalmeasurements, which described Facial Action Coding
System (Ekman and Rosenberg, 1997).
His experiment involved participants attached to devices to record
certain measurements including pulse, galvanic skin response (GSR), temperature and somatic
movement.
The experiment involves devices attached to participants to record certain measurements
including pulse, galvanic skin response (GSR), temperature and somatic movement.
Six participants were trained to exhibit the facial expressions of the six
basic emotions, anger, fear, sadness, disgust, joy and surprise. The physiological changes
associated with affect were assessed and analyzed.
CHAPTER-2
METHODOLOGY
2.METHODOLOGY
2.1.Affective Computing:
The process of making emotional computers with sensing abilities is known as affective
computing. The steps used in this are:
1)Giving sensing abilities
2)Detecting human emotions
3)Respond properly
The first step, researchers say, is to give ma-chins the equivalent of the eyes,
ears, and other sensory organs that humans use to recognize and express emotion. To that end,
computer scientists are exploring a variety of mechanisms including voice-recognition software
that can discern not only what is being said but the tone in which it is said; cameras that can track
subtle facial expressions, eye movements, and hand gestures; and biometric sensors that can
measure body temperature, blood pressure, muscle tension, and other physiological signals
associated with emotion.
In the second step, the computers have to detect even the minor variations of
our moods. For e.g. person may hit the keyboard very fast either in the happy mood or in the
angry mood.
In the third step the computers have to react in accordance MJ with the
emotional states. Various methods of accomplishing affective computing are:
1) AFFECT DETECTION.
2) MAGIC POINTING.
3) SUITOR.
4) EMOTIONAL MOUSE.
1) AFFECT DETECTION
This is the method of detecting our emotional states from the expressions on
our face. Algorithms amenable to real time implementation that extract information from facial
expressions and head gestures are being explored. Most of the information is extracted from the
position of the eye rows and the corners of the mouth.
2) MAGIC POINTING
MAGIC stands for Manual Acquisition with Gaze Tracking
Technology. a computer with this technology could move the cursor by following the direction of
the user's eyes .This type of technology will enable the computer to automatically transmit
information related to the screen that the user is gazing at. Also, it will enable the computer to
determine, from the user's expression, if he or she understood the information on the screen,
before automatically deciding to proceed to the next program The user pointing is still done by
the hand, but the cursor always appears at the right position as if by MAGIC. By varying input
technology and eye tracking, we get MAGIC pointing.
3) SUITOR
SUITOR stands for Simple User Interface Tracker. It implements the
method for putting computational devices in touch with their users changing moods. By
watching what we page the user is currently browsing, the SUITOR can find additional
information on that topic. The key is that the user simply interacts with the computer as usual
and the computer infers user interest based on what it sees the user do.
4) EMOTION MOUSE
This is the mouse embedded with sensors that can sense the
physiological attributes such as temperature, Body pressure, pulse rate, and touching style, etc.
The computer can determine the users emotional states by a single touch. IBM is still
Performing research on this mouse and will be available in the market within the next two or
three years. The expected accuracy is 75%.
One goal of human computer interaction (HCI) is to make an adaptive, smart
computer system. This type of project could possibly include gesture recognition, facial
recognition, eye tracking, speech recognition, etc. Another non-invasive way to obtain
information about a person is through touch. People use their computers to obtain, store and
manipulate data using.
In order to start creating smart computers, the computer must start gaining
information about the user. Our proposed method for gaining user information through touch is
via a computer input device, the mouse. From the physiological data obtained from the user, an
emotional state may be determined which would then be related to the task the user is currently
doing on the computer. Over a period of time, a user model will be built in order to gain a sense
of the user's personality.
The scope of the project is to have the computer adapt to the user in
order to create a better working environment where the user is more productive. The first steps
towards realizing this goal are described here.
2.1. EMOTION AND COMPUTING
Rosalind Picard (1997) describes why emotions are important to the computing
community. There are two aspects of affective computing: giving the computer the ability to
detect emotions and giving the computer the ability to express emotions. Not only are emotions
crucial for rational decision making but emotion detection is an important step to an adaptive
computer system. An adaptive, smart computer system has been driving our efforts to detect a
persons emotional state.
By matching a persons emotional state and the context of the expressed
emotion, over a period of time the persons personality is being exhibited. Therefore, by giving
the computer a longitudinal understanding of the emotional state of its user, the computer could
adapt a working style which fits with its users personality. The result of this collaboration could
increase productivity for the user. One way of gaining information from a user non-intrusively is
by video. Cameras have been used to detect a persons emotional state. We have explored
gaining information through touch. One obvious place to put sensors is on the mouse.
2.2.MAGIC pointing:
This work explores a new direction in utilizing eye gaze for computer input. Gaze tracking has
long been considered as an alternative or potentially superior pointing method for computer
input. We believe that many fundamental limitations exist with traditional gaze pointing. In
particular, it is unnatural to overload a perceptual channel such as vision with a motor control
task. We therefore propose an alternative approach, dubbed MAGIC (Manual and Gaze Input
Cascaded) pointing. With such an approach, pointing appears to the user to be a manual task,
used for fine manipulation and selection. However, a large portion of the cursor movement is
eliminated by warping the cursor to the eye gaze area, which encompasses the target.
Two specific MAGIC pointing techniques, one conservative and one liberal,
were designed, analyzed, and implemented with an eye tracker we developed. They were then
tested in a pilot study. This early stage exploration showed that the MAGIC pointing techniques
might offer many advantages, including reduced physical effort and fatigue as compared to
traditional manual pointing, greater accuracy and naturalness than traditional gaze pointing, and
possibly faster speed than manual pointing.
In our view, there are two fundamental shortcomings to the existing gaze
pointing techniques, regardless of the maturity of eye tracking technology
First, given the one-degree size of the fovea and the subconscious jittery
motions that the eyes constantly produce, eye gaze is not precise enough to operate UI widgets
such as scrollbars, hyperlinks, and slider handles Second, and perhaps more importantly, the eye,
as one of our primary perceptual devices, has not evolved to be a control organ. Sometimes its
movements are voluntarily controlled while at other times it is driven by external events. With
the target selection by dwell time method, considered more natural than selection by blinking [7],
one has to be conscious of where one looks and how long one looks at an object. If one does not
look at a target continuously for a set threshold (e.g., 200ms), the target will not be successfully
selected.
Once the cursor position had been redefined, the user would need to only make
a small movement to, and click on, the target with a regular manual input device. We have
designed two MAGIC pointing techniques, one liberal and the other conservative in terms of
target identification and cursor placement.
4. Speed. Since the need for large magnitude pointing operations is less than with pure manual
cursor control, it is possible that MAGIC pointing will be faster than pure manual pointing.
5. Improved subjective speed and ease-of-use. Since the manual pointing amplitude is smaller,
the user may perceive the MAGIC pointing system to operate faster and more pleasantly than
pure manual control, even if it operates at the same speed or more slowly.
The fourth point wants further discussion. According to the well accepted Fits'
Law, manual pointing time is logarithmically proportional to the A/W ratio, where A is the
movement distance and W is the target size. In other words, targets which are smaller or farther
away take longer to acquire.
For MAGIC pointing, since the target size remains the same but the cursor
movement distance is shortened, the pointing time can hence be reduced. It is less clear if eye
gaze control follows Fits' Law. In Ware and Michelins study, selection time was shown to be
logarithmically proportional to target distance, thereby conforming to Fits' Law. To the contrary,
Silber and Jacob [9] found that trial completion time with eye tracking input increases little with
distance, therefore defying Fits' Law.
In addition to problems with today's eye tracking systems, such as delay,
error, and inconvenience, there may also be many potential human factor disadvantages to the
MAGIC pointing techniques we have proposed, including the following:
1. With the more liberal MAGIC pointing technique, the cursor warping can be overactive at
times, since the cursor moves to the new gaze location whenever the eye gaze moves more than a
set distance (e.g., 120 pixels) away from the cursor. This could be particularly distracting when
the user is trying to read. It is possible to introduce additional constraint according to the context.
For example, when the user's eye appears to follow a text reading pattern, MAGIC pointing can
be automatically suppressed.
2. With the more conservative MAGIC pointing technique, the uncertainty of the exact location
at which the cursor might appear may force the user, especially a novice, to adopt a cumbersome
strategy: take a touch (use the manual input device to activate the cursor), wait (for the cursor to
appear), and move (the cursor to the target manually). Such a strategy may prolong the target
acquisition time. The user may have to learn a novel hand-eye coordination pattern to be efficient
with this technique. Gaze position reported by eye tracker Eye tracking boundary with 95%
confidence True target will be within the circle with 95% probability The cursor is warped to the
boundary of the gaze area, along the initial actuation vector Previous cursor position, far from
target Initial manual actuation vector
3. With pure manual pointing techniques, the user, knowing the current cursor location, could
conceivably perform his motor acts in parallel to visual search. Motor action may start as soon as
the user's gaze settles on a target. With MAGIC pointing techniques, the motor action
computation (decision) cannot start until the cursor appears. This may negate the time saving
gained from the MAGIC pointing technique's reduction of movement amplitude. Clearly,
experimental (implementation and empirical) work is needed to validate, refine, or invent
alternative MAGIC pointing techniques.
2.3.Emotional Mouse:
For Hand:
Emotion Mouse
Sentic Mouse
For Eyes:
Expression Glasses
Magic Pointing
Eye Tracking
For Voice:
Artificial Intelligence Speech Recognition
One proposed, non invasive method for gaining user information through touch is via a
computer input device, the mouse. This then allows the user to relate the cardiac rhythm, the
body temperature, electrical conductivity of the skin and other physiological attributes with the
mood. This has led to the creation of the Emotion Mouse. The device can measure heart rate,
temperature, galvanic skin response and minute bodily movements and matches them with six
emotional states: happiness, surprise, anger, fear, sadness and disgust. The mouse includes a set
of sensors, including infrared detectors and temperature-sensitive chips. These components, User
researchers stress, will also be crafted into other commonly used items such as the office chair,
the steering wheel, the keyboard and the phone handle. Integrating the system into the steering
wheel, for instance, could allow an alert to be sounded when a driver becomes drowsy.
Information Obtained From Emotion Mouse
1
Behavior
a
Mouse movements
Physiological information
a
Heart
rate
(Electrocardiogram
(ECG/EKG),
Photoplethysmogram (PPG))
b
has to make many comparisons before recognition occurs. This necessitates use of very highspeed processors. A large RAM is also required as even though a spoken word may last only a
few hundred milliseconds, but the same is translated into many thousands of digital words. It is
important to note that alignment of words and templates are to be matched correctly in time,
before computing the similarity score. This process, termed as dynamic time warping, recognizes
that different speakers pronounce the same words at different speeds as well as elongate different
parts of the same word. This is important for the speaker-independent recognizers.