You are on page 1of 6

Emotion and Group Decision Making in Artificial Intelligence

Goreti Marreiros1, Carlos Ramos1 and José Neves2


1
GECAD – Knowledge Engineering and Decision Support Group
Institute of Engineering – Polytechnic of Porto
Porto, Portugal
{goreti, csr}@dei.isep.ipp.pt
2
University of Minho
Braga, Portugal
jneves@di.uminho.pt
Abstract
New economic conditions, particularly the increase of inter and intra organizational
competitiveness, leads more and more to assigning the decision making responsibility to groups
rather than to individuals. Traditionally, emotions and affects have been separated from cognitive
and rational thinking; they have a bad connotation on what is related to the individuals’
behaviour, in particular on what is related to the decision-making process. However, in the last
years, researchers’ from several distinct areas (psychology, neuroscience, philosophy, etc) have
begun to explore the role of the emotion as a positive influence on human decision making
process. Current research in Artificial Intelligence demonstrates also a growing interest in
emotional agents. From human-computer interaction, to development of believable agents to the
entertainment industry, and to modelling and simulating the human behaviour, there is a wide
variety of application areas of emotional agents.
This paper presents a survey of the role of emotions in individual and group decision making
and discusses the process of emotional contagion. After that, it also discusses the application of
those concepts to the simulation of group decision making through the use of emotional software
agents. Ethical aspects related to the use of emotional agents are also debated. All these concepts
are being applied in the development of a prototype under the scope of the ArgEmotionAgents
project (POSI/EIA/56259/2004- Argumentative Agents with Emotional Behaviour Modelling for
Participants’ Support in Group Decision-Making Meetings), a project supported by FCT – the
Portuguese Science and Technology Foundation.

Keywords: Group Decision Making; Emotional Agents

Introduction
The problem of group decision-making has gained great relevance in the scope of Decision
Support Systems, which were initially designed as individual tools. Quickly those tools have
demonstrated to be limited, in the sense that in today’s organizations several persons, entities or
agents are involved in most of the decision processes. In that way, the decision problems are
considered from different points of view, with different opinions about the importance of the
decision criteria (for instance, in the purchase of a car we will be able to consider criteria like
price, technical characteristics, design or manufacturer). Numerous commercial and non
commercial Group Decision Support Systems (GDSS) were developed in the lasts years
(GroupSystems software; Marreiros et al, 2004; Karacapilidis and Papadias, 2001). Despite of the
quality of these systems, they present some limitations. In our recent work we are proposing
some new ideas to deal with GDSS (Marreiros et al, 2005a). These ideas are the following: the
use of Multi-Agent Systems to model group participants; the inclusion of argumentation and
emotional aspects in the group decision making process.
The use of Multi-Agent Systems seems very suitable to simulate the behaviour of groups of
people working together and, in particular, to group decision making modelling, because it allows
(Marreiros et al, 2005b):
• Individual modelling – each participant of the group decision making can be represented by
an agent that will interact with other agents. Agents can be modelled with social and
emotional characteristics in order to become more realistic.
• Flexibility – with this approach it is easy to incorporate or remove entities. It is also possible
to change the characteristics of the individuals, for instance, in order to analyze its impact in
the group behaviour.
• Data distribution – frequently, in group decision making, participants are geographically
distributed. Agents that represent participants, with this approach, may be running in different
machines.
What is the role of emotion in generic decision making processes? The neuroscientist António
Damásio, as well as other researchers, defends that emotion affects the decision making process
(Damásio, 1994). Moreover, various researchers identify emotion as one of the key elements of
the intelligence and adaptable nature of Human beings (Goleman, 1995; LeDoux, 1996; Bechara
et al., 1997).
This information is contradictory to the dominant thought that along several centuries
defended that emotion is an obstacle to reason. Plato, for instance, states that passions, desires
and fears make it impossible for us to think (LeDoux, 1996). In the XVI century Descartes
(Descartes, 1989) echoed the idea that emotion and reason are incompatible, with his famous
thought “I think, therefore I am” (Descartes, 1989). His theory states the separation between mind
and body, where emotions are needs and impulses created by the body, and the mind is
responsible for all the processes associated with reason.
In this paper, and generally in the ArgEmotionAgents project, we will try to illustrate the
importance of emotions in decision making processes. We will present a survey on the role of
emotions in individual and group decision making, and in particular we will discuss the process
of emotional contagion between group members (in the case of group decision making). Further,
we will give a brief overview of how emotions have been treated in the Artificial Intelligence
domain, will discuss how the presented concepts will be integrated in the scope of the
ArgEmotionAgents project, and will present some conclusions in the final section of the paper.

Emotion and Decision Making


Common sense usually tells us that a great deal of emotion can harm decision making process
but, on the other hand, Rosalind Picard for instance, claims that too little emotion can impair
decision making as well (Picard, 1997). It seams that, in decision making processes, emotion is
needed in a balanced way.
The terms emotion, mood and affect are many times used alternately. According to Forgas
(1995) affect is the most generic and usually is used to refer to mood and emotion. Emotion is
normally referred as an intense experience, of short duration (second to minutes), with a specific
origin and in general the individual is conscious of that. In contrast, moods have a propensity to
be less intensive, longer lasting (hours or even days) and remain unconscious for the individual.
Moods may be caused by an intense or recurrent emotion, or yet by environmental aspects.

Individual Decision Making


The work developed by the neuroscientist António Damásio (Damásio, 1994) exposed new
neurological evidence of the emotion relevance in individual decision making. In his work he
establishes a clear relation between specific brain lesions (pre-frontal cortex) and emotional
(in)capabilities. The analysed patients show normal results in IQ tests, but they are unable to deal
with real-world decisions as, for instance, scheduling and appointment with someone else. In
conclusion Damásio defends that emotion and reason are constitutive elements, which means that
there are no opposition between emotion and rationality (Damásio, 2000). It should not be
necessary to choose between reason, emotion, knowledge or logic. The human being needs all
these mechanisms to take decisions in time and with quality.
In psychological literature several examples could be found on how emotions and moods
affects the individual decision making process:
• Individuals are more predisposed to recall past memories that are congruent with their present
feelings.
• Positive mood trend to promote risk aversion behaviour, while negative mood promote a risk
taking behaviour.
• Affect influence the individual’s information strategy processing. Positive moods tend to be
associated with heuristics processing, while negative moods are more related to systematic
processing.

Group Decision Making


As seen in the last section there are several researchers devoted to the study of the role of
emotion in individual decision making. The impact of emotion in the group decision making
processes have been less discussed. Most researchers analyze the group performance based on
concepts like: group size, group heterogeneity, group diversity. Emotion will influence the
individual decisions of the group members, but during a group decision making, group members
may be also influenced by the displayed emotions of other members.
Emotional contagion could be defined as the tendency to automatically mimic and
synchronize facial expressions, vocalizations, postures, and movements with those of another
person and, consequently, to converge emotionally (Hatfield, Cacioppo, & Rapson, 1992). The
process of emotional contagion could be analysed based: on the emotions that a group member is
feeling (for instance, if one or more members is feeling fear this may induce this emotion in other
members, beside alert them to the possibility of something bad happen); or based on the group
members mood (Neumann and Strack 2000), if a member of the group is in a negative/positive
mood this may induce that mood in the others members.
One of the disadvantages usually associated to group decision is groupthink, which means
that thinking as a group may unintentionally hide some critical thoughts, in order to for instance,
maintain the group unanimity. In our opinion the process of emotional contagion will enhance the
decision process without the problems associated to groupthink.
Emotion in Artificial Intelligence
Wooldridge and Jennings (1995) distinguished two notions of agency: (i) a weak definition,
where an agent is defined according to the following characteristics: autonomy, social ability,
reactivity and pro-activeness; (ii) and a strong definition of agent, that comes essentially from
researchers in the Artificial Intelligence (AI) area, and see agents with anthropomorphic
characteristics.
The study of emotions in AI is not new, in the 60’s Herbert Simon, studied the role of
emotions in the cognitive process (Simon, 1967). However, only recently researchers have
devoted more attention to this subject (Bates, 1994; Bazan et al. 2002; Botelho and Coelho, 2001;
Canamero, 1997; Elliot, 1992; Ortony et al., 1988; Ortony, 2003; Picard, 1997, 2003; Velasquez,
1998).
Emotion has several definitions and usages inside AI. For authors that work with believable
agents, emotion is a way of improving the believability of artificial agents (Bates, 1994; Ortony,
2003). Rosalind Picard (2003) enumerates four major reasons to give machines emotional
characteristics:
• Emotions may be useful in the creation of robots and believable characters with the capacity
to emulate humans and animals. The use of emotion gives agents more credibility.
• The capacity to express and understand emotions could be very useful to a better association
between humans and machines, making this relationship less frustrating.
• The possibility to build intelligent machines, although this concept is a little bit vague.
• The possibility to understand human emotions by modelling them.
The last point is a relevant point to the ArgEmotionAgents Project because, although the main
goal is certainly not to make a deep study of human emotions, we intended to simulate the
behaviour of a group in a decision making process. To realize this simulation it is, in our opinion,
absolutely necessary to incorporate emotions, because they influence the group and individual
behaviour.
Is the use of agents free of concerns? No. Concepts such as privacy, responsibility, trust and
delegation are always in discussion:
• The delegation of competencies to an agent implies the existence of trust on that agent. How
to assure that the agent really represents the interest of the user? The process of trust
acquisition is complex and slow.
• If the agent will act on behalf of the user, how can we be sure that the it’s private data is
maintained?
• If a problem occurs who should be accountable? The multi-agent system, the person
(organization) who conceives the system or the entity responsible for the delegation of
competencies?
The use of emotional agents raises other questions as, for instance:
• Should agents be allowed to hide their emotions from each other and from humans?
• Are users always aware that they are interacting with agents, or due to the agents believability
users may stay confuse?
Using Emotional Agents in Group Decision Making Simulation
The ArgEmotionAgents project envisages the use of Multi-Agent Systems approach for
supporting Group Decision-Making processes, where Argumentation and Emotion components
are especially important. Emotional agents will be used in this project to simulate the members of
a group decision making. The architecture of participant agents will be composed by several
modules. The emotional module, the argumentation module and
Emotions the decision making module. The focus of this paper is on the
emotional component. In the next figure it is possible to see the
Appraisal
architecture of emotional module (figure 1).
The emotion model adopted in our implementation of emotional
Selection Mood agents is a reviewed version of the OCC model (Ortony, 2003)
initially created by Ortony, Clore and Collins (Ortony et al,
Decay 1988). In this model, five categories of positive emotions (joy,
hope, relief, pride and gratitude) and five negative categories
(distress, fear, disappointment remorse and anger) are considered.
The emotional module is composed by three main components:
Figure 1 - Emotional module the appraisal – based on OCC model, the intensities of potential
emotions are calculated; the selection – each emotion has a
threshold activation, that can be influenced by the agent mood, this component selects the
dominant emotion; and decay – emotions have a short duration, but they do not go away
instantaneously, they have a period of decay. The agent mood is calculated based on the
emotions agents felt in the past and in the moods of the remaining participants. In our approach
only the process of mood contagion is being considered.
In group decision simulation the participant agents will exchange arguments in order to
achieve a consensual solution, the selection of arguments to be sent and the evaluation of
received arguments, will take into account the agent internal emotional state, the moods of other
agents, as well as, other characteristics that compose the agents profile: debts of gratitude, agents
in which the participant agent trust, agents that participant agent think that consider him as
credible, friendship agents and enemy agents.
In the case of group decision making simulation, most of the concerns associated with the use
of agents (referred in section 3) are not applied because, as mentioned, it is a simulation.
Delegation and trust are still open questions: an agent could delegate his decision power to
another agent of the group; and an agent could change his preferences just because an agent in
which he trust asks him. We think that in real group decision making if one of the members is
“replaced” by an emotional agent all the referred above concerns are pertinent.

Conclusion
In our opinion the use of emotional agents to simulate the role of humans in group decision
making processes will imply a more believable behaviour and consequently better simulations.
In our multi-agent simulation of group decision problems will not be considered the influence
of the personality in the way agents feel emotions. Some authors apply the personality factor to
change the emotion activation threshold, to select the dominant emotion, etc. Also not considered
is the process of emotional inhibition, which means that the fact that an agent is feeling emotion
X will inhibit him of feeling emotion Y (e.g. fear inhibits happiness).
In classical decision methods, proposals are evaluated according to normative models such as
multi-attribute theory. Sometimes group decision making is the result of the mathematical
aggregation of group members preferences. We argue that the inclusion of argumentation and
emotional aspects in group decision making simulation will enhance the quality of decisions.

References
Bates, J. (1994); The role of emotion in believable agents; Communications of the ACM, vol. 37, No.7
(pp.122–125).
Bazzan, A.; Adamatti, D. and Bordini, R. (2002); Extending the Computational Study of Social Norms
with the Use a Systematic Model of Emotions; Advances in Artificial Intelligence: proceedings (LNAI
). Berlin : Springer-Verlag, 2002 (pp. 108-117).
Botelho, LM and Coelho, H. (2001); Machinery for artificial emotions; Cybernetics and Systems, Vol 32,
No. 5 (pp.465-506).
Cañamero, D. (1997); Modelling Motivations and Emotions as a Basis for Intelligent Behavior;
Proceedings of the First International Symposium on Autonomous Agents.
Damásio, A. (1994); O erro de Descartes: emoção, razão e cérebro humano; Publicações Europa América.
Damásio, A. (2000); O Sentimento de Si - O Corpo, a Emoção e a Neurobiologia; Publicações Europa-
América.
Elliott, C. (1992); The Affective Reasoner: A Process Model of Emotions in a Multi-agent System; Ph.D.
Dissertation, Northwestern University.
Goleman, D. (1995); Emotional Intelligence; New York: Bantam Books.
Hatfield, E., Cacioppo, J. and Rapson, R. (1992); Primitive emotional contagion; Review of Personality
and Social Psychology: Emotion and Social Behavior, Vol 14 (pp 151-177).
Karacapilidis, N. and Papadias, D. (2001); Computer supported argumentation and collaborative decision
making: The Hermes system; Information Systems, Vol. 26 No. 4 (pp. 259-277).
LeDoux, J. (1996); The emotional brain; Simon & Shuster: New York, pp 24.
Marreiros, G.; C. Ramos and J. Neves (2005a); Modelling group decision meeting participants with an
Agent-based approach; Selected for publication in an upcoming special issue of the International
Journal of Engineering Intelligent Systems.
Marreiros, G., Santos R.; Ramos C. and J. Neves (2005b); Agent Based Simulation for Group Formation;
SCS-ESM2005 19th European Simulation Multiconference, Riga, Latvia.
Marreiros, G.; Sousa J.P. and Ramos C. (2004). WebMeeting - a group decision support system for multi-
criteria decision problems. International Conference on Knowledge Engineering and Decision Support
ICKEDS04 (pp. 63-70)
Neumann, R. and Strack, F. (2000); Mood contagion: The automatic transfer of mood between persons;
Journal of Personality and Social Psychology, Vol 79 (pp. 211-223).
Ortony, A.(2003); On making believable emotional agents believable; In R. P. Trapple, P. (Ed.), Emotions
in humans and artefacts. Cambridge: MIT Press.
Ortony, A.; Clore, GL; Collins, A. (1988); The cognitive structure of emotions; Cambridge: Cambridge
University Press.
Picard, R. (1997); Affective Computing; MIT Press, Cambridge, MA.
Picard, R. (2003); What does it mean for a computer to have emotions?; In Trappl, R.; Petta, P.; and Payr,
S. (eds) Emotions in Human and Artefacts.
Simon, H. (1967); Motivational and emotional controls of cognition; Pysocological Rev Vol. 74.
Velásquez, J. (1998); Modeling Emotion-Based Decision-Making; In: Proceedings of the 1998 AAAI Fall
Symposium Emotional and Intelligent: The Tangled Knot of Emotion.
Wooldridge, M. and Jennings, NR (1995); Intelligent agents: Theory and practice; Knowledge.
Engineering Review, Vol. 10, No.2 (pp.115-152).

You might also like