You are on page 1of 14

WOLKITE UNIVERSITY

COLLEGE OF COMPUTING AND INFORMATICS


DEPARTEMENT OF SOFTWARE ENGINEERING
HUMAN AND COMPUTER INTRACTION
ASSIGNMENT WITH TITLE OF INTLLEGENT USER
INTERFACE

group - 6
nAMe id
2. FisseHA Abebe 194/07
3. kAleAbe YAlewdeg 274/07
4. Moges Mokonen 339/07
6. HerMelA MAncHAlew 252/07

1
Contents
1.Introduction ......................................................................................... 3
2. Some IUI Objectives ............................................................................ 5
3.History ................................................................................................. 6
3.1. 19451968: Batch interface .......................................................... 7
3.2. 1969present: Command-line user interface ................................ 8
3.3. 1985: SAA User Interface or Text-Based User Interface ................ 8
4. Central issue of Intelligent user interface design ................................ 9
4.1. Systems where the intelligence lies mainly behind the user
interfaces ........................................................................................... 12
5. Some Major IUI Challenges ............................................................... 13

2
1. Introduction
The increasing complexity in todays applications, e.g. the number of
available options, often leads to a decreased usability of the user interface.
This effect can be countered with intelligent user interfaces (IUIs) that
support the user in performing her tasks by facilitating the interaction as
much as possible. IUIs facilitate information retrieval by suggesting
relevant information or they support the system use, e.g. by providing
explanations, performing tasks for the user, or adapting the interface. In
this paper, we focus on user-adaptive IUIs which are able to adapt their
behavior to individual users. In our opinion this is a key feature for IUIs as
the support that should be provided by an IUI heavily depends on the needs
and preferences of each user. We present here the results of a literature
survey of the main design challenges that have to be faced when designing
a user-adaptive IUI and list existing approaches how to cope with these
challenges. We thereby focus on issues relevant for human computer
interaction and not on the underlying algorithms.

Human computer interaction (HCI), IUIs do not only focus on enabling the
user to perform intelligent actions but on ways to incorporate knowledge
to be able to assist the user in performing actions. In contrast to traditional
research in artificial intelligence (AI), IUIs do not focus on making the
computer smart by itself but to make the interaction between computer
and human smarter. The goal of IUIs is to make the interaction itself as well
as the presentation of information more effective and efficient to better
support the users current needs. The way to achieve this ranges from
supporting a more natural interaction, e.g. by allowing multimodal or
natural language input, to intelligent tutoring systems and recommender
systems. Based on the definition by Maybury and Wahlster [1998], we
define IUIs as follows:

Intelligent User Interfaces are human-machine interfaces that aim to


improve the efficiency, effectiveness and naturalness of human machine
interaction by representing, reasoning and acting on models of the user,
domain, task, discourse, context, and device. The explosion of available

3
materials on corporate national, and global information networks is
driving the need for more effective, efficient, and natural interfaces to
support access to information, applications, and people. This is
exacerbated by the increasing complexity of systems, the shrinking of task
time lines, and the need to reduce the cost of application and interface
development. Fortunately, the basic infrastructure for advanced
multimedia user interfaces is rapidly appearing or already available.

As with traditional interfaces, principled intelligent interfaces should be


learnable, usable, and transparent. In contrast, however, intelligent user
interfaces promise to provide additional benefits to users that can enhance
interaction, such as

Comprehension of possibly imprecise, ambiguous, and/or partial


multimodal input
Generation of coordinated, cohesive, and coherent multimodal
presentations
Semi- or fully automated completion of dele gated tasks

Management of the interaction (e.g., task completion, tailoring


interaction styles, adapting the
interface) by representing, reasoning, and exploiting models of the
user, domain, task, and context

4
2. Some IUI Objectives
Increase productivity
Decrease expenditures Improve efficiency, effectiveness, and
naturalness of interaction
How? for example, use knowledge representation, reasoning, ML,
adaptation, additivity Example
Email Filter / Email Response System / Dialogue System

5
3. History

Before 1960, the term user interface was practically non-existent.


Computer scientists focused on making calculations and the presentation
of results hardly received any attention. But then things started to change.
In 1963 Ivan Sutherland published his MIT PhD thesis about a system called
Sketchpad [Sutherland 1963]. Sketchpad made it possible to create graphic
images directly on a computer screen using a light pen and was the first
Graphical User Interface (GUI) and Direct Manipulation (DM) system.
Around the same time, Douglas Engelbart and William English were
working on their Augmenting human intellect project [Engelbart and
English 1968]. As part of this project they developed a new control device,
which was the first mouse. In December 1968 Engelbart presented the NLS
system; a revolutionary system that was the first to incorporate many
things such as hypertext, multiple overlapping windows, elaborate
document control, and on-screen video teleconferencing. The new
technologies in the NLS system were really astonishing and
groundbreaking at that time. In fact, the NLS system was so far ahead of its
time that many people attending Engelbarts first demonstration of the
system did not believe it was real. In the 1970s the technology-driven focus
on interfaces that was present in the 60s slowly changed. Instead, the user
started to become the focus of attention and in the 1980s the new field of
Human Machine Interaction (HCI) had turned into a user-centered research
field with usability as its main goal and technology as a supporting tool.
Around 1981 the DM interfaces, developed by Sutherland and later by
Xerox Parc and Apple, finally were incorporated into commercial software
programs.

The WIMP (Windows, Icons, Menus, Pointers) model became widespread


and proved to be very successful, thus becoming a guiding principle for all
interfaces. Although during the late 1980s and 1990s DM interfaces were
enhanced with things like embedded context-menus, new types of mice,
joysticks and other controls, the basic technology has not changed much
since its introduction. A problem is that new technologies such as data

6
mining, machine learning, speech recognition, and computer vision are
difficult to use with the existing interfaces. As a reaction the field of IUIs
slowly started to take form. Around 1994, intelligent agents and
recommender systems appeared on Internet. In 1996, the first practical
speech recognition and natural language applications appeared. Then in
1997, Microsoft released their intelligent Office assistant help system.
However, since then progress on IUIs for the commercial market seems to
have come to a stop and only a few IUIs have appeared in recent years.

3.1. 19451968: Batch interface


In the batch era, computing power was extremely scarce and expensive. User
interfaces were rudimentary. Users had to accommodate computers rather than
the other way around; user interfaces were considered overhead, and software
was designed to keep the processor at maximum utilization with as little
overhead as possible.
The input side of the user interfaces for batch machines were mainly punched
cards or equivalent media like paper tape. The output side added line printers to
these media. With the limited exception of the system operator's console, human
beings did not interact with batch machines in real time at all.
Submitting a job to a batch machine involved, first, preparing a deck of punched
cards describing a program and a dataset. Punching the program cards wasn't
done on the computer itself, but on keypunches, specialized typewriter-like
machines that were notoriously balky, unforgiving, and prone to mechanical
failure. The software interface was similarly unforgiving, with very strict syntaxes
meant to be parsed by the smallest possible compilers and interpreters.
Once the cards were punched, one would drop them in a job queue and wait.
Eventually. operators would feed the deck to the computer, perhaps
mounting magnetic tapes to supply another dataset or helper software. The job
would generate a printout, containing final results or (all too often) an abort
notice with an attached error log. Successful runs might also write a result on
magnetic tape or generate some data cards to be used in later computation.

7
3.2. 1969present: Command-line user interface

Command-line interfaces (CLIs) evolved from batch monitors connected to the


system console. Their interaction model was a series of request-response
transactions, with requests expressed as textual commands in a specialized
vocabulary. Latency was far lower than for batch systems, dropping from days or
hours to seconds. Accordingly, command-line systems allowed the user to change
his or her mind about later stages of the transaction in response to real-time or
near-real-time feedback on earlier results. Software could be exploratory and
interactive in ways not possible before. But these interfaces still placed a
relatively heavy mnemonic load on the user, requiring a serious investment of
effort and learning time to master.
The earliest command-line systems combined teleprinters with computers,
adapting a mature technology that had proven effective for mediating the
transfer of information over wires between human beings. Teleprinters had
originally been invented as devices for automatic telegraph transmission and
reception; they had a history going back to 1902 and had already become well-
established in newsrooms and elsewhere by 1920. In reusing them, economy was
certainly a consideration, but psychology and the Rule of Least Surprise mattered
as well; teleprinters provided a point of interface with the system that was
familiar to many engineers and users.

3.3. 1985: SAA User Interface or Text-Based User Interface

In 1985, with the beginning of Microsoft Windows and other graphical user
interfaces, IBM created what is called the Systems Application Architecture (SAA)
standard which include the Common User Access (CUA) derivative. CUA
successfully created what we know and use today in Windows, and most of the
more recent DOS or Windows Console Applications will use that standard as well.
This defined that a pulldown menu system should be at the top of the screen,
status bar at the bottom, shortcut keys should stay the same for all common
functionality (F2 to Open for example would work in all applications that followed

8
the SAA standard). This greatly helped the speed at which users could learn an
application so it caught on quick and became an industry standard.

4. Central issue of intelligent user interface design


IUIs deal with different forms of input and output and try to help the user in
an intelligent fashion. They try to solve some of the problems that the current
direct manipulation interfaces cannot, such as:
Creating personalized systems:
No two persons are the same and people have different habits,
preferences, and working methods and environment. An intelligent
interface that takes these differences into account can provide a
personalized method of interaction. The interface knows the user and can
use that knowledge in its communication with the user.

Information overflow or filtering problems:


Finding the right information on your computer or on the Internet can be
like looking for a needle in a haystack. Intelligent interfaces can reduce the
information overload that often results from finding information in large
databases or complex systems. By filtering out irrelevant information, the
interface can reduce the cognitive load on the user. In addition, the IUI can
propose new and useful information sources not known to the user.

Providing help on using new and complex programs:


Computer systems can be very complicated to work with when you first
start to use them. As you struggle to get to know and understand a new
program, new software versions or updates may appear that include new
functionality. Many computer users fail to keep up with these
developments. Intelligent help systems can detect and correct user
misconceptions, explain new concepts, and provide information to simplify
tasks.

Taking over tasks from the user:


An IUI can also look at what you are doing, understand and recognize your
intent, and take over some of your tasks completely, allowing you to focus
on other things.

9
Other forms of interaction:
Currently, the most common interaction devices are the keyboard and the
mouse. IUI research looks at other forms of human-computer interaction
(e.g. speech or gestures). By providing multiple forms of interaction, people
with a disability will be able to use computers more easily.
It is a subfield of Human-Computer Interaction (HCI). To make things a bit
complicated, in literature the term intelligent user interface is used to denote
a particular type of interface as well as the research field. Other often mentioned
synonyms are adaptive interfaces, multimodal interfaces, or intelligent interface
technology. The first two are actually two subtypes of intelligent interfaces
whereas the latter is used as a synonym for IUIs as a research field. The main
problem in defining the terms intelligent user interface lies in the word
intelligent. For decades, researchers have tried to define intelligence. Back in
the 1950s, Alan Turing already came up with a proposal to define intelligence
using what we now call the Turing Test [Turing 1950], but the debate still is not
settled. Over the years numerous definitions of intelligence have been devised.
Most definitions mention the ability to adapt (learn and deal with new
situations), the ability to communicate, and the ability to solve problems. A
normal user interface is defined as a method of communication between a human
user and a machine. If we extend this definition, we can say that an intelligent user
interface uses some kind of intelligent technology to achieve this human-machine
communication. In other words, IUIs are interfaces with the ability to adapt to the
user, communicate with the user, and solve problems for the user.

Intelligent user interfaces specifically aim to enhance the flexibility, usability,


and power of human-computer interaction for all users. In doing so, they exploit
knowledge of users, tasks, tools, and content, as well as devices for supporting
interaction within differing contexts of use. [Maybury 2001]

Adaptation and problem solving are important topics addressed by research on


artificial intelligence (AI) and therefore many IUIs draw heavily on the techniques
developed in AI research. However, not all intelligent user interfaces have learning
or problem solving capabilities. Many interfaces that we call intelligent focus on the
communication channels between the user and machine. These interfaces often
10
apply new interaction techniques such as speech processing, gaze tracking or facial
recognition. Other research fields have also influenced IUIs. Some examples are:
psychology, ergonomics, human factors, cognitive science and social sciences.
techniques used to realize intelligent systems have their origins in AI
though in many cases a subfield has formed around a given type of technique
(no longer primarily associated with AI)
The canonical intelligent system includes a wide variety of capabilities,
including sensing and perception, knowledge representation and reasoning,
learning, creativity, planning, autonomous motion and manipulation, natural
language processing, and social interaction.
The intelligent processing is found in the user interface(s) of the system, and
its purpose is to enable an effective, natural, or otherwise appropriate
interaction of users with the system. For example, the system may support
human-like communication methods such as speech or gesture; or it may
adapt its style of interaction to individual users.
The intelligent processing is found in the backendof the system, and its
primary purpose is to serve some beneficial function such as performing
actions partly autonomously on behalf of the users. The relevance of the
systems intelligence to interaction with users.
The intelligent processing is used not directly in the system itself but in the
process of designing, implementing, and/or testing the system. Hence, the
system that the users interact with may not itself be an intelligent system.
Systems with adaptive user interfaces that are automatically adapted to the
inferred capabilities or needs of the user.
Multimodal systems that aim to enable more natural, human-like forms of
input and output.
Systems with human-like virtual characters that enable the user to interact
with a system in a way that is partly similar to human-human interaction.
Smart environments in which embedded objects interact intelligently with
their users.
Personalized websites, in which the displayed content is adapted to the
inferred interests of the user.

We need computer interfaces that can understand and help people and explain
them how to use the available functions. Many computer users are experiencing

11
problems and most of these problems are related to the interface: confusing menu
choices, incomprehensible error messages, unnatural and rigid interaction, etc.
Especially beginners, the elderly, or people with disabilities are having trouble, but
experienced computer users often bump into problems as well. We need to make
sure that computers, and other computerized devices for that matter, remain
accessible for everyone.
If we look at the way we interact with computers now, a lot has
changed compared to twenty years ago. Instead of the constrained
input we used then, interfaces have become much more flexible.
Modern-day interfaces try to be intuitive by using the desktop
metaphor, which consists of multiple windows showing folders and
documents (files). However, most modern day interfaces are very
limited in handling the differences between users and lack
personalization. Intelligent user interfaces (IUIs) form a subfield of
Human-Computer Interaction research.

The goal of IUIs is to improve human-computer interaction by using


smart and new technology. This interaction is not limited to a
computer, but can also be applied to improve the interface of other
computerized machines, for example the television, refrigerator, or
mobile phone. Using techniques from artificial intelligence, IUIs deal
with different forms of input and output and try to help the user in an
intelligent fashion.

4.1. Systems where the intelligence lies mainly behind the user interfaces
Recommender systems, which present products, documents, or other items
that are expected to be of interest to the current user.
Systems that employ intelligent technology to support information retrieval.
Learning environments that offer learning assistance on the basis of
assessments of each learners capabilities and needs.
Interface agents that perform complex or repetitive tasks with some
guidance from the user.
Situated assistance systems that monitor and support a users daily
activities.
Systems for capturing knowledge from domain experts who are not
knowledge engineers.

12
Games that make use of AI technology to create the opponents against
which the human players play.

5. Some Major IUI Challenges


The main goal in developing IUIs is that they should be usable, useful and trustable
[Myers, 2007]. This aligns with the main challenges identified by Maes [1994]:
Presentation, Competence and Trust. Presentation is concerned with the human
computer interaction part of IUIs, whereas Competence focuses on the artificial
intelligence techniques that can be applied. The development of IUIs has to take
special care of Trust, as the user is not willing to delegate tasks to an IUI she does
not trust, thus rendering the IUI useless. However, for IUIs it is much more
challenging to induce users trust in the system than for traditional user interfaces,
because IUIs apply artificial intelligence techniques whose results can often not be
directly foreseen by the user and thus reduce the users feeling of being in control
of the system. In the following, we point out for each of these issues which
challenges have to be faced when developing user-adaptive IUIs and describe
possible ways to cope with them. An overview of the identified challenges is given
in Table 1. The challenges are not disjunctive and heavily interrelated, they should
just give some of the focus points for developing user-adaptive IUIs.
Further, this list is not meant to be complete and not all challenges have to be faced
in each IUI, e.g. collaborative filtering systems usually do not have to cope with the
problem of few usage data.

Mixed-initiative dialogue (will be discussed)


Modeling what users want
Eliciting what users want
Not knowing the true world state (partial observability) and acting
Planning and reasoning ahead
Continually learning model parameters or whole models (never ending
learning)
Speech understanding and activity recognition

13
Reference:
1. Department of Information Technology and Systems Delft
University of Technology, the Netherlands.
http://www.kbs.twi.tudelft.nl

2. Alty, J.L., & Guida, G. (1986) The use of rule-based system


technology for the design of man-machine systems. In G. Mancini, G.
Johannsen, & L. Martensson (Eds.), Analysis, Design and Evaluation of
Alan-hlachine Systems (pp. 21-41) Oxford: Pergamon Press.

3. Belew, R. K. (1985) Evolutionary decision support


systems: an architecture based on information structure. in L. B. Methlie,
& R. H. Sprague (Eds.), Knowledge Representation for Decision Support
Systems (pp. 147- 160) Amsterdam: North-Holland

14

You might also like