You are on page 1of 10

Implicit Human Computer Interaction Through Context

Albrecht Schmidt
Telecooperation Office (TecO), University of Karlsruhe
Vincenz-Prießnitz-Str. 1, 76131 Karlsruhe
Germany
albrecht@teco.edu
www.teco.edu

Abstract (ranging form simple temperature sensors to


In this paper the term implicit human computer cameras), and the resulting perceptional
interaction is defined. It is discussed how the capabilities as well as the fact that the main user
availability of processing power and advanced group of current computing devices (e.g. mobile
sensing technology can enable a shift in HCI phones, PDAs, etc.) are non experts, we may
from explicit interaction, such as direct observe yet another shift in HCI. Devices that
manipulation GUIs, towards a more implicit have perceptional capabilities (even if they are
interaction based on situational context. In the very limited) will start the shift from explicit HCI
paper an algorithm that is based on a number of towards a more implicit interaction with
questions to identify applications that can machines.
facilitate implicit interaction is given. An XML- A vision of future devices
based language to describe implicit HCI is We will be able to create (mobile) devices that
proposed. The language uses contextual variables can see, hear and feel. Based on their perception,
that can be grouped using different types of these devices will be able to act and react
semantics as well as actions that are called by according to the situational context in which they
triggers. The term of perception is discussed and are used.
four basic approaches are identified that are In this paper it will be shown that this vision is
useful when building context-aware applications. not as far ahead as it seems. In our research we
Providing two examples, a wearable context start with the perception of simple concepts and
awareness component and a sensor-board, it is with their exploitation. Providing a number of
shown how sensor-based perception can be examples and demonstrators it is discussed how
implemented. It is also discussed how situational basic perception could enable a shift from explicit
context can be exploited to improve input and towards implicit HCI.
output of mobile devices.
2 Implicit Interaction
Keywords: context awareness, context sensing, Observing communication between humans we
implicit human computer interaction, perception, can see that a lot of information is only
ubiquitous computing. exchanged implicitly. The way people interact
with each other and also the situation in which
1 Introduction they interact carries information that is often
The way people interact with devices is vital for implicitly exploited in the exchange of messages.
their success. Looking at HCI it is apparent that While in a conversation the behavior of
interaction techniques are limited by the participants as well as what happens in the
technology available. Furthermore the anticipated surrounding environment supplies valuable
user groups influence the interaction metaphors to information that is often vital for the
a large extent. Considering the shift from punch understanding of messages. In many cases the
cards to interactive text terminals and also the robustness of human-to-human communication is
shift from command line interfaces to graphical based on the implicitly introduced contextual
user interfaces (GUI) this was observable. information, such as gestures, body language, and
voice. Another example is redundancy between
Bearing in mind current and upcoming
body language (e.g. nodding) and spoken
technologies, such as increased processing power
languages (e.g. the word ‘yes’). This implicitly
(even on mobile devices), availability of sensors
introduced knowledge is also used to rudimentary level, e.g. automatic light control
disambiguate information, e.g. in a discussion (switches on the light when it is dark and
with a student pointing at a computer the term someone is walking by) and active badge systems
‘sun’ has a different meaning than the same term (automatically open a door when someone with
when on the beach together with friends; a more appropriate permission likes to enter the
in depth discussion is given in [11]. building). In current computer systems we can
observe that agent technology is used to build
2.1 Implicit vs. Explicit Human Computer systems that have a certain ability to act
Interaction proactively. These approaches are mainly based
Considering current computer technology on user profiles and usage information [13]. In
interaction is explicit – the user tells the computer these cases perception is limited to information
in a certain level of abstraction (e.g. by gathered in the virtual space.
command-line, direct manipulation using a GUI, If we look concepts that are needed to facilitate
gesture, or speech input) what she expects the implicit interaction three basic building blocks
computer to do. This is considered as explicit can be identified:
interaction.
1. the ability to have perception of the use, the
Definition: Implicit Human Computer Interaction environment, and the circumstances,
Implicit human computer interaction is an action,
2. mechanisms to understand what the sensors
performed by the user that is not primarily aimed
see, hear and feel, and
to interact with a computerized system but which
such a system understands as input. 3. applications that can make use of this
information.
The action of a user is always performed in a
certain environment. Implicit interaction is based On a conceptual level (1) and (2) can be
on the assumption that the computer has a certain described as situational context. And (3) are
understanding of our behavior in the given applications that are context enabled. In the next
situation. This knowledge is then considered as section context is discussed in more detail.
an additional input to the computer while doing a
2.2 What is Context
task.
The notion of context is used in many different
A simple example is the garbage bin [14] that
ways. In our work we propose to regard
scans in the bar code of products and reproduces
situational context, such as location, surrounding
the information for a suggested shopping list. The
environment or state of the device, as implicit
action performed by the user (e.g. throw away an
input to the system. We use the term situational
empty can in a bin) is the same as with any other
context to describe implicit interaction fragments.
garbage bin. The recognition of the system (by
This extends the concept of context beyond the
scanning the bar code) and the built-in
informational context into real world
interpretation of the system (all things that go into
environments.
the bin may be on the next shopping list again)
make use of the action performed by the user. The word Context in general use has a multitude
The user herself does not explicitly interact with of meanings. Even within the field of computer
the computer, thus the process describes an science different disciplines, such as artificial
implicit interaction. As we see from the example intelligence, natural language processing, image
implicit interaction is based on two main recognition, and more recently mobile computing,
concepts: have their very own understanding of what
context is. In our work we found that very general
•= perception descriptions of context as given by a dictionary
•= interpretation. and also synonyms found in a thesaurus come
very close to our understanding. To illustrate this
For most applications implicit interaction will be we like to provide the following definitions:
used additionally to explicit interaction. Context n 1: discourse that surrounds a
There are other systems implemented that also language unit and helps to determine its
facilitate the idea of implicit interaction on a interpretation [syn: linguistic context,

2
context of use] 2: the set of facts or only a few seconds and making a note on a PDA
circumstances that surround a situation or is often in the range of several seconds up to
event; "the historical context" (Source: some minutes. Also the fact that the applications
WordNet ® 1.6) are mainly used while doing something else or to
Context: That which surrounds, and gives carry out a certain task (like tools in the real
meaning to, something else. world) calls for a reduction of the explicit human-
(Source: The Free On-line Dictionary of Computing) machine interaction and creates the need to shift
towards implicit HCI.
Synonyms Context:
Circumstance, situation, phase, position, Knowledge about the situational context is of
posture, attitude, place, point; terms; primary interest to the application, because we
regime; footing, standing, status, occasion, consider that the application will adapt to the
surroundings, environment, location, context.
dependence. (Source: www.thesaurus.com) It can be observed that an application (mobile or
To build applications that have knowledge about stationary alike) is:
their situational context it is important to gain an (a) running on a specific device (e.g. input
understanding what context is. Current research system, screen size, network access,
in context-awareness in mobile computing shows portability, etc.),
a strong focus on location [1], [12]. Location is a (b) at a certain time (absolute time e.g. 9:34 p.m.,
concept that is well understood. Also the benefit class of time e.g. in the morning)
of location-awareness is clearly given, at certain (c) used by one or more users (concurrently or
locations particular services are more important sequentially),
than others. An architectural approach, based on a (d) in a certain physical environment (absolute
smart environment is described by Schilit et. al. location, type of location, conditions such as
[17]. Other scenarios are using RF and GPS to light, audio, and temperature, infrastructure,
determine the users location, e.g. [4], [15]. But, as etc.),
pointed out in [20] context is more than location. (e) social setting (people co-located and social
We use the term context considering mobile role),
computing in a more general way, as also (f) to solve a particular task (single task, group
suggested by [2], to describe the environment, of tasks, or a general goal).
situation, state, surroundings, task, and so on. A We consider the items (a) to (f) as the basic
wider view of context is also given by [19]. They building blocks of context. For mobile
suggest to consider the way a device is used applications especially (d) and (e) are of major
(mobile phone in the users hand, on the table, in interest. In mobile settings the physical
pocket, etc.) to be treated as context. environment can change while an application is
executed e.g. making a phone call while walking
2.3 Applications in Context from the office desk to the car park. The
telephone application is running, while the noise
Analyzing the way people use ultra-mobile level changes between office and outside.
devices (e.g. personal digital assistants, smart
mobile phones, handheld and wearable 2.4 Identifying Implicit Human Computer
computers) it becomes apparent that the periods Interaction
of interaction are much shorter than in traditional To identify applications that can be improved by
mobile settings. Notebooks – considered as implicit HCI input and output of the application
mobile computers - are mainly used in stationary and the real world environment in which it is
setting, e.g. one takes a notebook to a meeting executed have to be analyzed. Then ways to
and takes note and a salesman takes a mobile capture the situational context must be assessed.
computer to a customer for a presentation. In Furthermore mechanisms for the interpretation of
general in these scenarios the application is used the situational context have to be found. Finally
in a stationary setting between several minutes the reaction of the application has to be defined.
and hours. Whereas considering the usage of The following questions help to identify these
ultra-mobile devices interaction periods are often points:
much shorter e.g. looking up an address takes

3
•= What happens around an application while On 3: For each Ci the accuracy Ai and the update
the application is in use? Are there any rate Ui that are needed to make the measurements
changes at all? useful are defined. Then a sensing device that
•= Do the surroundings (behavior, environment, matches these requirements is identified. If the
circumstances) carry any valuable cost for the identified sensing device Di is
information for the application? Does it acceptable, then the vector describing the
matter for the application? condition, the sensing device, the required
•= Are there any means to capture and extract accuracy and update rate is added to the set D.
the information in a way that is acceptable for For conditions that cannot be sensed the cost is
the application or device (processing cost, infinite.
sensor cost, weight, etc.)? On 4: If any conditions that are feasible to
•= How to understand the information? What measure exist then for each of these conditions
interpretation and reasoning is possible and one or more range values that are meaningful
useful. What is an appropriate way for the (temperature between 15°C and 25°C, location is
application to react? inside my office, user is moving, etc.) are
identified and for these ranges the reaction of the
Putting all of these together we can set up the
application (switch to notepad, etc.) is defined.
algorithm in figure 1. The algorithm works as
follows: 2.5 Modeling Implicit Human Computer
On 1: C is the set of surrounding conditions that Interaction
carry information that is useful for the To specify applications that facilitate implicit
application. Each element Ci stands for one HCI it is inevitable to have a specification
condition, e.g. location, temperature, current user, language to describe situational context linked to
device orientation, etc. The set is created by events/change that occur in the application. In our
asking what condition change in the environment. recent work we found it helpful to use a notation
that is human readable as well as easily to process
On 2: D is initialized – at the beginning no
using a computer. We decided to use a markup
sensing devices are identified.
language that is specified in XML for this
1. create the set C
2. set D = {}
3. for each Ci ∈ C
define Ai. // accuracy
define Ui. // update rate
identify Si // a sensor device
// that is appropriate
if cost(Si, Ai, Ui) is acceptable then
D = D ∪{( Ci, Si, Ai, Ui)}
fi
next
4. if D ≠ {} then
for each vector Di in D
define a set of application reaction Ri = {(Iij, Rij)}
// Iij is input range, application reaction pairs Iij
// Rij is application reaction
else
// implicit interaction is not used
//(either no condition that are useful or too costly)

Figure 1: Identifying Implicit HCI.

4
purpose. Extending the SGML based description <!ELEMENT context_interaction
model introduced by Brown in [2], [3] we added (context , action )>
two more concepts - grouping context with <!ELEMENT context (group+ )>
matching attributes and trigger attributes to make <!ELEMENT group (#PCDATA )>
<!ATTLIST group match
the description more expressive and suitable for
(one | all | none ) #REQUIRED >
our projects. See figure 2 for the XML data type <!ELEMENT action (#PCDATA )>
definition (DTD). <!ATTLIST action
In the <context> section contextual variables time CDATA ’0’
are used to describe the conditions. These trigger (enter | leave | in )
variables are made of two parts, the first is used #REQUIRED >
to specify the context sensing module, in figure 5, Figure 2: Data Type Definition
the sensor module (sensor_module) and the
palm pilot (pilot) and in the second part the <context_interaction>
<context>
variables provided by this module. <group match=’one’>
In the <action> section function calls are used sensor_module.touch
to specify the action to be carried out in case the pilot.on
trigger evaluates to true. These calls are also </group>
hierarchically structured; specifying the device, <group match=’none’>
sensor_module.alone
the application, and the function to be performed.
pilot.pen_down
Depending on the platform (e.g. context sensing </group>
module in a microcontroller) we use a different </context>
implementation language. <action trigger=’enter’ time=’3’>
pilot.notepad.confidential
If contexts are composed of a number of
</action>
components we found it very helpful to have a </context_interaction>
mechanism to bundle certain contextual variables
in groups and select a matching semantic for each Figure 3: Context description
group description. For matching in a group we
provided the following semantics: one (match In figure 3 an example of a description of a
one or more of the variables in the following context and an action is shown. The context
group), all (match all variables in the following description consists of two groups of contextual
group), none (match none of the variables in the variables. In the first group the match semantics
following group). All groups within the context is that at least one of the variables must be true, in
description must evaluate to true to cause the this case either the device is touched or the state
trigger. of the device is on. In the second group the match
We discriminate three different triggers: ‘enter a semantics is ‘none’, which means that the
context’, ‘leave a context’, and ‘while in a contextual variable alone must not be true and
context’. The ‘enter’ and ‘leave’ triggers take a that the user must not have touched the screen
time value that specifies the time after1 which the with a pen.
action is triggered if the context stays stable over If the context evaluates to true, an action is
this time. For the ‘while in a context’ trigger the triggered. Here the semantics is that if the context
time indicates the interval in which the trigger is is entered and is stable for at least three seconds
fired again. then the action is performed.
The complete description means that if the device
is on or in the users hand and if the user is not
alone and he is not touching the screen with the
1 The parameter indicating the time after that an action is pen then after three seconds the display should be
performed is often 0 (immediate context action coupling) hidden by an image as depicted in figure 6 (d) in
or positive. In certain circumstances, when future the later section.
situations can be predicted (e.g. you drive your car into the
parking, the context walking will appear soon) a negative
value does make sense, too.

5
3 Perception information of this kind is also useful for other
There are several ways to equip devices with applications, for instance calendars, email
perceptional capabilities. The range of notification, pagers and mobile phones can make
complexity to consider is very wide, starting from use of any context that gives an indication of
simple sensors that know the way a device is held whether or not it is a good time to interrupt a
[18] to complex audio and video analysis. We user.
identified the following four basic approaches: The specific contexts that we chose for our study
•= device-databases (e.g. calendars, todo- are based on aural information: user speaking,
lists, address books, profile, etc.) others speaking, noisy, and quiet. And based on
movement of the user: walking, running,
•= input to the application running (notepad
stationary. Movement context was included as it
- taking notes, calendar - looking up a
gives an indication as to whether a user can be
date, etc.)
interrupted visually.
•= active environments (active badges [10],
For recognition of aural and movement contexts,
IR-networks, cameras, audio, etc.)
we integrated two microphones and an
•= sensing context using sensors (TEA [5], accelerometer in our design. One of the
[21], Sensor Badges [1], GPS, cameras, microphones is placed near the user’s throat, the
audio [16], etc.) other pointing away from the user. With this
The perceptual capabilities can be located in the configuration the distinction of speaker and
device itself, in the environment or in another environment is feasible with minimal processing
device that shares the context over a network (e.g. cost. The acceleration sensor is used to
body area network). discriminate whether a user is standing still,
In the remainder of this section we concentrate on walking or running.
sensor based perception, also knowing that in The sensor placement considerations led us to
most scenarios a combination of all four cases is build the context-awareness component into a tie
the way of choice. First we introduce two sensor – it may be considered to build them into other
devices developed in our group and then provide accessories worn in similar ways (e.g. jewelry,
some information on other sensor based devices neckerchief, or necklace). We also liked that a tie
that supply contextual information. stresses the component’s design as accessory
rather than as stand-alone device, see figure 4.
3.1 Context Awareness Component
The hardware of our context-awareness
In this part a wearable context-awareness
component is build around a TINY-Tiger
component that integrates low-cost sensors is
microcontroller, which offers four analog inputs
described. Simple methods are used to derive
context information from sensor data. The
derived context is application-independent and
can be exploited by other wearable or personal
technologies in a body network, for instance
wearable computers, mobile phones, digital
cameras, and personal digital assistants.
Here we chose to address a number of contexts
that relate to how interruptible the user is. These
contexts describe only a certain aspect of real
world situations but they are general in the sense
that they can be exploited by a range of
applications. Such context is for instance
implemented and used in the audio wearable
described in [16], mimicking the human ability to
recognize situations in which it is rude to
interrupt, for instance when a person is engaged
in a conversation or giving a talk. Context Figure 4: Context-Awareness Tie.

6
and two serial lines. The two signals from the 3.2 Sensor Board
microphones are amplified and connected to the Using this Board we collect data on the
analog inputs. To measure the motion we used a situational context by using a combination of low
two-axis accelerometer (Analog Devices level sensors. In this project we built a context
ADXL202). A more detailed description has been recognition device equipped with a light sensor,
published in [22]. acceleration sensor, a passive infrared sensor, a
The software is realized in Tiger-BASIC, a touch sensor, and a temperature sensor. All
multitasking basic dialect for the TINY-Tiger. It sensors, but the touch sensor are standard sensors
reads and analyzes sensor data in a time window and produce analog voltage level. The touch
of about four seconds. The methods to analyze sensor recognizes the human body as a capacitor
the signals are deliberately simple; they work and supplies a digital value. The heart of the
within the time domain and are based on basic device is a BASICTiger microcontroller that reads
statistical measurements. Based on the features from all the physical input channels (it offers four
calculated from sensor data the contexts are analog digital converters and a number of digital
detected. IOs) and statistical methods are applied to
The communication is based on a serial line recognize contexts. The board is depicted in
connection using 9600 bit/s, in a simple request- figure 5. The PDA requests contextual variable
reply manner. The client requests the contextual while the application is idle, e.g. catching the
variables from the context-awareness component NullEvent on the PalmPilot.
that sends back the variables together with the
values. 3.3 Related Work on Context Sensing
Experimentation with the context-aware tie In robotics this way of perception is widely used
showed that contexts were recognized in a very but with a different objective – giving machines
reliable way. Both ‘user speaking’ vs. ‘others the ability to operate autonomously.
speaking’ and ‘stationary’ vs. ‘walking’ vs. For the use with handheld devices the project
‘running’ were discriminated correctly. A key TEA [5] developed a sensor board (equipped with
finding is that sensor placement can be used 8 sensors, light, acceleration, pressure,
effectively to increase reliability and to reduce temperature, etc.) that supplies contextual
required processing. information; communication is done via serial
The device can provide information on the line. The application described is a mobile phone
situational context of the user for other personal that recognizes its context (in users hand, on the
technologies in a body area network. Using this table, in suitcase, outdoors) and it adapts ringing
device the implicit HCI can be facilitated. modes according to users preferences in that
situation [19].
Using a similar approach a system to facilitate
indoor location awareness based on low level
sensors is described in [8]. The system reads data
from different sensors (acceleration, light,
magnetic field, etc.) and provides location
information.
In [7] a cup is described that has an acceleration
and temperature sensor build in together with a
microcontoller and infrared communication. The
cup is aware of its state (warm, cold, on a table,
drinking, moved). The information from a
number of cups communicated to a server is then
used to supply information about the group of
users. All these projects focus on a completely
Figure 5: Context Sensing Device and sensor based approach to context awareness.
PalmPilot A jacket that knows if it is on the hanger or with
the user is presented in [6]. The sensor jacket has

7
woven in sensors that give information if the user concern. Implicit HCI does not solve these
is wearing the jacket, what movements the user is problems in general but can help to:
making, etc. As one application correcting •= adapt the input system to the current situation
movements in sports (automated tennis coach) is (e.g. audio filter, recognition algorithms, etc)
suggested in the paper. In this project the
development of robust sensing technology is very •= limit need for input (e.g. information is
central. already provided by the context and can be
captured)
4 How Can HCI benefit from •= reduce selection space (e.g. only offer
appropriate options in current context)
Context?
HCI for mobile devices is concerned with the 4.3 ContextNotePad on a PalmPilot
general trade-off between devices qualities (e.g. To explore ways of implicit communication
small size, light-weight, little energy between users and their environment with mobile
consumption, etc.) and the demand for optimal devices we built a context aware NotePad
input-output capabilities. Here implicit HCI can application. The system uses the perceptional
offer interesting alternatives. capabilities of the sensor board, described in the
previous section and provides an application that
4.1 Output in Context
is very similar in functionality as the built-in
Over recent years the output systems for mobile notepad application on the PalmPilot.
devices became much better; features such as Additionally the application can adapt to the
stereo audio output, high-resolution color screens current situational context and can also react in
for PDAs and even on mobile phones as well as this way to the implicit interaction. The
display systems for wearable computers are application changes its behavior according to the
commercially available. Also unobtrusive situation. The following context adaptations have
notification mechanisms (e.g. vibration) have been implemented.
become widely used in phones and PDAs. Still on
the lower end devices with very poor display •= On/Off. The user has the device in her hand.
quality enter the marked. Situational context can In this case the application is switched on, if
help to: the user is putting the device out of her hand
it is switched off after a certain time. It
•= adapt the output to the current situation assumes that if the user takes the device in
(fontsize, volume, brightness, privacy her hand she wants to work with the device.
settings, etc) [19].
•= Fontsize. If the device is moved (e.g. while
•= find the most suitable time interruption [16], walking or on a bumpy road) the font size is
[22]. increased to ease reading. Whereas while
•= reduce the need for interruptions (e.g. you having the device in a stable position (e.g.
don’t need to remind someone to go to a device stationary in your hand or on the table)
meeting if he is already there.) the font is made smaller to display more text
at the same screen, see figure 6(a) and(b).
4.2 Input in Context
•= Backlight. This adaptation is straightforward
Considering very small appliances the space for a but still not build in in current PDAs. By
keyboard is very limited what results in bad monitoring the light condition the application
usability. Other input systems, such as graffiti and switches on the backlight if the brightness
handwriting recognition have been developed level in the environment is below a certain
further but still lack in speed and accuracy [9]. threshold. Accordingly if it becomes brighter
Advances in voice recognition have been made in the light is switched off again, see figure 6(c).
recent years, but for non office settings (e.g. in a
car, in a crowded place, sharing rooms with •= Privacy settings. If you are not alone and
others, and in industry workplaces), the you are not writing (or touching the screen)
recognition performance is still poor. Also the content on the display is hidden by an
privacy and acceptance issues are a major image, see figure 6(d). To sense if someone is

8
Figure 6: Adaptation to Context a) small font, b) large font, c) backlight, d) privacy

walking the passive infrared sensor is From current projects we learned that there is a
deployed. need for a simple specification language for
Currently we decrease the size of the context- implicit HCI, based on situational context. We
awareness device to make it feasible to plug it propose an XML-based markup language that
into the pilot to allow proper user studies. supports three different trigger semantics. The
language is easily human readable and also easy
to process.
5 Conclusion and Further Work
Basic mechanisms of perception to acquire
Based on observations of new sensing
situational context are discussed. In the first
technology, available sensors and anticipated
example a wearable context awareness
users a new interaction metaphor is proposed.
component build into a tie is described. Also a
Implicit HCI is defined as an action, performed
sensor-based context-awareness device is
by the user that is not primarily aimed to interact
introduced. Both devices supply context to other
with a computerized system but which such a
devices over a simple request reply protocol over
system understands as input. It is further
the serial line.
identified that perception and interpretation of the
user, the environment, and the circumstances are In a further section benefits of implicit interaction
key concepts for implicit HCI. Furthermore trough situational context to HCI are discussed.
applications that exploit this information are In an example implementation the feasibility of
required. the concepts introduced earlier is demonstrated.
Perception and interpretation are considered as
situational context. Therefore we motivate a References
broad view of context, and also suggest that the [1] Beadle, P., Harper, B., Maguire, G.Q. and
context is described from the perspective of the Judge, J. Location Aware Mobile
application. To identify applications that can Computing. Proc. of IEEE Intl. Conference
make use of situational context and thus can on Telecommunications, Melbourne,
facilitate implicit HCI a number of questions are Australia, April 1997.
raised and an algorithm is suggested. It is based [2] Brown, P. J., Bovey, J. D., Chen, X.
on the central questions: what happens around the Context-Aware Applications: From the
application, how can this be sensed or captured, Laboratory to the Marketplace. IEEE
how to interpret this information, and how can Personal Communications, October 1997.
applications make use of it. [3] Brown, P.J. The stick-e Dokument: A
Frameowrk for creating context-aware
Applications. Proc. EP´96, Palo Alto, CA.

9
(published in EP-odds, vol 8. No 2, pp. 259- in the Field", Workshop on Human
72) 1996. Computer Interaction with Mobile Devices,
[4] Cheverst K, Blair G, Davies N, and Friday University of Glasgow, United Kingdom,
A. Supporting Collaboration in Mobile- 21-23 May 1998, GIST Technical Report
aware Groupware. Personal Technologies, G98-1. 1998.
Vol 3, No 1, March 1999. [16] Sawhney, N., and S., Chris. "Nomadic
[5] Esprit Project 26900. Technology for Radio: A Spatialized Audio Environment for
enabling Awareness (TEA). 1998. Wearable Computing." Proceedings of the
http://tea.starlab.net /. International Symposium on Wearable
[6] Farringdon, J., Moore, A.J., Tilbury, N., Computing, Cambridge, MA, October 13-14,
Church, J., Biemond, P.D. Wearable Sensor 1997.
Badge & Sensor Jacket for Context [17] Schilit, B.N., Adams, N.L., Want, R.
Awareness. In Proceedings of the third Context-Aware Computing Applications.
International Symposium on Wearable Proc. of the Workshop on Mobile
Computers. San Fransico, 18-19. Oct. 1999. Computing Systems and Applications, Santa
[7] Gellersen, H-W., Beigl, M., Krull, H. The Cruz, CA, December 1994. IEEE Computer
MediaCup: Awareness Technology Society. 1994.
embedded in an Everyday Object, 1th [18] Schmidt, A., Beigl, M., Gellersen, H-W.
International Symposium on Handheld and Sensor-based adaptive mobile user
Ubiquitous Computing (HUC99), Karlsruhe, interfaces. In Proceedings 8th International
Germany, 1999. Conference on Human-Computer
Interaction, München, Germany, August
[8] Golding, A., Lesh, N. Indoor Navigation
1999.
Using a Diverse Set of cheap wearable
sensors. In Proceedings of the third [19] Schmidt, A., Aidoo, K.A., Takaluoma, A.,
International Symposium on Wearable Tuomela, U., Van Laerhoven, K., Van de
Computers. San Fransico, 18-19. Oct. 1999. Velde, W. Advanced Interaction in Context.
1th International Symposium on Handheld
[9] Goldstein, M., Book, R. Alsiö, G., Tessa, S.
and Ubiquitous Computing (HUC99),
Non-Keyboard QWERTY Touch Typing: A
Karlsruhe, Germany, 1999 & Lecture notes
Portable Input Interface For The Mobile
in computer science; Vol 1707, ISBN 3-540-
User. Proceedings of the CHI 99, Pittsburg,
66550-1; Springer, 1999.
USA 1999.
[10] Harter, A. and Hopper, A. A Distributed [20] Schmidt, A., Beigl, M., Gellersen, H.-W.
Location System for the Active Office. IEEE There is more to context than location. Proc.
Network, Vol. 8, No. 1, 1994. of the Intl. Workshop on Interactive
[11] Lenat, D.. The Dimensions of Context Applications of Mobile Computing (IMC98),
Space. 1999. Rostock, Germany, November 1998.
http://www.cyc.com/publications.html. [21] Schmidt, A., Forbess, J. What GPS Doesn't
Tell You: Determining One's Context with
[12] Leonhard, U., Magee, J., Dias, P. Location
Low-Level Sensors. The 6th IEEE
Service in Mobile Computing Environments.
International Conference on Electronics,
Computer & Graphics. Special Issue on
Circuits and Systems, September 5 - 8, 1999,
Mobile Computing. Volume 20, Numer 5,
Paphos, Cyprus. 1999.
September/October 1996.
[13] Maes, P., P. Maes on Software Agents: [22] Schmidt, A., Gellersen, H-W., Beigl, M. A
Humanizing the Global Computer. IEEE Wearable Context-Awareness Component -
Internet Computing July-August. 1997. Finally a Good Reason to Wear a Tie. In
[14] NCR Corp. Mülleimer informiert Proceedings of the third International
Supermarkt. Symposium on Wearable Computers. San
http://www.heise.de/newsticker/data/anm- Fransico, 18-19. Oct. 1999.
28.10.99-001/.
[15] Pascoe, J., Ryan, N. S., and Morse D. R.,
"Human Computer Giraffe Interaction: HCI

10

You might also like