You are on page 1of 5

C o v e r S t o r y S AFE T Y SYSTEMS

STEERING WHEEL FOR ACTIVE


DRIVER STATE DETECTION
The steady increase in equipment in driver assistance and communications systems has made it necessary to
look at driver workload and risk of accidents. The flood of information may even overload the driver. Although a
lot is known via sensors such as cameras, radar and lidar about the environment around the vehicle, the condi-
tion in the vehicle is still relatively unknown. Takata presents systems in the steering wheel, such as hand detec-
tion and integrated camera that allow biometric condition monitoring of the driver. The safety risk by additional
occupation is minimised.

36 www.autotechreview.com
Authors
DRIVER WORKLOAD AND meantime, one is faced with increasing
ACCIDENT RISK numbers of accidents due to distraction
[3]. People feel safer in today’s cars than

Car buyers are being increasingly confron- ever before. They use their smart phones
ted with the field of autonomous driving for SMS texting, selecting music, tele-
by the media. One such showcase is at phone and continue with activities such
the CES conference in Las Vegas, where as eating or drinking.
STEFANIE ESSERS
is Director for EMEA Core Engineering
for the last few years many vehicle manu- Interestingly, Landau [1] had already
& Research at Takata AG facturers and suppliers have presented included driver monitoring as a future
in Berlin (Germany). advanced driver assistance systems device and this is what will be discussed
(ADAS) and vehicles with varying degrees more extensively in the following.
of autonomous driving functionality.
In 2002, Landau [1] did intensive
research on automation. He specifically DATA COLLECTION
evaluated the influence of automation in
cars. The relationship between driver If we look at vehicles in the D segment
JASON LISSEMAN workload and accident risk and the and above, we are now able to “see” a
is Director North America Core
Engineering for Steering Wheel
degree of available assistance and com- 360° surround view of the vehicle,
Systems at Takata Holdings, Inc. in munication systems was already con- exceeding the human senses. Lidar
Auburn Hills, Michigan (USA). sidered. It was already anticipated that sensor systems, for example, are able to
after facilitating the driving task and detect and classify obstacles 300 m
thereby reducing the accident risk and ahead of the vehicle. Exterior sensors
driver workload, we would come to a typically supply one to multiple smart
turning point. The increasing number of assistance systems today, which belong
ADAS and sensor data fusion would lead to one of three categories:
to a situation of increased driver workload :: Safety systems (collision warning, ped-
HEIKO RUCK and accident risk because of an increased estrian protection, emergency
is Director for EMEA Application
Engineering for Steering Wheels and
demand on driver attention, while driving braking);
Airbags at Takata AG in from vehicle assistance systems and :: Vehicle control (ACC stop-and-go, lane
Aschaffenburg (Germany). nomadic devices. keeping); and
Both forecasts have come true. In :: Driver support (traffic sign recognition,
highly equipped cars, people sometimes blind spot detection).
are overloaded in managing the available With automatic parking and traffic jam
assistance and infotainment systems par- assistance it is already experienced
allel to their main task of driving [2]. In the today what will evolve further – a com-

2030
Explanation 2025

Autonomous
Safety Vehicle Driver Autonomous
systems support
2020 city driving
control highway
Highway driving
assist Emergency
avoidance
steering

Traffic jam Emergency


assist stop assist

2015

Intelligent
light

Traffic sign
Pedestrian Automatic
recog
detection parking
nition

Lane
Emergency
keeping
braking

Collision
warning ACC
stop and go Blind spot
detection
Parking
assist

1 ADAS characteristics – chronological development towards combined assistance systems

autotechreview Jul y 2 016 Vo lum e 5 | I s su e 7 37


C o v e r S t o r y S AFE T Y SYSTEMS

Driver controls Driver controls Driver must Driver does not No driver needed System can
continuously continuously permanently monitor permanently monitor in the specific automatically handle
longitudinal and longitudinal or the system the system; has to be application all situations during
lateral steering lateral steering potentially in a the entire trip;
Automation ←→ Driver

position to monitor no driver required


System can
automatically handle
all situations in
System takes over specific application
longitudinal and
System takes over lateral control in a
longitudinal and specific application;
lateral control in a detects system
specific application boundaries and calls
System takes the
for driver to take over
respective other
with a sufficient time
function
reserve
No engaging
vehicle system active

Level 0 Level 1 Level 2 Level 3 Level 4 Level 5


Driver only Assistant Partially Highly Fully Driverless
automated automated automated

System
overrules
2 Five levels of autonomous driving (Source: VDA)

bination of vehicle control and driver influenced. The German Association of requires knowledge of the driver’s status
support based on sensor data fusion. the Automotive Industry (VDA) [4] as well and activities. So, what does that mean
Within the next decade we will see as the German Federal Highway Research for the role of the driver?
combined ADAS that will include safety Institute (BASt) [5] provided definitions in
features, vehicle control and driver sup- 2014 that picked up on earlier automation
port in one system. This item allows the classifications and added a fifth level: the THE FUTURE ROLE OF THE DRIVER
car to drive automatically, 1. autonomous driverless car, 2.
Future cars will not only offer Today’s modern cars are just at the The driver will become partly passive in
increased assistance based on sensor data threshold of moving from partial to highly the loop, and will no longer be provided
fusion but they will also be fully connec- automated driving – level 2 to level 3, ②. with information solely about the system
ted – offering additional new functionalit- The driver will be allowed to step out of state and the ADAS. In fact, the intelli-
ies to the occupants, including the driver. the loop for a limited amount of time. gence of the vehicle (electronic control
As automation increases, the feedback Nevertheless, the safety of the occupants unit, ECU) will monitor and assess
loop between vehicle and driver will be must at all times be ensured. This information about the driver status. How-
ever, from today’s perspective we
approach a technical gap, 3: There is an
“intelligent”, connected vehicle, which is
very aware of everything in its environ-
ment. And there is a driver who can be
related to any of the following statuses –
focused and stable; distracted by second-
ary or tertiary tasks; drowsy; aggressive;
bored and overloaded.

POSSIBILITIES FOR
DRIVER STATE DETECTION

Unfortunately, the “intelligent” vehicle


does not see the driver’s behaviour via its
external sensors so far. It has very good
“eyes” for the exterior – but it is missing
all the senses to be aware of its inner side.
3 Technical gap between well-known environment recognition via “intelligent” The MIT Agelab [6] made a model of all
vehicle (left) and unknown driver behaviour (right) necessary data that is needed to detect the

38 www.autotechreview.com
4 Steering wheel with integrated lightbar as warning function (left) and a sec-
tioned view of the steering wheel with a hands-on detection sensor mat (right)

driver status. Ultimately this requires a


lot of data, some of which can be interface of a smart phone app does not CAMERA-BASED DRIVER
provided by existing vehicle perform- have to reflect the standards for automot- STATE DETECTION
ance, or environmental status devices, ive usage [7]. The safety risk posed by this
some of which will drive the develop- distracted behaviour is proven by the mul- Combining this functionality with an
ment of new sensors and monitoring sys- titude of statistics that are commonly interior-focused vision system allows the
tems that do not exist today. available [8, 9]. “intelligent” vehicle to monitor the visual
However, the real key will be interrog- attention of the driver. There are different
ating this data to build a holistic picture regions, where a driver is likely to focus:
of the drivers’ mental workload, level and DETECTION OF HAND CONTACT AT at the road, infotainment system, smart-
sources of distraction, fitness level and STEERING WHEEL phone or off-road.
intentions. In order to accomplish this, With the “inner eyes”, the vehicle is
there are three main sensing focus areas: In order to reduce the safety risk of addi- able to track whether the driver is pay-
:: Biometrics (heart rate, respiration, skin tional driver activities, a first step for the ing full attention and therefore “in the
conductance); near future could be the addition of a loop” or whether the driver is distracted
:: Visual attention and fatigue (gaze steering wheel hand detection system. and needs increased assistance or sup-
measurements or percental eye clos- This Hands-On Detection (HOD) system port. With a camera and illumination
ure); and enables the ability to identify when the via infrared light emitting diode
;; Emotions (face recognition or voice driver is in control of the car and to detect (IR-LED) included in the steering wheel,
analysis). situations when this is not the case. 5 (right), we have one of the best per-
Many of these systems exist in the med- By utilising hidden capacitive sensors spectives on the driver’s face and an
ical industry, and in the automotive underneath the steering wheel rim surface optimal viewing angle for further meas-
industry these measurements are already material, it is possible to detect not only urements such as gaze direction, occu-
taken to determine the driver-fitness hand contact but the location of contact pant size and position, heart rate extrac-
safety. As mentioned previously, people on the steering wheel for all vehicle ted out of skin colour variances, identi-
eat, drink, talk on the phone and just speeds, including stationary vehicles. The fication and age estimation.
recently in the last years’ text, tweet and first generation of HOD steering wheels is All these measurements support both
browse the internet while driving. in production today. HOD can also be passive and active safety functions. For
Two aspects of these additional driver combined with a high visibility wheel almost full automation, additional
activities are dangerous. On the one hand, mounted light warning function that can measurements like eye closure for
it is the visible behaviour of the driver; he remind the driver to put their hands back drowsiness detection and pupillometry
takes his hand off the wheel and his eyes on the steering wheel, 4. This is done to for workload measurement become
off the road. On the other hand, the GUIs quickly communicate vehicle state and increasingly important.
of mobile devices are not designed for can attract visual attention back to the With smart vision algorithms we will
in-vehicle usage. For example, the user forward looking zone. soon be able to more accurately detect

autotechreview Jul y 2 016 Vo lum e 5 | I s su e 7 39


C o v e r S t o r y S AFE T Y SYSTEMS

5 Steering wheel with biometric sensors in the rim (left) as well as integrated camera and infrared light emitting diode (right)

age, gender, posture, stature and identity solution, because hand position is not drive from Munich to Hamburg should
of the occupants, which will in turn stable and in autonomous mode, the know if the driver is healthy and ulti-
enable the car to personalise the passive hands may not always be on the wheel. mately still there.
safety restraints for optimum performance In general, no matter whether visual or
and optimise the active safety system by surface sensors are used, redundant meas- REFERENCES
adapting the automation. urements are key for stable classification [1] Landau, K.: The development of driver assist-
ance systems following usability criteria. In: Beha-
results and for a valid prognosis of driver viour & Information Technology 21 (2002), No. 5,
behaviour. Last but not the least, a smart pp. 341-344
VITAL STEERING WHEEL vehicle will also need a “brain” – a [2] Jahn, G.; Oehme, A.; Rösler, D.; Krems, J. F.:
Kompetenzerwerb im Umgang mit Fahrerinforma-
powerful processing platform with com-
tionssystemen. Berichte der Bundesanstalt für
One area of interest would be expanding plex algorithm – that combines all neces- Straßenwesen, Abt. Fahrzeugtechnik, Heft F 47,
these measurements to include an Electro sary data to classify the driver state and to Bremerhaven: Wirtschaftsverlag NW, 2011
[3] Klauer, S. G.; Dingus, T. A.; Neale, V. L.; Sud-
Cardiogram (ECG), skin conductance and adapt the automation and the interior to
weeks, J. D.; Ramsey, D. J.: The impact of driver
respiratory rate with further sensors the needs. inattention on near-crash/crash risk – an analysis
within the steering wheel or the seatbelt. using the 100-car naturalistic driving study.
Such data could help derive the fitness NHTSA, April 2006, National Technical Informa-
tion Service, Springfield, Virginia, 22161
level, workload and emotional state of the CONCLUSION FOR [4] VDA: Automatisiertes Fahren. In: https://www.
driver. More measurements such as brain DRIVER STATE SENSING vda.de/de/themen/innovation-und-technik/automat-
waves would be beneficial; but combining isiertes-fahren.html, last access on 23 January 2015
[5] Gasser, T.: Herausforderungen automatischen
these mentioned three core measurements Takata continues to strive to bring all
Fahrens und Forschungsschwerpunkte. 6. Tagung
with vision data it is possible to get a clas- these pieces into a complete system, such Fahrerassistenz, Tüv Süd, München, 2013
sification of stress level or workload, fit- as the virtual co-pilot. This would be the [6] Coughlin, J. F.; Mehler, B.; Reimer, B.: Monit-
oring, managing, and motivating driver safety and
ness level or wellness, emotional state “personification” of the vehicle and its
well-being. In: Pervasive Computing, 10 (2011),
and driver intention. assistance systems. No. 3, pp. 14-21
Continued sensing research is expected While the assistance systems and [7] Alliance of Automobile Manufacturers (AAM),
to deliver alternative non-intrusive ways automation become more and more Driver Focus-Telematics Working Group: State-
ment of principles, criteria and verification proced-
to gain this data, but as of today the powerful, the interface has not evolved ures on driver interactions with advanced in-ve-
majority of these measurements can only significantly. For a complex automation hicle information and communication systems.
be derived by direct contact. As the steer- task more than several on/ off buttons of Washington, DC, 2002
[8] U. S. Department of Transportation: Facts and
ing wheel is the only location in the various assistance systems are needed. In
statistics. In: http://www.distraction.gov/content/
vehicle for direct skin contact, it repres- fact, seamless, natural interaction get-the-facts/facts-and-statistics.html, last access
ents the ideal location to capture such between two “intelligent” parties – on 23 January 2015
data. The steering wheel prototype in ⑤ [9] Jolley, D.: Ford survey shows many European
between vehicle and driver – will be
drivers ignore distracted driving risks. In: http://
(left) is one such wheel, which is expected from the customer. europe.autonews.com/article/20120414/
equipped with independent biometric From a safety or misuse perspective, ANENE/120419968, last access on 23 January 2015
sensors around the steering wheel rim (as some level of driver and/ or occupant
sensor mat), which allows skin conduct- sensing is indispensable to do automa-
ance and heart rate measurements in a tion safely. The steering wheel will play
reliable and accurate manner. a significant role in detecting driver state. Read this article on
However, this is just one potential After all, the vehicle on an autonomous www.autotechreview.com

40 www.autotechreview.com

You might also like