You are on page 1of 5

Eye movement detection for wheelchair control

application
Dr. Mandeep Singh1, Prateek Jain2, Shaurya Chopra3
Electrical & Instrumentation Engineering Department,
Thapar University, Patiala
1
mdsingh@thapar.edu,2pj91000@gmail.com,3shauryachopra66@gmail.com
Abstract- An electric wheelchair is an aid for disabled people who
have lost the ability to move. A conventional wheelchair is
manually driven which cannot be used by full body impaired
people, so a model is needed which can be beneficial to them.
There are various motor operated wheelchairs available but none
of them are perfectly accurate. Inaccuracy of wheelchair can have
disastrous results for the operator. So, a design is required to
navigate wheelchair which has high accuracy. This paper deals
with deriving a method to navigate the wheelchair with high
accuracy by collaborating E.O.G. method and interfacing a
camera in front of eye to get maximum accuracy and giving
highest priority to the safety of user.
Keywords- E.O.G., Electrooculography , eye controlled wheelchair,
microcontroller

I.

INTRODUCTION

Census 2001 has revealed that over 21 million people in India


are suffering from one or the other kind of disability. This is
equivalent to 2.1% of the population out of which 0.6% are
physically disabled. Disability in mobility can either be a
congenital or acquired with age problem. There are many types
of orthopedic or neuromuscular impairments that can impact
mobility. These include but are not limited to amputation,
paralysis, Cerebral Palsy, Stroke, Multiple Sclerosis, Muscular
Dystrophy, Arthritis, and spinal cord injury. In order to support
mobility of disabled people, wheelchairs are being developed
using various techniques some of which are Brain activated
control, speech control, eye Control. Eye control can include
E.O.G. (Electrooculography) technology, camera interface with
eyes. The intention of this paper is to target people who are full
body impaired and focus on eye activated control techniques for
controlling electric wheelchair. The camera interface detects
eye movements from pictorial images of the eyeball while in
E.O.G. technology, signals are taken across the eye and then
processed for decision making.

technique is known as electrooculogram (EOG). When the eyes


are moving, the potentials at the electrodes vary proportionally
to the sine of eyes rotation angle, the linearity becomes
progressively worse for angles beyond 30 degree. Output
voltage level of EOG varies from person to person but EOG
signal profile remains the same (Fig. 3). The noise in EOG
recordings arises from different sources like facial muscles,
body, head movement and other activities like speaking.
DESIGN OF EOG SYSTEM
Horizontal and vertical electrodes are placed across the eyes in
horizontal and vertical position respectively in order to get leftright and up-down positions of eye ball as per Fig. 1. Ground
electrode is placed on forehead and is connected to the DC
ground of the electronics circuit. Signals from the electrodes are
amplified and processed using instrumentation amplifier.
Output of the instrumentation amplifier is given to 10 bit ADC
of Microcontroller refer Table 1 for signal level. Initially
operator has to look in forward direction to get reference signal
value because reference signal may vary from operator to
operator. For the operator to move in a required direction,
he/she directs his/her eye in that direction and the acquired
signal is compared with the reference signal registered in
microcontroller at initialization of program. Decision is taken
on the basis of profile of the signal and control signals are given
to motors of the wheelchair to move in the desired direction. To
stop the motors, operator has to look in upward direction and to
start the motors, he/she has to look in downward direction

II. EOG SIGNAL


The front of the eye (cornea) has a bioelectrical potential that is
positive with respect to the back of the eye (retina) and varying
from 0.4mV to 1.0mV.The potential is approximately aligned
with the optical axis of the eye. Using surface skin electrodes
placed around the eye, the potential can be registered. This
Fig. 1. Placement of electrodes

2015 IEEE

Fig. 2. Output from instrumentation amplifier Schematic

Centre

Left

Instrumentation amplifier is used because signal from


electrodes are of the order of 0.4mV to 1mV.Therefore signals
from two electrodes are needed to be amplified and processed.
Both op amp and differential amplifier are integrated in
instrumentation amplifier (AD620) and its error is less than the
circuit with op amp and differential amplifier used individually
(Fig. 4). Signal level at the output of instrumentation amplifier
is measured by 10 bit ADC (Fig. 2). Average of every 64
samples is taken in ADC of microcontroller before decision
making in order to reduce ripples and attain high accuracy.

Right

Fig. 3. Signal profile at output of Instrumentation


amplifier

Table 1 Voltage for different eye positions


Parameter

Voltage(mV)

Fig. 4. Error comparison between AD620 and three op Amp IA Designs

Centre Voltage(peak)

192

Left Voltage(peak)

236

III. WHEELCHAIR CONTROL BY CAMERA INTERFACE

Right Voltage(peak)

164

Signal Ripple

28

Left-Center Voltage

44

Center-Right Voltage

28

Camera is placed in front of eye to capture image of pupil and


sclera. Image captured is converted into binary image. Iris
serves as the black region and Sclera is white region in the
binary image. Binary image is divided into 3 vertical sectors
(Fig. 5). Initially operator has to look in forward direction to get
the reference image for the subsequent images taken in real
time. Number of nonzero terms (white pixels) is calculated for
each sector of reference image and is compared with respective
sectors of real time images. Based on these comparison decision
is taken about the position of eyeball and control signal is given

2015 IEEE

to motors of wheelchair. To stop and start the wheelchair,


operator has to blink for two seconds.

SECTOR 1

SECTOR 2

SECTOR 3

REFERENCE
IMAGE

Fig. 7 Right position detected on MATLAB

REAL TIME
IMAGE

Fig. 5 Division reference and real time image into


sectors.

When eye lids are closed, white pixels in the image increase by
a large number as compared to reference image. This signal is
used to start or stop the wheelchair.

Fig. 8 Left position detected on MATLAB

VERIFICATION OF CAMERA INTERFACE MODEL


WITH MATLAB
When the operator moves his eyes in left direction, white pixels
are decreased in sector 1, increased in sector 2 and remains
constant in sector 3 . By this comparison with reference image,
decision is taken to take a left turn similarly when the operator
looks in right direction, white pixels are decreased in sector 3,
increased in sector 2 and remains constant in sector 1 and
decision is taken to move wheelchair in right direction.

Fig. 9 Blink of detected on MATLAB

Image captured by camera is converted to binary image on


which image processing is carried out. Based on decision
making, output is given to microcontroller by RS232
communication for wheelchair motor control. Output results of
MATLAB is shown in Fig. 6 to Fig. 9 for forward, right, left
and blink actions respectively.

Fig. 6 Forward position detected on MATLAB

2015 IEEE

CONTROL SIGNAL TO
MOTOR OF WHEELCHAIR
CAMERA

MICRO
CONTROLLER

IMAGE
PROCESSING

DESCISION
MAKING

TRUE

IF(RESULT==1)

AND GATE

FALSE
EOG SIGNAL

MICROCONTROLLER

SIGNAL
PROCESSING

DESCISION
MAKING

SYSTEM RESEST

Fig. 10 Flow diagram for implementation of eye controlled wheelchair with E.O.G. and camera interface.

IV. INTEGRATING E.O.G. AND CAMERA INTERFACE


ON WHEELCHAIR.

Table 2 Comparison of different models in terms of accuracy

Safety of the operator of wheelchair is the primary concern.


So, a model with high accuracy is required as described in
Fig.10. To achieve this purpose, both the technologies i.e.
E.O.G and camera interface technology are incorporated
together on the wheel chair prototype. Image processing is
performed on MATLAB and results obtained are transferred to
microcontroller ATmega 88 through RS232 communication.
EOG signal processing and decision making are performed on
same microcontroller. When decision of both the sensors are
identical then control signal is given to motors of wheelchair
else motion of the wheelchair is terminated for operator's
safety and decision of both sensors is taken 4-5 times
(programmable). If they still do not match, the system is reset
to get a new reference image from camera and reference signal
for E.O.G. system. This model ensures high accuracy of the
system and no wrong decisions are made.
V. RESULT
50 trial runs were conducted for the two individual technologies
installed on wheelchair prototype. All the trials were conducted
on same operator under same conditions. 31 success samples
were recorded for wheelchair installed with E.O.G technology
i.e. 62% success rate. 40 success samples were recorded for
wheelchair installed with camera interface i.e. 74% success rate
.When both the technologies were incorporated in the
wheelchair, there was a significant increase of 46 success
samples i.e. 92% success rate (Table 2). If the wheel chair
moved in same direction as that of the direction of eyeball then
that it is considered as success sample.

MODEL
E.O.G.

CAMERA
INTERFACE
E.O.G. &
CAMERA
INTERFACE

NO. OF
TRIALS

NO. OF
SUCCESS

ACCURACY
PERCENTAG
E

50

31

62%

40

80%

46

92%

50

50

VI. CONCLUSION
This research project is aimed at guiding the wheel chair in
desired direction. Which was done by incorporating both E.O.G
and camera interface on wheel chair prototype and accuracy is
increased from 82% in camera interface to 92%. Considering
the practical aspects of the project, it is proposed that camera
and E.O.G. system should be connected to a single
microcontroller unit for decision making. Also, obstacle
sensors like ultra sound sensor, IR proximity sensor or infrared
range sensor can be installed on the wheelchair to make sure
that the operator does not collide with any obstacle .So, by
combining both the technologies, wheelchair can be
maneuvered by eye movements with high precision keeping in
mind the safety of the operator

2015 IEEE

References
[1] H. Harun, W. Mansor, EOG Signal Detection for Home Appliances
Activation 5th Int. Colloq. CSPA,2009.
[2] R. Barea,L. Boquete, M. Mazo, E. Lbpez, L.M. Bergasa E.O.G. guidance
of a wheelchair using neural networks 15 th Int. Conf. on Pattern
RecognitionVol.4,pp 668-671,2000.
[3] G. Gautam, G. Sumanth, K.C. Karthikeyan, S. Sundar , D.Venkataraman
Eye movement based electronic wheel chair for physically challenged
persons Int.l JSTR Vol. 3(2),2014.
[4] N. Kim-Tien , N. Truong-Thinh, Using Electrooculogram and
Electromyogram for powered wheelchairProc. IEEE Int. Conf. on Robotics
and Biomimetics ,Phuket,Thailand,2011.
[5] Y.S. Kim,H.B Lee,J.S Kim,H.J Baekl, M.S Ryu,K.S Park ECG, EOG
detection from helmet based system 6th Int. Special Topic Conference on
ITAB,Tokyo,2007.
[6] C.Yue, EOG Signalsin Drowsiness Research M.S. thesis, Dept.
Biomedical Engineering, University of Linkping, Linkping,Sweden,2011.
[7] A.A. Al-Haddad,R. Sudirman,C. Omar,Guiding Wheelchair Motion based
on EOG Signals using Tangent Bug Algorithm,3rd Int. Conf. on CIMSim,2011.
[8] J.K. Chacko, D. Oommen, K.K. Mathew, N. Sunny, N. Babu ,
Microcontroller Based EOG Guided Wheelchair, Int. Vol:7(11),pp 423-426,
2013
[9] A.W. North,Accuracy and precision of electrooculography recording,
Invest. Ophthalmol. Vis. Sci. Vol.4(3),pp 343348,1965

2015 IEEE

You might also like