Professional Documents
Culture Documents
Bibliography:
Mobile Robots
4) Sensors
Sensor classification Sensor characterization Wheel/motor encoders Heading/orientation sensors Ground-based beacons Active ranging Motion/speed sensors Vision (sensors, depth from focus, stereovision, optic flow)
Introduction to Autonomous Mobile Robots (ch.4) R. Siegwart, and I. Nourbakhsh, MIT Press, 2004 Sensors for Mobile Robots, Theory and Applications H.R. Everett, A K Peters Ltd., 1995
Sep-29-10
Sep-29-10
Exteroceptive sensors
information about the robots environment distances to objects, intensity of the ambient light, etc.
How:
Passive sensors
energy coming from the environment
Active sensors
emit their own energy and measure the reaction better performance, but some influence on the environment and generally consumes more energy
Sep-29-10 Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Sep-29-10
Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Size, weight, power consumption, voltage, interface Field of view (FOV) Range (or full scale)
lower and upper limits e.g. triangulation range finder may provide unpredictable values outside its working range
Resolution
smallest input change that can be detected (due to noise, physical limitations or determined by the A/D conversion)
Sep-29-10 Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Sep-29-10
Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Hysteresis
path-dependency (e.g. thermostats, some tactile sensors)
Bandwidth or Frequency
the speed at which a sensor can provide a stream of readings usually there is an upper limit depending on the sensor and the sampling rate one has also to consider delay (phase) of the signal
!
degree of conformity between the measurement and true value often expressed as a portion of the true value:
! ! ! ! m = measured value v = true value
Probability density
Sensitivity
ratio of output change to input change
!
8
Sep-29-10
Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Sep-29-10
Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Kinds of errors
Systematic errors: deterministic
caused by factors that can (in theory) be modeled and predicted e.g. distortion caused by the optics of a camera
Others
cross-sensitivity of sensors, motion blurr rarely possible to model -> appear as random errors but are neither systematic nor random systematic errors and random errors might be well defined in controlled environment. This is often not the case for mobile robots!
Sep-29-10 Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Sep-29-10
Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
10
Magnetic compasses
Since over 2000 B.C.
when Chinese suspended a piece of naturally magnetite from a silk thread and used it to guide a chariot over land.
Gyroscopes
The goal of gyroscopic systems is to measure changes in vehicle orientation by taking advantage of physical laws that produce predictable effects under rotation. Three functioning principles:
Mechanical Optical Micro-electromechanical systems (MEMS)
often called strapdown
Two categories
E.g.: HMC1043 is a miniature three-axis magnetoresistive magnetometer of only 3x3 mm
Major drawback
weakness of the Earth magnetic field (0.3-0.6 T) easily disturbed by magnetic objects or other sources not working in indoor environments (except locally)
Orientation -> directly measure angles (very rare in robotics!) Rate gyros -> measure rotation velocity, which can be integrated
A problem common to all gyroscopes is that of drift => unless the error is corrected through reference to some alternate measurement, the drift will eventually exceed the required accuracy of the system.
Sep-29-10 Mobile Robots - EPFL - J.-C. Zufferey
11
12
Mechanical gyroscopes
Rely on angular momentum (gyroscopic inertia): the tendency of a rotating object to keep rotating at the same angular speed about the same axis of rotation in the absence of an external torque. Reactive torque ! is proportional to the spinning speed ", the rate of precession # and the wheels inertia I. # = I" ! No torque can be transmitted from the outer pivot to the wheel axis => spinning axis will therefore be space-stable. Nevertheless, remaining friction in the bearings of the gyro axis introduce some errors over time (drift). A high-quality mechanical gyro can cost up to $100000 and drift may be kept as low as 0.02/h Mechanically complex => not much used in mobile robotics.
Sep-29-10 Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Optical gyroscopes
Based on the behavior of an optical standing wave in a rotating frame (the Sagnac effect). Optical gyroscopes use the interference of light to detect mechanical rotation:
two light beams travel (e.g. along an optical fiber of up to 5 km in length) in opposite directions the beam traveling against the rotation experiences a slightly shorter path than the other beam the resulting phase shift affects how the beams interfere with each other when they are combined the intensity of the combined beam then depends on the rotation rate of the device
13
14
MEMS gyros have no rotating parts, have low-power consumption requirements, and are small => they are quickly replacing mechanical and optical gyros in robotic applications.
Sep-29-10 Mobile Robots - EPFL - J.-C. Zufferey (adapted from the Handbook of Robotics, 2008, ch.20.4)
16
Beacon-based positioning
Beacons are signaling devices with a precisely known position. Beacon-based navigation is used since the humans started to travel
Natural beacons (landmarks) like stars, mountains or the sun Artificial beacons like lighthouses
The recently introduced Global Positioning System (GPS) revolutionized modern navigation technology
Already one of the key sensors for outdoor mobile robotics For indoor robots GPS is not applicable
IMU/INS are extremely sensitive to measurement errors => an external source of information such as GPS and/or magnetometer is often required to correct this (cf. ch. 8 for more information on sensor fusion). List of IMU/INS products: http://damien.douxchamps.net/research/imu/
Mobile Robots - EPFL - J.-C. Zufferey (adapted from the Handbook of Robotics, 2008, ch.20.4)
Sep-29-10
18
Sep-29-10
Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 5)
19
Sep-29-10
Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 5)
20
Technical challenges:
Time synchronization between the individual satellites and the GPS receiver Real-time update of the exact location of the satellites Precise measurement of the time of flight Interferences with other signals, reflections
Sep-29-10 Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
21
Sep-29-10
Mobile Robots - EPFL - J.-C. Zufferey (adapted from Everett, 1995, section 14.2)
22
Ultrasonic sensors as well as laser range sensors make use of propagation speed of sound or electromagnetic waves respectively. The traveled distance of a sound or electromagnetic wave is given by
d = c !t
Where
d = distance traveled (usually round-trip) c = speed of wave propagation t = time of flight
Sep-29-10 Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Recent commercial GPS receivers have a position accuracy within 20 m (typ. 10 m) in the horizontal plane and 45 m in the vertical plane, depending on the number of satellites within line of sight and multipath issues. WAAS allows to get close to 1-2 m accuracy. The update rate is typically between 1 and 4 Hz only.
Sep-29-10 Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
23
24
d=
c !t 2
c = " ! R !T
where $ : ratio of specific heats or adiabatic index (1.402 for air) R: gas constant (287.05 J"kg-1"K-1 for air) T: temperature in Kelvin
25
Sep-29-10 Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
26
measurement cone
30
60
Amplitude [dB]
27
Sep-29-10
Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
28
Sep-29-10 Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Also known as laser radar or LIDAR (LIght Detection And Ranging) Transmitted and received beams are coaxial Transmitter illuminates a target with a collimated beam (laser) Diffuse reflection with most surfaces because wavelength is typ. 824nm Receiver detects the time needed for round-trip An optional mechanism sweeps the light beam to cover the required scene (in 2D or 3D).
Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
29
Sep-29-10
30
Amplitude [V]
! "
Transmitted Beam Reflected Beam
Distance D, between the beam splitter and the target is given by # (Woodbury, et al., 1993): D= ! 4" where % is the phase difference between transmitted and reflected beam and ! =
Hokuyo, Japan
Sep-29-10 Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Confidence in the range (phase/time estimate) is inversely proportional to the square of the received signal amplitude. Hence dark, distant objects will not produce as good range estimates as closer, brighter objects.
31
Sep-29-10 Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
32
Example outputs
Output examples
Velodyne LIDAR
IBEO
4x 160
Velodyne HDL-64E
Sep-29-10
64x 360 (27 vertical) 0.1 angular resolution 5-15Hz update rate >1.333M points per second 50 120m range 2cm accuracy 20W, 13kg 75000 USD !! (as of 2008)
Velodyne Inc., CA 33
Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
34
176 x 144 pixels Up to 8 m Up to 54 fps 12V, max. 1A 1W of emitted light 500 g Designed for indoor use Resolution is inversely proportional to frame rate
http://www.mesa-imaging.ch
Sep-29-10
Mobile Robots - EPFL - J.-C. Zufferey (adapted from Scherer et al., 2007)
35
Sep-29-10
36
Doppler-effect sensors
1D triangulation sensors
d !
Diffuse ! reection !
b ! u ! f !
d= f
Appears either if transmitter (or receiver) is moving or when reflecting from an object. if transmitter is moving if receiver is moving
b u
A factor 2 needs to be introduced in the reflecting case to account for the round-trip. Doppler frequency shift: Relative speed:
Both collimated LED or lasers are used to emit a visible or infrared (IR) beam. Position-Sensing Detectors (PSD) are the simplest and fastest way to measure the position of the centroid of a light spot. PSDs are based on a lateral-effect diode that converts the position of incident radiation into signal currents that are directly proportional to the light's position on the active area of the detector. Another approach consists of using a linear camera.
37
Sep-29-10
where " is the relative angle between the direction of motion and beam axis. Sound waves (ultrasonic): e.g. industrial process control, security, measure of ground speed. Electromagnetic waves (RF, laser): e.g. radar systems, object tracking, e.g. the VORAD sensor (http://www.roadranger.com) can detect cars up to 150m away, has a range of 0 to 160km/h with a 1km/h resolution.
Sep-29-10 Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
38
a
u
H = D " tan !
A light source projects a known pattern or structured light onto the environment. The image is then processed to identify the patterns reflection. Size of object (in the z-axis) can be directly determined. Range to an illuminated point can then be determined from simple geometry (see next slide).
Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Sep-29-10
39
Sep-29-10
Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
40
Vision sensors
CCD (Charge Coupled Device)
few (usually 1) charge to voltage converter large pixels -> high fill factor good pixel uniformity 2048 x 2048 CCD array
Sony DFW-X700
41
Sep-29-10
Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
42
L R
Pairs of images
Changing camera geometry: depth from (de)focus Using (at least) 2 cameras: stereo vision Using a sequence of images (while moving): optic flow
Sep-29-10 Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
44
observed image
focused image
this equation relates the depth of scene points via R to the observed image g solving for R would provide us with the depth map, but f is another unknown Depth from defocus: depth is recovered based on a series of images that have been taken with known camera geometries (cf. next slide).
Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
this issue can be solved by taking at least two images of the same scene with different &, since f would be constant.
45
Sep-29-10 Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
Sep-29-10
46
Stereo vision
Idealized camera geometry for stereo vision
Disparity between two images -> computing of depth
disparity
http://homepages.inf.ed.ac.uk/rbf/CVonline/LOCAL_COPIES/FAVARO1/dfdtutorial.html Sep-29-10 Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
47
Sep-29-10
Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
48
Stereo vision
Distance is inversely proportional to disparity
Closer objects can be measured more accurately (nonlinear resolution).
Feature-based approach
Feature extraction (edges, line termination, etc.). Rely on the following constraints:
Limited search space: an element in the left image is sought only within a specific region of the right image, in particular on the epipolar line. Feature attributes: look for the same type (e.g., edge, end of line) and same characteristics (e.g., color, polarity of contrast). Ordering constraints: each element in the left image is assigned to exactly one element in the right image & respect sequence.
Intensity-based approach
Using all pixel values. Measure of similarity between shifted image patches from left and right camera (much like some optic-flow techniques).
49
Sep-29-10 Mobile Robots - EPFL - J.-C. Zufferey (adapted from Siegwart & Nourbakhsh, 2004, ch. 4)
50
2D projection onto a retina of the relative 3D motion of scene direction di points. Lets assume a spherical visual system in a stationary Di! T! environment. A specific viewing direction is indexed with i and the unit vector di whereas Di represents the distance between pi! R! the visual system and an object viewed in this direction. When di! the system moves with translation T and rotation R, the resulting motion field is the set of vectors pi is given by (Koenderink & van Spherical vision system Doorn, 1987):
This allows to estimate distances, if egomotion (rotation and translation) is known. This is called structure from motion. Or time to contact (TTC) in case of pure translation, i.e. TTC=D/||T|| .! Or estimate egomotion, given certain assumptions on surrounding distances.
Figure from: Krapp, H.G., Hengstenberg, R. (1996) Estimation of self-motion by optic flow processing in single visual interneurons. Nature, 5 December 1996, vol 384.
51
Sep-29-10
52
Examples of applications:
USC helicopter LIS indoor microflyer LIS outdoor flying wing TP robotique at LIS
correlation:
where I(x,y,t) is the irradiance of the image point (x,y) at time t and vx(x,y)=Dx/Dt and vy(x,y)=Dy/Dt are the optic flow components at that point.
Feature-based (or feature tracking) techniques consist of identifying objects in an image looking for the same in the next image.
Note: these techniques are very similar to the ones used in stereo vision. What happens when patches contain no texture?
Mobile Robots - EPFL - J.-C. Zufferey (adapted from Russell & Norvig, 2003, ch. 24)
Sep-29-10
53
Sep-29-10
54
56