You are on page 1of 11

Implementing of Multi Fuzzy Controllers for Multi Sensors to Steer Wheeled Mobile Robot

Mohammed Majed Mohammed AlKhalidy, Ph.D*. Rami.A. Mahir, Ph.D, and Mohammed. Z. Al-Faiz, Ph.D * E-mail: drenmoh@yahoo.com

Abstract: The paper addresses three important topics in mobile robotics. These are the path follower, the multi-sensors
and the steering fuzzy controllers for wheeled mobile robot. This paper presents a novel computer vision methodology for building a system capable of determining the presence of a path follower, tracking the objects on that path, and recognizing the objects shape in vertical or in horizontal in a complex and dynamic environments. The mechanism of multi-sensors together with the steering fuzzy controllers is the basis for the robots capability of following a certain path. The results obtained by this procedure provide an adequate basis for the robot to successfully perform the path follow task.

Keywords: Kinematics of nonholonomic wheeled mobile robot; Multi-Sensors; Multi Fuzzy Controllers and Computer
Vision.

1. Introduction
The field of mobile robot control has been focus of active research in the past decades. Despite the apparent simplicity of the kinematic model of a Wheeled Mobile Robot (WMR), the existence of nonholonomic constraints turns the design of stabilizing control laws for those systems to a considerable challenge. When implementing mobile robots, there are some problems related with the mathematical modeling of the kinematics and of the dynamics, difficulties to estimate the orientation and the position of the robot, some complexity in the control design and also when planning a path to be tracked.

2. Constructions of Wheeled Mobile Robot


In this paper mobile robot made up of a rigid body and non deforming wheels is considered. It is assumed that the vehicle moves on a plane without slipping, i.e., there is a pure rolling contact between the wheels and the ground. The WMR which we name Roc1 is illustrated in Figure (1).

(a) (b) Figure (1) Roc1 WMR a) without cover b) with cover The Free body diagram for the new modeling of nonholonomic WMR (Roc1) is illustrated in Figure (2)

Figure (2) Free body diagram for WMR

The motion model of the new nonholonomic WMR as it is proved in the dissertation [1]:

r2 cos & 2 x r2 sin y & = 2 & r2 r12 [sgn( ) sin( )] + d1l d2


The forward velocity for this model is:

r2 cos 2 r2 sin L 2 R 2 r2 r1 [sgn( ) sin( )] + d2 d1l

(1)

r2 cos r 2 r12 [sgn( r2 r12 [sgn( ) sin( )] r2 cos ) sin( )] sin sin l l + + &1 2 2 d2 d1l d2 d1l x L = x 2 2 r2 r1 [sgn( R ) sin( )] r2 sin ) sin( )] &2 r2 sin + l cos r2 + r1 [sgn( + l cos + 2 d1l d1l d2 d2 2

(2)

3. Robot Vision & Sensors


Two types of color camera are used; the first is CMOS and the second is CCD camera, mounted on a head of the WMR. Two magnetic sensors are used and fixed on the front sides of the robot, one for the left and the other for the right. Motion detection sensor is also used and fixed in the front centre as shown in Figure (3)

Figure (3) Construction of Roc1 WMR The PC USB camera used is the PHILIPS cu 2001 camera. It is a color CMOS technology camera with true 640 x 480 resolutions, auto brightness, white balance saturation and IrDa night-vision crystal light. The focus can be set manually and the orientation of the camera is easy to change. The vision in Roc1 depends on this type of camera, this camera is the eye for the robot, through which the robot could recognize its right path. The CCD camera used in this work is a wireless camera model Lyd-208C. Magnetic sensor is new in use with WMR. It is believed that it is the first time for using such application that has proved the efficiency of using this type of sensors in the robotic technologies. As mentioned before two of these sensors are used one for the left and the other for the right. These sensors are used to assist the robot to generate the right steering commands in complement with the other sensors (vision sensor), also when the vision sensor fails for some reason, magnetic sensors will be good compensators. Motion detection sensor is familiar in the robotic technologies although its applications are different. In this work the motion detection sensor is used to detect any front motion cross the robot track and to generate a signal to stop the robot for a few seconds. The coverage detector angle is 110o-120o and the maximum range of detection is 12m.

4. Steering Fuzzy Controllers Design


In this paper work the fuzzy controllers were designed depending on the random distribution inputs from the sensors: two magnetic sensors one for the left M SL and the other for the right M SR , forward motion detection sensor Md S and forward camera). The camera on the WMR is transmitted back to a computer; this computer has the necessary image processing software and runs a program to do the automatic lateral control see Figure (4). The actual image processing, is to obtain the position of the vehicle with respect to the path follower from the snapshot image. In the second step this information is used as the input to a control algorithm. The output of this algorithm will be a steering angle that will maintain the WMR in a desired position on the path. The control algorithm can be based on fuzzy controllers.

Figure (4) Bock diagram of camera control

In general, since there was various information taken from the snapshot images of the camera (threshold, line detection, edge detection, Euler number and number of objects), the information obtained was not consistent with each other, and some of these information affected by: 1. 2. 3. 4. Manifestation of some blemish or stain in the image. No constancy of illumination. Un obviously of some lines and edges. Exceed the path limitation because of differential drive error.

Because of all that and for more certainly it is found that it is necessary to design a Preceding Fuzzy Controller (PFC) for the camera that limits the priorities from the information, to be the input to the fuzzy controller (FC) as illustrated in Figure (5).

Figure (5) Block diagram of PFC & FC After test and analysis of one hundred snapshot images of multiple paths, as it is illustrated in Figure (6) for the vertical and horizontal objects position, it is found that whenever vertical edge detection is determined by Prewitt operator and edge detection by Roberts operator is determined for these images, the numbers of objects subtracted from the original binarized image are always proportional to the shape positions. Table (1) shows the object decision of this way.

Figure (6) Snapshot path images a) Vertical Objects b) Horizontal Objects Table (1) Objects Decision No. of No. of No. of edge vertical subtraction Objects Binarize object detection by edge dete. edge detection Decision Roberts shapes in Image by Prewitt (H) (V) bwh

bwh V = bwh - H =

Then, the object in horizontal position Then, the object in vertical position

bwv

bwv V = bwv - H =

To make sure from this way, a practical statistical ratio for one hundred snapshot path images are obtained as shown in Table (2). The output results of this way are taken to use as an input to PFC before FC as it is mentioned before, and the new matching algorithm becomes as illustrated in Figure (7).

Table (2) Practical statistics ratio Conformity 85% Unknown 5% Nonconformity 10%

Start

Scan the path

Capture an image

Determine the threshold

Binarize the image

PFC1
Determine the number of objects

Determine Euler number No. Objects

+ -

Determine vertical edge detection by Prewitt

Determine edge detection by Roberts No. of Objects Object decision

PFC2

CPFC

Figure (7) Matching architecture algorithm

The final block diagram which is described the control of the WMR with sensors is shown in Figure (8).

Figure (8) Bock diagram of sensors control The design of each fuzzy controller mentioned in the previous section is presented and illustrated in this section as follows:

Design of Precedence Fuzzy Controller1 (PFC1)

The primary control goal of this controller is to detect the direction and deliver a proper output to the second fuzzy controller (PFC2), and this output is used for more certainty. The two inputs of this controller are taken from the camera and they are; threshold and number of object with Euler number. The threshold universe of discourse range is [0-1], and for the number of objects and Euler (Nob&Euler) is [0-15]. It is known that in a fuzzy logic controller (FLC), the dynamic behavior of a fuzzy system is characterized by a set of linguistic description rules based on expert knowledge; Table (3) shows the rule base of the PFC1.

Table (3) Rule base for the PFC1 threshold R1 L1 F F L L R R S S

Nob&Euler

F2 L2 R2 S2

S1 S S S S

F1 F L R S

where, the output linguistic labels are: F = forward =2 L = left =1 R = right =3 S = stop =0

Design of Precedence Fuzzy Controller2 (PFC2)

The main goal of this controller is to make certain the turns direction. The two inputs of this controller are taken from the camera after image processing and matching development algorithm mentioned before, and they are; V~Prewitt and H~Roberts. The V~Prewitt universe of discourse range is [0-15], and for the H~Roberts is [0-15]. Table (4) shows the rule base of the PFC2.

Table (4) Rule base for the PFC2 H~Roberts Rh Lh S S V N N H H H

V~Prewitt

Fv Lv Rv Sv

Sh S V V N

Fh S S S S

where, the output linguistic labels are: V = forward =2 H = turn =1 N = equal =3 S = stop =0

Design of Collector Precedence Fuzzy Controller (CPFC)

This type of controller is used to collect the output of the two precedence fuzzy controllers (PFC1, PFC2) and use it as inputs. The output of CPFC will be the final decision that is get from the camera sensor. The PFC1 universe of discourse range is [0-3], and for the PFC2 is [0-3]. Table (5) shows the rule base of the CPFC.

Table (5) Rule base for the CPFC PFC2 N H Fc Fc Lc Lc Rc Rc Sc Sc

PFC1

F L R S

S Sc Sc Sc Sc

V Fc Fc Fc Sc

where, the output linguistic labels are: Fc = forward =2 Lc = left =1 Rc = right =3 Sc = stop =0

Design of Fuzzy Controller (FC)

The fuzzy controller (FC) is the final stage of the command fuzzy processing. This controller combines in its input two signals; the signal from the sensors ( two magnetic sensors one for the left M SL and the other for the right M SR and forward motion detection sensor Md S ) and the output of the CPFC. The sensors universe of discourse range is [0-15], and for the CPFC is [0-3]. Table (6) shows the rule base of the FC.

Table (6) Rule base for the FC CPFC Rc Lc F F L L R R S S

Sensors

Fs Ls Rs Ss

Sc F L R S

Fc F L R S

where, the sensors input signals are represented as: Fs = forward = 2 or 14 Ls = left = 10 Rs = right = 6 Ss = stop = 3, 7, 11 or 15 The output linguistic labels which represent the WMR directions steering commands are: F = forward =2 L = left =1 R = right =3 S = stop =0 Figure (9) illustrates these directions.

F S R

L S

Figure (9) WMR directions steering commands

5. Practical Results
The WMR that follows the path is tested depending on the magnetic sensors with the vision sensor. The results of this case are illustrated in Figure (10); where, the first picture shows the WMR reaching the natural magnetic. Next, picture describes the WMR when it detects the left natural magnetic. Next, picture shows WMR reaching the left turn. Next, two pictures represent WMR through the left turn. Next, picture depicts WMR when it detects the right natural magnetic. Next, picture shows WMR reach the rights turn. Next, picture shows WMR through the right turn (back view). Next, picture shows WMR through the right turn (front view). Next, picture shows WMR finished the right turn. Next, picture shows WMR through path tracking. Next, picture shows WMR reach the end of path.

Figure (10) WMR follows the path

6. Conclusion
The paper presents a novel methodology for computer vision-based robot navigation. A robust, multi-sensors navigation loops is also required for providing high accuracy and high frequency vehicle pose estimates. One of the key observations is that successful navigation systems should result from the synergistic combination of a set of fundamental principals: 1. 2. 3. 4. A synergistic of multi-sensors. An appropriate path to follow. An attention mechanism for handling the complexity of the perception process for the multi-sensors signals, thus allowing for an efficient use of computational resources. The behavior of the steering controller (action controller) appears arduous to track the path follower error. This fact makes the fuzzy controller developed enhance rapidly to provide smoother steering control.

The speed of the robot was set to 0.5 rad/sec in the live tests, but tests on saved images show that higher speeds would be handled successfully. Though the focus was not to create a high speed application, the robustness was more important, and the robot can only drive at 0.5 rad/sec anyway. All in all, the path following ability was high, and none of the choices made were regretted. The WMR path follower created in this project performed well in batch tests. The unmotivated stops in the tests were few and these stops were made in images with difficult conditions. Single paths were followed in a satisfactory manner. The robustness was high, it never tried to drive off the path, and the robot almost always stopped when it was supposed to. The entire tests for sensors types on the path and surroundings were handled successfully. The factors limiting the path width are the camera, its height, and the angle at which it is pointed towards the ground.

7. References
[1] M. M. Khalidy Design and Implementation of Wheeled Mobile Robot Using Intelligent Controllers , Ph. D. dissertation, University of Technology, Baghdad, 2007. [2] Evolution Robotics company Introduction to Evolution Robotics Technologies, Evolution Robotics, Inc. Evolution Robotics, ERSP. 2003. [3] S. M. LaValle Planning Algorithms, book, University of Illinois, August 31, 2005. [4] O. L. . Abdlkadir Improvement of an Image Processing System for Autonomous Mobile Robot Navigation. Paper from Internet Informations, 2000. [5] J. F. Reid, Precision Guidance of Agricultural Vehicles, UILU-ENG- 7031, 1998. [6] K. H. Low and Y. P. Leow Kinematic modeling, mobility analysis and design of wheeled mobile robots, VSP and Robotics Society of Japan, Advanced Robotics, Vol. 19, No. 1, pp. 7399, 2005. [7] J. R. Asensio and L. Montano A Kinematics and Dynamic Modle-Based Motion Controller for Mobile Robots, Department of Computer Science and Systems Engineering University of Zaragoza, Spain, 2002. [8] Z. P. Jiang, E. Lefeber and H. Nijeijer Stabilization and Tracking of A Nonholonomic Mobile Robot With Saturating Actuators, 3rd Portugese Conference on Automatic Control Coimbra, Portugal, 911 September, 1998. [9] M. Bisgaard, D. Vinther, K. Qstergaard and J. Bendtsten Simulation, Sensor Fusion and Model Verification For A Mobile Robot, Department of Control Engineering, Aalborg University Fredrik Bajersvej 7C , DK-9220 Aalborg East, Denmark. 2003. [10] R. Costantini and S. Susstrunk Virtual Sensor Design, Audiovisual Communication Laboratory, EPFL, Lausanne, Switzerland, 2004. [11] R. Madhavan and E. Messina Quantifying Uncertainty Towards Information-Centric Unmanned Navigation, National Institute of Standards and Technology (NIST), Gaithersburg, MD, 2003. [12] Y. Guo, Z. Qu and J. Wang A New Performance-Based Motion Planner for Nonholonomic Mobile Robots, School of Electrical Engineering and Computer Science, University of Central Florida, Orlando, FL 32816-2450, 1998. [13] K. Macek, I. Petrovic and R. Siegwart A Control Method for Stable and Smooth Path Following of Mobile Robots, Swiss Federal Institute of Technology Lausanne, Switzerland, 2005. [14] C. Zhou FuzzyArithmeticBased Lyapunov Synthesis in the Design of Stsble Fuzzy Controllers: A ComputingWithWords Approach, Int. J. Appl. Math. Comput. Sci., Vol.12, No.3, 411421, 2002. [15] J. Park, J. Kim and B. Ahn Evolutionary Learning For Fuzzy Path Planning of Shooting Action For Robot Soccer, Gwangju Institute of Science and Technology (GIST) Department of Mechatronics, 2004 [16] A. Riid and E. Rstern Fuzzy Logic in Control: Truck Backer-Upper Problem Revisited, Department of Computer Control Tallinn Technical University, 2001. [17] V. M. Olivera and R. Simmons Implementing Human-Acceptable Navigational Behavior and a Fuzzy Controller for an Autonomous Robot, Spanish Ministry of Science and Technology under the PROFIT program (FIT070000-2001-118), 2001. [18] J. Pan Fuzzy-Nav: A Vision-Based Robot Navigation Architecture using Fuzzy Inference for UncertaintyReasoning, this paper appears in the Proceedings of the World Congress on Neural Networks. Vol. 2, pp. 602-607, Washington D.C., July 17-12, 1995. [19] T. Kyriacou Vision-Based Urban Navigation Procedures for Verbally Instructed Robots, Robotic Intelligence Laboratory, School of Computing, University of Plymouth, Plymouth, United Kingdom, 2002. [20] Y. Amit 2D Object Detection and Recognition, Massachusetts Institute of Technology, Inc. 2002. [21] R. Van Dam and G. Briggs Enhanced Quagent control with vision-based Object Recognition, Computer Science 242: Artificial Intelligence April 27, 2004. [22] R. Costantini and S. Susstrunk Virtual Sensor Design, Audiovisual Communication Laboratory, EPFL, Lausanne, Switzerland, 2000. [23] S. E. Umbaugh, Computer Vision and Image Processing Prentice Hall PTR Prentice-Hall, Inc., 1998. [24] R. C. Gonzalez and R. E. W. Digital Image Processing Addison-Weslay Publishing Company, Inc. 1992. [25] R. Henderson CMOS Digital Camera Controller and Frame Capture Device, Oregon Institute of Technology, May 26, 2002. [26] D. Litwiller CMOS vs CCD, reprinted from PHOTONICS SPECTRA Laurin Publishing, August 2005. [27] G.E. Healey and R. Kondepndy Radiometric CCD Camera Calibration and Noise Estimation, IEEE Trans. On Pattern Analysis and Machine Intelligence, Vol. 16. No. 3. March 1994. [28] R. E. Cummings Intelligent Robot Vision Sensors in VLSI, Department of Electrical and Computer Engineering, Johns Hopkins University, Baltimore, MD 21218, 1998. [29] P. Chandak Study and Implementation of Follow the Leader, thesis, Msc. University of Cincinnati, 2002.

10

[30] P. Ridao, J. Forest, L. Pacheco and X. Cuf Sensorial and Navigation Systems for A Mobile Robot (ROGER), Computer Vision and Robotics Group, 1994. [31] K. C. Ng and M. M. Trivedi A Neuro-Fuzzy Controller for Mobile Robot Navigation and Multirobot Convoying, IEEE Transactions on systems, man, and Cyberneticspart B: Cybernetics, Vol. 28, no. 6, December 1998. [32] N. M. Taalabi Design and Application of Fuzzy Logic For the Speed Control of A Direct-Drive DC Motor, BP 6207 Rabat Instituts. Rabat Maroc. March, 2002. [33] H.R. Berenji, Fuzzy Logic controllers, in An introduction to Fuzzy Logic Application in Intelligent Systems, R.R.Yager and L.A. Zadeh, Eds. Boston, M.A:Kluwer, 1992, pp 69-96. [34] Matlab Users Guide Fuzzy Logic Toolbox, Version 2, 2002. [35] K. M. Passino and S. Yurkovich Fuzzy control, Addison Wesley Longman, INS. 1998. [36] L. Reznik Fuzzy Controllers, Victoria University of Technology, Melbourne, Australia, Lionid Reznik, 1997. [37] J. Jantzen " Design of fuzzy controllers ", Technical University of Denmark, May 15, 1998. [38] J. Mireles Kinematic Models of Mobile Robots, EE 5325/4315, Summer 2004. [39] J.-P. Laumond. Trajectories for mobile robots with kinematic and environment constraints. In Proc. of International Conference on Intelligent Autonomous Systems, pages 346{354, 1986. [40] J.-P. Laumond Robot Motion Plannning and Control, Springer-Verlag, Berlin, 1998. Available online at http://www.laas.fr/jpl/book.html. [41] Z. Li and J. F. Canny Nonholonomic Motion Planning, Kluwer Academic Publishers, Boston, MA, 1993. [42] University of Pennsylvania, Mobile Robot Kinematics and Control, 2004. [43] J. D. Bendtsen, P. Andersen, and T. S. Pedersen Robust Feedback Linearization-based Control Design for a Wheeled Mobile Robot, Paper from Internet Informations, 2000. [44] A. Riid and E. Rstern Fuzzy Logic in Control: Truck Backer-Upper Problem Revisited, Department of Computer Control, Tallinn Technical University Ehitajate tee 5, Tallinn, 19086, Estonia ,2001. [45] J. Zhang, K. Huebner and A. Knoll Learning based situation recognition by sectoring omnidirectional images for robot localisation, Technical Computer Science, Faculty of Technology, 2001. [46] O. M. C. Williams Guidance of a Mobile Robot using Computer Vision over a Distributed System, E 2001/2002. [47] C. M. Bergren Anatomy of a Robot, McGraw-Hill Companies, Inc. 2003. [48] P. E. Sandin Robot Mechanisms and Mechanical Devices Illustrated, McGraw-Hill Companies, Inc. 2003. [49] B. Sandier Robotics Designing the Mechanisms for Automated Machinery, Academic Press, Inc. 1999. [50] J. Luecke Analog and Digital Circuits for Electronic Control System Applications, Elsevier Inc. 2005. [51] W. Kester Data Conversion Handbook, Newnes, Elsevier by Analog Devices, Inc.2005. [52] Y. Q. Shi and H. Sun, Image and Video Compression for Multimedia Engineering CRC Press LLC, 1999. [53] E. R. Dougherty, Random Processes for Image and Signal Processing, The Society of Photo-Optical Instrumentation Engineers. 1999. [54] D. Phillips, Image Processing in C, R&D Publications, Inc. 1994. [55] R. C. Gonzalez and R. E. W. Digital Image Processing Addison-Weslay Publishing Company, Inc. 1992. [56] V. B. Rao and H. V. Rao C++ Neural Networks and Fuzzy Logic Management Information Source, Inc. 1993. [57] J. Gomes and L. Velho Image Processing for Computer Graphics Springer-Verlag New York, Inc. 1997. [58] H. Demuth and M. Beale Neural Network Toolbox Math Works, Inc. 1998. [59] P. J. Gibson and C. H. Power Introduction Remote Sensing Digital Image Processing and Applications, Goldin-Rudahl Systems,Amherst, MA 01002,USA. Inc.2000. [60] T. B. Lauwers, G. A. Kantor and R. L. Hollis A Dynamically Stable Single-Wheeled Mobile Robot with Inverse Mouse-Ball Drive, Proc. IEEE Intl. Conf. on Robotics and Automation, Orlando, FL, May 15-19, 2006. [61] D. C. Conner, H. Choset and A. A. Rizzi Towards Provable Navigation and Control of Nonholonomically Constrained Convex-Bodied Systems, Proceedings of the 2006 IEEE International Conference on Robotics and Automation Orlando, Florida - May 2006. [62] P. An PC Interfacing using Centronic, RS232 and Game Ports, Newnes, first published, Pei An, Inc.1998.

11

You might also like