Professional Documents
Culture Documents
ii
Acknowledgments
We bow our heads in most humble thanks to Almighty Allah, the most gracious, the most merciful for granting us the wisdom and strength to complete this work. It is our great pleasure to express our most sincere and deep gratitude to our respected supervisor, Sir Muhammad Habib for his unabated guidance, constant help and encouragement throughout the course of this project. We also wish to express our gratitude and admiration to all the faculty members of air University for their highly useful suggestions and co-operation. We shall be failing in our duty if we do not extend our thanks to our friends and colleagues for their help and encouragement whenever needed. Last but not the least we would like to thank our families for their moral support, love and understanding that enabled us to complete this work.
iii
Table of Contents
Acknowledgments ................................................................................................................... ii Chapter 1: INTRODUCTION ................................................................................................. 1 1.1. Automation.............................................................................................................. 1
1.1.1. Advantages and disadvantages of automation....................................................... 2 1.2 Bottling.......................................................................................................................... 3 Chapter 2: DESIGN PROCEDURE AND .............................................................................. 5 DETAILS ................................................................................................................................ 5 2.1 Hardware ....................................................................................................................... 5 2.1.1 Mechanical hardware ............................................................................................. 5 2.1.1.1 Conveyor belt ...................................................................................................... 5 2.1.1.2 Robotic arm ......................................................................................................... 6 2.1.2 Electrical hardware................................................................................................. 7 2.1.2.1 Infrared Sensor .................................................................................................... 7 2.1.2.2 Dc power supply: ................................................................................................ 9 2.1.2.3 Motor drives ...................................................................................................... 12 Unidirectional drive:............................................................................................................. 12 Bidirectional drive ................................................................................................................ 14 Chapter 3: IMAGE PROCESSING IN MATLAB ............................................................... 17 3. 1 Image acquisition ....................................................................................................... 17 3.2 Separation of RGB components: ................................................................................. 17 3.3 Size of image: .............................................................................................................. 18 3.4 Image Information....................................................................................................... 18 3.5 Image Conversion ....................................................................................................... 19 3.5.1 RGB to Gray Scale ............................................................................................... 19 3.5.2 Gray Scale to Binary Image ................................................................................. 19 3.6 Color Segmentation..................................................................................................... 21 3.7 Morphological Operations .......................................................................................... 22 3.7.1 Structuring Elements ............................................................................................ 22 3.7.2 Closing and opening of image .............................................................................. 24 3.9 Edge Detection ............................................................................................................ 26 3.10 Algorithm .................................................................................................................. 28
iv
Chapter 4: RESULTS AND DISCUSSION.......................................................................... 29 4.1 Data Set ....................................................................................................................... 29 4.2 Limitations .................................................................................................................. 30 CHAPTER 5 : CONCLUSION & FUTURE ENHANCEMENT ......................................... 31 5.1 Conclusion .................................................................................................................. 31 5.2 Future Enhancements .................................................................................................. 31
List of Figures
Figure 2.1 Conveyor Belt Figure 2.2 Robotic Arm Figure 2.3 Infrared Circuit Figure 2.4 5V DC Supply Figure 2.5 12V DC Supply Figure 2.6 15V DC Supply Figure 2.7 Unidirectional Motor Derive Figure 2.8 Bidirectional Motor derive Figure 3.1 RGB components Figure 3.2 Conversion of Gray Scale Image Figure 3.3 RGB to Binary Figure 3.4.1 Color Segmentation of led and label Figure 3.4.2 Color Segmentation of Solution Figure 3.5 Closing and Opening Figure 3.6 Edge Detection
Chapter 1: INTRODUCTION
In life, there is always a room for betterment. Everything that has been made by human keeps getting improved and this is a never ending process. There can be many examples regarding it, like cars, planes, etc. During the last few decades the term quality has become one of the most stressed words in the field of production. Companies today are under constant pressure to become more efficient in their manufacturing processes, to obtain higher yields, have faster throughput, and increasing productivity while keeping wastage and costs down. In addition to efficiency, accuracy is imperative as companies race to develop and maintain procedural standards to meet ISO 9000 compliance and position for themselves for corporate survival in the twenty-first century. Every manufacturer today wants to continually improve the quality of the products he produces. Few of the many important factors that affect quality are contamination; deviation of the process from its validated state, end product impurity etc. Contamination can be caused by air, water and especially human intervention. Manufacturing a product is a procedure. Quality control is a procedure to monitor a procedure with the goal of making it more efficient. Today the need is to explore ways to make the quality control procedure itself more efficient by automating the quality control process.
1.1. Automation
To get started with this project, one first needs to understand the word automation. According to Wikipedia; automation is: the use of control systems (such as numerical control, programmable logic control, and other industrial control systems), in concert with other applications of information technology (such as computer-aided technologies), to control indus-
trial machinery and processes, reducing the need for human intervention.[1] In the scope of industrialization, automation is a step beyond mechanization. Whereas mechanization provided human operators with machinery to assist them with the physical requirements of work. Automation greatly reduces the need for human sensory and mental requirements as well. Processes and systems can also be automated.[2] Currently, for manufacturing companies, the purpose of automation has shifted from increasing productivity and reducing costs, to broader issues, such as increasing quality and flexibility in the manufacturing process.
1.2 Bottling
Bottling as the name suggest, refers to the process in which bottles are filled labeled and packed in the cartons, boxes or crates etc. The term bottling is very well known in all the bottles related industries and the bottling process is one of the most significant issues in all such industries. All the major drinks production companies and factories like coke, Pepsi, nestle etc. give great importance to this process for better quality, production and business. To achieve these objectives most of the companies up till now have been affording man labor at very high costs, but with the advancement of the science and the technology all the major industries have been longing to replace this high cost man labor with a fast, economical and efficient machine work. Automation of bottling is the best solution to meet all the requirements of bottling industries in todays advance world. Since the last decade automated machines and robots have served such industries a lot to achieve their goals.
The solution would not only identify the defected good by checking the above aspects but also remove it from the production line.
The robotic arm used in our project is of the simplest form having only one motion i.e. the yaw motion. The purpose here is to eliminate the defected bottles from the production line. Whenever the image processing tools finds any defected item which is not meeting any of the standards then the robotic arm is held responsible to drop the bottle of the production line. Our robotic arm is having 90 degree yaw motion which enough to meet our requirements. Two limit switches are placed at both the extremes limits of the motion which provides signal when any of the physical limits is reached. Figure 2.2 is the real picture of the robotic arm being used in the project.
Infrared sensors pair consist of a transmitter and a receiver, which operates with in the infrared band. Within the line of sight the transmitter generates infrared radiations and the receiver is responsible for detection of the signals. Infrared sensors may be used for the detection of any obstacles or objects that come across the line of sight of the transmitter and the receiver. The object across the line of sight blocks the radiation path and the transition at the receiver detects a path blockage.
The transmitter and the receiver are placed across the width of the conveyor belt. The output at the receiver end of the sensor remains constant as long as the line of sight connection is being established. Whenever the bottle comes in between the line of sight (i.e. the frame of the camera) the transition in the voltage at the receiver indicates the microcontroller that the bottle is in the frame. Microcontroller after receiving the signal from the sensor stops the motion of the belt and signals the camera for taking an image.
Circuitry:
Figure 2.3 shows the circuit the infrared sensors used in the project. The two legs are placed across the width of the conveyor belt. The one with the transistor is
the transmitter and the other is the receiver. When the signal is applied at the base of the transistor Q1(2N22) it gets on and the current begins to flow from the source to the sink making the led(D1) to transmit infrared radiations, the transmitted radiations are continuously received by the receiver diode(D2). As long as the radiations are being received by the receiver the voltage at the output point remains constant. Whenever the path is interrupted by any obstacle the receiver detects the blockage and transition occurs at the output point. This transition is sensed by the microcontroller, and it is confirmed that the bottle is exactly in the frame of the camera.
10
11
12
Unidirectional drive:
Motion of the motor driving the conveyor belt is supposed to be unidirectional, so for that reason a unidirectional motor derive was designed. This derives takes an input signals from the microcontroller (pulse waveform) and decides for the motor to on and off. When a pulse is applied at the input, during the high logic or 5volts time period the power MOSFET gets on and allows the motor to rotate at the voltage provided across its terminals. And during logic 0 cycles it stops the motor for rotation. In this way the width of each cycle decides the off and the on time of the motor. And finally this continuous pulse signal controls the speed of the motor.
13
When the bottle comes in between the camera frame the microcontroller stops the motion of the conveyor belt after getting the signal from the sensor. And this stopping of the belt is made possible by setting the duty cycle at input of derive.
Circuitry:
Figure 2.7 Unidirectional Motor Drive
The above given figure shows the circuit for the unidirectional motor derives which will control the speed of the conveyor belt. Input of the motor derive is the microcontroller pulse. Input waveform and the corresponding output waveforms at the various points mentioned in the figure are as follows.
14
When a signal is applied at point I (fig.2.7.1), an inverted wave form appears at the output of the opt coupler (4n35) i.e. at point A (fig.2.7.2). During logic 1 at point I the transistor Q1 is on and the transistor of the IC in off hence ground or 0volts appear at point A. when point A is at zero volts transistor Q2 gets off and the VCC directly appears at gate of the MOS-fet and it gets on(fig.2.7.3), providing the motor with the maximum voltage difference between the two terminals. Same process goes on for the logic 1 or 5volts at input point.
Bidirectional drive
Motors fixed in the robotic arm that controls the yaw, pitch and the clipper motion are all supposed to be bidirectional, in order to remove the bottle from the production line. Bidirectional motor derives were also designed to achieve the objective. The bidirectional motor derives consist of a H-bridge circuit made using two PMOSFETs (IRF 9540) and two N-MOSFETs (IRF540). The transition of the voltage
15
at the center of the two legs controls the direction of the motion and the width or the duty cycle of the pulse like in unidirectional case controls the speed of the motor.
Circuitry:
Figure2.8 shows the circuit for the H-bridge derives for controlling the speed and the direction of all the bidirectional motors. Motor is fixed between the two points marked as point A and B in the figure. Now its simple to control the direction of the motor just by changing the polarity between the two points. If point A is at higher voltage then B then the motor will move in one direction and if the point B is at higher voltage then the motor will move in the other direction. and for doing that transistors Q1 and Q4 are set on simultaneously and Q2 and Q3 are set on at a time keeping the other two off for the opposite direction motion.
16
Transistors Q1 and Q2 are P-MOS (IRF9540) where as Q3 and Q4 are NMOS (IRF540). N-MOS gets on when it gets logic 1 at its gate and P-MOS gets on when it gets logic 0 at the base. To set the two transistors Q1 (P-MOS) and Q4 (N-MOS) on at the same time Q1 is provided with logic 0 at the gate and Q4 is provided with logic 1 at the gate. Same signaling is applied at the gates of the other two transistors for the opposite motion and the other two at the other time inverters are used. To achieve the desired signaling pattern at the gates same procedure is adopted as mentioned in the last section of unidirectional motor derive.
17
3. 1 Image acquisition
In MatLab (see Appendix A for the code) this task was accomplished with the help of a simple command: getsnapshot Syntax: y = getsnapshot(vid) (see Appendix A)
18
Figure 3.1.1 shows the input RGB image of the bottle by using the commands given above we can separate the R component G component and B component of the image. Figure 3.1.2, figure 3.1.3 and figure 3.1.4 shows the R, G and B component of the image.
19
The uint8 entry shows refers to one of several MATLAB data classes discussed earlier in data types.
Figure 3.2.1 shows the input RGB image by using the command rgb2gray we can easily convert it into gray scale image.
y=im2bw(f,T) Convert a gray scale image to binary image. Valid input data are uint8, unit16 and double. This command produces intensity image, f, by thresholding. The out binary image g has values of 0 for all pixels in the input image with intensity
20
values less than threshold T, and 1 for all other pixels. The value specified for T has to be in the range [0,1], regardless of the class of the input. The output binary image is automatically declared as a logical array by im2bw. If we write g=im2bw(f) it use a default value of 0.5.
Figure 3.3.1 shows the binary image of the figure 3.2.1 with threshold value of 90, figure 3.3.2 shows the binary image with threshold value of 90 and figure 3.3.3 shows the binary image with threshold value of 170.
21
From this thresholding we get the lid and label pixels as 1 and other than this limit we get 0.
For the segmentation of solution in the bottle we set the thresolding values as given below R(i,j)<=70 && G(i,j)<=60 && B(i,j)<=70; From these values of R G and B component of image we get the pixel of solution as 1s and other pixel as 0s.
22
Figure 3.4.1 shows the color segmentation of lid and label of the bottle. And figure 3.4.2 shows the color segmentation of solution of the product inside the bottle.
23
processed. The center pixel of the structuring element, called the origin, identifies the pixel of interest--the pixel being processed. The pixels in the structuring element containing 1's define the neighborhood of the structuring element. These pixels are also considered in the dilation or erosion processing. Three dimensional, or nonflat, structuring elements use 0's and 1's to define the extent of the structuring element in the x- and y-plane and add height values to define the third dimension.
se = strel('diamond',3) se =
0 0 0 1 0 0 0 0 0 1 1 1 0 0 0 1 1 1 1 1 0 1 1 1 1 1 1 1 0 1 1 1 1 1 0 0 0 1 1 1 0 0 0 0 0 1 0 0 0
Structuring element used in the code is se = strel('rectangle',[4 9]) it is a 49 matrix of 1s se = strel('rectangle',[4 9])
se = Neighborhood: 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
24
3.8 Labeling
Bwlabel command is used to label the connected portions. We get the segmented portions by labeling different connected regions i.e 1,2,3 or 4 pixel values for different connected regions.
25
Syntax: [L,num]= bwlabel(f,conn) Where f is an input binary image and conn specifies the desired connectivity it can be either 4 or 8. Out L is called the label matrix, and num (optional) gives the total number of connected portions found. Example given below shows the label matrix L corresponds to matrix F, computed using bwlabel(f,4). The pixel in each different connected component is assigned a unique integer, from 1 to the total number of connected components. In other words, the pixel labeled 1 belongs to first connected portion; the pixel label 2 belongs to the second connected component and so on.
F= 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 1 1 0 0 0 1 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0
L=
1 1
1 1
1 1
0 0
0 2
0 2
0 0
0 0
26
1 1 1 1 1 1
1 1 1 1 1 1
1 1 1 1 1 1
0 0 0 0 0 0
2 0 0 0 0 0
2 0 0 0 3 0
0 4 4 4 0 0
0 0 0 0 0 0
Function find is useful when working with label matrices. For example, the following call to find returns the row and column indices for all the pixels belonging to the third object. [r,c]=find(L==3) r=7 c=6
27
1. Find places where the first derivate of the intensity is greater in magnitude than a specified threshold.[5] 2. Find places where the second derivate of the intensity has a zero crossing.[5] We used sobel edge detection to find the edges of bottle. The sobel edge detector uses the mask shown given below Gx and Gy.
Gx = -1 0 1 -2 0 2 -1 0 1
Gy = -1 -2 -1 0 0 0 1 2 1
28
b22= m11a11 + m12a12 + m13a13 + m21a21 + m22a22 + m23a23 + m31a31 + m32a32 + m33a33
Figure 3.6.1 Edge Detection Process
3.10 Algorithm
The basic idea is to find the corner points of lid, label and the level of the product. For correctness of the bottle lid we used to find the pixel ratio of the corner points of the lid from the base of the bottle, ratio of the corner points of the label from the base point and the corner points of level from the base point. The ratio is always remains the same it didnt depend upon the placement of the bottle as the camera is fixed. We fixed threshold values for the placement of lid, label and level of the product. If the one of the ratio exceed or decrease from the threshold value the bottle will declared incorrect and discarded otherwise it declared as correct bottle.
29
30
4.2 Limitations
To make the project simple and for easy implementation a few limitations were observed. Some of the major limitaions are as follows. The algorithm was desighned only for single brand (coke) Camera position was set fixed Distance of the camera and conveyor belt was also fixed Light effect issues were resolved by placing the light above the camera
31
32
imaging techniques may be used to check the presence of any suspended particles in the product or sensors can be incorporated through which other factors of the products can be checked. If this project is applied to the food industry e.g. biscuits etc., then the weight and humidity sensors along with camera can be deployed for a broader scope.
33
REFRENCES
[1] Definitions
from
Dictionary.com".
dictionary.reference.com.
[4] MATLAB (image processing toolbox) [5] Digital Image Processing By Rafael C. Gonzalez, Richard Eugene Woods
34
APPENDIX A
Code for Image Processing
vid=videoinput('winvideo',3,'RGB24_352x288'); %camera initialization obj=getselectedsource(vid); obj.FocusMode='manual'; s=serial('com14'); fopen(s) %camera properties %set auto focus %serial port initialization %open serial port
%get image
35
if R(i,j)>=80 && G(i,j)<=90 && B(i,j)<=90; imnew(i,j)=1; else imnew(i,j)=0; end end end
%color segmentation
L = bwlabel(imnew,4); imview(L) [r,c]= find(L==2); a1=min(r); b=min(c); d=max(r); e=max(c); plid=[a b ; a e ; d b ; d e ] [r,c]=find(L==1); al=min(r); b=min(c); d=max(r);
%labeling
%corner detection
36
R=y(:,:,1); G=y(:,:,2); B=y(:,:,3); [a,b,c]=size(y); for i=2:1:a-1 for j=2:1:b-1 if R(i,j)<=70 && G(i,j)<=65 && B(i,j)<=70; imnew(i,j)=1; else imnew(i,j)=0; end end end se = strel('rectangle',[6 6]); imnew=imclose(imnew,se); imnew=imopen(imnew,se); L = bwlabel(imnew,4); imview(L) [r,c]= find(L==2); ap=min(r); %color segmentation of level
37
b=min(c); a3=max(r); e=max(c); [r,c]=find(L==1); ao=min(r); b=min(c); a6=max(r); e=max(c); [r,c]=find(L==3); ai=min(r); b=min(c); a4=max(r); e=max(c); [r,c]=find(L==4); au=min(r); b=min(c); a5=max(r); e=max(c); a2=max([a3 a4 a5 a6]); as=min([ap ao ai au]); if(a2/a1>=4.1 && a2/a1<=4.8 && a2/al>=1.6 && a2/al<=1.8 && a2/as>=1.9 && a2/as<=2.4) %threshold values fwrite(s,'Y') p=[5] else
38