You are on page 1of 4

Optimization of Antenna Parameters Using Intelligent Computation

Alok Pandey 1, Archana Sharma 2 and Swati Gaur 3


Associate Professor, Dept. of ECE, SIIT,Jaipur 302029 alok.pandey@ieee.org
1

Assistant Professor, Dept. of Mathematics, SIIT, Jaipur 302029 jmdarchanasharma@gmail.com

Assistant Professor, Dept. of ECE, SIIT,Jaipur 302029 swatig22ece@gmail.com

Abstract Co active neural networks & Fuzzy Inference System (CANFIS) are electronic systems which can be trained to remember behavior of a modeled structure in given operational points, and which can be used to approximate behavior of the structure out of the training points. These approximation abilities of neural nets are demonstrated on modeling a microstrip antenna. Parameters are turned to the accuracy and to the efficiency of neural models. The association of neural models and fuzzy algorithms in microstrip antenna design is discussed in this paper. The CANFIS (Co-Active Neuro-Fuzzy Inference System) model integrates adaptable fuzzy inputs with a modular neural network to rapidly and accurately approximate complex functions. Fuzzy inference systems are also valuable as they combine the explanatory nature of rules (membership functions) with the power of "black box" neural networks. Keywords: Co active neural networks & Fuzzy Inference System, microstrip antennas, modeling, optimization

design and analysis of microstrip antennas in rectangular form. In this work, the width and the length for designing of rectangular microstrip antennas are obtained by using new method based on artificial neural networks. The artificial neural networks are developed from neurophysiology by morphologically and computationally mimicking human brains. Although the precise operation details of artificial neural networks are quite different from human brains, they are similar in three aspects: they consist of a very large number of processing elements each neuron connects to a large number of other neurons, and the functionality of networks is determined by modifying the strengths of connections during a learning phase. The purpose of this article is to provide an overview of recent developments in the design and analysis of microstrip antenna using neural network and this is illustrated by taking an illustrative example of microstrip antenna design using ANN. 2. CANFIS Architecture

1.

Introduction

Microstrip antennas due to their many attractive features have drawn popularity among researchers and for industrial application. In high-performance spacecraft, aircraft, airship, aerostat, missile, and satellite applications, where size, weight, cost, performance, ease of installation, and aerodynamic profiles are constraints, low profile antennas may be required. Presently, there are many government and commercial applications such as mobile, radio, and wireless communications that have similar specifications. To meet these requirements, microstrip antennas can be used [13]. In the literature, artificial neural network (ANN) models have been built for the

Figure.1 CANFIS Architecture Figure.1 shows the adaptive neuro fuzzy inference system. Fuzzy a rule-based systems and artificial neural networks originated from different philosophies and were originally considered independent of each other. Later studies revealed that

they actually have a close relation. Buckley et al. [1] discussed the functional equivalence between neural networks and fuzzy expert systems. The integration of fuzzy logic and neural networks has given birth to an emerging technology field, fuzzy neural networks. The theory of fuzzy logic provides a mathematical strength to capture the uncertainties associated with human cognitive processes, such as thinking and reasoning, also it provides a mathematical morphology to emulate certain perceptual and linguistic attributes associated with human cognition. While fuzzy theory provides an inference mechanism under cognitive uncertainty, computational neural networks offer exciting advantages such as learning, adaptation, fault-tolerance, parallelism and generalization. The computational neural networks are capable of coping with computational complexity, nonlinearity and uncertainty. It is interesting to note that fuzzy logic is another powerful tool for modeling uncertainties associated with human cognition, thinking and perception [2,3]. Many authors have proposed various neuro-fuzzy models as well as complex training algorithms. Of these, Jang [4] proposed the famous neuro-fuzzy model ANFIS (adaptive network-based fuzzy inference system), which has beensuccessfully applied in various fields. In ANFIS, the hybrid learning algorithm is adopted that integrates the BP successfully applied in various fields. In ANFIS, the hybrid learning algorithm is adopted that integrates the BP(Backward Propagation) algorithm with the recursive least squares algorithm to adjust parameters. ANFIS was laterextended to the coactive ANFIS in [5] and to the generalized ANFIS in [6]. Horikawa et al. [7] proposed a neuro-fuzzy model using sigmoid functions to generate the bell-shape input membership functions and trained it with the BP algorithm. However, some practical difficulties associated with gradient descent are slow convergence and ineffectiveness at finding a good solution [8]. The Proposed Neuro-Fuzzy Model The proposed neurofuzzy model is a multilayer neural network-based fuzzy system and the system has a total of five layers. In this connectionist structure, the input and output nodes represent the input states and output reponse, respectively, and in the hidden layers, there are nodes functioning as membership functions (MFs) and rules. This eliminates the disadvantage of a normal feedforward multilayer network, which is difficult for an observer to understand or to modify. Throughout the simulation examples presented in this paper, all the MFs used are bell-shaped (Gaussian) functions defined

as: (x) = exp - ((x- c)2 / 2 ) A Gaussian membership function is determined by c and : c represents the centre of the MF; and determines the width of the MF. A detailed description of the components of the model's structure and functionalities, and the philosophy behind this architecture are given below.

a.

Input Layer

Nodes in this layer are input nodes that represent input linguistic variables as crisp values. The nodes in this layer only transmit input values to the next layer, the membership function layer. Each node is connected to only those nodes of layer 2, which represent the linguistic values of corresponding linguistic variables. b. Fuzzy Input Layer

Nodes in this layer act as membership functions to represent the terms of the respective linguistic variables. The input values are fed to fuzzy input layer that calculates the membership degrees. This is implemented using Gaussian membership functions with two parameters, mean (or centre, c ) and variance (or width, ).This layer implements fuzzification for the inputs. It represents fuzzy quantization of input variables. yt( FI) = exp - ( ((x(I ) - c)2 / 2 ) c. Rule Nodes Layer

The third layer contains rule nodes that evolve through learning. Evolving means all nodes on the third layer are created during learning. The rule nodes represent prototypes of input-output data associations that can be graphically represented as associations of hyper-spheres from the fuzzy input and the fuzzy output space. Hence, the functions of the layer are yt( R) = min yi( F) ; with I is the element of It where I t is the set of indices of the nodes in fuzzy layer that are connected to node t in Rule layer and yi( F) is the output of node i in Fuzzy input layer. d. Fuzzy Output Layer

The fourth layer is fuzzy output layer where each node represents fuzzy quantisation of the output variables. The activation of the node represents the degree to which this membership function is supported by all fuzzy rules together. The connection weights wkj of the links connecting nodes k in fuzzy output layer to nodes j in rule nodes layer represent conceptually the CFs of the corresponding fuzzy rules when inferring fuzzy output values. yt( R) = min (yi(R) wik(F)) where Ik is the set of indices of the nodes in Rule layer that are connected to the node k in Fuzzy output e. layer. 5. Output Layer

given in Table 1 and. Simulation has done on Neurosolution 6 and CST 4. References

This represents the output variables of the system. These nodes and the links attached to them act as a defuzzifier. A node in this layer computes a crisp output signal. The output variable layer makes the defuzzification for fuzzy output variables. The input-output relationship of the units in each layer are defined by the following equations:

yt

w
k k

FO, R tk

ykFO tk ctk ykFO k

FO, R tk

Where ctk and tk are respectively, the centroid and width of the membership function of the output linguistic value represented by k in Fuzzy output layer. 3. Results A. Structures of the CANFIS The CANFIS network, which has a configuration of 4 input neurons, 10 and 5 neurons in 2 hidden layers, and 2 output neurons with learning rate = 0.1, goal = 0.001, was trained for 400 epochs. The membership function is Gaussian and fuzzy model is tsukamoto. Sigmoid axiom with learning rule momentum is used in training. CANFIS network are shown in Table 1. CANFIS is trained with 45 samples and tested with 21 samples determined according to the definition of the problem; 5 inputs and 2 outputs were used for the analysis of CANFIS. The results of the analysis CANFIS for an isotropic material ( r= y) and comparison with the targets are

[1] Balanis C.A.,1997: Antenna theory, 3rd Edition, John Wiley and Sons, Hoboken, NJ. [2] Bahl I.J. and Bhartia P.,1980: Microstrip antennas, Artech House, Dedham, MA. [3] D.M. Pozar,1992: Microstrip antennas, Proceeding, IEEE 80, 7981. [4] Buckley J.J., Hayashi Y.,1994: Fuzzy neural networks: a survey, Fuzzy Sets and Systems, 1-13 [5] Chokri S., Abdelwahed T, 2003: Neural Network for Modeling: A New Approach. Springer-Verlag Berlin Lecture Note in Computing Science,159-168 [6] Takagi H.,1995:"Fusion techniques of fuzzy systems and neural networks, and fuzzy systems and genetic algorithms, SPIE 2061 pp 402-413. [7] Jang J.-S.,ANFIS: adaptive-network-based fuzzy inference system, IEEE Trans, Cybernet.23 pp 665685 1993. [8] Mizutani Eiji, Jang J.-S., 1995: Coactive neural fuzzy modeling, proceedings,IEEE International Conference on Neural Networks,Vol.2,Perth, Australia, 760 765. [9] Azeem M.F. et al., 1994: Generalization of adaptive neuro-fuzzy inference systems, IEEE Trans.Neural Networks 11 pp 1332 1346 2000. [10 ]Horikawa Shin-ichi,et al., 1992:On fuzzy modeling using fuzzy neural networks with the backpropagation algorithm, IEEE Trans.Neural Networks 3 pp 801 806. [11] G. Puskorius, L. Feldkamp, Neurocontrol of nonlinear dynamical systems with Kalman filter trained recurrent networks, IEEE Trans. Neural Networks 5 pp 279297. [12] L. Zadeh, 1965: Fuzzy Sets, Inf. Control, vol 8 pp 338-353. [13] Mamdani, E. H. and Assilian S., 1975:"An experiment in linguistic synthesis with a fuzzy logic controller." Int. J. Man-Machine Studies 7.

18 16 14 12 10 8 6 4 2 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 Figure.2: Graphical representation of result of Table-1

W (desired) L(desired) CANFIS (W) CANFIS (L)

Table.1 Results of the CANFIS algorithm and comparison with the targets. h 2.33E+00 2.33E+00 2.33E+00 2.50E+00 2.40E+00 2.20E+00 1.00E+00 1.40E+00 1.70E+00 1.90E+00 1.00E+00 1.80E+00 1.20E+00 1.70E+00 1.50E+00 1.60E+00 1.70E+00 1.80E+00 1.80E+00 1.70E+00 1.70E+00 1.75E+00 1.20E+00 1.50E+00 1.50E+00 1.60E+00 1.60E+00 2.00E+00 2.20E+00 2.50E+00 2.70E+00 2.90E+00 2.60E+00 3.00E+00 2.40E+00 2.10E+00 2.80E+00 2.90E+00 2.70E+01 2.80E+00 2.90E+00 2.60E+00 f 7.55 8 7.75 8.56 7.15 7.28 7.92 7.32 6.69 6.37 6.4 7.03 6.6 6.7 8.64 6.28 6.08 11 6.81 6.66 7.28 DESIRED (W ) 15.4 14.53 14.99 13.24 16.09 16.28 15.3 15.6 15.9 13.9 13.7 13.2 14.4 15.4 13 15.2 15.4 15.8 14.8 14.6 13 DESIRED (L) 11.95 11.6 11.78 10.21 12.61 12.89 12.6 12.8 13.1 13.3 13.5 12.2 12.4 13.4 11 13.2 13.4 13.8 12 12.2 11.8 CANFIS (W) 15.4148 14.5349 14.986 13.2475 16.1 16.28608 15.3407 15.6392 15.9035 13.8848 13.7933 13.264 14.4188 15.4963 12.9528 15.2198 15.4497 15.7943 14.8078 14.6154 13.08 CANFIS (L) 11.9469 11.6029 11.8052 10.2011 12.6442 12.8399 12.6077 12.8634 13.1433 13.3094 13.5376 12.2472 12.4326 13.4356 10.9998 13.202 13.4602 13.7897 12.0884 12.2632 11.8101

S.No. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21

You might also like