Professional Documents
Culture Documents
Algorithm
Examples
Conclusions
fabian@theis.name
Regensburg 11-Jan-05
Outline
Introduction Selection
Reinforcement learning Reproduction
Imitate nature Examples
Genetic algorithms 2d-function optimization
Algorithm Genetic Mastermind
Basic algorithm Hyerplane detection
Data representation Conclusions
Algorithm
Introduction Selection
Reinforcement learning Reproduction
Imitate nature Examples
Genetic algorithms 2d-function optimization
Algorithm Genetic Mastermind
Basic algorithm Hyerplane detection
Data representation Conclusions
Introduction
Introduction
Optimization
Optimization
Optimization
Genetic algorithms
Genetic algorithms
Genetic algorithms
Genetic algorithms
Genetic algorithms
Genetic algorithms
I advantages
I global not only local optimization
I simple and hence easy to implement
I easy parallelization possible
I disadvantages
I how to encode phase-space position
I rather low speed and high computational cost
I parameter dependencies (population size, selection and
reproduction parameters)
Genetic algorithms
I advantages
I global not only local optimization
I simple and hence easy to implement
I easy parallelization possible
I disadvantages
I how to encode phase-space position
I rather low speed and high computational cost
I parameter dependencies (population size, selection and
reproduction parameters)
Algorithm
Introduction Selection
Reinforcement learning Reproduction
Imitate nature Examples
Genetic algorithms 2d-function optimization
Algorithm Genetic Mastermind
Basic algorithm Hyerplane detection
Data representation Conclusions
Individual
Individual
Individual
Selection
I goal: select individuals that produce the next generation
I probabilistic selection
I based on fitness function f
I better individuals have increased chance of reproduction
I usually selection with replacement → very fit individuals
reproduce several times
I selection probabilities
I roulette wheel (Holland 1975)
f (i)
P(choice of individual i) = P
j f (j)
Selection
I goal: select individuals that produce the next generation
I probabilistic selection
I based on fitness function f
I better individuals have increased chance of reproduction
I usually selection with replacement → very fit individuals
reproduce several times
I selection probabilities
I roulette wheel (Holland 1975)
f (i)
P(choice of individual i) = P
j f (j)
Selection
I goal: select individuals that produce the next generation
I probabilistic selection
I based on fitness function f
I better individuals have increased chance of reproduction
I usually selection with replacement → very fit individuals
reproduce several times
I selection probabilities
I roulette wheel (Holland 1975)
f (i)
P(choice of individual i) = P
j f (j)
Reproduction
Reproduction
Crossover
I let x, y ∈ An be the genes of the two parents
I simple crossover
I choose r randomly in {1, . . . , n}
I generate children x0 , y0 ∈ An by
xi if i < r
xi0 :=
yi otherwise
yi if i < r
yi0 :=
xi otherwise
Crossover
I let x, y ∈ An be the genes of the two parents
I simple crossover
I choose r randomly in {1, . . . , n}
I generate children x0 , y0 ∈ An by
xi if i < r
xi0 :=
yi otherwise
yi if i < r
yi0 :=
xi otherwise
Crossover
I let x, y ∈ An be the genes of the two parents
I simple crossover
I choose r randomly in {1, . . . , n}
I generate children x0 , y0 ∈ An by
xi if i < r
xi0 :=
yi otherwise
yi if i < r
yi0 :=
xi otherwise
Mutation
Mutation
Mutation
Mutation
Algorithm
Introduction Selection
Reinforcement learning Reproduction
Imitate nature Examples
Genetic algorithms 2d-function optimization
Algorithm Genetic Mastermind
Basic algorithm Hyerplane detection
Data representation Conclusions
Examples
I continuous example
I global optimization of continuous function f : [a, b] → R
I binary example
I genetic Mastermind
I select optimal guess using GA
I example from our research
I perform overcomplete blind source separation by sparse
component analysis
I key problem: hyperplane detection
I solution: optimize cost function using GAs
Examples
I continuous example
I global optimization of continuous function f : [a, b] → R
I binary example
I genetic Mastermind
I select optimal guess using GA
I example from our research
I perform overcomplete blind source separation by sparse
component analysis
I key problem: hyperplane detection
I solution: optimize cost function using GAs
Examples
I continuous example
I global optimization of continuous function f : [a, b] → R
I binary example
I genetic Mastermind
I select optimal guess using GA
I example from our research
I perform overcomplete blind source separation by sparse
component analysis
I key problem: hyperplane detection
I solution: optimize cost function using GAs
2d-function optimization
multipeak
50 50
45
45
40
35
40
30
25 35
20
30
15
10
25
5
−10 −8 −6 −4 −2 0 2 4 6 8 10 20
x 0 10 20 30 40 50 60 70 80 90 100
Genetic Mastermind
Hyerplane detection
1 1
0.5
0.5
0
0
−0.5
1 −0.5 1
−1
1
0.5 −1 0.5
0.5 −1
0 −0.5 0
0
0
−0.5 −0.5
−0.5
0.5
−1 −1 1 −1
Algorithm
Introduction Selection
Reinforcement learning Reproduction
Imitate nature Examples
Genetic algorithms 2d-function optimization
Algorithm Genetic Mastermind
Basic algorithm Hyerplane detection
Data representation Conclusions
Conclusions
I Resources I References
I books: Goldberg [1989], P. Georgiev, F. Theis, and A. Cichocki. Sparse
component analysis and blind source
Schöneburg et al. [1994] separation of underdetermined mixtures.
IEEE Trans. on Neural Networks in print,
I Matlab GA optimization
2004.
toolbox: D. Goldberg. Genetic Algorithms in Search
Optimization and Machine Learning.
http://www.ie.ncsu.edu/ Addison Wesley Publishing, 1989.
mirage/GAToolBox/gaot E. Schöneburg, F. Heinzmann, and
S. Feddersen. Genetische Algorithmen und
I Details and papers on my Evolutionsstrategien. Addison Wesley
Publishing, 1994.
website F. Theis, P. Georgiev, and A. Cichocki. Robust
http://fabian.theis.name overcomplete matrix recovery for sparse
sources using a generalized hough
transform. In Proc. ESANN 2004, pages
I This research was supported 343–348, Bruges, Belgium, 2004. d-side,
by the DFG1 and BMBF2 . Evere, Belgium.
1
graduate college: Nonlinearity and Nonequilibrium in Condensed Matter
2
project ’ModKog’
Theis Genetic algorithms and evolutionary programming