Professional Documents
Culture Documents
Lecture 11 OPTIMIZATION
- Basic Ideas
- One-dimensional Unconstrained Optimization
- Golden-Section Search
- Quadratic Interpolation
- Newtons Method
- Multidimensional Unconstrained Optimization
- Direct Methods
- Gradient Methods
Suranaree
UNIVERSITY OF TECHNOLOGY Mongkol JIRAVACHARADET
NAKORNRATCHSIMA School of Civil Engineering
BASIC IDEAS
f (x) f ( x) = 0
Maximum
f ( x) < 0
f ( x) = 0
Root Root
0 x
Root
f ( x) = 0
Minimum
f ( x) > 0
Examples of Optimization Problems in Engineering
f (x) Global
maximum
Local
maximum
0 x
Local
Global minimum
minimum
GOLDEN-SECTION SEARCH
- Pick 4th points then choose the first or last three points.
Intermediate
Point
Choice of Intermediate Points
Maximum
f (x)
0 x
xL xU
First L0
iteration L0 = L1 + L2
L1 L2
Second
iteration L1 L2
L2 =
L0 L1
L1 L2
From two condition, =
L1 + L2 L1
Set R = L2 / L1,
1
1+ R = or R2 + R 1 = 0
R
Solving for positive root,
1 + 1 4(1) 5 1
R= = = 0.61803
2 2
R = 0.61803 = Golden ratio since it allows optima to be
found efficiently
The Parthenon in Athens
0.61803 x
x
This golden ratio were considered aesthetically pleasing
by the Greeks.
Initial Step of the Golden-section Search
Eliminate
f (x) f (x1) Maximum
f (x2)
d
d
xL xU x
x2 x1
1) Guess initial bracket xL and xU
2) Choose two interior points x1 and x2 according to golden ratio,
5 1
d= ( xU xL )
2
x1 = xL + d , x2 = xU d
3) If f (x1) > f (x2), eliminate [ xL , x2 ] and set x2 = xL for next round
Next Step to Complete the Algorithm
f (x) Maximum
xU x
xL x2 x1
Old x2 Old x1
4) Only new x1 need to be determined,
5 1
x1 = xL + ( xU xL )
2
Example: Golden-Section Search to find maximum
x2
f ( x) = 2sin x , xL = 0 and xU = 4
10
Solution: (1) Create two interior points
5 1
d= (4 0) = 2.472
2
x1 = 0 + 2.472 = 2.472
x2 = 4 2.472 = 1.528
(2) Evaluate function at interior points,
2.4722
f ( x1 ) = f (2.472) = 2sin(2.472) = 0.63
10
f ( x2 ) = f (1.528) = 1.765
(3) Because f(x2) > f(x1), eliminate upper part,
New xU = x1 = 2.472 OLD
New x1 = x2 = 1.528
xL x2 x1 xU
(4) Compute new x2
5 1 NEW
d= (2.472 0) = 1.528
2
x2 = 2.472 1.528 = 0.944 xL x2 x1 xU
i xL x2 f(x2) x1 f(x1) xU d
x0 x1 x3 x2 x
x0 = 1 f ( x0 ) = 1.5829
x1 = 1.5055 f ( x1 ) = 1.7691
x2 = 4 f ( x2 ) = 3.1136
f ( xi )
f ( x) = 0 xi +1 = xi
f ( xi )
g ( xi ) f ( xi )
g ( x) = 0 xi +1 = xi xi +1 = xi
g ( xi ) f ( xi )
Example: Use Newtons method to find maximum of
x2
f ( x) = 2sin x , Initial guess: x0 = 2.5
10
Solution: (1) Evaluate 1st and 2nd derivatives of the function,
x
f ( x) = 2 cos x
5
1
f ( x) = 2sin x
5
(2) Substitute into Newtons formula,
2 cos xi xi / 5
xi +1 = xi
2sin xi 1/ 5
(3) Substituting initial guess,
2 cos 2.5 2.5 / 5
x1 = 2.5 = 0.99508, f (0.99508) = 1.57859
2sin 2.5 1/ 5
i x f(x)
0 2.5 0.57194
1 0.99508 1.57859
2 1.46901 1.77385
3 1.42764 1.77573
4 1.42755 1.77573
y Line of constant f
x
2D searches: Ascending a mountain (maximum) or
Descending into a valley (minimization)
DIRECT METHODS
Random Search
Univariate and Pattern Searches
GRADIENT METHODS
Example: f ( x , y ) = y x 2 x 2 2 xy y 2
iter = 100;
maxf = -inf;
for i = 1:iter
x = xmin + (xmax - xmin)*rand;
y = ymin + (ymax - ymin)*rand;
fn = feval(objfun,x,y);
if fn > maxf
maxf = fn;
maxx = x;
maxy = y;
end
end
>>[maxx,maxy,maxf]=rndsrch('objfun1',-2,2,1,3)
Pattern directions
4 3
1
2 x
From (1) move along x axis with y constant to max at (2)
Powells Method
Use pattern directions to find optimum efficiently
- Pt. 1 and 2 are obtained by 1-D
y
search from different starting pts.
x
GRADIENT METHODS
Use derivative to locate optima
Gradient : y
f
h
x
f
= Slope along axis h
h
f f
f = i + j = Steepest direction
x y
2
f f f
2 2 2
H = 2
x y x y
2
2 f
If H > 0 and > 0 local minimum
x 2
2 f
If H > 0 and < 0 local maximum
x 2
>> contour(x,y,fxy)
4
x=0:0.1:4;
y=0:0.1:4; 3
fxy=zeros(length(y),length(x));
for i=1:length(y)
2
y
for j=1:length(x)
fxy(i,j)=x(j)*y(i)^2;
1
end
end
0
0 1 2 3 4
x
>> surf(x,y,fxy)
>> mesh(x,y,fxy)
Determine elevation: f (2, 2) = 2 22 = 8
8
1
Direction, = tan = 63.40 relative to x axis
4
Magnitude of f is 42 + 82 = 8.944
63.4o
y 2
0
0 1 2 3 4
x
Relationship between steepest direction and x-y coordinate
x
2 6 f f
Define g (h) = f ( x0 + h, y0 + h)
x y
Set g (h*) = 0 to find h *
Example: Developing 1-D function along a gradient direction
x = 1 + 6(0.2) = 0.2
y = 1 6(0.2) = 0.2
Set new starting point to (0.2, -0.2) and repeat the process.
MATLAB Optimization Toolbox
fminunc : the unconstrained optimization function
x =
0.5000 -1.0000