Professional Documents
Culture Documents
One-Dimensional Optimization
Multi-Dimensional Optimization
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Outline
Scientific Computing: An Introductory Survey
Chapter 6 Optimization
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Scientific Computing
Michael T. Heath
1 / 74
Definitions
Existence and Uniqueness
Optimality Conditions
Scientific Computing
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Optimization
2 / 74
Definitions
Existence and Uniqueness
Optimality Conditions
Optimization Problems
min f (x)
subject to g(x) = 0
and h(x) 0
where f :
Rn
R, g :
Rn
Rm ,
h : Rn Rp
Scientific Computing
3 / 74
Definitions
Existence and Uniqueness
Optimality Conditions
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Scientific Computing
4 / 74
Definitions
Existence and Uniqueness
Optimality Conditions
x1 ,x2
Scientific Computing
5 / 74
Definitions
Existence and Uniqueness
Optimality Conditions
Global Optimization
Scientific Computing
6 / 74
Definitions
Existence and Uniqueness
Optimality Conditions
Existence of Minimum
If f is continuous on closed and bounded set S Rn , then
f has global minimum on S
kxk
Michael T. Heath
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Scientific Computing
Michael T. Heath
Scientific Computing
8 / 74
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Definitions
Existence and Uniqueness
Optimality Conditions
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Level Sets
Definitions
Existence and Uniqueness
Optimality Conditions
Uniqueness of Minimum
L = {x S : f (x) }
If continuous function f on S Rn has nonempty sublevel
set that is closed and bounded, then f has global minimum
on S
Michael T. Heath
Scientific Computing
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
9 / 74
Michael T. Heath
Definitions
Existence and Uniqueness
Optimality Conditions
{Hf (x)}ij =
f (x) = 0
where f (x) is gradient vector of f , whose ith component
is f (x)/xi
At critical point x , if Hf (x ) is
positive definite, then x is minimum of f
negative definite, then x is maximum of f
indefinite, then x is saddle point of f
But not all critical points are minima: they can also be
maxima or saddle points
Scientific Computing
Michael T. Heath
Definitions
Existence and Uniqueness
Optimality Conditions
Scientific Computing
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Constrained Optimality
12 / 74
Definitions
Existence and Uniqueness
Optimality Conditions
2 f (x)
xi xj
which is symmetric
Michael T. Heath
10 / 74
Definitions
Existence and Uniqueness
Optimality Conditions
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Scientific Computing
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
subject to g(x) = 0
Rn
Rn
Rm ,
where f :
R and g :
with m n, necessary
condition for feasible point x to be solution is that negative
gradient of f lie in space spanned by constraint normals,
f (x )
L(x, ) =
HL (x, ) =
Scientific Computing
B(x, ) JgT (x)
Jg (x)
O
where
JgT (x )
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
B(x, ) = Hf (x) +
m
X
i Hgi (x)
i=1
13 / 74
Definitions
Existence and Uniqueness
Optimality Conditions
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Scientific Computing
14 / 74
Definitions
Existence and Uniqueness
Optimality Conditions
f (x) + JgT (x)
=0
g(x)
Scientific Computing
15 / 74
Michael T. Heath
Scientific Computing
16 / 74
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Definitions
Existence and Uniqueness
Optimality Conditions
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Unimodality
Scientific Computing
17 / 74
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Scientific Computing
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Scientific Computing
20 / 74
Scientific Computing
21 / 74
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Example, continued
x1
0.764
0.472
0.764
0.652
0.584
0.652
0.695
0.679
0.695
0.705
Scientific Computing
= ( 5 1)/2
x1 = a + (1 )(b a); f1 = f (x1 )
x2 = a + (b a); f2 = f (x2 )
while ((b a) > tol) do
if (f1 > f2 ) then
a = x1
x1 = x2
f1 = f2
x2 = a + (b a)
f2 = f (x2 )
else
b = x2
x2 = x1
f2 = f1
x1 = a + (1 )(b a)
f1 = f (x1 )
end
end
Michael T. Heath
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
18 / 74
Scientific Computing
22 / 74
x2
1.236
0.764
0.944
0.764
0.652
0.695
0.721
0.695
0.705
0.711
f2
0.232
0.074
0.113
0.074
0.074
0.071
0.071
0.071
0.071
0.071
Scientific Computing
23 / 74
Michael T. Heath
Scientific Computing
24 / 74
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Example, continued
xk
0.000
0.600
1.200
0.754
0.721
0.692
0.707
f (xk )
0.500
0.081
0.216
0.073
0.071
0.071
0.071
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Scientific Computing
25 / 74
Michael T. Heath
Scientific Computing
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Newtons Method
26 / 74
xk
1.000
0.500
0.700
0.707
Scientific Computing
27 / 74
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Safeguarded Methods
f (xk )
0.132
0.111
0.071
0.071
Scientific Computing
28 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Scientific Computing
29 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Scientific Computing
30 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
xk+1 = xk k f (xk )
where k is line search parameter that determines how far
to go in given direction
Michael T. Heath
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Scientific Computing
Michael T. Heath
Scientific Computing
32 / 74
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Example, continued
5.000
3.333
2.222
1.481
0.988
0.658
0.439
0.293
0.195
0.130
Scientific Computing
1.000
0.667
0.444
0.296
0.198
0.132
0.088
0.059
0.039
0.026
33 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
f (xk )
15.000
6.667
2.963
1.317
0.585
0.260
0.116
0.051
0.023
0.010
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Example, continued
f (xk )
5.000
5.000
3.333 3.333
2.222
2.222
1.481 1.481
0.988
0.988
0.658 0.658
0.439
0.439
0.293 0.293
0.195
0.195
0.130 0.130
Scientific Computing
34 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Newtons Method
Broader view can be obtained by local quadratic
approximation, which is equivalent to Newtons method
In multidimensional optimization, we seek zero of gradient,
so Newton iteration has form
xk+1 = xk Hf1 (xk )f (xk )
where Hf (x) is Hessian matrix of second partial
derivatives of f ,
{Hf (x)}ij =
2 f (x)
xi xj
Scientific Computing
35 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
37 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Scientific Computing
38 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
f (xk )T sk < 0
Scientific Computing
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Michael T. Heath
36 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Michael T. Heath
Scientific Computing
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Michael T. Heath
Scientific Computing
40 / 74
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Scientific Computing
41 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Quasi-Newton Methods
42 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Scientific Computing
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Scientific Computing
43 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
BFGS Method
Scientific Computing
44 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
x0 = initial guess
B0 = initial Hessian approximation
for k = 0, 1, 2, . . .
Solve Bk sk = f (xk ) for sk
xk+1 = xk + sk
yk = f (xk+1 ) f (xk )
Bk+1 = Bk + (yk ykT )/(ykT sk ) (Bk sk sTk Bk )/(sTk Bk sk )
end
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Scientific Computing
45 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Scientific Computing
46 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
xk
5.000
1.000
0.000 4.000
2.222
0.444
0.816
0.082
0.009 0.015
0.001
0.001
f (xk )
5.000
5.000
0.000 20.000
2.222
2.222
0.816
0.408
0.009
0.077
0.001
0.005
f (xk )
15.000
40.000
2.963
0.350
0.001
0.000
Michael T. Heath
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
47 / 74
Michael T. Heath
Scientific Computing
48 / 74
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Scientific Computing
49 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Scientific Computing
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Scientific Computing
i = 1, . . . , m
1 T
2 r (x)r(x)
i=1
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Scientific Computing
54 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Gauss-Newton Method
This motivates Gauss-Newton method for nonlinear least
squares, in which second-order term is dropped and linear
system
J T (xk )J (xk )sk = J T (xk )r(xk )
52 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
J T (xk )J (xk ) +
Scientific Computing
Michael T. Heath
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
50 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Scientific Computing
Example, continued
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
!
ri (xk )Hi (xk ) sk = J T (xk )r(xk )
i=1
Michael T. Heath
Scientific Computing
55 / 74
Michael T. Heath
Scientific Computing
56 / 74
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Example, continued
T
If we take x0 = 1 0 , then Gauss-Newton step s0 is
given by linear least squares problem
1
1
0
0.3
1 1
s0
1 2 = 0.7
0.9
1 3
f (t, x) = x1 exp(x2 t)
to data
t
y
0.0
2.0
1.0
0.7
2.0
0.3
3.0
0.1
whose solution is s0 =
ri (x)
= exp(x2 ti )
x1
Michael T. Heath
Scientific Computing
57 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Example, continued
Scientific Computing
58 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
xk
1.000
1.690
1.975
1.994
1.995
1.995
0.69
0.61
ri (x)
=
= x1 ti exp(x2 ti )
x2
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
0.000
0.610
0.930
1.004
1.009
1.010
kr(xk )k22
2.390
0.212
0.007
0.002
0.002
0.002
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Scientific Computing
59 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Levenberg-Marquardt Method
min f (x)
subject to g(x) = 0
Rn
where f :
R and g : Rn Rm , with m n, we seek
critical point of Lagrangian L(x, ) = f (x) + T g(x)
sk
=
k I
0
Jg (x)
O
g(x)
60 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Michael T. Heath
Scientific Computing
Equality-Constrained Optimization
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Scientific Computing
62 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Merit Function
Once Newton step (s, ) determined, we need merit
function to measure progress toward overall solution for
use in line search or trust region
Michael T. Heath
Scientific Computing
63 / 74
Michael T. Heath
Scientific Computing
64 / 74
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Inequality-Constrained Optimization
Penalty Methods
Merit function can also be used to convert
equality-constrained problem into sequence of
unconstrained problems
If x is solution to
min (x) = f (x) + 12 g(x)T g(x)
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Scientific Computing
65 / 74
Michael T. Heath
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Barrier Methods
Scientific Computing
66 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
subject to
i=1
or
(x) = f (x)
p
X
g(x) = x1 x2 1 = 0
Lagrangian function is given by
log(hi (x))
i=1
Scientific Computing
Since
f (x) =
x1
5x2
and Jg (x) = 1 1
we have
x L(x, ) = f (x) + JgT (x) =
67 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Example, continued
x1
1
+
5x2
1
Scientific Computing
68 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Example, continued
1
0
1
x1
0
0
5 1 x2 = 0
1 1
0
1
Solving this system, we obtain solution
x1 = 0.833,
x2 = 0.167,
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
= 0.833
Scientific Computing
69 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Michael T. Heath
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Linear Programming
Scientific Computing
70 / 74
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
subject to Ax = b and
x0
Scientific Computing
71 / 74
Michael T. Heath
Scientific Computing
72 / 74
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Optimization Problems
One-Dimensional Optimization
Multi-Dimensional Optimization
Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization
Example, continued
x1 + 3x2 12,
x1 0, x2 0
Michael T. Heath
Scientific Computing
73 / 74
Michael T. Heath
Scientific Computing
74 / 74