You are on page 1of 32

EE263 Autumn 2007-08

Stephen Boyd

Lecture 15
Symmetric matrices, quadratic forms, matrix
norm, and SVD
eigenvectors of symmetric matrices
quadratic forms
inequalities for quadratic forms
positive semidefinite matrices
norm of a matrix
singular value decomposition
151

Eigenvalues of symmetric matrices


suppose A Rnn is symmetric, i.e., A = AT
fact: the eigenvalues of A are real
to see this, suppose Av = v, v 6= 0, v Cn
then
v T Av = v T (Av) = v T v =

n
X
i=1

but also
T

v T Av = (Av) v = (v) v =

|vi|2

n
X
i=1

|vi|2

so we have = , i.e., R (hence, can assume v Rn)


Symmetric matrices, quadratic forms, matrix norm, and SVD

152

Eigenvectors of symmetric matrices


fact: there is a set of orthonormal eigenvectors of A, i.e., q1, . . . , qn s.t.
Aqi = iqi, qiT qj = ij
in matrix form: there is an orthogonal Q s.t.
Q1AQ = QT AQ =

hence we can express A as


A = QQT =

n
X

iqiqiT

i=1

in particular, qi are both left and right eigenvectors


Symmetric matrices, quadratic forms, matrix norm, and SVD

153

Interpretations

replacements
A = QQT
x

QT

QT x

QT x

Ax

linear mapping y = Ax can be decomposed as


resolve into qi coordinates
scale coordinates by i
reconstitute with basis qi
Symmetric matrices, quadratic forms, matrix norm, and SVD

154

or, geometrically,
rotate by QT
diagonal real scale (dilation) by
rotate back by Q

decomposition
A=

n
X

iqiqiT

i=1

expresses A as linear combination of 1-dimensional projections

Symmetric matrices, quadratic forms, matrix norm, and SVD

155

example:


1/2
3/2
3/2 1/2


 
T


1
1
1
0
1
1
1
1

=
0 2
2 1 1
2 1 1

A =

q2q2T x
x

q1

q1q1T x

1q1q1T x
q2

Symmetric matrices, quadratic forms, matrix norm, and SVD

2q2q2T x
Ax
156

proof (case of i distinct)


suppose v1, . . . , vn is a set of linearly independent eigenvectors of A:
Avi = ivi,

kvik = 1

then we have
viT (Avj ) = j viT vj = (Avi)T vj = iviT vj
so (i j )viT vj = 0
for i 6= j, i 6= j , hence viT vj = 0
in this case we can say: eigenvectors are orthogonal
in general case (i not distinct) we must say: eigenvectors can be
chosen to be orthogonal
Symmetric matrices, quadratic forms, matrix norm, and SVD

157

Example: RC circuit
i1
v1

c1

in
vn

resistive circuit

cn

ck v k = ik ,

i = Gv

G = GT Rnn is conductance matrix of resistive circuit


thus v = C 1Gv where C = diag(c1, . . . , cn)
note C 1G is not symmetric
Symmetric matrices, quadratic forms, matrix norm, and SVD

158

use state xi =

civi, so
x = C 1/2v = C 1/2GC 1/2x

where C

1/2

= diag( c1, . . . , cn)

we conclude:
eigenvalues 1, . . . , n of C 1/2GC 1/2 (hence, C 1G) are real
eigenvectors qi (in xi coordinates) can be chosen orthogonal
eigenvectors in voltage coordinates, si = C 1/2qi, satisfy
C 1Gsi = isi,

Symmetric matrices, quadratic forms, matrix norm, and SVD

sTi Csi = ij

159

Quadratic forms
a function f : Rn R of the form
f (x) = xT Ax =

n
X

Aij xixj

i,j=1

is called a quadratic form


in a quadratic form we may as well assume A = AT since
xT Ax = xT ((A + AT )/2)x
((A + AT )/2 is called the symmetric part of A)
uniqueness: if xT Ax = xT Bx for all x Rn and A = AT , B = B T , then
A=B
Symmetric matrices, quadratic forms, matrix norm, and SVD

1510

Examples
kBxk2 = xT B T Bx

Pn1
i=1

(xi+1 xi)2

kF xk2 kGxk2
sets defined by quadratic forms:
{ x | f (x) = a } is called a quadratic surface
{ x | f (x) a } is called a quadratic region

Symmetric matrices, quadratic forms, matrix norm, and SVD

1511

Inequalities for quadratic forms


suppose A = AT , A = QQT with eigenvalues sorted so 1 n

xT Ax = xT QQT x
= (QT x)T (QT x)
n
X
i(qiT x)2
=
i=1

n
X

(qiT x)2

i=1

= 1kxk2
i.e., we have xT Ax 1xT x
Symmetric matrices, quadratic forms, matrix norm, and SVD

1512

similar argument shows xT Ax nkxk2, so we have


nxT x xT Ax 1xT x

sometimes 1 is called max, n is called min

note also that


q1T Aq1 = 1kq1k2,

qnT Aqn = nkqnk2,

so the inequalities are tight

Symmetric matrices, quadratic forms, matrix norm, and SVD

1513

Positive semidefinite and positive definite matrices


suppose A = AT Rnn
we say A is positive semidefinite if xT Ax 0 for all x
denoted A 0 (and sometimes A  0)
A 0 if and only if min(A) 0, i.e., all eigenvalues are nonnegative
not the same as Aij 0 for all i, j
we say A is positive definite if xT Ax > 0 for all x 6= 0
denoted A > 0
A > 0 if and only if min(A) > 0, i.e., all eigenvalues are positive
Symmetric matrices, quadratic forms, matrix norm, and SVD

1514

Matrix inequalities
we say A is negative semidefinite if A 0
we say A is negative definite if A > 0
otherwise, we say A is indefinite
matrix inequality: if B = B T Rn we say A B if A B 0, A < B
if B A > 0, etc.
for example:
A 0 means A is positive semidefinite
A > B means xT Ax > xT Bx for all x 6= 0
Symmetric matrices, quadratic forms, matrix norm, and SVD

1515

many properties that youd guess hold actually do, e.g.,


if A B and C D, then A + C B + D
if B 0 then A + B A
if A 0 and 0, then A 0
A2 0

if A > 0, then A1 > 0

matrix inequality is only a partial order : we can have


A 6 B,

B 6 A

(such matrices are called incomparable)

Symmetric matrices, quadratic forms, matrix norm, and SVD

1516

Ellipsoids

if A = AT > 0, the set


E = { x | xT Ax 1 }
is an ellipsoid in Rn, centered at 0

s1

s2

Symmetric matrices, quadratic forms, matrix norm, and SVD

1517

1/2

semi-axes are given by si = i

qi, i.e.:

eigenvectors determine directions of semiaxes


eigenvalues determine lengths of semiaxes
note:
in direction q1, xT Ax is large, hence ellipsoid is thin in direction q1
in direction qn, xT Ax is small, hence ellipsoid is fat in direction qn

max/min gives maximum eccentricity

if E = { x | xT Bx 1 }, where B > 0, then E E A B


Symmetric matrices, quadratic forms, matrix norm, and SVD

1518

Gain of a matrix in a direction


suppose A Rmn (not necessarily square or symmetric)
for x Rn, kAxk/kxk gives the amplification factor or gain of A in the
direction x
obviously, gain varies with direction of input x
questions:
what is maximum gain of A
(and corresponding maximum gain direction)?
what is minimum gain of A
(and corresponding minimum gain direction)?
how does gain of A vary with direction?
Symmetric matrices, quadratic forms, matrix norm, and SVD

1519

Matrix norm
the maximum gain

kAxk
x6=0 kxk
is called the matrix norm or spectral norm of A and is denoted kAk
max

kAxk2
xT AT Ax
T
max
=
max
=

(A
A)
max
2
2
x6=0 kxk
x6=0
kxk
p
so we have kAk = max(AT A)

similarly the minimum gain is given by


min kAxk/kxk =
x6=0

Symmetric matrices, quadratic forms, matrix norm, and SVD

min(AT A)

1520

note that
AT A Rnn is symmetric and AT A 0 so min, max 0
max gain input direction is x = q1, eigenvector of AT A associated
with max
min gain input direction is x = qn, eigenvector of AT A associated with
min

Symmetric matrices, quadratic forms, matrix norm, and SVD

1521

35 44
44 56

1 2
example: A = 3 4
5 6

AT A =

then kAk =

0.620
0.785
0.785 0.620



90.7
0
0 0.265



0.620
0.785
0.785 0.620

T

max(AT A) = 9.53:



0.620


0.785 = 1,







2.18




0.620
A
= 4.99 = 9.53


0.785
7.78

Symmetric matrices, quadratic forms, matrix norm, and SVD

1522

min gain is

min(AT A) = 0.514:





0.785


0.620 = 1,

for all x 6= 0, we have



A






0.46



0.785
0.14
=

= 0.514
0.620
0.18

kAxk
9.53
0.514
kxk

Symmetric matrices, quadratic forms, matrix norm, and SVD

1523

Properties of matrix norm

n1
consistent
with
vector
norm:
matrix
norm
of
a

R
is

p
max(aT a) = aT a

for any x, kAxk kAkkxk


scaling: kaAk = |a|kAk
triangle inequality: kA + Bk kAk + kBk
definiteness: kAk = 0 A = 0
norm of product: kABk kAkkBk

Symmetric matrices, quadratic forms, matrix norm, and SVD

1524

Singular value decomposition

more complete picture of gain properties of A given by singular value


decomposition (SVD) of A:
A = U V T
where
A Rmn, Rank(A) = r
U Rmr , U T U = I
V Rnr , V T V = I
= diag(1, . . . , r ), where 1 r > 0
Symmetric matrices, quadratic forms, matrix norm, and SVD

1525

with U = [u1 ur ], V = [v1 vr ],


A = U V T =

r
X

iuiviT

i=1

i are the (nonzero) singular values of A


vi are the right or input singular vectors of A
ui are the left or output singular vectors of A

Symmetric matrices, quadratic forms, matrix norm, and SVD

1526

AT A = (U V T )T (U V T ) = V 2V T

hence:
vi are eigenvectors of AT A (corresponding to nonzero eigenvalues)
i =

i(AT A) (and i(AT A) = 0 for i > r)

kAk = 1

Symmetric matrices, quadratic forms, matrix norm, and SVD

1527

similarly,
AAT = (U V T )(U V T )T = U 2U T

hence:
ui are eigenvectors of AAT (corresponding to nonzero eigenvalues)
i =

i(AAT ) (and i(AAT ) = 0 for i > r)

u1, . . . ur are orthonormal basis for range(A)


v1, . . . vr are orthonormal basis for N (A)

Symmetric matrices, quadratic forms, matrix norm, and SVD

1528

Interpretations
A = U V T =

r
X

iuiviT

i=1

V Tx

V T x

Ax

linear mapping y = Ax can be decomposed as


compute coefficients of x along input directions v1, . . . , vr
scale coefficients by i

reconstitute along output directions u1, . . . , ur


difference with eigenvalue decomposition for symmetric A: input and
output directions are different
Symmetric matrices, quadratic forms, matrix norm, and SVD

1529

v1 is most sensitive (highest gain) input direction


u1 is highest gain output direction
Av1 = 1u1

Symmetric matrices, quadratic forms, matrix norm, and SVD

1530

SVD gives clearer picture of gain as function of input/output directions


example: consider A R44 with = diag(10, 7, 0.1, 0.05)
input components along directions v1 and v2 are amplified (by about
10) and come out mostly along plane spanned by u1, u2
input components along directions v3 and v4 are attenuated (by about
10)
kAxk/kxk can range between 10 and 0.05
A is nonsingular
for some applications you might say A is effectively rank 2

Symmetric matrices, quadratic forms, matrix norm, and SVD

1531

example: A R22, with 1 = 1, 2 = 0.5


resolve x along v1, v2: v1T x = 0.5, v2T x = 0.6, i.e., x = 0.5v1 + 0.6v2
now form Ax = (v1T x)1u1 + (v2T x)2u2 = (0.5)(1)u1 + (0.6)(0.5)u2
v2
u1
x

v1

Symmetric matrices, quadratic forms, matrix norm, and SVD

Ax

u2

1532

You might also like