You are on page 1of 7

Properties of Eigenvalues

(P1) A square matrix and its transpose have the same eigen values.

Proof : Let be an eigenvalue of a square matrix A. Since

AT I (A I)T A I 0 , the matrices A and AT have the same eigenvalues.

(P2) The eigenvalues of a triangular matrix are its diagonal elements.

(P3) The eigenvalues of an idempotent matrix are either zero or unity.


(Note : A matrix A is an idempotent matrix if A2 A )

(P4) The sum of the eigenvalues of a matrix A is equal to its Trace.

Proof : WKT, sum of the eigenvalues of a matrix = sum of the roots of the ch. Eqn
(1)(S1 )
= S1 Trace( A) .
1

(P5) The product of the eigenvalues of a matrix A is equal to its determinant.

Proof : WKT, product of the eigenvalues of a matrix = product of the roots of the ch. Eqn
(1) n Sn
= (1) n .( 1) n A A .
1

(Note : In terms of the eigenvalues, the matrix A is non-singular if all its eigenvalues are non-
zero. If atleast one eigenvalue is zero, then the matrix A is a singular matrix)
(P6) If1 , 2 ,... n are the eigenvalues of a matrix A, then
k1 ,k 2 ,...k n are the eigenvalues of kA , for a non-zero scalar k.
1 , 2 ,... n are the eigenvalues of A , for a positive integer p.
p p p p

1 1 1
, ,... are the eigenvalues of A1 , provided the eigenvalues are non-zero.
1 2 n
1 k, 2 k,... n k are the eigenvalues of A kI , where I is identity matrix.

A
(P7) If is an eigenvalue of a non-singular matrix A, then is an eigenvalue of adjoint(A).

Proof : A I AI AA 1 AI AA 1
I
A[I A 1 ] A[A 1 ]

Adj(A) I A AI
A[ ] [Adj(A) ]
A A

Hence A I
A AI AI
[Adj(A) ] Adj(A)
A
AI
As the characteristic equation of A is A I 0 , we have Adj(A) 0,

which is the characteristic equation of Adj(A) and hence eigenvalue of adj(A) is A .

(P8) The eigenvalues of a real symmetric matrix are real.

Proof : Let be an eigenvalue, X its associated eigenvector of a real symmetric matrix A.

Then, AX X (1)

Applying complex conjugate, and the fact that A is real, we get


AX X (2)
T T
Applying transpose on (2) , we have X AT X . As A is symmetric, we have
T T
X A X . Now, post-multiplying by X, we get
T T
X AX X X (3)
T
Pre-multiplying both sides of (1) by X , we get
T T
X AX X X (4). From (3) and (4), we have indicating that
is real. Thus, the eigenvalues of a real symmetric matrix are real.
SIMILAR MATRICES

Let A and B be two matrices of same order. The matrix B is similar matrix to matrix A if there
exists a non-singular matrix P such that B P1AP .
(P9) Similar matrices have the same eigenvalues.
Proof : Let A and B be two similar matrices. By definition, we have

B P1AP , where P is a non-singular matrix.

Let be an eigenvalue of A.

Now, B I P 1AP I P 1AP P 1IP P 1[A I]P . Therefore,

B I P1[A I]P P 1 A I P A I P 1P A I
Since the characteristic polynomials are same, is also an eigenvalue of B.
Thus, similar matrices have the same eigenvalues.

ORTHOGONAL MATRICES

Definition(s):

A matrix A is called an orthogonal matrix if AT A AAT I , where I is the identity matrix.


A matrix A is called an orthogonal matrix if the column (or row) vectors of A are pairwise
orthogonal unit vectors.

Property : The determinant of an orthogonal matrix is either 1 or -1.

Proof : Let A be an orthogonal matrix. Then,


(P10) The eigenvalues of an orthogonal matrix A are of magnitude 1 or unit modulus.
Proof : As A is an orthogonal matrix, A A AA I .
T T

Let be an eigenvalue of A and X the associated eigenvector. Then, AX X .


Pre-multiplying the above equation by [AX]
T
XT AT , we get
X T A T AX X T A T X
X T X X T (A T X) (A is orthogonal)
X T X X T (X) (A and A T have same eigenvalues)
( 2 1)X T X 0
As X 0 , we have 1 or 1.
2

1
(P11) If is an eigenvalue of an orthogonal matrix A, then is also an eigenvalue of A.

Proof : If is an eigenvalue of an orthogonal matrix A, then

AX X

AX X X X .as A is orthogonal and hence its determinant is one.

1
As X is non-zero, we have 1 2 1 . Thus, is also an eigenvalue of A.

Problems

4 1 1 2
Find the eigenvalues of (i) 2A2 , if A 3A 1 , if A
(ii)
3 2 2 1
1 2 2
Find the sum and product of the eigenvalues of A 1 0 3 .

2 1 3
2 2 1
Two eigenvalues of A 1 3 1 are equal to 1 each. Find the third eigenvalue.

1 2 2

If the sum of two eigenvalues and trace of a 3 x 3 matrix A are equal, find the value of A.
3 0 0

Find the sum of the eigenvalues of the inverse of A 8 4 0


6 2 5
Linear dependent and independent vectors

X1 ,X 2 ,...X n is said to be linearly independent if their linear


A set of (row or column) vectors
combination 1X1 2 X 2 ... n X n 0 implies the scalars 1 2 ... n 0 .

A set of vectors X1 ,X 2 ,...X n is said to be linearly dependent if they are not linearly independent.

Test for the linear dependence / independence of vectors (in terms of matrices)

Consider A to be a square matrix, having the n vectors X1 ,X 2 ,...X n as its columns (or rows). If
A 0 , then the vectors X1 ,X 2 ,...X n are linearly independent. If A 0 , then the vectors
X1 ,X 2 ,...X n are linearly dependent.

Properties of Eigenvectors

(P1) Eigenvector corresponding to an eigenvalue of a matrix is not unique.

(P2) An eigenvector of a matrix A corresponds to only one eigenvalue of the matrix A.

[Alternatively, an eigenvector of a matrix A cannot correspond to two different eigenvalues of


the matrix A.]

(P3) Eigenvectors corresponding to distinct eigenvalues of a matrix are linearly independent.

Proof : Let 1 and 2 X1 and X 2


be two different eigenvalues of A. Let be the
corresponding eigenvectors. Then, AX1 1X1 and AX2 2 X2
Consider the linear combination aX1 bX2 0 (1)

Now, A[aX1 bX2 ] 0 a1X1 b 2X 2 0 (2)

1 times (1) gives a1X1 b1X2 0 (3)

(3)-(2) gives b(1 2 )X2 0 b 0


Hence, (1) will give a 0
Hence, the eigenvectors are linearly independent.
[Note : (i) If all the eigenvalues of a square matrix are different, then the corresponding
eigenvectors are linearly independent; (ii) Eigenvectors corresponding to repeated
eigenvalues of a matrix may be linearly independent or dependent.]
Orthogonal vectors

Two vectors X and Y are orthogonal vectors, if their dot product Y.X 0 is zero. Equivalently,
in terms of matrix multiplication, the vectors X and Y are orthogonal vectors if Y X 0 .
T

A set of vectors form an orthogonal set if all vectors in the set are mutually orthogonal.

Orthonormal vectors

Two vectors X and Y are orthonormal vectors, if they are orthogonal and unit vectors.

A set of vectors form an orthonormal set if all vectors in the set are of unit length and are mutually
orthogonal.

(P4) Eigenvectors corresponding to two distinct eigenvalues of symmetric matrix are orthogonal.

Proof : Let ( 1 , X) and ( 2 ,Y) be two eigen pairs of a symmetric matrix A, with 1 2 . Then,

AX 1X (1)

AY 2 Y (2)

Taking the dot-product of (1) and Y, we get AX Y 1X Y . That is,

YT AX 1YT X (3)

Taking the dot product of X and (2), we get X AY X 2 Y . That is,

YT AT X 2 YT X . Since A is symmetric, we get

YT AX 2 YT X (4)

From (3) & (4), we get

2 1 YT X 0 YT X 0 X Y 0
Hence X and Y are orthogonal.

You might also like