Professional Documents
Culture Documents
(P1) A square matrix and its transpose have the same eigen values.
Proof : WKT, sum of the eigenvalues of a matrix = sum of the roots of the ch. Eqn
(1)(S1 )
= S1 Trace( A) .
1
Proof : WKT, product of the eigenvalues of a matrix = product of the roots of the ch. Eqn
(1) n Sn
= (1) n .( 1) n A A .
1
(Note : In terms of the eigenvalues, the matrix A is non-singular if all its eigenvalues are non-
zero. If atleast one eigenvalue is zero, then the matrix A is a singular matrix)
(P6) If1 , 2 ,... n are the eigenvalues of a matrix A, then
k1 ,k 2 ,...k n are the eigenvalues of kA , for a non-zero scalar k.
1 , 2 ,... n are the eigenvalues of A , for a positive integer p.
p p p p
1 1 1
, ,... are the eigenvalues of A1 , provided the eigenvalues are non-zero.
1 2 n
1 k, 2 k,... n k are the eigenvalues of A kI , where I is identity matrix.
A
(P7) If is an eigenvalue of a non-singular matrix A, then is an eigenvalue of adjoint(A).
Proof : A I AI AA 1 AI AA 1
I
A[I A 1 ] A[A 1 ]
Adj(A) I A AI
A[ ] [Adj(A) ]
A A
Hence A I
A AI AI
[Adj(A) ] Adj(A)
A
AI
As the characteristic equation of A is A I 0 , we have Adj(A) 0,
which is the characteristic equation of Adj(A) and hence eigenvalue of adj(A) is A .
Then, AX X (1)
Let A and B be two matrices of same order. The matrix B is similar matrix to matrix A if there
exists a non-singular matrix P such that B P1AP .
(P9) Similar matrices have the same eigenvalues.
Proof : Let A and B be two similar matrices. By definition, we have
Let be an eigenvalue of A.
B I P1[A I]P P 1 A I P A I P 1P A I
Since the characteristic polynomials are same, is also an eigenvalue of B.
Thus, similar matrices have the same eigenvalues.
ORTHOGONAL MATRICES
Definition(s):
1
(P11) If is an eigenvalue of an orthogonal matrix A, then is also an eigenvalue of A.
Proof : If is an eigenvalue of an orthogonal matrix A, then
AX X
1
As X is non-zero, we have 1 2 1 . Thus, is also an eigenvalue of A.
Problems
4 1 1 2
Find the eigenvalues of (i) 2A2 , if A 3A 1 , if A
(ii)
3 2 2 1
1 2 2
Find the sum and product of the eigenvalues of A 1 0 3 .
2 1 3
2 2 1
Two eigenvalues of A 1 3 1 are equal to 1 each. Find the third eigenvalue.
1 2 2
If the sum of two eigenvalues and trace of a 3 x 3 matrix A are equal, find the value of A.
3 0 0
Find the sum of the eigenvalues of the inverse of A 8 4 0
6 2 5
Linear dependent and independent vectors
A set of vectors X1 ,X 2 ,...X n is said to be linearly dependent if they are not linearly independent.
Test for the linear dependence / independence of vectors (in terms of matrices)
Consider A to be a square matrix, having the n vectors X1 ,X 2 ,...X n as its columns (or rows). If
A 0 , then the vectors X1 ,X 2 ,...X n are linearly independent. If A 0 , then the vectors
X1 ,X 2 ,...X n are linearly dependent.
Properties of Eigenvectors
Two vectors X and Y are orthogonal vectors, if their dot product Y.X 0 is zero. Equivalently,
in terms of matrix multiplication, the vectors X and Y are orthogonal vectors if Y X 0 .
T
A set of vectors form an orthogonal set if all vectors in the set are mutually orthogonal.
Orthonormal vectors
Two vectors X and Y are orthonormal vectors, if they are orthogonal and unit vectors.
A set of vectors form an orthonormal set if all vectors in the set are of unit length and are mutually
orthogonal.
(P4) Eigenvectors corresponding to two distinct eigenvalues of symmetric matrix are orthogonal.
Proof : Let ( 1 , X) and ( 2 ,Y) be two eigen pairs of a symmetric matrix A, with 1 2 . Then,
AX 1X (1)
AY 2 Y (2)
YT AX 1YT X (3)
YT AX 2 YT X (4)
2 1 YT X 0 YT X 0 X Y 0
Hence X and Y are orthogonal.