You are on page 1of 4

System of Linear equations

For a system of linear equations A X = b, recall that

the (known) matrix A is called the coefficient matrix,

the (unknown) column vector X is called the solution vector, and

the (known) column vector b is called the constant vector.

If b = 0, then the system is called a homogeneous system.

Solution of a Homogeneous system

Consider a homogeneous system of n linear equations of the form AX 0 , in which


T
A is the coefficient matrix and X x1 x 2 ..x i ..x n is the solution (column) vector.

If A 0 , then the system will have only a trivial (or zero) solution.

If A 0, then the system will have infinitely many non-trivial (or non-zero)
solutions given by
x1 x2 xj xn
... ... k,
Ci1 Ci2 Cij Cin
where k is an arbitrary constant, and C ij represent the cofactor of (i,j)th element of A.

Note : For a non-trivial solution, choose the suitable row from the system (say, row i),
such that at least one of the cofactors in that row is non-zero.

Principal minor of a matrix

A principal minor of a square matrix A is a minor M of the matrix A, in which the


diagonal elements of the minor M are the diagonal elements of the matrix A.
For a square matrix of order n, the orders of the principal minors vary from 1 to n.

Linearly dependence and independence of vectors

Let A be a square matrix having the n vectors X 1 , X 2 ,..., X n as its columns (or rows).
If A 0 , then the vectors X 1 , X 2 ,..., X n are said to be linearly independent. If A 0,
then the vectors X 1 , X 2 ,..., X n are said to be linearly dependent.
Eigenvalue Problem

Consider a column vector X in R n and a square matrix A of order n. The vector AX is


also a column vector in R n .

In general, the vectors X and AX are not necessarily parallel vectors. If they are parallel,
then for some scalar , we get the system of linear equations as AX X , which can
also be written as [ A I ]X 0 . The determination of the scalar and the non-zero
solutions of this homogeneous system constitute the eigenvalue problem.

Eigenvalues and Eigenvectors

The scalar is called the eigenvalue of A if there is a non-zero vector X such that
AX X or [ A I ]X 0.
Any non-zero vector X satisfying this relationship is called an eigenvector associated
with the eigenvalue .

Notes :

The matrix A I is called the characteristic matrix of A, and its determinant is


called the characteristic polynomial of A and is denoted by PA ( ) .

The equation A I 0 is called the characteristic equation of A, and the roots of


this equation are called the eigenvalues of A.

The system [ A I ] X 0 will have non-trivial solutions, if A I 0 . Each


non-zero solution of this system is called eigenvector of A associated with .

An eigenvalue is also known as characteristic root or latent root. Similarly, an


eigenvector is known as characteristic vector or latent vector.

Method for find the characteristic equation

If A is a square matrix of order n, then its characteristic equation A I 0 can be

simply written as
n n 1 n 2
S1 S2 ... ( 1)n Sn 0 , where

Sn A and S k sum of the principal minors of order k, for k = 1,2,,(n-1).


Worked Problem

1 1 3
1. Find the eigenvalues and eigenvectors of the matrix A 1 5 1
3 1 1
x
Sol : Let be the eigenvalue of A and X y the associated eigenvector of A. We
z
need to solve the homogeneous system

1 1 3 x 0
1 5 1 y 0 (1)
3 1 1 z 0

3 2
The characteristic equation of A is S1 S2 S3 0.

Problems for practice :

1. Find the eigenvalues and eigenvectors of the matrix A, shown in the following table.

No. Matrix A Eigenvalues Associated eigenvectors


(a) 4 1 1,5 [1 -3] ; [1 1]
3 2
(b) 1 1 3 -2,3,6 [1 0 -1] ; [1 -1 1] ; [1 2 1]
1 5 1
3 1 1
(c) 2 2 1 5,1,1 [1 1 1] ; [-2 1 0] ; [1 0 -1]
1 3 1
1 2 2
(d) 3 10 5 3,2,2 [1 1 -2]; [5 2 -5]; [5 2 -5]
2 3 4
3 5 7
(e) 6 6 5 -1,-1,-1 [0 5 6] ; [5 0 -7]; [6 7 0]
14 13 10
7 6 4
(f) 5 5 9 -1,-1,-1 [3 -6 2] ; [3 -6 2] ; [3 -6 2]
8 9 18
2 3 7

You might also like