MATRICES

Any row or column matrix is a vector.

Linearly dependent vectors

The vectors X1, X2, X3, … , Xn of same order are said to be linearly dependent if there exists scalars λ1,λ2,λ3, … ,λnnot all zero such that

λ1X1+λ2X2+λ3X3+…+λnXn = 0

Vectors are said to be parallel if their cross product is 0.

Parallel vectors are always linearly dependent but converse is not true.

If the vectors are linearly dependent then we can find a relation between them.

Linearly Independent vectors

The vectors X1, X2, X3, … , Xn of same order are said to be linearly independent if there exists scalars λ1,λ2,λ3, … ,λnall zero such that

λ1X1+λ2X2+λ3X3+…+λnXn = 0

If the vectors are linearly independent then there is no relation between them.

Vectors are said to be perpendicular if their dot product is zero.

Perpendicular vectors are always linearly independent but converse is not true.

If the vectors are linearly independent then we cannot find a relation between them.

Two column vectors of same order are said to be orthogonal if

X1T X2 = 0

OR

X2T X1 = 0

Basically the vectors X1 and X2 are perpendicular.

Norm / Length of a vector

a

X = b|| X || = √ a2 + b2 + c2

c

Normalized vector ( denoted as X ) is obtained by dividing each element of the vector by its length. Length of a normalized vector is 1.

Eigen value and Eigen vectors

Let A be a square matrix. Then there exists ascalarλ and non zero column vector X such that AX = λX ; then λ is called as Eigen value and X is called as Eigen vector.

Eigen values are also called as characteristic / proper / latent values / root of the matrix.

Eigen vector is also called as characteristic vector.

Eigen vector is always a non zero column vector.

Characteristics equation

| A – λI | = 0 is called characteristics equation of A in λ.

Eigen vectors are always linearly independent.

Orthogonal (perpendicular) vectors are always linearly independent.

If the matrix is symmetric, then Eigen vectors are orthogonal.

Method of finding Eigen values and Eigen vectors.

Types:

I] All Eigen values are distinct and A may be symmetric or non symmetric matrix.

Note that if determinant of A is not equal to zero implies all Eigen values are non zero otherwise at least one Eigen value is zero.

Formula to obtain values of λ

The following formula can be arrived at by expanding | A-λI | = 0

λ3 – [sum of diagonal elements of A] λ2 + [sum of minors of diagonal elements of A] λ - |A| = 0

Note that for a 2x2 ordered matrix the formula gets modified as

λ2 - λ[sum of diagonal elements of A] + | A | = 0

To find Eigen vectors

( A – λI ) X = 0 ; where X is the required Eigen vector. Obtain the final matrix in equation form, select 2 rows and find out answer by Cramer’s rule.

Note that if the matrix is symmetric then the Eigen vectors are orthogonal.

II] Eigen values are repeated and A is non symmetric matrix

In this case if suppose λ = 7,1,1 ; then first put λ = (distinct i.e. 7) in the equation ( A – λI ) X = 0 ; Now after substituting the repeated value of λ we get a matrix equation but Cramer’s rule cant be applied so we first convert the matrix in Echelon form, find out its rank; apply the rule that linearly independent Eigen vectors are n-r (say n-r = 2 implies 2 linearly independent Eigen vectors)

III] Eigen values are repeated and A is symmetric matrix.

Assuming here λ = 5, 2, 2 ; for λ=5 we get X1 by applying Cramer’s rule. Now after putting λ=2 in the equation ( A – λI ) X = 0 ; we reduce the matrix to Echelon form, find out n-r and assert that many linearly independent Eigen vectors. Say X2T = [ x1 x2 x3 ]; then put x1 = 0, x2 =1 and find out x3 as per the equation obtained from the Echelon form. Thus X1 and X2 is obtained; now the matrix A being symmetric we follow: If the matrix is symmetric, then Eigen vectors are orthogonal. Keeping this in mind, we find out X3

Note that when given matrix A is upper or lower triangular matrix then λ is directly given by the diagonal elements of A.

Theorems:
If λ and X are Eigen values and Eigen vectors of A, then 1/λ and X are Eigen values and Eigen vectors of A-1 where A is a non singular matrix.

If λ and X are Eigen values and Eigen vectors of A, then λn and X are Eigen values and Eigen vectors of An.

If λ and X are Eigen values and Eigen vectors of A, then f (λ) and X are Eigen values and Eigen vectors of f (A).

If λ and X are Eigen values and Eigen vectors of A, then k+λ and X are Eigen values and Eigen vectors of A+kI.

Note that for a 3x3 ordered matrix A if there exists 3 Eigen vectors, it implies that the matrix A is diagonable.

Caley Hamilton theorem

Every square matrix satisfies its characteristics equation

| A – λI | = 0 ; on expanding we get equation in λ .

Assuming characteristics equation as λ3- 6λ2 + 9λ – 4 = 0

By Caley Hamilton theorem A3- 6A2+ 9A - 4 = 0

Note that

if A is 2x2 ordered matrix then f (A) is of the form αA + βI

if A is 3x3 ordered matrix then f (A) = α2A + βA + γI

Reduction of matrix to diagonal form

Modal matrix : Consider a non symmetric matrix A(3x3) where λ1, λ2, λ3 & X1, X2, X3 are its Eigen values and Eigen vectors respectively, then matrix formed by Eigen vectors of A is called modal matrix i.e. M = [ X1 X2 X3 ]

Theorem

If λ1, λ2, λ3 are Eigen values & M is modal matrix then

λ100

M-1A M = 0λ20

00λ3

This is called as reduction of a matrix to diagonal form & M is called as modal / transforming / diagonalsing matrix.

Unimodal matrix: Consider asymmetric matrix A(3x3) where λ1, λ2, λ3 & X1, X2, X3 are its Eigen values and normalized Eigen vectors respectively, then

P = [X1 X2 X3 ]

Note that unimodal matrix is orthogonal matrix.

Note that

λ100

P-1 A P = 0λ20

00λ3

Consider for matrix A we have λ = 0, 1, 1 ; then

AM (λ=0) = 1

AM (λ=1) = 2

To obtain GM say for λ=0, put value of λ in the equation ( A – λI ) X = 0 ; then we find out the rank of the matrix and obtain GM (λ=0) = n-r . Similarly we obtain GM (λ=1) . If for each value of λ, AM = GM then we say that the matrix A is diagonable.

Derogatory Matrix:
A matrix A(nxn) is said to be derogatory if

Degree of minimal equation < n (the order of A)

Non Derogatory matrix:

A matrix A of order (nxn) is said to be non derogatory if

Degree of minimal equation = n (the order of A)

Note that for a homogeneous system of equations A X = 0

n = number of unknowns

r = rank of the matrix

n-r = number of independent solutions

if n = r ; then it implies that system has zero solutions

if r < n ; then the system has non zero solutions.

Note that if

n = r then the vectors are L.I.

r < n then the vectors are L.D.

Note:
Eigen vectors are always LI for any matrix.

For a given 3x3 matrix A, it has at most 3 eigen values and corresponding eigen vectors.

If all eigen values are distinct then the matrix is diagonable.

If matrix is symmetric then it is always diagonable.

If the matrix of order 3x3 is having 3 Eigen vectors then it is diagonable.

The matrices of same order A and B are said to be similar if there exists a non singular matrix P such that

A = P-1 B P

OR

B = P-1 A P

Monic polynomial : A polynomial in x in which the coefficient of highest power of x is unity is called monic polynomial eg: x3+3x2+6x+10 is monic polynomial of degree 3 and 2x2+3x+9 is not monic.

The monic polynomial of lowest degree that annihilates a matrix A is called the minimal polynomial of A.

If f (x) is minimal polynomial of A then f (x) = 0 is called minimal equation of matrix A.

Note

If all Eigen values of matrix A are distinct then the matrix A is non derogatory.

If Eigen values of A are repeated then matrix A may or may not be derogatory.