ODE Lecture Notes Section 7.3 Page 10 of 11

Section 7.3: Linear Algebraic Equations; Linear Independence, Eigenvalues, Eigenvectors

Big Idea: x

Big Skill: x

Dsds

Systems of Linear Algebraic Equations

A system of n linear equations in n variables can be written in matrix form as follows:

ÛÛ

If , then the system is homogeneous; otherwise, it is nonhomogeneous.

If A is nonsingular (i.e., det(A) ¹ 0), then A-1 exists, and the unique solution to the system is:

A picture for the 3 ´ 3 system is:


Solution: (-1, 2, -3) /


If A is singular, then either there are no solutions, or there are an infinite number of solutions.

Example pictures for 3 ´ 3 systems are:

Three parallel planes è No Solution

Note: Inconsistent systems yield false equations (like 0 = 2 or 0 = 4) after trying to solve them. /
Planes intersect in three parallel linesè No Solution

Note: Inconsistent systems yield a false equation (like 0 = -6) after trying to solve them. /
All three planes intersect along same line è Infinite number of solutions

Note: This type of dependent system yields one equation of 0 = 0 after row operations. /
All three planes are the sameè Infinite number of solutions

Note: This type of dependent system yields two equations of 0 = 0 after row operations. /

If A is singular, then the homogeneous system Ax = 0 will have infinitely many solutions (in addition to the trivial solution).

If A is singular, then the nonhomogeneous system Ax = B will have infinitely many solutions when (B, y) = 0 for all vectors y satisfying A*y = 0 (recall A* is the adjoint of A). These solutions will always take the form x = x(0) + x, where x(0) is the particular solution, and x is the general form for the corresponding homogeneous solution.

In practice, linear systems are solved by performing Gaussian elimination on the augmented matrix A | B.

Practice:

  1. Solve using an augmented matrix.

  1. Solve using an augmented matrix.

  1. Solve using an augmented matrix.

  1. Solve using an augmented matrix.


Linear Independence

A set of k vectors are said to be linearly dependent if there exist k complex numbers not all zero, such that . This term is used because if it is true that not all the constants are zero, then one of the vectors depends on one or more of the other vectors: .

Practice:

  1. Show that the vectors are linearly dependent.

On the other hand, if the only values of that make be true are , then the vectors are said to be linearly independent.

The test for linear dependence or independence can be represented with matrix arithmetic:

Consider n vectors with n components. Let xij be the ith component of vector x(j), and let
X = (xij). Then:

If det(X) = 0, then c = 0, and thus the system is linearly independent.

If det(X) ¹ 0, then there are nonzero values of ci, and thus the system is linearly dependent.


Practice:

  1. Determine the dependence or the independence of the vectors .

Note: Frequently, the columns of a matrix A are thought of as vectors.

The columns of vectors are linearly independent iff det(A) ¹ 0.

If C = AB, it happens to be true that det(C) = det(A)det(B). Thus, if the columns of A and B are linearly independent, then so are the columns of C.

Eigenvalues and Eigenvectors

The equation Ax = y is a linear transformation that maps a given vector x onto a new vector y. Special vectors that map onto multiples of themselves are very important in many applications, because those vectors tend to correspond to “preferred modes” of behavior represented by the vectors. Such vectors are called eigenvectors (German for “proper” vectors), and the multiple for a given eigenvector is called its eignevalue.

To find eigenvalues and eigenvectors, we start with the definition:

, which can be written as

, which has solutions iff

The values of l that satisfy the above determinant equation are the eigenvalues, and those eigenvalues can then be plugged back into the defining equation to find the eigenvectors.

You will see that eigenvectors are only determined up to an arbitrary factor; choosing the factor is called normalizing the vector. The most common factor to choose is the one that results in the eigenvector having a length of 1.

Practice:

  1. Find the eigenvalues and eigenvectors of .

  1. Find the eigenvalues and eigenvectors of .

Notes:

·  In these examples, you can see that finding the eigenvalues of an n ´ n matrix involved solving a polynomial equation of degree n, which means that there are always n eigenvalues for an n ´ n matrix.

·  Also in these two examples, all the eigenvalues were distinct. However, that is not always the case. If a given eigenvalue appears m times as a root of the polynomial equation, then that eigenvalue is said to have algebraic multiplicity m.

·  Every eigenvalue will have q linearly independent eigenvectors, where 1 £ q £ m. The number q is called the geometric multiplicity of the eigenvalue.

·  Thus, if each eigenvalue of A is simple (has algebraic multiplicity m = 1), then each eigenvalue also has geometric multiplicity q = 1.

·  If l1 and l2 are distinct eigenvalues of a matrix A, then their corresponding eigenvectors are linearly independent.

·  So, if all the eigenvalues of an n ´ n matrix are simple, then all its eigenvectors are linearly independent. However, if there are repeated eigenvalues, then there may be less than n linearly independent eigenvectors, which will pose complications later on when solving systems of differential equations (which we won’t have time to get to…).

Practice:

  1. Find the eigenvalues and eigenvectors of .

Notes:

·  In this example, A was symmetric: AT = A., and we see that even though there was a repeated eigenvalue, there were still three linearly independent eigenvectors.

·  Symmetric matrices are a subset of Hermitian, or self-adjoint matrices: A*= A;

·  Hermitian matrices have the following properties:

·  All eigenvalues are real.

·  There are always n linearly independent eigenvectors, regardless of multiplicities.

·  All eigenvectors of distinct eigenvalues are orthogonal.

·  If an eigenvalue has algebraic multiplicty m, then it is always possible to choose m mutually orthogonal eigenvectors.