Eigenvalues and Eigenvectors

#folder

Abstract

  1. Find eigenvalues

    Find , factor

  2. Find eigenvectors
    For each eigenvalue , find so , which is a system of equations -> RREF which should have free variables.

  3. Find the bases for the eigenspaces
    Solve -> RREF (which was already done in the previous part), then each vector with a free variable is a basis vector.

  4. Multiplicities
    Algebraic: # of times appears as a root in
    Geometric: dimension of the eigenspace cardinality of the set of eigenvalues

Continue with Diagonalization

Example

Find the eigenvalues, eigenvectors, basis for the eigenspace, and multiplicities of

  1. Find eigenvaluesThus
  2. Find eigenvectors
    • For
      Solve the system Missing \end{align}\begin{align} A - 2I & = \vec{0} \\ & = \bmatrix{1 & 2 \\ -1 & 4} - \bmatrix{2 & 0 \\ 0 & 2} \\ & = \bmatrix{-1 & 2 \\ -1 & 2} \\ & \sim \bmatrix{-1 & 2 \\ 0 & 0} so and , so
    • For
      Solve the system Missing \end{align}\begin{align} A - 3I & = \vec{0} \\ & = \bmatrix{1 & 2 \\ -1 & 4} - \bmatrix{3 & 0 \\ 0 & 3} \\ & = \bmatrix{-2 & 2 \\ -1 & 1} \\ & \sim \bmatrix{-1 & 1 \\0 & 0} so and , so
  3. Find bases for eigenspaces
    We have already solved , and have found the eigenvectors with free variables.
    • The basis for the eigenspace is
    • The basis for the eigenspace is
  4. Multiplicities
    We have and , since they only appear once each as roots.
    We have and , since the bases for the eigenspaces only have 1 element.

Eigenvalues and Eigenvectors

Definition

For , a scalar is an eigenvalue of if for some non-zero vector . The vector is then called an eigenvector of corresponding to .

Intuition

An eigenvector is a vector which stays on its own span after a linear transformation. The eigenvalue () is the amount which it is stretched or compressed.

Visualization

Consider the transformation :

Here, the basis vector gets "knocked" off its span, while and the red vector stay on their own spans (that is, their magnitude may change, but not the direction).

Computing Eigenvalues and Eigenvectors

If we have , how do we solve for ? Well, we can move stuff around:

(see identity matrix, zero vector)

Which is a homogeneous system

By invertible matrix theorem and the relation between the determinant and matrix invertibility:

Theorem

Let . A number is an eigenvalue of if and only if satisfies the equation:

If is an eigenvalue, then all nonzero solutions of the homogeneous system of equations

are the eigenvectors corresponding to .

So to find eigenvalues , we find the null space of by solving the homogeneous system .

Characteristic Polynomial

Definition

Let . The characteristic polynomial of is

(see determinant)

Theorem

Let . Then is a polynomial of degree .

Eigenspace

Definition

Let be an eigenvalue of . The set containing all the eigenvectors of corresponding to together with the zero vector is called the eigenspace of corresponding to , and is denoted by .

Remark

This is simply (null space)

Eigenspace is a Subspace

Theorem

Let be an eigenvalue of .

  • If , then is a subspace of .
  • If , then is a subspace of

Bases of Eigenspaces are Linearly Independent

Lemma

Let be an matrix and let be distinct eigenvalues of .

If is a basis for the eigenspace for , then is linearly independent. That is, every basis vector of all the eigenspaces are all linearly independent of each other.

Multiplicity

Theorem

For any any eigenvalue of ,

Algebraic Multiplicity

Definition

Let with eigenvalue . The algebraic multiplicity of , denoted by , is the number of times appears as a root of

(see multiplicity)

Geometric Multiplicity

Definition

Let with eigenvalue . The geometric multiplicity of , denoted by , is the dimension of the eigenspace