Orthogonal Diagonalization

Recall that for diagonalization, we want to find matrix , and recall for orthogonal matrices, we have . For orthogonal diagonalization, we want to find matrix

Orthogonal Subspaces

Definition

Let and be two subspaces of . We say and are orthogonal if for every and

Orthogonally Diagonalizable Matrix

Definition

An is orthogonally diagonalizable if there exists an orthogonal matrix and an diagonal matrix so that .

In this case, we say orthogonally diagonalizes to .

Lemma

, is orthogonally diagonalizable is diagonalizable.

However, the converse is not true, in that not every diagonalizable matrix is orthogonally diagonalizable.

Symmetric Matrices have Real Eigenvalues

Theorem

Let be a symmetric matrix. Then every eigenvalue of is real.

Dot Product Commutes with Matrix Vector Product Iff Symmetrical

Lemma

Let . Then is symmetric if and only if for all .

Eigenspaces of a Symmetric Matrix are Orthogonal Subspaces

Theorem

Let be a symmetric matrix. If are distinct eigenvalues of , then and are orthogonal subspaces of .

Symmetric Matrices are Orthogonally Diagonalizable

Theorem

Let . Then is symmetric if and only if is orthogonally diagonalizable.

Giant Problem

Example

Orthogonally diagonalize .

Solution
We first find the characteristic polynomial.

So we have

Eigenvalue algebraic multiplicity

We now find an a basis for each eigenspace.

so

Hence a basis for is

and . For , we solve .

so

Hence a basis for is

and .


Now let

We should verify that , , and , and also verify that is orthogonal to both and . Note that is diagonalizable since and .

We now find an orthogonal basis for by applying the Gram-Schmidt Procedure to . Let

and let .
Thus is an orthogonal basis for and is an orthogonal basis for . Moreover, since the eigenspace is orthogonal, is an orthogonal basis for (since it is a linearly independent set of 3 vectors in ).


We then normalize the vectors to obtain an orthonormal basis for :


Finally, we see that where