Line 25: | Line 25: | ||
Diagonalization: | Diagonalization: | ||
* $S^{-1} A S = \Lambda$ | * $S^{-1} A S = \Lambda$ | ||
− | * $\text{diag } | + | * $\Lambda = \text{diag } (\lambda_1, \ ... \ , \lambda_n)$ |
Line 38: | Line 38: | ||
Suppose we have $n$ linearly independent [[Eigenvalues and Eigenvectors|eigenvectors]] $\mathbf x_i$ of $A$ | Suppose we have $n$ linearly independent [[Eigenvalues and Eigenvectors|eigenvectors]] $\mathbf x_i$ of $A$ | ||
* let's put them in columns of a matrix $S$ - eigenvector matrix | * let's put them in columns of a matrix $S$ - eigenvector matrix | ||
− | * | + | * <math>S = \begin{bmatrix} |
| & & | \\ | | & & | \\ | ||
\mathbf x_1 & ... & \mathbf x_n \\ | \mathbf x_1 & ... & \mathbf x_n \\ | ||
| & & | \\ | | & & | \\ | ||
− | \end{bmatrix} | + | \end{bmatrix}</math> |
Now what if we multiply $AS$? | Now what if we multiply $AS$? | ||
* since all these $\mathbf x_i$ are eigenvectors, $A \mathbf x_i = \lambda_i \mathbf x_i$ | * since all these $\mathbf x_i$ are eigenvectors, $A \mathbf x_i = \lambda_i \mathbf x_i$ | ||
− | * thus, | + | * thus, <math>AS = A \begin{bmatrix} |
| & | & & | \\ | | & | & & | \\ | ||
\mathbf x_1 & \mathbf x_2 & ... & \mathbf x_n \\ | \mathbf x_1 & \mathbf x_2 & ... & \mathbf x_n \\ | ||
Line 55: | Line 55: | ||
\lambda_1 \, \mathbf x_1 & \lambda_2 \, \mathbf x_2 & ... & \lambda_m \, \mathbf x_n \\ | \lambda_1 \, \mathbf x_1 & \lambda_2 \, \mathbf x_2 & ... & \lambda_m \, \mathbf x_n \\ | ||
| & | & & | \\ | | & | & & | \\ | ||
− | \end{bmatrix} | + | \end{bmatrix}</math> |
− | * ok, now let's take the $\lambda_i$'s out: | + | * ok, now let's take the $\lambda_i$'s out: <math>\begin{bmatrix} |
| & & | \\ | | & & | \\ | ||
\lambda_1 \, \mathbf x_1 & \lambda_2 \, \mathbf x_2 & ... & \lambda_m \, \mathbf x_n \\ | \lambda_1 \, \mathbf x_1 & \lambda_2 \, \mathbf x_2 & ... & \lambda_m \, \mathbf x_n \\ | ||
Line 70: | Line 70: | ||
\vdots & \vdots & \ddots & \vdots \\ | \vdots & \vdots & \ddots & \vdots \\ | ||
0 & 0 & \cdots & \lambda_n \\ | 0 & 0 & \cdots & \lambda_n \\ | ||
− | \end{bmatrix} | + | \end{bmatrix}</math> |
− | * let's call this diagonal matrix | + | * let's call this diagonal matrix <math>\Lambda = \begin{bmatrix} |
\lambda_1 & 0 & \cdots & 0 \\ | \lambda_1 & 0 & \cdots & 0 \\ | ||
0 & \lambda_2 & \cdots & 0 \\ | 0 & \lambda_2 & \cdots & 0 \\ | ||
\vdots & \vdots & \ddots & \vdots \\ | \vdots & \vdots & \ddots & \vdots \\ | ||
0 & 0 & \cdots & \lambda_n \\ | 0 & 0 & \cdots & \lambda_n \\ | ||
− | \end{bmatrix} | + | \end{bmatrix}</math> |
* so $AS = S \Lambda$ | * so $AS = S \Lambda$ | ||
Line 129: | Line 129: | ||
* $F_{k + 2} = F_{k+ 1} + F_{k}$ | * $F_{k + 2} = F_{k+ 1} + F_{k}$ | ||
* it's a second order Recurrence Equation | * it's a second order Recurrence Equation | ||
− | * Let's build a system: | + | * Let's build a system: <math>\begin{cases} |
F_{k+2} = F_{k+1} + F_{k} \\ | F_{k+2} = F_{k+1} + F_{k} \\ | ||
F_{k+1} = F_{k+1} \\ | F_{k+1} = F_{k+1} \\ | ||
− | \end{cases} | + | \end{cases}</math> |
* so let $\mathbf u_k = \begin{bmatrix} F_{k+1} \\ F_{k} \end{bmatrix}$, | * so let $\mathbf u_k = \begin{bmatrix} F_{k+1} \\ F_{k} \end{bmatrix}$, | ||
* then $\mathbf u_{k + 1} = \begin{bmatrix} 1 & 1 \\ 1 & 0 \end{bmatrix} \begin{bmatrix} F_{k+1} \\ F_{k} \end{bmatrix} = A \, \mathbf u_k$ | * then $\mathbf u_{k + 1} = \begin{bmatrix} 1 & 1 \\ 1 & 0 \end{bmatrix} \begin{bmatrix} F_{k+1} \\ F_{k} \end{bmatrix} = A \, \mathbf u_k$ | ||
Line 138: | Line 138: | ||
* note that $A$ is symmetric, so it's diagonalizable | * note that $A$ is symmetric, so it's diagonalizable | ||
* let's find eigenvalues: | * let's find eigenvalues: | ||
− | ** | + | ** <math>| A - \lambda I | = \begin{vmatrix} |
1 - \lambda & 1 \\ | 1 - \lambda & 1 \\ | ||
1 & -\lambda | 1 & -\lambda | ||
− | \end{vmatrix} = \lambda^2 - \lambda - 1 = 0 | + | \end{vmatrix} = \lambda^2 - \lambda - 1 = 0</math> |
** or $\lambda_{1,2} = \cfrac{1 \pm \sqrt{1+4}}{2} = \cfrac{1 \pm \sqrt{5}}{2}$ | ** or $\lambda_{1,2} = \cfrac{1 \pm \sqrt{1+4}}{2} = \cfrac{1 \pm \sqrt{5}}{2}$ | ||
− | ** $\lambda_1 = 0.5 (1 + \sqrt{5}) \approx 1.618$ | + | ** $\lambda_1 = 0.5 \, (1 + \sqrt{5}) \approx 1.618$ |
− | ** $\lambda_1 = 0.5 (1 - \sqrt{5}) \approx -0.618$ | + | ** $\lambda_1 = 0.5 \, (1 - \sqrt{5}) \approx -0.618$ |
* so, $\mathbf u_{100} = c_1 \lambda_1^{100} \mathbf v_1 + c_2 \lambda_2^{100} \mathbf v_2$ | * so, $\mathbf u_{100} = c_1 \lambda_1^{100} \mathbf v_1 + c_2 \lambda_2^{100} \mathbf v_2$ | ||
** $\lambda_1 > 1$ so powers of it grow faster, and $|\lambda_2| < 1$, so its powers decay | ** $\lambda_1 > 1$ so powers of it grow faster, and $|\lambda_2| < 1$, so its powers decay |
Eigen Decomposition or, sometimes, Eigenvalue Decomposition (shortcut EVD)
We can turn a matrix into a diagonal one by using eigenvectors
Eigenvalue decomposition is a decomposition of a matrix into a "canonical form"
Why can we do it?
Diagonalization:
Eigenvalue Decomposition:
Suppose we have $n$ linearly independent eigenvectors $\mathbf x_i$ of $A$
Now what if we multiply $AS$?
For symmetric $A$
same for diagonalizable matrices:
This is useful for calculating the steady state probabilities of Markov Chains - by calculating the powers of Stochastic Matrices - matrix representation of a Markov Chain
Suppose you are given a vector $\mathbf u_0$ and a recurrent formula $\mathbf u_{k+1} = A \mathbf u_{k}$