Line 3: | Line 3: | ||
− | |||
− | |||
Suppose we have a [[Matrix]] $A$. What does it do with a vector? | Suppose we have a [[Matrix]] $A$. What does it do with a vector? | ||
* Suppose we multiply $A$ by a vector $\mathbf x$ - and we get a vector $A \mathbf x$ | * Suppose we multiply $A$ by a vector $\mathbf x$ - and we get a vector $A \mathbf x$ | ||
Line 17: | Line 15: | ||
* i.e. $\mathbf x \in N(A)$ - the eigenvector $\mathbf x$ belongs to the [[Nullspace]] | * i.e. $\mathbf x \in N(A)$ - the eigenvector $\mathbf x$ belongs to the [[Nullspace]] | ||
* so if $A$ is singular, then $\lambda = 0$ is its eigenvalue | * so if $A$ is singular, then $\lambda = 0$ is its eigenvalue | ||
− | |||
Line 35: | Line 32: | ||
$A \mathbf x = \lambda \mathbf x$ | $A \mathbf x = \lambda \mathbf x$ | ||
− | * let's rewrite it as $(A - \lambda I) \mathbf x = \mathbf 0$ | + | * let's rewrite it as $A \mathbf x - \lambda \mathbf x = (A - \lambda I) \mathbf x = \mathbf 0$ |
− | * for $\mathbf x \ne \mathbf 0$ | + | * so, $(A - \lambda I) \mathbf x = \mathbf 0$ for $\mathbf x \ne \mathbf 0$ |
− | * | + | * we want the vector to be non-zero, and it's only possible when the matrix $A - \lambda I$ is singular: that is, the columns of this matrix should be linearly dependent, so it's possible to get the zero verctor |
− | * $\text{det}(A - \lambda I) = 0$ - this is called the ''characteristic equation'' | + | * the matrix $A - \lambda I$ is singular when its [[Determinant]] is zero |
− | * this equation has $n$ roots, so you'll find $n$ eigenvalues $\lambda_i$ | + | * this, we need to solve $\text{det}(A - \lambda I) = 0$ - this is called the ''characteristic equation'' of $A$ |
− | * | + | * this equation is an $n$th order polynomial and has $n$ roots, so you'll find $n$ eigenvalues $\lambda_i$ |
+ | * Eigenvalues are sometimes called "singular values" because if $\lambda$ is an eigenvalue, then $A - \lambda I$ is singular | ||
− | $(A - \lambda I) \mathbf x = \mathbf 0$ | + | Finding the eigenvector |
− | + | * Look at the $(A - \lambda I) \mathbf x = \mathbf 0$ | |
+ | * The corresponding eigenvector $\mathbf x$ is in the [[Nullspace]] of $A - \lambda I$ | ||
+ | * So we need to solve this equation to get the vector | ||
+ | * Since the matrix is singular, there are multiple such eigenvectors - same direction, but different scale | ||
+ | * We just fix some scale and find the direction | ||
− | |||
− | |||
− | + | == Examples == | |
− | + | ||
− | == | + | |
=== Example 1 === | === Example 1 === | ||
− | Let | + | Let <math>A = \begin{bmatrix} |
3 & 1 \\ | 3 & 1 \\ | ||
1 & 3 \\ | 1 & 3 \\ | ||
− | \end{bmatrix} | + | \end{bmatrix}</math> |
− | * | + | * <math>\text{det}(A - \lambda I) = \begin{bmatrix} |
3 - \lambda & 1 \\ | 3 - \lambda & 1 \\ | ||
1 & 3- \lambda \\ | 1 & 3- \lambda \\ | ||
− | \end{bmatrix} = (3 - \lambda)^2 - 1 = 0 | + | \end{bmatrix} = (3 - \lambda)^2 - 1 = 0</math> |
* or $(3 - \lambda)^2 = 1$ | * or $(3 - \lambda)^2 = 1$ | ||
* $3 - \lambda = \pm 1$ | * $3 - \lambda = \pm 1$ | ||
Line 71: | Line 69: | ||
− | === Example 2: [[Rotation Matrices|Rotation Matrix]] === | + | === Example 2 === |
− | Let | + | Let <math>A = \begin{bmatrix} |
+ | 0 & 1 \\ | ||
+ | -2 & 3 \\ | ||
+ | \end{bmatrix}</math> | ||
+ | * need to solve $\text{det } A = 0$: | ||
+ | * $- \lambda (3 - \lambda) + 2 = 0$ | ||
+ | * $\lambda_{1,2} = \cfrac{-b \pm \sqrt{b^2 - 4ac}}{2a} = \cfrac{3 \pm \sqrt{9 - 4 \cdot 2}}{2} = \cfrac{3 \pm 1}{2}$ | ||
+ | * so $\lambda_1 = 1$ and $\lambda_2 = 2$ | ||
+ | |||
+ | Eigenvector $A \mathbf v_1 = \lambda_1 \mathbf v_1$ | ||
+ | * $(A - \lambda_1 I) \mathbf v_1 = \mathbf 0$ | ||
+ | * <math>\begin{bmatrix} | ||
+ | 0 - 1 & 1 \\ | ||
+ | -2 & 3 - 1 \\ | ||
+ | \end{bmatrix} = \begin{bmatrix} | ||
+ | -1 & 1 \\ | ||
+ | -2 & 2 \\ | ||
+ | \end{bmatrix}</math>, <math>\begin{bmatrix} | ||
+ | -1 & 1 \\ | ||
+ | -2 & 2 \\ | ||
+ | \end{bmatrix} \mathbf v_1 = \mathbf 0</math> | ||
+ | * Reduce the matrix to upper triangular form - and then we see that there's zeros on the last row (the matrix is singular) | ||
+ | ** <math>\begin{bmatrix} | ||
+ | -1 & 1 \\ | ||
+ | 0 & 0 \\ | ||
+ | \end{bmatrix}</math> | ||
+ | ** can fix some value of $v_{12}$, e.g. $v_{12} = 1$ | ||
+ | ** then $v_{11} = 1$ | ||
+ | * so <math>\mathbf v_1 = c \cdot \begin{bmatrix} | ||
+ | 1 \\ | ||
+ | 1 \\ | ||
+ | \end{bmatrix}</math> where $c$ is some constant | ||
+ | |||
+ | |||
+ | === Example 3: [[Rotation Matrices|Rotation Matrix]] === | ||
+ | Let <math>Q = \begin{bmatrix} | ||
0 & -1 \\ | 0 & -1 \\ | ||
1 & 0 \\ | 1 & 0 \\ | ||
− | \end{bmatrix} | + | \end{bmatrix}</math> |
* know that $\text{tr } Q = \sum_i \lambda_i$, so $\lambda_1 + \lambda_2 = 0$ | * know that $\text{tr } Q = \sum_i \lambda_i$, so $\lambda_1 + \lambda_2 = 0$ | ||
− | * | + | * <math>\text{det } Q = \lambda_1 \, \lambda_2 = 1</math> |
* how it's possible? | * how it's possible? | ||
− | * | + | * <math>\text{det } Q = \begin{vmatrix} |
-\lambda & -1 \\ | -\lambda & -1 \\ | ||
1 & -\lambda \\ | 1 & -\lambda \\ | ||
− | \end{vmatrix} = \lambda^2 + 1 | + | \end{vmatrix} = \lambda^2 + 1</math> |
* so $\lambda_1 = i$ and $\lambda_2 = -i$ - complex numbers | * so $\lambda_1 = i$ and $\lambda_2 = -i$ - complex numbers | ||
* note that they are complex conjugates | * note that they are complex conjugates | ||
Line 90: | Line 123: | ||
== Properties == | == Properties == | ||
− | [[Gaussian Elimination]] changes the eigenvalues of $A$ | + | * [[Gaussian Elimination]] changes the eigenvalues of $A$ |
− | * Triangular $U$ has its eigenvalues on the diagonal - but they are not eigenvalues of $A$ | + | ** Triangular $U$ has its eigenvalues on the diagonal - but they are not eigenvalues of $A$ |
− | + | * eigenvectors can be multiplied by any non-negative constant and will still remain eigenvectors | |
− | + | ** so it's good to make these vectors unit vectors | |
− | eigenvectors can be multiplied by any non-negative constant and will still remain eigenvectors | + | |
− | * so it's good to make these vectors unit vectors | + | |
=== Trace and Determinant === | === Trace and Determinant === | ||
− | $\text{tr } A = \sum\limits_i \lambda_i$ | + | * $\text{tr } A = \sum\limits_i \lambda_i$ |
− | * $\text{tr } A$ is a [[Trace (Matrix)|Trace]] of $A$ | + | ** $\text{tr } A$ is a [[Trace (Matrix)|Trace]] of $A$ |
− | + | * $\text{det } A = \prod\limits_i \lambda_i$ | |
− | + | ** $\text{det } A$ is a [[Determinant]] of $A$ | |
− | $\text{det } A = \prod\limits_i \lambda_i$ | + | |
− | * $\text{det } A$ is a [[Determinant]] of $A$ | + | |
− | + | ||
{{ TODO | prove it }} | {{ TODO | prove it }} | ||
− | |||
Line 118: | Line 145: | ||
* For $A^k$ the eigenvalues are $\lambda_i^k$ | * For $A^k$ the eigenvalues are $\lambda_i^k$ | ||
* Eigenvectors stay the same and don't mix up, only eigenvalues grow | * Eigenvectors stay the same and don't mix up, only eigenvalues grow | ||
+ | * this is used in [[Power Iteration]] and other methods for finding approximate values of eigenvalues and eigenvectors | ||
Eigenvalues and eigenvectors are important in dynamic problems
Suppose we have a Matrix $A$. What does it do with a vector?
What if $\lambda = 0$?
Sometimes we can find the eigenvalues by thinking geometrically
$A \mathbf x = \lambda \mathbf x$
Finding the eigenvector
Let [math]A = \begin{bmatrix} 3 & 1 \\ 1 & 3 \\ \end{bmatrix}[/math]
Now can find eigenvectors
Let [math]A = \begin{bmatrix} 0 & 1 \\ -2 & 3 \\ \end{bmatrix}[/math]
Eigenvector $A \mathbf v_1 = \lambda_1 \mathbf v_1$
Let [math]Q = \begin{bmatrix} 0 & -1 \\ 1 & 0 \\ \end{bmatrix}[/math]
TODO: prove it
if $A \mathbf x = \lambda \mathbf x$ then
If all eigenvalues $\lambda_1, \ ... \ , \lambda_n$ are different then all eigenvectors $\mathbf x_1, \ ... \ , \mathbf x_n$ are linearly independent
Thm. $\mathbf x_1, \ ... \ , \mathbf x_n$ that correspond to distinct eigenvalues are linearly independent.
Proof
$n = 2$ case:
General case:
$\square$
Suppose we have $n$ linearly independent eigenvectors $\mathbf x_i$ of $A$
$S = \Bigg[ \mathop{\mathbb x_1}\limits_|^| \ \mathop{\mathbb x_2}\limits_|^| \ \cdots \ \mathop{\mathbb x_n}\limits_|^| \Bigg]$
This matrix is used for Matrix Diagonalization