Line 39: | Line 39: | ||

* this way both $L$ and $U$ will have ones on their diagonals - if $L$ had ones and $U$ didn't | * this way both $L$ and $U$ will have ones on their diagonals - if $L$ had ones and $U$ didn't | ||

+ | == Solving [[System of Linear Equations|$A \mathbf x = \mathbf b$]] with LU Decomposition == | ||

+ | Solving $A\mathbf x = \mathbf b$ | ||

+ | * Can do that with Gaussian Elimination via LU Decomposition | ||

+ | * Let $LU = A$ | ||

+ | * then $A \mathbf x = \mathbf b$ becomes $LU \mathbf x = \mathbf b$ | ||

+ | * Now let $U \mathbf x = \mathbf y$ and $L \mathbf y = \mathbf x$ | ||

+ | * With this substitution we need to solve two equations now | ||

+ | ** Solve $L \mathbf y = \mathbf b$ to get $\mathbf y$ | ||

+ | ** Solve $U \mathbf x = \mathbf y$ to get $\mathbf x$ | ||

+ | * These two equations are simpler than the original one because: | ||

+ | ** $L$ is lower-diagonal, and $L \mathbf y = \mathbf b$ can be solved with "Forward Substitution" and | ||

+ | ** $U$ is upper-diagonal, and $U \mathbf x = \mathbf y$ can be solved with "Back Substitution" | ||

− | == [[Inverse Matrices|Matrix Inversion]] | + | === Forward Substitution === |

+ | First, we solve $L \mathbf x = \mathbf b$ | ||

+ | |||

+ | $2 \times 2$ case: | ||

+ | * <math>\begin{bmatrix} | ||

+ | l_{11} & 0 \\ | ||

+ | l_{21} & l_{22} \\ | ||

+ | \end{bmatrix} | ||

+ | \begin{bmatrix} | ||

+ | x_1 \\ | ||

+ | x_2 \\ | ||

+ | \end{bmatrix} | ||

+ | = | ||

+ | \begin{bmatrix} | ||

+ | b_1 \\ | ||

+ | b_2 \\ | ||

+ | \end{bmatrix} | ||

+ | </math> | ||

+ | * $x_1 = b_1 / l_{11}$ | ||

+ | * $x_2 = (b_2 - l_{21} x_2) / l_{22}$ | ||

+ | |||

+ | General procedure: | ||

+ | * $x_i = (b - \sum_{j=1}^{i - 1} l_{ij} x_j) / l_{ii}$ | ||

+ | |||

+ | |||

+ | === Back Substitution === | ||

+ | Then we solve $U \mathbf x = \mathbf b$ | ||

+ | |||

+ | $2 \times 2$ case: | ||

+ | * <math>\begin{bmatrix} | ||

+ | u_{11} & u_{12} \\ | ||

+ | 0 & u_{22} \\ | ||

+ | \end{bmatrix} | ||

+ | \begin{bmatrix} | ||

+ | x_1 \\ | ||

+ | x_2 \\ | ||

+ | \end{bmatrix} | ||

+ | = | ||

+ | \begin{bmatrix} | ||

+ | b_1 \\ | ||

+ | b_2 \\ | ||

+ | \end{bmatrix} | ||

+ | </math> | ||

+ | |||

+ | General procedure: | ||

+ | * $x_i = (b - \sum_{j=i+1}^{n} u_{ij} x_j) / u_{ii}$ | ||

+ | |||

+ | |||

+ | === [[Inverse Matrices|Matrix Inversion]] === | ||

To find the matrix inverse, we need to solve the equation $AX = I$, where $X = A^{-1}$ | To find the matrix inverse, we need to solve the equation $AX = I$, where $X = A^{-1}$ | ||

* Let's decompose $A = LU$, so we have: $LUX = I$ | * Let's decompose $A = LU$, so we have: $LUX = I$ | ||

Line 58: | Line 118: | ||

* [[Linear Algebra MIT 18.06 (OCW)]] | * [[Linear Algebra MIT 18.06 (OCW)]] | ||

* https://www.gamedev.net/resources/_/technical/math-and-physics/matrix-inversion-using-lu-decomposition-r3637 | * https://www.gamedev.net/resources/_/technical/math-and-physics/matrix-inversion-using-lu-decomposition-r3637 | ||

+ | * [[Matrix Computations (book)]] | ||

[[Category:Linear Algebra]] | [[Category:Linear Algebra]] | ||

[[Category:Matrix Decomposition]] | [[Category:Matrix Decomposition]] |

This is the simplest factorization that can be seen as a by-product of Gaussian Elimination

When we do elimination, we have some elimination matrices:

- $E_1 \cdots E_k A = U$, where $E_1 \cdots E_k$ are elimination matrices for each elimination step
- Let $E = E_1 \cdots E_k$

- $E_1 \cdots E_k \ A = U$
- or $EA = U$
- let $L = E^{-1}$, so we have $A = LU$
- $U$ is upper-triangular by construction - because we eliminate all elements down the main diagonal
- $L$ is lower-triangular

$L$

- $L = E^{-1} = (E_1 \cdots E_k)^{-1} = E_k^{-1} \cdots E_1^{-1}$
- $E_i$ have zeros up the diagonal, so when we inverse them, they become lower-diagonal
- when we multiply a bunch of lower-diagonal matrices, we get a lower-diagonal matrix

Types of $LU$ decomposition:

- It's possible to have ones on the main diagonal of either $L$ or $U$
- Doolittle decomposition - the diagonal entries of $L$ are ones
- Crout decomposition - the diagonal entries of $U$ are ones

- To avoid division by small numbers, we permute rows during the eliminations such that the largest element is the pivot
- We can express row permutation with the permutation matrix P
- Then $PA$ is $A$ with permuted rows, so we have $E \, (PA) = U$
- So the decomposition becomes $PA = LU$

We can go further:

- and obtain factorization $A = LU = LDU^*$, where $D$ is diagonal
- I.e. we factorize $U = DU^*$
- this way both $L$ and $U$ will have ones on their diagonals - if $L$ had ones and $U$ didn't

Solving $A\mathbf x = \mathbf b$

- Can do that with Gaussian Elimination via LU Decomposition
- Let $LU = A$
- then $A \mathbf x = \mathbf b$ becomes $LU \mathbf x = \mathbf b$
- Now let $U \mathbf x = \mathbf y$ and $L \mathbf y = \mathbf x$
- With this substitution we need to solve two equations now
- Solve $L \mathbf y = \mathbf b$ to get $\mathbf y$
- Solve $U \mathbf x = \mathbf y$ to get $\mathbf x$

- These two equations are simpler than the original one because:
- $L$ is lower-diagonal, and $L \mathbf y = \mathbf b$ can be solved with "Forward Substitution" and
- $U$ is upper-diagonal, and $U \mathbf x = \mathbf y$ can be solved with "Back Substitution"

First, we solve $L \mathbf x = \mathbf b$

$2 \times 2$ case:

- [math]\begin{bmatrix} l_{11} & 0 \\ l_{21} & l_{22} \\ \end{bmatrix} \begin{bmatrix} x_1 \\ x_2 \\ \end{bmatrix} = \begin{bmatrix} b_1 \\ b_2 \\ \end{bmatrix} [/math]
- $x_1 = b_1 / l_{11}$
- $x_2 = (b_2 - l_{21} x_2) / l_{22}$

General procedure:

- $x_i = (b - \sum_{j=1}^{i - 1} l_{ij} x_j) / l_{ii}$

Then we solve $U \mathbf x = \mathbf b$

$2 \times 2$ case:

- [math]\begin{bmatrix} u_{11} & u_{12} \\ 0 & u_{22} \\ \end{bmatrix} \begin{bmatrix} x_1 \\ x_2 \\ \end{bmatrix} = \begin{bmatrix} b_1 \\ b_2 \\ \end{bmatrix} [/math]

General procedure:

- $x_i = (b - \sum_{j=i+1}^{n} u_{ij} x_j) / u_{ii}$

To find the matrix inverse, we need to solve the equation $AX = I$, where $X = A^{-1}$

- Let's decompose $A = LU$, so we have: $LUX = I$
- Now let $UX = Y$ and thus $LY = I$
- This gives us two systems:
- first solve $LY = I$ - it's easy because $L$ is lower-triangular, so we just do the forward pass and obtain $Y$
- then we solve $UX = Y$ - it's also easy because $U$ is upper-triangular

- after that we get $X$ which is our inverse
- if we need pivoting, then:
- we solve $PAX = PI = P$
- since $PA = LU$, we have $LUX = P$
- then we solve $LY = P$ and $UX = Y$