A matrix $A$ has four subspaces:

- Column Space $C(A)$
- Nullspace $N(A)$
- Row Space $C(A^T)$ of $A$ is the same as Column Space of $A^T$
- Nullspace of $A^T$ (also called "Left Nullspace")

Suppose we have an $m \times n$ matrix of rank $r$

- Nullspace of $A$ is orthogonal to the row space: $N(A) \; \bot \; C(A^T)$
- Left nullspace of $A$ is orthogonal to the column space: $N(A^T) \; \bot \; C(A)$
- see the proof in Space Orthogonality#Row space and Nullspace

- $\text{dim } C(A) = r$, there are $r$ pivot columns
- basis: columns of $A$

- $\text{dim } C(A^T) = r = \text{dim } C(A)$, there are $r$ pivot rows - the same dim as for Column Space
- Let $R$ be Row Reduced Echelon Form of $A$, then $C(A^T) = C(R^T)$
- basis: first $r$ rows of $R$

- $\text{dim } N(A) = n - r$ - the number of free variables
- basis: special solutions for $A\mathbf x = \mathbf 0$

- This is the nullspace of $A^T$ ($A^T$ is $n \times m$ matrix of rank $r$)
- $\text{dim } N(A^T) = m - r$ - there are $m$ columns, $m$ variables, and $m - r$ free variables

We know how to find the basis for all the subspaces

- e.g. from using Gaussian Elimination transform the matrix to the echelon form and find them
- but these bases are not "perfect". We want to use Orthogonal Vectors instead

SVD finds these bases:

- if $A V = U \Sigma$ then
- $\mathbf v_1, \ ... \ , \mathbf v_r$ is the basis for the row space $C(A^T)$
- $\mathbf v_{r+1}, \ ... \ , \mathbf v_{n}$ is the basis for the nullspace $N(A)$
- $\mathbf u_1, \ ... \ , \mathbf u_r$ is the basis for the column space $C(A)$
- $\mathbf u_{r+1}, \ ... \ , \mathbf u_{m}$ is the basis for the left nullspace $N(A^T)$

- Linear Algebra MIT 18.06 (OCW)
- The fundamental theorem of linear algebra, G. Strang [1]
- The Four Fundamental Subspaces: 4 Lines, G. Strang, [2]
- Seminar Hot Topics in Information Management IMSEM (TUB)