This is a stub. Edit it. |

In Linear Algebra a $m \times n$ matrix $A$ is a rectangular array with $m$ rows and $n$ columns:

$A = \begin{bmatrix} a_{11} & a_{12} & ... & a_{1n}\\ a_{21} & a_{22} & ... & a_{2n}\\ ... & ... & ... & ... \\ a_{m1} & a_{m2} & ... & a_{mn} \end{bmatrix}$

$\{a_{ij}\}$ (or $(A)_{ij}$) are components of the matrix $A$

if $m = n$, then $A$ is called *rectangular*

$(a_{11}, a_{22}, ..., a_{nn})$ are diagonal elements

- Matrix Multiplication: Can multiply a matrix by a scalar, by a vector or by another matrix
- Matrix Transposition
- Inversion
- ...

Matrices can be:

- square $n \times n$ and rectangular $m \times n$
- Rank-1 Matrices
- Identity matrices
- Symmetric Matrices
- Orthogonal Matrices
- Rotation Matrices
- Similar Matrices
- Positive-Definite Matrices

- LU Decomposition: $A = LU$ where $L$ is lower triangular and $U$ is upper triangular
- QR Decomposition: $A = QR$ where $Q$
- Eigendecomposition: $A = S \Lambda S^{-1}$ with diagonal $\Lambda$
- special case of EVD: Spectral Theorem: $A = Q \Lambda Q^T$ with diagonal $\Lambda$ and Orthogonal $Q$
- Singular Value Decomposition: $A = U \Sigma V^T$ with diagonal $\Sigma$ and orthogonal $U$ and $V$

We can see matrices as vectors, and they also can form Vector Spaces

- see Matrix Vector Spaces
- they have inner product (element-wise) and norm (Frobenius Norm)

- Linear Algebra MIT 18.06 (OCW)
- Курош А.Г. Курс Высшей Алгебры