Skip to content

Linear Algebra Review

This review covers everything from the Linear Algebra section. Use it to test yourself across all topics before moving on.

  • Vectors: a,b,c\langle a, b, c \rangle with magnitude v=a2+b2+c2\lVert \mathbf{v} \rVert = \sqrt{a^2 + b^2 + c^2}
  • Dot product: uv=u1v1+u2v2+u3v3\mathbf{u} \cdot \mathbf{v} = u_1v_1 + u_2v_2 + u_3v_3 (measures alignment, used for projection and lighting)
  • Cross product: u×v\mathbf{u} \times \mathbf{v} gives a perpendicular vector (used for normals and area)
  • Addition/subtraction require matching dimensions
  • Multiplication: row-by-column dot products, result is m×pm \times p from (m×n)(n×p)(m \times n)(n \times p)
  • Not commutative: ABBAAB \neq BA in general
  • Determinant: adbcad - bc for 2x2, tells you invertibility and area scaling
  • Inverse: A1A^{-1} exists only when det(A)0\det(A) \neq 0
  • Write as augmented matrix, apply row operations
  • REF: staircase of zeros below pivots
  • RREF: pivots are 1, zeros above and below
  • Three outcomes: unique solution, no solution, infinitely many
  • Linearly independent: no vector is a combination of the others
  • Basis: independent set that spans the space
  • Dimension: number of vectors in any basis

Eigenvalues, Eigenvectors, and Diagonalization

Section titled “Eigenvalues, Eigenvectors, and Diagonalization”
  • Av=λvA\mathbf{v} = \lambda\mathbf{v}: eigenvectors are scaled, not rotated
  • Find eigenvalues via det(AλI)=0\det(A - \lambda I) = 0
  • Diagonalization: A=PDP1A = PDP^{-1} makes An=PDnP1A^n = PD^nP^{-1} trivial
  • MVP pipeline: Projection ×\times View ×\times Model ×\times vertex
  • Lighting: brightness =nL= \mathbf{n} \cdot \mathbf{L}
  • Collision: projection removes wall-normal component for sliding

Example 1: Vector Projection

Project u=5,12\mathbf{u} = \langle 5, 12 \rangle onto v=3,4\mathbf{v} = \langle 3, 4 \rangle.

uv=(5)(3)+(12)(4)=15+48=63\mathbf{u} \cdot \mathbf{v} = (5)(3) + (12)(4) = 15 + 48 = 63

v2=9+16=25\lVert \mathbf{v} \rVert^2 = 9 + 16 = 25

projvu=63253,4=7.56,10.08\text{proj}_{\mathbf{v}} \mathbf{u} = \frac{63}{25} \langle 3, 4 \rangle = \langle 7.56, 10.08 \rangle

Example 2: Matrix Multiplication

[2003][0110]=[0230]\begin{bmatrix} 2 & 0 \\ 0 & 3 \end{bmatrix} \begin{bmatrix} 0 & -1 \\ 1 & 0 \end{bmatrix} = \begin{bmatrix} 0 & -2 \\ 3 & 0 \end{bmatrix}

This scales by (2, 3) then rotates 90°. Order matters: rotating first then scaling gives a different result.

Example 3: 3x3 Gaussian Elimination

{x+y+z=62x+3y+z=14xy+2z=2\begin{cases} x + y + z = 6 \\ 2x + 3y + z = 14 \\ x - y + 2z = 2 \end{cases}

After row reduction: x=4x = 4, y=2y = 2, z=0z = 0.

Example 4: Eigenvalues

A=[4123]A = \begin{bmatrix} 4 & 1 \\ 2 & 3 \end{bmatrix}

det(AλI)=(4λ)(3λ)2=λ27λ+10=(λ5)(λ2)=0\det(A - \lambda I) = (4-\lambda)(3-\lambda) - 2 = \lambda^2 - 7\lambda + 10 = (\lambda - 5)(\lambda - 2) = 0

Eigenvalues: λ1=5\lambda_1 = 5, λ2=2\lambda_2 = 2.

Example 5: Determinant and Invertibility

A=[6342]A = \begin{bmatrix} 6 & 3 \\ 4 & 2 \end{bmatrix}

det(A)=(6)(2)(3)(4)=1212=0\det(A) = (6)(2) - (3)(4) = 12 - 12 = 0

AA is singular. No inverse exists. The rows are linearly dependent (row 1 = 1.5 ×\times row 2).

  • Practice matrix multiplication and Gaussian elimination until they feel automatic
  • Visualize vectors and transformations whenever possible
  • The key insight: matrices represent linear transformations
  • Connect concepts: dot product leads to projection leads to lighting; eigenvalues lead to diagonalization leads to efficient computation
  • When stuck on a problem, ask: “What does this look like geometrically?”
The dot product $\mathbf{u} \cdot \mathbf{v}$ equals zero when the vectors are:
Matrix multiplication is:
Gaussian elimination is used to:
A set of vectors is a basis if it is linearly independent and:
The MVP pipeline transforms vertices using:
If $\det(A) = 0$, then matrix $A$:
An eigenvector $\mathbf{v}$ of matrix $A$ satisfies:
The cross product $\mathbf{u} \times \mathbf{v}$ produces:
In game lighting, surface brightness is determined by:
Diagonalization $A = PDP^{-1}$ is useful because:
Linear independence means:
The dimension of $\mathbb{R}^3$ is:
Translation is not a linear transformation unless we use:
The view matrix represents:
If two vectors are linearly dependent, one can be written as:
The projection matrix is responsible for:
Eigenvalues tell us:
For a 2x2 matrix, the determinant $ad - bc$ tells you:
A major advantage of using matrices in games is:
The most important big idea in linear algebra for graphics is: