This review covers everything from the Linear Algebra section. Use it to test yourself across all topics before moving on.
- Vectors: ⟨a,b,c⟩ with magnitude ∥v∥=a2+b2+c2
- Dot product: u⋅v=u1v1+u2v2+u3v3 (measures alignment, used for projection and lighting)
- Cross product: u×v gives a perpendicular vector (used for normals and area)
- Addition/subtraction require matching dimensions
- Multiplication: row-by-column dot products, result is m×p from (m×n)(n×p)
- Not commutative: AB=BA in general
- Determinant: ad−bc for 2x2, tells you invertibility and area scaling
- Inverse: A−1 exists only when det(A)=0
- Write as augmented matrix, apply row operations
- REF: staircase of zeros below pivots
- RREF: pivots are 1, zeros above and below
- Three outcomes: unique solution, no solution, infinitely many
- Linearly independent: no vector is a combination of the others
- Basis: independent set that spans the space
- Dimension: number of vectors in any basis
- Av=λv: eigenvectors are scaled, not rotated
- Find eigenvalues via det(A−λI)=0
- Diagonalization: A=PDP−1 makes An=PDnP−1 trivial
- MVP pipeline: Projection × View × Model × vertex
- Lighting: brightness =n⋅L
- Collision: projection removes wall-normal component for sliding
Example 1: Vector Projection
Project u=⟨5,12⟩ onto v=⟨3,4⟩.
u⋅v=(5)(3)+(12)(4)=15+48=63
∥v∥2=9+16=25
projvu=2563⟨3,4⟩=⟨7.56,10.08⟩
Example 2: Matrix Multiplication
[2003][01−10]=[03−20]
This scales by (2, 3) then rotates 90°. Order matters: rotating first then scaling gives a different result.
Example 3: 3x3 Gaussian Elimination
⎩⎨⎧x+y+z=62x+3y+z=14x−y+2z=2
After row reduction: x=4, y=2, z=0.
Example 4: Eigenvalues
A=[4213]
det(A−λI)=(4−λ)(3−λ)−2=λ2−7λ+10=(λ−5)(λ−2)=0
Eigenvalues: λ1=5, λ2=2.
Example 5: Determinant and Invertibility
A=[6432]
det(A)=(6)(2)−(3)(4)=12−12=0
A is singular. No inverse exists. The rows are linearly dependent (row 1 = 1.5 × row 2).
- Practice matrix multiplication and Gaussian elimination until they feel automatic
- Visualize vectors and transformations whenever possible
- The key insight: matrices represent linear transformations
- Connect concepts: dot product leads to projection leads to lighting; eigenvalues lead to diagonalization leads to efficient computation
- When stuck on a problem, ask: “What does this look like geometrically?”
Retrying will remove your ✅ checkmark until you pass again.