Appearance
Appendix — Further Reading
You have finished the course. You can work with vectors and dot products, build and compose transformation matrices, solve systems of equations, compute determinants, and reason about eigenvalues. You have seen these tools running inside a 3D rendering pipeline and a neural network forward pass.
The question now is: where do you go next?
This appendix is organized by destination. Pick the path that matches where you want to take this knowledge.
Path 1 — I Want to Go Deeper Into Theory
These resources are for readers who enjoyed the why as much as the what, and who want to fill in the mathematical foundations more rigorously.
Immersive Linear Algebra — Ström, Åström, and Akenine-Möller
URL: immersivemath.com
The best first stop after this course. Immersive Linear Algebra is a free, browser-based textbook with fully interactive figures — you can grab a vector and drag it while watching the dot product update in real time. It covers the same topics as this course (vectors, dot products, matrices, Gaussian elimination, determinants, rank, eigenvalues) but goes one layer deeper in each. The interactive format makes the geometry visceral in a way that static diagrams cannot.
3Blue1Brown — Essence of Linear Algebra
URL: 3blue1brown.com — Linear Algebra
If you ever felt that a concept clicked visually but you couldn't quite put it into words, this video series is the cure. Grant Sanderson's Essence of Linear Algebra is sixteen short videos that build geometric intuition from the ground up. The treatment of linear transformations as "moving grid lines" and the visual proof that are particularly good. Watch these alongside any textbook — the videos reinforce intuition while the textbook reinforces precision.
MIT OpenCourseWare 18.06 — Gilbert Strang
URL: MIT OCW 18.06 Linear Algebra
Gilbert Strang's undergraduate linear algebra course, recorded in full and freely available. Strang is a legendary teacher: his explanations are direct, his examples are geometric, and he has a gift for making abstract ideas feel inevitable. The course follows his textbook Introduction to Linear Algebra, but the video lectures stand on their own. Topics include everything in this course plus vector spaces and subspaces, orthogonality and least squares, singular value decomposition (SVD), and positive definite matrices. If you want one canonical resource for the full undergraduate treatment, this is it.
Where to start on 18.06
If the early chapters feel like review, skip ahead to the lectures on orthogonality (Lecture 14 onward). SVD — which Strang calls the climax of linear algebra — is covered in Lectures 29 and 30, and it is the foundation for the PCA technique introduced in Chapter 8.
Mathematics for Machine Learning — Deisenroth, Faisal, and Ong
URL: mml-book.github.io
A free PDF from Cambridge University Press. The first half of this book is a rigorous but accessible treatment of the mathematics underlying modern ML: linear algebra, analytic geometry, matrix decompositions, vector calculus, and probability. The second half applies those tools to regression, dimensionality reduction, density estimation, and SVMs. It is the bridge between "I understand the operations" and "I can read ML papers." Chapter 4, on matrix decompositions (LU, Cholesky, QR, and eigendecomposition), is especially good as a follow-up to Chapters 6–8 of this course.
Path 2 — I Want to Build Games
These resources are for readers who want to apply linear algebra inside game engines, physics simulations, and 3D rendering code.
3D Math Primer for Graphics and Game Development — Dunn and Parberry
URL: gamemath.com
The definitive book for developers who want to understand the math behind 3D games. The entire second edition is free to read online. It covers everything in this course — vectors, matrices, transformations — plus topics specific to 3D games: Euler angles and gimbal lock, quaternions for rotation, geometric primitives (rays, planes, AABBs, spheres), and triangle meshes. The writing is precise and practical; every section ends with applications to real rendering and physics problems. If you work in a game engine and want to stop treating quaternions as magic, start here.
Quaternions
Quaternions did not appear in this course because they require a small amount of complex number theory to motivate properly. They are the standard way game engines represent 3D rotations, avoiding the gimbal lock problem that Euler angles suffer from. Dunn and Parberry's Chapter 8 is the clearest introduction to them in any game math text.
LearnOpenGL — Transformations
URL: learnopengl.com — Transformations
LearnOpenGL is a complete, free tutorial series for modern OpenGL programming. The Transformations chapter is a natural continuation of Chapters 4 and 9 from this course: it shows how to construct rotation, scaling, and translation matrices, compose them using the GLM mathematics library, and upload them as uniforms to a GLSL vertex shader. The broader site covers the full rendering pipeline — lighting models, shadow mapping, framebuffers — all of which are linear algebra in action. Work through the "Getting Started" section first; you will find every matrix operation familiar.
Unity — Scripting with Vectors
URL: docs.unity3d.com — Moving objects with vectors
Unity's official documentation on its Vector2, Vector3, and Vector4 types. This is the practical complement to the theory. The page covers vector arithmetic, dot and cross products, and scalar multiplication exactly as introduced in Chapters 2–3, but in the context of Unity C# scripting. If you are building or planning to build in Unity, reading this page alongside Chapter 3 will close the gap between the math and the API.
The Nature of Code — Daniel Shiffman
URL: natureofcode.com
A free online book (and associated Processing/p5.js code) that builds physics simulations — forces, particles, springs, flocking, cellular automata — from first principles. The linear algebra is relatively light, but the book is exceptional at showing how vectors, forces, and the dot product translate into motion. Chapters 1–3 (Vectors, Forces, Oscillations) are an excellent complement to Chapters 2–3 of this course for anyone interested in simulation and procedural animation.
Path 3 — I Want to Do ML and AI
These resources are for readers who want to use the linear algebra from this course as a foundation for machine learning, deep learning, and data science.
Dive into Deep Learning — Appendix: Mathematics for Deep Learning
URL: d2l.ai — Geometry and Linear Algebraic Operations
Dive into Deep Learning (D2L) is an open-source textbook used in university ML courses worldwide. The mathematics appendix covers the geometry of high-dimensional vectors, matrix operations, eigendecompositions, and how all of this connects to modern neural network architectures. What makes D2L distinctive is that every section includes runnable code in PyTorch (and optionally JAX/TensorFlow), so you can go directly from reading to experimenting. The full book covers CNNs, RNNs, attention mechanisms, and transformers — all with the same combination of math and working code.
PyTorch — Build the Neural Network
URL: docs.pytorch.org — Build the Neural Network
The official PyTorch tutorial that Chapter 9 of this course references. It walks through defining an nn.Module, building a small fully-connected network, and tracing how dimensions flow from layer to layer. After following Chapter 9's worked example by hand, this tutorial is the next step: writing the same computation in a framework that will train and differentiate it for you. The broader Learn the Basics series (linked from that page) covers datasets, training loops, and saving models — everything you need to run your first real experiment.
Mathematics for Machine Learning — Deisenroth, Faisal, and Ong
URL: mml-book.github.io
Listed under Theory as well, because it belongs on both lists. For the ML path, focus on:
- Chapter 4 (Matrix Decompositions) — the eigendecomposition and SVD that power PCA, collaborative filtering, and latent semantic analysis
- Chapter 10 (Dimensionality Reduction with PCA) — a full derivation of the technique introduced in Chapter 8 of this course
- Chapter 9 (Linear Regression) — the least-squares solution $\hat{\mathbf{x}} = (A^T A)^{-1} A^T \mathbf{b}$, derived from the system-solving ideas in Chapter 6
Stanford CS229 Notes — Linear Algebra Review
URL: cs229.stanford.edu — Section Notes
Andrew Ng's Stanford machine learning course (CS229) includes a concise, freely available linear algebra review written for ML practitioners. It covers vectors, matrices, the trace and transpose, norms, special matrices (symmetric, orthogonal, diagonal), matrix calculus, and the eigenvalue decomposition. This is a good rapid reference when you are reading an ML paper and encounter notation that feels unfamiliar. It is denser than this course but shorter than any full textbook.
Quick Reference Table
| Resource | Format | Focus | Cost |
|---|---|---|---|
| Immersive Linear Algebra | Interactive web | Theory + geometry | Free |
| 3Blue1Brown Essence of LA | Video series | Visual intuition | Free |
| MIT OCW 18.06 | Video lectures | Full undergrad theory | Free |
| Mathematics for Machine Learning | PDF / print | Theory + ML | Free PDF |
| 3D Math Primer (Dunn & Parberry) | Web / print | Game math | Free online |
| LearnOpenGL Transformations | Web tutorial | OpenGL / GLSL | Free |
| Unity Vector Docs | Documentation | Unity / C# | Free |
| The Nature of Code | Web / print | Simulation / animation | Free |
| D2L Mathematics Appendix | Web + code | Deep learning math | Free |
| PyTorch Build a Neural Network | Web tutorial | PyTorch basics | Free |
| MML Book | PDF / print | ML foundations | Free PDF |
| Stanford CS229 Notes | ML notation review | Free |
The topics you have studied here — vectors, matrices, linear transformations, eigenvalues — are the foundation that every path above builds on. You are ready for all of them.