Skip to content

Introduction — Why Should a Developer Care?

Linear algebra is the invisible engine behind game physics, 3D graphics, and machine learning. This chapter shows you where it hides, why it matters, and what you will learn in this course. By the end of these pages you will know what vectors, matrices, and transformations are at a high level — and you will have every reason to keep reading.

A Bullet, a Wall, and a Whole Lot of Math

Picture this. You are playing a first-person shooter. Your character sprints down a corridor, rounds a corner, and fires. The bullet tears through the air, strikes a metal wall at an angle, and ricochets into a crate. The crate splinters. Fragments scatter. The overhead light casts a shadow that shifts as a door swings open behind you.

That sequence — movement, collision, reflection, lighting, shadow — took maybe two seconds of gameplay. Underneath, the game engine performed thousands of mathematical operations every frame. Almost all of them belong to a single branch of mathematics: linear algebra.

The bullet's trajectory? A vector. The ricochet angle off the wall? A dot product. The rotation of the door? A matrix. The camera that followed your character around the corner? Another matrix. The projection that flattened the entire 3D scene onto your flat monitor? Yet another matrix.

If you have ever opened a game engine like Unity or Unreal and encountered a Transform component with fields for position, rotation, and scale, you were staring at linear algebra in a trench coat.

So What Is Linear Algebra?

Linear algebra is the branch of mathematics that studies vectors, matrices, and linear transformations — functions that map vectors to other vectors in a straight-line, proportional way. The Encyclopaedia Britannica defines it as "a mathematical discipline that deals with vectors and matrices and, more generally, with vector spaces and linear transformations."[^1]

That sounds abstract, so let us make it concrete.

  • A vector is a list of numbers that describes a magnitude and a direction. In a game, it might represent a character's velocity: "10 meters per second to the north-east."
  • A matrix is a grid of numbers that describes a transformation. It can rotate, scale, stretch, or squish things. When you rotate a 3D model, you multiply its vertices by a rotation matrix.
  • A linear transformation is the idea behind the matrix — a function that takes vectors in and sends vectors out, preserving the basic structure of space (straight lines stay straight, the origin stays put).

These three objects — vectors, matrices, and transformations — are the core vocabulary of linear algebra, and they will be the core vocabulary of this course.

Where Game Developers Meet Linear Algebra

If you build games or interactive 3D applications, linear algebra is not optional. It is the language your tools speak. Here is a quick tour of where it shows up.

Positioning and movement

Every object in a game world has a position, usually stored as a vector (x, y, z). Moving that object means adding a velocity vector to its position vector, frame after frame. This is vector addition — the simplest operation in linear algebra, and one of the most frequently executed.

Think about a racing game. Each car has a position vector and a velocity vector. Every frame — sixty times per second — the engine adds the velocity to the position. If the car is at position (100, 0, 50) and its velocity is (2, 0, -1), then one frame later it is at (102, 0, 49). That tiny addition, repeated millions of times across every object in the scene, drives the entire simulation forward.

Rotation, scaling, and transformation

Rotating a spaceship, scaling a UI element, or skewing a sprite are all matrix operations. In the standard graphics pipeline, each object passes through a chain of transformations — from model space (the object's own coordinate system) to world space to camera space to screen space. Each step is a matrix multiplication.[^2]

A rotation around the y-axis in 3D, for instance, is represented by a specific 3×3 matrix whose entries involve the sine and cosine of the rotation angle.[^2] You do not need to re-derive the formula each time; the matrix packages it neatly.

Here is the beautiful part: because each transformation is a matrix, you can compose them. Need to scale an object, then rotate it, then move it into world position? Multiply the three matrices together to get a single matrix that does all three at once. This is why graphics APIs like OpenGL and DirectX revolve around matrix stacks — they are composing transformations for you, layer by layer.[^2]

Lighting and shading

When a game engine decides how brightly to shade a surface, it compares two vectors: the direction the surface faces (the normal vector) and the direction the light is coming from. The comparison tool is the dot product. If the two vectors point in roughly the same direction, the surface is well-lit. If they are perpendicular, the surface gets no direct light. This is the basis of Lambertian shading, one of the oldest and most common lighting models in real-time graphics.

The dot product also powers another common game mechanic: field-of-view checks. Want to know if an enemy is in front of the player? Compute the dot product between the player's forward direction and the vector pointing toward the enemy. A positive result means the enemy is ahead; a negative result means the enemy is behind. One line of linear algebra replaces a tangle of angle calculations.

Physics and collision detection

Detecting whether two objects overlap, calculating bounce angles, and simulating rigid-body dynamics all lean on vectors, dot products, cross products, and matrix transformations. The physics engine is, at heart, a linear algebra engine.

TIP

If you have ever looked at the source code of a physics engine and felt overwhelmed by the math, this course is designed to fix that. Every formula has a geometric story behind it.

Where Machine Learning Meets Linear Algebra

Games are not the only place. Over the past decade, machine learning — and especially deep learning — has become one of the largest consumers of linear algebra on the planet.

A neural network is, at its core, a chain of matrix multiplications punctuated by simple nonlinear functions. Each layer of the network takes an input vector, multiplies it by a weight matrix, adds a bias vector, and then applies an activation function. The output becomes the input to the next layer.[^3]

Here is the key insight: the weight matrix is the layer. Everything the network has "learned" is encoded in the numbers inside its matrices. Training a neural network means adjusting those numbers until the network's output matches reality.

Consider a tiny example. Suppose a neural network layer accepts three inputs and produces two outputs. That layer's weight matrix is a 2×3 grid of numbers. Multiplying the 3-element input vector by this matrix produces a 2-element output vector — a transformation from a three-dimensional space to a two-dimensional one.[^3] Stack several of these layers together, and you get a network that can classify images, translate languages, or generate text.

This is not a loose analogy. A neural network's forward pass is literally a sequence of matrix multiplications. When you call model.forward(x) in PyTorch, the framework multiplies your input tensor by weight matrices, one layer at a time. The nonlinear activation functions between layers are the only things that prevent the entire network from collapsing into a single matrix. Understanding this architecture — why linearity alone is not enough, and what the matrices contribute — starts with understanding linear algebra.

Beyond neural networks, linear algebra underpins:

  • Principal Component Analysis (PCA), which reduces high-dimensional data to its most important axes using eigenvectors and eigenvalues.[^4]
  • Recommendation systems, which decompose user-item matrices to predict preferences.[^4]
  • Computer vision, where images are matrices of pixel values that get multiplied, convolved, and transformed.

If you plan to work with any ML framework — PyTorch, TensorFlow, JAX — you will be working with tensors, which are just multi-dimensional generalizations of vectors and matrices. Understanding linear algebra means understanding what those frameworks are actually doing under the hood.

INFO

You do not need to know machine learning to follow this course. The ML examples are here to show breadth, but the core content focuses on building geometric intuition that applies everywhere.

A Preview of the Journey

Here is the road map for the rest of the course. Each chapter builds on the previous one, and every concept connects back to real applications.

  1. Vectors — What they are, how to add and scale them, and how they represent positions, velocities, and directions in a game world.
  2. The Dot Product — How to measure the angle between two vectors and why that matters for lighting, line-of-sight checks, and collision detection.
  3. Matrices — Grids of numbers that encode transformations. You will learn to multiply them, compose them, and read them as recipes for rotating, scaling, and moving space.
  4. Linear Transformations — The deeper idea: a matrix is a function. You will learn to see what a transformation does by reading its columns.
  5. Solving Systems — How to use matrices to solve systems of linear equations, and what happens when solutions do not exist.
  6. Determinants — A single number that tells you how much a transformation stretches or squishes space, and whether it can be reversed.
  7. Eigenvalues and Eigenvectors — The special vectors that a transformation only stretches, never rotates. These power PCA, physics simulations, and more.
  8. Putting It All Together — Two extended walkthroughs: a 3D game physics pipeline and a neural network forward pass, end to end.

By the final chapter, you will be able to pick up a game engine manual or an ML research paper and read the matrix equations without flinching. That is the promise.

Why You, Specifically

You might be thinking: "I already use Unity (or PyTorch, or Three.js). The framework handles the math for me. Why should I learn what is happening inside?"

Fair question. Three reasons.

First, debugging. When a model renders inside-out, when a physics simulation explodes, when a neural network produces garbage — these bugs are often linear algebra bugs. A transposed matrix. A non-normalized vector. A singular matrix where an invertible one was expected. Understanding the math turns a mysterious black screen into a solvable problem.

Second, performance. Knowing that two successive rotations can be collapsed into a single matrix multiplication — instead of rotating every vertex twice — is the kind of optimization that only comes from understanding the underlying algebra.

Third, creative power. The developers who push boundaries — who invent new shaders, new physics effects, new ML architectures — are the ones who understand the math well enough to bend it. Want to write a custom vertex shader that warps space around a black hole? You need to understand how transformation matrices work. Want to design a novel neural network layer? You need to know what a matrix multiplication does to the geometry of your data. Linear algebra is not a gate you pass through once. It is a tool you pick up and use, over and over, for the rest of your career.

TIP

You do not need to remember everything from a previous math class. This course assumes you have a hazy memory of vectors and matrices at best. We will rebuild from the ground up.

Let's Go

The next chapter starts where every journey starts: with a single step — or rather, a single vector. We will define what a vector is, give it a direction and a magnitude, and watch it move a character across a game world.

Turn the page.

References

[^1]: "Linear algebra." Encyclopaedia Britannica. https://www.britannica.com/science/linear-algebra

[^2]: Snyder, W. "Computer Graphics — Linear Algebra, Geometry, and Computation." Boston University CS 132, Spring 2021. https://www.cs.bu.edu/fac/snyder/cs132-book/L13ComputerGraphics-Spring2021.html

[^3]: "Linear Algebra." Dive into Deep Learning, Section 2.3. https://d2l.ai/chapter_preliminaries/linear-algebra.html

[^4]: "What Is Linear Algebra for Machine Learning?" IBM. https://www.ibm.com/think/topics/linear-algebra-for-machine-learning