Generic filters
Exact matches only

# Linear Algebra for Machine Learning

Last Updated on August 9, 2019

You do not need to learn linear algebra before you get started in machine learning, but at some time you may wish to dive deeper.

In fact, if there was one area of mathematics I would suggest improving before the others, it would be linear algebra. It will give you the tools to help you with the other areas of mathematics required to understand and build better intuitions for machine learning algorithms.

In this post we take a closer look at linear algebra and why you should make the time to improve your skills and knowledge in linear algebra if you want to get more out of machine learning.

If you already know your way around Eigen Vectors and SVD, this post is probably not for you.

Discover vectors, matrices, tensors, matrix types, matrix factorization, PCA, SVD and much more in my new book, with 19 step-by-step tutorials and full source code.

Let’s get started. Linear Algebra For Machine Learning
Photo by Sarah, some rights reserved.

## What is Linear Algebra

Linear Algebra is a branch of mathematics that lets you concisely describe coordinates and interactions of planes in higher dimensions and perform operations on them.

Think of it as an extension of algebra (dealing with unknowns) into an arbitrary number of dimensions. Linear Algebra is about working on linear systems of equations (linear regression is an example: y = Ax). Rather than working with scalars, we start working with matrices and vectors (vectors are really just a special type of matrix).

Broadly speaking, in linear algebra data is represented in the form of linear equations. These linear equations are in turn represented in the form of matrices and vectors.

— Vignesh Natarajan in answer to the question “How is Linear Algebra used in Machine Learning?

As a field, it’s useful to you because you can describe (and even execute with the right libraries) complex operations used in machine learning using the notation and formalisms from linear algebra.

Linear algebra finds widespread application because it generally parallelizes extremely well. Further to that most linear algebra operations can be implemented without messaging passing which makes them amenable to MapReduce implementations.

— Raphael Cendrillon in answer to the question “Why is Linear Algebra a prerequisite behind modern scientific/computational research?

### Need help with Linear Algebra for Machine Learning?

Take my free 7-day email crash course now (with sample code).

Click to sign-up and also get a free PDF Ebook version of the course.

## Minimum Linear Algebra for Machine Learning

Linear Algebra is a foundation field. By this I mean that the notation and formalisms are used by other branches of mathematics to express concepts that are also relevant to machine learning.

For example, matrices and vectors are used in calculus, needed when you want to talk about function derivatives when optimizing a loss function. They are also used in probability when you want to talk about statistical inference.

…it’s used everywhere in mathematics, so you’ll find it used wherever math is used…

— David Joyce, in answer to the question “What is the point of linear algebra?

If I was to convince you to learn a minimum of linear algebra to improve your capabilities in machine learning, it would be the following 3 topics:

• Notation: Knowing the notation will let you read algorithm descriptions in papers, books and websites to get an idea of what is going on. Even if you use for-loops rather than matrix operations, at least you will be able to piece things together.
• Operations: Working at the next level of abstraction in vectors and matrices can make things clearer. This can apply to descriptions, to code and even to thinking. Learn how to do or apply simple operations like adding, multiplying, inverting, transposing, etc. matrices and vectors.
• Matrix Factorization: If there was one deeper area I would recommend diving into over any other it would be matrix factorization, specifically matrix deposition methods like SVD and QR. The numerical precision of computers is limited and working with decomposed matrices allows you to sidestep a lot of the overflow/underflow madness that can result. Also, a quick LU, SVD or QR decomposing using a library will give you an ordinary least squares for for your regression problem. A bed rock of machine learning and stats.

If you want to get into the theory of it all, you need to know linear algebra. If you want to read white papers and consider cutting edge new algorithms and systems, you need to know a lot of math.

— Jesse Reiss in answer to the question “How important is linear algebra in computer science?

## 5 Reasons To Improve Your Linear Algebra

Of course, I don’t want you to stop at the minimum. I want you to go deeper.

If your need to know more and get better doesn’t motivate you down the path, here are five reasons that might give you that push.

1. Building Block: Let me state it again. Linear algebra is absolutely key to understanding the calculus and statistics you need in machine learning. Better linear algebra will lift your game across the board. Seriously.
2. Deeper Intuition: If you can understand machine learning methods at the level of vectors and matrices you will improve your intuition for how and when they work.
3. Get More From Algorithms: A deeper understanding of the algorithm and its constraints will allow you to customize its application and better understand the impact of tuning parameters on the results.
4. Implement Algorithms From Scratch: You require an understanding of linear algebra to implement machine learning algorithms from scratch. At the very least to read the algorithm descriptions and at best to effectively use the libraries that provide the vector and matrix operations.
5. Devise New Algorithms: The notation and tools of linear algebra can be used directly in environments like Octave and MATLAB allowing you to prototype modifications to existing algorithms and entirely new approaches very quickly.

Linear Algebra will feature heavily in your machine learning journey whether you like it or not.

## 3 Video Courses To Learn Linear Algebra

If you are looking to beef up your linear algebra, there are three options that you could start with.

These are video courses and lectures I found and went through recently in preparation for this post. I found each decent and suited to a different audience.

I watch all videos on double time, and defiantly recommend it with all of these sources. Also, take notes.

### 1. Linear Algebra Refresher

This is a quick whip around the topics in linear algebra you should be familiar with. This is for those who took linear algebra in collage and are looking for a reminder rather than an education.

The video is titled “Linear Algebra for machine learning” and was created by Patrick van der Smagt using slides from University Collage London.

### 2. Linear Algebra Crash Course

The second option is the Linear Algebra crash course presented as an optional module in Week 1 of his Coursera Machine Learning course.

This is suited to the engineer or programmer who is perhaps less or not at all familiar with linear algebra and is looking for a first bootstrap into the topic.

It contains 6 short videos and you can access a YouTube playlist here titled “Machine Learning – 03. Linear Algebra Review“.

The topics covered include:

1. Matrices and Vectors
3. Matrix Vector Multiplication
4. Matrix Matrix Multiplication
5. Matrix Multiplication Properties
6. Inverse and Transpose

### 3. Linear Algebra Course

The third option is to take a complete introductory course into Linear Algebra. A slow grind that puts the whole field into your head.

I recommend the Linear Algebra stream on Khan Academy.

It’s amazing. Not only is the breadth impressive and it provides spot check questions throughout, but Sal is a great communicator and cuts straight to the the applied side of the material. Much better than any university course I took.

Sal’s course is divided into 3 main modules:

• Vector Spaces
• Matrix Transformations
• Alternative Coordinate Systems (bases)

Each module contains 5-7 sub modules and each sub-module contains 2-7 videos or question sets that range from 5-25 minutes (faster on double time!).

It’s great material and a low burn and I would recommend doing all of it, perhaps in weekend binges.

### More Resources To Learn Linear Algebra

If you are looking for more general advice, check out the answers to the question “How can I self study Linear Algebra?“. There are some real gems in here.

## Programming Linear Algebra

As a programmer or engineer, you likely learn best by doing. I know I do.

As such, you may wish to grab a programming environment or library and start coding up matrix multiplication, SVD and QR decompositions with test data.

Below are some options you might like to consider.

• Octave: Octave is the open source version of MATLAB and for most operations they are equivalent. These platforms were built for linear algebra. This is what they do and they do it very well. They are a joy to use.
• R: It can do t, but its less beautiful than Octave. Check out this handy report: “Introduction to linear algebra with R” (PDF)
• SciPy numpy.linalg: Easy and fun if you are a Python programmer with clean syntax and access to all the operations you need.
• BLAS: Basic Linear Algebra Subprograms like multiplication, inverse and the like. Ported or available in most programming languages.
• LAPACK: Linear Algebra Library, successor to LINPACK. The place to go for various matrix factorizations and the like. Like BLAS, ported or available in most programming languages.

There’s also a new Coursera course titled “Coding the Matrix: Linear Algebra through Computer Science Applications” by Philip Klein that also has an accompanying book by the same name “Coding the Matrix: Linear Algebra through Applications to Computer Science“. This may be worth a look if you are a Python programmer and looking to beef up your linear algebra.

## Linear Algebra Books

I learn best from applied examples, but I also read a lot. If you’re anything like me, you’ll want a good textbook on the shelf, just in case.

This section lists some of the top textbooks on Linear Algebra for beginners.

### Foundations

This is a beginner textbooks that covers the foundations of linear algebra. Either would be a good compliment to taking the course on Khan Academy.

### Applied

These are books that lean more towards the application of linear algebra.

I really like the latter book “Matrix Computation” because it gives you snippets of theory and algorithm pseudocode. Very cool for the math guy and the programming guy in me. If you want to implement the procedures yourself from scratch (rather than use a library), this may be the book for you.

For more suggestions of good beginner books on Linear Algebra, check out: What is the best book for learning Linear Algebra?

## Summary

In this post you have taken a look at Linear Algebra and the important role it plays in Machine Learning (and really broader mathematics). You also noted a minimum of linear algebra to look at.

We touched on three options that you can use to learn linear algebra, a refresher, crash course or a deeper video course, all available to you now for free. We also looked at the top textbooks on the topic in case you wanted to go deeper.

I hope this has sparked your interest in the importance and power of getting better at linear algebra. Pick one resource and read/watch it to completion. Take that next step and improve your understanding of machine learning.

Update: Two additional high quality resources mentioned on the Reddit discussion of this post are the book Linear Algebra Done Right Axler and the MIT open courseware course on Linear Algebra taught by Gilbert Strang (author of some of the books mentioned above).

## Get a Handle on Linear Algebra for Machine Learning! #### Develop a working understand of linear algebra

…by writing lines of code in python

Discover how in my new Ebook:
Linear Algebra for Machine Learning

It provides self-study tutorials on topics like:
Vector Norms, Matrix Multiplication, Tensors, Eigendecomposition, SVD, PCA and much more…