Atenţie! Aceasta este o versiune veche a paginii, scrisă la 2017-06-15 06:10:35.
Revizia anterioară   Revizia următoare  

Linear Algebra Resouces

Cosmin
Cosmin Negruseri
15 iunie 2017

Linear algebra is very useful in engineering. Usually in school is taught as a dry subject. The problems seem to be solved by mechanically following some rules without much intuition behind them.

I've watched a short course on youtube that has good insights about geometrical intuition behind linear algebra concepts. I highly recommend it.

Essence of linear algebra

Thanks Catalin Tiseanu for suggesting it.

For a Machine Learning view of Linear Algebra you can go through chapter 2 of the Deep Learning Book available online
Deep Learning, Chapter 2: Linear Algebra

I recently attended a Q&A session about this chapter and deep learning in general given by Yaroslav Bulatov (Open Ai, previously Google Street View). It's pretty good both for beginners and more advanced people.
Yaroslav Bulatov (OpenAi) Q&A Deep Learning Book, Chapter 2: Linear Algebra

The best resource is Gilbert Strang's Linear Algebra course taught MIT. He explains things very clearly and with a lot of simple examples. It's all available on yotube!
Gilbert Strang MIT Linear Algebra Video Lectures
You may want to play it at 1.5 or 2x speed though :). There are a lot of lectures, you can skip to the ones you're interested in.

If you just want some visual intuition behind eigen values and eigen vectors there's a very good blog post:
Eigen Vectors and Eigen Values explained visually
where the authors have dynamic visualizations. It's super fun to move things around and observe the effects.

There are lots of applications of linear algebra:
- Pagerank, the algorithm behind Google's success is based on eigen values and eigen vectors
- the winning entry in the Netflix Prize was based on Singular Value Decomposition
- 3D games use matrix multiplications for computing rotations, translations, shearing transforms
- in machine learning figuring out if the function you are learning is well conditioned for Gradient Descent corresponds to having a small ratio between the min and max eigen value of the hessian matrix
- again in machine learning figuring out if you're in a saddle point, local minima or local maxima can be done by looking at the signs of the eigen values of the hessian in the current point
- and of course they are sometime used in coding contest competitions :) (fast matrix exponentation, gaussian elimination etc)

Have fun!

Categorii:
remote content