Pagini recente » Monitorul de evaluare | Diferente pentru minimal-enclosing-circle intre reviziile 3 si 4 | Diferente pentru blog/linear-algebra intre reviziile 11 si 10
Nu exista diferente intre titluri.
Diferente intre continut:
There are lots of applications of linear algebra:
- Pagerank, the algorithm behind Google's success is based on eigen values and eigen vectors
- the winning entry in the Netflix Prize was based on Singular Value Decomposition
- The winning entry in the Netflix Prize was based on Singular Value Decomposition
- 3D games use matrix multiplications for computing rotations, translations, shearing transforms
- in machine learning figuring out if the function you are learning is well conditioned for Gradient Descent corresponds to having a small ratio between the min and max eigen value of the hessian matrix
- again in machine learning figuring out if you're in a saddle point, local minima or local maxima can be done by looking at the signs of the eigen values of the hessian in the current point
- and of course they are sometime used in coding contest competitions :) (fast matrix exponentation, gaussian elimination etc)
- in machine learning figuring out if your data is well conditioned for Stochastic Gradient Descent corresponds to having a small ratio between the min and max eigen value of the hessian matrix
- and of course they are sometime used in coding contest competitions :)
Have fun!
Nu exista diferente intre securitate.
Topicul de forum nu a fost schimbat.