Diferente pentru blog/deep-learning-alien-labs intre reviziile #7 si #8

Nu exista diferente intre titluri.

Diferente intre continut:

I've been thinking about deep learning ever since I saw Andrew Ng's 'talk at Google':https://www.youtube.com/watch?v=n1ViNeWhC24 back in 2013, but somehow never made much progress other than talking to people.
I've been thinking about deep learning ever since I saw Andrew Ng's 'talk at Google':https://www.youtube.com/watch?v=n1ViNeWhC24 back in 2012, but somehow never made much progress other than talking to people.
Deep learning has recently ~2010-2012 become very effective across a large set of problems where progress was very slow. It would be a shame not spending some time on it.
* Baidu is working on speech recognition for mandarin (lots of illiterate people with phones).
* Word embeddings give you natural language models that do away with the tweaks one used to encode all sorts of idiosyncrasies in the English language.
OpenAI CTO: "As Ilya likes to say, deep learning is a *shallow field* — it's actually relatively easy to pick up and start making contributions.
*Shallow field* OpenAI CTO: "As Ilya likes to say, deep learning is a shallow field — it's actually relatively easy to pick up and start making contributions.
* short learning curve, deep learning techniques started to be effective recently (2010? 2012?)
* deep learning techniques started to be effective recently (2010? 2012?)
* previous experience in the old techniques is not very relevant
* a home setup is enough to start (a google datacenter of GPUs would help)
* state of the art results significantly improving on the previous state of the art
* *Rectified Linear Units* f(x) = max(0, x) work much better than the historically used sigmoid and hyperbolic tangent functions
* *Dropout* - some neurons are ignored with a set probability, inspired by how the neurons in the brain fire with some pro
*Some deep learning news*:
 
* 2016 "ACRush, 3 times Google Code Jam winner, joins Baidu to work on autonomous cars":https://twitter.com/AndrewYNg/status/723640197875830784
So not only PHds in the field but coding contest guys as well :).
* "Introducing OpenAi":https://openai.com/blog/introducing-openai/
"these funders have committed $1 billion, although we expect to only spend a tiny fraction of this in the next few years."
* 2014 "Baidu started a deep learning lab, Andrew Ng and other Stanford people went there":http://deeplearning.net/2014/05/17/andrew-ng-is-hired-by-baidu/
* 2014 'Oxford natural language processing research group bought by Google':http://techcrunch.com/2014/10/23/googles-deepmind-acqui-hires-two-ai-teams-in-the-uk-partners-with-oxford/
* 2014 "Google acquired DeepMind for more than 500 million dollars.":http://deeplearning.net/2014/01/27/google-acquires-deep-mind/ That's how much it costs to beat Go
* 2013 'Facebook starts a deep learning lab':https://gigaom.com/2013/12/09/facebook-hires-nyu-deep-learning-expert-to-run-its-new-ai-lab/
* 2013 'Wired interview with Geoff Hinton, one of the neural net pioneers':http://www.wired.com/2013/03/google_hinton/
* 2012 'Google cat detector on 16000 machines':http://www.wired.com/2012/06/google-x-neural-network/
Mircea Pasoi and Cristian Strat, after their successful stint at twitter, recently founded Alien Labs. They use deep learning to build intelligent chat bots for an office environment. This is an awesome opportunity to work together again. It's been 8 years since we last did. At Google, we worked on an ads inventory management problem using 'network flows':http://goo.gl/5SDRrM Our claim to fame is that we got help from Cliff Stein, the S in 'CLRS':https://mitpress.mit.edu/books/introduction-algorithms :). This is also an opportunity for me to jump on the deep learning train tackling real world problems.

Nu exista diferente intre securitate.

Topicul de forum nu a fost schimbat.