Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Deep learning is a large scale application of Restricted Boltzmann Machines, of which Hinton (among others) was a pioneer. But that was in the 80s, not in the 2000s.

http://en.wikipedia.org/wiki/Restricted_Boltzmann_machine




I don't believe the term "deep learning" is restricted to RBMs only – at least that's not the way I've seen the term used in literature (e.g. Deep Convolutional Neural Networks, various deep Autoencoders, etc.).


Convolutional networks were also developed in the 80s as well as backpropagating algorithms (autoencoders). The way i see it used, "deep" usually means many layers, indicating a difference in quantity, not in quality.

Point is, the science was there since the 80s, and not much has changed.


Sure, but these types of deep architectures haven't really been practical until relatively recently.

Well, then we're in agreement about the meaning of the term. Deep Learning, then, would be Machine Learning using any of these deep architectures – be they Restricted Boltzmann Machines, or otherwise.


I sometimes wonder if in the 2030s, people will be complaining about how all the interesting stuff was really invented back in the 2010s.

But yes, the available computing power has been a huge limitation for much AI research.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: