Deep learning is a large scale application of Restricted Boltzmann Machines, of which Hinton (among others) was a pioneer. But that was in the 80s, not in the 2000s.
I don't believe the term "deep learning" is restricted to RBMs only – at least that's not the way I've seen the term used in literature (e.g. Deep Convolutional Neural Networks, various deep Autoencoders, etc.).
Convolutional networks were also developed in the 80s as well as backpropagating algorithms (autoencoders). The way i see it used, "deep" usually means many layers, indicating a difference in quantity, not in quality.
Point is, the science was there since the 80s, and not much has changed.
Sure, but these types of deep architectures haven't really been practical until relatively recently.
Well, then we're in agreement about the meaning of the term.
Deep Learning, then, would be Machine Learning using any of these deep architectures – be they Restricted Boltzmann Machines, or otherwise.
http://en.wikipedia.org/wiki/Restricted_Boltzmann_machine