They took the images from the brain, but they might not have gotten it from the mind - it may have come for circuitry associated with the eyes, rather that the circuity associated with thinking.
I would like to see if they could read images if a person just thought about them, rather than actually saw them.
Exactly. The visual system produces a very stable activation pattern, relative to other senses and multidimensional objects. The noise inherent in those other systems is probably the best protection of our inner privacy. There simply isn't a code to be read because there's little systematicity from one moment, or person, to the next. The visual system is as good as it gets and even then only when you're looking at a prescribed object.
In other words, ignore the PR about downloading thoughts and dreams. We can decode some stuff from fMRI activity patterns but we're soon swamped by noise, even when using machine learning techniques. The data just isn't there. It's like trying to predict the weather two weeks from now. The system is inherently noisy.
For instance, here's a seminal paper predicting which category (faces, place, object) was going to be recalled from studied pictures and a second or two before it was actually recalled:
They are using voxels from visual areas V1, V2, and V4. These are levels in the visual regions of the cortex. What you call "thinking", would correlate with the IT (infero-temporal cortex) and the PFC (pre-frontal).
Also, they are using fMRI activations, which means they don't record from neurons directly, but look at cerebral blood flow (which is linked to neuronal activation; the more the neuron does, the more oxygen it needs).
I would like to see if they could read images if a person just thought about them, rather than actually saw them.
I imagine it would take a bit of training to be able to visualize images in ways that a computer can read them - I imagine children would be best at learning thought-commands quickly. Just as my grandparents could never quite learn how to touch-type, I imagine my grandchildren will kick my ass at thinking into their computers ;)
It would have been lame if they had gotten the information from the neurons in the retina, but the article says they took the readings from the cerebral visual cortex. I'm pretty sure that it's used both for processing vision and thinking about images.
Not thinking about. That far back is a reconstruction of the data compression that happens in the retina. Attention can shift activation there slightly, but there's little abstraction from the retinal image.
I would like to see if they could read images if a person just thought about them, rather than actually saw them.