>Seems that DeepMind’s original thesis about “solving intelligence and then using it to solve everything else” fundamentally misunderstands what intelligence is. Intelligence is the ability to solve specific problems, therefore it necessarily exists in the context of all in which it lives and what came before it and the goal of “solving intelligence” is meaningless.
The definition seems off. Many problems only seem specific in hindsight, and humans solving problems on command with full awareness of the space is more a measure of obedience. Napoleon famously didn't want intelligent generals, he wanted lucky ones.
DeepMind et al probably refined the motto internally to be "solving reasoning" from great number of perspectives "then solving everything else".
The definition seems off. Many problems only seem specific in hindsight, and humans solving problems on command with full awareness of the space is more a measure of obedience. Napoleon famously didn't want intelligent generals, he wanted lucky ones.
DeepMind et al probably refined the motto internally to be "solving reasoning" from great number of perspectives "then solving everything else".