Isn't this a form of intuitive intelligence which doesn't rely on "reasoning"? To me "reasoning" sounds like intentionally trying to solve some explicit problem, while another form of ... insight is the ability to figure out that something is a problem in the first place.
That's, by the way, something LLMs are very much not good at. They possess a superhuman amount of knowledge covering all areas of academia, including math, science, philosophy, engineering, computer science, social sciences and so on, but that doesn't cause them to come up with novel hypotheses and theories. Something that would be easy for a smart human even with a fraction of the academic knowledge of an LLM.
That's, by the way, something LLMs are very much not good at. They possess a superhuman amount of knowledge covering all areas of academia, including math, science, philosophy, engineering, computer science, social sciences and so on, but that doesn't cause them to come up with novel hypotheses and theories. Something that would be easy for a smart human even with a fraction of the academic knowledge of an LLM.