It certainly feels like a GC pause, but it could be a lot of things. Regardless of the cause I feel like the fact that people don't think about how memory is managed is indicative of the real problem: people are programming computers without really understanding how they work. And while that is fine, even encouraged, for a lot of use cases I think anyone calling themselves a professional programmer should be held to a higher standard.
So how many levels or layers down or up should we expect someone to go? Say we take Go / Java as a starting point, L-1 would be C / C++, L-2 would be Assembly / SIMD / AVX processor instructions, L-3 would be chip design / architecture, L-4 would be diode chemistry and fabrication, L-5 is quantum mechanics?
And on the other side L+1 would be databases, L+2 would be system architecture / systems design, L+3 would be user experience and communication, L+4 organisation and team design, L+5 would be LISP?
How many levels would someone have to straddle to meet the higher standard? And does it matter which level their midpoint is?
I too despair when I remember the days of software instantly responding to keypresses. But I fear that no matter how much today's developers knew about computer architecture software would still be slow, because latency is like a gas: it expands to take up all available latency-toleration space in a program's audience. Compressing it back is expensive, ergo it's a race to the bottom where business leadership would not let those developers actually take the time to make software more responsive.