"Next fifteen years of Moore's Law?" The recent failure of Intel's "tick-tock" alternation of process shrinkage and new architecture suggests that however performance improves in the next 15 years, projecting the last 15 years' Moore's Law forward is a bad idea. For crypto stuff, I'd think about how quantum computing may advance by the 2030s.