What about latency, though. Fortnite may be possible but Quake or StarCraft? Sure, on the lowest of the lowest of tiers. People used to say 24 fps was great, too. Only took us 20 years for the masses to catch up.
Anything below 120 feels sluggish to me. At 60 fps local, I move my mouse and the picture is changed after what feels like an eternity. Can't imagine cloud rendered 60 being better. In fact, it's guaranteed to be worse.
There is already inherent latency in networked games. Cloud gaming could somewhat compensate by having the servers running the game clients close to the game hosts.
Fast paced networked games typically solved that by running a local simulation ahead of the server. The button you clicked looks depressed the instant you click it, not once the server knows about it. In FPS style games your character typically starts walking forwards the instant you press the forwards key, and you shoot the instant you click, not when the server finds out about it.
This has weird effects. Each player is actually playing in a slightly different world. You might see yourself hitting something and they might see themselves blocking the shot, and only one of you can be right. The different worlds will retroactively correct themselves to be consistent in some form or another (depending on the game it might be that the person shooting is always correct, or it might be that the person blocking is always correct, or it might be that whoever's packets reached the server first is correct, or really some complex combination of all of the above). The weird effects are worthwhile because people are really sensitive to latency in response to their inputs.
Even in slow placed games that use simpler networking models, I'm pretty sure the UI is basically always local. For example you might press the button that says "do the thing" and see the button style into it's "pressed state", but the server decides that the thing doer is dead before that button press reaches the server, so it ignores that button press.
I am talking about input latency. The cloud solutions cannot compensate for it unless they start rendering all possible frames all the time which makes zero sense, nevermind being impractical to borderline impossible right now.
Read carefully gpm's comment, or, I don't know, start playing games? It really helps
The display still has to be rendered on the player’s screen for them to react to it. Cloud gaming only increases the volume of data coming down to the client so it seems logical any latency issues would be amplified, even if using a top video codec.
moving the latency to the player client just makes everything feel terrible. its like playing in mud because your actions take 50ms-100ms of time to show up on your screen
Anything below 120 feels sluggish to me. At 60 fps local, I move my mouse and the picture is changed after what feels like an eternity. Can't imagine cloud rendered 60 being better. In fact, it's guaranteed to be worse.