I think the technology is about 80% there. To my eye, the graphics and performance are the same as running the game locally.
Provided you have a good internet connection, if you could just run a AAA game off of your crummy computer or tv, why would you buy an expensive graphics card? It is definitely where things are going.
You can't just have a "good" internet connection. For some games the difference between 30ms and 100ms latency is a great experience vs being almost unplayable. This is not exaggeration, this is fact and millions of gamers who actually measure their ping will tell you the same.
The latency between peripheral input and visual feedback is much more important than the latency between a game client and server. The servers have all sorts of mitigation strategies to compensate. For inputs, it’s all on the human. Which feels bad.
> The servers have all sorts of mitigation strategies to compensate.
I haven't seen any papers where cloud gaming servers implement rollback logic on your inputs in behalf of the game developer.
Even assuming games that do support native rollback, most rollback netcode implicitly considers you're local to your game client, not in some relay box 7 frames or more in the past.
I was talking about clients and servers in the classic setup, not for cloud gaming. My parent was talking about game ping and I was pointing out that cloud services are even more sensitive since you’re not interacting with the game client locally.
The latency between a peripheral input and the visual feedback includes that latency between the client and server in a streaming scenario. The TV isn't smart enough to implement rollback netcode for every game.
Rocket League is an extremely twitchy game, especially in 1s. Several of the best 1s players in the world were playing from KSA for years on well over 100 ping. They were winning tournaments without a single loss against players on single-digit ping.
I think there's still a market for what you're describing, but in terms of players and revenue it seems like competitive multiplayer titles that don't require particularly beefy hardware and where network latency is especially important have become what's most popular and profitable.
For example looking at September's games by MAUs[1], #6, #10 and #17 seem like good fits for cloud gaming, but the rest seem like a bad fit for cloud with fairly benign recommended hardware and latency sensitivity. Developers seem to be aware and willing to target fairly modest hardware specs for these titles as well. For example the new CoD's recommended GPU requirements is a 1070 and previous game was a 1060, which matches up with Steam hardware surveys.[2]
Provided you have a good internet connection, if you could just run a AAA game off of your crummy computer or tv, why would you buy an expensive graphics card? It is definitely where things are going.