I'm already in the beta-test program.
So basicallyI don't think many read HOW they plan to achieve this.
Yes, it is possible. No,no one should want this.
For those who only read title, they plan to achieve this by guessing what you'll do next and just do it for you and ignore what you actually do.
Next you're gonna tell me Stadia runs games in a state of quantum superposition and leverages the wave-function collapse caused by observation to ensure the next frame has already been simulated and rendered by the time the player presses a button.
F U C K
O F F F
Though that does get me thinking- it wouldn't be impossible to invent a system whereby a game's current state could be forked into several different potential 'futures' (based on the possible set of next player inputs) that are run as separate simulations and then collapsed down into one 'correct' one when the actual player input arrives, kind of like rollback netcode. It'd be pretty damn CPU-intensive though, particularly for games that have a lot of potential 'next player inputs' on any given frame.
And that still wouldn't validate the claim of 'better than local no matter the hardware', because you'd still be wasting part of the potential gains on streaming lag. Even if zero-latency computing and networking was a thing, their tech would only ever be able to match local hardware doing the same stuff, not exceed it.
Now, if they were to invent time travel...
“Ultimately, we think in a year or two we’ll have games that are running faster and feel more responsive in the cloud than they do locally,” Bakar says to Edge, “regardless of how powerful the local machine is.”
I don't think many read HOW they plan to achieve this.
Yes, it is possible. No,no one should want this.
For those who only read title, they plan to achieve this by guessing what you'll do next and just do it for you and ignore what you actually do.
Hmm, I guess that depends on how much information such an AI would have available to it. If it had to go purely off your gamepad inputs then I think there would be sufficient margin of error to render it imperfect- like a sudden muscle spasm or IRL projectile causing you to mis-input or drop the controller.is there a reason a sufficiently powerful ai computer couldn't predict exactly what you will do before you do it? I mean it would turn gaming into an exercise in existential dread, but it would be super ‘responsive’.
one of the quotes from Google founder and CIA operative was ‘ we want you to be able to type in ‘what college should i go to’ and give you the correct answer.’ Google is creepy AF.
Sorry but this is def a non issue in many countries.Data caps, data caps, data caps.
None of this is viable until ISP data caps are universally eliminated.
Sorry but this is def a non issue in many countries.
Netflix and spotify are still a thing.Except the USA, where internet infrastructure is way behind the rest of the world, and datacaps exist with almost every ISP.
And laws intended to try to help prevent ISPs from doing whatever they want have been rolled back, meaning that it's not going to get any better any time soon.
But I guess if Stadia's success is somehow only tied to Europe, then I guess you have a point... ?
So how is the machine learning going to know that all of a sudden I want to go off the beaten path?ITT: a bunch of people thinking they're more than uniquely specialised slow computer. It's not like they need to emulate the whole human brain. Just button presses in a game where all calculations and physics are known natively. It's a piece of piss for machine learning if it's always on and tracking.
Your reaction time, your decision bias, your playtime fatigue, your performance at different times of the day, multipliers on how high you are from the first couple of combos. Even patience and the consequences on your playstyle while agitated - maybe even automatic difficulty adjustment for you to keep playing instead of quitting. It doesn't need to be perfect for us not to notice instances of auto-play. You're underestimating machine learning and overestimating these lower tier species that are us.
Bring on the future I say.
Give it a few years, when VLEO sat services give traditional telecoms fair competition.Except the USA, where internet infrastructure is way behind the rest of the world, and datacaps exist with almost every ISP.
And laws intended to try to help prevent ISPs from doing whatever they want have been rolled back, meaning that it's not going to get any better any time soon.
But I guess if Stadia's success is somehow only tied to Europe, then I guess you have a point... ?
Netflix and spotify are still a thing.
Give it time.
Specifically Bakar notes Google’s “negative latency” will act as a workaround for any potential lag between player and server. This term describes a buffer of predicted latency, inherent to a Stadia players setup or connection, in which the Stadia system will run lag mitigation. This can include increasing fps rapidly to reduce latency between player input and display, or even predictive button presses.
Yes, you heard that correctly. Stadia might start predicting what action, button, or movement you’re likely to do next and do it for you – which sounds rather frightening.
I don't think auto-play would work for extended periods of time, only parts of a second to a second intervals in the midst of a ping spike and active playing i.e active button mashing in combat, which is most of the time unconscious learned reaction patterns. Off the cuff reactions couldn't be predicted because that would require simulation of all of your brain which at this point in time is not possible if it will ever be.So how is the machine learning going to know that all of a sudden I want to go off the beaten path?
Obviously this is PR bullshit through and through.
BUT
The way Stadia's servers are designed to scale up means that developers won't get locked into the same "generational" development cycles that we've seen in the past with console gaming. Whether or not developers choose to utilize the extra, continuously improving resources that they are given is another story. Google definitely has a uphill battle ahead if they think that developers will optimize their games for Stadia rather than just having a separate build branch that has parity with traditional console releases - most developers / publishers don't even bother putting additional effort into making their games run better on PC these days.
If in "one or two years" Google is moneyhatting developers to make Stadia exclusive games, then I'll give this statement a "maybe".
My read of his post and yours say you are both in agreement. I think his message was if Google pays for all development costs the devs will utilize the benefit, but otherwise forget it.Devs will NEVER be able to utilize the continuous improving resources because it takes too many years to make a game. And you'll still have to support it on consoles and PC first.
Next you're gonna tell me Stadia runs games in a state of quantum superposition and leverages the wave-function collapse caused by observation to ensure the next frame has already been simulated and rendered by the time the player presses a button.
F U C K
O F F F
Though that does get me thinking- it wouldn't be impossible to invent a system whereby a game's current state could be forked into several different potential 'futures' (based on the possible set of next player inputs) that are run as separate simulations and then collapsed down into one 'correct' one when the actual player input arrives, kind of like rollback netcode. It'd be pretty damn CPU-intensive though, particularly for games that have a lot of potential 'next player inputs' on any given frame.
And that still wouldn't validate the claim of 'better than local no matter the hardware', because you'd still be wasting part of the potential gains on streaming lag. Even if zero-latency computing and networking was a thing, their tech would only ever be able to match local hardware doing the same stuff, not exceed it.
Now, if they were to invent time travel...