• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox One | Understanding Microsoft's Cloud claims | Tech panel and ArsTech article

oVerde

Banned
Lightmaps from my understanding are quite large file sizes. Like 200mb+ even at not so high resolutions. You'd have to have some crazy fast internet to do the kind of file sizes you're talking in a matter of seconds. Not viable except for a tiny minority of people with uber connections.

Why every player have to upload it? Makes no sense. The server is dedicated and has the assets, just needs to send you the calculations done.
 
Maybe they're planning for this to happen several years into the generation.

But I expect the most we see out of this early on will be stuff like billboards in Forza downloading new ads "from the cloud".

Amazon charges per hour for EC2 cloud usage, but Microsoft is going to give away 3TF of cloud compute power to every Xbone owner... to make some nicer lighting that 90% of their users will never notice? yeahokay.gif
 

shandy706

Member
cloud computing won't do a thing, people buying this are getting duped.

I bet you think the insane science/tech behind Kinect 2.0 is fake too huh?

Read up on how it works. The R&D behind the new Kinect is sci-fi like....and it's true. I'm guessing you've not looked into it?
 

jayu26

Member
Okay! I don't really understand any of this, but here are some of the questions that come to my mind:

Why would any third party developer do this (assuming this will add unnecessary cost to development)?
What happens when servers stop supporting older games (do they never look as good as intended in the future ever again)?

Also, does this not defeat the whole point of a mass market gaming console if one person has better experience on that console than another due to proximity to servers and/or better internet connection?
 

Klocker

Member
I'm getting kind of annoyed. I haven't found a mention of this on the MSDN sites and this also didn't seem to be on the agenda of the "Windows Azure Technical Briefing for the Games Industry" in Hamburg on 29. April. I would love to see actual documentation for this but my google-fu is failing me.

http://research.microsoft.com/pubs/72894/NOSSDAV2007.pdf
here is something although not related to Xbox but research related to mmos and latency

also this about Halo team testing with MS research in Jan 13
 
I bet you think the insane science/tech behind Kinect 2.0 is fake too huh?

Read up on how it works. The R&D behind the new Kinect is sci-fi like....and it's true. I'm guessing you've not looked into it?

i don't give a fuck about Kinect, why should i look into it?
 

bobbytkc

ADD New Gen Gamer
I bet you think the insane science/tech behind Kinect 2.0 is fake too huh?

Read up on how it works. The R&D behind the new Kinect is sci-fi like....and it's true. I'm guessing you've not looked into it?

This is not a thread talking about kinect, and nobody thought the kinect 2.0 was sci fi, what are you talking about? It is an expected evolution of existing tech.
 
D

Deleted member 8095

Unconfirmed Member
common sense.

Oh, so you don't know anything about it. You're just spouting your opinion as fact for some reason. I see. Why don't we wait until we know more about it before we claim people are getting duped. Is that so hard?
 
Oh, so you don't know anything about it. You're just spouting your opinion as fact for some reason. I see. Why don't we wait until we know more about it before we claim people are getting duped. Is that so hard?

i don't need to get stabed to know it hurts.

if people wanna believe cloud computing gonna make their X1s 40x times stronger, by all means, be my guests.
 

fallingdove

Member
Maybe if they didn't spend money on Kinect, and extra non gaming features they could have better tech thus negating the need for the cloud. The cloud just adds another layer of DRM and shit that could go wrong. If your game needs needs to offload data because the system can't handle it, you're doing it wrong.
It seems that Microsoft is going to end up havin the same issue as the Wii U. Though not as pronounced - they have to cut costs on CPU GPU and ram to keep the costs down enough to include the kinect 2.0 with ever unit sold.
 
How can lighting be pre-calculated? That makes no sense to me? How does it know which direction I'll be facing by the time the response is sent back from the "cloud"?

Like this
QpG784w.jpg
 

Karak

Member
Now, the real question is why Microsoft is pushing extremely hard on the notion that cloud computing is the future when faced with this information. Well, that's a rhetorical question though.

Same question I keep coming up with even since the leaked powerpoint they have been setting sights on it. That thing was dated years ago. Now the money...volumes of money they are spending is crazy.
 
Can someone here make a gif a Crysis 3 or any game with PHYSX ON in full settings with the tag "Cloud on" transitioning to Xbox360 LOD without PHYSX with the tag "Cloud off"? It would be a pretty damn funny and indication what we can expect if such a thing is forced.
 

oVerde

Banned
Can someone here make a gif a Crysis 3 or any game with PHYSX ON in full settings with the tag "Cloud on" transitioning to Xbox360 LOD without PHYSX with the tag "Cloud off"? It would be a pretty damn funny and indication what we can expect if such a thing is forced.

oh lawd ahahaha if I was good at Giffing would be doing it just for the lulz
and somehow explain it
.

Like this

Awesome :D
 

nib95

Banned
As I said earlier, my assumption, or at least hope for this generation is for more and more to go dynamic, from lighting, to physics, to geometry, procedural damage, foliage etc etc. If things do head in that direction, that's where cloud computing becomes even less viable and where added local level hardware and it's bandwidth becomes more advantageous.

It's next gen. Baked lighting should be a thing of the past.
 

i-Lo

Member
Same question I keep coming up with even since the leaked powerpoint they have been setting sights on it. That thing was dated years ago. Now the money...volumes of money they are spending is crazy.

The final evolution of console gaming inevitably would be cloud (like onLive) due to the artificial limits imposed on power drawn to performance ratio (unless we can come up with newer materials for fab) that a console has to conform to. But given current situation with inconsistent and unreliable state of internet, its proliferation and inherent latencies, we are far from it and to me this is an underhanded approach to achieving their ulterior motive at this point in time.

Even during E3 they can pull the wool over our eyes and so the best way to test would be when actual people have access who don't work at MS.
 

Kunan

Member
I'm not sure it's worth the significant re-tooling of engines to pursue it. If it ends up being a One thing, and Sony doesn't push it (and devs have to setup their own servers for PC ports), then it would be lots of man hours for a single platform. It sounds like a financial risk that would be easy to say no to.

This is before we even begin to talk about what exactly could/should be moved.

As I said earlier, my assumption, or at least hope for this generation is for more and more to go dynamic, from lighting, to physics, to geometry, procedural damage, foliage etc etc. If things do head in that direction, that's where cloud computing becomes even less viable and where added local level hardware and it's bandwidth becomes more advantageous.
This is on point. Dynamic rendering/damage/physics/etc. are where we're heading, finally dropping a lot of faked effects from last gen. These are the very things cloud computing would sell things short at.
 

tfur

Member
Ars should kindly sift through these claims, instead of just regurgitating PR. Nothing real time or near real time that depends on heavy computation will be done. You are not going to run fluid dynamics on the fly for a scene. These types of calculations are time based iterative simulations, and they are not calculable or retrievable at a rate a required for a live game.

You could however per run simulations, and have a database of results, but even then the retrieval time is sub optimal.

This whole PR campaign is extremely disingenuous.
 

140.85

Cognitive Dissonance, Distilled
MS has to demonstrate the value of this in a clear and impressive way.

Right now it sounds like the SimCity situation - i.e., an exaggerated claim used to leverage DRM.
 

kaching

"GAF's biggest wanker"
The current console generation, for starters. Longest hardware generation the market's seen, fully internet connected consoles from the start of it, ample cloud-computing server-farms available throughout. You'd think someone might want to have taken advantage of this much sooner...
 
Let's be realistic here, even with there being some legit use the likely outcome is one of the following:

1) Nobody will use it because why put the extra effort and complexity for one platform when you're on a tight deadline, and rising dev costs?

2) A third party will do it because it's not limited to the Xbox One and PS4 so the advantage is mute.

3) Only first parties will use it so only a handful of games at best will take advantage of it since it only works in specific scenarios.
 

nomis

Member
They're going to be spending an awful lot of time and money on BS R&D to make this offloaded fluid dynamics, etc. work just to have a cover story for being able to check your copyright licences every 24 hours.
 

Karak

Member
The final evolution of console gaming inevitably would be cloud (like onLive) due to the artificial limits imposed on power drawn to performance ratio (unless we can come up with newer materials for fab) that a console has to conform to. But given current situation with inconsistent and unreliable state of internet, its proliferation and inherent latencies, we are far from it and to me this is an underhanded approach to achieving their ulterior motive at this point in time.

Even during E3 they can pull the wool over our eyes and so the best way to test would be when actual people have access who don't work at MS.

OH I am sure we will hear from others devs soon about it for sure.
 

kadotsu

Banned
http://research.microsoft.com/pubs/72894/NOSSDAV2007.pdf
here is something although not related to Xbox but research related to mmos and latency

also this about Halo team testing with MS research in Jan 13

The approach of the first paper is very interesting but it doesn't address the problem.

1. If this method would be implemented the only stuff that would be offloaded would be a stateless, deterministic operation.
2. It's not a cloud computing solution but a grid computing application. The algorithm relies on having many clients not one server for performance and redundancy to ensure correctness. This redundancy also addresses the latency and thus can't be reproduced on a one client, one server structure.
 
Come on guys it's not too hard to come up with examples of stuff that doesn't need to happen right away. A basic version of this would be something like a chess game or RPG where the cloud could compute the AI opponent's turn in a matter of seconds where it might take minutes on the console. The data it sends back is nothing more than the next move, but the calculation required to figure it out wouldn't be practical on the console.

Or what about terrain generation where the game builds a random set of terrain while it's loading? On the console, this may take a lot longer than is practical, so it's dumbed down, but with cloud computing, it could be way more complex, even use data from other games currently in progress and send it back down to the console in a matter of seconds which is no big deal when you consider that loading into the level can take 20-30 seconds in a typical game anyway.

Those thinking this would be used to do hit testing, collision response, or other physics in real time are missing the point. That's not what it's for. (However, any modern FPS already does this to some extent: Quake did this, for example. When you'd shoot someone, the server (aka cloud) figured out if you hit them and then sent the response to the client with the result. That's a basic form of it, but it hopefully illustrates the technique. That's why lag kills FPS games).

And for those asking why PC games don't do this, they actually do. Any MMO does stuff like this all the time. Most PC games don't have thousands of dedicated servers at the disposal of everyone playing at any given time (unless they are an MMO of some kind).
 
Even if this works, which I doubt it will, it'll basically turn the Xbox One into the console equivalent of the gametrailers player. Varying quality depending on factors totally out of your control.

Whereas the PS4'll just have the guts to do it all locally.

I wouldn't be surprised if Microsoft just ends up faking the Cloud effect for first party titles and forces a lower quality mode if the internet connection drops to fool people.
 

Klocker

Member
The approach of the first paper is very interesting but it doesn't address the problem.

1. If this method would be implemented the only stuff that would be offloaded would be a stateless, deterministic operation.
2. It's not a cloud computing solution but a grid computing application. The algorithm relies on having many clients not one server for performance and redundancy to ensure correctness. This redundancy also addresses the latency and thus can't be reproduced on a one client, one server structure.

thanks

you know more than I, was hoping it might shed some light.
As I said I'm sure how it works exactly is still secret but will become more public once
if
they are ready to show it
 

Windu

never heard about the cat, apparently
i think i am going to believe the engineers at microsoft and not random people on gaf. Interesting to see what comes of this.
They're going to be spending an awful lot of time and money on BS R&D to make this offloaded fluid dynamics, etc. work just to have a cover story for being able to check your copyright licences every 24 hours.
They have been building their PaaS and IaaS stuff for awhile for enterprise. (they are all in with the cloud or whatever). They have the infrastructure already in place and probably not using it all so this is probably not much of a extra cost for them in the grand scheme of things.
 

jayu26

Member
Even if this works, which I doubt it will, it'll basically turn the Xbox One into the console equivalent of the gametrailers player. Varying quality depending on factors totally out of your control.

Whereas the PS4'll just have the guts to do it all locally.

I wouldn't be surprised if Microsoft just ends up faking the Cloud effect for first party titles and forces a lower quality mode if the internet connection drops to fool people.

Wow! I don't think anyone would go to that length to sabotage their own game just because they might be salty!
 

Kunan

Member
Come on guys it's not too hard to come up with examples of stuff that doesn't need to happen right away. A basic version of this would be something like a chess game or RPG where the cloud could compute the AI opponent's turn in a matter of seconds where it might take minutes on the console. The data it sends back is nothing more than the next move, but the calculation required to figure it out wouldn't be practical on the console.

Or what about terrain generation where the game builds a random set of terrain while it's loading? On the console, this may take a lot longer than is practical, so it's dumbed down, but with cloud computing, it could be way more complex, even use data from other games currently in progress and send it back down to the console in a matter of seconds which is no big deal when you consider that loading into the level can take 20-30 seconds in a typical game anyway.

Those thinking this would be used to do hit testing, collision response, or other physics in real time are missing the point. That's not what it's for. (However, any modern FPS already does this to some extent: Quake did this, for example. When you'd shoot someone, the server (aka cloud) figured out if you hit them and then sent the response to the client with the result. That's a basic form of it, but it hopefully illustrates the technique. That's why lag kills FPS games).

And for those asking why PC games don't do this, they actually do. Any MMO does stuff like this all the time. Most PC games don't have thousands of dedicated servers at the disposal of everyone playing at any given time (unless they are an MMO of some kind).
There are absolutely a ton of things that could be offloaded. Like you said, complicated AI determination routines could be offloaded. When you approach an enemy and change direction (or switch weapon, start running, etc.), a split second delay to its reaction and new plan of action is more humanistic than a mirror, and could free the CPU to do other tasks.

The main reason why rendering/dynamic tasks are being brought up a lot is because MS reps have brought it up as a way of closing a perceived power gap and leaping forward. Wars don't have much place in this thread, but it's good to establish context about what kinds of things we truly will achieve to anyone reading.
 
i think i am going to believe the engineers at microsoft and not random people on gaf. Interesting to see what comes of this.

To my knowledge not a single MS engineer has been interviewed on these topics. It's all been dirtbag businessmen and pr people. An engineer who actually worked on such tech could probably explain it clearly and concisely. Or even better, they could show it. Instead of just saying MAN IT'LL BE SO GREAT WITH THE CLOUD NAH MAN USED GAMES DRM SHHH THE CLOUD
 

Riggs

Banned
To my knowledge not a single MS engineer has been interviewed on these topics. It's all been dirtbag businessmen and pr people. An engineer who actually worked on such tech could probably explain it clearly and concisely. Or even better, they could show it. Instead of just saying MAN IT'LL BE SO GREAT WITH THE CLOUD NAH MAN USED GAMES DRM SHHH THE CLOUD

.
 

WoolyNinja

Member
Like this

You still have to figure out how the lighting from the sun reacts with everything. And if it doesn't react with anything then it wouldn't be hard to just calculate that locally. It would take up little to no resources to show a bright sun in the sky whose light doesn't interact with anything.
 
So, on a purely technical level, this isn't totally impossible. As has been pointed out in this thread, the idea is conceptually similar to how client-server multiplayer games already work, where the game state is computed on the server, and then sent to the client every tick. The thing is, the magic that makes that all work is prediction. The information from the server is always a few frames behind, so to hide that latency the client needs to be able to predict the current state of the game based on the old information from the server. This is a fairly well understood problem, but when latency or packet loss get too high, the algorithms break down and you start to see weird artifacts (players suddenly teleport or move backwards, you discover you've actually been dead for three seconds, etc.).

So, lets consider what this approach might look like for the example application we've been told about, lighting. Basically, you end up needing three different lighting systems:

1) There's the fancy all singing, all dancing lighting system that runs "in the cloud" and produces gorgeous looking effects, but is calculating the lights based on information from a frame or two ago.

2) There's the predictive lighting system running on the console. It tries to take the lighting information given to it from the system above, and (as cheaply as possible) figure out how to update it so that it actually looks correct for the frame that's currently being computed. This is potentially a very, very hard problem.

(You can try to minimize the work (2) has to do by moving some of the prediction to (1), but since the server can never know exactly when the information it sends will arrive at the console, (2) can never be fully eliminated).

3) You also need a complete fallback lighting system that can run real-time on the console. This is necessary for when the cloud is unavailable (for whatever reason) and for when something happening in the game totally invalidates the lighting information sent from the server (for example, the players turns on a light in a previously dark room).

This is a lot of complexity for what will probably end up being an incremental improvement in lighting quality. It would also cost a whole lot of money to run all those servers.

The other problem with the idea is that it doesn't help differentiate the Xbox from the competition. As cloud computing goes, Azure is an also-ran product and MS is hardly a powerhouse in the field. There's nothing stopping Sony from introducing the same capability for use on the PS4, or for that matter stopping EA or Activision from setting up their own clouds that work cross platform. There's also nothing about this idea that couldn't have been done on current gen consoles. The fact that almost no one has tried it before is somewhat telling.
 

shinnn

Member
To my knowledge not a single MS engineer has been interviewed on these topics. It's all been dirtbag businessmen and pr people. An engineer who actually worked on such tech could probably explain it clearly and concisely. Or even better, they could show it. Instead of just saying MAN IT'LL BE SO GREAT WITH THE CLOUD NAH MAN USED GAMES DRM SHHH THE CLOUD

what.. they discussed it in a panel with hundreds journalists watching.

http://www.twitch.tv/microsoftstudios/c/2315924
 

Windu

never heard about the cat, apparently
To my knowledge not a single MS engineer has been interviewed on these topics. It's all been dirtbag businessmen and pr people. An engineer who actually worked on such tech could probably explain it clearly and concisely. Or even better, they could show it. Instead of just saying MAN IT'LL BE SO GREAT WITH THE CLOUD NAH MAN USED GAMES DRM SHHH THE CLOUD
that twitch talk had engineers talking about it.
 

nib95

Banned
So, on a purely technical level, this isn't totally impossible. As has been pointed out in this thread, the idea is conceptually similar to how client-server multiplayer games already work, where the game state is computed on the server, and then sent to the client every tick. The thing is, the magic that makes that all work is prediction. The information from the server is always a few frames behind, so to hide that latency the client needs to be able to predict the current state of the game based on the old information from the server. This is a fairly well understood problem, but when latency or packet loss get too high, the algorithms break down and you start to see weird artifacts (players suddenly teleport or move backwards, you discover you've actually been dead for three seconds, etc.).

So, lets consider what this approach might look like for the example application we've been told about, lighting. Basically, you end up needing three different lighting systems:

1) There's the fancy all singing, all dancing lighting system that runs "in the cloud" and produces gorgeous looking effects, but is calculating the lights based on information from a frame or two ago.

2) There's the predictive lighting system running on the console. It tries to take the lighting information given to it from the system above, and (as cheaply as possible) figure out how to update it so that it actually looks correct for the frame that's currently being computed. This is potentially a very, very hard problem.

(You can try to minimize the work (2) has to do by moving some of the prediction to (1), but since the server can never know exactly when the information it sends will arrive at the console, (2) can never be fully eliminated).

3) You also need a complete fallback lighting system that can run real-time on the console. This is necessary for when the cloud is unavailable (for whatever reason) and for when something happening in the game totally invalidates the lighting information sent from the server (for example, the players turns on a light in a previously dark room).

This is a lot of complexity for what will probably end up being an incremental improvement in lighting quality. It would also cost a whole lot of money to run all those servers.

The other problem with the idea is that it doesn't help differentiate the Xbox from the competition. As cloud computing goes, Azure is an also-ran product and MS is hardly a powerhouse in the field. There's nothing stopping Sony from introducing the same capability for use on the PS4, or for that matter stopping EA or Activision from setting up their own clouds that work cross platform. There's also nothing about this idea that couldn't have been done on current gen consoles. The fact that almost no one has tried it before is somewhat telling.

Good post.
 

Bumblebeetuna

Gold Member
How viable is the claim regarding the cloud acting as dedicated servers for all games? Because that sounds awesome. Something that would make 3rd party games like Call of Duty way better on Xbone potentially.
 
Greenwalt alluded to it being used in Forza 5 so maybe we will see sooner than later

That's a first party game. It's to be expected, but really how many third party developers are going to use this? Not much. I mean despite Sony by year three having some pretty great tools for development on Cell. Developers still chose the cheapest/path of least resistance. As long as PS4 has a sizable marketshare that's comparable to Xbone, this isn't going to gain much traction in third party use. Developers will continue make sure their games run on code that's as cross platform as can be. This is basically an analogue to developing PS3 games versus 360/PC games in terms of code utilization.
 
Top Bottom