• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Cloud demoing for gaming at Microsoft Build

MS is not going to give away $20 odd of compute power per a player per an hour for free, even with massive discounts (say 50%) it is still way to much for devs or players to bare.

Who says it's for free? People pay for xbox live. And if you take the amount of paying costumers versus the number of concurrent users I'd say it's pretty much covers the usage of these servers.
 
Was it just mean or did it look like the simulation in the demo was running in slow motion? I don't know how they expect to ever use this in a shipping product if even in a best case scenario it can't run at full speed.
 
I seriously don't get how MS is going to pay for this. Say in a few years there will be 40mil xbox owners wanting to play games with this kind of tech. How many people can one of these servers handle at the same time? I mean Live brings in 5$ a month or something and how are they going to keep all these servers up for that kind of money.
 
Was it just mean or did it look like the simulation in the demo was running in slow motion? I don't know how they expect to ever use this in a shipping product if even in a best case scenario it can't run at full speed.

What the hell are you talking about? lol
 
Who says it's for free? People pay for xbox live. And if you take the amount of paying costumers versus the number of concurrent users I'd say it's pretty much covers the usage of these servers.

One of these machines would need to calculate for 144 people at the same time if all the money from live went to it and it only costs 1$ an hour to run when hardware, electricity, support, maintenace etc... have been included. That seems improbable.
 
I seriously don't get how MS is going to pay for this. Say in a few years there will be 40mil xbox owners wanting to play games with this kind of tech. How many people can one of these servers handle at the same time? I mean Live brings in 5$ a month or something and how are they going to keep all these servers up for that kind of money.

I'm sure there are more than 40 million people using their Azure servers for things besides gaming on a daily basis.

Having said that divide 40 million by the number of people online at any one time. Then divide that number by the number of people who are playing a game that requires cloud computing.
 
I'm sure there are more than 40 million people using their Azure servers for things besides gaming on a daily basis.

Having said that divide 40 million by the number of people online at any one time. Then divide that number by the number of people who are playing a game that requires cloud computing.

But how much of that incorporates heavy computation? I looked at about the first 10-15 companies on their partners sites and it mostly just having people connected to each or dedicated servers.
 
Those users are actually paying for the service.

Well the end user is theoretically paying via Xbox Live but considering Microsoft is rather fond of using Xbox Live as a paywall, I can never see them letting people using these servers for free.

Something has to give, this is the games industry where a video game console has apps that are free on every single piece of hardware, including fridges, locked behind a paywall.
 
What the hell are you talking about? lol

I don't know. All the destruction looked like it was happening at like half speed. I'd blame the video playback on my tablet, but the people and sound were normal. But like I said, maybe I'm imagining things.
 
Correct me if I'm wrong but Azure is a CPU farm, not a GPU farm. Combine that with the following observation about the CPU utilization of Infamous: Second Son, and XB1 games are still going to be GPU bound even with cloud computing.

It was also hard to max out the CPU (while we learned in a previous interview that the GPU is used at its maximum capacity most of the time), with 50-70% used for main jobs and 5-16% for other threads.
How inFAMOUS: Second Son Used the PS4′s 8 (4.5) GB of RAM, CPU and GPU Compute to Make Our Jaws Drop
 
I don't know. All the destruction looked like it was happening at like half speed. I'd blame the video playback on my tablet, but the people and sound were normal. But like I said, maybe I'm imagining things.

Ok, I'll give a more serious reply.

The speed that objects move in a computer program is by design. The only thing that really matters here would be the framerate, which wasn't noticeably impacted in the cloud demonstration (and actually was running slow in the non-cloud version). For example, let's take two 60fps racers, Outrun 2 and F-Zero GX. Outrun isn't running in slow motion in comparison to F-Zero simply because everything appears to be moving slower to you. It just means that the game is designed to move objects further along in the same amount of time.
 
Ok, I'll give a more serious reply.

The speed that objects move in a computer program is by design. The only thing that really matters here would be the framerate, which wasn't noticeably impacted in the cloud demonstration (and actually was running slow in the non-cloud version). For example, let's take two 60fps racers, Outrun 2 and F-Zero GX. Outrun isn't running in slow motion in comparison to F-Zero simply because everything appears to be moving slower to you. It just means that the game is designed to move objects further along in the same amount of time.

I'm well aware, and that's actually part of my point. It's supposed to be a physical simulation, right? Based on certain distanced, weights, momentums and gravity? To make a simulation appear realistic you would set those things to approximate real physical laws that govern things like the rate of acceleration for an object falling in standard gravity.

For me the simulation presented in this demo seems off, like they dilated time (and this is independent from, and irrespective of the frame rate) the way CCP does when too many ships are fighting in one place in EVE Online. Not to that extreme, or course. Maybe they didn't think about it. Maybe they set it up the way they did to make it easier to see the number of chunks flying around to emphasize the destruction. Or maybe something about the latency introduced by the cloud processing can be better hidden at this time scale.

In any case we're still a long way away from this kind of processing being viable for end user applications for myriad reasons.
 
Battlefield uses cloud in the same way? Well that's mighty impressive.

Battlefield does with it's physics stuff, and I believe there was a post on reddit from an Engineer saying that the reason why BF4 is so buggy and had a rough launch was all the serverside calculations were too much and something about either tuning it down or asking for more or better servers (which is obviously something that got shot down)
 
I'm well aware, and that's actually part of my point. It's supposed to be a physical simulation, right? Based on certain distanced, weights, momentums and gravity? To make a simulation appear realistic you would set those things to approximate real physical laws that govern things like the rate of acceleration for an object falling in standard gravity.

For me the simulation presented in this demo seems off, like they dilated time (and this is independent from, and irrespective of the frame rate) the way CCP does when too many ships are fighting in one place in EVE Online. Not to that extreme, or course. Maybe they didn't think about it. Maybe they set it up the way they did to make it easier to see the number of chunks flying around to emphasize the destruction. Or maybe something about the latency introduced by the cloud processing can be better hidden at this time scale.

In any case we're still a long way away from this kind of processing being viable for end user applications for myriad reasons.

Man, the demo wasn't about being the most realistic destruction, was about showing how the Cloud can process those calculations.
Like the user above said, it was a matter of design.
 
"I doubt that'd happen often if at all anyway," words an engineer loves to hear. Also, if using the time the rocket is traveling to its destination is your primary strategy, you might be surprised when a player decides to fire a rocket at point blank range.
Or, you can set parameters for larger pieces to interact locally. Or maybe the best way, you can't interact with them at all or InstaDeath. If you're shooting at point blank, kill the player and render something canned. And prediction technology exists these days. Start calculating when the player is aiming. Rockets in games usually don't launch from in use to ads to launched for a nearly a half second anyway. There's smart ways to do this. Just, not so many instantly visible like a building exploding.

Looks like MS took the "Will my game look worse without cloud?" to heart.
 
Man, the demo wasn't about being the most realistic destruction, was about showing how the Cloud can process those calculations.
Like the user above said, it was a matter of design.

It was a demo intended to show the technique is becoming a viable solution. For that to be persuasive the demo needs to show it's possible in the way you'd present it in an actual product. Unless it's only supposed to be used in games set under water or with slow motion powers I'm left with certain questions about what can really be accomplished.
 
I'm well aware, and that's actually part of my point. It's supposed to be a physical simulation, right? Based on certain distanced, weights, momentums and gravity? To make a simulation appear realistic you would set those things to approximate real physical laws that govern things like the rate of acceleration for an object falling in standard gravity.

For me the simulation presented in this demo seems off, like they dilated time (and this is independent from, and irrespective of the frame rate) the way CCP does when too many ships are fighting in one place in EVE Online. Not to that extreme, or course. Maybe they didn't think about it. Maybe they set it up the way they did to make it easier to see the number of chunks flying around to emphasize the destruction. Or maybe something about the latency introduced by the cloud processing can be better hidden at this time scale.

In any case we're still a long way away from this kind of processing being viable for end user applications for myriad reasons.

Ah ok, your point makes a lot more sense now. My guess would be that the speed was chosen primarily to allow us to see the effects of each impact. I don't think the latency would make much noticeable differences in this case, as I imagine they'd be able to get that down to 1 or 2 frames at this frame rate. There's also the fact that only the rocket shots would show this delay as the detonation was invisible, so we'd not know when it was triggered anyway.

I do agree that the use case shown here doesn't make much sense for anything other than a fully online game right now, but then... I don't have much of an imagination. I also want to apologise if my previous post came across as patronizing. Sometimes on here I end up in discussions with people who don't understand some of the basic concepts, so I kinda misjudged what you were implying by slow motion. :P
 
In the first demo (the one without server compute), the area surrounding the main building is empty. See 0:52 - he turns to the left and there is nothing there.

In the server compute demo, the area to the left, and all around, are filled with other buildings and objects. Some of them are destructible (all while the main building is falling apart).

The implication seems to be that by allowing the server compute to handle the realtime calculations for the destruction, there is reserve processing power to fill the world with more detail...? Could this work for a game like Destiny (just an example) where much of the world can be processed through server compute... seeing as it is an online game...?
 
I'm sure there are more than 40 million people using their Azure servers for things besides gaming on a daily basis.

Having said that divide 40 million by the number of people online at any one time. Then divide that number by the number of people who are playing a game that requires cloud computing.

And some of those are going to be in multiplayer situations in which you'll only have to calculate the global state once, then distribute the results to multiple people.

Gold subscribers are also worth far more then just the cost of their subscription, many of them spend a lot of money on Live.

In any case, I like how we're moving on from "the power of TEH CLOUD lol" to "but who's going to pay for that??"
 
And some of those are going to be in multiplayer situations in which you'll only have to calculate the global state once, then distribute the results to multiple people.

Gold subscribers are also worth far more then just the cost of their subscription, many of them spend a lot of money on Live.

In any case, I like how we're moving on from "the power of TEH CLOUD lol" to "but who's going to pay for that??"

Nope, not from what I expect. Although you are partially right, it's not who is going to pay for it, but rather who is going to pay for the extra development costs for minor benefits.

Also really amusing that almost every defender of the cloud is a junior member...

Correct me if I'm wrong but Azure is a CPU farm, not a GPU farm. Combine that with the following observation about the CPU utilization of Infamous: Second Son, and XB1 games are still going to be GPU bound even with cloud computing.

Fantastic point showing that the GPU is far more important then the CPU, and the cloud can't help with the GPU.
 
And some of those are going to be in multiplayer situations in which you'll only have to calculate the global state once, then distribute the results to multiple people.

Gold subscribers are also worth far more then just the cost of their subscription, many of them spend a lot of money on Live.

In any case, I like how we're moving on from "the power of TEH CLOUD lol" to "but who's going to pay for that??"

The viability of the server compute, and who pays for it, are 2 different arguments

The demo isn't the be and end all but it does show the benefits of Azure compute services. So naturally people will shift towards attacking the idea from another angle: price. It is what people do.
 
Nope, not from what I expect. Although you are partially right, it's not who is going to pay for it, but rather who is going to pay for the extra development costs for minor benefits.

Also really amusing that almost every defender of the cloud is a junior member...



Fantastic point showing that the GPU is far more important then the CPU, and the cloud can't help with the GPU.

Just curious, but why are you so certain that this would lead to substantially increased development costs? I don't know the specifics myself, but apparently one of the big things about Thunderhead is that it has simple integration with the Xbox SDK. It may simply be a case of placing your physics module on their service and calling that rather than the standard function calls you'd usually have.

And yes, the Xbox GPU will still be limited by its CPU either way (so is the PS4 btw). However CPU computing can help with some of the tasks assigned to a GPU even if it is far less optimal to do so. It also helps that the CPU in this case will be tasked with one thing, rather than also being responsible for rendering everything else visible to the player at the time. So I don't see how highlighting how Infamous uses the GPU actually invalidates anything. It's not like utilising the cloud invalidates the X1's own GPU...

EDIT: I'm also not a junior member. :)
 
I readily admit I know nothing about this stuff, but to me at least if both computers were rendering the same thing at the same resolution and the cloud one stayed at 32 fps while the non-cloud one dropped to 2 fps then that's kind of impressive.

Depends, if the 'cloud' was 5 servers with super CPU and TITAN black GPU doing the grunt and connected high speed and 2 ms PING to the device....

Or normal commercial azure servers many miles away on a normal internet connection and 50 ms ping......

I remain sceptical...

I don't believe much I see at a staged event...
 
What happens if there are too many users playing a game?

Obviously, the calculations on cloud are possible because some other CPUs/GPUs are doing the work for you. But how many do they have? They will need millions of them in order to cover all the user base. Whole server farms to be exact.

All this seems like a waste of recourses and electricity to me. I mean each person will need 2, 3 or more computers with strong GPUs working exclusively for him. And for what? Just so he can have more particles and polygons? Aren't these things getting better as the technology progresses anyway? Isn't optimization of the code also helping?

Also, imagine the cloud being a standard. I can imagine that it will make developers care less about optimizing their code because they will have the cloud to do the work for them. Games that would probably work without the cloud with some optimization will probably require it in the future...
 
It was a demo intended to show the technique is becoming a viable solution. For that to be persuasive the demo needs to show it's possible in the way you'd present it in an actual product. Unless it's only supposed to be used in games set under water or with slow motion powers I'm left with certain questions about what can really be accomplished.

That is your vision because you seem to know nothing of game programming.
The actual design of how realistic the destruction is doesn't matter at all, that is a question of fine tuning all the variables such as gravity, attrition, mass, etc...

The calculations are going to be the same (CPU load in other words) either you have 0 gravity or earth gravity. When they explode the chunks they need to calculate all the math according to the variables given, but the numbers of the variables don't change the amount of calculations enormously.
 
I was mostly frustrated by MS spouting 'cloud' every other sentence around E3 with no concrete info to back it up, hoping that people just buy it as a magical thing.

There are areas where it helps, but by the nature of the cloud, that isn't unique to Xbox - also something MS seemed to imply

Anything heavy duty, like calculating physics so the 'client' can just draw it and save computing power (eg think that GTAV vehicle crashing into a wall - easy to play back the canned movements of the wall breaking) sounds good, but would require a significant amount of processing power, and the cloud isn't set up like that at the moment - it might move towards that, but it is more suited to file serving. Plus you have issues like latency - you'd need a lot of power available to crunch the data quickly enough to send back, and bandwidth - sending the canned animation data for those physics objects could require a decent Internet connection. Same with things like having the cloud calculate lighting changes etc. plus your game would need to account for situations where you don't have that data, so increasing the complexity of the game significantly.


The best uses would be asynchronous stuff like forza 5's drivatars. There you can let slower processors crunch over a period of time, you don't need that back instantly.


And any benefits are still usable by Sony
 
That is your vision because you seem to know nothing of game programming.
The actual design of how realistic the destruction is doesn't matter at all, that is a question of fine tuning all the variables such as gravity, attrition, mass, etc...

The calculations are going to be the same (CPU load in other words) either you have 0 gravity or earth gravity. When they explode the chunks they need to calculate all the math according to the variables given, but the numbers of the variables don't change the amount of calculations enormously.

But the timescale chosen changes the amount of time you have to do those calculations.
 
It would be better if the demo was actually impressive. There's no reason a decent gaming PC would only get 2fps with that demo, unless the code is really unoptimized. We've seen more impressive physics in actual games.
 
But the timescale chosen changes the amount of time you have to do those calculations.

Even if it does, they are blowing up 30000 chunks in less that a minute. You don't do that in any game, anywhere. Each chunk is a individual object, that is the magic of all this. And they keep calculating it all through the demo.
1 minute, 32000 chunks. https://www.youtube.com/watch?v=QxHdUDhOMyw&feature=youtu.be

Find me any game that even come close to that.

https://www.youtube.com/watch?v=wDciQ2nAPdQ (amazing destruction, If you pause you can not see more than 1000 chunks)
 
Longevity of support bothers me.

Say I have an SP game that offloads something to Azure as an integral part of the game design.

Now let's say it doesn't sell well.

How long will MS subsidise my cloud-dependent flop?

Even if it is successful, how long will MS support it?
 
Longevity of support bothers me.

Say I have an SP game that offloads something to Azure as an integral part of the game design.

Now let's say it doesn't sell well.

How long will MS subsidise my cloud-dependent flop?

Even if it is successful, how long will MS support it?

Even with cloud's inherent scalability, that's a fair question that needs to be answered.
 
Even if it does, they are blowing up 30000 chunks in less that a minute. You don't do that in any game, anywhere. Each chunk is a individual object, that is the magic of all this. And they keep calculating it all through the demo.
1 minute, 32000 chunks. https://www.youtube.com/watch?v=QxHdUDhOMyw&feature=youtu.be

Find me any game that even come close to that.

https://www.youtube.com/watch?v=wDciQ2nAPdQ (amazing destruction, If you pause you can not see more than 1000 chunks)

30.000 is peanuts. How about 200.000? In 1080p at 60fps?

"The entire environment - everything - is built up from these voxels," explains Kruger. "All of the cubes that you see flying around - there's no gimmick, no point sprites, it's not a particle effect, they're actual physical cubes. In gameplay, dynamic cubes with collisions, floating around, you can get up to 200,000. Our engine supports up to 500,000 but in actual gameplay scenarios, it rarely goes over 200K."

http://www.eurogamer.net/articles/digitalfoundry-vs-resogun
 
MS have a big uphill battle convincing people about the viability of their server compute.

Talking about it was weak. Showing a concept in action is better and is creating some better discussion around the idea of server compute. Now seeing it all happen on a 3Mb/second connection, with 100ms - 200ms latency, with high-res textures, lighting, shadows etcetera needs to happen.

If it works convincingly in that scenario then I'm sold
 
Also really amusing that almost every defender of the cloud is a junior member...



Fantastic point showing that the GPU is far more important then the CPU, and the cloud can't help with the GPU.

If you have a concern pm a mod and let them deal with it.

You're entirely missing the point on this vs. Infamous - if Infamous was trying to compute 32000 objects you can bet your ass it'll be cpu bound. This demo is to show that in processing scenarios that are so intense that they drive the machine from gpu bound to cpu bound, you can remove the cpu burden and get it back to gpu bound
 
Yeah, whats your point ?

Looks OK on a decent PC rig at full HD and consistent frame rate
It's a demo of cloud computing under the conditions you stated. I'm not going to derail this thread into Titanfall cloud discussion - it's already been done to death in the Respawn Engadget interview thread a few weeks back.
 
30.000 is peanuts. How about 200.000? In 1080p at 60fps?



http://www.eurogamer.net/articles/digitalfoundry-vs-resogun

Whilst touting 32000 chunks was a bit misguided, Resogun doesn't trouble itself with rotations as far as I can tell. That makes everything significantly simpler, and prevents it from being comparable really.

The Next Car Game comparison is also flawed though, because that stuff simply wouldn't work well with the delayed responses you'd get from the cloud computations.
 
Point I don't get is this.

Say this demo runs slowly on a high end PC, and could run better of say 2 servers being conservative.

How many Azure servers are there, lets say 300,000 just for argument sake.

Now, if a game utilising this sells 2 million copies, pretty normal for a decent console game...

Then how many servers would be needed ? 4 million / 6 million ?

It looks OK for a single unit demo in known conditions, but the reality does not add up, at all to me.

Anybody the wiser, or does this look totally unpractical ?
 
Whilst touting 32000 chunks was a bit misguided, Resogun doesn't trouble itself with rotations as far as I can tell. That makes everything significantly simpler, and prevents it from being comparable really.

The Next Car Game comparison is also flawed though, because that stuff simply wouldn't work well with the delayed responses you'd get from the cloud computations.

The cubes in Resogun definitely rotate, here's a screenshot. They're all just cubes though, which indeed makes it a lot simpeler. Still, 200.000 at 1080p60 is pretty impressive.
 
Nope, not from what I expect. Although you are partially right, it's not who is going to pay for it, but rather who is going to pay for the extra development costs for minor benefits.

Also really amusing that almost every defender of the cloud is a junior member...



Fantastic point showing that the GPU is far more important then the CPU, and the cloud can't help with the GPU.

There are current/future techniques that make more use of the CPU than the GPU like raytracing. So not everything graphically has to be tied to the GPU.
 
The silence in NeoGAF today is deafening...

You don't have to dislike it because it's great news for Xbox One, it's great for gamers in general! We should all be excited about the potential for this tech!
 
The silence in NeoGAF today is deafening...

You don't have to dislike it because it's great news for Xbox One, it's great for gamers in general! We should all be excited about the potential for this tech!

No, its just we are smart enough to know cloud compute is no big deal.

http://www.extremetech.com/extreme/...than-google-but-more-than-amazon-says-ballmer

So, MS have 300,000 servers for xbox projected, and 1 million in total according to Balmer. Average cost of a server is $ 1000, think of that as a top end PC....

MS is going for office 365 and other cloud services that companies will pay for. They aint going to give each xbox gamer their own server (unless you are a MisterX fan and live in stupid land)

If a game uses 1 server to do all the computing for it (probably like that stage demo), and the game sells 2 million, then they don't have enough servers. TO RUN 1 GAME LIKE THIS.

And I am giving internet latency and bandwidth / PING benefit of the doubt....

Most people understand this and see it for the FUD it is. Obviously you don't.

If you are wiser, then tell us...
 
Would MS have to put 'Online Only' on all of their game cases? Wouldn't it be breaking the law if they didn't? And if MS start adding this into all of their games, isn't it DRM by the back door?

It won't happen ...

'Sorry, you can't play this singleplayer game it requires an internet connection ...'

It's still not working.

'Sorry, you can't play this singleplayer game it requires good internet connection'

But I live in Africa.

'Sorry, you can't play this singleplayer game it requires good internet connection and won't be releasing in your country.'

The game didn't sell well at all

'Sorry but the servers for The Experiment have had to be shut down.'
 
MS have a big uphill battle convincing people about the viability of their server compute.

Talking about it was weak. Showing a concept in action is better and is creating some better discussion around the idea of server compute. Now seeing it all happen on a 3Mb/second connection, with 100ms - 200ms latency, with high-res textures, lighting, shadows etcetera needs to happen.

If it works convincingly in that scenario then I'm sold

Yeah, irrespective of the timescale concerns I brought up, I would not be surprised if the reason the demo is capped at 32fps and the graphics are so flat and simple is because they have to spend most of each frame's rendering time waiting for the results to be delivered from "the cloud".
 
Top Bottom