Out of curiosity... What did you expect him to say?
he didn't have to say anything at all.
Out of curiosity... What did you expect him to say?
MS is not going to give away $20 odd of compute power per a player per an hour for free, even with massive discounts (say 50%) it is still way to much for devs or players to bare.
Was it just mean or did it look like the simulation in the demo was running in slow motion? I don't know how they expect to ever use this in a shipping product if even in a best case scenario it can't run at full speed.
Who says it's for free? People pay for xbox live. And if you take the amount of paying costumers versus the number of concurrent users I'd say it's pretty much covers the usage of these servers.
I seriously don't get how MS is going to pay for this. Say in a few years there will be 40mil xbox owners wanting to play games with this kind of tech. How many people can one of these servers handle at the same time? I mean Live brings in 5$ a month or something and how are they going to keep all these servers up for that kind of money.
I'm sure there are more than 40 million people using their Azure servers for things besides gaming on a daily basis.
Having said that divide 40 million by the number of people online at any one time. Then divide that number by the number of people who are playing a game that requires cloud computing.
What the hell are you talking about? lol
It was also hard to max out the CPU (while we learned in a previous interview that the GPU is used at its maximum capacity most of the time), with 50-70% used for main jobs and 5-16% for other threads.
How inFAMOUS: Second Son Used the PS4′s 8 (4.5) GB of RAM, CPU and GPU Compute to Make Our Jaws Drop
I don't know. All the destruction looked like it was happening at like half speed. I'd blame the video playback on my tablet, but the people and sound were normal. But like I said, maybe I'm imagining things.
Ok, I'll give a more serious reply.
The speed that objects move in a computer program is by design. The only thing that really matters here would be the framerate, which wasn't noticeably impacted in the cloud demonstration (and actually was running slow in the non-cloud version). For example, let's take two 60fps racers, Outrun 2 and F-Zero GX. Outrun isn't running in slow motion in comparison to F-Zero simply because everything appears to be moving slower to you. It just means that the game is designed to move objects further along in the same amount of time.
Battlefield uses cloud in the same way? Well that's mighty impressive.
I'm well aware, and that's actually part of my point. It's supposed to be a physical simulation, right? Based on certain distanced, weights, momentums and gravity? To make a simulation appear realistic you would set those things to approximate real physical laws that govern things like the rate of acceleration for an object falling in standard gravity.
For me the simulation presented in this demo seems off, like they dilated time (and this is independent from, and irrespective of the frame rate) the way CCP does when too many ships are fighting in one place in EVE Online. Not to that extreme, or course. Maybe they didn't think about it. Maybe they set it up the way they did to make it easier to see the number of chunks flying around to emphasize the destruction. Or maybe something about the latency introduced by the cloud processing can be better hidden at this time scale.
In any case we're still a long way away from this kind of processing being viable for end user applications for myriad reasons.
Or, you can set parameters for larger pieces to interact locally. Or maybe the best way, you can't interact with them at all or InstaDeath. If you're shooting at point blank, kill the player and render something canned. And prediction technology exists these days. Start calculating when the player is aiming. Rockets in games usually don't launch from in use to ads to launched for a nearly a half second anyway. There's smart ways to do this. Just, not so many instantly visible like a building exploding."I doubt that'd happen often if at all anyway," words an engineer loves to hear. Also, if using the time the rocket is traveling to its destination is your primary strategy, you might be surprised when a player decides to fire a rocket at point blank range.
Man, the demo wasn't about being the most realistic destruction, was about showing how the Cloud can process those calculations.
Like the user above said, it was a matter of design.
I'm well aware, and that's actually part of my point. It's supposed to be a physical simulation, right? Based on certain distanced, weights, momentums and gravity? To make a simulation appear realistic you would set those things to approximate real physical laws that govern things like the rate of acceleration for an object falling in standard gravity.
For me the simulation presented in this demo seems off, like they dilated time (and this is independent from, and irrespective of the frame rate) the way CCP does when too many ships are fighting in one place in EVE Online. Not to that extreme, or course. Maybe they didn't think about it. Maybe they set it up the way they did to make it easier to see the number of chunks flying around to emphasize the destruction. Or maybe something about the latency introduced by the cloud processing can be better hidden at this time scale.
In any case we're still a long way away from this kind of processing being viable for end user applications for myriad reasons.
I'm sure there are more than 40 million people using their Azure servers for things besides gaming on a daily basis.
Having said that divide 40 million by the number of people online at any one time. Then divide that number by the number of people who are playing a game that requires cloud computing.
And some of those are going to be in multiplayer situations in which you'll only have to calculate the global state once, then distribute the results to multiple people.
Gold subscribers are also worth far more then just the cost of their subscription, many of them spend a lot of money on Live.
In any case, I like how we're moving on from "the power of TEH CLOUD lol" to "but who's going to pay for that??"
Correct me if I'm wrong but Azure is a CPU farm, not a GPU farm. Combine that with the following observation about the CPU utilization of Infamous: Second Son, and XB1 games are still going to be GPU bound even with cloud computing.
And some of those are going to be in multiplayer situations in which you'll only have to calculate the global state once, then distribute the results to multiple people.
Gold subscribers are also worth far more then just the cost of their subscription, many of them spend a lot of money on Live.
In any case, I like how we're moving on from "the power of TEH CLOUD lol" to "but who's going to pay for that??"
Nope, not from what I expect. Although you are partially right, it's not who is going to pay for it, but rather who is going to pay for the extra development costs for minor benefits.
Also really amusing that almost every defender of the cloud is a junior member...
Fantastic point showing that the GPU is far more important then the CPU, and the cloud can't help with the GPU.
I readily admit I know nothing about this stuff, but to me at least if both computers were rendering the same thing at the same resolution and the cloud one stayed at 32 fps while the non-cloud one dropped to 2 fps then that's kind of impressive.
What happens if you have a slow internet connection?
It was a demo intended to show the technique is becoming a viable solution. For that to be persuasive the demo needs to show it's possible in the way you'd present it in an actual product. Unless it's only supposed to be used in games set under water or with slow motion powers I'm left with certain questions about what can really be accomplished.
What happens if you have a slow internet connection?
That is your vision because you seem to know nothing of game programming.
The actual design of how realistic the destruction is doesn't matter at all, that is a question of fine tuning all the variables such as gravity, attrition, mass, etc...
The calculations are going to be the same (CPU load in other words) either you have 0 gravity or earth gravity. When they explode the chunks they need to calculate all the math according to the variables given, but the numbers of the variables don't change the amount of calculations enormously.
But the timescale chosen changes the amount of time you have to do those calculations.
Longevity of support bothers me.
Say I have an SP game that offloads something to Azure as an integral part of the game design.
Now let's say it doesn't sell well.
How long will MS subsidise my cloud-dependent flop?
Even if it is successful, how long will MS support it?
Even if it does, they are blowing up 30000 chunks in less that a minute. You don't do that in any game, anywhere. Each chunk is a individual object, that is the magic of all this. And they keep calculating it all through the demo.
1 minute, 32000 chunks. https://www.youtube.com/watch?v=QxHdUDhOMyw&feature=youtu.be
Find me any game that even come close to that.
https://www.youtube.com/watch?v=wDciQ2nAPdQ (amazing destruction, If you pause you can not see more than 1000 chunks)
"The entire environment - everything - is built up from these voxels," explains Kruger. "All of the cubes that you see flying around - there's no gimmick, no point sprites, it's not a particle effect, they're actual physical cubes. In gameplay, dynamic cubes with collisions, floating around, you can get up to 200,000. Our engine supports up to 500,000 but in actual gameplay scenarios, it rarely goes over 200K."
Also really amusing that almost every defender of the cloud is a junior member...
Fantastic point showing that the GPU is far more important then the CPU, and the cloud can't help with the GPU.
Have you SEEN Titanfall? ;-)If it works convincingly in that scenario then I'm sold
Have you SEEN Titanfall? ;-)
It's a demo of cloud computing under the conditions you stated. I'm not going to derail this thread into Titanfall cloud discussion - it's already been done to death in the Respawn Engadget interview thread a few weeks back.Yeah, whats your point ?
Looks OK on a decent PC rig at full HD and consistent frame rate
30.000 is peanuts. How about 200.000? In 1080p at 60fps?
http://www.eurogamer.net/articles/digitalfoundry-vs-resogun
According to Phil Spencer that demo isn't a throw away thing.
https://twitter.com/XboxP3/status/451909464892116992
My guess early look at what they are trying to do with Crackdown.
Whilst touting 32000 chunks was a bit misguided, Resogun doesn't trouble itself with rotations as far as I can tell. That makes everything significantly simpler, and prevents it from being comparable really.
The Next Car Game comparison is also flawed though, because that stuff simply wouldn't work well with the delayed responses you'd get from the cloud computations.
Nope, not from what I expect. Although you are partially right, it's not who is going to pay for it, but rather who is going to pay for the extra development costs for minor benefits.
Also really amusing that almost every defender of the cloud is a junior member...
Fantastic point showing that the GPU is far more important then the CPU, and the cloud can't help with the GPU.
The silence in NeoGAF today is deafening...
You don't have to dislike it because it's great news for Xbox One, it's great for gamers in general! We should all be excited about the potential for this tech!
MS have a big uphill battle convincing people about the viability of their server compute.
Talking about it was weak. Showing a concept in action is better and is creating some better discussion around the idea of server compute. Now seeing it all happen on a 3Mb/second connection, with 100ms - 200ms latency, with high-res textures, lighting, shadows etcetera needs to happen.
If it works convincingly in that scenario then I'm sold