• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS4 Architect Mark Cerny: 'Cloud won't work well to boost graphics'

It's true that the traditional graphics pipeline could offload quite a few things to the cloud, but that's because you precalculate a lot of things on load in a traditional situation. Any threads which are dynamic and need continuous communication to function will not be offloaded to Azure by anyone who has the slightest bit of common sense, because doing so would be the equivalent of ensuring a shitty experience for 99% of gamers because of their internet speeds so that 1% of your possible customers can get more sparks.
 
You are obviously being intentionally vague. Why would this be?

The cat is dead. It has been exposed over and over again. Let it go.

Not at all. real time pipeline is just a part of a graphics engine. It is a very narrow focus. There is a lot more to making a game look good.
 

FranXico

Member
It's true that the traditional graphics pipeline could offload quite a few things to the cloud, but that's because you precalculate a lot of things on load in a traditional situation. Any threads which are dynamic and need continuous communication to function will not be offloaded to Azure by anyone who has the slightest bit of common sense, because doing so would be the equivalent of ensuring a shitty experience for 99% of gamers because of their internet speeds so that 1% of your possible customers can get more sparks.

Quoted for truth. Thank you.
 

abic

Banned
It's true that the traditional graphics pipeline could offload quite a few things to the cloud, but that's because you precalculate a lot of things on load in a traditional situation. Any threads which are dynamic and need continuous communication to function will not be offloaded to Azure by anyone who has the slightest bit of common sense, because doing so would be the equivalent of ensuring a shitty experience for 99% of gamers because of their internet speeds so that 1% of your possible customers can get more sparks.


That solution still doesn't work in many cases because offloading the large preloaded chunks will render the game unplayable without Internet, e.g. Map, models.

Unless we're willing to be always on, only non-vital things can be offloaded i.e. core assets and related computations are local.
 
Remember kids, the cloud is only useful for gimmicks and always on DRM. (Not MMO-esque persistence, because we have this thing called "servers" that does that better)

Anyone who says they improve graphics is a Microsoft fanboy and should be labelled as such, then mocked.
 
Lighting happens to be part of graphical output and:

"10x the power of xbox 360 without cloud while 40x with it."
"Or 3 times resources of one local made available to developers."

They have never ever clarified their statement and the onus is on them. They also didn't state whether this cloud stuff will be for offline or online games. These reaches for the sake of defending MS's honour is pitiable.

Your right, there is more that MS could do to clarify their claims, but I think they've (and others like respawn) have gone into more depth then just the two quotes you provided. And you're also correct that the onus is them to prove their claims, but I think it will be awhile after the sytem and some games are out before a resonable verdict can be made regarding the benifits of their cloud computing claims.

As using cloud computing requires the internet, I would assume it will be only for online games.

I'm not trying to defend their honor. I'm not trying to prove cloud computing works either. And I'm not saying the infinte power of the cloud will be revolutionary to gaming. A lot of people seem to be associating what Cerny said to be dismissal of MS's cloud claims, when Cerny didn't really address any of those claims.

Perhaps I need to get a better understanding of how game design and game engines work before I make such a claim as "lighting =/ graphics". If what I said what incorrect, then I appologize for being mistaken. But in most modern games isn't there a graphical/rendering engine,a lighting model/engine, and a physics engine? (I know that's an over-simplification, and I'm sure there's a lot more to it then that). Example: the graphics in Forza 5 are amazing, to bad it's using static, prebaked lighting.
 

Alebelly

Member
'Cloud won't work well to boost graphics'

But if you build a massive server farm, you got to stick that bullet point somewhwer
 

FranXico

Member
Not at all. real time pipeline is just a part of a graphics engine. It is a very narrow focus. There is a lot more to making a game look good.

Streaming pre-calculated data and environments? This is already done, a long time before the "infinite power" PR. Of course, it also does not make a game look better, it only ensures an endless stream of content.

Explain how exactly, outside a real time pipeline, we can use remote servers for enhancing graphics. Please, do not disappoint us with streaming pre-baked assets.
 
That solution still doesn't work in many cases because offloading the large preloaded chunks will render the game unplayable without Internet, e.g. Map, models.

Unless we're willing to be always on, only non-vital things can be offloaded i.e. core assets and related computations are local.

I was thinking more of an optional situation, like the traditional render-farm setup used to render CGI. In that scenario the Xbox could render the environment on its own but with decent network speeds could parcel out operations to other nodes provided by the cloud infrastructure and end up at the same result faster. I don't think it's impossible, but I think it's improbable that it would ever make a big difference to anything which isn't extremely static.

And yeah, people without a network connection or with a slow one would suffer so that people with a fast connection could halve their load times.
 
Streaming pre-calculated data and environments? This is already done, a long time before the "infinite power" PR. Of course, it also does not make a game look better, it only ensures an endless stream of content.

Explain how exactly, outside a real time pipeline, we can use remote servers for enhancing graphics. Please, do not disappoint us with streaming pre-baked assets.

Streaming pre-bakes assets? Lol. That is exactly what the Cloud does not do. At the very least it will stream dynamic assets and environments, freeing up the local CPU/GPU.

Anyway... I have given plenty of examples on NeoGaf already in the past. Let's just see what devs come up with... you might be surprised. I have seen some cool Cloud things already. With the popularity of rogue-likes/Minecrafts atm, I can see the cloud versions pushing this concept into new territories.
 

He is explicitly saying there's an issue with latency and bandwidth and that would be only suitable for graphical tasks that can deal with that, so what's wrong with that statement?

GI from the sun is something you could do on the cloud hardly latency sensitive. Shadows and sun direction aren't things that change fast If you do 60~120 min day night transition. There are algorithms/simulations that could be done on the cloud so they don't have to take GPGPU resources.

There are quiet some latency between pulling and releasing your control trigger and seeing your grenade getting thrown, in that time the server could have calculated the impact and physics simulation before the grenade reaches a wall and explodes. There are several latency hiding mechanic devs can use i believe Gow:A makes extensive use of those techniques to hide latency in multiplayer.

Question is how much performance is given to the devs and how easy it will be to implement it. If you have multiplayer with 32 players the server can run one simulation push the data to the 32 players and let the engine only render the data maybe buffer two or more states to interpolate between the states like almost every mmorpg out there.
In the end $$$ says it all, if its too expensive it wont be used.
Ms Azure SDK has a "burst mode", where the server can immediately be expanded to many more machines to complete a task, and comeback to a reduced state afterwards. A few months ago the azure team posted a blog post of a impressive benchmark where the server expanded to about 500 servers and was able to maintain a higher than 90% efficiency (compared to the theoretical maximum all those machines could produce)...

With their 3X the xbone on the cloud statements, my guess is that the server can be expanded by that times the number of connected player at a given time.
 

Chumpion

Member
"So when you walk into a room, it might be that for the first second or two the fidelity of the lighting is done by the console, but then, as the cloud catches up with that, the data comes back down to the console and you have incredibly realistic lighting," Matt Booty, General Manager of Redmond Game Studios and Platforms, told Ars.

LOL, what is this bullshit? Yeah, MS could do that. They could also relocate Xbox Live servers to the Moon.

I imagine it went something like this at MS:

PR: We need something to counter Sony's hardware advantage. I want you to think up ways to leverage cloud computing... in a really serious way.
Tech: This is what I've been dreaming about! Think of the incredible possibilities! We're going to create massively parallel simulations of...
PR: Whoa, hold on there, little buddy! We'll do no such thing. Do you realize how expensive that cloud stuff is hehehe? I just want something that's possible in theory, so that we can be all like "We're not gonna do it? Then prove it LOL!"
Tech: ...
PR: Now go earn your paycheck egghead!
 
Ms Azure SDK has a "burst mode", where the server can immediately be expanded to many more machines to complete a task, and comeback to a reduced state afterwards. A few months ago the azure team posted a blog post of a impressive benchmark where the server expanded to about 500 servers and was able to maintain a higher than 90% efficiency (compared to the theoretical maximum all those machines could produce)...

With their 3X the xbone on the cloud statements, my guess is that the server can be expanded by that times the number of connected player at a given time.

I've been working with virtualised infrastructure for about 10 years now, what you're describing is elasticity and has been a core aspect of all cloud systems since forever.
 

FranXico

Member
Streaming pre-bakes assets? Lol. That is exactly what the Cloud does not do. At the very least it will stream dynamic assets and environments, freeing up the local CPU/GPU.

Oh, so calling the environments "dynamic" changes everything. MMOs already do that. Nice of you to confirm.

* Procedural textures
* Model deformations
* Environmental effects (ex weather)
* Tesselations
* Calculating transformations

Again, examples of some of those things already exist. I am afraid that network latency will mess with some of those though. But fair enough, you gave your shot at making it sound "magical".

You guys have no clue...

You would be surprised.
 

nib95

Banned
Streaming pre-bakes assets? Lol. That is exactly what the Cloud does not do. At the very least it will stream dynamic assets and environments, freeing up the local CPU/GPU.

Anyway... I have given plenty of examples on NeoGaf already in the past. Let's just see what devs come up with... you might be surprised. I have seen some cool Cloud things already. With the popularity of rogue-likes/Minecrafts atm, I can see the cloud versions pushing this concept into new territories.

No way in hell it could compute those dynamic elements, they need to be computed and rendered in milliseconds every frame (30 frames per second up to 60). Stuff like dynamic lighting, shadows, post process effects etc. That can change at any moment and frame depending on the players actions and a massive range of variables. You're kidding yourself if you think the cloud is going to be used to render these sorts of hyper latency sensitive things.
 

Takuya

Banned
Moving the goalpost syndrome again. Whatever.

It's not moving any goalposts, you think you being a mobile game dev changes anything or gives you any more credibility than the lead architect of the PS4 is mind boggling.

Anyone who buys into Microsoft's cloud graphical processing BS is setting themselves up for quite the disappointment.
 
I've been working with virtualised infrastructure for about 10 years now, what you're describing is elasticity and has been a core aspect of all cloud systems since forever.

Yeah, I know, I was merely speaking of a particular benchmark that shows how effective Azure is.

Higher than 90% of the theoretical output is really impressive imo. And if I'm not mistaken, the benchmark even was validated as a entry on the top 500 super computers.
 

Raist

Banned
I find it interesting that MS has made certain claims about cloud computing, and Sony, who seems to be eager to stick it to MS whenever they can, have done nothing to refute the claims that MS has made.

MS has claimed that the developers will have access to three times the compute in the cloud to be used in games, and that it will help with A.I., physics, and lighting. As well as do all the stuff that dedicated servers are known to do. Never once (that I have seen) has MS claimed that the cloud will help to boost graphics (lighting =/ graphics)

In the first instance where Sony was questioned about cloud computing they tried to dismiss MS's claim, saying they could do that too; claiming linking, matching, etc - basically using dedicated servers. No mention of the other claims about the cloud.

Then in this interview Cerny uses a common marketing technique; making a factual statement that is related, but not relevent to the claims of their competitor in order to transfer doubt onto the competitors claims. MS never claimed graphics would be boosted, Cerny is saying graphics can't be boosted, and hoping people equate that to mean that MS's other claims about the cloud are also invalid.

I'm not trying to say that the cloud is a magical wonder; I just find it interesting how Sony dances around the claims that MS is making. I'd love someone from Sony actually address MS's claim, rather then answer in roundabout maketing terms. Real or not, I think it's still to early to unequivocally make judgement on the cloud before the systems or games that (might) use it are out. It will likely be 1 or 2 years before we can really assess what kind of benifit, if any, it provides.

Cerny is using marketing talk? lolz.

I guess "thanks to the unlimited power of the cloud your xbone will be 4 times more powerful" is not marketing talk, like lighting isn't graphics.
 

Johnny

Member
All computing will be done in the cloud eventually, it's the most efficient model. Think of all of the devices you have in your home right now, computers, gaming devices, phones. How much of that raw processing power is being utilized on average, 10%? Now imagine moving all of that technology to a centralized location, where it's potential can be maximized. You've suddenly cut your costs drastically, or gained access to computational power that you would otherwise not have.

In a few years time we'll forego current tech in exchange for more simple, inexpensive and energy-efficient devices. Screens that do little more than receive a video feed, with a quality and response time indistinguishable from local sources. We'll no longer have "console generations", instead games will evolve on a daily basis. What you play on your "TV" can be played on your "phone", the only difference being screen size.

It's only a matter of infrastructure at this point, and many are there already with fibre and LTE.
 

FranXico

Member
It's not moving any goalposts, you think you being a mobile game dev changes anything or gives you any more credibility than the lead architect of the PS4 is mind boggling.

Anyone who buys into Microsoft's cloud graphical processing BS is setting themselves up for quite the disappointment.

Clearly, you have no idea what the future is like. Those dynamic weather effects and transformation computations offloaded to the cloud over your 4G network are going to blow your mind!

/sarcasm
 

FranXico

Member
All computing will be done in the cloud eventually, it's the most efficient model. Think of all of the devices you have in your home right now, computers, gaming devices, phones. How much of that raw processing power is being utilized on average, 10%? Now imagine moving all of that technology to a centralized location, where it's potential can be maximized. You've suddenly cut your costs drastically, or gained access to computational power that you would otherwise not have.

In a few years time we'll forego current tech in exchange for more simple, inexpensive and energy-efficient devices. Screens that do little more than receive a video feed, with a quality and response time indistinguishable from local sources. We'll no longer have "console generations", instead games will evolve on a daily basis. What you play on your "TV" can be played on your "phone", the only difference being screen size.

It's only a matter of infrastructure at this point, and many are there already with fibre and LTE.

Thin clients are back! And to think that it took almost 30 years!

http://www.youtube.com/watch?v=yzLT6_TQmq8
 
All computing will be done in the cloud eventually, it's the most efficient model. Think of all of the devices you have in your home right now, computers, gaming devices, phones. How much of that raw processing power is being utilized on average, 10%? Now imagine moving all of that technology to a centralized location, where it's potential can be maximized. You've suddenly cut your costs drastically, or gained access to computational power that you would otherwise not have.

In a few years time we'll forego current tech in exchange for more simple, inexpensive and energy-efficient devices. Screens that do little more than receive a video feed, with a quality and response time indistinguishable from local sources. We'll no longer have "console generations", instead games will evolve on a daily basis. What you play on your "TV" can be played on your "phone", the only difference being screen size.

It's only a matter of infrastructure at this point, and many are there already with fibre and LTE.

No, we won't do this. Certainly not in "a few years time", probably not ever.
 
No way in hell it could compute those dynamic elements, they need to be computed and rendered in milliseconds every frame (30 frames per second up to 60). Stuff like dynamic lighting, shadows, post process effects etc. That can change at any moment and frame depending on the players actions and a massive range of variables. You're kidding yourself if you think the cloud is going to be used to render these sorts of hyper latency sensitive things.

Seriously.... reading problems????

I said NOT in a traditional real time pipeline.

Anyway... enough for me... have fun.

Thin clients are back! And to think that it took almost 30 years!

Did you somehow miss mobile apps? Or *gasp* HTML?
 
I find it interesting that MS has made certain claims about cloud computing, and Sony, who seems to be eager to stick it to MS whenever they can, have done nothing to refute the claims that MS has made.

MS has claimed that the developers will have access to three times the compute in the cloud to be used in games, and that it will help with A.I., physics, and lighting. As well as do all the stuff that dedicated servers are known to do. Never once (that I have seen) has MS claimed that the cloud will help to boost graphics (lighting =/ graphics)

I would disagree here. Lightning (and therefore shading) are probably the most important aspects of graphics. I'm by no means an expert, but you could test this yourself by downloading UDK and rendering one of the example levels in unlit mode. It goes from quite pretty current gen stuff to early 2000s level of crap. Geometry and textures remain exactly the same, but the effect is profound, even though it's just baked lighting. Dynamic lighting can make even more of a difference, so if the cloud could be used to enhance that, it would be a major win for graphic quality in games. Of course it's all bullshit though.
 
Did you somehow miss mobile apps? Or *gasp* HTML?

Except that HTML requires a fair bit of local processing to render and interact with despite the fact that the transmission of data is highly efficient. And if mobile apps were a validation of cloud-based processing we wouldn't see a continuous increase in the raw processing power of mobile devices year on year.
 
No, we won't do this. Certainly not in "a few years time", probably not ever.

I disagree. I've read some of the Gaikai presentations where they claim they've already achieved controller like lag inputs, certainly better than some LCD TVs we currently used. And that's TODAY. I think something like that is certainly the future and it's only dependent on internet infrastructure improving so it can reach the masses (which will certainly take a few years, I'll give you that).
 
It's not moving any goalposts, you think you being a mobile game dev changes anything or gives you any more credibility than the lead architect of the PS4 is mind boggling.

Just like Mr Cerny I studied Comp Science and I have been in the industry since 80s. I am just offering an opinion that I think is just as valid and unlike most of the "PR bullshit" people I am trying to offer options that might be used to enhance our experiences as gamers. There are people out there doing a lot of work in cloud-game related stuff.

Obviously you just prefer to swallow the "cloud is shit" 100%.... I am not blocking you from doing that.
 

madmackem

Member
This whole cloud thing has been snake oil salesman like from the off. Even with my primary school understanding of all the hardware i got that it wouldnt work as some believed it would.
 

Durante

Member
Remember kids, the cloud is only useful for gimmicks and always on DRM. (Not MMO-esque persistence, because we have this thing called "servers" that does that better)
Actually, the one thing the cloud is really useful is hosting servers -- it's not required of course, but it's more resource-efficient. The thing is, it doesn't offer anything new compared to servers, it just allows you to host them more cheaply.

It's a cost-saving technology.
 

kaching

"GAF's biggest wanker"
Did you somehow miss mobile apps? Or *gasp* HTML?
Wait, mobile apps, which specifically arose because of a desire and need to use more local compute power to accomplish requested tasks than web apps have been able to, and HTML, which in it's latest iteration is making effort to try to close that gap and provide more local compute power? That's kind of going in the opposite intended direction for thin client...
 
I disagree. I've read some of the Gaikai presentations where they claim they've already achieved controller like lag inputs, certainly better than some LCD TVs we currently used. And that's TODAY. I think something like that is certainly the future and it's only dependent on internet infrastructure improving so it can reach the masses (which will certainly take a few years, I'll give you that).

The problem with all of this thinking is that the world doesn't start with North America and end with Japan. The notion that technology companies will decide to switch their processing to a paradigm which only works well in a fraction of the planet is absurd, they want to develop new markets in developing nations where it's going to take a lot longer than a few years for the infrastructure to come close to the point where cloud-computing in the way described by that post is viable.

It's not surprising to hear Gaikai and OnLive people touting the amazing work they've done because that's in their best interests to do so, and good for them. However, the more likely scenario for the majority of the computing world is that devices will continue to become more energy efficient and powerful, while using cloud infrastructure for more appropriate purposes such as content and application delivery.
 

Respawn

Banned

black-guy-laughing-on-boat-gif.gif
 
How do you solve the problem of the latency imposed by the network connection then?

Most engines already have 100+ ms latency between pulling and releasing the trigger or button and seeing an action being performed.
The moment input thread notice the player releasing the trigger it could tell the server the players position and look-at vector(reticule). The server can calculate for example where the grenade will land. If it is close to a wall the server could calculate the fractions points where the wall should break.

Send those points back to the console and probably have it done before the player even sees the grenade being thrown. Off course this is an stupid example and i would do this on the GPU but this is more an example of that there is quiet some latency in games. This should be easier to hide on 30 fps games then on 60 fps games.
 

Chobel

Member
Just like Mr Cerny I studied Comp Science and I have been in the industry since 80s. I am just offering an opinion that I think is just as valid and unlike most of the "PR bullshit" people I am trying to offer options that might be used to enhance our experiences as gamers. There are people out there doing a lot of work in cloud-game related stuff.

Obviously you just prefer to swallow the "cloud is shit" 100%.... I am not blocking you from doing that.

What we're arguing about here is "cloud for graphics is shit", I studied Comp science too and I'm pretty sure "latency" don't go with "real time graphics". You said you can use the cloud computing in "real time graphics" but not "traditional real time graphics", would you cite some examples?
 
All computing will be done in the cloud eventually, it's the most efficient model. Think of all of the devices you have in your home right now, computers, gaming devices, phones. How much of that raw processing power is being utilized on average, 10%? Now imagine moving all of that technology to a centralized location, where it's potential can be maximized. You've suddenly cut your costs drastically, or gained access to computational power that you would otherwise not have.

Nahhhhh. No, all computing won't be done in the cloud unless we figure out a way to send information everywhere at the speed of light or, at the least, at speeds similar to those within microchips (aka not ever as speed will always be a factor of distance)

I agree a lot of task will be offloaded to the cloud but only those that really benefit from social/mass computing. If it's simply something that needs a lot of processing power and needs to be done in real time it'll still be done locally
 

Nilaul

Member
Cloud will always be a DRM measure for me; by the time technology advances to allow the cloud be used for real time dynamic calculations everywhere in the world; our 100 dollar devices will have enough processing power to produce seemingly photo realistic graphics.
 

jWILL253

Banned
NemesisPrime, if you been in the industry since the 80's, then you know it's best to actually explain your point with details in a reasonable manner, rather than just responding to everyone with petulance.

Ah, who am I kidding? It seems like everybody who works in the game industry has the social skills of Honey Boo-Boo...
 
Just like Mr Cerny I studied Comp Science and I have been in the industry since 80s. I am just offering an opinion that I think is just as valid and unlike most of the "PR bullshit" people I am trying to offer options that might be used to enhance our experiences as gamers. There are people out there doing a lot of work in cloud-game related stuff.

Obviously you just prefer to swallow the "cloud is shit" 100%.... I am not blocking you from doing that.
Your inability to go into anything specific doesn't do your credibility any favors. Hell I'm in CompSci currently, looking to break into the industry and I can point how how much bullshit the " infinite cloud" nonsense is bullet point by bullet point.
 
Dang so many Comp sci people in this thread.
Like with anything new in computation comp sci people divide in camps and will fight each other. New skool vs old skool and shit.

So real question tabs or spaces?
 
Top Bottom