Used in the real world? Nope.
Used in proof of concept? Sure.
It will be used in real world games sooner than later.
Used in the real world? Nope.
Used in proof of concept? Sure.
Yeah but DriveClub looks poor graphically so it'd be hard to argue that offloading the lighting wouldn't be of any benefit.
K:SF is using prebaked GI.That was 35% build pre alpha..
Killzone Shadow Fall is using GI too. And these are LAUNCH titles running on far worse gpus than a Titan...
That was 35% build pre alpha..
K:SF is using prebaked GI.
That's quite a statement.
The pop in becomes really noticeable at 200ms of latency. That might get a little distracting in environments where lights are turning on and off all the time (say, any action game.)
The catchphrase of denial
It seems the best use is real time lighting for the overall world. I would think lighting that moves quickly (flashlights, fire, particles) should be left to local rendering. It would certainly help free up resources when the local client doesn't have to render world lighting.
It seems the best use is real time lighting for the overall world. I would think lighting that moves quickly (flashlights, fire, particles) should be left to local rendering. It would certainly help free up resources when the local client doesn't have to render world lighting.
Also, dat latency.
I'm willing to eat crow at launch if you are.
It's funny, even with it being in the OP, people are ignoring that this requires a lot of GPU power. Something Azure does not have.
I only saw an issue at 1000ms, even 500ms looked ok in the demo. Anything under 200ms looked damn near perfect.
But isn't that exactly what's being done in this tech demo as well?
No, its real-time GI, just not on the client side.
Perhaps in the last segment. The midway scenes had pop in even at decent latency. Lighting pop in, flickering, all sorts. Certain pillars go from lit to unlit in a split second etc.
Just to clarify, you're saying neither KZ SF or DriveClub use real time GI?
It's funny, even with it being in the OP, people are ignoring that this requires a lot of GPU power. Something Azure does not have.
I dont know about DriveClub, but KZ:SF for sure is not using real time GI.
Right now it doesn't.
But that's the great thing of having these resources in the cloud. MS is free to update and improve the hardware in their servers over the life of the console, adding powerful GPU's, updating the CPU/RAM, etc. On the console itself, the hardware is going to stay the same for the entire generation, but that isn't the case with Azure.
Guessing DriveClub uses real time Global Illumination given the sun moves around, sets, rises etc. DriveClub dev comments on it here.
http://www.youtube.com/watch?v=VoengumG6FI&t=3m40s
Even more curious to see how this game ends up looking and performing at launch now.
Yeah i know what GI is, does this implementation increase quality though.Global illumination makes a big difference in the quality lighting and therefore improves the graphics.
[IMhttp://planetside.co.uk/wiki/images/5/54/GI_garden_760x240.jpg[/IMG]
Also, dat latency.
I think cloud computing is a little more than just a buzzword lol
Found this.
They haven't announced it yet who knows maybe they do have or are building Nvidia titans rack at azure right now. From what i saw it is extremely easy to replace the containers containing the CPU racks at azure data centers.
trololol that is where the E3 titan cards where for emulate the X1 and cloud resources....
Just to clarify, you're saying neither KZ SF or DriveClub use real time GI?
If folks read the actual Siggraph presentation you will see that they look at the entire rendering pipeline to see which parts would make sense for Cloud computing.
Here are the parts of the pipeline they conclude are BAD for Cloud computing:
1) UI
2) Shadow Map Render
3) Physics
4) Direct Illumination/Composite/Post-Process
They found only one part of the pipeline suitable for Cloud computing: Indirect Illumination.
And to do the demo they needed:
1) A GeForce Titan in every server
2) 43Mb/s Internet bandwidth for a Battlefield 3 size map (using Photon system)
So, a good research paper, but not economically practical either on the client or the server side yet.
Which will improve faster, client H/W or server H/W + Internet bandwidth? That's the magic question.
So it's real....Well damn.
I don't think Azure has GPU clusters. From my understanding only Amazon has GPU clusters to rent out.
If folks read the actual Siggraph presentation you will see that they look at the entire rendering pipeline to see which parts would make sense for Cloud computing.
Here are the parts of the pipeline they conclude are BAD for Cloud computing:
1) UI
2) Shadow Map Render
3) Physics
4) Direct Illumination/Composite/Post-Process
They found only one part of the pipeline suitable for Cloud computing: Indirect Illumination.
And to do the demo they needed:
1) A GeForce Titan in every server
2) 43Mb/s Internet bandwidth for a Battlefield 3 size map (using Photon system)
So, a good research paper, but not economically practical either on the client or the server side yet.
Which will improve faster, client H/W or server H/W + Internet bandwidth? That's the magic question.
Exactly.Which will improve faster, client H/W or server H/W + Internet bandwidth? That's the magic question.
They found only one part of the pipeline suitable for Cloud computing: Indirect Illumination.
And to do the demo they needed:
1) A GeForce Titan in every server
2) 43Mb/s Internet bandwidth for a Battlefield 3 size map (using Photon system)
So, the impossible.....is looking possible?
And even then it only makes sense in a subscription style package, otherwise it would be idiotic to have such a setup for games longer than their prime selling period.
So, the impossible.....is looking possible?
Yeah, no. I was specifically told that I was an idiot for believing that the cloud would ever be used to improve graphics.
If folks read the actual Siggraph presentation you will see that they look at the entire rendering pipeline to see which parts would make sense for Cloud computing.
Here are the parts of the pipeline they conclude are BAD for Cloud computing:
1) UI
2) Shadow Map Render
3) Physics
4) Direct Illumination/Composite/Post-Process
They found only one part of the pipeline suitable for Cloud computing: Indirect Illumination.
And to do the demo they needed:
1) A GeForce Titan in every server
2) 43Mb/s Internet bandwidth for a Battlefield 3 size map (using Photon system)
So, a good research paper, but not economically practical either on the client or the server side yet.
Which will improve faster, client H/W or server H/W + Internet bandwidth? That's the magic question.
...Even Cerny said it could be done. No one said it couldn't be done.