• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Killzone 4 Uses Raytracing For Its Real-Time Reflections

Status
Not open for further replies.
These days ROPS are not that important.

Lol that's not the point of my comment. >_<

I'm not exactly sure how they work. I do know they output the "3D scene" into 2D format for the screen.

Why can't they render it higher than 1080p and then downscale for a clean SSAA output?
 

Ty4on

Member
Most of the blurring is due to the lighting model. The reflections take into account the roughness of the material in question and you get more or less blurry reflections due to that.
Kind of like here:
glossy_reflectiona.jpg

Fuck everything else, just give me reflections like that and I'm happy!
 

KKRT00

Member
Most of the blurring is due to the lighting model. The reflections take into account the roughness of the material in question and you get more or less blurry reflections due to that.
Kind of like here:

Yes i know, but scene i pointed out has glass as a material.
 
Lol that's not the point of my comment. >_<

I'm not exactly sure how they work. I do know they output the "3D scene" into 2D format for the screen.

Why can't they render it higher than 1080p and then downscale for a clean SSAA output?

Because the hardware is to weak it is simple as that.
All ROPS do is read,blend and write from framebuffers.
You still need to calculate stuff. They probably went with 32 because there isn't a step between 16 -> 32.
 

farisr

Member
Nice, but what's the point if nobody noticed if when the game was unveiled? Most people weren't blown away by the KZ4 demo.

Speak for yourself. I was impressed noticed the reflection stuff right away, and there was barely anyone that didn't think "now this is next-gen!" when the dropship sequence at the end happened.

Edit: Yes, I said speak for yourself and then proceeded to speak for people. LOL.
 

sunnz

Member
Considering how Sleeping dogs went from looking so average during the day to straight up beautiful at night ( when raining) because of the reflections, I hope more games focus on having these realistic reflections, makes it look very good.
 
I was gonna come in here and say "I think you're mistaking ray-tracing for ray-casting or some other hack being used to approximate ray-traced reflections."

And then saw about a dozen people had already covered it.

Love to see this stuff in action even if it isn't "Ray-tracing" in the strictest sense.
 

Sorral

Member
To quote myself from the announcement thread. Here are some examples of its implementation in the trailer.

There was a gif at this moment of the video that I was unable to find for now. This doesn't even show how good/fast the reflection was that well.

6g4XbmgFN68N9HUsEfitfQoGGMYfg.jpg
 
Didn't they say they had some from of Ray-tracing in KZ2 and 3?

I'm sure it is further fleshed out here in KZ SF.. but the point is ray-tracing isn't foreign to the franchise.
 

JCreasy

Member
Forgive my cluelessness . . .

Can some version of the tech here be used to pull off an approximation of SVOGI?
 

KKRT00

Member
Not every type of glass is super reflective, so he's right.

It doesnt need to be super reflective in this angle :) There are others examples in the trailer.
It could be half-res, crappy blur and small amount of rays that gives such low quality reflection or because its alpha version and they wanted it to be stable, they decreased quality for presentation.

This is half-res from 720p and it looks more precise than half-res from 1080p - http://www.mobygames.com/images/sho...aystation-3-screenshot-aphrodite-seems-to.jpg
 

RoboPlato

I'd be in the dick
Forgive my cluelessness . . .

Can some version of the tech here be used to pull off an approximation of SVOGI?
It can and is being used for GI here, the bits about light reflections off of reflections at the end of the talk confirms that, but not SVO. SVO is an entirely different type of light simulation.
 
There was a gif at this moment of the video that I was unable to find for now. This doesn't even show how good/fast the reflection was that well.

6g4XbmgFN68N9HUsEfitfQoGGMYfg.jpg
Wait... would screen space work in rendering the side of the Helghan? Because it's clearly doing that as opposed to just approximating a reflection.

Didn't they say they had some from of Ray-tracing in KZ2 and 3?

I'm sure it is further fleshed out here in KZ SF.. but the point is Ray tracing isn't foreign to the franchise.

They used ray tracing for their bullet impacts, not the lighting.
 

JCreasy

Member
Wow, next-gen is going to be exciting.

Adding great new tech to franchises I already love, and new IPs born from the start with this tech in mind, is going to be absolutely wonderful!
 
These days ROPS are not that important.
Go calculate the Fillrate = ROPS * frequency.
you will come to the conclusion that the rops count is overkill for 1080p gaming.
And probably something they want to use later on for 4k movies with other gimicks like high framerate and 3D.

The number of ROPs is proportioned to the memory bandwidth.
 

RoboPlato

I'd be in the dick
AA won't be a problem with 32 ROPs. The game will run at 1080p and I do expect at least a 4xAA mode.

My only concern with MSAA is that GG uses a lot of deferred rendering and MSAA does not jive well with it. If that wasn't the case, I'd assume at least 2xMSAA would be a given.
 

KKRT00

Member
Hmm. I find it odd how that works. So it can "ray trace" things on screen, but it if the object is off screen, you can't? Why can't you include it to reflect things in a wider view?

You could, but then You would need to draw more geometry. This technique works after triangle occlusion and then geometry and lighting rendering, so it only works for geometry that frame will draw, thats why its called screen-space.
 
The number of ROPs is proportioned to the memory bandwidth.

Aah oke,
You have any links i can read into what they really do information about ROPS is really scarce on the internet. You get countless links that tell what feels like 30% of the story.
I want to clear this up what ROPS precisly do for myself.
 
Raytracing reminds me a quote about raycasting:

Tim Rogers said:
Once upon a time, game design genius Clifford &#8220;CliffyB&#8221; Bleszinski said, in an interview, that because raycasting is such a simple function in 3D simulations, it&#8217;s easy to make a game understand when any given point is geometrically aligned with another. Bleszinski said that entertainment experiences, in general, are about reaching out and touching people, and that because of the ease of raycasting and its natural similarity to aiming and pointing a gun, games, as entertainment, are more often than not about &#8220;reaching out and touching someone with your gun&#8221;.

http://www.actionbutton.net/?p=3006
 

EinSof

Member
CliffyB's name reminds of me of the locked weapon skins @108 kb in Gears of War 3. Don't care that he says, lets keep him out of this thread.

Considering he's done more for the video game industry then you will ever know, he deserves our time and respect. This thread is about a technique used in a game engine, which he probably knows something about considering him and the talented folks at Epic built an engine that has had a far reaching impact in the video game industry and has helped scale down the cost of video game development. Regardless of how you feel about his game design decisions, he is a prominent and influential member of the video game industry and is widely respect for what he is contributed to it.
 

Randdalf

Member
Fuck everything else, just give me reflections like that and I'm happy!

Having just written a raytacer, I can tell you right now that we probably won't even see true glossy reflections in the generation after next-gen. To render them you need to super-sample reflection rays stochastically (the randomness is determined by the roughness) and then average the results together. To get a decent looking result you need to sample at least 32 rays (per reflection, so imagine how quickly the number of traces needed for reflection rays that bounce off multiple surfaces grows) and even then, reflections from further away in the scene will be noisy. Of course there are doubtless going to be ways to fake it and accelerate the process, but at this point in time any distributed raytracing techniques (glossy reflections, true-to-life depth of field, soft shadows, etc.) would take far too much computational time to be of any use in a real-time application.
 

onQ123

Member
Having just written a raytacer, I can tell you right now that we probably won't even see true glossy reflections in the generation after next-gen. To render them you need to super-sample reflection rays stochastically (the randomness is determined by the roughness) and then average the results together. To get a decent looking result you need to sample at least 32 rays (per reflection, so imagine how quickly the number of traces needed for reflection rays that bounce off multiple surfaces grows) and even then, reflections from further away in the scene will be noisy. Of course there are doubtless going to be ways to fake it and accelerate the process, but at this point in time any distributed raytracing techniques (glossy reflections, true-to-life depth of field, soft shadows, etc.) would take far too much computational time to be of any use in a real-time application.

Fake it until we make it!
 
Status
Not open for further replies.
Top Bottom