• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Killzone 4 Uses Raytracing For Its Real-Time Reflections

Status
Not open for further replies.
I sure there is more Smartphones & tablets in homes then there are Wii U's

Thats beside the point.

How many people will both own the nextbox and a tablet?

The demographics arent the same.

You wont see the same innovation that the wii u will get, because its never guaranteed to the devs that every xbox owner will have a tablet.

For the wii u, not only is it an all in one package at a lower price, but its also built into the controller.


edit: wait, where did smart phones come into this?

its bad enough people will have to use an xbox controller while figuring out how to position their tablet in a comfortable position, but to do this with a small screen?
 

squidyj

Member
They talk about the implementation probably 15 mins or so in, and far more in depth 35 mins or so in

yeahhh I've just been watching it all through, i'm at 37 minutes now, some good stuff, I wonder what the error is on the linear decomp though.
 

pottuvoi

Banned
Anisotropic filtering solves the problem by (in effect) including "stretched" textures into our mip maps. Look back at the image that contains both the stripes and scaled-down grey texture. Notice the missing stuff on the upper right and lower left? We can put additional, and very useful, scaled textures there.
Actually, Rip mapping is the one that solves the problem with stretched textures. (using 4x the memory when compared to original memory.)
Anisotropic filtering uses more texture samples within projected pixels sample area in texture space to get required information from normal mipmaps. (thus mipmaps still require only additional 0.33% of memory.)
 

low-G

Member
At 1 hour 9 minutes they show the extent of the real time raytracing, runs at 1/2 resolution, distance based samples. But it does seem like a significant step forwards in raytracing in games...

The game CAN have multiple reflection bounces (but it sounds like it has to be specified), so not just raycasting but TRUE raytracing.
 

HTupolev

Member
Actually, Rip mapping is the one that solves the problem with stretched textures.
Anisotropic filtering adds samples into the sample area to get proper information from normal mipmaps. (thus mipmaps still require only additional 0.33% of memory.)
Yeah. I tried to clarify that by using terms like "in effect."

I'd argue that rip mapping is itself sort of a form of AF, even if it's not what modern GPUs implement as hardware AF; it's an additional mechanism in the texture filtering process meant to deal with the exact same problems caused by anisotropic effects. You could even argue that the mechanism is almost the same, as using ripmaps is basically just precomputing a set of AF sampling results.

I wasn't really trying to mislead, but I see that what I wrote might be problematic.
 

pottuvoi

Banned
Lack of AF does hurt IQ a lot, 2003 hardware could get useable 16X AF and hardware from 2005 like the 7800 had no problems with it at all performance wise (at least with games from around the same time).
Performance hit for 16x is ~7% with 7970, so it certainly is there.
On on x360 one of the biggest reason for sub-optimal AF performance was a low amount of texture cache. (32kB)
Yeah. I tried to clarify that by using terms like "in effect."

I'd argue that rip mapping is itself sort of a form of AF, even if it's not what modern GPUs implement in hardware; it's an additional mechanism in the texture filtering process meant to deal with the exact same problems caused by anisotropic effects. You could even argue that the mechanism is almost the same, as using ripmaps is basically just precomputing a set of AF sampling results.
Agreed.
The memory cost of ripmapping is something that still comes to some discussions on anisotropic filtering which was the reason for my post. :)
 

squidyj

Member
they should use that ray-tracing buffer for some sort of visual effect, if not in killzone then in another game, spirit walking or something. looks cool.
 

JCreasy

Member
Um guys . . . I may be hella late on this, but I need to know . . .

Where is this from? It looks like Helghast version on the scenario from the demo

IMG_00981-380x253.jpg
 
Um guys . . . I may be hella late on this, but I need to know . . .

Where is this from? It looks like Helghast version on the scenario from the demo

IMG_00981-380x253.jpg

It's just concept artwork by Guerilla. It was posted in the announcement thread a while back.

And if you're thinking we might get to play the Helghast in the campaign, the unfortunate answer is no :(
 
Um guys . . . I may be hella late on this, but I need to know . . .

Where is this from? It looks like Helghast version on the scenario from the demo

IMG_00981-380x253.jpg

Helghast single player campaign confirmed :p

What if that wall divides one big play area for the sp and you can choose which side you want to play on. Unlikely but what if...
 
At 1 hour 9 minutes they show the extent of the real time raytracing, runs at 1/2 resolution, distance based samples. But it does seem like a significant step forwards in raytracing in games...

The game CAN have multiple reflection bounces (but it sounds like it has to be specified), so not just raycasting but TRUE raytracing.

yes, I think this is the part he was referring to when he said he couldn't go into much detail about it now.
 

RiverBed

Banned
Um guys . . . I may be hella late on this, but I need to know . . .

Where is this from? It looks like Helghast version on the scenario from the demo

IMG_00981-380x253.jpg

I believe it is from the official KZ website under the story section. It is basically a vertical strip of the entire Killzone universe time line- pretty artwork too.
 

JCreasy

Member

bubble burst

sooo, how about that new IP GG

At 1 hour 9 minutes they show the extent of the real time raytracing, runs at 1/2 resolution, distance based samples. But it does seem like a significant step forwards in raytracing in games...

The game CAN have multiple reflection bounces (but it sounds like it has to be specified), so not just raycasting but TRUE raytracing.

Danm, next gen is sounding better and better.

Their new IP will benefit from all the learning they grasp from Shadow Fall. That's makes my toes curl. Shadow Fall seems more like a preliminary exercise - an awesome looking exercise - but an earnest warm-up. I think we'll be truly impressed with next-gen games.
 

Perkel

Banned
from presentation


- Every light source in game is area light (even sun)
- Heavy involvement into physic proprieties of material and how artist use it, thanks to that they can work really fast and use every asset everywhere in game (they don't need bake shadowing etc)
- They showed fairy large staircase and said that it was textured in just about hour because they used their predefined materials and when they they put 3 hours into making it old and dirty (by just adding textures on materials)
- In 10 hours they can change every asset from Killzone 3 to killzone 4 material based model.
 

squidyj

Member
So why are the reflections on the big buildings all faked?

because the real-time reflection system fails to generate reflections for it so it falls back onto the cubemap or potentially onto the skybox.


the real-time system likely fails because it's a screen-space technique and the buildings are reflecting information that is off-screen.
 

Perkel

Banned
So why are the reflections on the big buildings all faked?

Well they are not (entirely)

They mix all effects to get final effect look good.



Here is only Cubemap from presentation:

jbjG27JNQBlw6Y.png


As he said it look preatty shitty because cubemap is displayed only for half of water and it is positioned incorectly from point of view.

Now this is what they get when they use all 3 at the same time (cubemap, raytracing,skybox)

jblj8hzFHapb5U.png


Effect imo is amazing considering it is not full real time sollution.
 

TheD

The Detective
Is a bit disingenuous to call it raytraced reflections, SSR has a bunch of flaws that real raytraced reflections do not have.
 
Is a bit disingenuous to call it raytraced reflections, SSR has a bunch of flaws that real raytraced reflections do not have.

it is disengneous.
And if people are wondering what they mean by inter reflections.. he literally just means that the SSR reflection are done after bounce light is calculated from their GI (whatever way they do GI: it is not mentioned).. and hence reflected light from GI ends up in the reflected images.

It is not actually reflections on reflections.. just the diffuse component from the GI also being in the reflections.
 

low-G

Member
it is disengneous.
And if people are wondering what they mean by inter reflections.. he literally just means that the SSR reflection are done after bounce light is calculated from their GI (whatever way they do GI: it is not mentioned).. and hence reflected light from GI ends up in the reflected images.

It is not actually reflections on reflections.. just the diffuse component from the GI also being in the reflections.

He specifically says they can do reflections on reflections.
 
He specifically says they can do reflections on reflections.

no. It does not make sense given the technique how one screenspace reflection will reflect in another. But what does make sense.. is what he says about GI reflection bounce being a part of the reflection. the SSr is done after everyother screenpass (excluding the cubemaps).

He mentions that GI bonces are in reflections.

Those are technically reflections... but not anything like glossy reflections.

Can we please get a change of thread title? It is... more than inaccurate.
 
it is disengneous.
And if people are wondering what they mean by inter reflections.. he literally just means that the SSR reflection are done after bounce light is calculated from their GI (whatever way they do GI: it is not mentioned).. and hence reflected light from GI ends up in the reflected images.

It is not actually reflections on reflections.. just the diffuse component from the GI also being in the reflections.

Maybe im wrong but he said they are using lightmaps why would you need lightmaps if your are doing realtime GI?
Edit:Nvm you probably didn't mean realtime GI right?

Cubemaps reflecting all the environment including track detail, tunnels , etc?

Ignore the comparison, just the first pic i found:
http://www.abload.de/img/fm4gt5aezpy.jpg

The reflect whatever is backed into the Cubemaps. Most of the time they are made in production if im not mistaken and not updated in realtime. You can see that back in the slides talking about you see red/pinkish colored box lines in the scene where they place Cubemaps.
 

BigTnaples

Todd Howard's Secret GAF Account
Just to be clear, he hasnt said 16x AF, but FP16 HDR format.

Why the fuck isn't 16xAF standard on all console games?


The effect on IQ is HUGE and I have been forcing it for well over a decade on every PC game ever and it has NEVER affected my framerate negatively.
 
Why the fuck isn't 16xAF standard on all console games?


The effect on IQ is HUGE and I have been forcing it for well over a decade on every PC game ever and it has NEVER affected my framerate negatively.

I am pretty sure this game will use at least 8X AF. I would not worry about that.
I think AF is the least of their problems.. but rather their available shading power...
 
The reflect whatever is backed into the Cubemaps. Most of the time they are made in production if im not mistaken and not updated in realtime. You can see that back in the slides talking about you see red/pinkish colored box lines in the scene where they place Cubemaps.

Yeah but would that cause very noticeable transations in things like entering/exiting tunnel? It's something you don't see in GT5.
 

Stallion Free

Cock Encumbered
Why the fuck isn't 16xAF standard on all console games?


The effect on IQ is HUGE and I have been forcing it for well over a decade on every PC game ever and it has NEVER affected my framerate negatively.

Watch out, you probably just got labelled a PC elitist and will have your console-user credits checked.
 

antic604

Banned
Why the fuck isn't 16xAF standard on all console games?

Chill out, they've not said they won't - it's kind of given considering the GPU that they'll have good AF.

I'd really like somebody who has deeper understanding of the tech to explain differences between their approach and what's in CryEngine 3. For my layman eyes (and brain...) it is strikingly similar. I also re-watched the game-play demos and indeed started to noticing how what they described looks on screen. I can imagine it only gets better while you actually play it, because it gets 'interactive' :)
 

Portugeezer

Member
I already thought it looked great (character models could use some work though), now watching the video it's cool to notice the little effects, when shooting the guys down the stairs seeing their bullets reflected looks great.

16xAF is almost free on hardware that will ne in future consoles.


Now AA on other hand.... yeah there will be games that still will have problems with AA and fucking tearing.

2xAA on 1080p is acceptable though, as long as we never see 0AA games ever again I will be happy.

Hmmm, I don't find much of a difference between 16xAF and 8xAF on the PC games I have played...

Same, and TBH I run most PC games with 8xAF because I don't notice the difference from 16x, and although people say it's basically "free" I still lower it.
 

Stallion Free

Cock Encumbered
Hmmm, I don't find much of a difference between 16xAF and 8xAF on the PC games I have played...

There isn't any visible performance difference between 8x and 16x so there is no reason not to use 16x. I just have 16x forced globally and never run into issues with the exception of Rage (use in-game) and Crysis (use mod to fix POM/AF).
 
Status
Not open for further replies.
Top Bottom