• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Killzone 4 Uses Raytracing For Its Real-Time Reflections

Status
Not open for further replies.

RoboPlato

I'd be in the dick
right right, so now we know how Sony is "co-developing: Versus13 with Square :p



Quantic Dream, Heavy Rain cost them 16 million Euros to develop.

Naughty Dog outsourced the multiplayer character models for Uncharted 2 and 3 (and it shows).

This lighting presentation is pretty cool so far, although my connection is being shoddy so I can only watch it in spurts. The lighting goal with the Helghast model is gorgeous. I hope they can hit that.
 

B.O.O.M

Member
What is there to backup when it was confirmed more or less by GG at least 2 years ago?



You can probably add SSM's new IP to that list too in terms of first party.

If anything gets me excited for next-gen it's new IP's

Did they actually confirm the new IP? I don't recall they ever doing that

Oh and yes I did forget about the new IP Stig is supposedly making. Exciting stuff
 

RoboPlato

I'd be in the dick
Will this shorten development time?

Yeah. I may be misunderstanding based on his accent but it sounds like once the team got a hold on the new lighting tech it more than doubled their speed getting things looking right without having to prebake and it looks better. He outright says they would not be able to get the game out near launch looking as good as it will without this tech.
 

RoboPlato

I'd be in the dick
I thought that the reflections were all faked? or are the shiny building ones faked and the in level ones ray traced?

The giant buildings were all faked through cubemaps. In game it's a combination of raytracing, cube maps, and the skybox depending on how far the subject is.
 

Saberus

Member
They also said at the beginning of this presentation that the Feb demo was running on a final Dev kit. And in my new Edge mag, Guerrilla said they received a new dev kit two weeks before the show. So sounds like the hardware could be almost ready for production.
 

RoboPlato

I'd be in the dick
They also said at the beginning of this presentation that the Feb demo was running on a final Dev kit. And in my new Edge mag, Guerrilla said they received a new dev kit two weeks before the show. So sounds like the hardware could be almost ready for production.

Even though he said final, a lot of the rumors said that the January dev kits were the last ones before the final ones come. He may have just misspoke and it's the last one before the real ones arrive.
 

RoboPlato

I'd be in the dick
It would have been pathetic if AF wasn't standard next-gen.

Do you play many console games? AF is very rare currently and certainly not at 16x. I just thought it was worth mentioning since it's a big jump from current gen. Nothing mindblowing, just good to know.
 

Stallion Free

Cock Encumbered
Do you play many console games? AF is very rare currently and certainly not at 16x. I just thought it was worth mentioning since it's a big jump from current gen. Nothing mindblowing, just good to know.

I know it's non-existent this gen. It's one of the most irritating omissions from current games. This is a massive leap in hardware. The performance hit on PC graphics card has been non-existent for years now.
 

Stallion Free

Cock Encumbered
What does AF do?

6wsHnR2.png
 

milsorgen

Banned
Just to clear things up raytracing is used for a lot of stuff....
Collision detection,Ai what ever needs a distance and a direction it will probably use some form of raytracing.

Thanks for the info man, I try and stay somewhat abreast on general game soft-tech but I guess I never put two and two together.
 
It's a basic feature that makes a massive difference. Sony and MS kinda fucked up with their hardware design making it difficult to include.
it doesnt make a massive difference...i can never tell the difference. bigger than af is missing vegitation and grass in a large environment. the recent bf3 railroad track comes to mind
 

TheD

The Detective
Let the PC elitist talk begin...

It is not "elitist" to point out that most console games either lack AF or only use a low amount, that the performance of it has not been a problem on newer GPUs for a long time and thus the new consoles should have it more often!
 

Niks

Member
wow that was great!

Would love to see a talk like this from Polyphony Digital. The stuff they pulled on PS3 hardware was unreal.
 

HTupolev

Member
I see, nice that they are doing proper stuff in game rather than faking it
Screen-space reflections are a pretty long shot from faking it. Aside from being more versatile than planar reflections in one or two respects (i.e. they're a lot easier to apply to curved surfaces) they're about as faked as reflections get. They attempt to determine what should be reflecting in surfaces based only on the things currently being looked at, which results in about as much visual stability as a Halo 2 cutscene running on an original Playstation.

The extent to which they're fake is why devs are interested in them, as opposed to expensive (but far more stable) methods like planar reflections which require rendering entire extra images, or full geometry ray tracing which takes tons of stuff into account.

What does AF do?
AF is an extension of mipmapping, to deal with problems created by the basic mipmapping technique.

If you load up a 3d game from the early through mid 90's, and you rotate the camera, any detailed textures in the distance will be seen to shimmer. Suppose that the red dots represent the center points of pixels on your screen, and the black and white strips are the actual texture on a far-off object:

8K3pWw2.png


At this moment, this part of your screen will be solid black, because all of the pixel centers fall of black spots on the texture. But if you rotate your screen a bit to the right, the red dots all might slip onto the white, and suddenly the whole part of your screen looks white. This is very incorrect, shimmery behaviour. In the real world, if you were to view an object made of thin black and white strips from a distance, the black and white strips would blend to look grey. And it wouldn't shimmer in your vision as you look around.

The issue happens because you're sampling with a small number of sample points (the pixel centers) from a thing with lots of high-frequency detail (the strips). One solution is to sample from a lower-frequency image. Or, in simpler terms, a scaled down image. So, in addition to having your big texture with black and white strips, you might also have a scaled down copy of that texture, which in this case might just look like a grey blob:

o8LlbPm.png


(Pretend that the stripes are already only like a 10x10 texture for this to look correct, so that it actually does become a grey blob when you scale down. Obviously a ~200x200 texture like this wouldn't turn into a grey blob when scaled to ~70x70 like in this picture.)

Now that we have this scaled-down version of the texture on hand, we can use it when the surface is viewed from a large distance to get correct results:

4tii6qK.png


Yay, we have now implemented mip mapping!

HOWEVER! There's a problem. While this technique works great when viewing surfaces head-on, it isn't quite right for viewing surfaces at oblique angles.

Suppose that the surface with this striped texture is suspended off the ground, maybe like a freeway sign. Suppose that we walk toward its location so that we're just about standing beneath it. And then suppose we look up at it. Because we're viewing it at an oblique angle, it's filling only a small slice of our view vertically, even though it's still filling a pretty good chunk of our vision horizontally.
So... which mip map level do we use? The stripes that are appropriate for viewing a texture up close, or the grey blob that we use when we view a texture from a distance?
In general, there isn't a good answer. If the texture has lots of high-frequency detail in the vertical direction (this one's stripes only have high-frequency detail in the horizontal direction, but we can't build a technique based on the quirks of our single specific texture), using the big version can cause shimmering when the camera is rotated up and down. But using the small one obviously obscures horizontal detailing (the stripes, in this case) that we really should still be able to see!
In games with regular mipmapping, choosing the smaller texture is standard, since it ensures image stability. This is why in console games today, textures are almost always fairly blurry when viewed at oblique angles.

Anisotropic filtering solves the problem by (in effect) including "stretched" textures into our mip maps. Look back at the image that contains both the stripes and scaled-down grey texture. Notice the missing stuff on the upper right and lower left? We can put additional, and very useful, scaled textures there.
The lower left spot can hold a vertically-squished texture that we can use like the situation of looking up at a surface. And the upper right spot can hold a horizontally-squished texture that we could use in situations like looking at a wall from an oblique angle.
In the case of this texture, because the high-frequency detail is entirely in the horizontal direction, these extra textures aren't very interesting...

bprJrzR.png


...but again, we need to account for the general case where we might have vertical details as well.

And that's the gist of AF (its motivation and results, anyway; the exact implementation details don't usually work exactly like this these days, but it's the idea). It's awesome, but PS360 games haven't seen a whole lot of it because it taxes memory. A PS4 with 8GB of blazing fast RAM should be able to handle it much more easily.
 

Desty

Banned
Yeah their lighting is really advanced. Like some sort of hybrid between real time and a movie pipeline. I really hope they put up some papers and examples about all the crazy stuff they are doing.
 

Lord Error

Insane For Sony
10 years late.
Sigh, you know what...

I'm not going to go there, but I will say, I don't do 16x on my PC or laptop even today. I've seen some games look buggy with 16xAF, so I just stick with 4X, which is what I think some console games use as well. I also kind of think 16xAF performance hit on 2003 hardware would be unacceptable, but even if true, I realize that it would not change the nature of the comment, which was regardless unnecessary.
 
D

Deleted member 80556

Unconfirmed Member
Yeah. I may be misunderstanding based on his accent but it sounds like once the team got a hold on the new lighting tech it more than doubled their speed getting things looking right without having to prebake and it looks better. He outright says they would not be able to get the game out near launch looking as good as it will without this tech.

Excellent news. I hope that they're sharing this with the ICE team over at America so that third party can have access to this kind of technology.

Stream-lining of development of difficult and noticeable things like lighting can only be good for everyone. Will still watch the entire presentation this week when I have more time.
 
D

Deleted member 80556

Unconfirmed Member
EDIT: Double post because of 508 error. Lemme talk about something then!

It's a basic feature that makes a massive difference. Sony and MS kinda fucked up with their hardware design making it difficult to include.

But would it have been possible and cost-effective ten years ago as you say?
 

TheD

The Detective
Sigh, you know what...

I'm not going to go there, but I will say, I don't do 16x on my PC or laptop even today. I've seen some games look buggy with 16xAF, so I just stick with 4X, which is what I think some console games use as well. I also kind of think 16xAF performance hit on 2003 hardware would be unacceptable, but even if true, I realize that it would not change the nature of the comment, which was regardless unnecessary.

The only modern game I am aware if that has problems with high levels of AF is Rage due to the frankly shitty megatexture system.

Lack of AF does hurt IQ a lot, 2003 hardware could get useable 16X AF and hardware from 2005 like the 7800 had no problems with it at all performance wise (at least with games from around the same time).
 

James Sawyer Ford

Gold Member
Yeah. I may be misunderstanding based on his accent but it sounds like once the team got a hold on the new lighting tech it more than doubled their speed getting things looking right without having to prebake and it looks better. He outright says they would not be able to get the game out near launch looking as good as it will without this tech.

What's the timestamp on when he discusses this?
 
Nintendo pushing innovation...

Sony pushing hardware...

Both with awesome 1st party...

Microsoft and halo(even tho i loved the first halo)

Get outa ma face..
 

onQ123

Member
Nintendo pushing innovation...

Sony pushing hardware...

Both with awesome 1st party...

Microsoft and halo(even tho i loved the first halo)

Get outa ma face..
e65.gif


Kinect 2 & Smart Glass could be just as innovative as the Wii U pad & Xbox 3 shouldn't be that far behind PS4 in hardware.
 

Stallion Free

Cock Encumbered
Sigh, you know what...

I'm not going to go there, but I will say, I don't do 16x on my PC or laptop even today. I've seen some games look buggy with 16xAF, so I just stick with 4X, which is what I think some console games use as well. I also kind of think 16xAF performance hit on 2003 hardware would be unacceptable, but even if true, I realize that it would not change the nature of the comment, which was regardless unnecessary.
What games?
 
Kinect 2 & Smart Glass could be just as innovative as the Wii U pad & Xbox 3 shouldn't be that far behind PS4 in hardware.

Smart glass wont be nearly the same. Why? It requires people to buy both the next box and a tablet. Not everyone can afford that. So it wont have the same dev. treatment as, lets say, the wii u that gives it all in one package and will be considerably priced less.

I do agree, nextbox wont be far behind the ps4 in hardware, but sony, out of the three, is pushing it the most. Plus their 1st party is much better.



EDIT: I feel bad using the gif i used. Not trying to be racist, but it comes off that way and i apologize.

Ill remove that gif, if you also want to from quoting me..
 
Finally had the time to watch this. Seeing how far they use the SSRT is pretty impressive. The shot of the main area used it for a very far distance.
 

onQ123

Member
Smart glass wont be nearly the same. Why? It requires people to buy both the next box and a tablet. Not everyone can afford that. So it wont have the same dev. treatment as, lets say, the wii u that gives it all in one package and will be considerably priced less.

I do agree, nextbox wont be far behind the ps4 in hardware, but sony, out of the three, is pushing it the most. Plus their 1st party is much better.

I sure there is more Smartphones & tablets in homes then there are Wii U's
 
Status
Not open for further replies.
Top Bottom