• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Killzone: Shadow Fall technical presentation slides [Up: Tech Videos]

Violater

Member
Tech and graphics are the least of my overall concerns for the series going forward, though it would be nice to see the destructible environments make a strong comeback.
I also hope the supposed tighter DS4 controls really shine in SF and they settle on a control scheme and keep it.
Additionally the time laps gives G.G. the opportunity to crafting a more engaging story and cast of characters.
Obligatory fix that shit G.G.
 

LiquidMetal14

hide your water-based mammals
3gb for os is absurd

I don't know where the numbers are coming from but when I spoke to someone, they said 6. Then another gaffer said 7 but I don't know if I should believe that since what I heard came from someone who has had time with the HW (PS4).
 

KageMaru

Member
I don't know where the numbers are coming from but when I spoke to someone, they said 6. Then another gaffer said 7 but I don't know if I should believe that since what I heard came from someone who has had time with the HW (PS4).

I thought a dev stated they have access to 7GBs, no?
 

kaching

"GAF's biggest wanker"
But I'm not saying they put all of the additional RAM into the OS.
I didn't say you did.

And it's not a "maybe, who knows" proposition.
It certainly is for Sony if they hadn't been planning for it before February. Unless they have definitive plans for that memory on their own (which we've had absolutely no indication that they do), they're just playing a wait-and-see game with MS and they'll be reactive rather than proactive. Meaning there'll be significant lag between seeing what MS is doing, seeing if it takes hold with the market and then Sony deciding whether they need to counter and then implement a competing service. Meanwhile, fallow. That's not good justification for adding significant manufacturing cost to their hardware.

The other aspect here is that they're both running very different OS systems which have very different memory requirements for OS and services. So Sony could still be aiming to compete on equal footing wrt apps/services with a smaller OS memory reserve.
 
Tech and graphics are the least of my overall concerns for the series going forward, though it would be nice to see the destructible environments make a strong comeback.
I also hope the supposed tighter DS4 controls really shine in SF and they settle on a control scheme and keep it.
Additionally the time laps gives G.G. the opportunity to crafting a more engaging story and cast of characters.
Obligatory fix that shit G.G.

Well said. I want a destruction system in all FPS's, it's just an incredible feature. But this is a launch game, so I am not expecting it.

I will buy this for the multiplayer, I just love KZ multiplayer.
 
Another interesting thing that I noticed from the slides - their CPU profiler has six bars. Could that mean that 6 cores are available for development with two being locked out for OS or whatever else?

Also, it just downed on me how it's kind of crazy that the game is already using so much memory. Just for sound it's using more than total amount of PS3s memory, and 4.5GB is probably more than any PC game I've seen. I think BF3 on Ultra is something like 3GB when you add up main+vram.

No, it just means they are currently using 6 cores.

1gb seems more than enough for a console OS. it's not exactly a full blown Windows OS.

1GB also seems way too much. Vita uses around 90MB and has a lot of OS functions already. I think the PS4 will use less than 256MB for the OS, with a total of 512 MB reserved for future upgrades.
 

that sounds like it doesn't really have anything to do with the cores. In fact in the statement right above that they specifically say "there using all the cores". Seems like they might not have been using the audio chip to its full capacity.

Sony reserving two cores is quite disappointing and seems excessive, although MS is rumored to do the same thing. I would think this is something that could always change before PS4 is released, its very early on, and they probably could make a software change like this closer to release. Also there's little to no chance Sony is disabling a core for yields, as they wouldn't call it a "8 core processor." They never said Cell was a 9 core processor or PS3 had 8 SPE's and 1 PPE for example. It was always 7 SPE's(one reserved for the OS.)
 

Zoator

Member
I didn't say you did.

It certainly is for Sony if they hadn't been planning for it before February. Unless they have definitive plans for that memory on their own (which we've had absolutely no indication that they do), they're just playing a wait-and-see game with MS and they'll be reactive rather than proactive. Meaning there'll be significant lag between seeing what MS is doing, seeing if it takes hold with the market and then Sony deciding whether they need to counter and then implement a competing service. Meanwhile, fallow. That's not good justification for adding significant manufacturing cost to their hardware.

The other aspect here is that they're both running very different OS systems which have very different memory requirements for OS and services. So Sony could still be aiming to compete on equal footing wrt apps/services with a smaller OS memory reserve.

Why do they have to wait an see specifically what MS is doing? They have probably known for well over a year now that MS was going to target applications and services pretty heavily, so there's no reason Sony can't support that right out of the gate. A lot of it comes down to how the operating system can be used, regardless of any specific applications that may exist. If they want to do extensive, seamless multitasking inclusive of all of their system features, memory is a must.

And I feel that the argument for manufacturing costs actually goes the other way. If they just wanted to match the RAM that MS is giving to developers, they could have increased their pool to 6GB instead of 8GB, and continue to reserve 1GB for the OS. That would put both MS and Sony at 5GB for games, with Sony having the advantage in bandwidth and GPU power. So why add more to their costs with another 2GB on top of that for games, which would result in very minimal returns? Instead, they could build out a more robust operating system, and support their console with non-game applications and services to the same extent that MS will be able to.
 

RoboPlato

I'd be in the dick
that sounds like it doesn't really have anything to do with the cores. In fact in the statement right above that they specifically say "there using all the cores". Seems like they might not have been using the audio chip to its full capacity.

Sony reserving two cores is quite disappointing and seems excessive, although MS is rumored to do the same thing. I would think this is something that could always change before PS4 is released, its very early on, and they probably could make a software change like this closer to release. Also there's little to no chance Sony is disabling a core for yields, as they wouldn't call it a "8 core processor." They never said Cell was a 9 core processor or PS3 had 8 SPE's and 1 PPE for example. It was always 7 SPE's(one reserved for the OS.)
I'm also hoping that the CPU reserve goes down. I had heard it was less than 1 core in the past, which would be awesome.
 
So is there anyway to tell what the total amount of triangles SF pushing on screen and how it compares to KZ3? I know they said the NPC's were 4x as much.
 
that sounds like it doesn't really have anything to do with the cores. In fact in the statement right above that they specifically say "there using all the cores". Seems like they might not have been using the audio chip to its full capacity.

Sony reserving two cores is quite disappointing and seems excessive, although MS is rumored to do the same thing. I would think this is something that could always change before PS4 is released, its very early on, and they probably could make a software change like this closer to release. Also there's little to no chance Sony is disabling a core for yields, as they wouldn't call it a "8 core processor." They never said Cell was a 9 core processor or PS3 had 8 SPE's and 1 PPE for example. It was always 7 SPE's(one reserved for the OS.)

No one said Sony reserved two cores for the OS. I even remember someone considered reliable on this board saying all 8 cores are available minus some resources taken from one core. So until we have some official info we shouldn't jump to conclusions.
 
No one said Sony reserved two cores for the OS. I even remember someone considered reliable on this board saying all 8 cores are available minus some resources taken from one core. So until we have some official info we shouldn't jump to conclusions.

In the slides they show their profiler, and you can clearly see, that at the moment, there using 6 cores. They also made a comment in the article up above that there utilizing all cores.

digitalfoundry also confirmed this in their analysis of the KZ: SF presentation, but they made sure to point out that it could change, and this is what it is for the time being.
 

RiverBed

Banned
wouldn't/shouldn't the extra hardware that deals with sharing also handle os RAM leaving a minimum to only connect the two ( game and OS under the aame UI)?
 

DieH@rd

Banned
In the slides they show their profiler, and you can clearly see, that at the moment, there using 6 cores. They also made a comment in the article up above that there utilizing all cores.

digitalfoundry also confirmed this in their analysis of the KZ: SF presentation, but they made sure to point out that it could change, and this is what it is for the time being.

Guerrilla devs mentioned that devkits did not have dedicated hardware for sound, and that they had to emulate that with the regular hardware [read: CPU]. One core from KZ:SF PSMeeting could easily be used by sound. More info about OS resources in retail boxes will come nearer to launch as always.
 
In the slides they show their profiler, and you can clearly see, that at the moment, there using 6 cores. They also made a comment in the article up above that there utilizing all cores.

digitalfoundry also confirmed this in their analysis of the KZ: SF presentation, but they made sure to point out that it could change, and this is what it is for the time being.

An one thing to underline: in the presentation sum up they say nothing about CPU performance ( however they say GPU is very fast ). For Guerrilla, that brute forced Cell, 6 Jaguar cores at 1,6GHz must be shit. No wonder Sony will try to get their clocks as high as possible ( 1,8-2 GHz ), especially if also 2 cores are reserved ( I supposse they are waiting to see what will MS do to put this finally in a stone ).
 

RoboPlato

I'd be in the dick
An one thing to underline: in the presentation sum up they say nothing about CPU performance ( however they say GPU is very fast ). For Guerrilla, that brute forced Cell, 6 Jaguar cores at 1,6GHz must be shit. No wonder Sony will try to get their clocks as high as possible ( 1,8-2 GHz ), especially if also 2 cores are reserved ( I supposse they are waiting to see what will MS do to put this finally in a stone ).

My theory on the CPU core clocks and why they haven't announced anything is that they're trying to up it but they're trying to go for matched clocks with the GPU to maintain the efficiency during HSA functions. They need to be a perfect multiplier of each other for that to work. 1.6GHz is exactly what it would need to be for the 800MHz GPU and any increases to the CPU are probably bound by if the GPU and the TDP envelope that comes with it. Upclocking a Jaguar CPU isn't a big deal, but taking the GPU up to 900MHz for even 1.8GHz or even 1GHz for a 2GHz CPU would be a pretty big jump that could cause problems. Of course, if they could do it then on top of the extra CPU power they'd get the PS4 over 2 Tflops but they've been saying the 1.84 Tflop number a lot so it's probably not going to change.
 
In the slides they show their profiler, and you can clearly see, that at the moment, there using 6 cores. They also made a comment in the article up above that there utilizing all cores.

digitalfoundry also confirmed this in their analysis of the KZ: SF presentation, but they made sure to point out that it could change, and this is what it is for the time being.


Does the demo run on final PS4 hardware? No, it does not. They also said tools are very immature at this point. So we can't really assume that two cores are reserved from that presentation, especially because reserving two cores for the OS wouldn't make sense.

An one thing to underline: in the presentation sum up they say nothing about CPU performance ( however they say GPU is very fast ). For Guerrilla, that brute forced Cell, 6 Jaguar cores at 1,6GHz must be shit. No wonder Sony will try to get their clocks as high as possible ( 1,8-2 GHz ), especially if also 2 cores are reserved ( I supposse they are waiting to see what will MS do to put this finally in a stone ).

I think it's the opposite. For running games, Cell is very shitty compared to a proper X86 processor like Jaguar. They are probably very glad to have something so much easier to program for. And, although Cell single precision performance is great, in double precision operations Jaguar beats Cell by a long margin. Cell double precision perfomance is about 21 Gflops, while Jaguar is around 51 Gflops at 1.6GHz.
 

stryke

Member
Can someone who has a copy of the powerpoint presentation upload it? It seems the links at GG don't work anymore.
 

i-Lo

Member
Pertaining to 7 LoDs as opposed to 3 LoDs for most opponents, why aren't GG, being a leading studio in visual fidelity (among Sony's 1st party) use (adaptive) tessellation to bridge the gap between those LoDs (or even reduce their number) on the fly instead of resorting back to switching them out completely leading to pop in (albeit the effect would be less jarring purely because there are 4 more LoDs to go through now)?
 
Pertaining to 7 LoDs as opposed to 3 LoDs for most opponents, why aren't GG, being a leading studio in visual fidelity (among Sony's 1st party) use (adaptive) tessellation to bridge the gap between those LoDs (or even reduce their number) on the fly instead of resorting back to switching them out completely leading to pop in (albeit the effect would be less jarring purely because there are 4 more LoDs to go through now)?
They might do something like that for their next game, but right now they have a lot to work on and learn. They'll just use up some of that extra memory to throw in those extra LOD's into the memory (as they probably don't have much use for it at the moment.)
 

spwolf

Member
Pertaining to 7 LoDs as opposed to 3 LoDs for most opponents, why aren't GG, being a leading studio in visual fidelity (among Sony's 1st party) use (adaptive) tessellation to bridge the gap between those LoDs (or even reduce their number) on the fly instead of resorting back to switching them out completely leading to pop in (albeit the effect would be less jarring purely because there are 4 more LoDs to go through now)?

their "lower" lods might be much more detailed now though, so you might not notice much if any difference at all.
 

i-Lo

Member
They might do something like that for their next game, but right now they have a lot to work on and learn. They'll just use up some of that extra memory to throw in those extra LOD's into the memory (as they probably don't have much use for it at the moment.)

Thank you. I think, if by some fortuitous event, devs (like GG) can be asked about this specifically then further discussion on this matter is warranted.

their "lower" lods might be much more detailed now though, so you might not notice much if any difference at all.

I know it's Watch_Dogs but observe around 00:59, 5:31, 5:35 time marks where for the first two LoD pop in (the lady with the purple dress and garage door at the far end respectively) occurs. For the last one a vent pops up.

Honestly, at this point, if they can't solve this issue after touting the benefits of and rather a more mainstream adoption of tessellation then I'd honestly take a lower fidelity model outright.
 

Violater

Member
New engine. New lighting, new renderer, etc.

I think either they are lacking the vision or the launch time-frame has hindered what they can do so far with the game. But there are more features they could be incorporating into the game to make the world seem less static. There needs to be more things to interact with and not just shooting barrels or pushing buttons to open doors.
 
I think either they are lacking the vision or the launch time-frame has hindered what they can do so far with the game. But there are more features they could be incorporating into the game to make the world seem less static. There needs to be more things to interact with and not just shooting barrels or pushing buttons to open doors.

Launch title mang.

They just want to give the kids a shoot-bang to play when they get their new console.
 

Lord Error

Insane For Sony
Pertaining to 7 LoDs as opposed to 3 LoDs for most opponents, why aren't GG, being a leading studio in visual fidelity (among Sony's 1st party) use (adaptive) tessellation to bridge the gap between those LoDs (or even reduce their number) on the fly instead of resorting back to switching them out completely leading to pop in (albeit the effect would be less jarring purely because there are 4 more LoDs to go through now)?
If I had to guess, I'd say it's because they have more than enough memory to store as many LOD levels as they want, but they don't have infinite GPU power to use for tessellation, which usually still destroys performance on every GPU. They can kind of pre-compute tessellation and get it performance free.
 

i-Lo

Member
If I had to guess, I'd say it's because they have more than enough memory to store as many LOD levels as they want, but they don't have infinite GPU power to use for tessellation, which usually still destroys performance on every GPU. They can kind of pre-compute tessellation and get it performance free.

That does make sense. That said, I thought these new generation GPUs (esp. HD7XXX) were being primed to adequately handle tessellation. Given that tessellation utilizes keeping the overall no. of polys nearly constant, I didn't think it could tax the GPU that much.

I was hoping that the LoD models would be the pre-determined tether points along which tessellation/de-tessellation would progress upon instead of arbitrarily increasing the poly numbers of what's closest to the screen. Of course, the highest LoD models next gen are going to look suitably rounded (in motion, current gen games have done most impressive job when it comes to imbibing central characters with apt amount of polys for detail) and it's the transition that feels jarring.
 

RoboPlato

I'd be in the dick
That does make sense. That said, I thought these new generation GPUs (esp. HD7XXX) were being primed to adequately handle tessellation. Given that tessellation utilizes keeping the overall no. of polys nearly constant, I didn't think it could tax the GPU that much.

I was hoping that the LoD models would be the pre-determined tether points along which tessellation/de-tessellation would progress upon instead of arbitrarily increasing the poly numbers of what's closest to the screen. Of course, the highest LoD models next gen are going to look suitably rounded (in motion, current gen games have done most impressive job when it comes to imbibing central characters with apt amount of polys for detail) and it's the transition that feels jarring.

They probably didn't have hardware to truly work with high levels of tessellation for most of the development process. They're definitely using edge tessellation but fully tessellating everything would take more time than they probably have time for to make launch so they're just using LoDs since they have more knowledge of it, wouldn't have to redo any work for it, and they can brute force quality on PS4 with it.
 

Portugeezer

Member
I know it's Watch_Dogs but observe around 00:59, 5:31, 5:35 time marks where for the first two LoD pop in (the lady with the purple dress and garage door at the far end respectively) occurs. For the last one a vent pops up.

Honestly, at this point, if they can't solve this issue after touting the benefits of and rather a more mainstream adoption of tessellation then I'd honestly take a lower fidelity model outright.

I saw grass pop-in in the BF4 gameplay which was running on some beast PC apparently.

I will wait for the final versions before criticising though.
 
No, it just means they are currently using 6 cores.



1GB also seems way too much. Vita uses around 90MB and has a lot of OS functions already. I think the PS4 will use less than 256MB for the OS, with a total of 512 MB reserved for future upgrades.

It looks like Sony is reserving the 2 cores, so games at most will be able to use 6 cores, and I expect that to be final. That's still a great amount of performance. If there was an additional core they could have been using, but weren't, I would imagine they would have called that out for future enhancements in their presentation.

Plus, if they reserve 2 cores, it gives them a lot more flexibility in what they could do in the OS down the road. There's nothing stopping them from reducing that in the future if they find out they don't need it, but you can't go the other way around.

But like I said, we have nothing to worry about. 6 cores is still a hell of a lot of power, as proven by Killzone during the reveal.
 

i-Lo

Member
They probably didn't have hardware to truly work with high levels of tessellation for most of the development process. They're definitely using edge tessellation but fully tessellating everything would take more time than they probably have time for to make launch so they're just using LoDs since they have more knowledge of it, wouldn't have to redo any work for it, and they can brute force quality on PS4 with it.

I did not find any reference to this in their PDF. How are you in the know?

I saw grass pop-in in the BF4 gameplay which was running on some beast PC apparently.

I will wait for the final versions before criticising though.

I saw the light fixtures pop in the tunnel section as well and it surprised me. Simply put, regardless of the platform, I am curious whether devs will be developing solutions for this very inherent 3D graphic issue.
 

pottuvoi

Banned
If I had to guess, I'd say it's because they have more than enough memory to store as many LOD levels as they want, but they don't have infinite GPU power to use for tessellation, which usually still destroys performance on every GPU. They can kind of pre-compute tessellation and get it performance free.
Also, tesselation cannot handle all topology changes needed to transform a low resolution mesh into high resolution one. (IE. separate fingers, a pack hanging on jacket, teeth within mouth.. etc.)

There is also a matter of shader, bone structure etc simplifications for distant lods, you do not want to run all facial animations and such for character couple of pixels high.

Of course, the performance can be a lot better when you have hand or off-line optimized mesh, especially if the mesh is quite coarse.
I did not find any reference to this in their PDF. How are you in the know?
I certainly didn't see any evidence for that.
 

Blizzje

Member
I already posted this in the 'general' Shadow Fall thread, but I figure I should have asked it here instead. This is where the techies hang out, I suppose? :)

I was wondering about the lighting: it looks really nice, but is the 'vignette' effect from Killzone 3 still being used? I see somewhat darker corners, especially in the top left and top right of the screen. Or does that have to do with the dynamic lighting? Anyone care to explain how that stuff works? I absolutely hate vignetting in Killzone 3 and Mass Effect 3, it's way too pronounced.
 

pottuvoi

Banned
I was wondering about the lighting: it looks really nice, but is the 'vignette' effect from Killzone 3 still being used? I see somewhat darker corners, especially in the top left and top right of the screen. Or does that have to do with the dynamic lighting? Anyone care to explain how that stuff works? I absolutely hate vignetting in Killzone 3 and Mass Effect 3, it's way too pronounced.
Vignetting used in killzone and mass effect is just a post process where you multiply a image with darkened corners on top of the image.

The actual effect in photography is due to a walls of lense blocking light.
Tri-Ace presentation is still best resource on subject. (.PPTX)

So it doesn't have anything to do with lights in scene and such.
 

Blizzje

Member
Vignetting used in killzone and mass effect is just a post process where you multiply a image with darkened corners on top of the image.

The actual effect in photography is due to a walls of lense blocking light.
Tri-Ace presentation is still best resource on subject. (.PPTX)

So it doesn't have anything to do with lights in scene and such.

Can anyone tell if they are using vignetting in Shadow Fall? Like I said, I notice the darker corners in the top left and right of the screen, but I thought it had to do with the light source (the effect seems more pronounced when looking at the sun or right when the main character is lying down and kills the Helghast with a knife)
Thanks for the clarification, this is really interesting to me for some weird reason. :)
 
Top Bottom