• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Killzone Shadow Fall (PS4) announced, launch title [1080p/30fps]

Status
Not open for further replies.
30 fps? I shouldnt be surprised. Next gen isnt even here and it already sucks.

And there in lies GG ultimate weakness. I don't understand the mentality of a flagship studio outputting 30 fps on their flagship game.

We have been hearing non stop on how GG was in on ps4 development so if its hardware related then why the hell haven't they mentioned it to sony in first place?

8 GB of memory and yet still stuck with 30 fps. Sigh
 
And there in lies GG ultimate weakness. I don't understand the mentality of a flagship studio outputting 30 fps on their flagship game.

We have been hearing non stop on how GG was in on ps4 development so if its hardware related then why the hell haven't they mentioned it to sony in first place?

8 GB of memory and yet still stuck with 30 fps. Sigh

Go with 60FPS and you get less Graphical detail.

Oh and RAM has nothing to do with Framerate
 
Hitting a constant 60 fps at 1080p is very hard, you need a fast gpu or a fast cpu, you could also make compromises by removing motion blur, dof, aa, reducing shadow quality....all this would result in a shitty looking game
 
And there in lies GG ultimate weakness. I don't understand the mentality of a flagship studio outputting 30 fps on their flagship game.

We have been hearing non stop on how GG was in on ps4 development so if its hardware related then why the hell haven't they mentioned it to sony in first place?

8 GB of memory and yet still stuck with 30 fps. Sigh

60fps or 30fps with much more graphical detail? Personally I'm fine with 30fps.
 
It might happen that you choose an approach which is more computationally intensive but that uses a lot less memory in which case this might be untrue.
Also, additional ram means certain effects that are taxing to generate can be optimized to look similar with simply more particles or whatnot to emulate the previous effect and reduce processing time (to increase framerate)
 

GopherD

Member
And there in lies GG ultimate weakness. I don't understand the mentality of a flagship studio outputting 30 fps on their flagship game.

We have been hearing non stop on how GG was in on ps4 development so if its hardware related then why the hell haven't they mentioned it to sony in first place?

8 GB of memory and yet still stuck with 30 fps. Sigh

How the hell is it a weakness? GG has decided to be a slower, more graphically adept shooter with open mission structure and a stronger reliance on story driven gameplay. As a graphical showcase of the PS4 at launch, this is exactly what it should be. 30 fps is sometimes not a technical limitation, but a design choice for particular reasons.

Me and many, many others can't wait to see GG's creation without the sandboxing of the game into a particular design ethos. I'd rather wait and judge its effect on the game as a whole.
 

Demon Ice

Banned
And there in lies GG ultimate weakness. I don't understand the mentality of a flagship studio outputting 30 fps on their flagship game.

We have been hearing non stop on how GG was in on ps4 development so if its hardware related then why the hell haven't they mentioned it to sony in first place?

8 GB of memory and yet still stuck with 30 fps. Sigh

Haha

I'm guessing you don't know the meaning of at least half of what you just said.

"Maybe next gen needs 16 GB of GDDGDDGDDR6 for 60 FPS!"
 

RoboPlato

I'd be in the dick
And there in lies GG ultimate weakness. I don't understand the mentality of a flagship studio outputting 30 fps on their flagship game.

We have been hearing non stop on how GG was in on ps4 development so if its hardware related then why the hell haven't they mentioned it to sony in first place?

8 GB of memory and yet still stuck with 30 fps. Sigh
I can't remember the last flagship exclusive console title that was 60fps outside of Mario and Gran Turismo.
 
I think some people are thinking too much about Crysis 3 when they think about 30 fps, which is an entirely different thing.

According to Eurogamer, KZ: SF was locked at 30 fps, none of that peaks and troughs shit with Crysis 2/3 where often 30fps was the maximum framerate during gameplay.

An absolutely solid 30 fps is absolutely fine. Would I prefer 60? Of course. But locked 30 fps for higher visual fidelity is perfect.
 

Superflat

Member
If the game never dips below 30, I am 100% okay with it.

After watching the HQ version of this I saw something really jarring and had to play it again to make sure my eyes weren't bugging out. It was of a NPC woman running by you during a firefight.

I didn't read all of the past 70-something pages so it could be redundant, but it gave me a chuckle to see a textureless/unfinished character running around:

dISB6Pg.png


Of course, game isn't finished/put together just for the meeting/not indicative of final product/etc.

I was actually really impressed with the overall visuals of this demo.

The only KZ game I played was 3 and I enjoyed it though I don't remember a lot of it now. The premise is the most appealing part, with the Cold War/Berlin Wall inspiration.
 
And there in lies GG ultimate weakness. I don't understand the mentality of a flagship studio outputting 30 fps on their flagship game.

We have been hearing non stop on how GG was in on ps4 development so if its hardware related then why the hell haven't they mentioned it to sony in first place?

8 GB of memory and yet still stuck with 30 fps. Sigh

You were funnier with the crazy Ken avatar, right now, not so much
 

RoboPlato

I'd be in the dick
I think some people are thinking too much about Crysis 3 when they think about 30 fps, which is an entirely different thing.

According to Eurogamer, KZ: SF was locked at 30 fps, none of that peaks and troughs shit with Crysis 2/3 where often 30fps was the maximum framerate during gameplay.

An absolutely solid 30 fps is absolutely fine. Would I prefer 60? Of course. But locked 30 fps for higher visual fidelity is perfect.
Yeah, I'm totally fine with smooth 30fps as long as it doesn't drop or tear. A lot of people just seem to be under the impression that new hardware means 60fps but they forget about the moving goalposts of graphics. I hope we get smooth 30fps games next gen instead of the numerous games that we have this gen that top out at 30. I love 60fps and want it as much as possible but I know that's not going to happen in a console setting.
 
Killing Rico is the right direction for the franchise. If anything let us play as helghast for like 20 minutes on a kill rico mission.

That's a bit cruel, KZ: SF is set 30 years after KZ3, so going on a mission to kill an old man seems a bit much, even though he is a dick.
 

Nizz

Member
Yeah, I'm totally fine with smooth 30fps as long as it doesn't drop or tear. A lot of people just seem to be under the impression that new hardware means 60fps but they forget about the moving goalposts of graphics. I hope we get smooth 30fps games next gen instead of the numerous games that we have this gen that top out at 30. I love 60fps and want it as much as possible but I know that's not going to happen in a console setting.
This would be fine for me. The problem with current gen are games that struggle to hit that 30fps target. What my hopes are at least is that with PS4 devs have enough headroom that we can have games like Killzone: SF and Battlefield 4 looking great with all (or a lot) of bells and whistles and crazy chaos going on without ever dipping below 30.

Although I would like to see less graphical intensive games hit that 60fps mark on PS4.
 

i-Lo

Member
This is wild speculation but why kz1s intro and not 2?

My guess: It epitomizes the start of the new conflict. In that speech Visari's outlined what the Helgasts have had to overcome to reach where they are, how their sacrifice had made them stronger, more unified and above all how he was the architect of this success. Lastly, he points to the cause that started the chain reaction.

Honestly, both KZ1 and 2 have two of the best intro speeches in video games. A stark contrast to what we see in the game. Then again, perhaps it is because they are the ISA they act like teenage jocks and meat heads.

If the game never dips below 30, I am 100% okay with it.

After watching the HQ version of this I saw something really jarring and had to play it again to make sure my eyes weren't bugging out. It was of a NPC woman running by you during a firefight.

I didn't read all of the past 70-something pages so it could be redundant, but it gave me a chuckle to see a textureless/unfinished character running around:

Pics.jpeg

Of course, game isn't finished/put together just for the meeting/not indicative of final product/etc.

I was actually really impressed with the overall visuals of this demo.

The only KZ game I played was 3 and I enjoyed it though I don't remember a lot of it now. The premise is the most appealing part, with the Cold War/Berlin Wall inspiration.

Wow! This is the first time, I've even noticed this. Great capture. I love women of clay. Kind of reminds me of inFAMOUS.
 

Piggus

Member
30 fps is fine as long it's responsive and consistent. I play plenty of games on PC at a locked 30 fps if they are unable to maintain 60 fps (AC3 for example). I mean the alternative is no vsync, but then you just get a ton of tearing and the experience feels inconsistent. With a locked 30 fps you can't tell when the game is being bogged down at least.

Maybe it was just the video, but it seemed like there was still some framerate optimization to do. It looked more like 25 fps in some parts, not 30. But they have plenty of time to get it consistent. Still amazing considering it's a launch title.

You can pre-calculate some effects and store them in RAM to save computation time and hit 60fps.

lmao

computuhs how do they work?! Such a way of thinking is why ATi and Nvidia are able to sell so many shitty low-end GPUs to people who think the amount of VRAM is all that matters. "Dis 1 has 2 gbs so it must b amazin!!!11"
 

Jack_AG

Banned
Pertaining to visuals I noticed two inconsistencies (well the latter is part of an existing problem):

Observe: In this picture how the whole screen zooms in on the target when only the sight from the scope ought to undergo the process:

Whoa whoa... slow down there, tiger... do you know how piss-poor fucking hard it would be to aim with a tiny scope zooming the image by while the rest of the screen moves at its own pace? You'd make one tiny movement and 30 feet go by in the scope when it looks like you only moved an inch.

Zooming is usually handled this way when the scope only takes up a fraction of the screen. If the scope were much larger on screen you can make adjustments since the players eyes will miss most of the negative space - but in this case - the negative space fills up a larger portion of the screen.

From a design/playability standpoint - it's a smart decision.
 
I know full well that RAM has nothing to do with framerate. And that's why i find it ironic to see how everyone is praising cerny and sony for the huge amount of ram and yet they skimped on the cpu.

The bottleneck for the ps4 is going to be cpu and sony's flagship racer is prove of that. But the point i am trying to make is the slow gameplay of killzone right now is just so outdated.

You can have all the brilliant warzone scenarios and awesome looking muzzle flash but a slow game is just what it is. Its like GT on 30fps. No Gt fan will ever trade framerate for graphics even for a pretty heavy graphics game like GT
 

Kambing

Member
I know the chances of what i want to happen are in the single digits but a toggling option between 1080p/30fps and 720p/60fps would seriously kick ass.

For single player games i don't mind playing at a capped and or consistent 30fps. For multiplayer however, the difference between 30 and 60 fps are huge IMO. For those that disagree, i recommend downloading that BF4 60fps and compare it with KZ:SF 30fps gameplay side by side. I mean hypothetically it should be feasible to achieve 60 fps at 720p as 1080p has 2.25 times as many pixels to push. But perhaps its not that easy and the reason we have not heard about it =(
 

i-Lo

Member
Whoa whoa... slow down there, tiger... do you know how piss-poor fucking hard it would be to aim with a tiny scope zooming the image by while the rest of the screen moves at its own pace? You'd make one tiny movement and 30 feet go by in the scope when it looks like you only moved an inch.

Zooming is usually handled this way when the scope only takes up a fraction of the screen. If the scope were much larger on screen you can make adjustments since the players eyes will miss most of the negative space - but in this case - the negative space fills up a larger portion of the screen.

From a design/playability standpoint - it's a smart decision.

Valid point. However, it is a touch too unrealistic for my taste. Given it basically functions like an ACOG scope, they can bring it closer (like it is with other shooters). When it comes to guns, KZ2 offered quite the realism (spread and recoil). So, consistency matters (to me at least).
 

RoboPlato

I'd be in the dick
Valid point. However, it is a touch too unrealistic for my taste. Given it basically functions like an ACOG scope, they can bring it closer (like it is with other shooters). When it comes to guns, KZ2 offered quite the realism (spread and recoil). So, consistency matters (to me at least).

This is something that's bothered me for a while too but pretty much all games do it. It's a complete useability/playability issue and I'm fine with that as a solution since there's good reason behind it.
 

Drencrom

Member
I know the chances of what i want to happen are in the single digits but a toggling option between 1080p/30fps and 720p/60fps would seriously kick ass.

For single player games i don't mind playing at a capped and or consistent 30fps. For multiplayer however, the difference between 30 and 60 fps are huge IMO. For those that disagree, i recommend downloading that BF4 60fps and compare it with KZ:SF 30fps gameplay side by side. I mean hypothetically it should be feasible to achieve 60 fps at 720p as 1080p has 2.25 times as many pixels to push. But perhaps its not that easy and the reason we have not heard about it =(

This would be awesome
 

RoboPlato

I'd be in the dick
I know full well that RAM has nothing to do with framerate. And that's why i find it ironic to see how everyone is praising cerny and sony for the huge amount of ram and yet they skimped on the cpu.

The bottleneck for the ps4 is going to be cpu and sony's flagship racer is prove of that. But the point i am trying to make is the slow gameplay of killzone right now is just so outdated.

You can have all the brilliant warzone scenarios and awesome looking muzzle flash but a slow game is just what it is. Its like GT on 30fps. No Gt fan will ever trade framerate for graphics even for a pretty heavy graphics game like GT

I don't think that the CPU is going to be that bad on the PS4 considering there's a lot of stuff that they have dedicated hardware to handle that CPUs traditionally did in previous gens. Audio rendering and video decoding have their own dedicated chips and physics processing/advanced lighting calculations will be done by the GPU. The CPU will pretty much only be doing AI and geometry set up which it should be sufficient for.
 
If the game never dips below 30, I am 100% okay with it.

After watching the HQ version of this I saw something really jarring and had to play it again to make sure my eyes weren't bugging out. It was of a NPC woman running by you during a firefight.

I didn't read all of the past 70-something pages so it could be redundant, but it gave me a chuckle to see a textureless/unfinished character running around:





Of course, game isn't finished/put together just for the meeting/not indicative of final product/etc.

I was actually really impressed with the overall visuals of this demo.

The only KZ game I played was 3 and I enjoyed it though I don't remember a lot of it now. The premise is the most appealing part, with the Cold War/Berlin Wall inspiration.

lol nice catch!!
 

velociraptor

Junior Member
Yeah, I'm totally fine with smooth 30fps as long as it doesn't drop or tear. A lot of people just seem to be under the impression that new hardware means 60fps but they forget about the moving goalposts of graphics. I hope we get smooth 30fps games next gen instead of the numerous games that we have this gen that top out at 30. I love 60fps and want it as much as possible but I know that's not going to happen in a console setting.
Indeed.

Give me a smooth, locked framerate. I don't care if it's 30 or 60, but it has to be butter smooth with absolutely no dips.

God Of War III is probably the best usage of a variable framerate though. Coupled with the motion blur, it looked really, really smooth.
 

RoboPlato

I'd be in the dick
Indeed.

Give me a smooth, locked framerate. I don't care if it's 30 or 60, but it has to be butter smooth with absolutely no dips.

God Of War III is probably the best usage of a variable framerate though. Coupled with the motion blur, it looked really, really smooth.
Agreed about GoWIII. The 45 or so FPS with motion blur looked wonderful.
 

Kleegamefan

K. LEE GAIDEN
I know full well that RAM has nothing to do with framerate. And that's why i find it ironic to see how everyone is praising cerny and sony for the huge amount of ram and yet they skimped on the cpu.

The bottleneck for the ps4 is going to be cpu and sony's flagship racer is prove of that. But the point i am trying to make is the slow gameplay of killzone right now is just so outdated.

You can have all the brilliant warzone scenarios and awesome looking muzzle flash but a slow game is just what it is. Its like GT on 30fps. No Gt fan will ever trade framerate for graphics even for a pretty heavy graphics game like GT

PC gaming is waiting for you because in terms of hardware power, ps4 is probably the best you are gonna get for the next 5 or so years.....

In other words, xbox 3 or (lol) wii u is not gonna give you more power either -_-

Oh, and btw, developers are praising 8gb fast ram from a game design point of view as well as graphics.......consider the ps360 gen, it didn't matter how much hardware power xenon/xenox/rsx/cell had, if the game scope was too ambitious and couldn't fit into ram, sacrifices had to be made to pare a games scope to fit it into (the too small) ram...


This it what I see is most exciting for games developers.....flexibility of scope afforded by dat 8gb
 
My point is this. PS4 will realistically have 4+ gigs just being used for the GPU. The GPU's that are rumored in the Dev kits is supposedly "less powerful" than in the final kit. Let's say it's a 7850. The largest GPU ram option for that is about 2 gigs.

2, not 4, not 6.

Devs DID NOT (and likely still don't) have a dev kit with that VRAM that is available in the PS4.

TL;DR: Devs don't have 8, let alone 4 gigs of VRAM.

Yes, but this isn't really a problem. The game assets are always created at very high quality. So if they have more RAM in the final silicon, it means they can use higher quality assets.
 
I know full well that RAM has nothing to do with framerate. And that's why i find it ironic to see how everyone is praising cerny and sony for the huge amount of ram and yet they skimped on the cpu.

The bottleneck for the ps4 is going to be cpu and sony's flagship racer is prove of that. But the point i am trying to make is the slow gameplay of killzone right now is just so outdated.

You can have all the brilliant warzone scenarios and awesome looking muzzle flash but a slow game is just what it is. Its like GT on 30fps. No Gt fan will ever trade framerate for graphics even for a pretty heavy graphics game like GT

It won't. The PS4 CPU has plenty of power and it uses one of the best architectures AMD has created to date. The CPU is the best you can have considering the TDP limits.

http://beyond3d.com/showpost.php?p=1...&postcount=155

Piledriver core (CPU used in Trinity) is designed for high clock speeds (turbo up to 4.3 GHz, overclock up to 8 GHz). In order to reach such high clocks, several sacrifices had to be made. The CPU pipelines had to be made longer, because there's less time to finish each pipeline stage (as clock cycles are shorter). The cache latencies are longer, because there's less time to move data around the chip (during a single clock cycle). The L1 caches are also simpler (less associativity) compared to Jaguar (and Intel designs). In order to combat the IPC loss of these sacrifices, some parts of the chip needed to be beefed up: The ROBs must be larger (more TLP is required to fill longer pipelines, more TLP is required to hide longer cache latencies / more often occurring misses because of lower L1 cache associativity) and the branch predictor must be better (since long pipeline causes more severe branch mis-predict penalties). All these extra transistors (and extra power) are needed just to negate the IPC loss caused by the high clock headroom.

Jaguar compute unit (4 cores) has same theoretical peak performance per clock as a two module (4 core) Piledriver. Jaguar has shorter pipelines and better caches (less latency, more associativity). Piledriver has slightly larger ROBs and slightly better branch predictor. But these are required to negate the disadvantages in the cache design and the pipeline length. The per module shared floating point pipeline in Piledriver is very good for single threaded tasks, but for multithreaded workloads, the module design is a hindrance, because of various bottlenecks (shared 2-way L1 instruction cache and shared instruction decode). Steamroller will solve some of these bottlenecks (before end of this year?), but it's still too early to discuss about it yet (with the limited information available).

Jaguar and Piledriver IPC will be in the same ballpark. However when running these chips at low clocks (<19W) all the transistors spent in Piledriver design that allow the high clock ceiling are wasted, but all the disadvantages are still present. Thus Piledriver needs more power and more chip area to reach similar performance than Jaguar. There's no way around this. Jaguar core has better performance per watt.

My guess:
- Multi threaded workload (GPU stressed): Jaguar wins by +10% (because of clock difference: 1.6 GHz vs 1.815 GHz)
- Multi threaded workload (GPU idle): Tie(depends highly on how much PD can turbo clock in this case)
- Single threaded workload (GPU idle): PD wins by up to +50% (single core in PD module runs up to 20% faster when the other core is idling + PD can turbo clock to 2.4 GHz when only a single core is active).

GPU performance difference is impossible to estimate at this point, because we do not yet know exact details about the Temash/Kabini APU configurations. However Jaguar core takes (considerably) less die space and consumes slightly less power than a Piledriver core (of similar performance). This means that AMD could equip the Jaguar based APU with a more powerful GPU than Trinity at the same TDP limit. It's all about market segmentation. If AMD plans to place the high end Kabini APUs against Intel's Ultrabook chips, we might see a more powerful GPU (AMD has technology available for this already, just look at the PS4 design). However if AMD sees Kabini as a low end laptop / netbook platform, they are likely going to focus on making a small chip that is cheap to produce, and limit the GPU options to low performance models.

The Xbox360/PS3 CPUs are pretty shitty compard to the PS4 CPU.
 

lord pie

Member
You can pre-calculate some effects and store them in RAM to save computation time and hit 60fps.

The length of time taken to process and render a frame is determined by the maximum of amount of processing performed and the amount of memory that is read / written.
Very simply, the maximum of flops and bandwidth.

What you are saying that by increasing the amount of memory used, you can reduce the amount of processing performed. This is typically called caching - and for the most part it is true, however it misses the important subtlety in the equation above.

What happens if processing and bandwidth time are already in a roughly balanced state?
A game will be engineered to balance processing and bandwidth requirements. This is part of optimisation. A common optimisation is to trade one for the other, when you can measure that one is overused and the other is underused.


However, you are suggesting that you can halve total processing time for all systems by some increase in memory use (which will correspond with an increase in bandwidth). Lets say that in order to halve processing, you have to double bandwidth (very approximate - but probably fairly realistic)

Now, lets also say that the bandwidth/processing demands of the game are already roughly balanced (say, within +/-20%) as the game has been well engineered and optimised by a team of highly skilled experts.
What you get, is the following three worst cases for borderline 30fps:

Code:
		processing	bandwidth	frame rate
balance		33ms		33ms		30fps
proc'limited	33ms		26ms		30fps
band'limited	26ms		33ms		30fps

Ok?
That hopefully makes sense

Now. Lets consider what happens if you trade 2x bandwidth hit for 1/2 processing in the exact same cases:

Code:
		processing	bandwidth	frame rate
balance		17ms		66ms		15fps
proc'limited	17ms		52ms		19fps
band'limited	13ms		66ms		15fps

With this in mind, you can make a conclusion:
The faster the frame rate, the less processing that can occur and the less bandwidth a game can use per-frame.
The lower the bandwidth available, the less unique memory a game can access within a frame*.
The less unique memory that a game can access per frame almost always corresponds with using less memory in total.

Hence we can come to the conclusion that in the majority of cases a 60fps game will use less memory than a 30fps game.

* The amount of unique memory accessed per frame also depends on how much memory is read/written more than once in a frame - a non-trivial problem the depends highly on the engine architecture and fundamental design choices made early in the project.
 
60fps or 30fps with much more graphical detail? Personally I'm fine with 30fps.

This.

Halo has always been fun at 30fps.

What will make a bigger difference is if they can make the controls feel less stodgy than they did in the PS3 titles.

Lock it at 30 and go to town of the effects and lighting gets my vote.
 

Kerub

Banned
Of course the framerate will double if the RAM size is doubled. Can't you see how the numbers add up? If PS4 had 6 GB RAM, all games would run at 45 fps.
 

jaosobno

Member
30 fps? I shouldnt be surprised. Next gen isnt even here and it already sucks.

I see that many have already forgotten how the present gen started; we've had sub 30 FPS games from the very start (Heavenly Sword, Oblivion, etc.). Present gen was all about "power of Cell", revolutionary unified shaders, monstrous GPUs and CPUs, etc. One would expect that such heavy duty hardware produced nothing less than 720p@60fps but somehow we got stuck with 30 or (often) less FPS and 720p or (quite often) less resolution.

Enter next gen, and now rock solid 1080p@30fps with insane particle and alpha effects, incredible reflections, high res textures, physics, AA, tessellation, SSAO, etc. somehow sucks.
 

Limanima

Member
There are some really awsome moments in that demo. The fly-over the city at the begining and the zoom out at the end are superb. Can't imagine how a GOWPS4 will be when KZ is already showing such an huge scale.

Regarding the 30fps vs 60 fps, I think this will be the same as this gen, only a few titles will be 60FPS.
My TV makes image interpolation and 30fps games have a smooth look, almost look like 60fps, so I'm fine with 30.
 
Status
Not open for further replies.
Top Bottom