That would give him more significance than he deserves. Rico needs to be killed by accident in an easily missed scripted event.
true, that fool needs to be forgotten
That would give him more significance than he deserves. Rico needs to be killed by accident in an easily missed scripted event.
30 fps? I shouldnt be surprised. Next gen isnt even here and it already sucks.
And there in lies GG ultimate weakness. I don't understand the mentality of a flagship studio outputting 30 fps on their flagship game.
We have been hearing non stop on how GG was in on ps4 development so if its hardware related then why the hell haven't they mentioned it to sony in first place?
8 GB of memory and yet still stuck with 30 fps. Sigh
That would give him more significance than he deserves. Rico needs to be killed by accident in an easily missed scripted event.
And there in lies GG ultimate weakness. I don't understand the mentality of a flagship studio outputting 30 fps on their flagship game.
We have been hearing non stop on how GG was in on ps4 development so if its hardware related then why the hell haven't they mentioned it to sony in first place?
8 GB of memory and yet still stuck with 30 fps. Sigh
Go with 60FPS and you get less Graphical detail.
Oh and RAM has nothing to do with Framerate
Also, additional ram means certain effects that are taxing to generate can be optimized to look similar with simply more particles or whatnot to emulate the previous effect and reduce processing time (to increase framerate)It might happen that you choose an approach which is more computationally intensive but that uses a lot less memory in which case this might be untrue.
And there in lies GG ultimate weakness. I don't understand the mentality of a flagship studio outputting 30 fps on their flagship game.
We have been hearing non stop on how GG was in on ps4 development so if its hardware related then why the hell haven't they mentioned it to sony in first place?
8 GB of memory and yet still stuck with 30 fps. Sigh
And there in lies GG ultimate weakness. I don't understand the mentality of a flagship studio outputting 30 fps on their flagship game.
We have been hearing non stop on how GG was in on ps4 development so if its hardware related then why the hell haven't they mentioned it to sony in first place?
8 GB of memory and yet still stuck with 30 fps. Sigh
30 or 60fps, its fine by me. The game looked sick
I can't remember the last flagship exclusive console title that was 60fps outside of Mario and Gran Turismo.And there in lies GG ultimate weakness. I don't understand the mentality of a flagship studio outputting 30 fps on their flagship game.
We have been hearing non stop on how GG was in on ps4 development so if its hardware related then why the hell haven't they mentioned it to sony in first place?
8 GB of memory and yet still stuck with 30 fps. Sigh
And there in lies GG ultimate weakness. I don't understand the mentality of a flagship studio outputting 30 fps on their flagship game.
We have been hearing non stop on how GG was in on ps4 development so if its hardware related then why the hell haven't they mentioned it to sony in first place?
8 GB of memory and yet still stuck with 30 fps. Sigh
Yeah, I'm totally fine with smooth 30fps as long as it doesn't drop or tear. A lot of people just seem to be under the impression that new hardware means 60fps but they forget about the moving goalposts of graphics. I hope we get smooth 30fps games next gen instead of the numerous games that we have this gen that top out at 30. I love 60fps and want it as much as possible but I know that's not going to happen in a console setting.I think some people are thinking too much about Crysis 3 when they think about 30 fps, which is an entirely different thing.
According to Eurogamer, KZ: SF was locked at 30 fps, none of that peaks and troughs shit with Crysis 2/3 where often 30fps was the maximum framerate during gameplay.
An absolutely solid 30 fps is absolutely fine. Would I prefer 60? Of course. But locked 30 fps for higher visual fidelity is perfect.
Killing Rico is the right direction for the franchise. If anything let us play as helghast for like 20 minutes on a kill rico mission.
You can pre-calculate some effects and store them in RAM to save computation time and hit 60fps.
Is it possible to have console games not get lower than 30fps but are not limited to 30fps?
That's a bit cruel, KZ: SF is set 30 years after KZ3, so going on a mission to kill an old man seems a bit much, even though he is a dick.
This would be fine for me. The problem with current gen are games that struggle to hit that 30fps target. What my hopes are at least is that with PS4 devs have enough headroom that we can have games like Killzone: SF and Battlefield 4 looking great with all (or a lot) of bells and whistles and crazy chaos going on without ever dipping below 30.Yeah, I'm totally fine with smooth 30fps as long as it doesn't drop or tear. A lot of people just seem to be under the impression that new hardware means 60fps but they forget about the moving goalposts of graphics. I hope we get smooth 30fps games next gen instead of the numerous games that we have this gen that top out at 30. I love 60fps and want it as much as possible but I know that's not going to happen in a console setting.
This is wild speculation but why kz1s intro and not 2?
If the game never dips below 30, I am 100% okay with it.
After watching the HQ version of this I saw something really jarring and had to play it again to make sure my eyes weren't bugging out. It was of a NPC woman running by you during a firefight.
I didn't read all of the past 70-something pages so it could be redundant, but it gave me a chuckle to see a textureless/unfinished character running around:
Pics.jpeg
Of course, game isn't finished/put together just for the meeting/not indicative of final product/etc.
I was actually really impressed with the overall visuals of this demo.
The only KZ game I played was 3 and I enjoyed it though I don't remember a lot of it now. The premise is the most appealing part, with the Cold War/Berlin Wall inspiration.
You can pre-calculate some effects and store them in RAM to save computation time and hit 60fps.
here's link to the vid
Pertaining to visuals I noticed two inconsistencies (well the latter is part of an existing problem):
Observe: In this picture how the whole screen zooms in on the target when only the sight from the scope ought to undergo the process:
Well, not clean enough goddammit. Kill the remaining jaggies!So clean.
Whoa whoa... slow down there, tiger... do you know how piss-poor fucking hard it would be to aim with a tiny scope zooming the image by while the rest of the screen moves at its own pace? You'd make one tiny movement and 30 feet go by in the scope when it looks like you only moved an inch.
Zooming is usually handled this way when the scope only takes up a fraction of the screen. If the scope were much larger on screen you can make adjustments since the players eyes will miss most of the negative space - but in this case - the negative space fills up a larger portion of the screen.
From a design/playability standpoint - it's a smart decision.
Valid point. However, it is a touch too unrealistic for my taste. Given it basically functions like an ACOG scope, they can bring it closer (like it is with other shooters). When it comes to guns, KZ2 offered quite the realism (spread and recoil). So, consistency matters (to me at least).
I know the chances of what i want to happen are in the single digits but a toggling option between 1080p/30fps and 720p/60fps would seriously kick ass.
For single player games i don't mind playing at a capped and or consistent 30fps. For multiplayer however, the difference between 30 and 60 fps are huge IMO. For those that disagree, i recommend downloading that BF4 60fps and compare it with KZ:SF 30fps gameplay side by side. I mean hypothetically it should be feasible to achieve 60 fps at 720p as 1080p has 2.25 times as many pixels to push. But perhaps its not that easy and the reason we have not heard about it =(
I know full well that RAM has nothing to do with framerate. And that's why i find it ironic to see how everyone is praising cerny and sony for the huge amount of ram and yet they skimped on the cpu.
The bottleneck for the ps4 is going to be cpu and sony's flagship racer is prove of that. But the point i am trying to make is the slow gameplay of killzone right now is just so outdated.
You can have all the brilliant warzone scenarios and awesome looking muzzle flash but a slow game is just what it is. Its like GT on 30fps. No Gt fan will ever trade framerate for graphics even for a pretty heavy graphics game like GT
If the game never dips below 30, I am 100% okay with it.
After watching the HQ version of this I saw something really jarring and had to play it again to make sure my eyes weren't bugging out. It was of a NPC woman running by you during a firefight.
I didn't read all of the past 70-something pages so it could be redundant, but it gave me a chuckle to see a textureless/unfinished character running around:
Of course, game isn't finished/put together just for the meeting/not indicative of final product/etc.
I was actually really impressed with the overall visuals of this demo.
The only KZ game I played was 3 and I enjoyed it though I don't remember a lot of it now. The premise is the most appealing part, with the Cold War/Berlin Wall inspiration.
Indeed.Yeah, I'm totally fine with smooth 30fps as long as it doesn't drop or tear. A lot of people just seem to be under the impression that new hardware means 60fps but they forget about the moving goalposts of graphics. I hope we get smooth 30fps games next gen instead of the numerous games that we have this gen that top out at 30. I love 60fps and want it as much as possible but I know that's not going to happen in a console setting.
You can pre-calculate some effects and store them in RAM to save computation time and hit 60fps.
Agreed about GoWIII. The 45 or so FPS with motion blur looked wonderful.Indeed.
Give me a smooth, locked framerate. I don't care if it's 30 or 60, but it has to be butter smooth with absolutely no dips.
God Of War III is probably the best usage of a variable framerate though. Coupled with the motion blur, it looked really, really smooth.
I know full well that RAM has nothing to do with framerate. And that's why i find it ironic to see how everyone is praising cerny and sony for the huge amount of ram and yet they skimped on the cpu.
The bottleneck for the ps4 is going to be cpu and sony's flagship racer is prove of that. But the point i am trying to make is the slow gameplay of killzone right now is just so outdated.
You can have all the brilliant warzone scenarios and awesome looking muzzle flash but a slow game is just what it is. Its like GT on 30fps. No Gt fan will ever trade framerate for graphics even for a pretty heavy graphics game like GT
My point is this. PS4 will realistically have 4+ gigs just being used for the GPU. The GPU's that are rumored in the Dev kits is supposedly "less powerful" than in the final kit. Let's say it's a 7850. The largest GPU ram option for that is about 2 gigs.
2, not 4, not 6.
Devs DID NOT (and likely still don't) have a dev kit with that VRAM that is available in the PS4.
TL;DR: Devs don't have 8, let alone 4 gigs of VRAM.
Yes, but this isn't really a problem. The game assets are always created at very high quality. So if they have more RAM in the final silicon, it means they can use higher quality assets.
I know full well that RAM has nothing to do with framerate. And that's why i find it ironic to see how everyone is praising cerny and sony for the huge amount of ram and yet they skimped on the cpu.
The bottleneck for the ps4 is going to be cpu and sony's flagship racer is prove of that. But the point i am trying to make is the slow gameplay of killzone right now is just so outdated.
You can have all the brilliant warzone scenarios and awesome looking muzzle flash but a slow game is just what it is. Its like GT on 30fps. No Gt fan will ever trade framerate for graphics even for a pretty heavy graphics game like GT
Piledriver core (CPU used in Trinity) is designed for high clock speeds (turbo up to 4.3 GHz, overclock up to 8 GHz). In order to reach such high clocks, several sacrifices had to be made. The CPU pipelines had to be made longer, because there's less time to finish each pipeline stage (as clock cycles are shorter). The cache latencies are longer, because there's less time to move data around the chip (during a single clock cycle). The L1 caches are also simpler (less associativity) compared to Jaguar (and Intel designs). In order to combat the IPC loss of these sacrifices, some parts of the chip needed to be beefed up: The ROBs must be larger (more TLP is required to fill longer pipelines, more TLP is required to hide longer cache latencies / more often occurring misses because of lower L1 cache associativity) and the branch predictor must be better (since long pipeline causes more severe branch mis-predict penalties). All these extra transistors (and extra power) are needed just to negate the IPC loss caused by the high clock headroom.
Jaguar compute unit (4 cores) has same theoretical peak performance per clock as a two module (4 core) Piledriver. Jaguar has shorter pipelines and better caches (less latency, more associativity). Piledriver has slightly larger ROBs and slightly better branch predictor. But these are required to negate the disadvantages in the cache design and the pipeline length. The per module shared floating point pipeline in Piledriver is very good for single threaded tasks, but for multithreaded workloads, the module design is a hindrance, because of various bottlenecks (shared 2-way L1 instruction cache and shared instruction decode). Steamroller will solve some of these bottlenecks (before end of this year?), but it's still too early to discuss about it yet (with the limited information available).
Jaguar and Piledriver IPC will be in the same ballpark. However when running these chips at low clocks (<19W) all the transistors spent in Piledriver design that allow the high clock ceiling are wasted, but all the disadvantages are still present. Thus Piledriver needs more power and more chip area to reach similar performance than Jaguar. There's no way around this. Jaguar core has better performance per watt.
My guess:
- Multi threaded workload (GPU stressed): Jaguar wins by +10% (because of clock difference: 1.6 GHz vs 1.815 GHz)
- Multi threaded workload (GPU idle): Tie(depends highly on how much PD can turbo clock in this case)
- Single threaded workload (GPU idle): PD wins by up to +50% (single core in PD module runs up to 20% faster when the other core is idling + PD can turbo clock to 2.4 GHz when only a single core is active).
GPU performance difference is impossible to estimate at this point, because we do not yet know exact details about the Temash/Kabini APU configurations. However Jaguar core takes (considerably) less die space and consumes slightly less power than a Piledriver core (of similar performance). This means that AMD could equip the Jaguar based APU with a more powerful GPU than Trinity at the same TDP limit. It's all about market segmentation. If AMD plans to place the high end Kabini APUs against Intel's Ultrabook chips, we might see a more powerful GPU (AMD has technology available for this already, just look at the PS4 design). However if AMD sees Kabini as a low end laptop / netbook platform, they are likely going to focus on making a small chip that is cheap to produce, and limit the GPU options to low performance models.
You also need more proccessing power to work those higher res assets.
You can pre-calculate some effects and store them in RAM to save computation time and hit 60fps.
processing bandwidth frame rate
balance 33ms 33ms 30fps
proc'limited 33ms 26ms 30fps
band'limited 26ms 33ms 30fps
processing bandwidth frame rate
balance 17ms 66ms 15fps
proc'limited 17ms 52ms 19fps
band'limited 13ms 66ms 15fps
60fps or 30fps with much more graphical detail? Personally I'm fine with 30fps.
30 fps? I shouldnt be surprised. Next gen isnt even here and it already sucks.