• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

More Praise: Next Consoles Are Delivering High FrameRate In Spades

Gamer79

Predicts the worst decade for Sony starting 2022
Why not just wait a few months and get the PS5 at RRP? Play some smaller, less demanding games on your old PC in the meantime.
My plan! I am playing a hell of a great library on the Nintendo Switch and soon to get Oculus Quest 2. I can wait until the ps5 gets more real next generation games.
 
Last edited:

VFXVeteran

Banned
Nah, 60fps games in 3 years will look better than current 60fps games.
Same goes for 30fps games.

ALL 60FPS games right now use dynamic resolution. Unless you are talking about art direction or lowering floor resolution even more than 1440p, I don't see MORE tech thrown on top. If the consoles had that much extra power to give up then we'd be seeing native 4k. I'm not sure why most of us think there is a significant amount of untapped power in these consoles when we clearly know what their limitations are (2080s don't run games at native 4k/60FPS).
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
ALL 60FPS games right now use dynamic resolution. Unless you are talking about art direction or lowering floor resolution even more than 1440p, I don't see MORE tech thrown on top. If the consoles had that much extra power to give up then we'd be seeing native 4k. I'm not sure why most of us think there is a significant amount of untapped power in these consoles when we clearly know what their limitations are (2080s don't run games at native 4k/60FPS).
There are an entirely new generation of engines still yet to come to next gen consoles, that were designed with them in mind.

From UE 5 to ID Tech, to everyone else in the industry.

Not to mention the likelihood of improvements to dynamic scaling technology for next-gen consoles.

Every console generation sees games improve their fidelity over time, and they do so while historically targeting 30 FPS. They get better over the generation while maintaining those targets.

This generation is likely to be even more dramatic over time, not less. And it looks like 60FPS is popular with both devs and consumers, there's no real reason for it to not get better over the gen.
 
This didn't happen the same way during the previous generation switch because the PS4 and XBO had CPUs that were barely more powerful than the previous generation. It was really just a GPU and memory leap that time.
I'd argue the CELL BE was more powerful than those pitiful Jaguar CPUs, but what do I know..
 

Krisprolls

Banned
or really old games

I know you're trolling and shitposting for fun, but Demon's Souls is 100 % à remake from scratch and it's the most next gen game right now.

Graphics, lighting and sounds are incredible. Everything is ultra smooth with instant loading too.
 

Skifi28

Member
It's not locked though, dips into 30-40fps on occasions.

Most games have drops on occasion (even on powerful PCs), Valhalla certainly does as I'm sure others do. It's still at 60 99% of the time, so how does that disprove the original article?
 
Last edited:

waquzy

Member
Most games have drops on occasion (even on powerful PCs), Valhalla certainly does as I'm sure others do. It's still at 60 99% of the time, so how does that disprove the original article?

99% of the time, sure buddy... it is a poorly optimised game.

 

Skifi28

Member
99% of the time, sure buddy... it is a poorly optimised game.



So you posted 40 seconds of a 100 hour game that run in the 50s instead of 60 fps. That's nice, but it doesn't disprove my point on top of having nothing to do with the thread anymore.

Let me remind you that this thread is about the majority of games released on the new consoles being 60fps. Posting a a 30 second clip of a game running at 55fps is just silly, I'm not even sure what point you are trying to make. Perhaps that a game doesn't count as 60fps if there's a single dropped frame? I don't understand you.
 
Last edited:

waquzy

Member
So you posted 40 seconds of a 100 hour game that run in the 50s instead of 60 fps. That's nice, but it doesn't disprove my point on top of having nothing to do with the thread anymore.

Let me remind you that this thread is about the majority of games released on the new consoles being 60fps. Posting a a 30 second clip of a game running at 55fps is just silly, I'm not even sure what point you are trying to make.

lol, I posted 40 seconds clip, because that was the length of the clip.
My point is the game does not run at 60fps 99% of the time, it's more like 80% of the time. Have you even played the game yet? If you had, you would know that the shooting sequence is not even the most demanding area of the game, not by a mile. Same areas it will drop below 40fps.
 

Skifi28

Member
lol, I posted 40 seconds clip, because that was the length of the clip.
My point is the game does not run at 60fps 99% of the time, it's more like 80% of the time. Have you even played the game yet? If you had, you would know that the shooting sequence is not even the most demanding area of the game, not by a mile. Same areas it will drop below 40fps.

I actually have it on PS5 and I have yet to see any areas that it drops below, including that sequence, probably performs better than on the X. Once again though, what is your point? Valhalla also drops in the 50s quite often, Is it not a 60fps game? I think I'm gonna stop now because we're just derailing the thread as this adds nothing to the original discussion
 
Last edited:

waquzy

Member
I actually have it on PS5 and I have yet to see any areas that it drops below, including that sequence, probably performs better than on the X. Once again though, what is your point? Valhalla also drops in the 50s quite often, Is it not a 60fps game? I think I'm gonna stop now because we're just derailing the thread as this adds nothing to the original discussion

I have the game on PS5 as well, you need to go and play a bit more then or maybe you can't tell the difference between a locked 60fps and 40fps. Anyways, the point I was making is that your claim of 60fps at 99% of the time on next gen consoles, isn't true. That's all.
 

Skifi28

Member
I have the game on PS5 as well, you need to go and play a bit more then or maybe you can't tell the difference between a locked 60fps and 40fps. Anyways, the point I was making is that your claim of 60fps at 99% of the time on next gen consoles, isn't true. That's all.

Your original post that triggered this discussion was that Cyberpunk was not a 60fps game, which I think it is despite a few drops. We can argue percentages, whether it's 99% or 80% or 93.2%, but it's pointles and doesn't add anything to what the thread is about.
 

waquzy

Member
Your original post that triggered this discussion was that Cyberpunk was not a 60fps game, which I think it is despite a few drops. We can argue percentages, whether it's 99% or 80% or 93.2%, but it's pointles and doesn't add anything to what the thread is about.

The thread is about next gen consoles delivering frame rate in spades. I pointed out that it's not the case with Cyberpunk 2077, where next gen consoles are failing to deliver it in spades.
 

Soodanim

Member
This is good. This is the sort of thing that could bring me back to console.

It continuing will be based on metrics they get, and I hope developers share that information so we can get an idea of how frequently used performance modes are.
 

VFXVeteran

Banned
There are an entirely new generation of engines still yet to come to next gen consoles, that were designed with them in mind.

From UE 5 to ID Tech, to everyone else in the industry.

Not to mention the likelihood of improvements to dynamic scaling technology for next-gen consoles.

Every console generation sees games improve their fidelity over time, and they do so while historically targeting 30 FPS. They get better over the generation while maintaining those targets.

This generation is likely to be even more dramatic over time, not less. And it looks like 60FPS is popular with both devs and consumers, there's no real reason for it to not get better over the gen.

I never said there won't be improvements. My argument is NOT giant leaps like most console lovers here seem to hope for - and I have evidence of last gen games to prove it. Look we have gone back and forth with this several threads so I think what I'll do is start an OT on the matter so we can keep track during the course of the generation.
 
Last edited:

Skifi28

Member
The thread is about next gen consoles delivering frame rate in spades. I pointed out that it's not the case with Cyberpunk 2077, where next gen consoles are failing to deliver it in spades.

It's a good thing your "in spades" is different to mine in this case.
 

Matt_Fox

Member
Whatever some may claim about 'game x' or 'game y' imho we haven't yet seen a game that can bonafide claim to be a generational leap in graphics over PS4 / Xbox One.

It may turn out to be that next gen never brings that graphical leap, but instead we get 60fps / 4K.
 

Hunnybun

Member
Whatever some may claim about 'game x' or 'game y' imho we haven't yet seen a game that can bonafide claim to be a generational leap in graphics over PS4 / Xbox One.

It may turn out to be that next gen never brings that graphical leap, but instead we get 60fps / 4K.

It's subjective but imo Ratchet on PS5 looks as big a leap over the PS4 game as that was over the PS3 ones.
 

RoadHazard

Gold Member
I'd argue the CELL BE was more powerful than those pitiful Jaguar CPUs, but what do I know..

If used to its fullest, yeah, it was pretty beastly. But that basically only happened with Sony 1st party games. For most devs the Jaguar probably gave a bit better performance with less effort.
 
Last edited:

VFXVeteran

Banned
Reconstruction/Scaling techniques are now the standard across the board. Native resolution is bordering on redundant at this point.

Doesn't take away the fact that reconstruction costs *less* in GPU power than just rendering natively. My point is that there isn't enough excess power to render natively @ 60FPS until a developer goes off and starts to add significantly more rendering algorithms which will certainly take more GPU cycles to force reconstruction technique. We are already at the reconstruction technique with what we have on display now. Where is the extra GPU cycles?
 
ALL 60FPS games right now use dynamic resolution. Unless you are talking about art direction or lowering floor resolution even more than 1440p, I don't see MORE tech thrown on top. If the consoles had that much extra power to give up then we'd be seeing native 4k. I'm not sure why most of us think there is a significant amount of untapped power in these consoles when we clearly know what their limitations are (2080s don't run games at native 4k/60FPS).
Microsoft will be getting new software tools

Sony and devs will be getting to grips with the variable PS5 clocks.

It's really early days and i can see 1440p upscaled 6 60fps or fauxK@60fps

Add to that, the SSD being used on closed architecture is a first. Let's see what tricks devs can do. Because Forza on 1x (horizon?) is one of the best looking games i've ever seen. Likewise Spidey/GoW/Ghost. Those devs made those games look that good on shite hardware. Imagine what they can do with some real power.
 

VFXVeteran

Banned
Microsoft will be getting new software tools

Sony and devs will be getting to grips with the variable PS5 clocks.

It's really early days and i can see 1440p upscaled 6 60fps or fauxK@60fps

Add to that, the SSD being used on closed architecture is a first. Let's see what tricks devs can do. Because Forza on 1x (horizon?) is one of the best looking games i've ever seen. Likewise Spidey/GoW/Ghost. Those devs made those games look that good on shite hardware. Imagine what they can do with some real power.

This is the hangup I have with 99% of people who think that what they see now is NOT at least 50% of what they will see in the future. The evidence just doesn't support this. I am going to start a thread on this very thought process and we will follow it throughout the generation. I think people on these boards severely underestimate developer's knowledge (and previous experience) in rendering. I also think people severely lack understanding of simple linear and exponential behavior when it concerns "cost" of rendering (i.e. having a large cache doesn't mean 10x more polygons on screen). I'd like to give technical details as to why several of you won't see what you think you will this generation. It was already shown last generation but people tend to avoid the conversation (comparing early last gen games to late last gen games on a technical level).
 
Last edited:

Clear

CliffyB's Cock Holster
Doesn't take away the fact that reconstruction costs *less* in GPU power than just rendering natively. My point is that there isn't enough excess power to render natively @ 60FPS until a developer goes off and starts to add significantly more rendering algorithms which will certainly take more GPU cycles to force reconstruction technique. We are already at the reconstruction technique with what we have on display now. Where is the extra GPU cycles?

Same as anything else in computing. When you exhaust options to improve efficiency at the lowest level of operation you transfer your attention to mid and then high-level efficiencies. Because if you can shave down the number of low-level ops needed to achieve the desired end-result you have an effective performance gain
 
This is the hangup I have with 99% of people who think that what they see now is NOT at least 50% of what they will see in the future. The evidence just doesn't support this. I am going to start a thread on this very thought process and we will follow it throughout the generation. I think people on these boards severely underestimate developer's knowledge (and previous experience) in rendering. I think people severely lack understanding of simple linear and exponential behavior when it concerns "cost" of rendering (i.e. having a large cache doesn't mean 10x more polygons on screen). I'd like to give technical details as to why several of you won't see what you think you will this generation. It was already shown last generation but people tend to avoid the conversation (comparing early last gen games to late last gen games on a technical level).
I won't disagree with anything you say technically, you've got more knowledge than me, hands down. However, when you say evidence doesn't support this, i have to disagree, reginald.

The evidence is in looking at any launch game of any console, and comparing it to the swan song. Was Gears of War 1 beautiful? Yeo. Gears of war 3 blew it out of the water though. Same with PS3, look at TLoU, or OG Halo and Halo 2... Some tricks will be pulled out of the bag, some boffin will come up with a new rendering technique or speed up transfer or some magical mumbo-jumbo i'm not even going to pretend i understand.

6 Months ago i was being told that 4k or fauxK with RT was impossible. Now launch games have it, in spades, at 60fps.
 

VFXVeteran

Banned
Same as anything else in computing. When you exhaust options to improve efficiency at the lowest level of operation you transfer your attention to mid and then high-level efficiencies. Because if you can shave down the number of low-level ops needed to achieve the desired end-result you have an effective performance gain

Agreed. The question is will it make a noticeable difference in the overall render of the scene. I will say confidently - no.

Discuss

 

VFXVeteran

Banned
I won't disagree with anything you say technically, you've got more knowledge than me, hands down. However, when you say evidence doesn't support this, i have to disagree, reginald.

The evidence is in looking at any launch game of any console, and comparing it to the swan song. Was Gears of War 1 beautiful? Yeo. Gears of war 3 blew it out of the water though. Same with PS3, look at TLoU, or OG Halo and Halo 2... Some tricks will be pulled out of the bag, some boffin will come up with a new rendering technique or speed up transfer or some magical mumbo-jumbo i'm not even going to pretend i understand.

6 Months ago i was being told that 4k or fauxK with RT was impossible. Now launch games have it, in spades, at 60fps.


Discuss:


And bring examples. This is a thread about proving what you say.
 
Discuss:


And bring examples. This is a thread about proving what you say.
I honestly can't be bothered. I don't mean that in a bad way but if people can't remember the evolution of games over the past 3 decades, nothing i bring to the table to will change their mind. 🤷‍♂️
 

VFXVeteran

Banned
I honestly can't be bothered. I don't mean that in a bad way but if people can't remember the evolution of games over the past 3 decades, nothing i bring to the table to will change their mind. 🤷‍♂️

That's a very dismissive comment and you also completely misworded my argument. My claim has ALWAYS been WITHIN a generation. Not over 30 yrs dude.

Just say you don't feel like replying to my comments over the matter...
 
Last edited:

BuffNTuff

Banned
I love it. Especially on ps5. I’ve had a chance to compare xsx and ps5 - gamed on the xsx for a month by itself after launch until I could find a ps5.
Something about the Ps5 makes it seem so much smoother.

Cant quite put a finger on the Xbox series x but it almost feels like there’s something wrong in the OS level that causes some stutter / frame pacing at 60 hz that just makes it feels less smooth than the PS5 for whatever reason. While it may be quantifiably very small, it makes a huge difference.

Even Valhalla which technically has a better frame rate on xsx feels so much better on ps5.
 
Last edited:
That's a very dismissive comment and you also completely misworded my argument. My claim has ALWAYS been WITHIN a generation. Not over 30 yrs dude.

Just say you don't feel like replying to my comments over the matter...
That's fair. I should have explained better. If you take a snapshot of any launch games and compare them to the Swan song games, you'll see a difference. The 30 years bit was just how many gens we have to compare my guess with, vs an isolated case.

That's not true. And I'm sorry if it came off that way.
 

VFXVeteran

Banned
That's fair. I should have explained better. If you take a snapshot of any launch games and compare them to the Swan song games, you'll see a difference. The 30 years bit was just how many gens we have to compare my guess with, vs an isolated case.

That's not true. And I'm sorry if it came off that way.

You really should at least read my OP before you write it off. I explain better what my problem is and it's definitely worth addressing in a technical way instead of hijacking threads all the time because people feel so strongly about it (including you). At least hear my case.
 
Last edited:
Every console generation sees games improve their fidelity over time, and they do so while historically targeting 30 FPS. They get better over the generation while maintaining those targets.
16-bit 3d: Often below 10 fps
32-bit 3d: Anywhere between 20 to 60fps, with the ps1 having a decent amount of. 60fps games
Dreamcast, ps2, og Xbox, gc: Quite a lot of 60fps titles, I think it was the expectation back then
Ps360: This is ag where <30fps came back in force
Ps4: Most 30fps titles are pretty steady now, we also have more 60fps games than the last gen did, they hold it better... It was obvious even on day one things would turn out this way, with better 30 and more common 60.

There has never been a standard frame rate across generations.

Most of the launch games offer 60fps modes that are pretty good, if not 120 (with serious drops).

It's fair to expect 60fps to be the new 30, and that games will now at the very least offer a choice between 30, 60 and more fps, when 60 is not the only mode.
 

quest

Not Banned from OT
16-bit 3d: Often below 10 fps
32-bit 3d: Anywhere between 20 to 60fps, with the ps1 having a decent amount of. 60fps games
Dreamcast, ps2, og Xbox, gc: Quite a lot of 60fps titles, I think it was the expectation back then
Ps360: This is ag where <30fps came back in force
Ps4: Most 30fps titles are pretty steady now, we also have more 60fps games than the last gen did, they hold it better... It was obvious even on day one things would turn out this way, with better 30 and more common 60.

There has never been a standard frame rate across generations.

Most of the launch games offer 60fps modes that are pretty good, if not 120 (with serious drops).

It's fair to expect 60fps to be the new 30, and that games will now at the very least offer a choice between 30, 60 and more fps, when 60 is not the only mode.
Keep dreaming they will need the frame rate ray tracing check box to sell thier game. Most won't bother with a non ray tracing mode. Fully expect 30fps in 2022 when nextgen really starts.
 
Same as anything else in computing. When you exhaust options to improve efficiency at the lowest level of operation you transfer your attention to mid and then high-level efficiencies. Because if you can shave down the number of low-level ops needed to achieve the desired end-result you have an effective performance gain
Sure, but have you looked at what we have seen so far one these new consoles? (no sweat locket 60 with 120 fps modes that drop)

Compare that to early last ten, we had mostly 30+ fps games day one, with the 30 fps games being locked (on the ps4 at least).

I will not include resolution in this discussion because I don't think it's as relevant as it was before, with modern reconstruction techniques and all.... And they go against one another.

I also think that despite your experience in vfx/3d you are more wrong than the average forum poster for some reason.
 

BuffNTuff

Banned
Keep dreaming they will need the frame rate ray tracing check box to sell thier game. Most won't bother with a non ray tracing mode. Fully expect 30fps in 2022 when nextgen really starts.

especially when you think about things like Ray traced global illumination... this kind of feature doesn’t just impact the user in terms of visuals it also radically changed the development pipeline as well.

Use an off the rack gi solution at 30 FPS and dramatically cut your dev time and costs... or spend countless more hours pre baking assets to make a performance mode that will please some people on the internet... seems obvious how this will go down.
 

Nezzeroth

Member
I hope devs keep giving 60fps modes. A lot of us are still on 1080p TVs, wasting those resources on resolution would be a shame.
 
Top Bottom