TGO
Hype Train conductor. Works harder than it steams.
Yeah I suppose, still there should be more right?Check out Days Gone and Ghosts of Tsushima
Yeah I suppose, still there should be more right?Check out Days Gone and Ghosts of Tsushima
I’m quite confident newer games such as Cyberpunk all have a PS5 boost profile. Not sure which others tho..Yeah I suppose, still there should be more right?
My plan! I am playing a hell of a great library on the Nintendo Switch and soon to get Oculus Quest 2. I can wait until the ps5 gets more real next generation games.Why not just wait a few months and get the PS5 at RRP? Play some smaller, less demanding games on your old PC in the meantime.
Nah, 60fps games in 3 years will look better than current 60fps games.
Same goes for 30fps games.
There are an entirely new generation of engines still yet to come to next gen consoles, that were designed with them in mind.ALL 60FPS games right now use dynamic resolution. Unless you are talking about art direction or lowering floor resolution even more than 1440p, I don't see MORE tech thrown on top. If the consoles had that much extra power to give up then we'd be seeing native 4k. I'm not sure why most of us think there is a significant amount of untapped power in these consoles when we clearly know what their limitations are (2080s don't run games at native 4k/60FPS).
Demon’s Soulsthey're only running crossgen games
I'd argue the CELL BE was more powerful than those pitiful Jaguar CPUs, but what do I know..This didn't happen the same way during the previous generation switch because the PS4 and XBO had CPUs that were barely more powerful than the previous generation. It was really just a GPU and memory leap that time.
Demon’s Souls
they're only running crossgen games
or really old games
Cyberpunk 2077 disagrees with this article
It's not locked though, dips into 30-40fps on occasions.It's 60 on both series X and ps5.
It's not locked though, dips into 30-40fps on occasions.
Most games have drops on occasion (even on powerful PCs), Valhalla certainly does as I'm sure others do. It's still at 60 99% of the time, so how does that disprove the original article?
99% of the time, sure buddy... it is a poorly optimised game.
So you posted 40 seconds of a 100 hour game that run in the 50s instead of 60 fps. That's nice, but it doesn't disprove my point on top of having nothing to do with the thread anymore.
Let me remind you that this thread is about the majority of games released on the new consoles being 60fps. Posting a a 30 second clip of a game running at 55fps is just silly, I'm not even sure what point you are trying to make.
lol, I posted 40 seconds clip, because that was the length of the clip.
My point is the game does not run at 60fps 99% of the time, it's more like 80% of the time. Have you even played the game yet? If you had, you would know that the shooting sequence is not even the most demanding area of the game, not by a mile. Same areas it will drop below 40fps.
I actually have it on PS5 and I have yet to see any areas that it drops below, including that sequence, probably performs better than on the X. Once again though, what is your point? Valhalla also drops in the 50s quite often, Is it not a 60fps game? I think I'm gonna stop now because we're just derailing the thread as this adds nothing to the original discussion
I have the game on PS5 as well, you need to go and play a bit more then or maybe you can't tell the difference between a locked 60fps and 40fps. Anyways, the point I was making is that your claim of 60fps at 99% of the time on next gen consoles, isn't true. That's all.
Your original post that triggered this discussion was that Cyberpunk was not a 60fps game, which I think it is despite a few drops. We can argue percentages, whether it's 99% or 80% or 93.2%, but it's pointles and doesn't add anything to what the thread is about.
There are an entirely new generation of engines still yet to come to next gen consoles, that were designed with them in mind.
From UE 5 to ID Tech, to everyone else in the industry.
Not to mention the likelihood of improvements to dynamic scaling technology for next-gen consoles.
Every console generation sees games improve their fidelity over time, and they do so while historically targeting 30 FPS. They get better over the generation while maintaining those targets.
This generation is likely to be even more dramatic over time, not less. And it looks like 60FPS is popular with both devs and consumers, there's no real reason for it to not get better over the gen.
The thread is about next gen consoles delivering frame rate in spades. I pointed out that it's not the case with Cyberpunk 2077, where next gen consoles are failing to deliver it in spades.
Whatever some may claim about 'game x' or 'game y' imho we haven't yet seen a game that can bonafide claim to be a generational leap in graphics over PS4 / Xbox One.
It may turn out to be that next gen never brings that graphical leap, but instead we get 60fps / 4K.
I'd argue the CELL BE was more powerful than those pitiful Jaguar CPUs, but what do I know..
ALL 60FPS games right now use dynamic resolution.
Reconstruction/Scaling techniques are now the standard across the board. Native resolution is bordering on redundant at this point.
Microsoft will be getting new software toolsALL 60FPS games right now use dynamic resolution. Unless you are talking about art direction or lowering floor resolution even more than 1440p, I don't see MORE tech thrown on top. If the consoles had that much extra power to give up then we'd be seeing native 4k. I'm not sure why most of us think there is a significant amount of untapped power in these consoles when we clearly know what their limitations are (2080s don't run games at native 4k/60FPS).
Microsoft will be getting new software tools
Sony and devs will be getting to grips with the variable PS5 clocks.
It's really early days and i can see 1440p upscaled 6 60fps or fauxK@60fps
Add to that, the SSD being used on closed architecture is a first. Let's see what tricks devs can do. Because Forza on 1x (horizon?) is one of the best looking games i've ever seen. Likewise Spidey/GoW/Ghost. Those devs made those games look that good on shite hardware. Imagine what they can do with some real power.
Doesn't take away the fact that reconstruction costs *less* in GPU power than just rendering natively. My point is that there isn't enough excess power to render natively @ 60FPS until a developer goes off and starts to add significantly more rendering algorithms which will certainly take more GPU cycles to force reconstruction technique. We are already at the reconstruction technique with what we have on display now. Where is the extra GPU cycles?
I won't disagree with anything you say technically, you've got more knowledge than me, hands down. However, when you say evidence doesn't support this, i have to disagree, reginald.This is the hangup I have with 99% of people who think that what they see now is NOT at least 50% of what they will see in the future. The evidence just doesn't support this. I am going to start a thread on this very thought process and we will follow it throughout the generation. I think people on these boards severely underestimate developer's knowledge (and previous experience) in rendering. I think people severely lack understanding of simple linear and exponential behavior when it concerns "cost" of rendering (i.e. having a large cache doesn't mean 10x more polygons on screen). I'd like to give technical details as to why several of you won't see what you think you will this generation. It was already shown last generation but people tend to avoid the conversation (comparing early last gen games to late last gen games on a technical level).
Same as anything else in computing. When you exhaust options to improve efficiency at the lowest level of operation you transfer your attention to mid and then high-level efficiencies. Because if you can shave down the number of low-level ops needed to achieve the desired end-result you have an effective performance gain
I won't disagree with anything you say technically, you've got more knowledge than me, hands down. However, when you say evidence doesn't support this, i have to disagree, reginald.
The evidence is in looking at any launch game of any console, and comparing it to the swan song. Was Gears of War 1 beautiful? Yeo. Gears of war 3 blew it out of the water though. Same with PS3, look at TLoU, or OG Halo and Halo 2... Some tricks will be pulled out of the bag, some boffin will come up with a new rendering technique or speed up transfer or some magical mumbo-jumbo i'm not even going to pretend i understand.
6 Months ago i was being told that 4k or fauxK with RT was impossible. Now launch games have it, in spades, at 60fps.
I honestly can't be bothered. I don't mean that in a bad way but if people can't remember the evolution of games over the past 3 decades, nothing i bring to the table to will change their mind.Discuss:
Evolution of Graphics Technology
All: There's a repeated discussion about potential gains during this gen of consoles. Maybe it needs it's own OT because that's actually an interesting discussion. I started this thread because there is a lot of misunderstanding and arguing over what we will actually see with the evolution...www.neogaf.com
And bring examples. This is a thread about proving what you say.
I honestly can't be bothered. I don't mean that in a bad way but if people can't remember the evolution of games over the past 3 decades, nothing i bring to the table to will change their mind.
That's fair. I should have explained better. If you take a snapshot of any launch games and compare them to the Swan song games, you'll see a difference. The 30 years bit was just how many gens we have to compare my guess with, vs an isolated case.That's a very dismissive comment and you also completely misworded my argument. My claim has ALWAYS been WITHIN a generation. Not over 30 yrs dude.
Just say you don't feel like replying to my comments over the matter...
That's fair. I should have explained better. If you take a snapshot of any launch games and compare them to the Swan song games, you'll see a difference. The 30 years bit was just how many gens we have to compare my guess with, vs an isolated case.
That's not true. And I'm sorry if it came off that way.
16-bit 3d: Often below 10 fpsEvery console generation sees games improve their fidelity over time, and they do so while historically targeting 30 FPS. They get better over the generation while maintaining those targets.
Keep dreaming they will need the frame rate ray tracing check box to sell thier game. Most won't bother with a non ray tracing mode. Fully expect 30fps in 2022 when nextgen really starts.16-bit 3d: Often below 10 fps
32-bit 3d: Anywhere between 20 to 60fps, with the ps1 having a decent amount of. 60fps games
Dreamcast, ps2, og Xbox, gc: Quite a lot of 60fps titles, I think it was the expectation back then
Ps360: This is ag where <30fps came back in force
Ps4: Most 30fps titles are pretty steady now, we also have more 60fps games than the last gen did, they hold it better... It was obvious even on day one things would turn out this way, with better 30 and more common 60.
There has never been a standard frame rate across generations.
Most of the launch games offer 60fps modes that are pretty good, if not 120 (with serious drops).
It's fair to expect 60fps to be the new 30, and that games will now at the very least offer a choice between 30, 60 and more fps, when 60 is not the only mode.
Sure, but have you looked at what we have seen so far one these new consoles? (no sweat locket 60 with 120 fps modes that drop)Same as anything else in computing. When you exhaust options to improve efficiency at the lowest level of operation you transfer your attention to mid and then high-level efficiencies. Because if you can shave down the number of low-level ops needed to achieve the desired end-result you have an effective performance gain
Keep dreaming they will need the frame rate ray tracing check box to sell thier game. Most won't bother with a non ray tracing mode. Fully expect 30fps in 2022 when nextgen really starts.