The PS5 should have GTX 1080Ti performance!

#51
It does depend on the game, Far Cry 5 for instance runs a lot smoother at 4K/30FPS than it does with the same settings on a 1060. That's down to optimisation at the end of the day though.
Yeah, but RX 580 handles 4K/30 on X1X settings fine. As this is an AMD optimized game. Point is X1X is roughly 1060/580 territory depending on the game.
 
#52
Yeah, but RX 580 handles 4K/30 on X1X settings fine. As this is an AMD optimized game. Point is X1X is roughly 1060/580 territory depending on the game.
Does it? I've only seen 1060 benchmarks with it over on Jermgaming that show FC5 has an absurd drop in performance going from 80%/90% of 4K to 100% native 4K.

No argument on your main point though, I agree. In fact, I don't even need to agree, it's a fact and I acknowledge that.
 
#53
While I'm always for more power in game consoles, I hope the next generation is defined by more creative and artistic choices in regards to graphics rather than just exploiting technical prowess to put together amazing visuals.

We've already seen developers utilize intelligent and creative use of art assets to make visually amazing games that are just as optimized in performance, I just hope we see more of that.
 
#54
GPU isn't really the bottleneck anymore. It's the reduced cost, low power cpu's that have been crippling new consoles for years now. You can adjust post processed effects/engage in shader trickery to get some amazing visuals, but your cpu will always limit the entirety of your game logic, dynamic object scene density and animation. This is why the ps4 pro and xbox one x are still stuck in 30fps-town in most cases, they could easily run most games on medium-high at 60fps if they only had something approaching a low end desktop ryzen 1200 from a generation ago.

This is tragic, because their gpu's are grossly bottleneck. This is how you know they are trully stop-gap half measure products, completely unbalanced to capitalize on hdr and 4k at 30fps.

To put things into perspective, a friend's OC'd 8350 @4.4ghz (much faster than the jaguars in the xbox one x) was bottlenecking a 1070 by 20-30fps. The cpu's in the ps4 pro and xboner x are crippling peak system on a board performance by at least ~50%.

So next generation consoles will seek to address this. Now is precisely the time when a low powered, low cost embedded cpu can finally deliver adequate performance. As for gpu, anything approaching a 1080 would be more than enough for 4k/60 with better graphics than you are seeing presently.

There needs to be an industry-wide mandate placed on a 60fps cap. 30fps is a disgrace and should be ridiculed at every turn, unless you're dealing with a <10W handheld.
 
Last edited:
#55
GPU isn't really the bottleneck anymore. It's the reduced cost, low power cpu's that have been crippling new consoles for years now. You can adjust post processed effects/engage in shader trickery to get some amazing visuals, but your cpu will always limit the entirety of your game logic, dynamic object scene density and animation. This is why the ps4 pro and xbox one x are still stuck in 30fps-town in most cases, they could easily run most games on medium-high at 60fps if they only had something approaching a low end desktop ryzen 1200 from a generation ago.

This is tragic, because their gpu's are grossly bottleneck. This is how you know they are trully stop-gap half measure products, completely unbalanced to capitalize on hdr and 4k at 30fps.

To put things into perspective, a friend's OC'd 8350 @4.4ghz (much faster than the jaguars in the xbox one x) was bottlenecking a 1070 by 20-30fps. The cpu's in the ps4 pro and xboner x are crippling peak system on a board performance by at least ~50%.

So next generation consoles will seek to address this. Now is precisely the time when a low powered, low cost embedded cpu can finally deliver adequate performance. As for gpu, anything approaching a 1080 would be more than enough for 4k/60 with better graphics than you are seeing presently.

There needs to be an industry-wide mandate placed on a 60fps cap. 30fps is a disgrace and should be ridiculed at every turn, unless you're dealing with a <10W handheld.
How to not agree this!?!?
Your completely spot on
 
#56
While I agree 30 fps is a disgrace and should never be used on current games I would love to see consoles push past 60 fps on some title.

While 4k 60fps with all the bells and whistles are great especially in single player games once a person plays games like Fortnite at 144hz or better its really hard to go back to 60fps.

I love Xbox is pushing forward on its 120hz update and even though games don't really take full advantage of 120 fps (yet) the console overall seems snappier plus supposedly input lag is slightly reduced in every game.
 
#57
GPU isn't really the bottleneck anymore. It's the reduced cost, low power cpu's that have been crippling new consoles for years now. You can adjust post processed effects/engage in shader trickery to get some amazing visuals, but your cpu will always limit the entirety of your game logic, dynamic object scene density and animation. This is why the ps4 pro and xbox one x are still stuck in 30fps-town in most cases, they could easily run most games on medium-high at 60fps if they only had something approaching a low end desktop ryzen 1200 from a generation ago.

This is tragic, because their gpu's are grossly bottleneck. This is how you know they are trully stop-gap half measure products, completely unbalanced to capitalize on hdr and 4k at 30fps.

To put things into perspective, a friend's OC'd 8350 @4.4ghz (much faster than the jaguars in the xbox one x) was bottlenecking a 1070 by 20-30fps. The cpu's in the ps4 pro and xboner x are crippling peak system on a board performance by at least ~50%.

So next generation consoles will seek to address this. Now is precisely the time when a low powered, low cost embedded cpu can finally deliver adequate performance. As for gpu, anything approaching a 1080 would be more than enough for 4k/60 with better graphics than you are seeing presently.

There needs to be an industry-wide mandate placed on a 60fps cap. 30fps is a disgrace and should be ridiculed at every turn, unless you're dealing with a <10W handheld.
Couldn't agree with you more. 30 FPS is terrible.

I remember everyone getting excited by the Xbox One X because of the TFLOPS numbers that everyone was throwing around. Then I read about how underpowered the CPU is and for that very reason, 60 FPS games weren't going to be as common as we all hoped for.

I can't think of one time that I would ever choose 30 FPS over 60 FPS. TLoU had this option on PS4 where you could run 30 FPS with higher res shadows or 60 FPS with reduced shadow quality. After switching back and forth between the two types, it was obvious 60 FPS was miles better. I hate that "cinematic" excuse that gets thrown around.
 
#58
While I agree 30 fps is a disgrace and should never be used on current games I would love to see consoles push past 60 fps on some title.

While 4k 60fps with all the bells and whistles are great especially in single player games once a person plays games like Fortnite at 144hz or better its really hard to go back to 60fps.

I love Xbox is pushing forward on its 120hz update and even though games don't really take full advantage of 120 fps (yet) the console overall seems snappier plus supposedly input lag is slightly reduced in every game.
I didn't read about their push on 120Hz, was that recent? They already are supporting or will be supporting FreeSync TV's which is very awesome to see.

On a side note, I believe latency increases as Hz goes up. Blurbusters would have an article on this I know it.
 
#59
I didn't read about their push on 120Hz, was that recent? They already are supporting or will be supporting FreeSync TV's which is very awesome to see.

On a side note, I believe latency increases as Hz goes up. Blurbusters would have an article on this I know it.
Freesync is already out for Xbox and the 120hz update is in testing for us preview members now and will be out for everyone in the May update.
 
#60
The problem is that Nvidia's Teraflop performance is higher than AMD's. VEGA64 with 12.6Tflops = GTX 1080 with 9Tflops. These two GPUs do not make 4k 60fps. So I think to achieve the performance of a 1080Ti would need something around 14Tflops from AMD.
I speculate that the ideal for a PS5 is something between 14 and 16Tflops. No matter how improbable this seems to happen.
Now I know this post is already over a week old but... sigh man I.. I can't even beging to... ugh..
That's so wrong on so many levels. AMD FLOP is EXACTLY THE SAME as a Nvidia FLOP... that's the IDEA of a FLOP... It's just a simple statement how many floating point operations a given piece of hardware can do in a second.
Saying they are different is like saying 100km/h in a Porsche is faster than 100km/h in a Honda. The fact that some Nvidia GPU's with less TFLOPS are better in gaming than AMD GPU's with more TFLOPS comes from the fact that FLOPS don't say anything about the Shading-Power or other parts of the GPU Pipeline.
 
Last edited:
#61
Now I know this post is already over a week old but... sigh man I.. I can't even beging to... ugh..
That's so wrong on so many levels. AMD FLOP is EXACTLY THE SAME as a Nvidia FLOP... that's the IDEA of a FLOP... It's just a simple statement how many floating point operations a given piece of hardware can do in a second.
Saying they are different is like saying 100km/h in a Porsche is faster than 100km/h in a Honda. The fact that some Nvidia GPU's with less TFLOPS are better in gaming than AMD GPU's with more TFLOPS comes from the fact that FLOPS don't say anything about the Shading-Power or other parts of the GPU Pipeline.
There's no need to reply like that, calm down. He meant that nVidia's GPUs have better performance even having lower floating point operations per second than AMD's GPUs. That's all.
 
#62
Now I know this post is already over a week old but... sigh man I.. I can't even beging to... ugh..
That's so wrong on so many levels. AMD FLOP is EXACTLY THE SAME as a Nvidia FLOP... that's the IDEA of a FLOP... It's just a simple statement how many floating point operations a given piece of hardware can do in a second.
Saying they are different is like saying 100km/h in a Porsche is faster than 100km/h in a Honda. The fact that some Nvidia GPU's with less TFLOPS are better in gaming than AMD GPU's with more TFLOPS comes from the fact that FLOPS don't say anything about the Shading-Power or other parts of the GPU Pipeline.

They point he's trying to repeat there is that given a number of floating point operations per second hardware can theoretically do, Nvidia cards would put out more end game performance for an equivalent number. Put another way, AMD cards usually have more flops per unit performance.


It's like saying a Mhz is a Mhz, on an objective hardware level true enough, but clearly an architecture has a massive impact on how much performance is put to the ground at a given clock speed.

GPU Gflops are just a dumb paper calculation of shader cores * clock speed * operations per core per clock (they can do 2 per clock), and AMDs designs of having smaller more numerous cores make that number higher, while Nvidia chooses to try to get more work out of fewer cores.


Neither philosophy is better or worse, Nvidia is tending to better performance per watt now but it wasn't always true, it's just two different ways of doing things.
 
Last edited:
#63
I recall the higher fps option in last of us remastered on ps4 was just an unlocked framerate option. This meant it jumped around anywhere from 30-60. Was not ideal but TLOU had terrible control latency so an fps boost greatly improved quality of life there.

60 fps locked > 30 fps locked > unlocked fps of any description in the 1-60fps range, unless you're sporting gsync or freesync and aren't dropping below 40fps at any point.

Anyway, what the switch does with just 8W on a 720p screen in handheld mode is remarkable. Zelda for example has greatly responsive controls even at 25-30, and Mario odyssey is a dream.

Getting back to the thread topic, I think we've reached large diminishing returns when it comes to gpu power on 60hz, 4k displays. What the next consoles need is a beefy cpu to make those visuals shine at a higher frame rate. 60 fps should be the minimum goal because we won't be seeing high refresh rate, 4k low latency televisions any time soon outside of the enthusiast segment.

Any gpu at or close to 1080 base levels, and greatly increased cpu power (easy to do considering how slow the 2012-era jaguar is) would achieve this. However optical media is kind of tired, as it necessitates hard drive and internet access for frequent, huge updates anyway-so the optical drive just adds to the cost with little benefit. Some blu ray games you buy don't even work without a day one patch.

Solid State storage has gotten very affordable, very fast and will be more so going forward. So I envision games coming on sd cards, ranging from 32gb+ as standard. This means no more game installations, no more need for giant hard drives (also ads to cost). Instead if you want to go digital only, invest in a cheap 4tb portable hdd and plug it into the usb 3 port of your console. Alternatively if you want to buy games and collect them physically, have them on self contained storage like the switch.

If the ps5 and XBox One X2 (jeez, the title gore) go the route of just enough cpu for 30fps or inconsistent 60 with a focus on ever more 4k post processed visuals, internal storage, optical drives, game installs in a locked proprietary box that receives incremental upgrades every 3 years again...then I might as well stick with a custom PC build. May be more expensive initially, but I know I'll have guaranteed 60+fps there, and much cheaper games long term.
 
Last edited:
#64
I don't see modern games pushing away from 30 FPS anytime soon. Devs and the market in general will always prefer the console power used to do something else.
I don't have a lot of confidence in my fellow man, 4K is visible (give me 60fps over 4k any day) , prettier graphics is visible, you really can't market 60 FPS apart from the video game enthusiast crowd, it's a very hard to advertise for.
 
#65
The point of diminishing returns has been reached. That's why consoles have a mediocre CPU this gen. No point in adding high end hardware when only half the games will take advantage of it.
 
#66
The point of diminishing returns has been reached. That's why consoles have a mediocre CPU this gen. No point in adding high end hardware when only half the games will take advantage of it.
Jaguar was about the only option they had for a console sized APU in 2013 with the graphics they wanted. They were both near the interposer size limits...

The CPU is terribly slow now, but it made sense as a choice when you look at what was available at release. Given that they wanted an APU, the one thing that could be argued is they could have split them out.
 
#67
I really don't want to say write it but even FPS have nothing to do with the CPU.
The developer always decides how many frames and at what resolution a game works. If he chooses to only "use" a 30 FPS cap, that he can spent more render-time in each frame. If he chooses to go with 60 fps the game needs to get optimized for that, the GPU-heavy and the CPU-heavy part. It is not impossible. 8 Jaguar cores (well 6,5-7 for a game) are not slow it is just how the developer decides to use the cores.
If we get ryzen CPUs in the next-gen consoles the same problem will occur again and again.

Another thing that we shouldn't forget is, that it was always planned that GPUs will take more and more tasks that were CPU-tasks in the past. That did not happen this way. In most cases it is just better to invest GPU resources into graphics because this sells better.

what I really find interesting is the support for 120Hz. This way a console could also output vsynced 40fps. But the problem is TVs must still support this. 40-45fps would be a good compromise between eye-candy and input-lag if it only could be outputted synced with the TV. 50FPS would work with most TVs today if the game would output at 50Hz. Every HD-TV should be able to handle that refresh-rate. Maybe this would be another thing, unlocked games should output a 50Hz signal.
 
Last edited:
#68
I really don't want to say write it but even FPS have nothing to do with the CPU.
The developer always decides how many frames and at what resolution a game works. If he chooses to only "use" a 30 FPS cap, that he can spent more render-time in each frame. If he chooses to go with 60 fps the game needs to get optimized for that, the GPU-heavy and the CPU-heavy part. It is not impossible. 8 Jaguar cores (well 6,5-7 for a game) are not slow it is just how the developer decides to use the cores.
If we get ryzen CPUs in the next-gen consoles the same problem will occur again and again.

Another thing that we shouldn't forget is, that it was always planned that GPUs will take more and more tasks that were CPU-tasks in the past. That did not happen this way. In most cases it is just better to invest GPU resources into graphics because this sells better.

what I really find interesting is the support for 120Hz. This way a console could also output vsynced 40fps. But the problem is TVs must still support this. 40-45fps would be a good compromise between eye-candy and input-lag if it only could be outputted synced with the TV. 50FPS would work with most TVs today if the game would output at 50Hz. Every HD-TV should be able to handle that refresh-rate. Maybe this would be another thing, unlocked games should output a 50Hz signal.
My man there is such a thing as CPU bottlenecks and they are very present on Xbox One X. Everyone was going crazy about the "6 TFLOPS basically a GTX 1070" debate that was going on yet does that thing give you the same performance you get with a PC with a GTX 1070 and a way more advanced modern CPU? No definitely not. The bottleneck is clearly the CPU here and there have already been a few articles about this very thing. I know Digital Foundry did a video about the Xbox One X. Please watch it.
 
#69
Jaguar was about the only option they had for a console sized APU in 2013 with the graphics they wanted. They were both near the interposer size limits...

The CPU is terribly slow now, but it made sense as a choice when you look at what was available at release. Given that they wanted an APU, the one thing that could be argued is they could have split them out.
Yup the decision to use an APU shows that high end hardware makes little sense because of diminishing returns. Also most people don't want to invest alot of money when they buy a console.
 
#71
I don’t believe we are at a point of diminishing returns as far as graphics are concerned. There are plenty of graphical effects that are still too processing intensive for existing GPU’s (Ray-Tracing for example) that will make a huge difference once achieved and implemented.
Anyway, if the PS5 is built with more focus on VR 120fps is very important to avoid gag bagging and to obtain smooth visuals. I am hoping this will equate to high quality graphics running at 60fps for the non VR portion of the game. Just a thought....
 
#73
I really don't want to say write it but even FPS have nothing to do with the CPU.
IF both PS4Pro and XboxXX had decent CPU most of their games would run with locked 60fps. Due to how shitty CPU is most of GPU power in not utilized. Especially XboxXXX as it had decent GPU that could easily run games at 60fps with decent CPU.
 
#75
My man there is such a thing as CPU bottlenecks and they are very present on Xbox One X. Everyone was going crazy about the "6 TFLOPS basically a GTX 1070" debate that was going on yet does that thing give you the same performance you get with a PC with a GTX 1070 and a way more advanced modern CPU? No definitely not. The bottleneck is clearly the CPU here and there have already been a few articles about this very thing. I know Digital Foundry did a video about the Xbox One X. Please watch it.
IF both PS4Pro and XboxXX had decent CPU most of their games would run with locked 60fps. Due to how shitty CPU is most of GPU power in not utilized. Especially XboxXXX as it had decent GPU that could easily run games at 60fps with decent CPU.
I think you don't get it. The CPU is only a bottleneck if the developer decides. Yes the jaguar cores are not the best CPUs you can get but they are still very capable CPUs.
The problem here is just how developers decide how to use the CPU-power and how not to use it.
If you create a game for a PC and than port it to consoles, you will always have problems with one component.
Also if PS4 Pro and XB1x would have a ryzen CPU they could still not deliver 60fps in every case just because the GPU would be limiting at the current details.
But if you just mean deliver the xb1/ps4 base game with 1080p/30 and bump it up to60fps, you are correct, than the CPU is limiting because the game was designed for a 1.6Ghz CPU so a 2.1-2.3 GHz CPU just can't deliver double performance.

If we get ryzen with the next gen, games will be designed to max out that cpu. If it than runs on 30 or 60fps is still the decision of the developer.