The PS5 should have GTX 1080Ti performance!

#51
It does depend on the game, Far Cry 5 for instance runs a lot smoother at 4K/30FPS than it does with the same settings on a 1060. That's down to optimisation at the end of the day though.
Yeah, but RX 580 handles 4K/30 on X1X settings fine. As this is an AMD optimized game. Point is X1X is roughly 1060/580 territory depending on the game.
 
#52
Yeah, but RX 580 handles 4K/30 on X1X settings fine. As this is an AMD optimized game. Point is X1X is roughly 1060/580 territory depending on the game.
Does it? I've only seen 1060 benchmarks with it over on Jermgaming that show FC5 has an absurd drop in performance going from 80%/90% of 4K to 100% native 4K.

No argument on your main point though, I agree. In fact, I don't even need to agree, it's a fact and I acknowledge that.
 
#53
While I'm always for more power in game consoles, I hope the next generation is defined by more creative and artistic choices in regards to graphics rather than just exploiting technical prowess to put together amazing visuals.

We've already seen developers utilize intelligent and creative use of art assets to make visually amazing games that are just as optimized in performance, I just hope we see more of that.
 
#54
GPU isn't really the bottleneck anymore. It's the reduced cost, low power cpu's that have been crippling new consoles for years now. You can adjust post processed effects/engage in shader trickery to get some amazing visuals, but your cpu will always limit the entirety of your game logic, dynamic object scene density and animation. This is why the ps4 pro and xbox one x are still stuck in 30fps-town in most cases, they could easily run most games on medium-high at 60fps if they only had something approaching a low end desktop ryzen 1200 from a generation ago.

This is tragic, because their gpu's are grossly bottleneck. This is how you know they are trully stop-gap half measure products, completely unbalanced to capitalize on hdr and 4k at 30fps.

To put things into perspective, a friend's OC'd 8350 @4.4ghz (much faster than the jaguars in the xbox one x) was bottlenecking a 1070 by 20-30fps. The cpu's in the ps4 pro and xboner x are crippling peak system on a board performance by at least ~50%.

So next generation consoles will seek to address this. Now is precisely the time when a low powered, low cost embedded cpu can finally deliver adequate performance. As for gpu, anything approaching a 1080 would be more than enough for 4k/60 with better graphics than you are seeing presently.

There needs to be an industry-wide mandate placed on a 60fps cap. 30fps is a disgrace and should be ridiculed at every turn, unless you're dealing with a <10W handheld.
 
Last edited:
#55
GPU isn't really the bottleneck anymore. It's the reduced cost, low power cpu's that have been crippling new consoles for years now. You can adjust post processed effects/engage in shader trickery to get some amazing visuals, but your cpu will always limit the entirety of your game logic, dynamic object scene density and animation. This is why the ps4 pro and xbox one x are still stuck in 30fps-town in most cases, they could easily run most games on medium-high at 60fps if they only had something approaching a low end desktop ryzen 1200 from a generation ago.

This is tragic, because their gpu's are grossly bottleneck. This is how you know they are trully stop-gap half measure products, completely unbalanced to capitalize on hdr and 4k at 30fps.

To put things into perspective, a friend's OC'd 8350 @4.4ghz (much faster than the jaguars in the xbox one x) was bottlenecking a 1070 by 20-30fps. The cpu's in the ps4 pro and xboner x are crippling peak system on a board performance by at least ~50%.

So next generation consoles will seek to address this. Now is precisely the time when a low powered, low cost embedded cpu can finally deliver adequate performance. As for gpu, anything approaching a 1080 would be more than enough for 4k/60 with better graphics than you are seeing presently.

There needs to be an industry-wide mandate placed on a 60fps cap. 30fps is a disgrace and should be ridiculed at every turn, unless you're dealing with a <10W handheld.
How to not agree this!?!?
Your completely spot on
 
#56
While I agree 30 fps is a disgrace and should never be used on current games I would love to see consoles push past 60 fps on some title.

While 4k 60fps with all the bells and whistles are great especially in single player games once a person plays games like Fortnite at 144hz or better its really hard to go back to 60fps.

I love Xbox is pushing forward on its 120hz update and even though games don't really take full advantage of 120 fps (yet) the console overall seems snappier plus supposedly input lag is slightly reduced in every game.
 
#57
GPU isn't really the bottleneck anymore. It's the reduced cost, low power cpu's that have been crippling new consoles for years now. You can adjust post processed effects/engage in shader trickery to get some amazing visuals, but your cpu will always limit the entirety of your game logic, dynamic object scene density and animation. This is why the ps4 pro and xbox one x are still stuck in 30fps-town in most cases, they could easily run most games on medium-high at 60fps if they only had something approaching a low end desktop ryzen 1200 from a generation ago.

This is tragic, because their gpu's are grossly bottleneck. This is how you know they are trully stop-gap half measure products, completely unbalanced to capitalize on hdr and 4k at 30fps.

To put things into perspective, a friend's OC'd 8350 @4.4ghz (much faster than the jaguars in the xbox one x) was bottlenecking a 1070 by 20-30fps. The cpu's in the ps4 pro and xboner x are crippling peak system on a board performance by at least ~50%.

So next generation consoles will seek to address this. Now is precisely the time when a low powered, low cost embedded cpu can finally deliver adequate performance. As for gpu, anything approaching a 1080 would be more than enough for 4k/60 with better graphics than you are seeing presently.

There needs to be an industry-wide mandate placed on a 60fps cap. 30fps is a disgrace and should be ridiculed at every turn, unless you're dealing with a <10W handheld.
Couldn't agree with you more. 30 FPS is terrible.

I remember everyone getting excited by the Xbox One X because of the TFLOPS numbers that everyone was throwing around. Then I read about how underpowered the CPU is and for that very reason, 60 FPS games weren't going to be as common as we all hoped for.

I can't think of one time that I would ever choose 30 FPS over 60 FPS. TLoU had this option on PS4 where you could run 30 FPS with higher res shadows or 60 FPS with reduced shadow quality. After switching back and forth between the two types, it was obvious 60 FPS was miles better. I hate that "cinematic" excuse that gets thrown around.
 
#58
While I agree 30 fps is a disgrace and should never be used on current games I would love to see consoles push past 60 fps on some title.

While 4k 60fps with all the bells and whistles are great especially in single player games once a person plays games like Fortnite at 144hz or better its really hard to go back to 60fps.

I love Xbox is pushing forward on its 120hz update and even though games don't really take full advantage of 120 fps (yet) the console overall seems snappier plus supposedly input lag is slightly reduced in every game.
I didn't read about their push on 120Hz, was that recent? They already are supporting or will be supporting FreeSync TV's which is very awesome to see.

On a side note, I believe latency increases as Hz goes up. Blurbusters would have an article on this I know it.
 
#59
I didn't read about their push on 120Hz, was that recent? They already are supporting or will be supporting FreeSync TV's which is very awesome to see.

On a side note, I believe latency increases as Hz goes up. Blurbusters would have an article on this I know it.
Freesync is already out for Xbox and the 120hz update is in testing for us preview members now and will be out for everyone in the May update.
 
#60
The problem is that Nvidia's Teraflop performance is higher than AMD's. VEGA64 with 12.6Tflops = GTX 1080 with 9Tflops. These two GPUs do not make 4k 60fps. So I think to achieve the performance of a 1080Ti would need something around 14Tflops from AMD.
I speculate that the ideal for a PS5 is something between 14 and 16Tflops. No matter how improbable this seems to happen.
Now I know this post is already over a week old but... sigh man I.. I can't even beging to... ugh..
That's so wrong on so many levels. AMD FLOP is EXACTLY THE SAME as a Nvidia FLOP... that's the IDEA of a FLOP... It's just a simple statement how many floating point operations a given piece of hardware can do in a second.
Saying they are different is like saying 100km/h in a Porsche is faster than 100km/h in a Honda. The fact that some Nvidia GPU's with less TFLOPS are better in gaming than AMD GPU's with more TFLOPS comes from the fact that FLOPS don't say anything about the Shading-Power or other parts of the GPU Pipeline.
 
Last edited:
#61
Now I know this post is already over a week old but... sigh man I.. I can't even beging to... ugh..
That's so wrong on so many levels. AMD FLOP is EXACTLY THE SAME as a Nvidia FLOP... that's the IDEA of a FLOP... It's just a simple statement how many floating point operations a given piece of hardware can do in a second.
Saying they are different is like saying 100km/h in a Porsche is faster than 100km/h in a Honda. The fact that some Nvidia GPU's with less TFLOPS are better in gaming than AMD GPU's with more TFLOPS comes from the fact that FLOPS don't say anything about the Shading-Power or other parts of the GPU Pipeline.
There's no need to reply like that, calm down. He meant that nVidia's GPUs have better performance even having lower floating point operations per second than AMD's GPUs. That's all.
 
#62
Now I know this post is already over a week old but... sigh man I.. I can't even beging to... ugh..
That's so wrong on so many levels. AMD FLOP is EXACTLY THE SAME as a Nvidia FLOP... that's the IDEA of a FLOP... It's just a simple statement how many floating point operations a given piece of hardware can do in a second.
Saying they are different is like saying 100km/h in a Porsche is faster than 100km/h in a Honda. The fact that some Nvidia GPU's with less TFLOPS are better in gaming than AMD GPU's with more TFLOPS comes from the fact that FLOPS don't say anything about the Shading-Power or other parts of the GPU Pipeline.

They point he's trying to repeat there is that given a number of floating point operations per second hardware can theoretically do, Nvidia cards would put out more end game performance for an equivalent number. Put another way, AMD cards usually have more flops per unit performance.


It's like saying a Mhz is a Mhz, on an objective hardware level true enough, but clearly an architecture has a massive impact on how much performance is put to the ground at a given clock speed.

GPU Gflops are just a dumb paper calculation of shader cores * clock speed * operations per core per clock (they can do 2 per clock), and AMDs designs of having smaller more numerous cores make that number higher, while Nvidia chooses to try to get more work out of fewer cores.


Neither philosophy is better or worse, Nvidia is tending to better performance per watt now but it wasn't always true, it's just two different ways of doing things.
 
Last edited:
#63
I recall the higher fps option in last of us remastered on ps4 was just an unlocked framerate option. This meant it jumped around anywhere from 30-60. Was not ideal but TLOU had terrible control latency so an fps boost greatly improved quality of life there.

60 fps locked > 30 fps locked > unlocked fps of any description in the 1-60fps range, unless you're sporting gsync or freesync and aren't dropping below 40fps at any point.

Anyway, what the switch does with just 8W on a 720p screen in handheld mode is remarkable. Zelda for example has greatly responsive controls even at 25-30, and Mario odyssey is a dream.

Getting back to the thread topic, I think we've reached large diminishing returns when it comes to gpu power on 60hz, 4k displays. What the next consoles need is a beefy cpu to make those visuals shine at a higher frame rate. 60 fps should be the minimum goal because we won't be seeing high refresh rate, 4k low latency televisions any time soon outside of the enthusiast segment.

Any gpu at or close to 1080 base levels, and greatly increased cpu power (easy to do considering how slow the 2012-era jaguar is) would achieve this. However optical media is kind of tired, as it necessitates hard drive and internet access for frequent, huge updates anyway-so the optical drive just adds to the cost with little benefit. Some blu ray games you buy don't even work without a day one patch.

Solid State storage has gotten very affordable, very fast and will be more so going forward. So I envision games coming on sd cards, ranging from 32gb+ as standard. This means no more game installations, no more need for giant hard drives (also ads to cost). Instead if you want to go digital only, invest in a cheap 4tb portable hdd and plug it into the usb 3 port of your console. Alternatively if you want to buy games and collect them physically, have them on self contained storage like the switch.

If the ps5 and XBox One X2 (jeez, the title gore) go the route of just enough cpu for 30fps or inconsistent 60 with a focus on ever more 4k post processed visuals, internal storage, optical drives, game installs in a locked proprietary box that receives incremental upgrades every 3 years again...then I might as well stick with a custom PC build. May be more expensive initially, but I know I'll have guaranteed 60+fps there, and much cheaper games long term.
 
Last edited:
#64
I don't see modern games pushing away from 30 FPS anytime soon. Devs and the market in general will always prefer the console power used to do something else.
I don't have a lot of confidence in my fellow man, 4K is visible (give me 60fps over 4k any day) , prettier graphics is visible, you really can't market 60 FPS apart from the video game enthusiast crowd, it's a very hard to advertise for.
 
#65
The point of diminishing returns has been reached. That's why consoles have a mediocre CPU this gen. No point in adding high end hardware when only half the games will take advantage of it.
 
#66
The point of diminishing returns has been reached. That's why consoles have a mediocre CPU this gen. No point in adding high end hardware when only half the games will take advantage of it.
Jaguar was about the only option they had for a console sized APU in 2013 with the graphics they wanted. They were both near the interposer size limits...

The CPU is terribly slow now, but it made sense as a choice when you look at what was available at release. Given that they wanted an APU, the one thing that could be argued is they could have split them out.
 
#67
I really don't want to say write it but even FPS have nothing to do with the CPU.
The developer always decides how many frames and at what resolution a game works. If he chooses to only "use" a 30 FPS cap, that he can spent more render-time in each frame. If he chooses to go with 60 fps the game needs to get optimized for that, the GPU-heavy and the CPU-heavy part. It is not impossible. 8 Jaguar cores (well 6,5-7 for a game) are not slow it is just how the developer decides to use the cores.
If we get ryzen CPUs in the next-gen consoles the same problem will occur again and again.

Another thing that we shouldn't forget is, that it was always planned that GPUs will take more and more tasks that were CPU-tasks in the past. That did not happen this way. In most cases it is just better to invest GPU resources into graphics because this sells better.

what I really find interesting is the support for 120Hz. This way a console could also output vsynced 40fps. But the problem is TVs must still support this. 40-45fps would be a good compromise between eye-candy and input-lag if it only could be outputted synced with the TV. 50FPS would work with most TVs today if the game would output at 50Hz. Every HD-TV should be able to handle that refresh-rate. Maybe this would be another thing, unlocked games should output a 50Hz signal.
 
Last edited:
#68
I really don't want to say write it but even FPS have nothing to do with the CPU.
The developer always decides how many frames and at what resolution a game works. If he chooses to only "use" a 30 FPS cap, that he can spent more render-time in each frame. If he chooses to go with 60 fps the game needs to get optimized for that, the GPU-heavy and the CPU-heavy part. It is not impossible. 8 Jaguar cores (well 6,5-7 for a game) are not slow it is just how the developer decides to use the cores.
If we get ryzen CPUs in the next-gen consoles the same problem will occur again and again.

Another thing that we shouldn't forget is, that it was always planned that GPUs will take more and more tasks that were CPU-tasks in the past. That did not happen this way. In most cases it is just better to invest GPU resources into graphics because this sells better.

what I really find interesting is the support for 120Hz. This way a console could also output vsynced 40fps. But the problem is TVs must still support this. 40-45fps would be a good compromise between eye-candy and input-lag if it only could be outputted synced with the TV. 50FPS would work with most TVs today if the game would output at 50Hz. Every HD-TV should be able to handle that refresh-rate. Maybe this would be another thing, unlocked games should output a 50Hz signal.
My man there is such a thing as CPU bottlenecks and they are very present on Xbox One X. Everyone was going crazy about the "6 TFLOPS basically a GTX 1070" debate that was going on yet does that thing give you the same performance you get with a PC with a GTX 1070 and a way more advanced modern CPU? No definitely not. The bottleneck is clearly the CPU here and there have already been a few articles about this very thing. I know Digital Foundry did a video about the Xbox One X. Please watch it.
 
#69
Jaguar was about the only option they had for a console sized APU in 2013 with the graphics they wanted. They were both near the interposer size limits...

The CPU is terribly slow now, but it made sense as a choice when you look at what was available at release. Given that they wanted an APU, the one thing that could be argued is they could have split them out.
Yup the decision to use an APU shows that high end hardware makes little sense because of diminishing returns. Also most people don't want to invest alot of money when they buy a console.
 
#71
I don’t believe we are at a point of diminishing returns as far as graphics are concerned. There are plenty of graphical effects that are still too processing intensive for existing GPU’s (Ray-Tracing for example) that will make a huge difference once achieved and implemented.
Anyway, if the PS5 is built with more focus on VR 120fps is very important to avoid gag bagging and to obtain smooth visuals. I am hoping this will equate to high quality graphics running at 60fps for the non VR portion of the game. Just a thought....
 
#73
I really don't want to say write it but even FPS have nothing to do with the CPU.
IF both PS4Pro and XboxXX had decent CPU most of their games would run with locked 60fps. Due to how shitty CPU is most of GPU power in not utilized. Especially XboxXXX as it had decent GPU that could easily run games at 60fps with decent CPU.
 
#75
My man there is such a thing as CPU bottlenecks and they are very present on Xbox One X. Everyone was going crazy about the "6 TFLOPS basically a GTX 1070" debate that was going on yet does that thing give you the same performance you get with a PC with a GTX 1070 and a way more advanced modern CPU? No definitely not. The bottleneck is clearly the CPU here and there have already been a few articles about this very thing. I know Digital Foundry did a video about the Xbox One X. Please watch it.
IF both PS4Pro and XboxXX had decent CPU most of their games would run with locked 60fps. Due to how shitty CPU is most of GPU power in not utilized. Especially XboxXXX as it had decent GPU that could easily run games at 60fps with decent CPU.
I think you don't get it. The CPU is only a bottleneck if the developer decides. Yes the jaguar cores are not the best CPUs you can get but they are still very capable CPUs.
The problem here is just how developers decide how to use the CPU-power and how not to use it.
If you create a game for a PC and than port it to consoles, you will always have problems with one component.
Also if PS4 Pro and XB1x would have a ryzen CPU they could still not deliver 60fps in every case just because the GPU would be limiting at the current details.
But if you just mean deliver the xb1/ps4 base game with 1080p/30 and bump it up to60fps, you are correct, than the CPU is limiting because the game was designed for a 1.6Ghz CPU so a 2.1-2.3 GHz CPU just can't deliver double performance.

If we get ryzen with the next gen, games will be designed to max out that cpu. If it than runs on 30 or 60fps is still the decision of the developer.
 
#76
VEGA 64 and GTX 1080 performance are the most plausible.

Optimizations will provide better graphics and performance near to GTX 1080Ti, but it all depends on developers.
 
#77
Now I know this post is already over a week old but... sigh man I.. I can't even beging to... ugh..
That's so wrong on so many levels. AMD FLOP is EXACTLY THE SAME as a Nvidia FLOP... that's the IDEA of a FLOP... It's just a simple statement how many floating point operations a given piece of hardware can do in a second.
Saying they are different is like saying 100km/h in a Porsche is faster than 100km/h in a Honda. The fact that some Nvidia GPU's with less TFLOPS are better in gaming than AMD GPU's with more TFLOPS comes from the fact that FLOPS don't say anything about the Shading-Power or other parts of the GPU Pipeline.
You're the sort of person who thinks a ton of feathers weighs the same as a ton of bricks.
 
#78
I think you don't get it. The CPU is only a bottleneck if the developer decides. Yes the jaguar cores are not the best CPUs you can get but they are still very capable CPUs.
The problem here is just how developers decide how to use the CPU-power and how not to use it.
If you create a game for a PC and than port it to consoles, you will always have problems with one component.
No mate. You just don't have an idea how garbage they are. I had AMD apu with one of those (only 4 core instead of 8) and compared to my i5-3570@4ghz it couldn't do shit. Difference between them is like in several TIMES especially when you deal with actual hard work. And that was at 3,5Ghz+ not some measly 2ghz.
Secondly you don't understand how code is written. Despite having 8 threads a lot of code needs to still be on 1st core.
So effectively while total power you have looks interesting in practice it is garbage.

And no "cpu is only bottleneck if developer decides it" doesn't cut it. CPU does a lot more work than just telling gpu what to do. And if you assume everyone is code wizard then you are wrong. There are only handful of people in industry that can deliver code wizardry in timely fashion.

Also if PS4 Pro and XB1x would have a ryzen CPU they could still not deliver 60fps in every case just because the GPU would be limiting at the current details.
If Xbox one x had ryzen cpu it could deliver Witcher 3 @ 1080p @ 60fps locked without shadow of a doubt. You don't know performance of pc parts. And we are talking here about console so console hardware should work even faster due to better optimizations.
 
Last edited:
#79
Last edited:
#81
Well we are about to have a few answers, with that new Chinese console coming out. It will be a Ryzen cpu coupled with an integrated gpu equivalent to the ps4 pro. I think you will see some surprising performance, as the ps4 pro is grossly bottlenecked by its crappy jaguar cores. It will probably exceed xbox one x performance in most scenarios.

https://www.techradar.com/au/news/amd-brings-vega-graphics-to-game-consoles-in-china-where-to-next
Sony doesn't need more than a RX VEGA 64 performance to bring us a true next gen graphics.

First because consoles games runs at Medium/High settings. Developers never use Very High/Ultra Settings and max AA or max anisotropic. And I'm just talking about filters.

The second important thing is that developers always optimize games with newer Engines, API and softwares.
RX VEGA 64 runs all current gen games at 4k Ultra Settings average 45fps. So if we reduce filters to Medium/High it will bring us 4k 60fps easily(We all know that 30fps is standard). The secret is just taking all these remaining 30fps and converts them into more polygons, particles, animations, and new-generation graphic effects in place of Ultra textures, shadows and AA filters. That happens in every generations.

The problem is if SONY choose for RX VEGA 56/GTX 1070 performance. :lollipop_crying::lollipop_frowning_mouth::lollipop_anguish::lollipop_astonished: I can't imagine a true generational leap with less than GTX 1080/VEGA64 performance.
 
Last edited:
#83
That's because the X1X runs things at lower quality than PC, run them both at equal settings, and 1070 is always the faster one.
My personal experience is X1X is exactly RX580/1060.
Been shown time and time again that xb1x beats 1060 easily and is much closer to a 1070 as OP said. Plenty of benchmarks from Wolf to Far cry show this.
 
#85
The hardware optimisations only get you so far ( ie ps4 pro checkerboard or xb1x hardware commands). Software specific optimisations for the hardware help a lot to.. there was heaps of commands xbox 360 games used that were technically on pc gpus but never used because lack of driver standardisation.

Basically performance is everything working together .. he fastest car on top gears track is never the one with the most horsepower.

Its odd how some say amd flops are different to nvidia then claim that console games performance dosent count as it was optmised for AMD. Most pc games are optimised for nvidia and most console games optimised for AMD thats just how it is its not nvidia flops vs Amd cheat codes.

Ps5 is likely to be about a 1080ti performance 9-14 tF. What i think is really going to matter is cooling, ram and SSD includsion. A little bit more gpu wont be as useful as being able to stream more textures faster.
 
Last edited:
#86
It's nonsense to compare directly GPU in PC and GPU in next PS5,apples and oranges.
PS4 has GPU equivalent to HD7850 which is 6 years old mid-range GPU. And we have graphics like Horizon Zero Dawn,God of War,Spiderman etc.Its better than any graphics on PC. And thats with 1.8Teraflops.
Now imagine 10-12TF with ryzen cpu as a lowest common denominator.
 
#87
The hardware optimisations only get you so far ( ie ps4 pro checkerboard or xb1x hardware commands). Software specific optimisations for the hardware help a lot to.. there was heaps of commands xbox 360 games used that were technically on pc gpus but never used because lack of driver standardisation.

Basically performance is everything working together .. he fastest car on top gears track is never the one with the most horsepower.

Its odd how some say amd flops are different to nvidia then claim that console games performance dosent count as it was optmised for AMD. Most pc games are optimised for nvidia and most console games optimised for AMD thats just how it is its not nvidia flops vs Amd cheat codes.

Ps5 is likely to be about a 1080ti performance 9-14 tF. What i think is really going to matter is cooling, ram and SSD includsion. A little bit more gpu wont be as useful as being able to stream more textures faster.
I still predicting GTX 1080/VEGA 64 performance on PS5. Nothing more and nothing less than this. That's more than enough to bring us a True generational leap.

I just don't know if NAVI 7nm will provide more efficiency per Tflops. Making 10Tflops equals to Nvidias Pascal performance.
For exemple: GTX 1080 9Tflops = NAVI RX 780 9Tflops instead of using 12.6Tflops VEGA 64 to be equivalent to GTX 1080.

We don't know how much power will be increased on this next gen 7nm gpus. But I think if SONY put 10Tflops inside that box they'll know what they are doing.
 
Last edited:
#88
It's nonsense to compare directly GPU in PC and GPU in next PS5,apples and oranges.
PS4 has GPU equivalent to HD7850 which is 6 years old mid-range GPU. And we have graphics like Horizon Zero Dawn,God of War,Spiderman etc.Its better than any graphics on PC. And thats with 1.8Teraflops.
Now imagine 10-12TF with ryzen cpu as a lowest common denominator.
Do you think these are possible games of the current generation or Are they Next/Cross Gen games?

They are so far from everything I have ever seen!
No release date...

 
Last edited:
#89
Do you think these are possible games of the current generation or Are they Next/Cross Gen games?

They are so far from everything I have ever seen!
No release date...
If that's not straight up pre rendered bullshot movies, those "gameplay" trailers are running on very expensive dev hardware generations ahead of what a 200$ console can do, Sony bros should be used to this by now, never trust the E3 presentations.
 
#90
Did you guys seriously necro this old-ass borderline joke thread for the sake of continuing a console war.

What is wrong with you.
 
Last edited:
#92
It's nonsense to compare directly GPU in PC and GPU in next PS5,apples and oranges.
PS4 has GPU equivalent to HD7850 which is 6 years old mid-range GPU. And we have graphics like Horizon Zero Dawn,God of War,Spiderman etc.Its better than any graphics on PC. And thats with 1.8Teraflops.
Now imagine 10-12TF with ryzen cpu as a lowest common denominator.

You don't play many pc games on ultra/60fps on a 1440-4k monitor...do you? Horizon on a base ps4 has terrible IQ. As does god of war. Motion blur is used to hide that on your tv. There are many PC games and cross platform games that look (and more importantly run) far better when maxed than anything on any console (forza, ff15, Kingdom Come etc). See this on ultra textures 4k 60fps+ in person and you will be singing a different tune.

Uncharted 4 looks great, but the environments are small and look closely...the visual trickery breaks down.

You can absolutely compare pc and console hardware, as they are now identical architectures. This is why every single multiplat looks and runs better on a 1070 vs a ps4 pro or even an xbox one x (which is hugely bottlenecked by it's cpu).

Did you guys seriously necro this old-ass borderline joke thread for the sake of continuing a console war.

What is wrong with you.
THE N64 WAS A BEAST! PSX was inferior!
 
Last edited:
#93
You can absolutely compare pc and console hardware, as they are now identical architectures. This is why every single multiplat looks and runs better on a 1070 vs a ps4 pro or even an xbox one x (which is hugely bottlenecked by it's cpu).
A GPU that cost $400? The same price as a PS4 Pro. Of course it has to look and run better! Why not compare a $400 PC build (CPU, GPU, RAM, Case, OS, Board, PSU) with a PS4 Pro or a Xbox One X at $500? Is not fair to say a $400 GPU it's better than any console right now on the market.
 
Last edited:
#94
A GPU that cost $400? The same price as a PS4 Pro. Of course it has to look and run better! Why not compare a $400 PC build (CPU, GPU, RAM, Case, OS, Board, PSU) with a PS4 Pro or a Xbox One X at $500? Is not fair to say a $400 GPU it's better than any console right now on the market.
Now take into account paying for online play, and the cost of let's say....50 games over the lifetime of the console (physical and digital), and even a monster $1,500 pc with hundreds of games ends up being cheaper than your ps4. Games are simply far cheaper on PC, especially when bought in bulk bundles during sales. PSN monthly give aways can't compete-plus you lose access the second you stop paying. I have ~600 games on my steam account, did not spend more than $800 for them during sales and humble bundles, at full price or even partial sales (eg typical console situation of 30% off) they would be worth $5,000-$7,000.

You've been lured into the closed ecosystem and tricked into staying. It's actually more expensive over the lifespan of a console generation to exclusively game on a console. And no, you don't need to upgrade your pc every 2 years. I do it every 5 years. Still WAY ahead. I've spent ~$10,000+ on console gaming since 2013. Not worth it lol. Trust me I know all about the comparative long term costs. My brand spanking new monster pc cost me $1,200 to build last year and the games are still there ready to play all the way from 2008 when I first opened my steam account...all of them. PC's rule, consoles drool.
 
Last edited:
#95
Now take into account paying for online play, and the cost of let's say....50 games over the lifetime of the console (physical and digital), and even a monster $1,500 pc with hundreds of games ends up being cheaper than your ps4. Games are simply far cheaper on PC, especially when bought in bulk bundles during sales. PSN monthly give aways can't compete-plus you lose access the second you stop paying. I have ~600 games on my steam account, did not spend more than $800 for them during sales and humble bundles, at full price or even partial sales (eg typical console situation of 30% off) they would be worth $5,000-$7,000.

You've been lured into the closed ecosystem and tricked into staying. It's actually more expensive over the lifespan of a console generation to exclusively game on a console. And no, you don't need to upgrade your pc every 2 years. I do it every 5 years. Still WAY ahead. I've spent ~$10,000+ on console gaming since 2013. Not worth it lol. Trust me I know all about the comparative long term costs. My brand spanking new monster pc cost me $1,200 to build last year and the games are still there ready to play all the way from 2008 when I first opened my steam account...all of them. PC's rule, consoles drool.

You can wait years and buy most any game for a couple of dollars, console included. That's far from a "plus". $1,200 isnt buying a "monster" PC either.
 
#96
No, typically you wait a few weeks for legit cd key sites to offer deals on new releases. Suddenly a wild 50% off appears. The benefits of globalism and a free platform. If the big corporations can benefit from cross market trade, so can I.

And if you wait years on console, the multiplayer component of your game won't even function...so....
 
Last edited:
#97
A GPU that cost $400? The same price as a PS4 Pro. Of course it has to look and run better! Why not compare a $400 PC build (CPU, GPU, RAM, Case, OS, Board, PSU) with a PS4 Pro or a Xbox One X at $500? Is not fair to say a $400 GPU it's better than any console right now on the market.
And here you are wrong, what makes you think PC gamers upgrade every single time a console comes out a complete new PC. The only thing i had to upgrade on my PC in the last 10 years was the gpu with reselling my old gpu i paid a grand total of 110 bucks and play everything at 60fps + high settings without issue's and most likely play PS5/xbox two titles without issue's on top of it.

The logic of buying a complete new PC isn't working. It's the logic a console user would use however because they are used towards this type of behaviour with consoles. The PS4 pro is nothing but a GPU upgrade that is exactly 400 bucks if you have the base version already.

PC gaming can be very cheap.

I think you don't get it. The CPU is only a bottleneck if the developer decides. Yes the jaguar cores are not the best CPUs you can get but they are still very capable CPUs.
The problem here is just how developers decide how to use the CPU-power and how not to use it.
If you create a game for a PC and than port it to consoles, you will always have problems with one component.
Also if PS4 Pro and XB1x would have a ryzen CPU they could still not deliver 60fps in every case just because the GPU would be limiting at the current details.
But if you just mean deliver the xb1/ps4 base game with 1080p/30 and bump it up to60fps, you are correct, than the CPU is limiting because the game was designed for a 1.6Ghz CPU so a 2.1-2.3 GHz CPU just can't deliver double performance.

If we get ryzen with the next gen, games will be designed to max out that cpu. If it than runs on 30 or 60fps is still the decision of the developer.
That's the thing, the ryzen will be just as much of a dud in the PS5 as jaguar in the PS4. If the performance moves upward drastically the GPU isn't capable to push with it anyway. With 4k the main focus most of the gpu performance will already dissapear into that, add with it 60 fps and good luck finding a 20 tflop gpu that keeps up performance with a next generation look behind it.

The CPU is not going to the main focus in the PS5 because it makes no sense for them to do so, they will pick a weak ryzen that can deliver there 30 fps mark and focus all there resources on the GPU as that's going to get the main beating this time around. Also even with the weak jaguar at this day in age, there is not a single game that doesn't run on those consoles they care about. This gen could go on for even another 5 years if they wanted too.
 
Last edited:
#98
And here you are wrong, what makes you think PC gamers upgrade every single time a console comes out a complete new PC. The only thing i had to upgrade on my PC in the last 10 years was the gpu with reselling my old gpu i paid a grand total of 110 bucks and play everything at 60fps + high settings without issue's and most likely play PS5/xbox two titles without issue's on top of it.

The logic of buying a complete new PC isn't working. It's the logic a console user would use however because they are used towards this type of behaviour with consoles. The PS4 pro is nothing but a GPU upgrade that is exactly 400 bucks if you have the base version already.

PC gaming can be very cheap.



That's the thing, the ryzen will be just as much of a dud in the PS5 as jaguar in the PS4. If the performance moves upward drastically the GPU isn't capable to push with it anyway. With 4k the main focus most of the gpu performance will already dissapear into that, add with it 60 fps and good luck finding a 20 tflop gpu that keeps up performance with a next generation look behind it.

The CPU is not going to the main focus in the PS5 because it makes no sense for them to do so, they will pick a weak ryzen that can deliver there 30 fps mark and focus all there resources on the GPU as that's going to get the main beating this time around. Also even with the weak jaguar at this day in age, there is not a single game that doesn't run on those consoles they care about. This gen could go on for even another 5 years if they wanted too.
I hope SONY read your post!
There are several developers right now sharing their requests and wishes to Mark Cerny about how a PS5 should be in the next 2 years.
I think that many of them are wishing for more powerful CPU. They want to provide better physics and A.I on the next gen games. But they are forgeting about how much resources 4k consumes.
I don't really know if they care about increasing more polygon density, textures, animations, lightings and so on.
I still saying that PS5 should deliver the same VEGA 64/GTX 1080 performance. These two GPUs are totally capable of making next gen graphics if developers use their talent, mainly because console games runs at 30fps as standard. But if SONY choose for less than a GTX 1080/VEGA 64 performance... I don't know...
 
Last edited:
#99
You don't play many pc games on ultra/60fps on a 1440-4k monitor...do you? Horizon on a base ps4 has terrible IQ. As does god of war. Motion blur is used to hide that on your tv.
There are many PC games and cross platform games that look (and more importantly run) far better when maxed than anything on any console (forza, ff15, Kingdom Come etc). See this on ultra textures 4k 60fps+ in person and you will be singing a different tune.
Cross games looks better on better spec pc, but still to this day i didn't see or played better looking game then horizon on pro AAA multiplatform titlle never gonna look better then AAA exclussive for that reason.
Uncharted 4 looks great, but the environments are small and look closely...the visual trickery breaks down.
Lost legacy have big enviroments and have same visual quality as uncharted 4, and it's not even best looking game on ps4, so your statement breaks down also.
You can absolutely compare pc and console hardware, as they are now identical architectures. This is why every single multiplat looks and runs better on a 1070 vs a ps4 pro or even an xbox one x (which is hugely bottlenecked by it's cpu).
Every proccesor on console have customized architecture of pc parts that's makes them different so you can't compare.
 
Does it? I've only seen 1060 benchmarks with it over on Jermgaming that show FC5 has an absurd drop in performance going from 80%/90% of 4K to 100% native 4K.

No argument on your main point though, I agree. In fact, I don't even need to agree, it's a fact and I acknowledge that.
In digital foundry far cry 5 analysis they have tested 1060 and rx 580, and both were too weak and could not provide good 30fps experience with xbox X settings. Xbox x GPU is RX 580 on steroids, it has improved architecture and new features compared to RX580 (for example delta color compression improve makes a big difference in memory bandwith and MS also build dx12 into it on hardware level) so of course it should be faster compared to RX580, and especially in game like far cry 5 that likes memory bandwith (that's why going from 80% to 100% of 4K make so big difference in performance). But right now I'm the most impressed with wolfenstein 2 performance on X. 4K dynamic, high settings, and still 55-60fps. With locked 4K native mode 45fps on average. Now just for comparison, GTX 1060 with 2GHz OC is running the same game at low/medium settings in 4K dynamic at around 40-55fps, and 30-45fps in locked 4K native.

It's impressive what MS have achieved, with just 1.8 tflops more their console push sometimes twice as many pixels as ps4p (4.2 tflops), and that clearly shows tflops alone are not everything, and GPU architecture is also as important. If next xbox or playstation will feature 12 tflops GPU, then with similar optimisations like MS have done to xbox X it should provide true next gen experience.

You don't play many pc games on ultra/60fps on a 1440-4k monitor...do you? Horizon on a base ps4 has terrible IQ. As does god of war. Motion blur is used to hide that on your tv. There are many PC games and cross platform games that look (and more importantly run) far better when maxed than anything on any console (forza, ff15, Kingdom Come etc). See this on ultra textures 4k 60fps+ in person and you will be singing a different tune.

Uncharted 4 looks great, but the environments are small and look closely...the visual trickery breaks down.

You can absolutely compare pc and console hardware, as they are now identical architectures. This is why every single multiplat looks and runs better on a 1070 vs a ps4 pro or even an xbox one x (which is hugely bottlenecked by it's cpu).



THE N64 WAS A BEAST! PSX was inferior!
So far uncharted 4 is the most detailed game I have ever played. Insane attention to details from up close (especially in Nathan hause at the end of the game) and also levels were quite open in design (compared with corridor like levels in previous Uncharted 1-3 games.
 
Last edited: