• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Consoles will still lead in Industry Winning Graphics....

Barnabot

Member
Wait, what?! What exactly happened? Checked the review thread, and didn't see anything. And his post linked to his ban, at least in my opinion, didn't seem fit. Look at his next post after that.
I guess those two were arguing with fanboys who thought the ninth generation consoles are like the second coming. And they thought they could change those console warrior's mind. Why even bother in the first place?

Learn from Filthy Frank.
 

Md Ray

Member
So you think that a PS5 Pro will make a massive jump from the 1080Ti performance PS5 and go all the way to 3090 levels?
It seemed impossible 2 years ago for consoles to have hardware-based RT GPU and an NVMe SSD, much less a PCIe Gen4-based, that too for $500. Yet here we are.
 

Md Ray

Member
Two smart technical posters @Thugnificient @VFXVeteran banned just now. Why?
Wait, what?! What exactly happened? Checked the review thread, and didn't see anything. And his post linked to his ban, at least in my opinion, didn't seem fit. Look at his next post after that.

Because of this unnecessary post: 👇🏼

It's also available on PC. Can't wait to see the more graphical effects over the PS5. :D

And it's getting tiring.
 
Last edited:
Because of this unnecessary post: 👇🏼



And it's getting tiring.
With these guys saying ps5 is going to look better than PC in these games, I can't really blame him, as anyone with common sense knows which hardware will be more performant. But I probably would have worded it a little differently.
 

Md Ray

Member
With these guys saying ps5 is going to look better than PC in these games, I can't really blame him, as anyone with common sense knows which hardware will be more performant. But I probably would have worded it a little differently.
But the post VFX replied to didn't say something like that, did it?
 

Md Ray

Member
With these guys saying ps5 is going to look better than PC in these games, I can't really blame him, as anyone with common sense knows which hardware will be more performant. But I probably would have worded it a little differently.
For all we know, Kena on PS5 could be using PC's max settings. Since it's a cross-gen game - it's totally plausible. I could see a scenario where the PS4 could be using PC's Medium/High settings, but maxed out on PS5, in comparison.
 
Last edited:
But the post VFX replied to didn't say something like that, did it?
Maybe check out their post history. Vfx was replying to a guy who has a history of trying to downplay his posts.

For all we know, Kena on PS5 could be using PC's max settings. Since it's a cross-gen game - it's totally plausible. I could see a scenario where the PS4 could be using PC's Medium/High settings, but maxed out on PS5, in comparison.
Not to say that can't happen, but it hasn't happened in the past. If it's using maxed out PC settings, the game wouldn't be a good PC port, as you wouldn't be able to enable any extra settings or graphical features.
 

Md Ray

Member
Maybe check out their post history. Vfx was replying to a guy who has a history of trying to downplay his posts.




Not to say that can't happen, but it hasn't happened in the past. If it's using maxed out PC settings, the game wouldn't be a good PC port, as you wouldn't be able to enable any extra settings or graphical features.
Pretty sure it's happened with early cross-gen games. Assassin's Creed Black Flag comes to mind. It's fine to have a game running at PC's max settings on a console, especially for a cross-gen indie title. I wouldn't say "the game wouldn't be a good PC port" just because a console is able to render all of the top settings, early on. But next-gen games will for sure have additional rendering features, higher-quality effects, over XSX & PS5 on PC.
 
Not to say that can't happen, but it hasn't happened in the past. If it's using maxed out PC settings, the game wouldn't be a good PC port, as you wouldn't be able to enable any extra settings or graphical features.
The Unreal demo for example used 8k textures, and hollywood asset level geometry, with polygon per pixel rendering. Geometry wise a pc game cannot have more geometric detail than something using polygon per pixel detail. Texture wise, sure you could include extra nightmare 16K textures, but that wouldn't be perceptible, that would just be bragging and misuse of rendering resources. The image quality was also extremely good in the unreal demo.

We are getting to a point where detail level in terms of textures or polygons cannot be meaningfully increased from what the consoles can handle.

Judging by the nvidia marbles demo, unless more complex animation is significantly more taxing on path tracing lighting, it also seems we are one or two generations away from path traced lighting on mainline games on consoles. At which point you can add lighting to textures and polygon detail that cannot be meaningfully improved.

What remains after that is physics animation and framerate. If DLSS like solutions are implemented in the future, expect consoles to easily handle 60+fps at 4k, already the unreal demo uses hollywood level assets and runs fine on ps5(some estimates say 45+fps), so additional performance can go into framerate.

Sure you will be able to say you game at 8k 120fps on a future high end rig, but that won't be that much of a difference from 4k 60fps with the same assets lighting and excellent image quality.

Before when you had blurry textures, crappy image quality, and sub 30fps on consoles, you could say yeah there's a big difference. But as consoles get sharp textures, excellent image quality, 60fps, hollywood level textures and geometry, and a few generations down the line path traced lighting. The difference becomes much smaller.
 
Last edited:
Pretty sure it's happened with early cross-gen games. Assassin's Creed Black Flag comes to mind. It's fine to have a game running at PC's max settings on a console, especially for a cross-gen indie title. I wouldn't say "the game wouldn't be a good PC port" just because a console is able to render all of the top settings, early on. But next-gen games will for sure have additional rendering features, higher-quality effects, over XSX & PS5 on PC.
Imagine if RDR2 used the same low-medium settings that consoles used, and didn't do anything for the PC port? It would be such a let down. So of course PC gamers expect better visuals and performance to consoles, otherwise they would be playing it on console instead.

Most devs give PC players settings that they can crank up and showcase how they intended the game to look, without restrictions.
The Unreal demo for example used 8k textures, and hollywood asset level geometry, with polygon per pixel rendering. Geometry wise a pc game cannot have more geometric detail than something using polygon per pixel detail. Texture wise, sure you could include extra nightmare 16K textures, but that wouldn't be perceptible, that would just be bragging and misuse of rendering resources. The image quality was also extremely good in the unreal demo.

We are getting to a point where detail level in terms of textures or polygons cannot be meaningfully increased from what the consoles can handle.

Judging by the nvidia marbles demo, unless more complex animation is significantly more taxing on path tracing lighting, it also seems we are one or two generations away from path traced lighting on mainline games on consoles. At which point you can add lighting to textures and polygon detail that cannot be meaningfully improved.

What remains after that is physics animation and framerate. If DLSS like solutions are implemented in the future, expect consoles to easily handle 60+fps at 4k, already the unreal demo uses hollywood level assets and runs fine on ps5(some estimates say 45+fps), so additional performance can go into framerate.

Sure you will be able to say you game at 8k 120fps on a future high end rig, but that won't be that much of a difference from 4k 60fps with the same assets lighting and excellent image quality.

Before when you had blurry textures, crappy image quality, and sub 30fps on consoles, you could say yeah there's a big difference. But as consoles get sharp textures, excellent image quality, 60fps, hollywood level textures and geometry, and a few generations down the line path traced lighting. The difference becomes much smaller.
That same unreal engine demo is coming to PC in the upcoming months, so we can test how PC fares against ps5 in that exact demo. With having better hardware, you could push fidelity, animations, physics, raytracing, etc, to a higher level of detail. Whether it can be perceived or not, is up to the end user, as some cannot tell the difference between 1080p and 4K, or 30fps and 144fps.
 
That same unreal engine demo is coming to PC in the upcoming months, so we can test how PC fares against ps5 in that exact demo. With having better hardware, you could push fidelity, animations, physics, raytracing, etc, to a higher level of detail. Whether it can be perceived or not, is up to the end user, as some cannot tell the difference between 1080p and 4K, or 30fps and 144fps.
Yes it's up to the user, but for example the difference between 120fps and 144fps even framerate aficionados have trouble perceiving it.

It is also a matter of time before we have photorealistic images produced in real time. Once we have that on consoles, the next generations to follow will go towards framerate.

Say once we have 4~K 120 fps photoreal images on consoles, can we say 8~K 240fps provides meaningful difference? Sure some may say they see the difference, but even people with what is called superhuman vision far above average do not find significant image quality improvement between movies in 4k and 8k at reasonable distance on 100 inch screens. PC gamers that are sensitive to framerate have trouble telling any benefit of 144fps over 120fps.


We get to a point similar to connoisseurs who claim to tell vast differences in ultra expensive products, yet are easily tricked into preferring cheap products on blind tests.
 
Last edited:
Yes it's up to the user, but for example the difference between 120fps and 144fps even framerate aficionados have trouble perceiving it.

It is also a matter of time before we have photorealistic images produced in real time. Once we have that on consoles, the next generations to follow will go towards framerate.

Say once we have 4~K 120 fps photoreal images on consoles, can we say 8~K 240fps provides meaningful difference? Sure some may say they see the difference, but even people with what is called superhuman vision far above average do not find significant image quality improvement between movies in 4k and 8k at reasonable distance on 100 inch screens. PC gamers that are sensitive to framerate have trouble telling any benefit of 144fps over 120fps.


We get to a point similar to connoisseurs who claim to tell vast differences in ultra expensive products, yet are easily tricked into preferring cheap products on blind tests.
That's where I feel raytracing will definitely be the differentiator this generation. PC seems to have more robust hardware, and much of that power will go to raytracing, as opposed to just higher resolution and higher framerate. We'll start to see this with cyberpunk and other releases soon after.
 

Md Ray

Member
Imagine if RDR2 used the same low-medium settings that consoles used, and didn't do anything for the PC port? It would be such a let down. So of course PC gamers expect better visuals and performance to consoles, otherwise they would be playing it on console instead.

Most devs give PC players settings that they can crank up and showcase how they intended the game to look, without restrictions.
RDR2 is not a good example. It's a game built from the ground up for current-gen consoles & released in the middle of their lifecycle almost 2 years ago, it's not a cross-gen game like Kena. That's why I gave you the e.g. of AC Black Flag - it was a cross-gen PS3/X360 game using lower-quality graphics but the PS4/XB1 versions were largely identical to PC's max settings in terms of assets and effects work at that time. It's ok for cross-gen games to get PC's max settings on a new console. AC Unity was a game built from the ground up for new machines leaving PS3/X360 behind, and it had additional draw distance, higher-res textures, shadows on PC compared to PS4/XB1. This is going to continue for PS5/XSX games as well.

Hence the sentence in my previous post: "But next-gen games will for sure have additional rendering features, higher-quality effects, over XSX & PS5 on PC".
 
Last edited:
Top Bottom