• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry is doing an Avatar console vs PC comparison using Performance Mode PS5 vs PC Unobatanium Settings

ChiefDada

Gold Member
He wasn't comparing the max fidelity between platforms. He was comparing the scalability of the engine at a given refresh rate. The PS5 could just as easily have been swapped out for a 3060.

That is exactly what the pictures he tweeted were showcasing. There was nothing related to FPS shown in the captures; they illustrated differences in graphical presets and not much else.

Anyways, I'm beyond bored with this conversation.

He very likely will do an optimized settings guide for mid-tier rigs as he so often does. This is just to demonstrate how the game scales at 60fps...and he uses that because 30fps would be completely irrelevant to the PC crowd because no one wants that.

And that's fantastic! Everything I've said in this thread is simply based on the tweets. Myself and others consider the tweet comparisons flawed. Yourself and others think it's fine. I think Avatar looks really good on all platforms and I'm pleasantly surprised about Series S results. 4090/12900k is unquestionably the best looking/performing in all manners of comparison.
 

Gaiff

SBI’s Resident Gaslighter
And that's fantastic! Everything I've said in this thread is simply based on the tweets. Myself and others consider the tweet comparisons flawed. Yourself and others think it's fine. I think Avatar looks really good on all platforms and I'm pleasantly surprised about Series S results. 4090/12900k is unquestionably the best looking/performing in all manners of comparison.
Oh, I perfectly know this and pointed this out. The tweet is part of a larger video that will come later but OP conveniently left this out. Cheers.
 

FireFly

Member
That is exactly what the pictures he tweeted were showcasing. There was nothing related to FPS shown in the captures; they illustrated differences in graphical presets and not much else.
He explained the context of the pictures in a subsequent tweet, also in the OP. If the argument is that his first tweet was misleading, then OK. But if you're going to disbelieve his own explanation for making the comparison, then yeah I don't think the discussion is going to go anywhere.
 

shamoomoo

Member
Because then there would be absolutely NO reason to use DLSS Performance if this was the objective? IQ is part of the image fidelity (especially in this day and age when more than just pixels scale with resolution) and you're out to lunch if you don't think it is.

The objective was 60fps vs 60fps because no one gives a shit about 30fps on PC, especially not anyone playing on a 4090.
Imagine clarity is irrelevant when the comparison is the quality of tech put in the consoles version in quality mode vs comparable GPUs and CPUs on PC.
 

shamoomoo

Member
He wasn't comparing the max fidelity between platforms. He was comparing the scalability of the engine at a given refresh rate. The PS5 could just as easily have been swapped out for a 3060.
Alex has relatively low end GPUs,his PC vs PS5 would kinda be irrelevant if he's focusing on PC alone for a tech comparison.
 

rodrigolfp

Haptic Gamepads 4 Life
Output resolution and graphical presets (e.g. "max settings") are independent of each other. You all know this. Stop being weird and selectively ignorant.
As people already pointed, blurry IQ and/or reconstruction artifacts hurts every assets quality.
 
Last edited:

DeepEnigma

Gold Member
But they literally spent half of the last DF direct saying GTA6's 30 fps limit is fine because sometimes you have to give up 60 fps for higher fidelity.
30fps was fine for Starfield too.
Tea Time Reaction GIF by Robert E Blackmon
 

Killer8

Member
The point of the comparison was to show how scalable the engine is, so pitting performance mode on PS5 against the max settings on PC showcases the full range of what to expect from the engine.

It would've been more clear if he just left out all of the talk about framerates for each comparison, since it's not the main point.
 

SlimySnake

Flashless at the Golden Globes
The point of the comparison was to show how scalable the engine is, so pitting performance mode on PS5 against the max settings on PC showcases the full range of what to expect from the engine.

It would've been more clear if he just left out all of the talk about framerates for each comparison, since it's not the main point.
if that was the goal then he shouldve picked the series s version.
 

shamoomoo

Member
As people already pointed, blurry IQ and/or reconstruction artifacts hurts every assets quality.
So you can't tell how many trees are in a scene or the level of volumetric effects because of image quality? Like if those effects are there or not.

And how is a higher resolution and a stable but lower frame rate game worse than a high frame rate game at lower resolution?
 

farmerboy

Member
Entirely valid to compare the 60fps target.
Now, if you wanted to compare just the "best" it can look, you wouldn't do this.
But as a ps5 user who tends to opt for performance, this comparison suits me just fine.
 
Beefy doesn't require $5000. Take that from an owner of a PC that cost about that (perhaps a tad more).
This game is extremely demanding notice he’s on a 4090 yet still needs dlss to run at max settings for 60 what are you talking about a weaker pc can do max settings
 

Thirty7ven

Banned
The point of the comparison was to show how scalable the engine is, so pitting performance mode on PS5 against the max settings on PC showcases the full range of what to expect from the engine.

That doesn’t make any sense he should be using the Series S then. Or the steamdeck.

A lot people playing dumb with this when there’s nothing ambiguous about it.
 
Last edited:

HeWhoWalks

Gold Member
This game is extremely demanding notice he’s on a 4090 yet still needs dlss to run at max settings for 60 what are you talking about a weaker pc can do max settings
You do not need a $5000 PC to run this game at max settings. So, what are you on about?
 
Last edited:

Thirty7ven

Banned
How would that demonstrate the scalability of the engine to future PC hardware?

How is the PS5 in performance helping demonstrate the scalability?

Honestly this is such a dumb conversation. It’s like the trick is to go round and round until people simply accept the lack of logic as proof of logic.
 
I love how DF always show what is the internal resolution in consoles even when they are using scaling algorithms, but they will only label “4K DLSS quality“, “4K DLSS performance” in their PC videos.
I never even caught that before but good on pointing that out
 

Senua

Member
I never even caught that before but good on pointing that out
Well we know what internal rez those dlss modes are at a 4k target, whereas consoles are variable. I guess they could have the internal rez in brackets for the different dlss modes for less nerdy viewers
 

FireFly

Member
How is the PS5 in performance helping demonstrate the scalability?

Honestly this is such a dumb conversation. It’s like the trick is to go round and round until people simply accept the lack of logic as proof of logic.
Games are primarily designed for PS5/XSX/3060 class hardware, and setting and resolution cuts are made to allow for playability on the Series S and Steam Deck. So it's common to see scalability upward from the S and Steam Deck, but less common to see scalability from console class hardware to, say, a 4090. How do you assess the visual impact of such scalability when it exists? Answer: by comparing the game on those two sets of hardware. Alex could have used the 3060 running at 60 FPS, and the comparison wouldn't have been any less valid. If he had used a 3060, I doubt we would have a thread full of angry 3060 owners questioning why he wasn't testing at 30 FPS with the settings and resolution cranked up.
 

Thirty7ven

Banned
Games are primarily designed for PS5/XSX/3060 class hardware, and setting and resolution cuts are made to allow for playability on the Series S and Steam Deck. So it's common to see scalability upward from the S and Steam Deck, but less common to see scalability from console class hardware to, say, a 4090. How do you assess the visual impact of such scalability when it exists? Answer: by comparing the game on those two sets of hardware. Alex could have used the 3060 running at 60 FPS, and the comparison wouldn't have been any less valid. If he had used a 3060, I doubt we would have a thread full of angry 3060 owners questioning why he wasn't testing at 30 FPS with the settings and resolution cranked up.

Well there you go college, you got there on your own.
 

rodrigolfp

Haptic Gamepads 4 Life
So you can't tell how many trees are in a scene or the level of volumetric effects because of image quality? Like if those effects are there or not.

And how is a higher resolution and a stable but lower frame rate game worse than a high frame rate game at lower resolution?
The amount of trees and the presence or not of effects is not quality measure.

And how is worse controller response and visual fluidity worse than blurry IQ?
 
Average people don't watch DF videos. DF has been doing this for years. It's no dig at consoles.

Because those terminologies don't really exist on consoles. There is usually no FSR Performance or Quality. You'll get a Performance and a Quality Mode and the base resolution will be at whatever the devs decide to set it to. Then FSR or another upscaling method will be used to reach an output resolution. DF has to specify the base resolution because it varies a lot on consoles as I said before. It can be anything.

On PC, pretty much only 3 resolutions are relevant; 1080p, 1440p, and 4K. 99% of benchmarks will be run at those resolutions. You don't need to beat someone over the head with this. DF aren't the only channel not mentioning the internal resolution of the reconstruction when the target resolution and quality setting are known. Hell, sometimes they even do for uncommon choices such as FSR/DLSS Performance with a 1080p output because the internal resolution will be 540p, which is incredibly low.


Clearly. That's why your puny brain is hurting trying to comprehend why big bad DF is trying to make consoles look bad.
I honestly think the comparison is fine just would be cool to also show the quality mode alongside it
 
Wait so you want him to test the PC on every configuration just because not everybody has his exact configuration? Not everybody has a 3060 after all only around 9% of Steam users do. And btw the 3060 is ballpark PS5 performance **AT RASTER**. The PS5 performs at the level of a 5700XT, 1080ti, Pascal Titan and the 3060 is slightly faster than a 5700XT at raster. The 3060 runs circles around the PS5 at RT and demolishes it at AI meaning the 3060 can look flatout better even at same settings thanks to DLSS and Ray Reconstruction. Of the 3 performance metrics the PS5 only is competitive in 1 and that's raster. Most people in this sub don't actually follow hardware hence their misunderstanding of PC hardware but the reality is that Nvidia > AMD when it comes to hardware.

And btw of course he's testing PC at its best to show what the game can do, it's literally the same thing hes doing on console. Do you want him to run the game on a Series S vs a 4060 and Ryzen 7600 Because again that would make consoles look even worst and PC even better.

Or perhaps you want him to use old hardware instead of Ada (40 series) and Zen 4 (7600), maybe a 2060 and Intel 9600 that launched the same year console gamers were playing God of War 2018 and Spiderman 1. And guess what even ancient hardware like that would still demolish the Series S in this game because the 2060 beats the PS5 and Series X at Ray Tracing and upscaling let alone the Series S.
The 3060 isn’t comparable to the ps5 in raster the ti is the one that’s closer
 

Dorfdad

Gold Member
the fact that they are even discussing it a PS5 console vs a 4090 GPU and its anywhere close tells me all I need to know. These cards we have out now will never be maximized to the extent that consoles use them. Tools and optimization is just so much better on consoles overall. Im seriously considering no gaming PC come next Gen for gaming. Way too expensive and troublesome for me on PC anymore.

If we get a decent bump and we can do real 4K/60/120 now I have no need for PC gaming.

/Hides
 
The 3060 isn’t comparable to the ps5 in raster the ti is the one that’s closer
I already explained it all right there. PS5 is 5700XT, 1080TI, Pascal Titan level, you can stretch that to 2070 Super in some games but the super more often than not outperforms the PS5 in raster. Either way the 1080ti is ball park: 2080, 5700xt, 2070 Super, Pascal Titan level. All of these cards will perform differently depending on the drivers and game chosen but they all land in that same ball park.


Forgot to add the 3060TI is beyond that ballpark, it's so fast it's even faster than the 4060 (the 4060 beats all the cards in the above paragraph). It uses the GA104 chip that the 3070 uses just a more crippled version so the 3060ti is essentially a baby 3070. The 3060t is 89% of the performance of the 2080ti(which was the 4090 of Turing) and 3070, and the 3070 more than 50% faster than a PS5. In Alan Wake 2 the 3070 is about 50% faster than the PS5, you know what card the 3070 is 50% faster than? The 5700XT.
 
Last edited:

SenkiDala

Member
30fps was fine for Starfield too.
Tea Time Reaction GIF by Robert E Blackmon
Yeah... And I don't get their bias on this one, sure "let's compare 60fps mode to 60fps on PC" well... No ? And I mean he promotes that by... posting screenshots to be sure that's the PS5 is at its disadvantage ?

Anyway those comparisons are dumb as fuck... Do anybody compare F1 cars with standard cars ? Or "let's compare the frame rate of Genshin Impact between the 400$ Galaxy A53 and the 1100$ iPhone 15 Pro Max, really wondering which will win, the suspense is at its best"...
 
I already explained it all right there. PS5 is 5700XT, 1080TI, Pascal Titan level, you can stretch that to 2070 Super in some games but the super more often than not outperforms the PS5 in raster. Either way the 1080ti is ball park: 2080, 5700xt, 2070 Super, Pascal Titan level. All of these cards will perform differently depending on the drivers and game chosen but they all land in that same ball park.


Forgot to add the 3060TI is beyond that ballpark, it's so fast it's even faster than the 4060 (the 4060 beats all the cards in the above paragraph). It uses the GA104 chip that the 3070 uses just a more crippled version so the 3060ti is essentially a baby 3070. The 3060t is 89% of the performance of the 2080ti(which was the 4090 of Turing) and 3070, and the 3070 more than 50% faster than a PS5. In Alan Wake 2 the 3070 is about 50% faster than the PS5, you know what card the 3070 is 50% faster than? The 5700XT.
The ps5 is a 5700xt… if you buy Alex claims and ignore real world performance
 
I already explained it all right there. PS5 is 5700XT, 1080TI, Pascal Titan level, you can stretch that to 2070 Super in some games but the super more often than not outperforms the PS5 in raster. Either way the 1080ti is ball park: 2080, 5700xt, 2070 Super, Pascal Titan level. All of these cards will perform differently depending on the drivers and game chosen but they all land in that same ball park.


Forgot to add the 3060TI is beyond that ballpark, it's so fast it's even faster than the 4060 (the 4060 beats all the cards in the above paragraph). It uses the GA104 chip that the 3070 uses just a more crippled version so the 3060ti is essentially a baby 3070. The 3060t is 89% of the performance of the 2080ti(which was the 4090 of Turing) and 3070, and the 3070 more than 50% faster than a PS5. In Alan Wake 2 the 3070 is about 50% faster than the PS5, you know what card the 3070 is 50% faster than? The 5700XT.
Alan wake is a heavily rt game we are discussing pure raster I also suspect that game favors nvidia cards. Use 90% of other games out right now
 
People defending this retarded explanation from Alex regarding choosing performance modes in his comparison videos make me laugh… if he wants to find optimised settings for PC he chooses different PC configurations and then compare them against each other at given frame rate… We all know consoles are a different animal… they are closed boxes with limited performance with the 30 fps modes being the main modes regarding graphical settings not the 60 fps ones… we all know that the developers spend most of their optimisation time perfecting the quality mode graphics settings (30 fps) and not the performance mode (60 fps) because that’s the way consoles work… Choosing the worst graphics mode on consoles to judge graphics prowess against max settings PC with the absolutely best hardware money can buy at the moment is intellectually dishonest and pointless both for PC and Consoles communities…
 
Alan wake is a heavily rt game we are discussing pure raster I also suspect that game favors nvidia cards. Use 90% of other games out right now
Yes we are talking raster, the PS5 version isn't using RT in that comparison and the 3070 was running on PS5 graphics settings. It uses a software based RT technique (a technique that's been used since the PS3 days) that runs on traditional raster only hardware. It's irrelevant to RT discussion because it's not hardware accelerated, in other words it runs fine on hardware without HT. If you have an older PC boot up Fortnite and turn on software lumen, it'll run just fine but if you turn on hardware lumen on a UE5 game the performance will tank. A 1080ti can't run Cyberpunk with RT on but it can run a UE5 game with software RT just fine, that's what's going on with AW2 on PS5, the terrible RT performance of the PS5 is irrelevant in that game.
 
Last edited:

DeepEnigma

Gold Member
Yeah... And I don't get their bias on this one, sure "let's compare 60fps mode to 60fps on PC" well... No ? And I mean he promotes that by... posting screenshots to be sure that's the PS5 is at its disadvantage ?

Anyway those comparisons are dumb as fuck... Do anybody compare F1 cars with standard cars ? Or "let's compare the frame rate of Genshin Impact between the 400$ Galaxy A53 and the 1100$ iPhone 15 Pro Max, really wondering which will win, the suspense is at its best"...
Feel the inferiority console PS5 peasant.
insult peasant GIF by History UK
 
The ps5 is a 5700xt… if you buy Alex claims and ignore real world performance
It's very similar to a real 5700XT or a 6700 (non RT). In practice the 6700 is much faster for a variety of reasons (the 6700 has infinity cache, is not thermally limited nor sharing resources like memory with the CPU). The 5700XT performs close to the PS5 in practice thanks to the advancements of low level APIs on PC like DX12 and Vulkan and also because it has a bit more hardware like more CUs. If this was the DX9 days the PS5 would be much faster in practice due to the high overhead of abstracted APIs like DX9.
 
Feel the inferiority console PS5 peasant.
insult peasant GIF by History UK
No matter what the comparison the PS5 would come out below (4090, 4070, 4060, 3060, even a half a decade old 2060 would make the PS5 look bad at RT). There's one critical reason why that most console people miss: it's not Sony vs Nvidia, it's AMD vs Nvidia. Sony doesn't make GPUs nor CPUs. And the AMD GPU in the PS5 is only as good as AMD made it.

Here's a good example. In the year 2020 AMD launched it's new generation of GPUs the first for AMD to bring hardware accelerated RT. Guess what Nvidia created the market for RT GPUs 2 AMD generations ago. This 2020 architecture: RDNA2 was competitive with Nvidia at rasterization but struggled to compete in RT and image quality (AI HW powered DLSS). It ran on the most cutting edge fab process (TSMC 7N) for GPUs.

Meanwhile Nvidia releases Ampere on the "ghetto" Samsung 8N process which was 2 generations worth of performance behind TSMC 7N. On this terrible power hungry and inefficient process Nvidia manages to match AMDs RDNA2 at Raster and demolishes it at RT and AI (DLSS). You know what HW the PS5 is using? RDNA2* (with certain features fused off like ha vrs). You know where the PS5 excels? Raster just like RDNA2, you know where it struggles? RT just like RDNA2, you know what it flat out can't do? AI just like RDNA2.
 

Redefine07

Member
Just got this game from AMD GPU RX 7900 XT Nitro+ , works really good 4k FSR3 Quality Frame Gen on , ultra with mix of high/medium , 90-120fps (fake or w/e they are it's smooth on a LG C1)
 
Top Bottom