• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Horizon Forbidden West Complete Edition PC specifications revealed

We're actually talking about how a poster foolishly claimed that even with a $1000 PC, you can't get what the PS5 does which is utter bullshit.

The most laughable part of these comparisons is that we always gotta gut PC features to bring them down to the level of the consoles. As much as people love shitting on PC GPUs for their 8GB, these same people are completely silent when the consoles even in Quality Mode are forced to run shit-tier 4x AF that's impossible to upgrade. Then in order to make those comparisons "fair", we drop PC's settings down to 4x AF as well when 16x has been free for over a decade. As if anyone on PC games with 4x AF lol.

Another comical fact is how DLSS gets completely ignored and RTX GPUs in comparison use garbage FSR to level the playing field. Once again, no one with an RTX GPU will use FSR over DLSS when both are available. How is it our problem that consoles don't have DLSS? This very thread is proof of it. Who so far has acknowledged that you can simply toggle DLSS Quality at 4K to get similar image quality and better fine detail resolve than on PS5 but with much better performance? Again, crickets chirping.

Fact is, with DLSS, Frame Generation, and better RT, RTX GPUs can dunk on consoles but we always use them in a way that inherently favors consoles because consoles can't use what PC GPUs do. As far as an academic exercise is concerned, this is how it should be done, but, if we're talking value, that's not how it should be done. The RTX 3060 only gets 60+fps at 1080p? Increase the res to 1620p, toggle DLSS Quality and watch it having a much better image quality than the CBR on the PS5 while also performing on par but no one acknowledges that.

You get what you pay for and for $400-500, consoles have nothing like DLSS, frame generation, Reflex, or high-quality RT and pretty much always run bad AF. These are all things that get glossed over when comparing "value".

You seem upset because people buy consoles instead of PCs....

And even those who game on PC usually have worse specs than current gen consoles

If you are so right, why doesn't PC has 90% of the market share??

Why is it getting console ports?
 
Last edited:

Diddy X

Member
Console version is usually 1440p 30fps (can drop to 1080p at times and below 30 even), so you need at least 2x the GPU for 4K (4070S) and another 2x for 60FPS.

So ~4x the power of PS5, 4090 won't be able to do it without upscaling:

SVHG93h.jpg

Consoles are underpowered af
 

Gaiff

SBI’s Resident Gaslighter
You seem upset because people buy consoles instead of PCs....

And even those who game on PC usually have worse specs than current gen consoles

If you are so right, why doesn't PC has 90% of the market share??

Why is it getting console ports?
I've owned every console every single generation dating back to the PS2 era and am waiting for a PS5 Pro. No one is upset. The discourse is just incredibly dishonest all the time.

Look at when Bojji Bojji was fiddling around to find shitty CPUs because apparently, consoles were CPU-bound in some DF comparisons and even when the dude found bottom of the barrel CPUs that were getting 100fps when not GPU-bound, he was told that it wasn't fair and that we needed a freakin' 4800S to make it fair. Literally no one has that desktop kit and I even looked around to get one and couldn't find it. Rich from DF had to find it in some Chinese auction lol.

Again, for academic discourses and to know how efficient PCs or consoles are, we should absolutely strive to make things as fair as possible. When it comes to value though? lol. There's a reason NVIDIA wipes the floor with AMD GPUs despite being generally more expensive for the same tier of performance. They offer features no one else has and that's value that always gets ignored.

I love my consoles and you'll seldom find me shitting on them. It's the hypocritical fanboys who annoy me.
 
Last edited:
At the end of the day you need a xx70 card to surpass the three (and a half) years old PS5

And that alone costs more than the whole console

You can do the 4070 Super vs PS5 game like DF does but it's still a joke....
 

Gaiff

SBI’s Resident Gaslighter
At the end of the day you need a xx70 card to surpass the three (and a half) years old PS5

And that alone costs more than the whole console

You can do the 4070 Super vs PS5 game like DF does but it's still a joke....
No, you don't. The 4060 Ti 16GB is plenty. And that's just considering rasterization WITHOUT DLSS.

If you actually bother using DLSS and RT like any sane person owning an RTX GPU would, then the 3060 will get you an equivalent if not better experience and this card is almost as old as the PS5.

On AMD's side, a 6700 will match it. A 6700 XT will beat it. A 6800 completely outclasses it. These are all 2020-2021 cards.

This is exactly the kind of bullshit I was referring to in my earlier post. Never mind the fact that the basic 4070 is much faster.
 
Last edited:

Zathalus

Member
At the end of the day you need a xx70 card to surpass the three (and a half) years old PS5

And that alone costs more than the whole console

You can do the 4070 Super vs PS5 game like DF does but it's still a joke....
Yeah, but it exceeds it by anything from 1.8x to 3x or more in some titles. You certainly can't beat a PS5 at its price point, but for roughly double the price you can get roughly double the performance. Hence why somebody claiming you can't beat the PS5 for under $1000 is laughable.

But PC gaming has never been about having the absolute best hardware price/performance ratio, the appeal and strengths of the platform lay elsewhere and that is enough for it remain a popular choice for many. Consoles have their own strengths as well hence they are also popular.
 
No, you don't. The 4060 Ti 16GB is plenty. And that's just considering rasterization WITHOUT DLSS.

If you actually bother using DLSS and RT like any sane person owning an RTX GPU would, then the 3060 will get you an equivalent if not better experience and this card is almost as old as the PS5.

I'm sorry I just don't see it that way

And most people don't see it that way at all

Nobody would pick a 3060 PC vs a PS5, not even PC-only gamers (unless they game at 1080p MAX)

They will spend significantly more money to build a better rig to make the difference meaningful someway
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
I'm sorry I just don't see it that way
Yeah and?
And most people don't see it that way at all

Nobody would pick a 3060 PC vs a PS5, not even PC-only gamers
Is this why the RTX 3060 is the most popular GPU?
They will spend significantly more money to build a better rig to make the difference meaningful someway
No, they won't. Most PC gamers have mid-tier rigs with GPUs in the class of the 2070 to 4060 Ti.
 
Last edited:
Is this why the RTX 3060 is the most popular GPU?

Do you know that the same survey says the most used resolution is 1920x1080?

Yeah at 1080p a 3060 is more than enough....

I bought a 4060 for the same reason, but it's not really comparable to a PS5 that targets 1440p and above
 

SlimySnake

Flashless at the Golden Globes
Consoles are underpowered af
Compared to a card with an MSRP of $1,649 that routinely goes for $2,000? Not really.

You are getting 3-4x more performance from a 3-4x more expensive card. They are both equally great value.
 

Gaiff

SBI’s Resident Gaslighter
Do you know that the same survey says the most used resolution is 1920x1080?
Yeah, and who cares? Most PC gamers will gun for 60fps.
Yeah at 1080p a 3060 is more than enough....
For 60fps. Why do you think people on this forum are always crying about the low resolution of the base consoles in Performance Mode where they have to drop to 720p and reconstruct not to choke their GPUs?
I bought a 4060 for the same reason, but it's not really comparable to a PS5 that targets 1440p and above
I mean, this is provably false unless you got the 8GB version and go for 4K. DF just did a whole ass benchmark and the PS5 performed on the level of a 6700.
 
Last edited:

ChiefDada

Gold Member
We're actually talking about how a poster foolishly claimed that even with a $1000 PC, you can't get what the PS5 does which is utter bullshit.

The most laughable part of these comparisons is that we always gotta gut PC features to bring them down to the level of the consoles. As much as people love shitting on PC GPUs for their 8GB, these same people are completely silent when the consoles even in Quality Mode are forced to run shit-tier 4x AF that's impossible to upgrade. Then in order to make those comparisons "fair", we drop PC's settings down to 4x AF as well when 16x has been free for over a decade. As if anyone on PC games with 4x AF lol.

I've never seen any comparison like this but if so yeah that would be terrible methodology.

Another comical fact is how DLSS gets completely ignored and RTX GPUs in comparison use garbage FSR to level the playing field. Once again, no one with an RTX GPU will use FSR over DLSS when both are available. How is it our problem that consoles don't have DLSS? This very thread is proof of it. Who so far has acknowledged that you can simply toggle DLSS Quality at 4K to get similar image quality and better fine detail resolve than on PS5 but with much better performance? Again, crickets chirping.

I agree 100% with this sentiment. But iirc you were guilty of the same when it came to PS5 decompression hw.

And per DF last video on this, PC port improved but there are still significant CPU bottlenecks due to how ND engine is crafted. That is a plus for the PS5 hardware decompression; it doesn't make the comparison unfair any more than DLSS allows for better image quality for cheaper cost on Nvidia cards

The comparison you used wasn't "unfair", it was just outdated. You want to compare GPU performance, therefore you have to remove the CPU bottleneck as much as possible and when we see a 4070 hitting 80% usage on a CPU that hits 90% usage, we know the GPU is being held back. Per more recent metrics, the PS5 is ahead of the 2070S by 30-48%, which puts it somewhere around a 2080 Ti/6750 XT to a 6800 but typically closer to the former.
 
Yeah, and who cares? Most PC gamers will gun for 60fps.

For 60fps. Why do you think people on this forum are always crying about the low resolution of the base consoles in Performance Mode where they have to drop to 720p and reconstruct not to choke their GPUs?

I mean, this is provably false unless you got the 8GB version and go for 4K. DF just did a whole ass benchmark and the PS5 performed on the level of a 6700.

Actually when going for 30-40 fps PS5 targets 2160p not 1440p

I was specifically talking about 60fps

And frame-rate is also impacted by the CPU that is weaker on a console by default compared to a desktop PC for obvious reasons
 
Last edited:

Topher

Gold Member
Actually when going for 30-40 fps PS5 targets 2160p not 1440p

That used to be the case, but Alan Wake 2 renders at 1270p and Avatar uses DRS and ranges between 1296p and 1800p. And now we find out Dragon's Dogma 2 isn't getting a performance mode at all and it will be interesting to see what resolution 30fps will get you but I'm doubting seriously it will be 2160p.
 

Diddy X

Member
Compared to a card with an MSRP of $1,649 that routinely goes for $2,000? Not really.

You are getting 3-4x more performance from a 3-4x more expensive card. They are both equally great value.

But 30 fps and 1080p? That's not the next gen they told us about.

4k 60+ fps along other tweaks, that's true nextgen even if very expensive.
 
Last edited:

yamaci17

Member
Yeah, and who cares? Most PC gamers will gun for 60fps.

For 60fps. Why do you think people on this forum are always crying about the low resolution of the base consoles in Performance Mode where they have to drop to 720p and reconstruct not to choke their GPUs?

I mean, this is provably false unless you got the 8GB version and go for 4K. DF just did a whole ass benchmark and the PS5 performed on the level of a 6700.
my personal problem here is that 1440p dlss performance often looks better than other upscalers rendering around 1080p
and 4k dlss performance often looks better than 1440p

so i cannot take any of these discussions for serious anymore :messenger_grinning: as you said it is part of what makes rtx gpus valuable in a manner of speaking. and it is just leagues ahead compared to other upscalers. I had to endure jedi survivor with FSR and it looked horrible at 4K/FSR Quality. Come DLSS patch, I was able to enjoy 1440p DLSS balanced. That is a wide gap of tolerance when it comes to upscalers. Just huge. 1440p dlss balanced in that game is more servicable, more temporally stable, looks more coherent than 4k fsr quality. it is just insane.

even at 1080p there are games where it is just usable despite what the internal resolution suggests (all comparisons are done in fast movement, the worst case scenario for temporal upscalers)



this was the most brutal:

I play tested last of us part 1's 1440p mode on PS5 against 4k dlss performance on PC and PC just looked better (same screen, same viewing distance, same conditions)

did the test on PC, wasn't surprised


just look at this performance boost. and image still looks brilliant. just a bit reduced over 4k but nothing "substantial"


just for the luls, there's not a single upscaler that can achieve whatever DLSS is doing here. not one.



both reflex and dlss are black magic but people refuse to see any value in them and try to compare 1 to 1 raster benchmarks at all costs (i'm not saying we shouldn't do that, but at some point people completely ignore how much of a value dlss brings to the table)

and reflex makes even the gamepad experience much more fun. once you get used to snappy experience, games become more fun to play on PC. PS5's input lag is horrible compared to what you get on PC.

I get less input lag at 30 fps on PC than most PS5 1st party games do at 60 fps. let that sink in.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
I've never seen any comparison like this but if so yeah that would be terrible methodology.
It's not a terrible methodology because there are two sides to this: One is academic, the other is purely what kind of experience you can get. From a purely academic perspective, yes, we need to match settings and drop down the PC to console levels because we cannot really tweak settings on consoles.

And look at Spider-Man on PS5 for instance. 4x AF.

yvsXLc1.jpg

I agree 100% with this sentiment. But iirc you were guilty of the same when it came to PS5 decompression hw.
No. The example you were using was from an old-as-hell benchmark with an old patch and I pointed out that CPU performance massively improved in subsequent patches by up to 35%. The objective in that case was purely GPU performance vs GPU performance and not only was the 4070 at low usage, but it was also running on an old patch with a massive CPU bottleneck that has since been improved. If I wanted to do that, I could have simply used DLSS at 4K and compared that against PS5 running native but that wouldn't have been an honest comparison given the academic context of the discussion.

I'll reiterate; you want to compare parts such as GPU vs GPU? Then sure, we need matched settings and all that jazz. You want to compare value? Then you need to factor how people actually use the products. Nobody uses 4x AF on PC and everybody and their mom uses DLSS. For instance, if the 3060 at 4K with DLSS Quality gets the same performance as the PS5 with 4K native, the PS5 is obviously the better performer, but you're effectively getting the same experience. Therefore in terms of value, you can't seriously say that the PS5 isn't being matched. Not PC's problem that it doesn't have DLSS.

Actually when going for 30-40 fps PS5 targets 2160p not 1440p
Only a few Sony have a 40Hz mode so why are you even bringing this up?
I was specifically talking about 60fps
I can't think of a single non-Sony game that runs at 1440p/60fps on the PS5 aside from perhaps shooters. 99% of the time, games in Performance Mode run at resolutions below 1440p such as 1080 or 1260p and reconstruct to 4K or something like that. The PS5 is not a 1440p/60fps machine most of the time.
And frame-rate is also impacted by the CPU that is weaker on a console by default compared to a desktop PC for obvious reasons
Which is the problem of the console. Can't just ignore that your $800 has a much, much faster CPU when comparing value.

Performance-wise, can a rig centered around an RTX 3060 match the PS5? Most of the time, no, unless there's a lot of RT.

Value-wise, can it match the PS5? Especially when we consider the ubiquitous 16x AF, DLSS, and better RT support? Absolutely. You more often than not can reach higher fps at lower resolution and at higher resolutions, you can simply use DLSS to claw back the performance deficit while maintaining a similar image quality.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
But 30 fps and 1080p? That's not the next gen they told us about.

4k 60+ fps along other tweaks, that's true nextgen even if very expensive.
sorry what game is 1080p 30 fps?

most games nowadays target fsr 4k quality so thats 1440p internal resolution at 30 fps. Using FSR to upscale so you get 4k textures and a much cleaner image than 1440p.

its the 60 fps modes that are the problem because they are dropping to 720p, but i had to buy a $850 graphics card and build a $2000 pc around it to ensure i was getting 1440p 60 fps. Getting 4k 60 fps for these modern games is simply not possible even on expensive $800 GPUs. you need a 4090 for that.
 

yamaci17

Member
I don't think this is some evil grant design on their part. Just division that makes GPUs made stupid decision to go with 8gb of VRAM at minimum and division that makes features like DLSS3 didn't expect games to balloon like they did with VRAM requirements. They have made 16gb 4060ti after all the backlash and I think they didn't plan this sku from the start. 10gb should be the minimum for xx60 GPUs in 2023.

Or maybe this was planned from the start like you said, Nvidia don't care that much about their consumers.
im not calling it evil, but nvidia will always do things that they can get away with

look at most ue 5 games. they dont even fill 8 gb buffer at 4k.

imagine being a 4060 user and playing these new "ue 5" titles at 1080p with fg:





this is what nvidia banked on, and it is paying off. and consider ue 5 will be the most used engine this gen, many studios are moving to it. and if its low vram consumption trend continues more than half of the games will be compliant with 1080p/1440p+frame gen with 8 gb vram. naturally whenever fg causes a problem in some other engine, people blame the game, not the vram, because then they will be able to tell "i played this, that and those ue games and never had any problem"

genie is out of the bottle with this one. im not defending 8 gb vram but even last of us developers have proved that you can do better with 8 gb and still have vram to spare.

9n6Qd91.png


it will be especially brutal when cases like this already happened. hardware unboxed went on to tell stories about how the textures to the left is the fate of the 8 gb cards going forward and it was unfixable and this budget was unworkable for games like this. and here it is, less vram usage, better quality textures.

im personally okay with compromises but not when it becomes ps3-like textures. genie is out of the battle. proof of the concept exists. it is too late for developers. if they were brave, they would've continued their stance and never fixed the problem above. so that more people would be scared away from 8 gb gpus. instead developers went and fixed their games. i dont even care how much money it cost them. im an end user and they're the developers. if they enjoy <%50 steam scores with ps3 textures on %60+ of the gpu population, all the power to them imo
 

shamoomoo

Member
We're actually talking about how a poster foolishly claimed that even with a $1000 PC, you can't get what the PS5 does which is utter bullshit.

The most laughable part of these comparisons is that we always gotta gut PC features to bring them down to the level of the consoles. As much as people love shitting on PC GPUs for their 8GB, these same people are completely silent when the consoles even in Quality Mode are forced to run shit-tier 4x AF that's impossible to upgrade. Then in order to make those comparisons "fair", we drop PC's settings down to 4x AF as well when 16x has been free for over a decade. As if anyone on PC games with 4x AF lol.

Another comical fact is how DLSS gets completely ignored and RTX GPUs in comparison use garbage FSR to level the playing field. Once again, no one with an RTX GPU will use FSR over DLSS when both are available. How is it our problem that consoles don't have DLSS? This very thread is proof of it. Who so far has acknowledged that you can simply toggle DLSS Quality at 4K to get similar image quality and better fine detail resolve than on PS5 but with much better performance? Again, crickets chirping.

Fact is, with DLSS, Frame Generation, and better RT, RTX GPUs can dunk on consoles but we always use them in a way that inherently favors consoles because consoles can't use what PC GPUs do. As far as an academic exercise is concerned, this is how it should be done, but, if we're talking value, that's not how it should be done. The RTX 3060 only gets 60+fps at 1080p? Increase the res to 1620p, toggle DLSS Quality and watch it having a much better image quality than the CBR on the PS5 while also performing on par but no one acknowledges that.

You get what you pay for and for $400-500, consoles have nothing like DLSS, frame generation, Reflex, or high-quality RT and pretty much always run bad AF. These are all things that get glossed over when comparing "value".
But consoles can't use DLSS,if there was a cheaper and or better upscaler on consoles then DLSS would be irrelevant. Also, Nvidias frame-gen is locked to the 4,000 series, which so bring up some that's new in comparison to 4 year old tech?
 

Gaiff

SBI’s Resident Gaslighter
But consoles can't use DLSS,if there was a cheaper and or better upscaler on consoles then DLSS would be irrelevant.
And if my grandma had wheels, she’d be a bicycle.
Also, Nvidias frame-gen is locked to the 4,000 series, which so bring up some that's new in comparison to 4 year old tech?
Because Zathalus Zathalus posted a rig with a 4060 and a poster replied it wouldn’t match a PS5 in terms of gaming experience which is bullshit.
 

Gaiff

SBI’s Resident Gaslighter
my personal problem here is that 1440p dlss performance often looks better than other upscalers rendering around 1080p
and 4k dlss performance often looks better than 1440p

so i cannot take any of these discussions for serious anymore :messenger_grinning: as you said it is part of what makes rtx gpus valuable in a manner of speaking. and it is just leagues ahead compared to other upscalers. I had to endure jedi survivor with FSR and it looked horrible at 4K/FSR Quality. Come DLSS patch, I was able to enjoy 1440p DLSS balanced. That is a wide gap of tolerance when it comes to upscalers. Just huge. 1440p dlss balanced in that game is more servicable, more temporally stable, looks more coherent than 4k fsr quality. it is just insane.

even at 1080p there are games where it is just usable despite what the internal resolution suggests (all comparisons are done in fast movement, the worst case scenario for temporal upscalers)



this was the most brutal:

I play tested last of us part 1's 1440p mode on PS5 against 4k dlss performance on PC and PC just looked better (same screen, same viewing distance, same conditions)

did the test on PC, wasn't surprised


just look at this performance boost. and image still looks brilliant. just a bit reduced over 4k but nothing "substantial"


just for the luls, there's not a single upscaler that can achieve whatever DLSS is doing here. not one.



both reflex and dlss are black magic but people refuse to see any value in them and try to compare 1 to 1 raster benchmarks at all costs (i'm not saying we shouldn't do that, but at some point people completely ignore how much of a value dlss brings to the table)

and reflex makes even the gamepad experience much more fun. once you get used to snappy experience, games become more fun to play on PC. PS5's input lag is horrible compared to what you get on PC.

I get less input lag at 30 fps on PC than most PS5 1st party games do at 60 fps. let that sink in.
At least in stills, 4K DLSS Performance looks better than 1440p with TAA in TLOU Part I. Not sure about how it looks in motion though and that's the big thing.
 

yamaci17

Member
At least in stills, 4K DLSS Performance looks better than 1440p with TAA in TLOU Part I. Not sure about how it looks in motion though and that's the big thing.
its not a still actually, i always do fast camera pans or runs in my comparisons (which is why none of my comparisons are perfectly aligned)

I take 50+ shot consecutively in motion for each comparison, then match ones that align best

sometimes I take high bitrate videos and do side by sides with Nvidia ICAT. 4k dlss performance consistently looks much clear than native 1440p taa/dlaa. reason it ends up looking better than DLAA is because dlss/dlaa output result scales with resolution. in other words you get better image quality if you tell dlss to upscale 1080p to 4K compared to 1440p DLAA. dlaa is still a huge improvement over taa though.
 
Last edited:

Mister Wolf

Gold Member
its not a still actually, i always do fast camera pans or runs in my comparisons (which is why none of my comparisons are perfectly aligned)

I take 50+ shot consecutively in motion for each comparison, then match ones that align best

sometimes I take high bitrate videos and do side by sides with Nvidia ICAT. 4k dlss performance consistently looks much clear than native 1440p taa/dlaa. reason it ends up looking better than DLAA is because dlss/dlaa output result scales with resolution. in other words if you tell dlss to upscale 1080p to 4K compared to 1440p DLAA. dlaa is still a huge improvement over taa though.

I appreciate you always debunking people who downplay DLSS. Expect to see some of it in the Dragons Dogma 2 on PC DF thread.
 

yamaci17

Member
I appreciate you always debunking people who downplay DLSS. Expect to see some of it in the Dragons Dogma 2 on PC DF thread.
I didn't purchase that game so I won't be able to provide any insights towards it (im not interesting in dragons dogma much to be honest). But my friend has horizon forbidden west on steam and I have a PS5 at my disposal right now so I will be able to do actual comparisons between them
 

shamoomoo

Member
And if my grandma had wheels, she’d be a bicycle.

Because Zathalus Zathalus posted a rig with a 4060 and a poster replied it wouldn’t match a PS5 in terms of gaming experience which is bullshit.
Depending on the game, there is probably some truth to that going to the PS5 or the lower end 4,000 series.
 

Gaiff

SBI’s Resident Gaslighter
Depending on the game, there is probably some truth to that going to the PS5 or the lower end 4,000 series.
If you try to use consoles settings like a moron instead of using the card like it’s supposed to be used, sure.

Just ignore DLSS, ray tracing, and frame generation. Next, crank down AF to 4x and aim for 4K/30fps instead of a much more reasonable 1440p/DLSS Quality/60fps with 16x AF. That should bring it down to the level of a PS5 or lower. Not only are you choking the 8GB of the base model, but you’re effectively making the far better CPU irrelevant by aiming for 30fps, which no one on PC does but some idiots insist we use PCs like consoles.
 
Last edited:

Mr Moose

Member
Depending on the game, there is probably some truth to that going to the PS5 or the lower end 4,000 series.
If there's a VRAM limit hit then sure, like with the shitty 4060 8GB model, though it does come with the fake frame feature (thanks, Nvidia).
The next gen consoles are a bit better than my GPU (3060 which is slightly better than a 2070) and my 3600 which for some fucking reason is capped at 3575Mhz.
 
Top Bottom