• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Horizon Forbidden West Complete Edition PC specifications revealed

Zathalus

Member
That's what I mean. The PS5 for $400 can do Dynamic 4k/60. Show me a PC that can do even remotely that for $1k.

Whew that was rather easy! Knock a bit off the price if you build it yourself as well.
 
If you really think the sole reason someone plays on PC is power then I get why you think so, but you can't be further from the truth pal, also even at equivalent GPU than consoles, PCs don't have the same bottleneck (CPU, which low end tend to be better than console's) and have technology the console versions don't have like different upscalers and frame generation that would be great for Sony games since they already mask their high latencies under "smooth animations".

Also: Controllers options, more fine grained settings, playing with Xbox controller (symmetric sticks suck for many,) and a large list of etc.
I’m actually super hard on the cpu (arguably the biggest person on this site) but horizon isn’t cpu limited due to being cross gen (maybe in the 120fps mode) so we can actually focus on the gpu side
 

ChiefDada

Gold Member
This should do well technically on high end PC. Although since there's no RT I think PS5 will be more competitive than usual a la TLOU Remake and Death Stranding DC. Probably no need to worry much about PC asset streaming since Guerilla didn't bother to improve pop-in/faster world traversal like Insomniac did masterfully with SM2. Burning Shores deserved much better there.
 

ChiefDada

Gold Member
According to DF, the PS5 version runs 1800p DRS using checkerboard and PC settings equivalent to "low" to get to 60 fps. Considering everything in the chart above is native, I'd say you are looking at a RTX 3070 to "beat" the PS5 specs. Think we can dispense with the bullshit take that you need a 4080 to do that.

Lets get this out the way now - 3070 is DOA especially with Burning Shores. It will not compete with PS5 because it will have comparatively crap textures. Texture is major part of what makes HFW/HBS look good.
 

Topher

Gold Member
Lets get this out the way now - 3070 is DOA especially with Burning Shores. It will not compete with PS5 because it will have comparatively crap textures. Texture is major part of what makes HFW/HBS look good.

Why will 3070 have crap textures?
 

Xdrive05

Member
Who would have predicted the RTX 3060 12GB’s unusually large vram configuration would give it a second life in the face of its direct predecessor?

Nvidia should have found a way to stick with 12GB and eaten the cost (yeah like that will ever happen again!).
 

yamaci17

Member
Seems strange when 3070 is listed in the High/1440p/60 column of GG's chart. But Bojji Bojji seems to agree with him so interested in his take as well.
you can get very high quality textures in nixxes ports but often you have to sacrifice compute performance just so that game can stall the render to make constant transfers between ram/vram



with the help of dlss and brute power, you can kind of make that kind of overhead to get decent results

they make some attempt to make high quality textures (not very high) look decent but you will get inconsistent texture quality

proper way of handling things is doing it like alan wake 2/avatar where textures very far from the camera are being reduced and most people don't actually notice it (me included). but sony games are designed with fixed / sony hardware in mind so you either have enough budget or not

after seeing brilliant textures in alan wake 2 on my 8 gb card WHILE running ray tracing at 1440p, I've completely stopped caring about broken PS ports that handle vram poorly

alan wake 2 has better geometry, overall perceived texture and model quality on the 3070 at 1440p all the while being able to utilize path tracing than last of us part 1 does all the while last of us part 1 requiring/using more vram at 1080p... that should tell you all you know.

i'm not defending 8 gb vram but it is a number big enough that you can make optimization towards it.

peak lastgen titles ran decently on 4 gb cards at 1080p (rdr 2). technically something that looks 2x better than rdr 2 at 1440p dlss quality with 8 GB VRAM should be possible. and alan wake 2 practically proves that
 
Last edited:

Bojji

Member
Seems strange when 3070 is listed in the High/1440p/60 column of GG's chart. But Bojji Bojji seems to agree with him so interested in his take as well.

On shit textures on 3070? I don't think I agree, depends on the game completely and devs. Sony games tend to have massive vram requirements but two Decima games we had before were decent in this aspect. This game is cross gen but at the same time PS5 version has very good textures so we will see.

3070 is in bad position anyway, it has good amount of power but that Vram is crippling it in many games at higher settings.
 

iHaunter

Member

Whew that was rather easy! Knock a bit off the price if you build it yourself as well.
That's a piece of shit and still over a $1,000. That won't do 4k/60 on good settings at all. Not stable. You'd be lucky if you get 4k/45 on medium settings.

Delusional.
 

Topher

Gold Member
On shit textures on 3070? I don't think I agree, depends on the game completely and devs. Sony games tend to have massive vram requirements but two Decima games we had before were decent in this aspect. This game is cross gen but at the same time PS5 version has very good textures so we will see.

3070 is in bad position anyway, it has good amount of power but that Vram is crippling it in many games at higher settings.

It is going to be interesting to see how well the hardware in the charts measure up in real world benchmarks. I'm only good for surface level discussion on this stuff as I don't get into the weeds of it all like you and yamaci17 yamaci17 so appreciate the insight.
 

yamaci17

Member
On shit textures on 3070? I don't think I agree, depends on the game completely and devs. Sony games tend to have massive vram requirements but two Decima games we had before were decent in this aspect. This game is cross gen but at the same time PS5 version has very good textures so we will see.

3070 is in bad position anyway, it has good amount of power but that Vram is crippling it in many games at higher settings.
3070 has two saving graces

ps5 games often try to target a 4k buffer (internal resolution can be recklessly low, but 4k output buffer itself has a serious vram impact)
as a result, when you target a 1440p buffer on desktop, you get some serious vram reductions based on that alone. that seriously is an important note. I cannot for example play with very high textures in ratchet at 4k output despite gpu being capable (sadge) with very high textures. GPU just gets wrecked. but 1440p buffer? practically gives enough "relief" to the GPU. since the game is designed with 4k buffer vram usage in mind, it gets an unexpected vram usage reduction when you output that game to 1440p

second saving grace is DLSS itself, which looks very decent at 1440p and with latest DLSS DLLs, even balanced is workable/acceptable.

third potential saving grace is competent texture streaming.

when these 3 factors combined, and if the game was optimized to fill 10 gb buffer on sx/ps5 with dynamic 4k buffer, it gracefully scales to 7.2 GB DXGI buffer on PC at 1440p/dlss quality with a bit of texture streaming.

3070 can technically should outpace consoles even at dynamic 4k conditions but 8 GB VRAM is a true limitation at 4K buffers.

I've come across situations where I couldn't save the performance from dropping with 4K DLSS ultra performance but running the damn game at 1440p just ran damn fine.

Of course this can change if developers start to target 1440p "output" buffer on consoles. Let's hope they don't. (again, internal resolutions doesn't matter much, the main bulk of VRAM increase comes from the output resolution)
 

Bojji

Member
It is going to be interesting to see how well the hardware in the charts measure up in real world benchmarks. I'm only good for surface level discussion on this stuff as I don't get into the weeds of it all like you and yamaci17 yamaci17 so appreciate the insight.

I think devs are more sensitive to vram usage after all that outrage in middle of last year. Most games released in Q3, Q4 and Q1 24 top out around 10GB of usage.

But at the same time difference between 8 and 16GB is massive and I doubt that 3070 will be able to match texture settings with 6800:



3070 has two saving graces

ps5 games often try to target a 4k buffer (internal resolution can be recklessly low, but 4k output buffer itself has a serious vram impact)
as a result, when you target a 1440p buffer on desktop, you get some serious vram reductions based on that alone. that seriously is an important note. I cannot for example play with very high textures in ratchet at 4k output despite gpu being capable (sadge) with very high textures. GPU just gets wrecked. but 1440p buffer? practically gives enough "relief" to the GPU. since the game is designed with 4k buffer vram usage in mind, it gets an unexpected vram usage reduction when you output that game to 1440p

second saving grace is DLSS itself, which looks very decent at 1440p and with latest DLSS DLLs, even balanced is workable/acceptable.

third potential saving grace is competent texture streaming.

when these 3 factors combined, and if the game was optimized to fill 10 gb buffer on sx/ps5 with dynamic 4k buffer, it gracefully scales to 7.2 GB DXGI buffer on PC at 1440p/dlss quality with a bit of texture streaming.

3070 can technically should outpace consoles even at dynamic 4k conditions but 8 GB VRAM is a true limitation at 4K buffers.

I've come across situations where I couldn't save the performance from dropping with 4K DLSS ultra performance but running the damn game at 1440p just ran damn fine.

Of course this can change if developers start to target 1440p "output" buffer on consoles. Let's hope they don't. (again, internal resolutions doesn't matter much, the main bulk of VRAM increase comes from the output resolution)

True, with DLSS and target resolution change you can get some decent savings. But there are some games where target resolution don't change massive vram requirements like TLoU before the patches.

When I had 4070ti i had trouble running Phantom Liberty with FG and PT, it simply don't fit in 4k framebuffer with DLSS even on performance, Dropping resolution target to 1800p allowed me to have some good experience :)
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
That's what I mean. The PS5 for $400 can do Dynamic 4k/60. Show me a PC that can do even remotely that for $1k.
Is there a single game where the PS5 does dynamic 4K/60? Or do you mean 1080p reconstructed to 4K using FSR2?
 

yamaci17

Member
I think devs are more sensitive to vram usage after all that outrage in middle of last year. Most games released in Q3, Q4 and Q1 24 top out around 10GB of usage.

But at the same time difference between 8 and 16GB is massive and I doubt that 3070 will be able to match texture settings with 6800:





True, with DLSS and target resolution change you can get some decent savings. But there are some games where target resolution don't change massive vram requirements like TLoU before the patches.

When I had 4070ti i had trouble running Phantom Liberty with FG and PT, it simply don't fit in 4k framebuffer with DLSS even on performance, Dropping resolution target to 1800p allowed me to have some good experience :)

to be honest i tried hogwarts legacy at 1080p high + high ray tracing and couldnt replicate that kind of a horrible texture loading on my end (in certain cases that happen in the video)

regardless though using ultra textures insistently at times will cause worse textures to be loaded. sometimes engines react violently. while indeed you wont match 16 gb cards' textures at times, difference, at least within close range, won't be as stark. differences should happen in distance as long as you're not overburdening vram. in my experience running 1-1.5 gb short of required vram often is enough to get away with a decent texture quality experience in close to midrange distances

ray tracing with hogwarts legacy is bugged in general that it even is problematic on 12-16 gb cards

regardless if devs dont want to see <50 steam review score on their brand new game, they will have to do better. nvidia or not, this is what the reality is and most people still sit on competent 8 gb cards. they will have to improvise. popularity of 4060 will practically decide the future of 8 gb cards for the next 5 years. if it becomes popular enough, devs wont be able to give "horrible n64 textures" treatment for 8 gb cards
 
Last edited:

Bojji

Member
to be honest i tried hogwarts legacy at 1080p high + high ray tracing and couldnt replicate that kind of a horrible texture loading on my end (in certain cases that happen in the video)

regardless though using ultra textures insistently at times will cause worse textures to be loaded. sometimes engines react violently. while indeed you wont match 16 gb cards' textures at times, difference, at least within close range, won't be as stark. differences should happen in distance as long as you're not overburdening vram. in my experience running 1-1.5 gb short of required vram often is enough to get away with a decent texture quality experience in close to midrange distances

ray tracing with hogwarts legacy is bugged in general that it even is problematic on 12-16 gb cards

Hogwarts Legacy is a fucking mess, it can consume 24GB of RAM and 14GB of VRAM (saw that on my 6800) and at the same time barely look better than Xbox version that has 13GB of total memory available...

I think that few games last year were really, really badly optizmed on memory front and this massive vram requirements spike won't become a trend in the end. But 8GB is really on the edge, I can't use card like that with 4K screen but for 1440p target it can probably still be decent.
 

ChiefDada

Gold Member
It's not just texture here. HFW character animation is in another league compared to the original i have to imagine it's consuming lots of memory. But anyways I expect most other cards with adequate vram will allow it to scale nicely above PS5.
 

Kataploom

Gold Member
I’m actually super hard on the cpu (arguably the biggest person on this site) but horizon isn’t cpu limited due to being cross gen (maybe in the 120fps mode) so we can actually focus on the gpu side
No game on PC is CPU limited unless it's garbage port or high end simulation type of game, so even that is a big advantage, even at similar GPU any low to mid range CPU would be enough to surpass anything from consoles and still be idling most of its resources lol
 

yamaci17

Member
Hogwarts Legacy is a fucking mess, it can consume 24GB of RAM and 14GB of VRAM (saw that on my 6800) and at the same time barely look better than Xbox version that has 13GB of total memory available...

I think that few games last year were really, really badly optizmed on memory front and this massive vram requirements spike won't become a trend in the end. But 8GB is really on the edge, I can't use card like that with 4K screen but for 1440p target it can probably still be decent.
yeah 4k was fun while it lasted. it really looks gorgeous over 1440p. only saving grace for 1440p for me is that it looks better than 1080p by a noticable margin, but by itself is not impressive as 4k is (of course it shouldn't have, 4k is just a brilliant target). i like 4k dlss performance better than native 1440p for example. but it is what it is, 4k buffer with 8 gb vram going forward is just not feasible

1080p folks will probably do decently with 8 GB. I will have to squueze every bit of optimized settings to make 1440p dlss balanced work as a minimum baseline. For me I feel like I will be able to hold out till GTA 6. And by then my GPU will be 6 years old so I will be able to say I got great mileage out of it (counting all niche path traced old titles, dlss reflex stuff, low latency gaming, DLDSR, and so on etc. etc.) it has been 3.5 years and im mostly fine with the product but helps that i have gotten it for MSRP 500 bucks. For MSRP 500 bucks, being able to play gorgeous games like alan wake 2 / avatar decently in 2023, I would say it did okay. And it will keep doing so for a while from the looks of it. But for people who have gotten it for absurd 900+ bucks prices, whelp, they probably have every right to complain about it... (they also have themselves to blame for it though)

dlss itself is a great feature imo and for that alone it was a worthy investment. it is the only upscaler I can stomach at 1440p

also notice how the exact same problem now happens in final fantasy game. it probably is 1-2 gb short of what devs have intended on PS5's vram budget, and as a result, game loads garbage textures all over the place. i cannot say this is a problem with unreal engine in general, why? because i played jedi survivor with ray tracing at 1440p and textures look phenomenal at all times. so i dont even understand how certain games get so bad so quickly. there are good unreal examples and bad unreal examples which makes it hard to make blanket statements about it.

it probably has to do with texture classification. identifying textures based on their distance and importance must be playing an important role for proper texture streaming. although unoptimized mess the jedi survivor is, it actually has the most decent texture streamer out of an unreal engine i've ever seen. even in the 4 gb vs 8 gb video hardware unboxed did, 4 gb card actually looked "decentish" considering how heavy the game appears to be on VRAM. not once while playing jedi survivor that I saw potato, garbage or horrible textures, despite pushing the visuals to their limits

texture streaming however is crucial and will be the name of the game if devs want to scale their games gracefully from series s to 8 gb cards to series x/ps5. it will benefit everyone, and probably... nvidia preyed on/hoped for this to happen.
 
Last edited:
No game on PC is CPU limited unless it's garbage port or high end simulation type of game, so even that is a big advantage, even at similar GPU any low to mid range CPU would be enough to surpass anything from consoles and still be idling most of its resources lol
I meant the console version isn’t cpu limited
 

Kataploom

Gold Member
It's not just texture here. HFW character animation is in another league compared to the original i have to imagine it's consuming lots of memory. But anyways I expect most other cards with adequate vram will allow it to scale nicely above PS5.
Everything that have does any mid range PC will do same or better. Game send to be artistically impressive to some but that's a cross gen game in the end, it runs with the limitations of the PS4 and has just too many tricks behind in order to make you ask think it's brute forcing those stuff with magic
 

Zathalus

Member
That's a piece of shit and still over a $1,000. That won't do 4k/60 on good settings at all. Not stable. You'd be lucky if you get 4k/45 on medium settings.

Delusional.
The 4060 is similar to the GPU performance in the PS5. Looking at the recommended requirements this PC should be able to do performance DLSS 4k which will look similar (if not better) to checkerboard 1800p which the PS5 runs at.

It won't do native 4k/60 but then again nether does the PS5, and if you think it does the only delusional person here is you.

Edit: Should have just link a build: https://pcpartpicker.com/list/TfNJZJ

Just over $900, significantly more powerful compared to the PS5.
 
Last edited:

shamoomoo

Member
The 4060 is similar to the GPU performance in the PS5. Looking at the recommended requirements this PC should be able to do performance DLSS 4k which will look similar (if not better) to checkerboard 1800p which the PS5 runs at.

It won't do native 4k/60 but then again nether does the PS5, and if you think it does the only delusional person here is you.

Edit: Should have just link a build: https://pcpartpicker.com/list/TfNJZJ

Just over $900, significantly more powerful compared to the PS5.
That isn't true,the demand of the game dictates how fast and at what resolution a game will run.4k/30 isn't uncommon,though most games don't reach the resolution.
 

yamaci17

Member
The 4060 is similar to the GPU performance in the PS5. Looking at the recommended requirements this PC should be able to do performance DLSS 4k which will look similar (if not better) to checkerboard 1800p which the PS5 runs at.

It won't do native 4k/60 but then again nether does the PS5, and if you think it does the only delusional person here is you.

Edit: Should have just link a build: https://pcpartpicker.com/list/TfNJZJ

Just over $900, significantly more powerful compared to the PS5.
you didn't get the memo, you can use dynamic scaling on console but can't on pc

even if that dedicated upscaler can make better use of 480p than other regular upscalers do at 960p :)

 

ChiefDada

Gold Member
Everything that have does any mid range PC will do same or better. Game send to be artistically impressive to some but that's a cross gen game in the end, it runs with the limitations of the PS4 and has just too many tricks behind in order to make you ask think it's brute forcing those stuff with magic

This game is a great example of why this term is ultimately meaningless. As DF stated, the difference between HFW PS4 and PS5 is major, particularly when it comes to asset quality, character lighting, and water rendering. Also, Burning Shores is PS5 only.
 
The 4060 is similar to the GPU performance in the PS5. Looking at the recommended requirements this PC should be able to do performance DLSS 4k which will look similar (if not better) to checkerboard 1800p which the PS5 runs at.

The RTX 4060 is a 1080p card MAX for AAA games (VRAM and memory bandwidth say hello)

You are comparing apples to oranges
 
Last edited:

Senua

Member
This game is a great example of why this term is ultimately meaningless. As DF stated, the difference between HFW PS4 and PS5 is major, particularly when it comes to asset quality, character lighting, and water rendering. Also, Burning Shores is PS5 only.
I wonder why burning shores is ps5 only if cross gen has no bearing on anything
 

Zathalus

Member
The RTX 4060 is a 1080p card MAX for AAA games (VRAM and memory bandwidth say hello)

You are comparing apples to oranges
Yes, nobody should buy a 4060 for 4k gaming, but since the 8GB 3070 is recommended for native 1440p/60 or 4k/30, I'm sure the 4060 should be able to do 4k/60, albeit with DLSS performance.
 

Kenpachii

Member
The 4060 is similar to the GPU performance in the PS5. Looking at the recommended requirements this PC should be able to do performance DLSS 4k which will look similar (if not better) to checkerboard 1800p which the PS5 runs at.

It won't do native 4k/60 but then again nether does the PS5, and if you think it does the only delusional person here is you.

Edit: Should have just link a build: https://pcpartpicker.com/list/TfNJZJ

Just over $900, significantly more powerful compared to the PS5.

4060 has framegen it will perform a lot more better then the PS5.
 

Topher

Gold Member
The 4060 is similar to the GPU performance in the PS5. Looking at the recommended requirements this PC should be able to do performance DLSS 4k which will look similar (if not better) to checkerboard 1800p which the PS5 runs at.

It won't do native 4k/60 but then again nether does the PS5, and if you think it does the only delusional person here is you.

Edit: Should have just link a build: https://pcpartpicker.com/list/TfNJZJ

Just over $900, significantly more powerful compared to the PS5.

Couldn't you trade out the 4070 for a 7700 XT or a 7800 XT? About the same price or less but without the memory limitation.
 
Last edited:

Zathalus

Member
Couldn't you trade out the 4070 for a 7700 XT or a 7800 XT? About the same price or less but without the memory limitation.
7700XT is 12 GB as well, but is quite a bit cheaper. 7800 XT vs 4070 is basically more VRAM and raster performance vs DLSS and RT performance. If you want to go AMD then for $20 more you can get the 7900 GRE.
 

Bojji

Member
4060 has framegen it will perform a lot more better then the PS5.

Frame gen can cause card to go out of VRAM and performance will be worse.

Nvidia are like idiots sometimes, they shouldn't allow to use frame gen on cards below 12gb. I had problems with it on cards with 12gb...
 

Topher

Gold Member
And lose DLSS and better RT? I dunno.

Sure, but in talking about how this game and the specs on the chart measure against PS5, an AMD option would seem to be a more natural comparison.

7700XT is 12 GB as well, but is quite a bit cheaper. 7800 XT vs 4070 is basically more VRAM and raster performance vs DLSS and RT performance. If you want to go AMD then for $20 more you can get the 7900 GRE.

Ah yeah.....good point about the 7900 GRE
 

yamaci17

Member
Frame gen can cause card to go out of VRAM and performance will be worse.

Nvidia are like idiots sometimes, they shouldn't allow to use frame gen on cards below 12gb. I had problems with it on cards with 12gb...
they're not idiots, they do it on purpose

they practically have no competition. and people cannot recommend 3060/6700xt over 4060 because dlss 2 dlss 3 has strong mindshare in people. in the case of dlss 3, it truly will be useless for 4060 due to vram limitation unless games drastically change the way they work

nvidia specifically targeted low vram target games like a plague tale requiem, atomic heart and such to market DLSS 3 on vram limited cards. once people run into situations where vram is stressed, they will quickly find out dlss 3 is useless on 8 gb hardware. Dare I say, problems will pop up even with 12 gb cards in certain titles. because it is some extra feature that developer has no obligations to cater for. they don't have to revolve/redesign how their games work with 8 gb budget while keeping dlss 3 compatability in mind

however people keep buying. I guess 1080p dlss quality may give some games enough headroom to make it work... ue5 seems to have very low vram footprint overall. it may work well there consistently. but other games... yeah. questionable.
 

Gaiff

SBI’s Resident Gaslighter
Sure, but in talking about how this game and the specs on the chart measure against PS5, an AMD option would seem to be a more natural comparison.
We're actually talking about how a poster foolishly claimed that even with a $1000 PC, you can't get what the PS5 does which is utter bullshit.

The most laughable part of these comparisons is that we always gotta gut PC features to bring them down to the level of the consoles. As much as people love shitting on PC GPUs for their 8GB, these same people are completely silent when the consoles even in Quality Mode are forced to run shit-tier 4x AF that's impossible to upgrade. Then in order to make those comparisons "fair", we drop PC's settings down to 4x AF as well when 16x has been free for over a decade. As if anyone on PC games with 4x AF lol.

Another comical fact is how DLSS gets completely ignored and RTX GPUs in comparison use garbage FSR to level the playing field. Once again, no one with an RTX GPU will use FSR over DLSS when both are available. How is it our problem that consoles don't have DLSS? This very thread is proof of it. Who so far has acknowledged that you can simply toggle DLSS Quality at 4K to get similar image quality and better fine detail resolve than on PS5 but with much better performance? Again, crickets chirping.

Fact is, with DLSS, Frame Generation, and better RT, RTX GPUs can dunk on consoles but we always use them in a way that inherently favors consoles because consoles can't use what PC GPUs do. As far as an academic exercise is concerned, this is how it should be done, but, if we're talking value, that's not how it should be done. The RTX 3060 only gets 60+fps at 1080p? Increase the res to 1620p, toggle DLSS Quality and watch it having a much better image quality than the CBR on the PS5 while also performing on par but no one acknowledges that.

You get what you pay for and for $400-500, consoles have nothing like DLSS, frame generation, Reflex, or high-quality RT and pretty much always run bad AF. These are all things that get glossed over when comparing "value".
 
Last edited:

Bojji

Member
they're not idiots, they do it on purpose

they practically have no competition. and people cannot recommend 3060/6700xt over 4060 because dlss 2 dlss 3 has strong mindshare in people. in the case of dlss 3, it truly will be useless for 4060 due to vram limitation unless games drastically change the way they work

nvidia specifically targeted low vram target games like a plague tale requiem, atomic heart and such to market DLSS 3 on vram limited cards. once people run into situations where vram is stressed, they will quickly find out dlss 3 is useless on 8 gb hardware. Dare I say, problems will pop up even with 12 gb cards in certain titles. because it is some extra feature that developer has no obligations to cater for. they don't have to revolve/redesign how their games work with 8 gb budget while keeping dlss 3 compatability in mind

however people keep buying. I guess 1080p dlss quality may give some games enough headroom to make it work... ue5 seems to have very low vram footprint overall. it may work well there consistently. but other games... yeah. questionable.

I don't think this is some evil grant design on their part. Just division that makes GPUs made stupid decision to go with 8gb of VRAM at minimum and division that makes features like DLSS3 didn't expect games to balloon like they did with VRAM requirements. They have made 16gb 4060ti after all the backlash and I think they didn't plan this sku from the start. 10gb should be the minimum for xx60 GPUs in 2023.

Or maybe this was planned from the start like you said, Nvidia don't care that much about their consumers.
 

Topher

Gold Member
We're actually talking about how a poster foolishly claimed that even with a $1000 PC, you can't get what the PS5 does which is utter bullshit.

The most laughable part of these comparisons is that we always gotta gut PC features to bring them down to the level of the consoles. As much as people love shitting on PC GPUs for their 8GB, these same people are completely silent when the consoles even in Quality Mode are forced to run shit-tier 4x AF that's impossible to upgrade. Then in order to make those comparisons "fair", we drop PC's settings down to 4x AF as well when 16x has been free for over a decade. As if anyone on PC games with 4x AF lol.

Another comical fact is how DLSS gets completely ignored and RTX GPUs in comparison use garbage FSR to level the playing field. Once again, no one with an RTX GPU will use FSR over DLSS when both are available. How is it our problem that consoles don't have DLSS? This very thread is proof of it. Who so far has acknowledged that you can simply toggle DLSS Quality at 4K to get similar image quality and better fine detail resolve than on PS5 but with much better performance? Again, crickets chirping.

Fact is, with DLSS, Frame Generation, and better RT, RTX GPUs can dunk on consoles but we always use them in a way that inherently favors consoles because consoles can't use what PC GPUs do. As far as an academic exercise is concerned, this is how it should be done, but, if we're talking value, that's not how it should be done. The RTX 3060 only gets 60+fps at 1080p? Increase the res to 1620p, toggle DLSS Quality and watch it having a much better image quality than the CBR on the PS5 while also performing on par but no one acknowledges that.

You get what you pay for and for $400-500, consoles have nothing like DLSS, frame generation, Reflex, or high-quality RT and pretty much always run bad AF. These are all things that get glossed over when comparing "value".

Preaching to the choir my man. Yeah, I saw that post you are referring to and I just rolled my eyes. Par for the course lately from those who are trying to tear down PC as much as possible for......reasons, I guess? I don't get it. If someone prefers console then fine......no need to justify that.

Awkward John Krasinski GIF by Saturday Night Live
 
Top Bottom