• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Horizon Forbidden West Complete Edition PC specifications revealed

GymWolf

Member
Dlaa 4k is probably gonna be a 4090 affair only if you wanna maintain 60 fps rock solid, i doubt my 4080 can manage that even with framegen.

That thing is HEAVY.
 

shamoomoo

Member
According to DF: On paper, this sounds like outright butchery but in actual gameplay, what we're actually seeing is a carefully curated equivalent to 'PC low settings' up against an all-out ultra-class experience on PlayStation 5.

And if you do believe they aren't very different, then the PS5 isn't faring very well either since it's running a PS4 game at 1600x1800 at 60fps but it's something like 8x the power of the PS4. Furthermore, we have yet to know what settings the PS5 uses compared to PC. Is it comparable to Medium? High? Very High? Because if it's a mix of Medium/High or just Medium, then it isn't that much above the 3060 which is weaker than the PS5's GPU anyway. Your one big benefit over the PS5 is better ray tracing and DLSS.

You guys are already trying to shit on PC when we haven't even got a comparison out. And you of all people should know that the 12TFlops narrative is moronic and has always been.
I know you said "like" but the PS5 is about 5.4x in terms of fill rates and in terms of TFLOPS, it's literally 2 PS4 Pro plus the base PS4.
 
Last edited:

Senua

Gold Member
Yeah but it won’t be noticeably large.

The game on ps5 already looks outstanding and sharp on a 4k screen regardless of which mode you pick

It is highly optimized for the PS5 and that just won’t be the case for PC
1630490967708.gif
 
Doesn’t the PS5 do 1440/60 and I assume High? If yes, then a 3070 and especially a 6800 seem like a lot to ask.

If I’m not mistaken, those are the same requirements for TLOU2. Decima does alright on PC though, so we’ll see what the real results are.
The ps5 does cb 1800p or 4k (forget which one) which is actually a touch more demanding than native 1440p also if it’s anything like ratchet or other ps5 games i think a couple of the settings are at very high specifically the textures with how good they look. And this is actually reasonable those gpus performed worse in games like last of us part 1
 

Topher

Gold Member
Yeah but it won’t be noticeably large.

The game on ps5 already looks outstanding and sharp on a 4k screen regardless of which mode you pick

It is highly optimized for the PS5 and that just won’t be the case for PC

Eh....I can't see into the future my man. Hopefully they do the game justice and we get the visuals that match the power of the system.
 

Senua

Gold Member
The ps5 does cb 1800p or 4k (forget which one) which is actually a touch more demanding than native 1440p also if it’s anything like ratchet or other ps5 games i think a couple of the settings are at very high specifically the textures with how good they look. And this is actually reasonable those gpus performed worse in games like last of us part 1
That game is still unoptimised please stop using that as a metric
 

yamaci17

Member
Seriously doubt that. Have to remember this isn't just an upgrade from console to PC, but an upgrade from AMD to Nvidia.

From @ArtHands post....

"Horizon Forbidden West Complete Edition on PC features unlocked frame rates, customizable graphics settings and a broad range of performance-enhancing technologies, including NVIDIA DLSS 3 upscaling and frame generation. AMD FSR and Intel XeSS are also supported. For players with high-end hardware and extra headroom, image-enhancing NVIDIA DLAA is also available. The game leverages DirectStorage for quick loading times on PC."

So lower tier 4000 series Nvidia GPUs are going to see improvement.
hfw had awful temporal reconstruction tech at release. it got improved but it is still subpar compared to some other games. DLSS/DLAA will probably bring massive improvements to this game's temporal presentation. in ps5's performance mode, foliage looks often blurred together like a souped mush (quality mode at native 4k is better but still fails to retain certain subpixel information and foliage ends up looking spotless) and most of the subpixel details are being erased by their temporal upscalers. DLSS will probably have insane temporal reconstruction and improve on those fronts

IGT is the one upscaler that DF praises the most and even that cannot come anywhere near DLSS when it comes tor reconstructing subpixel details



4:38 etc.

dlss alone is a tech upgrade that most baseline NVIDIA PCs have now. it literally has better quality per pixel rendered, and it just cannot be denied. if you don't want the performance, you can run DLAA and get the benefits and this game will support it as it has been said before. (which is why direct performance comparisons has become meaningless for me since dlss will often provide 1.5x better quality per pixel, which then in turn fair comparison becomes impossible. say you may get equivalent performance to the ps5 with a 2070 super but the image quality you will get at the same resolution will be better with the assistance of tensor cores/DLSS. how will it be fair then? it is really hard to assign "value" to this.

qcAIBsH.png

Tu0XIQU.png


as you see at the same performance level, they don't look similar at all. i can easily say it looks 1.5x better at least because IGTI fails to reconstruct details like DLSS can. entirety of foliage like this in most games become blurry soups with regular upscaler solutions gow ragnarok, hfw and alike uses on PS5. it is only when you get used to DLSS that regular upscalers look bad though. so I don't necessarily mean IGTI looks bad or spiderman looks bad or that HFW looks bad on PS5. but DLSS is on another level which in return we should all criticise Sony for not preparing a similar hardware assisted upscaling and instead forcing devs to keep relying on software solutions / keep wasting time on software solutions to work out the intriciates of upscaling and still get a mostly blurry output compared to the one that automatically does it all in one package
 
Last edited:

shamoomoo

Member
Okay, I admit you got me there. Cool you to have a RTX 3060 (i hope you get rid of it since you cannot seem to understand the value of it)

Now it is your turn to explain why ps5 needs checkerboarded 1800p to hit 60 fps while ps4 can hit "so called" 1080p 30 fps with "so called" comparable visuals

with the famous SlimySnake's logic, if you can hit 1080p 30 fps with 1.84 tflops on a gcn2-3 antique GPU (2m pixels)
how can PS5 require 1600*1800 (2.8m pixels) to hit 60 FPS with 10 tflops with a much modern rdna 2 hardware? 5.6 times of computational performance, architectural improvements and all that can amount to is 2x framerate and a bere 1.4x pixel increase?

it should've at least gotten a native 4k 60 fps there. but it doesn't. why? why would it need a 30 fps target to hit native 4k if ps4 can do "1080p 30 fps" why is there even a native 4k 30 fps mode?
Obviously Horizon Forbidden West is a cross-gen game,devs aren't going to put in any more effort then they need to, so they will use the extra performance for minor improvements.
 

SlimySnake

Flashless at the Golden Globes
That game is still unoptimised please stop using that as a metric
For whatever reason, Decima games like HZD and Death Stranding run very well on PC GPUs. HZD had a poor and buggy launch, but the GPU performance was fine. I highly doubt you will need a 3070 to get PS5 quality graphics like you do in ND games. Especially on PCs that have way better CPUs than the one in consoles.

Actually, the 3070 might be vram bound at native 4k, but similar power GPUs like the 2080 Ti should be fine. IIRC we were doing some comparisons with Death Stranding on PC on gaf a couple of years ago, and the PS5 was performing like a 2080 Super or 3060Ti. A bit higher than the typical 2070 Super performance we see in other games, but nothing like a 3070.
 

DeaDPo0L84

Member
I see I see, how much is one of those then?
You make a drive-by post, claim you have a 4090 to convince people you're not a troll, then proceed to not know how much gpus cost that were released in the past few years, please stop, you're not fooling anyone, it's embarrassing.
 

Fredrik

Member
Nice I’ll be able to reach 4K60 going by this info. Just hoping there won’t be any bigger drops. No VRR on the TV here, every drop will result in a 30fps stutter.

Anyway can’t wait to jump in again. I dropped it on PS5 before I even got a chance to dive in the water, Elden Ring arrived.
 

Gaiff

SBI’s Resident Gaslighter
For whatever reason, Decima games like HZD and Death Stranding run very well on PC GPUs. HZD had a poor and buggy launch, but the GPU performance was fine. I highly doubt you will need a 3070 to get PS5 quality graphics like you do in ND games. Especially on PCs that have way better CPUs than the one in consoles.
What do you mean for whatever reason lol? PCs seem to run the Decima engine alright. The outlier is the engine ND that for whatever reason is trash on PC.
Actually, the 3070 might be vram bound at native 4k, but similar power GPUs like the 2080 Ti should be fine. IIRC we were doing some comparisons with Death Stranding on PC on gaf a couple of years ago, and the PS5 was performing like a 2080 Super or 3060Ti. A bit higher than the typical 2070 Super performance we see in other games, but nothing like a 3070.
The 2080 was something around 97% the performance of a PS5 but in some scenes cold run a bit faster. The PS5 is 2080/2080S tier in that game. 3060 Ti is a bit out of rich since it sits closer to the 2080 Ti/3070.
 
Last edited:

THE DUCK

voted poster of the decade by bots
You say 12tf like that is supposed to mean anything.

Understood that tf is not the be all end all but it's still a partial yardstick. Considering 3060 supports dlss you would expect a card of this power level to do better than 1080p.
 

SlimySnake

Flashless at the Golden Globes
What do you mean for whatever reason lol? PCs seem to run the Decima engine alright. The outlier is the engine ND that for whatever reason is trash on PC.
Nixxes' spiderman ports were also needing 3070 to do PS5 quality visuals iirc.

GOW 2018 is another example of a Sony first party engine just not translating well to PCs. A 6 tflops 580 was basically performing like a 4.2 tflops PS4 Pro. Ghost of Tsushima is next and we will see how well that engine is ported.
 

Topher

Gold Member
Understood that tf is not the be all end all but it's still a partial yardstick. Considering 3060 supports dlss you would expect a card of this power level to do better than 1080p.

It really isn't a yardstick at all outside of its own architecture. What you are seeing in the chart is native resolution without factoring in DLSS.
 

poppabk

Cheeks Spread for Digital Only Future
Understood that tf is not the be all end all but it's still a partial yardstick. Considering 3060 supports dlss you would expect a card of this power level to do better than 1080p.
Native or reconstructed though? This list doesn't indicate anything about base resolution unless they are listing the base resolution.
 
I love it when ppl compare PS version or PS VS PC versions like, "yeah oMg nO difference" when they have no idea how optimization works. I can tell someone never played PC games when they have no idea how severely demanding draw distances are just like TLOU2 remaster for example, where initially u'd say yeah it does look the same if you stare at it closely, but moving around the world with almost no severe pop-in IS FUCKING HUGE W. Same with Horizon, it may not come with major asset difference from PS5 to PC but Im sure as hell those better draw distances will be a god send on PC.
 

yamaci17

Member
Understood that tf is not the be all end all but it's still a partial yardstick. Considering 3060 supports dlss you would expect a card of this power level to do better than 1080p.
nothing stops you from using a 3060 at 1440p dlss quality. you will get similar, if not better, performance.



even 1440p dlss balanced with 840p internal resolution will often look better than native 1080p with comparable or better performance

in some cases 1440p dlss balanced looks and performs better than 1080p

they list 3060 as 1080p specs because most people have got the 3060 for 1080p (not their fault. I see it as a competent 1440p card with the aid of DLSS. problem is, casual userbase usually don't see it that way and often tries to get something above 3060ti for 1440p resolution, even if they're going to use the upscaling all the same, lol)

there are legitimate cases in my experience with 3070 where 1440p dlss balanced looked better, performed better and also had less VRAM usage than 1080p than native 1080p to a point makes 1080p itself obsolete by all metrics, even from a hardware perspective, even for a 3060 perspective. but people has this weird mentality of "i would rather play native 1080p than to play at internal 840p on a 1440p screen" so here we are, people being led to the mindset of 3060 being only usable for 1080p. as such, devs will act like that is what it is. because with this mindset, most people are afraid of 1440p screens and decide to stay at 1080p/3060 specs when with DLSS they would get maybe better performance and surely better image quality on a 1440p screen. it is their loss. rendering at native 1080p in 2024 is just a waste of resources and it is better to render at 1080p and upscale that to 1440p/1620p/4K instead:


even if you have a lowend hardware, it is pointless to target a 1080p screen anymore. unless you're vram limited, i'd say all 3060+ users should just target a 1440p screen no matter what. you will get better image quality at 1440p / aggresive upscaling tha native 1080p can ever hope to achieve

GY4c2tg.png



even DLAA cannot save it because you can only polish a turd so much (turd being the 1080p itself)
 
Last edited:

THE DUCK

voted poster of the decade by bots
It really isn't a yardstick at all outside of its own architecture. What you are seeing in the chart is native resolution without factoring in DLSS.

But let's be real, your saying a 3060 should run this game at the level of the series S? Seems like weaksauce and indicative of other games running on a 3060.

But as others have suggested, guess we will see how it really performs when it's released. Dlss and frame recon might help a lot with minimal effect on image.
 
Last edited:

EverydayBeast

thinks Halo Infinite is a new graphical benchmark
Horizon Forbidden West is a major game to land on PC, I completed the campaign recently on PS5 and for the record it’s an all time game, with the Burning Shores DLC you’ll end up putting in 60-100 hours easily.
 
I believe it uses some form of reconstruction to get to 1800p or something. It looks good, but the pixel count is probably way below native 1440.
Remember running the cb algorithm has its own cost (around 20-30%) the native pixel count for cb 1800p is like 1296p so I wouldn’t consider that way below 1440p
 

Kenpachii

Member
Tbf that’s not real 4k 120 I’m surprised how demanding this game is on pc seems you will need a 5080 if not 5090 for 4k 120

Nothing is real its a game. 4k 120 fps will be easily gainable with a 4090 if DLSS 3.0 is calculated into the specifications.
 

Topher

Gold Member
But let's be real, your saying a 3060 should run this game at the level of the series S? Seems like weaksauce and indicative of other games running on a 3060.

But as others have suggested, guess we will see how it really performs when it's released. Dlss and frame recon might help a lot with minimal effect on image.

First, I'm just talking about this chart which is not definitive. Not going to speculate on XSS, but no console or GPU is set hard to one "level". There are games running on PS5 and XSX that upscale from 1080p. But yeah, best to wait for actual benchmarks.
 

Jigsaah

Gold Member
Can't wait. Nixxes is doing God's work.

Gonna opt for high as possible framerate at 1440p. I'll try it in 4k...but I'm not gonna be greedy. 4090 is ready.

Interesting video on this.

 
Last edited:

yamaci17

Member
But let's be real, your saying a 3060 should run this game at the level of the series S? Seems like weaksauce and indicative of other games running on a 3060.

But as others have suggested, guess we will see how it really performs when it's released. Dlss and frame recon might help a lot with minimal effect on image.
eh there's also the fact that 3060 can be %20/25 slower than ps5 anyways, and that can play an important role too. 3060 is actually rather slow on raster compared to PS5

this game's performance mode on ps5 is a mystery box, as it runs at 1800p checkerboarded resolution. natively that is only 2.8 millions of pixels (hello slimy) then you have people like above that will say checkerboarding process is tough and can be costly around %20 to %30. let's say ps5 is rendering at 1500p/60 FPS in that case

there's also the fact that decima engine does not scale well into lower resolutions, more so on Ampere



you can see you get %70 of the 1440p's performance at 1080p on a 3060 (give or take) on this game engine

let's assume they used a safe margin and 3060 will actually give you a 1080p 75 FPS avg. with the %70 FPS worst case scenario, it means it would render 1440p 50 FPS or so.

when you look at things from this perpective, it quickly becomes

3060: 1440p 50 fps
PS5 1500p 60 FPS

(1.2x fps difference, 1.04x res difference)

which makes sense considering there can be %20-25 difference between PS5's GPU performance and 3060's raw GPU bound raster performance.

perspective matters a lot. and of course none of this brings dlss into discussion (which will warp all of the calculations above)

almost all Ampere GPUs, including 3060, actually scale badly into 1080p. it is just the nature of architecture... those bloated TFLOPS metrics become even more useless at lower resolutions, and become more meaningful at higher resolutions, so to speak.
 
Last edited:

angrod14

Member
Console required specs for X game: 2020 potatoe.

PC required specs for the port of the same game: Two RTX 4090 in SLI config + Apple M3 Pro Chip with a hamster wheel as a cooling solution + waiting the game to get 50 patches after release because problem with shader compilation whatever.
 

daninthemix

Member
Lemme guess - a new Sony port has set the cat out amongst the pigeons yet again?

Sony's porting strategy is the gift that keeps on giving.
 
Console version is usually 1440p 30fps (can drop to 1080p at times and below 30 even), so you need at least 2x the GPU for 4K (4070S) and another 2x for 60FPS.

So ~4x the power of PS5, 4090 won't be able to do it without upscaling:

SVHG93h.jpg
Don’t expect the normal 2x gpu scale of 4070s since this is an exclusive
 

Katatonic

Member
Every time I'm tempted to build a new PC, I'm reminded of the diminishing returns. 4080 for 4k 60 fps and I bet there'll be hitches still.

Maybe if one day I switch from Mac to PC for music production, I can justify building a monster rig with gaming as a bonus.
 

Senua

Gold Member
Every time I'm tempted to build a new PC, I'm reminded of the diminishing returns. 4080 for 4k 60 fps and I bet there'll be hitches still.

Maybe if one day I switch from Mac to PC for music production, I can justify building a monster rig with gaming as a bonus.
Well tbh dude native 4k at DLAA will be a massive improvement to image quality over the PS5. Or you could lower the internal res with DLSS and retain great IQ and increase framerates beyond 60.
 

Gaiff

SBI’s Resident Gaslighter
Every time I'm tempted to build a new PC, I'm reminded of the diminishing returns. 4080 for 4k 60 fps and I bet there'll be hitches still.

Maybe if one day I switch from Mac to PC for music production, I can justify building a monster rig with gaming as a bonus.
Or get a 4070, toggle frame generation and DLSS Quality at 4K. Done. Better image quality than PS5 and 60fps. Flip on Reflex for lower latency.

$1000ish rig with twice the performance and better image quality.
 
Last edited:

Topher

Gold Member
PC required specs for the port of the same game: Two RTX 4090 in SLI config + Apple M3 Pro Chip with a hamster wheel as a cooling solution + waiting the game to get 50 patches after release because problem with shader compilation whatever.

Coolstory Cool Story Bro GIF by Hook SEO
 

Senua

Gold Member
Or get a 4070, toggle frame generation and DLSS Quality at 4K. Done. Better image quality than PS5 and 60fps. Flip on Reflex for lower latency.

$1000ish rig with twice the performance and better image quality.
Look who has entered the chat

msNsI4w.png






Peace Out Reaction GIF
 

yamaci17

Member
Nixxes' spiderman ports were also needing 3070 to do PS5 quality visuals iirc.

GOW 2018 is another example of a Sony first party engine just not translating well to PCs. A 6 tflops 580 was basically performing like a 4.2 tflops PS4 Pro. Ghost of Tsushima is next and we will see how well that engine is ported.
No. spiderman game will tank your performance by half if you're vram bottlenecked, in return, causes unfair relation shifts in comparisons. when not vram bottlenecked, 16 gb 4060ti will be above ps5 by however margin there should be (%25-30)

you get massive 1.5x to 2x (depending on pcie bandwidth) hits to performance if you're vram bounds in ratchet and spiderman (in return defeats the purpose of comparing raw GPU power comparisons in relation to ps5)

O0NocSs.png




it is a personal L 8 gb users like me have to take over PS5 if we want to play with equivalent textures. it is not caused by GPU chip power itself, the entire game engine stalls the entirety of GPU to transfer textures from RAM to VRAM all the time over PCI-e. This simply does not happen on lowend GPUs like 12 GB 3060 which proves that it is a specifically niche thing for 3070 8 GB and alike and nothing more, nothing less and should not be used in academic researches while analyising ps5's performance to equivalent or similar GPUs on desktop front.



you can literally hit 60+ fps with 4k dlss balanced + ray traced high (ps5 equivalent ray tracing) on this GPU in spiderman. it is optimized well but it will horribly scale for 8 GB cards in terms of performance. (4k dlss balanced has a perf cost around native 1620p which means 3060 is being able to hit 1620p 60 fps with ray tracing. considering ps5 can hit 1440p 70-80 fps with ray tracing, it means that the port is fine as long as you have enough VRAM considering 3060's ray tracing advantage and lack of raster. it performs exceptionally fine.)



it is too late for these 8 GB GPUs. if nixxes uses a similar trick for 8 GB cards in horizon forbidden west, you will see similar results there too.

also god of war's case is simple: game was designed for 8 ACEs of ps4 and was completely designed around Async. And game is literally dx11, the most distance thing you can find to Async. even then, most gcn and pascal cards had lackluster async compared to consoles. this is not the case anymore. most gpus at the time did not support async properly, especially Pascal/Maxwell cards. With them being phased out, async based outliers will be gone. but thanks to NVIDIA's VRAM stuff, we will have tons of VRAM based outliers (sadly)
 
Last edited:
Played the main game when it released on the PS5, have not yet played the DLC, I'm thinking about getting this on PC to eventually to play the DLC.
If you still have your ps5 copy you can also get it on the pro do you have a 4080 or 4090? If not I would certainly recommend the pro over the pc version
 
It's not Sony exclusive and even without that, aside last of us even Sony games scale pretty well with GPU power.
Nah rift apart doesn’t scale as well as other pc games the ps5 has more gpu grunt in that game than a 3070 and approaches the 3080 even though that shouldn’t be the case.
 

Zathalus

Member
What is with the debate regarding performance vs the PS5? PS5 does 1800p checkerboard at 60fps, while a 3070 should do 1440p at 60fps. 2.88 million pixels vs 3.68 million pixels. So 28% more pixels which is basically the power difference between a PS5 and the 3070. Nothing really debating over.
 
Top Bottom