Forspoken PC Requirements

SlimySnake

The Contrarian
Ultra settings .. no FSR



Looks like a switch game running on Emu. where is all the detail?
 
New Nvidia drivers out as well and says it supports DLSS 3. Strange since it's an AMD funded title.

That should not make any difference as regardless of who is sponsoring the game, all PC games now should support DLSS (2 and 3), FSR and XeSS as well as other upscalers such as UE4’s own temporal upscaling. Forcing one on everyone benefits no-one; just look at The Callisto Protocol with its sub-par implementation of FSR that distorts the text on lower settings than Quality.
 

01011001

Gold Member
downloading the demo now too, I wonder how fast it will load on my mid ass SSDs lol.

I'll try it on my slower Samsung external SSD first before moving it to my faster internal one.
 

rofif

Member
Trying the demo out of curiosity before I start ps5 version and it runs good.
4k30 ultra with 3080.
DLSS and FSR of course around 60. Fine.

There is Ray traced ambient occlusion but it doesn't work? Anyone figured out what it does? there is no difference.

ground textures are very very low res... sometimes high res texture will flicker in
 

RoboFu

One of the green rats
Anyone else notice photomode adds more grass / foilage to teh scenes? I have the highest settings yet photomode still adds in more stuff.


example:

 
Last edited:

Brock2621

Member
Anyone else notice photomode adds more grass / foilage to teh scenes? I have the highest settings yet photomade still adds in more stuff.
I wonder if that means there's an .ini file somewhere people can mess with to better represent some of the earlier (better looking) footage?
 

01011001

Gold Member
There is Ray traced ambient occlusion but it doesn't work? Anyone figured out what it does? there is no difference.

I mean that's what I thought when I first tried the RT shadows mode on PS5. the RT in this game is so mediocre that it's really hard to see what is different, and I wouldn't be surprised if the RT AO is the same quality as the Fidelity FX AO
 

rofif

Member
I mean that's what I thought when I first tried the RT shadows mode on PS5. the RT in this game is so mediocre that it's really hard to see what is different, and I wouldn't be surprised if the RT AO is the same quality as the Fidelity FX AO
RT shadows at least does... something. Not worth enabling but there is an effect. rt AO does absolutely nothing here as I try it.

Anyway. I've found The same spot from ps5 demo and did a comparison for anyone curious. Yes, the one infamous shot that's been compressed and cropped here in past few days.
3080, 4k, ultra, NO FSR, No Dlss.

PS5 vs PC (ps5 is the top picture!)



It's funny you can disable fog in the pc version. Looks like shit without it. Why would they even allow it
 
Last edited:

01011001

Gold Member
RT shadows at least does... something. Not worth enabling but there is an effect. rt AO does absolutely nothing here as I try it.

Anyway. I've found The same spot from ps5 demo and did a comparison for anyone curious. Yes, the one infamous shot that's been compressed and cropped here in past few days.
3080, 4k, ultra, NO FSR, No Dlss.

PS5 vs PC (ps5 is the top picture!)



It's funny you can disable fog in the pc version. Looks like shit without it. Why would they even allow it

maybe the AO only applies to completely static and opaque objects? that wouldn't surprise me in the slightest.
 

rofif

Member
maybe the AO only applies to completely static and opaque objects? that wouldn't surprise me in the slightest.
I did comparisons on the castle fortress wall in shade and nothing.
And normal AO standard vs High just makes the AO shadow line thicker in standard than high
 

01011001

Gold Member
I did comparisons on the castle fortress wall in shade and nothing.
And normal AO standard vs High just makes the AO shadow line thicker in standard than high

I just started the demo I'll also try and look for it lol... wouldn't surprise me if it's literally so bad it's not visible or if it's missing.

the performance so far is surpassingly good tho, way better than on PS5, like it's not even close.
this is a shot with everything max, 1080p, dlss quality mode (which is about 960p I think natively)
dynamic res off and everything maxed including RT shadows and AO



I am hovering around 50fps sometimes 60fps with highly unoptimized settings... I expected a disaster tbh.
a few shader stutters are there tho, and they can be annoying when you first use a spell that has to be compiled.
 

rofif

Member
The ground textures not loading fully are weird, bad ao is weird and for sure there is a memory leak. fps drop to lower digits in the same spots if I run around and come back.
Not the worst pc launch I've seen. Should be ironed out in a week
 

Bo_Hazem

Gold Dealer
RT shadows at least does... something. Not worth enabling but there is an effect. rt AO does absolutely nothing here as I try it.

Anyway. I've found The same spot from ps5 demo and did a comparison for anyone curious. Yes, the one infamous shot that's been compressed and cropped here in past few days.
3080, 4k, ultra, NO FSR, No Dlss.

PS5 vs PC (ps5 is the top picture!)



It's funny you can disable fog in the pc version. Looks like shit without it. Why would they even allow it

 

yamaci17

Member
The ground textures not loading fully are weird, bad ao is weird and for sure there is a memory leak. fps drop to lower digits in the same spots if I run around and come back.
Not the worst pc launch I've seen. Should be ironed out in a week
use standard settings. 10 gigs of vram is not going to cut it for higher textures
you would do yourself a service with standard vram settings, at least proper ones would be loaded. otherwise the engine could simply cheat (it already cheats even on a PS5. watch the DF video, it sometimes refuses to load textures to save for budget)

game most likely requires a PS5 with 64 gb budget, all the while looking worse than TLOU2. you're welcome
 

rofif

Member
use standard settings. 10 gigs of vram is not going to cut it for higher textures
you would do yourself a service with standard vram settings, at least proper ones would be loaded. otherwise the engine could simply cheat (it already cheats even on a PS5. watch the DF video, it sometimes refuses to load textures to save for budget)

game most likely requires a PS5 with 64 gb budget, all the while looking worse than TLOU2. you're welcome
It defaulted to high. I though this would be enough for 10gb so I avoided ultra for vram setting.
Man... 3080 with 10gb really is fucked
 

Reizo Ryuu

Member
The ground textures not loading fully are weird, bad ao is weird and for sure there is a memory leak. fps drop to lower digits in the same spots if I run around and come back.
Not the worst pc launch I've seen. Should be ironed out in a week
no, a 3080 has 10gb VRAM, they said for high texture setting you need 12gb+ for it to work properly; I posted it twice on the previous page:
In the preset “High” other elements of game assets consume higher VRAM than in the “Standard” preset.With 8GB VRAM, the GPU has less headroom for texture streaming which might result in a noticeable downgrade of texture quality in certain parts of the game.
We recommend having 12GB or more VRAM available when using the preset “High”, in order to experience the game with high quality visuals.
 
RT shadows at least does... something. Not worth enabling but there is an effect. rt AO does absolutely nothing here as I try it.

Anyway. I've found The same spot from ps5 demo and did a comparison for anyone curious. Yes, the one infamous shot that's been compressed and cropped here in past few days.
3080, 4k, ultra, NO FSR, No Dlss.

PS5 vs PC (ps5 is the top picture!)



It's funny you can disable fog in the pc version. Looks like shit without it. Why would they even allow it
This must be the ugliest world in an open world game yet...
 

Graciaus

Member
Coming from a heavily modded Skyrim earlier in the day this game looks bad in comparison. The combat I was bored with. The dialogue was awful but maybe Japanese could save it. Other then the texture pop ins I don't think it ran that badly but I mainly just ran around the world before deleting. Even if it was free I wouldn't install the full game.
 

WhartoX

Member
Ran the in-game benchmark. Here are my results.

No DLSS, No FSR, Max settings
CPU: AMD Ryzen 9 5950X
GPU: ASUS TUF RTX 4090 GAMING OC (OC = +200 Core, +1600 Memory, +100 Core Voltage, 133 % Power Limit)
RAM: 64GB DDR4 RAM (3600 Mhz)
SSD: WD_BLACK 2TB SN850

 
Last edited:

Mister Wolf

Member
It really grinds my gears that we've had nvme ssds on PC for years but didn't get to take advantage of them in games until these lazy motherfuckers at Microsoft got off their asses and released the DirectStorage SDK and that's mainly for their precious little consoles. We got it on PC as a side-effect.

Blame Nvidia as well. They only just now came up with GDeflate.
 

01011001

Gold Member
I'm doing a hardcore load time test next.
I'm currently moving the game onto my external HDD, which is the PS4 stock HDD that I use for old games and small files.

this is gonna be interesting. I'll try a cold boot into the game and loading my save file and see how long it's gonna be :D ...moving the game over takes quite a while tho (VERY slow HDD...)
 
Last edited:

Celcius

°Temp. member
Ran the in-game benchmark. Here are my results.

No DLSS, No FSR, Max settings
CPU: AMD Ryzen 9 5950X
GPU: ASUS TUF RTX 4090 GAMING OC (OC = +200 Core, +1600 Memory, +100 Core Voltage, 133 % Power Limit)
RAM: 64GB DDR4 RAM (3600 Mhz)
SSD: WD_BLACK 2TB SN850

To be getting only 75 fps at 4K with an overclocked 4090, it doesn't look like it runs so well... especially considering the visuals don't look like anything special.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Ran the in-game benchmark. Here are my results.

No DLSS, No FSR, Max settings
CPU: AMD Ryzen 9 5950X
GPU: ASUS TUF RTX 4090 GAMING OC (OC = +200 Core, +1600 Memory, +100 Core Voltage, 133 % Power Limit)
RAM: 64GB DDR4 RAM (3600 Mhz)
SSD: WD_BLACK 2TB SN850

133% power limit....on a 4090?
You might be the first person ive seen actually bump up the 4090 power limit, everyone i know drops it down to 75 - 80%.
 

GHG

Member
To be getting only 75 fps at 4K with an overclocked 4090, it doesn't look like it runs so well... especially considering the visuals don't look like anything special.

Getting 70-72 on a stock 4090.

Overclocking a 4090 is the very definition of diminishing returns.
 

01011001

Gold Member
I'm doing a hardcore load time test next.
I'm currently moving the game onto my external HDD, which is the PS4 stock HDD that I use for old games and small files.

this is gonna be interesting. I'll try a cold boot into the game and loading my save file and see how long it's gonna be :D ...moving the game over takes quite a while tho (VERY slow HDD...)

yeah, so direct storage clearly doesn't work on a slow HDD :messenger_tears_of_joy: but I had to test it. these are extreme conditions tho, it's hard to get an HDD this slow these days, those PS4 drives were really awful

 

01011001

Gold Member
Made a shitty video on steamdeck, low settings 30-45 fps.


how are the loading times into the saved game from the main menu? this can't use direct storage after all, so it would be interesting to see raw SSD or flash memory speeds
 
Last edited:

01011001

Gold Member
7,1 seconds, but i use a 3,5gb/s nvme drive in it.

interesting. didn't the devs actually say that Direct Storage reduced the loading in their tests from 10sec down to less than 1sec as well? this would be in that ballpark if true and would kinda confirm the load speed increase using Direct Storage
 

Kenpachii

Member
interesting. didn't the devs actually say that Direct Storage reduced the loading in their tests from 10sec down to less than 1sec as well? this would be in that ballpark if true and would kinda confirm the load speed increase using Direct Storage

No clue, steam deck uses linux tho ( steam os ), not sure if direct storage is even a thing there, i don't have a windows installation on it so can't really test. but i assume it runs the same as any 3.0 pci-e nvme ssd then, maybe a bit slower because CPU cores are a bit less powerful.
 

01011001

Gold Member
No clue, steam deck uses linux tho ( steam os ), not sure if direct storage is even a thing there, i don't have a windows installation on it so can't really test. but i assume it runs the same as any 3.0 pci-e nvme ssd then, maybe a bit slower because CPU cores are a bit less powerful.

my Samsung USB SSD in windows 10 loads as fast as the PS5 version (basically instantly), so direct storage does its job even on not SSDs that aren't that great.
that SSD is rated for 540 MB/s.

too bad you have no Windows partition, that would be really interesting to test due to the low spec nature of the deck, because Direct Storage of course is GPU bound
 

Kenpachii

Member
my Samsung USB SSD in windows 10 loads as fast as the PS5 version (basically instantly), so direct storage does its job even on not SSDs that aren't that great.
that SSD is rated for 540 MB/s.

too bad you have no Windows partition, that would be really interesting to test due to the low spec nature of the deck, because Direct Storage of course is GPU bound

Ah forgot about that, its a gpu solution. But yea won't be installing windows on it for now.
 

yamaci17

Member
having no trouble with 16 gb and a 3070 and a 2700 over here

4k dlss balanced + standard + high mixed settings with high texture quality (not ultra high)

getting sane ram and vram consumptions. CPU easily does 40+ frames consistently



I'm not aware as to why game recommends 24-32 GB RAM. It functions pretty fine without stutters and hitches, as well.

graphics are not that impressive but RAM requirement is a hoax. no way this kind of usage amounts of a 16 gb "minimum" for 720p low.

actually; i managed to play at 1080p/low with a locked and consistent 45 FPS on 8 gb of memory, DESPITE game giving me a warning of " do not play with 8 gigs bro"
 
Last edited:

octiny

Member
Decided to install the demo on my larger NVME drive which only gets 2,100/1,700MB/s to see if there's a difference. Still instant load times when quitting & restarting from title screen.

Using my 4.9 Liter console build (Velka 5, 12600K, 6800 XT, 32GB)

Getting about 65-85 fps maxed out settings w/ RT using FSR2 Quality @ 1440P (OC'ed to around 2610mhz in-game, faster than stock 6900 XT, closer to 6950 XT performance.). Usually around 70-75, lowest I saw was 65 during the big ass bridge fight & big zombie fight. Without FSR, lowest I saw was 52, average hovering close to 60 w/ highs being around 65-70.

Game definitely seems to be better optimized on AMD cards than Nvidia from what I'm seeing around the interwebz, performing around 4070 Ti/3090/3090 Ti performance w/ or w/o FSR/DLSS (max settings).

Edit: Forgot to mention I'm not using VRS, noticed no performance hit with it off. Also, since when did Square hire Jarvis from MCU?
 
Last edited:

yamaci17

Member
It defaulted to high. I though this would be enough for 10gb so I avoided ultra for vram setting.
Man... 3080 with 10gb really is fucked
nope, just use ps5 equivalent settings. you have identical memory budget to ps5, you have to adhere to that. this includes omitting ray tracing (and yes, this will make the 3000 series ray tracing capabilities over a ps5 obsolete in future gen titles if you want equivalent textuer quality at 4k upscaled compared to a PS5. however if you target 1440p or lower resolutions, you will most likely be able to keep ps5 equivalent textures and extra ray tracing together.

practically: you will have options. you won't be necessarily damned, but yeah, the GPU will be stranded to certain options under certain situations

i know u like ur frames as low as possible so yeah, you would be better off with a 12 gb 3060 lol. because in future gen titles,(jking) 3080 will provide additionally at 4k / upscaling will be higher framerates and much better reconstruction. i think dlss alone justifies all RTX cards.

i will be blunt with u. im running this game at 1620p dlss balanced (internally around 900p) and it looks damn near like native 1440p almost. ps5 uses regular fsr to hurdle around 720p and 900p and results are super fuzzy, blurry and weird (i saw them in df video). 1620p dlss balanced is leaps and bounds better than what ps5 spits in this game and of course, i get higher framerate as well in this specification. i get both better performance and better IQ this way on the 3070. but of course most people even you most likely arent realizing these kind of potentials ampere and turing carsdds have

of course i know this wil lhave no meaning for you but it does to me. dlss's 900p and fsr's 900p is not the same. fsr 2 is best used when it upscales 1440p to 4k. dlss however does a majestic job even at low resolutions or lower res. targets.
 
Last edited:

Kenpachii

Member
Testing it out with a 3,5gb/s samsung 970 ssd pci-e 3.0, 9900k 5ghz 8/16 core cpu, 32gb and windows 10, with 3080 tuf stock 10gb model.

3440x1440 ultrawide, everything maxed including textures to ultra high, and RT on, about ~50 fps with drops to high 30's.
With DLSS quality, ~50's, Dlss performance is about ~60's never saw it drop to 50's even in high demanding combat.

Screenshot taken at ultra settings + dlss quality max quality u can get.

Game looks like ass tho really.



Also v-ram consumption seems to halt at 8,8gb from what i saw on this resolution, memory is higher because i have a fuck ton of other programs running on the background. Didn't update nvidia drivers tho.

Load time is pretty much instant.

SSD usage highest i saw, was 360mb/s while roaming around.
 
Last edited:

WhartoX

Member
Getting 70-72 on a stock 4090.

Overclocking a 4090 is the very definition of diminishing returns.

Overclocking the 4090 improves performance for me on every game but it varies heavily. Some games may get a 1-5 fps boost some may get around 10-15 fps boost.
 
  • Thoughtful
Reactions: GHG

GHG

Member
Overclocking the 4090 improves performance for me on every game but it varies heavily. Some games may get a 1-5 fps boost some may get around 10-15 fps boost.

Did you up core speed or memory speed (or both).

From the reading I've done around the subject it's the overclocking of memory speed that can yield better results, especially at higher resolutions.
 
Top Bottom