• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Forspoken PC Requirements

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Is the difference as big even at 4k?
Single digit differences.
Even if you built two identical machines one with a 136K one with a 139K put them side by side let people test the games for however long
And asked people which was which it would be pure pure pure guess work.

These tests are kinda like how people suddenly cant stand playing Returnal cuz they just found out it renders internally at 1080p.
But before they knew that the game was fine.

Once the differences between CPUs is sub 10% hell even sub 20%, yet still well into the 100s of FPS, its purely academic.
 
Last edited:

//DEVIL//

Member
360Hz monitors are a thing. .
lol It does not work like that bud. especially with the 4090. your GPU will be bottlenecked by the CPU and will not give you higher frames. the CPU can't keep up with a 4090 at FHD.



Look at 2k / FHD and 4k. between 2k and FHD he is getting the same frames. so having a 360Hz monitor means shit when your GPU is getting chocked up. the 4090 is on another class halo product. you do not touch FHD or even in some cases 2k with a 4090. that card is pure 4k / 8k experience card.

Even when testing CPU performance. It shouldn't be done with a 4090. for the exact same reason.

Also. with all respect. anyone who has the intention or bought a 4090 to play on a FHD, the word stupid is not enough to describe him/her. might as well limit the frame to 30 fps on that FHD. u know.. for ultimate optimization .... fml.
 

Gaiff

Member
lol It does not work like that bud. especially with the 4090. your GPU will be bottlenecked by the CPU and will not give you higher frames. the CPU can't keep up with a 4090 at FHD.



Look at 2k / FHD and 4k. between 2k and FHD he is getting the same frames. so having a 360Hz monitor means shit when your GPU is getting chocked up. the 4090 is on another class halo product. you do not touch FHD or even in some cases 2k with a 4090. that card is pure 4k / 8k experience card.

Even when testing CPU performance. It shouldn't be done with a 4090. for the exact same reason.

Also. with all respect. anyone who has the intention or bought a 4090 to play on a FHD, the word stupid is not enough to describe him/her. might as well limit the frame to 30 fps on that FHD. u know.. for ultimate optimization .... fml.

And to prove your point you're showing a game that goes from 150fps at 4K to 230fps+ at 1080p?

You also got Hunt Showdown doing 300fps+ lol.



There is absolutely a class of people who will drop their settings to minimum, get a 360Hz TN panel and play at 1080p to get maximum performance.

And once again, the point is to demonstrate CPU differences. I don't even game at 1080p myself.

Once the differences between CPUs is sub 10% hell even sub 20%, yet still well into the 100s of FPS, its purely academic.
Great, now a 20% difference doesn't matter. Talk about being dishonest. 100 to 120 isn't purely academic.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
At least it will at 60fps and not be 480p 20fps on a pc of equivalent value😂🤣
Mate.
You dont want this fight when you can see a bunch of PC guys arguing about 1% lows.

A PS5 is what $500 tax free*

5600/12400 - 150
B550M/B450 - 100
16GB of RAM - 50
1TB SSD SATA - 50
RTX 2060Super - 200

Total = 550 give or take

^This machine will run this game at 60fps above 720p
You may see yourself out.





(I know the DE is cheaper but lets not go there)
 
Last edited:

DenchDeckard

Moderated wildly
The news 7700x is like twice as fast as 3700x which surprised me very much and questions my pc gaming alliance even morne lol. Feels like I got this shitty cpu and platform just recently and it was actually July 2019!!!!!!
I can tell you, cpu was good but that x570 was terrible at first and installing new bios every fucking week was getting tired fast.
1 minute cold boot coming from 20seconds on 2500k was not what I expected but they ironed it out after a year.

Is it time for new platform, cpu and mobo ?
Yes.
Will I upgrade? Fuck no

We know you hate PC...but we still love you!

On the real, I understand the pain on that. Sounds rough.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Great, now a 20% difference doesn't matter. Talk about being dishonest. 100 to 120 isn't purely academic.
Even at 100fps (I wasnt being literal by the way)

The frametime is what 10ms right?
at 120fps
The frametime is what 8.3ms or something.

You telling me you are gonna notice the 1.6 millisecond frame difference whenever it happens??
Mate please.
Now imagine when the difference is 10% or less?
This isnt at 30 vs 40 vs 60fps situation....we are PC gamers.

And I used 100s so not exactly 100 we are seeing these CPUs pull 100s of frames per second easy in many games, so that frame time difference becomes even smaller.
Theres a point of diminishing returns as you go up in frame rate, well above 100fps seeing a 10% difference becomes hard.

Ive got a 165Hz screen, I lock it to 144 or 120 depending on the game.
I couldnt tell you the difference, i sometimes forget to turn it down to 120 and leave it 144 so it VRRs between 144 and 12x.....i couldnt even tell you that shit was happening.
I legit need to have RTSS open and my OCD to notice that the frametime graph is not flat, then suddenly I can notice the difference and need to lock the frames.
 

Gaiff

Member
Even at 100fps (I wasnt being literal by the way)

The frametime is what 10ms right?
at 120fps
The frametime is what 8.3ms or something.

You telling me you are gonna notice the 1.6 millisecond frame difference whenever it happens??
Mate please.
Now imagine when the difference is 10% or less?
This isnt at 30 vs 40 vs 60fps situation....we are PC gamers.

And I used 100s so not exactly 100 we are seeing these CPUs pull 100s of frames per second easy in many games, so that frame time difference becomes even smaller.
Theres a point of diminishing returns as you go up in frame rate, well above 100fps seeing a 10% difference becomes hard.

Ive got a 165Hz screen, I lock it to 144 or 120 depending on the game.
I couldnt tell you the difference, i sometimes forget to turn it down to 120 and leave it 144 so it VRRs between 144 and 12x.....i couldnt even tell you that shit was happening.
I legit need to have RTSS open and my OCD to notice that the frametime graph is not flat, then suddenly I can notice the difference and need to lock the frames.
This is at the moment. There are 20% differences already at 100 vs 120fps. In 4 years from now, what is it going to be? 54 vs 65fps? Maybe 49 vs 60? The point is depending of your configuration, the 13900K might outlast the 13600K. This isn't a 3-4% advantage there. Is that money worth it for most of us? Fuck no. Just build a new rig in 4 years for now. If you have a lot of cash to spare and care about the absolute best uncompromised performance with the safest upgrade path, get the 13900K.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
If you have a lot of cash to spare and care about the absolute best uncompromised performance with the safest upgrade path, get the 13900K.
Im not even arguing against ^ this point.
In 4 - 5 years there is a strong chance youll be upgrading your CPU and GPU.
For me and im sure most people 4 - 5 years is about their upgrade cycle.
Assuming the 126K and 136K are on their last legs at that time then the entire AlderLake and RaptorLake line will be on their last legs at that point too (assuming you still want high framerates and we have GPUs that "easily" do 4K120) youll be upgrading anyway regardless of which K or X you chose.

But if youve got the cash to throw at the "problem" then sure buy the biggest baddest everything....if you are being measured and are planning a gaming first build the 6 core CPUs are bang for buck and will last as long as their higher core brothers.
 

yamaci17

Member
Im not even arguing against ^ this point.
In 4 - 5 years there is a strong chance youll be upgrading your CPU and GPU.
For me and im sure most people 4 - 5 years is about their upgrade cycle.
Assuming the 126K and 136K are on their last legs at that time then the entire AlderLake and RaptorLake line will be on their last legs at that point too (assuming you still want high framerates and we have GPUs that "easily" do 4K120) youll be upgrading anyway regardless of which K or X you chose.

But if youve got the cash to throw at the "problem" then sure buy the biggest baddest everything....if you are being measured and are planning a gaming first build the 6 core CPUs are bang for buck and will last as long as their higher core brothers.
i'd like these people to show us the situations where the 3 year old 10900k achieves colossal performance benefits over 10600k like they claim they would (for example, they claim one would fall below 60 whereas the other would be above 60 etc).

they can also show us how gracefully 6 year old 1700x aged over 1600x.
 
Last edited:

//DEVIL//

Member
And to prove your point you're showing a game that goes from 150fps at 4K to 230fps+ at 1080p?

You also got Hunt Showdown doing 300fps+ lol.



There is absolutely a class of people who will drop their settings to minimum, get a 360Hz TN panel and play at 1080p to get maximum performance.

And once again, the point is to demonstrate CPU differences. I don't even game at 1080p myself.


Great, now a 20% difference doesn't matter. Talk about being dishonest. 100 to 120 isn't purely academic.

please watch the video again and read my comments before you do yours.

was giving you proof of FHD and 2K with 4090, the difference is almost nothing. because the 4090 is bottlenecked at FHD. like why are you even trying to argue with this shit?

4090 and FHD don't go together. we done. every chart you bring here bringing a 4090 and FHD together makes you look like you don't know shit. I am questioning that at this to be honest. do not quote me back again about this. I will not even bother to read it.


and then you link me to a video of someone playing a game with a 4090 at 4k on high and low settings ? what the FUCK this has to do with 4090 ant fhd ? you don't have a point. you can't prove a point. you do not have an idea to have a point to prove it. just stop >_<



Also, those people with 360 monitor... sure. they exist. but NOT WITH A FUCKING 4090 !!!!! IT DOES NOT FUNCTION AT FHD

how many times I have to say it ??

4090 DOES NOT WORK PROPERLY WITH FHD. YOU WON'T GET EXTRA FRAMES
4090 DOES NOT WORK PROPERLY WITH FHD. YOU WON'T GET EXTRA FRAMES
4090 DOES NOT WORK PROPERLY WITH FHD. YOU WON'T GET EXTRA FRAMES
4090 DOES NOT WORK PROPERLY WITH FHD. YOU WON'T GET EXTRA FRAMES

Damn...
 
Last edited:

Gaiff

Member
i'd like these people to show us the situations where the 3 year old 10900k achieves colossal performance benefits over 10600k like they claim they would (for example, they claim one would fall below 60 whereas the other would be above 60 etc).

they can also show us how gracefully 6 year old 1700x aged over 1600x.
I'd like to see you quote anyone making such a claim. Your posts have been nothing but hyperbole and strawmen. "You guys said that the 6-core is useless trash!!!" when I said the exact opposite.

please watch the video again and read my comments before you do yours.

was giving you proof of FHD and 2K with 4090, the difference is almost nothing. because the 4090 is bottlenecked at FHD. like why are you even trying to argue with this shit?

4090 and FHD don't go together. we done. every chart you bring here bringing a 4090 and FHD together makes you look like you don't know shit. I am questioning that at this to be honest. do not quote me back again about this. I will not even bother to read it.


Of course the 4090 and 1080p don't go together. I literally told you the post is a CPU benchmark and this is why they used a 4090 at 1080p as has been done for 20 years on review sites. Few games will scale with CPUs enough to get you 300fps to take advantage of the 360Hz monitor.

And please shut the hell up. You're acting like an annoying spoiled kid.
 
Last edited:

GymWolf

Member
The 5080-90 won't be twice as fast as their predecessors.

Again missing the forest for the trees. I said that the faster CPU would last longer and suffered less over time and could also give you 1-2 more years or an extra generation. This entirely depends on the configuration you have. The takeaway is simply that the better CPU gives you potentially more upgrade paths than the weaker one and those 10-20% differentials could be the differences between a major bottleneck and not. Hence why I used the 9900K vs 9600K as examples.


And you still don't get it. The 13900K is still appreciably faster than the 13600K when you implied there was no difference. 20% in some scenarios is a substantial difference. Furthermore, you're still hung up on that 6-core bit which isn't the point. 6-core isn't the the disease, it's the symptom. The problem aren't the 6-core, it's their placement in the market and what chiefly Intel does to justify the higher SKUs. Take the 9600K which is an i5 part that always had hyperthreading but they decided to disable it to upsell the 9700K and 9900K. While 6-core 12 threads is fine, 6-core 6 threads isn't so good. Same for the 12400K. 6-core part but a much worse bin, resulting in it being a budget performer. Taking a look at the ACTUAL 6-core CPUs over the past 6 years shows us that they have aged worse than their bigger cousins and that's not necessarily due to them being 6-core but they have aged worse nonetheless.

It's not 6-core vs 8-core vs 10-core that you guys are going on about. It's actual 6-core parts that we can measure against better ones and when we saw the obvious differences that could go above 20% (when upon release there was almost none), you guys started downplaying them.

360Hz monitors are a thing. And this is just to test CPU performance.


Way smaller but I honestly don't game at 4K. This is purely a CPU benchmark.
Oh ok, i thought you were still vaguely referring to my case.
 

Umbasaborne

Banned
I found the visuals and performance to be pretty abysmal on ps5, but it was fun enough to play. Ill be curious to try this when i get my 4090 pc
 

sachos

Member
I wouldnt go crazy yet about the recommended system specs. 99% of the time devs put whatever in those charts and real performance is different. Im interested in how DirectStorage will compete with the PS5 loading, the PS5 load times are pretty much instant, sub 2 secs.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I wouldnt go crazy yet about the recommended system specs. 99% of the time devs put whatever in those charts and real performance is different. Im interested in how DirectStorage will compete with the PS5 loading, the PS5 load times are pretty much instant, sub 2 secs.
As far as I remember the load times using an M.2 and DirectStorage was ~2secs.
They demo'd it some time ago and I would argue that it certainly felt as fast as the PS5 like you might think you accidentally just closed a menu instead of actually having loaded anything
And I believe this is using DirectStorage 1.0 not even 1.1 which is even faster.

maxresdefault.jpg


DirectStorage 1.1 can do transfer rates of over 10GB/s with a PCIE Gen 3....~15GB/s with a Gen 4.....I dont even want to imagine what a Gen 5 pulls.

2023-01-12-image-23-j_1100.webp



The sooner more games start using DirectStorage the better.
 

Kataploom

Gold Member
When it comes to PC gaming I'm proud of it. Not upgrading my parts looks like one of the smartest decisions I've ever made.
Actually, it's getting cheaper to get into PC gaming than consoles if one goes 1080p 60fps... An almost PS5 equivalent GPU can be found at around or less than $200 already, we can see how it's gonna be in 2 years from now
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Actually, it's getting cheaper to get into PC gaming than consoles if one goes 1080p 60fps... An almost PS5 equivalent GPU can be found at around or less than $200 already, we can see how it's gonna be in 2 years from now
If you are only aiming for 1080p60 you can certainly get there for "pocket change".
Even aiming higher for instance this machine from earlier in the thread:

5600/12400 - 150
B550M/B450 - 100
16GB of RAM - 50
1TB SSD SATA - 50
RTX 2060Super - 200

Total = 550 give or take

Add a hundred or so dollars for a better GPU and youve got yourself a solid machine that will likely last the entire generation with ease.
 
Last edited:

Kataploom

Gold Member
If you are only aiming for 1080p60 you can certainly get there for "pocket change".
Even aiming higher for instance this machine from earlier in the thread:

5600/12400 - 150
B550M/B450 - 100
16GB of RAM - 50
1TB SSD SATA - 50
RTX 2060Super - 200

Total = 550 give or take

Add a hundred or so dollars for a better GPU and youve got yourself a solid machine that will likely last the entire generation with ease.
GPU can be equal, better or worse depending on game optimization and if you want RT, but that Ryzen CPU is definitely way better so games like A Plague Tale should run better than on PS5, at least they should reach 60 fps even at 1080p... Even if you put $50 more to get a NVMe instead of Sata you'd be getting an amazing deal... Now imagine how it's gonna turn two years from now.
 

yamaci17

Member
I wouldnt go crazy yet about the recommended system specs. 99% of the time devs put whatever in those charts and real performance is different. Im interested in how DirectStorage will compete with the PS5 loading, the PS5 load times are pretty much instant, sub 2 secs.
Yup. I play Spiderman at 4K with ray tracing mostly around 60 FPS on my 16 gigs ram and low end Zen+ CPU despite their devs claiming you would need 32 gigs and 5900x at that spec. RAM usage at max is 11-12 GB total... Nowhere near maxing out 16 GB to begin with. Sure, I have a clean system .Everyone can have them. No need to let tons of launchers run amock on your background if you have limited amounts of memory.
 

winjer

Gold Member
If you are only aiming for 1080p60 you can certainly get there for "pocket change".
Even aiming higher for instance this machine from earlier in the thread:

5600/12400 - 150
B550M/B450 - 100
16GB of RAM - 50
1TB SSD SATA - 50
RTX 2060Super - 200

Total = 550 give or take

Add a hundred or so dollars for a better GPU and youve got yourself a solid machine that will likely last the entire generation with ease.

To make it more comparable with the PS5, the CPU should be a 3600. And the GPU should be a 2070 Super or a 3060.
Yes, the 3060S matches the PS5 on RT. But it loses on rasterization.
Or change it for a 6700XT, getting the performance of a Series X.

BTW, there are a few things missing. Like the PSU, case, mouse and keyboard.
 
Last edited:

rofif

Can’t Git Gud
The hate boner for this is insane. Pcmr is all proud until game really requires good gear.
I don’t get it. 8600k and 1060 is 6yo crap.
16gb of ram is a standard for 10 years!!!!
Sure, ultra is very demanding but that’s how it should be. Let people crank their gear.

Vids like this are disgusting



None of this matters until the game is actually broken.
And for sure none of this should matter. Too much technicality recently. People refuse to but games because they are not up to some imaginary fake standard.
So many people would skip dark souls if it came out today. And many other best games ever. It’s not what is important!

Sure I get annoyed at bugs and broken hdr and I am not happy when this stuff comes first and game later. I was perfectionist about technicality… That’s why I main console recently. Game performance is what it is on console so all I can do is play the game… after inspecting stupid modes. But bestest when I play on pc, I am much more reasonable about inspecting settings. As a kid I really spent more time testing settings than playing :p but now I have better pc and experience. I feel many new pc gamers experience the same perfectionism when gaming on pc as I did in 2000s. It’s probably the proper natural cause of action and platform hobbyist behaviour. They will come around and realise what matters with age. Maybe.

I’ve not decided where and if I will get this. Pc should be better performing for me but ps5 should be ok too. Maybe even safer.

That said, I am now tempted to get this because everyone hates it… And I sometimes align 180 to general opinion. I loved cyberpunk and ds2 is my fav souls. But I recently hated Callisto too. Sometimes you gotta check yourself.
 
was gonna say, 32gb system ram + ~10gb vram is a crazy requirement considering consoles have 16gb unified... but if ps5 is hitting 720p, it doesnt sound nearly as crazy.

game's graphics dont look good though, so it's hard to get excited about steep requirements.

also, havent tried spiderman or the very newest games, but i've personally never played a game that used more than 16gb of system ram. >16gb pagefile, sure, but that's it.
any games use more than 16? (not pagefile)
i want an excuse to get 64gb when i upgrade soon.
 

yamaci17

Member
was gonna say, 32gb system ram + ~10gb vram is a crazy requirement considering consoles have 16gb unified... but if ps5 is hitting 720p, it doesnt sound nearly as crazy.

game's graphics dont look good though, so it's hard to get excited about steep requirements.

also, havent tried spiderman or the very newest games, but i've personally never played a game that used more than 16gb of system ram. >16gb pagefile, sure, but that's it.
any games use more than 16? (not pagefile)
i want an excuse to get 64gb when i upgrade soon.
they will most likely use a lot of RAM if THERE is ample amount of RAM. it also depends on the habits of user. I practicallyy shut everything down except the game launcher when I play games.

As I said, spiderman's 32 gb recommended tier only uses 7-9 GB of RAM (raw, game only usage) on my end. Devs are practically overprovisioning for people who bloats their systems. Some people think there are rules to each gen that somehow magically each gen we will have 2x RAM requirement increase. This is the first time consoles did not get a huge memory upgrade (ps3 512 mb to ps4 8192 mb to ps5 16384 mb total memory budget).

we will see how forspoken / hog legacy / returnal works on 32 gb recommended spec on my 16 gb spec. I will gladly share footage if it fails colossally or if it runs gracefully

you will surely see a lot of disgruntled 16 gb users, as i said, if they leave their systems bloated. bloated is the keyword here. trying to play game with 10 browser tabs open, a twitch stream open on the second monitor, 4 launchers automatically launching at startup: these would cause the "32 gb" recommendation to actually make a sense. this is definitely not me.

this is not to suggest anyone to get a 16 gig. everyone must build 32 gig from now on to be on the safe side and be free as the wind with background operations

but people who already have 16 gb has nothing to fear as long as they keep backgroudn stuff in check

this is what kind of ram usage spiderman has at 4k / high / very high ray tracing. based on Imsomniac/Nixxes's recommendations this needs 32 gb ram and a 5900x for a playable experience.


8W5vpzb.jpg

SmeqsdH.jpg
 
Last edited:

Guilty_AI

Member
The hate boner for this is insane. Pcmr is all proud until game really requires good gear.
I don’t get it. 8600k and 1060 is 6yo crap.
16gb of ram is a standard for 10 years!!!!
Sure, ultra is very demanding but that’s how it should be. Let people crank their gear.

Vids like this are disgusting



None of this matters until the game is actually broken.
And for sure none of this should matter. Too much technicality recently. People refuse to but games because they are not up to some imaginary fake standard.
So many people would skip dark souls if it came out today. And many other best games ever. It’s not what is important!

Sure I get annoyed at bugs and broken hdr and I am not happy when this stuff comes first and game later. I was perfectionist about technicality… That’s why I main console recently. Game performance is what it is on console so all I can do is play the game… after inspecting stupid modes. But bestest when I play on pc, I am much more reasonable about inspecting settings. As a kid I really spent more time testing settings than playing :p but now I have better pc and experience. I feel many new pc gamers experience the same perfectionism when gaming on pc as I did in 2000s. It’s probably the proper natural cause of action and platform hobbyist behaviour. They will come around and realise what matters with age. Maybe.

I’ve not decided where and if I will get this. Pc should be better performing for me but ps5 should be ok too. Maybe even safer.

That said, I am now tempted to get this because everyone hates it… And I sometimes align 180 to general opinion. I loved cyberpunk and ds2 is my fav souls. But I recently hated Callisto too. Sometimes you gotta check yourself.

The game's visuals don't justify the requirements, which indicates a poor optimization. Don't know whats so hard to understand about that.
 

Hugare

Member
The hate boner for this is insane. Pcmr is all proud until game really requires good gear.
I don’t get it. 8600k and 1060 is 6yo crap.
16gb of ram is a standard for 10 years!!!!
Sure, ultra is very demanding but that’s how it should be. Let people crank their gear.

Vids like this are disgusting



None of this matters until the game is actually broken.
And for sure none of this should matter. Too much technicality recently. People refuse to but games because they are not up to some imaginary fake standard.
So many people would skip dark souls if it came out today. And many other best games ever. It’s not what is important!

Sure I get annoyed at bugs and broken hdr and I am not happy when this stuff comes first and game later. I was perfectionist about technicality… That’s why I main console recently. Game performance is what it is on console so all I can do is play the game… after inspecting stupid modes. But bestest when I play on pc, I am much more reasonable about inspecting settings. As a kid I really spent more time testing settings than playing :p but now I have better pc and experience. I feel many new pc gamers experience the same perfectionism when gaming on pc as I did in 2000s. It’s probably the proper natural cause of action and platform hobbyist behaviour. They will come around and realise what matters with age. Maybe.

I’ve not decided where and if I will get this. Pc should be better performing for me but ps5 should be ok too. Maybe even safer.

That said, I am now tempted to get this because everyone hates it… And I sometimes align 180 to general opinion. I loved cyberpunk and ds2 is my fav souls. But I recently hated Callisto too. Sometimes you gotta check yourself.

Dont die on this hill, man

The game goes down to PS3 resolution on the PS5, while looking like a last gen game.

The game may end up being fun, but dont come up with excuses for Square mistakes. They are no indie company with no budget. Shit like this shouldnt be acceptable when they are charging you premium for it.

I'm no perfectionist or whatever, I've played Deadly Premonition on the PS3 and thats one of my favorite games. But Swery didnt have 1/10 of the budget this game probably has.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
To make it more comparable with the PS5, the CPU should be a 3600. And the GPU should be a 2070 Super or a 3060.
Yes, the 3060S matches the PS5 on RT. But it loses on rasterization.
Or change it for a 6700XT, getting the performance of a Series X.

BTW, there are a few things missing. Like the PSU, case, mouse and keyboard.
Give or take.
30 - 70 for the PSU
Mouse and Keyboard, whoever sold you the parts tries to offload them on you or you find in the trash at the back of any office complex, I legit turn down M/KBs because there are so many just lying around you'd think they were the dominant species on the planet.
Your motherboard comes with a free case :messenger_winking_tongue:
txDTEWC.jpg

The game's visuals don't justify the requirements, which indicates a poor optimization. Don't know whats so hard to understand about that.
A 5500XT for 720p?(Bet it does 1080p easy work)
A 6700XT for 1440p?
A 6800XT for 2160p?

Thats not dissimilar to Far Cry 6, The OuterWorlds, Horizon Zero Dawn, Dying Light 2, Shadow of the Tomb Raider performance and Id argue this game is atleast on par graphically with those titles.
The particle effects alone of this game justify that.

The 6800XT is pretty much a baseline 4K60 GPU.
Thats what this game is asking for, and people are shocked?
4K-p.webp
 

Dream-Knife

Banned
Upgrading your GPU without upgrading your CPU is kind of dumb tbh. Get a cpu that can max out your GPU and then do another build when the time comes.

lol It does not work like that bud. especially with the 4090. your GPU will be bottlenecked by the CPU and will not give you higher frames. the CPU can't keep up with a 4090 at FHD.



Look at 2k / FHD and 4k. between 2k and FHD he is getting the same frames. so having a 360Hz monitor means shit when your GPU is getting chocked up. the 4090 is on another class halo product. you do not touch FHD or even in some cases 2k with a 4090. that card is pure 4k / 8k experience card.

Even when testing CPU performance. It shouldn't be done with a 4090. for the exact same reason.

Also. with all respect. anyone who has the intention or bought a 4090 to play on a FHD, the word stupid is not enough to describe him/her. might as well limit the frame to 30 fps on that FHD. u know.. for ultimate optimization .... fml.

4090 will be a 1440p high refresh card in another year or two. Tech advances.
 

Guilty_AI

Member
A 5500XT for 720p?(Bet it does 1080p easy work)
A 6700XT for 1440p?
A 6800XT for 2160p?

Thats not dissimilar to Far Cry 6, The OuterWorlds, Horizon Zero Dawn, Dying Light 2, Shadow of the Tomb Raider performance and Id argue this game is atleast on par graphically with those titles.

The 6800XT is pretty much a baseline 4K60 GPU.
Thats what this game is asking for, and people are shocked?
4K-p.webp
Except for Dying Light 2 (which is an open world with much denser and detailed enviroments than Forspoken), all the games you mentioned run on a gtx 1050 ti... at 1080p... with good settings. And i'd argue some of them actually look better than it.

The particle effects alone of this game justify that.
Yes, i noticed it had tons of fancy effects, and that should be one of the first things that they must let us control the quality of, and possibly reduced for the ps5 performance mode, assuming they're really one of the causes (I personally think their asset streaming tech is just too inefficient). Reminds me of that talk about FFXIV's flower pots.
 
Last edited:

BennyBlanco

aka IMurRIVAL69
Upgrading your GPU without upgrading your CPU is kind of dumb tbh. Get a cpu that can max out your GPU and then do another build when the time comes.


4090 will be a 1440p high refresh card in another year or two. Tech advances.

You can ride a CPU for quite a long time with little drawback in my experience. GPUs see huge performance gains gen over gen.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Except for Dying Light 2 (which is an open world with much denser and detailed enviroments than Forspoken), all the games you mentioned run on a gtx 1050 ti... at 1080p... with good settings. And i'd argue some of them actually look better than it.


Yes, i noticed it had tons of fancy effects, and that should be one of the first things that they must let us control the quality of and reduced it for the ps5 performance mode. Reminds me of that talk about FFXIV's flower pots.
Do you really think a GTX 1060 wont be able to run this game at 1080p with "good settings"?
Are you taking the rec-spec sheet actually literally?
 

Guilty_AI

Member
Do you really think a GTX 1060 wont be able to run this game at 1080p with "good settings"?
Are you taking the rec-spec sheet actually literally?
I'm following the recommended specs they gave us, which says a gtx 1060 does minimal settings at 720p/30fps.

Of course they could be just overshooting them, but the performance we've seen of the game on the ps5 doesn't fill me with confidence.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I'm following the recommended specs they gave us, which says a gtx 1060 does minimal settings at 720p/30fps.

Of course they could be just overshooting them, but the performance we've seen of the game on the ps5 doesn't fill me with confidence.
Its releasing shortly no need to speculate for too long.

All this doom and gloom from a rec-spec sheet is really weird for a gaming forum cuz we aught to know better.
Sure it says GTX 1060 720p30 (unknown settings).
But realistically play with said settings and 1080p shouldnt be an issue for the 1060.
If 4K60 Ultra is achievable with a 6800XT, then this game in theory falls right in line with most AAA games that have come out the last 2 years.
 

//DEVIL//

Member
Upgrading your GPU without upgrading your CPU is kind of dumb tbh. Get a cpu that can max out your GPU and then do another build when the time comes.


4090 will be a 1440p high refresh card in another year or two. Tech advances.

Not necessary. the same the 3040 is still 4k card or almost in most games and its a 2 years old ( by your standards). and its a bottleneck for 1080p.

We might jump to a resolution 8k where the 4090 is not enough.

the 980ti, still does decently on full HD gaming at 60 frames ( minus ray tracing ) and that was released during the ice age.
 

01011001

Banned
Upgrading your GPU without upgrading your CPU is kind of dumb tbh.

not if you simply want to use higher resoluiand settings but are fine with the framerates you're getting.

like if you play on a 4K screen a 4070ti will basically be as perfomant at that resolution with a 3600X as it would with a 5600X in most games.

like maybe you're on a 1440p screen, and your current GPU needs to constantly use upscaling or reconstruction tech to reach that res, then upgrading to a card that gets you there basically 100% of the time is absolutely an option without changing your CPU.
 
Last edited:

DenchDeckard

Moderated wildly
My 8700k served me for years because you could overclock cpus by a lot back then.

Nowadays everything is running close to peak performance so maybe we will need to upgrade sooner.

I'll see how this 13900ks lasts.
 

KyoZz

Tag, you're it.
The hate boner for this is insane. Pcmr is all proud until game really requires good gear.
Stop the bait man, that game doesn't justify those requirement AT ALL. And I'm all in for a new Crysis like game that pushes boundaries but it will not be Forspoken.
I have a fuckin beast of a PC, and no way that game is getting my money.

Dipping to the low 20s on PS5 and 720p? How can you defend this?
Bad (very, very bad) AO, poor texturing, bad LOD, stupid AI, very dull lighting etc etc...
AMD Fidelity FX suite of effects is bad. Compared to Nvidia Gameworks it just fail in every aspect.

image.png


And this is a next gen only game??

Lindsay Lohan Reaction GIF by MOODMAN


Enjoy this game if you want, but don't come after people that call this out.
 

rofif

Can’t Git Gud
Stop the bait man, that game doesn't justify those requirement AT ALL. And I'm all in for a new Crysis like game that pushes boundaries but it will not be Forspoken.
I have a fuckin beast of a PC, and no way that game is getting my money.

Dipping to the low 20s on PS5 and 720p? How can you defend this?
Bad (very, very bad) AO, poor texturing, bad LOD, stupid AI, very dull lighting etc etc...
AMD Fidelity FX suite of effects is bad. Compared to Nvidia Gameworks it just fail in every aspect.

image.png


And this is a next gen only game??

Lindsay Lohan Reaction GIF by MOODMAN


Enjoy this game if you want, but don't come after people that call this out.
It's not ideal. I am not defending it. Why would I?
But at the same time, I couldn't care less. People are way too focused on technicality these days.
I understand that, I am too sometimes and was way more few years ago...
It's not broken as much to prevent enjoyment. Dark Souls 1 was 15 fps in places.
None of this really matters. People are now boycotting games because they are not perfect, not perfect stable 60fps, this and that. It's petty, pathetic and really these people are not games.
 

KyoZz

Tag, you're it.
It's not ideal. I am not defending it. Why would I?
You are. You even launched that demo again to took screenshots and prove your point (in a bad way since those screens don't look good at all, passable/ok at best).

But at the same time, I couldn't care less. People are way too focused on technicality these days.
Of course! We entered a new generation (uh sorry, they are already more than 2 years old but whatever) so obviously people want's to see great looking games, especially when they are current gen exclusives.

It's not broken as much to prevent enjoyment. Dark Souls 1 was 15 fps in places.
And here it is 20 FPS and 720p. That suck ass no matter how you put it. Not to mention that I don't care about other games, we are talking about Forspoken here.

None of this really matters
Tom Cruise What GIF


Seriously Rofif, I want to understand you, I really do but it is hard. And if you don't care, stop spending your time in those kind of threads, shitting on PC for whatever reason or because you are tired of PC settings.

People are now boycotting games because they are not perfect, not perfect stable 60fps, this and that. It's petty, pathetic and really these people are not games.
As always, it is the loud minority, because the large margin of people just play the games, and don't come here to talk about those subjects.
But on the other hand, I do think that a game like Forspoken should look good and have at least a stable framerate in quality mode. And 720p... what year is it??
 

JackSparr0w

Banned
It's not ideal. I am not defending it. Why would I?
But at the same time, I couldn't care less. People are way too focused on technicality these days.
I understand that, I am too sometimes and was way more few years ago...
It's not broken as much to prevent enjoyment. Dark Souls 1 was 15 fps in places.
None of this really matters. People are now boycotting games because they are not perfect, not perfect stable 60fps, this and that. It's petty, pathetic and really these people are not games.
Coming across your posts everywhere you seem to be losing it day by day.

I would take a break from this forum for a while.
 
  • Fire
Reactions: amc

rofif

Can’t Git Gud
You are. You even launched that demo again to took screenshots and prove your point (in a bad way since those screens don't look good at all, passable/ok at best).
I relaunched the demo and too the screenshot to debunk this terrible cropped compressed screenshot. Which I did.
Coming across your posts everywhere you seem to be losing it day by day.

I would take a break from this forum for a while.
I am fine. What's the problem? I seem like the only sane one here. People going on crazy hate crusades because it's not the best looking game and characters are too chatty?
Of course this is all important to me. Gaming is my passion man. I am just annoyed by all the boycotters and haters with so many games releases nowadays. Why are they even gaming at all?

btw - I am not delusional. I see how flat the game looks in some parts. Worse than ff XV... maybe pc version will help but will probably be a broken mess
 
Last edited:

JackSparr0w

Banned
I relaunched the demo and too the screenshot to debunk this terrible cropped compressed screenshot. Which I did.

I am fine. What's the problem? I seem like the only sane one here. People going on crazy hate crusades because it's not the best looking game and characters are too chatty?

Of course this is all important to me. Gaming is my passion man. I am just annoyed by all the boycotters and haters with so many games releases nowadays. Why are they even gaming at all?
People love memeing on dumb shit. You on the other hand care too much for some reason and your posts scream "I need a break from all this nonsense".

Why don't you go play some cool video games with your mates? Shoot the shit, relax etc.
 

rofif

Can’t Git Gud
People love memeing on dumb shit. You on the other hand care too much for some reason and your posts scream "I need a break from all this nonsense".

Why don't you go play some cool video games with your mates? Shoot the shit, relax etc.
Internet was a mistake. Seriously :p
I eat the bait too much. I know.
 

Reizo Ryuu

Gold Member
A 6700XT for 1440p?
A 6800XT for 2160p?
I don't really understand these requirements because for the 6700xt they say 1440p30 while the 6800xt it's supposedly UHD60? The 6800xt is on average a bit more than 40% faster than the 6700xt, so how is it able to push double the frame rate at double the pixels?
 

Kataploom

Gold Member
I don't really understand these requirements because for the 6700xt they say 1440p30 while the 6800xt it's supposedly UHD60? The 6800xt is on average a bit more than 40% faster than the 6700xt, so how is it able to push double the frame rate at double the pixels?
Probably the 6700xt can pull 60 fps but they're counting on the CPU to hold it back
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I don't really understand these requirements because for the 6700xt they say 1440p30 while the 6800xt it's supposedly UHD60? The 6800xt is on average a bit more than 40% faster than the 6700xt, so how is it able to push double the frame rate at double the pixels?
ExmLATeUUAMO4Id.png:large



We find out tomorrow.
I dunno how many people on GAF are actually gonna buy the game, but I think we need a performance thread.
Im pretty sure the 6700XT will easily pull 1440p60 if a 6900XT can pull 2160p60
 

yamaci17

Member
I don't really understand these requirements because for the 6700xt they say 1440p30 while the 6800xt it's supposedly UHD60? The 6800xt is on average a bit more than 40% faster than the 6700xt, so how is it able to push double the frame rate at double the pixels?
you have a point but sometimes 6700xt gets super bandwidth bound at higher resolutions to a point it gets %60 70 slower than the 6800xt. this game seems to be super bandwidth limited; it pushes below 1080p average resolution on PS5 to hit 60 FPS targer WHILE upscaling to 1440p (not even 4K upscaling).

never forget; PS5 is never far away from a 6700xt. especially more so when bandwidth bound situations are present. it firmly sits between 6600xt and 6700xt, bouncing between them depending on how bandwidth bound the game is. if you're at not so bandwidth intensive resolutions or settings, ps5 practically hurdles along the 6600xt. in super heavy bandwidth bound scenarios, it hurdles with 6700xt or get close to it. 6600xt is nowhere close on a PS5 with matched ray traced settings in Spiderman, whereas 6700xt and PS5 aligns nicely. 6600xt only gets like for like performance compared to PS5 at native 1080p where it is not crushed under heavy bandwidth load. infinity cache is cool and all, but it is no remedy for higher resolutions.

now; I believe 6700xt requirement is correct. What I believe is that 4K recommendation is incorrect. it has to be with FSR Quality at 4K. on the contrary, if ps5 averages below 1080p to hit a barely locked 60 with medium or recommended settings (whatever they're), it is impossible for 6800xt to hit 4k, ultra and 60 fps at the same time. maybe it can double the resolution, but higher settings? that seems unlikely
 
Top Bottom