• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Alan Wake II : Allegedly Only 7% of players will be able to run the game at real 1080P 60FPS

Spyxos

Member


fldzjx9suovb1.png
 
Aw yeah, my 3080 stumbling in, overworked and tired at the finish line.

It's interesting reading the Reddit comments. It's people coming face-to-face with "PC hardware inflation", since it hasn't been much of an issue in the past 5 years or so. I 'member the late 90s when your PC was outdated basically yearly.
 
Last edited:

Spyxos

Member
Do ya'll want games to push the visual bar or not?


I swear, if a game runs well on shit hardware, everyone complains that "iT'S nOt NeXt gEn". If a game looks incredible and has high hardware requirements to go along with that, it's "nO oNe CaN pLaY tHiS aT 60fPs".
Of course I want that, but it also has to remain realistic, especially with today's graphics card prices. It seems with many new games that there is very little or no optimization and the games are then thrown on the market.

If you have a 4080 or 4090 or a 7900 xtx. Sure, you don't care much, but everyone below that is already starting to sweat.
 
Last edited:

diffusionx

Gold Member
The problem here is that hardware is staying on the market way too long and is absurdly priced. The 4xxx generation is not good enough for this new round of games and is also too expensive. If you have to spend $650 on a GPU just to run a game at 1080p 60fps it's just not sustainable as a market.
 

HeWhoWalks

Gold Member
The fair warning has been given! Hopefully, folks with 1080Ti-s don't come complaining next weekend! :p
 
Last edited:

HeWhoWalks

Gold Member
Do ya'll want games to push the visual bar or not?


I swear, if a game runs well on shit hardware, everyone complains that "iT'S nOt NeXt gEn". If a game looks incredible and has high hardware requirements to go along with that, it's "nO oNe CaN pLaY tHiS aT 60fPs".
Spot on!
 

GudOlRub

Neo Member
I'm really curious to see the final product. I see all the "people didn't complain when it was Crysis, bla bla bla" arguments, well, when Crysis came out it was a truly incredible generational leap in graphics, will this game do the same without ray tracing? Doesn't look that way to me from all the previews...
If the game justifies its rasterization requirements with mind-blowing graphics then cool, but if it's one of those games where you need to zoom in 100x on a screenshot to see differences then everyone spouting these "Crysis-like" arguments will just look silly when you have games like RDR2 and TLOU2 running on PS4 hardware.

I'm okay with "Ultra" settings being future-proofing settings that will only be able to run well on mid tier cards in 5 or 10 years time, but when you tell me a 3070 can only run at 1080p60fps with DLSS Performance ON (540p native) then you better deliver some mind-blowing shit.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Settings how do they work?

dont-know-idk.gif






Yes a game made with cutting edge tech in 2023 is not going to allow you to play it at max settings on your potato.
If your potato actually could max settings the game at 1080p60 we should be questioning the devs why they dont want to utilize the hardware we spend hundreds of dollars on.



<----Has a lowly RTX3080, min/maxing my way to 1440p60(DLSS) with RT medium and PT on.
mad-max-fury-road.gif
 

Kadve

Member
I’m ok with high requirements as long as the game is mind-blowing as a result, like Control and AW were. It’s time to push the limits of this fancy 4070.
Well since its Alan Wake i expect them to render every drop of that scenery obstructing mist! /s
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
4-7 year old GPU not supported is ok, at some point support must be dropped to push higher limits.
But I want to play games with my GPU from 2016 at max settings.....its not fair, I could play games at max settings in 2016 why cant I play at max settings 7 years later?

Totally unoptimized shit and lazy devs cuz obvious if my hardware is stagnant for over half a decade then the devs have to be stagnant with their software too.
 

Holdfing

Member
Do ya'll want games to push the visual bar or not?


I swear, if a game runs well on shit hardware, everyone complains that "iT'S nOt NeXt gEn". If a game looks incredible and has high hardware requirements to go along with that, it's "nO oNe CaN pLaY tHiS aT 60fPs".
Do games push the visual bar enough to justify these requirements, though? AW2 didn't wow me at all on the graphics front. In fact, most games don't.
I know, diminishing returns an all that. Improvements nowadays are really marginal.
 

Sleepwalker

Member
If they were releasing GPUs every year, ok, but 7 years in time only only 3 generations in Nvidia hardware have come out, and the price has exploded since then.
I think with the leaps in upscaling and RT its gonna be harder and harder to support pre turing GPU's in next gen games. Between that and devs dropping the xbox one/ps4 gen its only gonna get worse.

Yeah the prices are higher now but that's not the devs fault. Aside from alan wake, other games will support those cards but performance will keep getting weaker and no amount of complaining will change it.

I have a 2080s in my secondary PC and I've made my peace with it being a low/med settings card nowadays.
 

diffusionx

Gold Member
I think with the leaps in upscaling and RT its gonna be harder and harder to support pre turing GPU's in next gen games. Between that and devs dropping the xbox one/ps4 gen its only gonna get worse.

Yeah the prices are higher now but that's not the devs fault. Aside from alan wake, other games will support those cards but performance will keep getting weaker and no amount of complaining will change it.

I have a 2080s in my secondary PC and I've made my peace with it being a low/med settings card nowadays.
I'm fine with increasing requirements, and this shift happens every single gen when devs start moving their tech over to "next gen", it's just been weird this time around.

BUT... we also have a situation where new hardware is not coming out and prices are not going down. Like, if Nvidia had a 5060 that ran as well as the 4070 for $300, or a 6050 at $200, this would not be such a big deal because that is a much easier upgrade than a $650+ GPU. So the problem is that they are just gouging this market with hardware that is looking underpowered very quickly.
 
Last edited:

Sleepwalker

Member
I'm fine with increasing requirements, and this shift happens every single gen when devs start moving their tech over to "next gen", it's just been weird this time around.

BUT... we also have a situation where new hardware is not coming out and prices are not going down. Like, if Nvidia had a 5060 that ran as well as the 4070 for $300, or a 6050 at $200, this would not be such a big deal because that is a much easier upgrade than a $650+ GPU. So the problem is that they are just gouging this market with hardware that is looking underpowered very quickly.

I 100% agree about prices. Hopefully Intel provides a good product at a decent price with Battlemage.
 
Friendly reminder that A) we need more competition in the consumer GPU space and B) in my opinion, it's unlikely that competition will come from AMD because their CEO and NVIDIA's are family. No chance of collusion there, right?

 

marjo

Member
J
The problem here is that hardware is staying on the market way too long and is absurdly priced. The 4xxx generation is not good enough for this new round of games and is also too expensive. If you have to spend $650 on a GPU just to run a game at 1080p 60fps it's just not sustainable as a market.

Exactly. The fact that NVidia's current gen $500 card (4060TI) will likely struggle to play this game at a decent resolution and frame rate is absurd.
 
Do ya'll want games to push the visual bar or not?


I swear, if a game runs well on shit hardware, everyone complains that "iT'S nOt NeXt gEn". If a game looks incredible and has high hardware requirements to go along with that, it's "nO oNe CaN pLaY tHiS aT 60fPs".

Game doesn’t look that impressive though? And we’re talking 1080p here.

This is poor optimization with devs crutching on DLSS.
 

Spyxos

Member
People still playing dumb about DLSS. It's completely disingenuous. DLSS adds information and in the case of 4K DLSS Performance it looks far superior to 1440p and perhaps even 1800p. Get over your sour grapes guys.
It can look better, but it is not always the case.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Game doesn’t look that impressive though? And we’re talking 1080p here.

This is poor optimization with devs crutching on DLSS.

Have seen any of the New York gameplay.
This game looks absolutely stunning and super dense, I can see why they are using Mesh Shaders, this shit is likely super heavy using vertex shaders.....well I guess the PC requirements are proving that.

P.S The RTX 2060 base even being invited to the party is proof enough the game is optimized enough considering the density they are pushing on screen and the world switching shit.

Look at the CPU requirements and say the game is poorly optimized again.
A 7500K..........a frikken 3700X?.....that shit gets decked by 100 dollar 4 core CPUs of yesteryear
 
Do games push the visual bar enough to justify these requirements, though? AW2 didn't wow me at all on the graphics front. In fact, most games don't.
I know, diminishing returns an all that. Improvements nowadays are really marginal.
Game doesn’t look that impressive though? And we’re talking 1080p here.

This is poor optimization with devs crutching on DLSS.

It’ll be in the conversation for best looking game ever made, and will be one of only 3-4 retail games that supports full pathtracing/rtgi.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Any words about Arc GPU's btw? Not compatible at all?

Arc A770 and Arc A750 are between the RTX3060 and RTX3060Ti depending on the title and have full support for DX12U so they support Mesh Shaders.
You can extrapolate its performance from there.
 

Mister Wolf

Gold Member
So why arent' the people claiming that we need a PC PRO and that PC is now outdated? 1080p on 2023 and stuff like that. I'm confuse :messenger_mr_smith_who_are_you_going_to_call:

Because 4K DLSS Performance > 1080p or 1440p. Are you still confused?





Now if 4K DLSS Quality which uses 1440p as its base looks this much better than 1440p wouldn't it stand that the same would apply to 1080 native vs 4K DLSS Performance?
 
Last edited:
Top Bottom