Cerny actually warned us of this in his Road to PS5 talk. He said RT has a pretty significant vram cost that would have its own hit on performance beyond taking up the GPU.Sub 720p..in 2023 on the high end systems...what an absolute joke....
Where's Mark Cerny and beardy man? I demand answers
Yeah, the last AAA game I recall running at 640p was MGS4 in 2008. XBox games had been 720p since Gears came out but PS3 games struggled for a bit until the end of 2007. Uncharted and Ratchet were 720p but Ass Creed and other 3rd party games were still sub 720p. But once devs got used to the cell, it was all good. Those 2009 games like Batman Arkham Knight, AC2, and well every sony first party game were all 720p.The game bottoms out at 648p?
Even COD games during the 360/PS3 era ran higher.
Oh? Is the performance up to snuff now? I definitely didn't expect this. Thought they still had lots of work to do.That's what I thought when I saw the state TLOU2 shipped in but those damn devs did it. Took them a month but they did it. The game is pretty much flawless now in terms of vram, cpu and gpu optimization.
It's not tho... GPU is mostly under 50% as far as I've seen, it's the CPU utilization the problemIt's relative.
The consoles are cheap and have limited RT capabilities (the game crashing is bad though, DF said they experienced one crash, hopefully it was just an anomaly).
A $2000 GPU struggling between 30-60fps is dogshit. In the very least they should have delayed PC version.
Everything I've said in this post so far is a joke, but I honestly do believe that the current crop of devs dont like playing video games.* If they did, they would actually go home and play their own games or other games that came out in the last 6 months and say hey maybe we should learn from that and not fuck up like the other devs did. But they dont play games. Maybe it's just another job to them, but an industry that doesnt have passionate people willing to stay up all night while a director and actor go through 100 takes of a scene then they should not be in that industry. And im not just talking about performance here, im talking about a general lack of ambition thats plagued this industry over the last decade or so. Something's changed and its affected the entire industry. Even japanese devs.
- speaking of nerds, I am beginning to believe that we need to go back to hiring exclusively nerdy dudes who have nothing better to do than make games all day.
- No more cool kids with girlfriends. I want virgins designing video games.
- If you have kids that means you managed to convince a real life woman to nut in her. You are too cool to make video games.
- I dont want the purple haired crowd with 10k followers on Twitter. I want the guy who has no followers.
- I want women devs who look like Amy Henning, not Alannah Pearce.
- I want devs who actually own an OLED, and care about shit like HDR. Honestly dont think anyone of these guys actually play their own games.
- I want them to only hire devs who own a 4000 series GPU or they can piss off and go work for an indie studio.
- Put it on the Job Requirements: If you're not a hardcore gamer, dont make games with us.
- Imagine if Nolan, Spielberg, Kubrick didnt like watching movies. Imagine if JK Rowling didnt like reading books. Thats the feeling I get from the current crop of developers.
*I remember how at one point Bungie had to set time aside at 4PM everyday for their devs to play Destiny because none of them were playing it at home and there was a big disconnect between the fanbase and the developers.
Yeah, the patch on tuesday fixed everything. no vram related stutters for me on my 3080. No bizarre dips to 40 fps even at lower resolutions where i wasnt vram bound. no bizarre stutters to 5 fps after playing for an hour. no crashes when quitting to main menu! way faster loading and even the shader compilation only took around 6-10 minutes. it used to take 35-40 minutes.Oh? Is the performance up to snuff now? I definitely didn't expect this. Thought they still had lots of work to do.
Those bottomed out at 540p-600p, just a hair over Dreamcast's common 480p, but you probably excused them by the rest things they were doing on those systems the Dreamcast couldn't ever hope to achieve rather than pretend all else is equal between BLOPS & Outtrigger or whatever DC game.The game bottoms out at 648p?
Even COD games during the 360/PS3 era ran higher.
What's the point of RT in performance mode when the resolution is so low you can't even see it?
That's great. Hopefully they keep at it and get their methodology for PC optimization honed in. When Factions releases, they definitely don't want a repeat of this.Yeah, the patch on tuesday fixed everything. no vram related stutters for me on my 3080. No bizarre dips to 40 fps even at lower resolutions where i wasnt vram bound. no bizarre stutters to 5 fps after playing for an hour. no crashes when quitting to main menu! way faster loading and even the shader compilation only took around 6-10 minutes. it used to take 35-40 minutes.
sub 720p?
Interesting that they focused on the PS5 version here, but it’s the best version so I guess it makes sense.
Series s is not using RT so it probably is the most stable 30fps console version, hahaha.
Why even bring this into console warring when the issue effects every console?
Do you realize how immature that is?
Who cares?! The game is in an awful state on everything.
I am sorry, but yes, it's better to just include a performance mode. Especially when you are developing for just the one/two platforms. And first party no less. And calling this unpayable is a stretch... at least it has a performance mode.I'll laugh if the same reviewers that gave this game high scores will criticize Redfall for releasing with only a 30 fps mode. Maybe the lesson is to just include a performance mode, whether or not it's actually playable.
And a good example of a misleading post.Sub 900P wtf. Lol so blurry with performance mode.. FSR 2 is the cherry on top.
Better off for them to just use UE4's TAAU if they're going that low, jees.
While I agree that this game falls short of the mark, I feel the OP and the general tone of this thread is misleading.
Eg. saying the game is running at 720p... is very misleading. Why is it ok to call FSR/Dlss on PC, 4K quality/4K performance or 1440p quality/performance and ignore the base native resolutions the games have to run at those presets? But when it's done on a console we attempt to sensationalize things and talk about those base resolutions as if they aren't doing the same reconstruction that is being headed everywhere else.
Your post matches your pfp.
Stop being a console warring cretin and grow up, the game runs like shit on everything, literally everything, this isn't the time to parade our favourite plastic boxes, it's time to expose shitty practices like this.
Y'all thinking the Series S is REALLY dragging stuff down that much? Seems like the PS5's wax wings are melting.
Ok cool, I can agree with that. Because to do 1140p in performance mode its actually running natively at around 720p and reconstructs that. And we know FSR2 is the suck when targeting lower resolutions.noone does that. everyone understands that DLSS Quality means that the game runs at a lower resolution and is getting reconstructed.
also, FSR2 looks like garbage, and therefore you can easily tell that the resolution is low as fuck.
at no given time does this game look like 1440p in performance mode.
In this particular cases the results are a bit too poor, be it the very low input resolution or a bad implementation of FSR, but I generally agree with your point and I've said the same before.While I agree that this game falls short of the mark, I feel the OP and the general tone of this thread is misleading.
Eg. saying the game is running at 720p... is very misleading. Why is it ok to call FSR/Dlss on PC, 4K quality/4K performance or 1440p quality/performance and ignore the base native resolutions the games have to run at those presets? But when it's done on a console we attempt to sensationalize things and talk about those base resolutions as if they aren't doing the same reconstruction that is being headed everywhere else.
You're only proving my point. It didn't. It doesn't. And it's running on all 3. I'm sure Patches will help.lol if SS would use same settings as PS5 and SX it probably be doing 320-480P
Exactly my point, and you made a very good example too. I compete agree that this game is generally doing all this poorly,I just expect more objectivity... eg, we have a lot of really good-looking stuff on these consoles already, one would think that if a game is flooring the hardware, then it must be in contention for best looking game of the year.... this game doesn't even look as good as HFW/Ratchet or even Dead space from your example.In this particular cases the results are a bit too poor, be it the very low input resolution or a bad implementation of FSR, but I generally agree with your point and I've said the same before.
When talking about reconstruction on PC people tend to focus on the output result rather than input pixel count while in discussions like this for console the reconstruction is being completely ignored and we focus on just the raw pixel count. Dead space was also laughed out of the park for being sub 1080p in performance mode, but I found the actual result to be very pleasant on a 1440p screen and very native-looking for the most part.
I am really beginning to lose hope or question the sense of most of these devs.Why developers bother with RT in these consoles is something who amuses me.
You're only proving my point. It didn't. It doesn't. And it's running on all 3. I'm sure Patches will help.
As far as I know SS only has 30fps mode, worse gfx settings and worse resolution, probably 600P ?