• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Resident Evil Village Nintendo Switch 2 is not a stable 60fps.

"thEy aRe ClOse"

YLxPzj8qN5Cl36Uw.png
3RN1YCr6A05eGKFF.png


rfP4NHhhnqekP6Om.png


gRswqItRCpycUqrh.png


lJ6udFNBQSeJbd6J.png
i did say some textures are better and of course its 60fps, no shit sherlock i can take several pics like you of sonic racing worlds running via emulation and it looks the same 99% of the time but keep trolling. again this game can run 60fps max settings on steam deck so its a useless comparison when we know ps4 pro gpu vastly more powerful but you of course don't care and wanna push a false narrative. you been doing the same crap every generation.
 
Last edited:
i did say some textures are better and of course its 60fps,

Some textures, some geometry, some details, some ambient occlusion, some shaders, some...

no shit sherlock i can take several pics like you of sonic racing worlds running via emulation and it looks the same 99% of the time but keep trolling.

Please do 🤡

Let's all witness that poor ass attempt as you dig further towards the center of the earth.

again this game can run 60fps max settings on steam deck with so its a useless comparison when we know ps4 pro gpu vastly more powerful but you of course dont care and wanna push a false narrative.

Oh the fucking irony of you calling anyone pushing false narrative lol :messenger_tears_of_joy:

Again, just a few days ago you "I can't see the difference" for Monster hunter story 3 where it's missing OBVIOUS fucking effects like ambient occlusion

You need to

stfu GIF


Your thread posting privileges for that sort of shit should be revoked

you been doing the same crap every generation.

I'm sorry, you're here since 2025? The fuck do you know about me you little troll?

Or you just outed yourself as an alt?
 
Some textures, some geometry, some details, some ambient occlusion, some shaders, some...



Please do 🤡

Let's all witness that poor ass attempt as you dig further towards the center of the earth.



Oh the fucking irony of you calling anyone pushing false narrative lol :messenger_tears_of_joy:

Again, just a few days ago you "I can't see the difference" for Monster hunter story 3 where it's missing OBVIOUS fucking effects like ambient occlusion

You need to

stfu GIF


Your thread posting privileges for that sort of shit should be revoked



I'm sorry, you're here since 2025? The fuck do you know about me you little troll?

Or you just outed yourself as an alt?
I never signe
Some textures, some geometry, some details, some ambient occlusion, some shaders, some...



Please do 🤡

Let's all witness that poor ass attempt as you dig further towards the center of the earth.



Oh the fucking irony of you calling anyone pushing false narrative lol :messenger_tears_of_joy:

Again, just a few days ago you "I can't see the difference" for Monster hunter story 3 where it's missing OBVIOUS fucking effects like ambient occlusion

You need to

stfu GIF


Your thread posting privileges for that sort of shit should be revok

Or you just outed yourself as an alt?
You know you can read forums without signing up right? I think even in the wiiu days you were such a fanboy that despite almost every game running better on 360 you kept the false narrative that wiiu was way more powerful and history repeats itself. again sonic racing worlds runs high setting 60fps on SD, meaning pro should and can run this game much better so move on to the next game game you been wrong about everything, and pro has won in almost every single comparison when you said it never would.
 
Good that Resident Evil Village will be available on Switch 2 but it's not a platform I'd want to play the game on personally. For one, the Switch 2's LCD screen is pretty poor and fast moving 60 fps games just look dreadful on it in my experience. Two, the docked experience is blighted by not having support for VRR so in a game that almost-but-not-quite hits 60 fps then it would be a jerky experience even though the game would look fantastic on my LG C3 OLED TV. I know how horrid a wobbly 60 fps framerate can be from the PS5 version which was released before VRR was supported on the console.

Love my Switch 2 but for me it is a device that is best used docked and even then only with games that have proper frame paced 30 fps and locked 60 fps modes. I made the mistake of buying Gear.Club Unlimited 3 on Switch 2 last week and that game is dreadful (the game itself is utterly average anyway), having neither a locked 30 or 60 fps mode so it just seems to be constantly jerky. Absolutely horrible to play.
 
Last edited:
Some textures, some geometry, some details, some ambient occlusion, some shaders, some...



Please do 🤡

Let's all witness that poor ass attempt as you dig further towards the center of the earth.



Oh the fucking irony of you calling anyone pushing false narrative lol :messenger_tears_of_joy:

Again, just a few days ago you "I can't see the difference" for Monster hunter story 3 where it's missing OBVIOUS fucking effects like ambient occlusion

You need to

stfu GIF


Your thread posting privileges for that sort of shit should be revoked



I'm sorry, you're here since 2025? The fuck do you know about me you little troll?

Or you just outed yourself as an alt?
I would suggest old Sweet tooth has just outed themselves lol..
 
I never signe

John Candy Reaction GIF


Just like that, from gaming-age 1998 to 2025, then you decided to make an account 🤡

You're not fooling anyone alt.

You know you can read forums without signing up right? I think even in the wiiu days you were such a fanboy that despite almost every game running better on 360 you kept the false narrative that wiiu was way more powerful and history repeats itself.

dafuq are you talking about

Let's see, the biggest graphical talk threads of Wii U

0 posts

0 posts

0 posts

0 posts

0 posts

0 posts

0 posts

1 post

oh wow

Phew, I spent all my time defending Wii U as we see here gentlemen. I have less posts about Wii U than subzero's threads making about shitting on Switch 2 in his less than a year on neogaf than the nearly 15 years of discussion that could have happened with Wii U.

Are you that fucking deranged in your head you're dragging a fucking console I don't care about into this thread?


again sonic racing worlds runs high setting 60fps on SD

I don't care. Is Steam deck the same architecture as PS4 Pro? You will not refocus the conversation from PS4 Pro to SD. SD with 800p and 0.75 scaler is not even in the same conversation

Aren't you supposed to be taking switch 1 emulator pics to say that it's equivalent to PS4 Pro?

meaning pro should

Hand On Shoulder GIF


and can run this game much better so move on to the next game game you been wrong about everything

lollllll dude

Everyone saw your posts in the past days and laughing at you troll

and pro has won in almost every single comparison when you said it never would.

You never recovered from your Cyberpunk 2077 meltdown have you?
 
Last edited:
Nothing to see here everyone, just another thread rightfully critical of the Switch 2 taken over by the brigading usual suspect Nintendo Fanboys Fans swinging their Miyamota branded autism around till every normal person gets tired of dealing with them.

cYJMa32eP7L5dswm.gif
 
Last edited:
Just like the "solid 60fps" RE8 PS4 pro?

Doiak0k0fhFII6f3.jpg
U blind?🤨
For comparison sake - Resident Evil Village stats from VGTech:
Kj4JjnuhF5Z8T6pp.png

Its actual compute is half the 3TF paper specs + ~20-30%. So ~2TF's in docked mode and ~1TF portable mode
The half flop theoric, holy shit, what is this, are we back 5 fucking years ago?
As we can all see - 50.4% of TFLOPs, but 78% of performance.
7uvxAFEplsfsrXoj.png


Again - 47.2% of TFLOPs, but 76% of performance.
IFnqd93J0s5YHQVU.png


So yeah, Haint is right. It is half and then you multiply it by 1.25 (1.33 at best).
One look at the TMUs between the generations tells you everything you need to know.
Just take a look at a few different Geforce generations and maybe (just maybe) you'll understand if Nvidia follows your flawed logic or if they actually follow real logic.

If anyone really feels the need to compare FLOPs between architectures, you can do so between Maxwell, Pascal, Turing, GCN 4, RDNA1 and RDNA 2, but THAT'S IT.
And it's not like Pascal and Turing or GCN 4 and RDNA 1 have the same 'FLOPs efficiency' but they're roughly comparable.
The FLOPs of Ampere, Ada Lovelace, Blackwell, RDNA 3 and RDNA 4 are all heavily inflated. Actually, RDNA 3 is in a league of its own when it comes to FLOPflation...🤭

And to quote your words from another thread: 'As if GCN 1.1 Tahiti generation would be even comparable to Ampere architecture'

You're right for once.
GCN 1 FLOPs aren't in any way comparable to Ampere's FLOPs.
In gaming, GCN 1 is much more performant per FLOP than Ampere is.
hvOxFsxLIZVtoQuu.png

And before you point to mesh shaders or some other barely used tech - how many games use hardware RT or mesh shaders on Switch 2? It's getting close to 200 games released on S2. Is it even 5%?:poop:

DLSS less banding artifacts than CDPR's TAA, fine details like rain resolve much better, nowhere near the SSR grainy shimmering artifact. Nearly every scene with details in parallel with the camera view will be shimmering even when not moving.
Switch 2 drops frames during traversal like a madman. Unlike PS4 Pro.
And I told you last month that I'm done looking at Cyberpunk.
CDPR not a technical reference (LOL, the first open world AAA path tracing game, they're not good technically! :messenger_tears_of_joy:)
What path tracing are you speaking of?
The one on Switch 2? 🤡
The one on Xbox Series X? 🤡
Or maybe the one on PS5 Pro? 🤡

If it weren't for Nvidia's fondness for the neo-noir artstyle with all its colorful neon lights, skyscrapers, and steam, Cyberpunk wouldn't have ever gotten all the attention from Nvidia.

Last time I personally cared about Cyberpunk 2077 was back in the fall of 2020 - when I was selling my CDR stock a few weeks before the release of the game.
Because it was obvious that they were hiding the base 8th gen consoles versions for absolutely obvious reason - abysmal performance.
The difference in framerates during car traversal in CP2077 between the PS4 Pro (high 20s) and the Switch 2 (low 20s) is very similar to CD Projekt Red's intentions in the summer and fall of 2020 - extremely obvious 🤷‍♂️
And here comes the clown again with Elden Ring, which is the epitome of spaghetti code that defies all hardware, but still being evaluated before release
So, if Elden Ring defies all hardware, please tell me - why is the Xbox One S version more or less what one would expect it to be when compared to the PS4 version?
And why is the PS4 Pro version what you would expect it to be when compared to the PS4 one?
And also - do you think FromSoftware put more time into optimizing the Xbox version than it put in the already delayed once (it was supposed to release in 2025) Elden Ring Tarnished Edition on Switch 2? 🤔



And since you're so knowledgeable and so moderate in your claims - could you please explain what's going on with DLSS 4 SR on Switch 2?

It's not like you were fantasizing and daydreaming about the capabilities of your beloved plastic box, right?

FANTASY:
The announcement from last night, the move to change from old DLSS Convolutional Neural Networks (CNN) to transformer model is applicable for all RTX hardware

Even Switch 2 with rumoured old ass Ampere will have the best upscaling tech of console world

REALITY: 'On the Switch 2, moving objects just kind of look lower res and flickery on their edges as soon as any sight of movement kind of starts. This applies to any and all moving objects in all scenes in the Switch 2 version.' (DF, Oct 3rd, 2025, 16m0s)



Poor Buggy was fantasizing about the DLSS Transformer model, but instead he got something worse than the DLSS 3.x SR CNN model in the majority of games that actually opt to use Tensor cores for upscaling.
And as for all your wild, outlandish claims - it's not like AI upscaling is even something that is universally used on the Switch 2. Less than half of the S2 games use any kind of DLSS, and the ones that actually use the full CNN model can be counted on the fingers of one hand.
 
John Candy Reaction GIF


Just like that, from gaming-age 1998 to 2025, then you decided to make an account 🤡

You're not fooling anyone alt.



dafuq are you talking about

Let's see, the biggest graphical talk threads of Wii U

0 posts

0 posts

0 posts

0 posts

0 posts




You never recovered from your Cyberpunk 2077 meltdown have you?
lol cyberpunk the game running 10fps better and higher resolution in the developers own benchmarks you should put a clown pic as your profile pic. mean while we have 10 -15 games running higher resolution on pro, lets look at sonic worlds a game that can run on toaster at 60fps high setting as proof as switch 2 superiority.
 

Not a single average here you're anywhere near 10-15% above a 3060, and this is without RT! Most of the time 3060 leads

So by your calculation, 8.5TFlops ahead of a 10.6 TFlops. Make it make sense lol

With RT then 3060 leads with +20%

So for 276 mm^2 with "fake trouble TFlops", competent ML which has now transformer model, competent ray tracing blocks, it still manages to match or exceed the 236mm^2 dedicated mainly to raster. On a much denser TSMC node than Samsung's 8nm too.

So what year exactly did AMD catch up in functionality clock for clock, core for core with Ampere?



Hooolllyyyy shit the cope

Not even ayyMD subreddit would muster the courage to type the nonsense you just did

Developers ignoring them…. By calling hardware agnostic function calls in their DX12 or Vulkan APIs? Do you understand how any of this works?



Which AMD's solution clearly never worked for gaming
Unlike Nvidia's ampere and Ada



Taped out in 2021

Revisions can happen after

But yes Nintendo did sit on it



And is that 7 years?

Haint mathing



Star Wars outlaws uses all the advantages of a newer architecture minus mesh shaders, which it would actually benefit from performance wise unless you do not understand the purpose of it.




So games that do not use more VRAM, better CPU than base PS4, without RT, or without compute shader features like mesh shaders, don't use SSD, don't use I/O texture decompression/streaming, would be akin to base PS4

Fucking hell

Yes yes, you figured it out.
The 15% delta is literally in the very first benchmark you posted, which you conveniently and disingenuously chose to show 1440p results for on budget cards incapable of running high end games at that resolution (as evidenced by the sub-60 frame rates)

cyberpunk-2077-1920-1080.png
relative-performance_1920-1080.png
 
Last edited:
The 15% delta is literally in the very first benchmark you posted, which you conveniently and disingenuously chose to show 1440p results for on budget cards incapable of running high end games at that resolution (as evidenced by the sub-60 frame rates)
Not the first time either.
Took a section saying PS4 Pro sticks to 60fps well and screenshotted the cutscene transition during that exact moment.
Where he took the screenshot from:



Notice they're saying they hit the 60fps target well and it's pretty solid. He cherrypicked that so well for his agenda.
 
lol cyberpunk the game running 10fps better and higher resolution in the developers

cray GIF


Gauge out your eyes because you clearly don't need them.
LfzthLxfm4uxuRWC.jpg


Playstation consoles are so undercutting content we might as well call them different generations


own benchmarks you should put a clown pic as your profile pic. mean while we have 10 -15 games running higher resolution on pro

With worse image quality, missing features, missing effects, not rendering everything

, lets look at sonic worlds a game that can run on toaster at 60fps high setting as proof as switch 2 superiority.

Aren't you supposed to be running emulator of sonic cross worlds by now clown to prove your point? We're all waiting

While you're proposing that hitman 2, hogwart legacy and Elden ring are the benchmark? Do list those 10-15 games :messenger_tears_of_joy:

Make-Up Face GIF by Justin


The 15% delta is literally in the very first benchmark you posted,

No?

CP2077_1440p.png


which you conveniently and disingenuously chose to show 1440p results

Iron Man Eye Roll GIF


1080p, 2025, 25 games average rasterization

relative-performance-1920-1080.png


for on budget cards incapable of running high end games at that resolution (as evidenced by the sub-60 frame rates)

cyberpunk-2077-1920-1080.png
relative-performance_1920-1080.png

You had to go back to 2021 and pick a $680 card over a $550 to find that sole benchmark that does not even come close to your claim of "on average its +15% ?

Here, another 2021 legacy

OgPQeKpZM84qWArI.png


In the meantime, 2025 25 games average kills your whole argument.
 
yakuzo 0 1080/60fps if just like base ps4 . ps4 pro would run this 1440p easy
tony pro skater 1080p just like base ps4, pro would run this 1440p
hogwarts 720p upscaled using dlss lite ps4 pro 1100p and runs much better
SF6 1440P on pro while switch 2 is upscaled from 520p and higher settings like ssr remain intact
fortnite higher resolution on pro 1100p-1440p
skyrim 4k on pro while switch 2 is 1440p
persona 3 1600p on pro vs 1080p on switch 2 on quality mode
grid legends 1440p/50-60fps switch 2 only has 30fps with pro settings
RE8 runs much better
RE7 is 1600p vs 1080p on switch 2
FF remake 7, better shadows, ssr and runs at 1600p
hitman 3 1080p/60fps vs 1080/30fps
elden ring they can't even get it out cause they struggling so bad with switch 2
cyberpunk higher resolution on ps4 pro while running better.

ps4 pro is superior end of discussion. you can literally look at all these cross gen games only one goes over 1080p anyone with common sense would get it. dlss is mostly dlss lite which doesn't even work in motion.
 
Last edited:
cray GIF


Gauge out your eyes because you clearly don't need them.
LfzthLxfm4uxuRWC.jpg


Playstation consoles are so undercutting content we might as well call them different generations




With worse image quality, missing features, missing effects, not rendering everything



Aren't you supposed to be running emulator of sonic cross worlds by now clown to prove your point? We're all waiting

While you're proposing that hitman 2, hogwart legacy and Elden ring are the benchmark? Do list those 10-15 games :messenger_tears_of_joy:

Make-Up Face GIF by Justin




No?

CP2077_1440p.png




Iron Man Eye Roll GIF


1080p, 2025, 25 games average rasterization

relative-performance-1920-1080.png




You had to go back to 2021 and pick a $680 card over a $550 to find that sole benchmark that does not even come close to your claim of "on average its +15% ?

Here, another 2021 legacy

OgPQeKpZM84qWArI.png


In the meantime, 2025 25 games average kills your whole argument.


Allow me to disclose how I produced this deep seek ancient unicorn to support my claim....I googled "Techpowerup 6600XT". That is literally the first result/review with benchmarks (the actual first result is the generic overview spec page).

And though it was unintentional on your part, thank you for proving my point on AMD's cards being significantly handicapped by terrible drivers and developer indifference. Did you not stop to think why you are finding such disparities in multi-game averages, particularly over time?
 
Last edited:

Unlike you who can't tell PS4 pro is not rendering effects and assets while driving a car, no.

As we can all see - 50.4% of TFLOPs, but 78% of performance.
7uvxAFEplsfsrXoj.png


Again - 47.2% of TFLOPs, but 76% of performance.
IFnqd93J0s5YHQVU.png


So yeah, Haint is right. It is half and then you multiply it by 1.25 (1.33 at best).

vs Turing

Who has talked about Turing vs Ampere here?

AMD RDNA 2 has to be scaled down too for even comparing with Turing. Pick a 6800 vs 2080 Ti and oops, scale down

Haint :

P1oOKWLLUn5s5ibY.png


RDNA 2 vs Ampere he's asking for.

Imagine even trying to interpolate this to PS4's GCN :messenger_tears_of_joy:


One look at the TMUs between the generations tells you everything you need to know.
Just take a look at a few different Geforce generations and maybe (just maybe) you'll understand if Nvidia follows your flawed logic or if they actually follow real logic.

If anyone really feels the need to compare FLOPs between architectures, you can do so between Maxwell, Pascal, Turing, GCN 4, RDNA1 and RDNA 2, but THAT'S IT.
And it's not like Pascal and Turing or GCN 4 and RDNA 1 have the same 'FLOPs efficiency' but they're roughly comparable.
The FLOPs of Ampere, Ada Lovelace, Blackwell, RDNA 3 and RDNA 4 are all heavily inflated. Actually, RDNA 3 is in a league of its own when it comes to FLOPflation...🤭

And to quote your words from another thread: 'As if GCN 1.1 Tahiti generation would be even comparable to Ampere architecture'

You're right for once.
GCN 1 FLOPs aren't in any way comparable to Ampere's FLOPs.
In gaming, GCN 1 is much more performant per FLOP than Ampere is.
hvOxFsxLIZVtoQuu.png

And before you point to mesh shaders or some other barely used tech - how many games use hardware RT or mesh shaders on Switch 2? It's getting close to 200 games released on S2. Is it even 5%?:poop:

Ok, so let's go with your logic and Haint's logic

Puny tiny 2TF docked mode Switch 2 vs PS4 pro underrated GCN 1 FLOPS which are much more performant than Ampere, which puts it at what, 5-6 TFlops PS4 pro?

Switch 2 slapped the ever fucking living shit out of in Cyberpunk 2077 with also phantom liberty and at 19W, not cheaping out on rendering during traversal and actually doesn't drop 10 fps for an alleyway battle with no streaming assets coming in excuse.

What a fucking beast

Celebrate In Love GIF by HBO Max



Switch 2 drops frames during traversal like a madman. Unlike PS4 Pro.

Because it does not even look the same generation as its rendering

LfzthLxfm4uxuRWC.jpg



Qa04sLITmFVLX96m.jpg


LejkEZY3Q0gPWXNS.jpg


sYZ0VDCYe3vf7PUs.jpg



QQ4F71oHFqcW9YBW.jpg



bvwahmVQaorZ3R1u.jpg

And I told you last month that I'm done looking at Cyberpunk.

Convenient

What path tracing are you speaking of?
The one on Switch 2? 🤡
The one on Xbox Series X? 🤡
Or maybe the one on PS5 Pro? 🤡

PC path tracing, hur dur dur

Your buddy subzero was trying to convince neogaf that these devs don't have technical knowledge. Are you ready to also support those claims and join the circus?

If it weren't for Nvidia's fondness for the neo-noir artstyle with all its colorful neon lights, skyscrapers, and steam, Cyberpunk wouldn't have ever gotten all the attention from Nvidia.

Last time I personally cared about Cyberpunk 2077 was back in the fall of 2020 - when I was selling my CDR stock a few weeks before the release of the game.
Because it was obvious that they were hiding the base 8th gen consoles versions for absolutely obvious reason - abysmal performance.
The difference in framerates during car traversal in CP2077 between the PS4 Pro (high 20s) and the Switch 2 (low 20s) is very similar to CD Projekt Red's intentions in the summer and fall of 2020 - extremely obvious 🤷‍♂️

What the fuck are you even babbling about.

So, if Elden Ring defies all hardware, please tell me - why is the Xbox One S version more or less what one would expect it to be when compared to the PS4 version?
And why is the PS4 Pro version what you would expect it to be when compared to the PS4 one?

Why were peoples on PS5 running the PS4 version via backward compatibility for the most stable experience? Try to defend this technical mess and elevate From software to a technical know-how dev, please do, I want to see the spin.

Know what? Post your spin on graphical fidelity thread too. They need a laugh.

And also - do you think FromSoftware put more time into optimizing the Xbox version than it put in the already delayed once (it was supposed to release in 2025) Elden Ring Tarnished Edition on Switch 2? 🤔

You think From Software is putting their A team on the port?

But I don't have any footage of the game they will launch after they decided to delay it.

Do you?

You're the only one here picking footage from a dev that clearly stated they have to go back to working on it, and you're using that for performance metrics. Imagine doing this, with a serious face. You're not credible, you guys are clowns. Nobody does this shit on neogaf. Both you bozos from 2025 come in with this nonsense. Again, not sure why you aren't banned yet.


And since you're so knowledgeable and so moderate in your claims - could you please explain what's going on with DLSS 4 SR on Switch 2?

It's not like you were fantasizing and daydreaming about the capabilities of your beloved plastic box, right?

REALITY: 'On the Switch 2, moving objects just kind of look lower res and flickery on their edges as soon as any sight of movement kind of starts. This applies to any and all moving objects in all scenes in the Switch 2 version.' (DF, Oct 3rd, 2025, 16m0s)



Poor Buggy was fantasizing about the DLSS Transformer model, but instead he got something worse than the DLSS 3.x SR CNN model in the majority of games that actually opt to use Tensor cores for upscaling.
And as for all your wild, outlandish claims - it's not like AI upscaling is even something that is universally used on the Switch 2. Less than half of the S2 games use any kind of DLSS, and the ones that actually use the full CNN model can be counted on the fingers of one hand.


Has better upscaling than all current gen platforms FSR2 pixel soup and I would say even including PiSSR. How many games on PS5 pro use PSSR? How many games on standard consoles used FSR 2? Piss poor ratio.

Reality : Alex from DF had to compare Switch 2's DLSS implementation in cyberpunk 2077 to be equivalent to running 1080p DLAA on PC and yet there's many times it resolved better than CNN E preset. 🤷‍♂️
 
And though it was unintentional on your part, thank you for proving my point on AMD's cards being significantly handicapped by terrible drivers and developer indifference. Did you not stop to think why you are finding such disparities in multi-game averages, particularly over time?

Nvidia drivers finewine 🤷‍♂️

Isn't that AMD's whole personality? What happened there?

Please, go into AMD threads in this place to tell them their cards are handicapped by terrible drivers, I'll grab the popcorn.

Developer indifference, again, I'm not sure you understand here how it goes. The game engine talks to API to talk to the GPU. Pick DX12, Vulkan, doesn't matter. The game sends "function call X" into the pipeline. This is hardware agnostic. Devs do not change the calls going through these APIs. It's then all up to GPU drivers to take this call and translate it into low-level commands and send it into the pipeline with as much optimization as the architecture or the driver allows.

Devs have fuck all to do with this dude. There's no games developed on a per architecture basis
 
I used to get tight when people said Nintendo was for kids… But after damn there every "mature adult masterpiece" I'd actually be proud to play in public runs like it's fighting for its life on that hardware?

Y'all stamped it yourselves man.....
kgnwqsWb9moMv9HM.gif
 
Go to the eyes doctor ASAP and
meanwhile you're in threads saying it's not a big difference 600p/40-60fps upscaled vs 1440p/60fps. bruh you guys will never change. like the pro and switch version look 10% better at best hardly anything that demanding going on there that shows the switch 2 or ps4 pro is 10x more powerful. go to sleep. you can't even the notice the difference between 60fps locked and 40-60fps and what's to tell me get my eyes checked.
 
Last edited:
looks almost the same to my eyes. looks like a game on the same generation of hardware.
You went from
"The Switch 2 version doesn't feature the same graphical features as the PS4 Pro and XSS; that's why it can run at 60 fps."
to
"Oliver from DF stated the Switch 2 version is just an upressed Switch 1 version without any gen 8 (PS4/X1/PS4 Pro/X1X) graphical features."
to
"Actually, Switch 1 version at 4k resolution looks exactly like PS4 Pro."
to
GGJqv51XwAABoTE.jpg
 
You went from
"The Switch 2 version doesn't feature the same graphical features as the PS4 Pro and XSS; that's why it can run at 60 fps."
to
"Oliver from DF stated the Switch 2 version is just an upressed Switch 1 version without any gen 8 (PS4/X1/PS4 Pro/X1X) graphical features."
to
"Actually, Switch 1 version at 4k resolution looks exactly like PS4 Pro."
to
GGJqv51XwAABoTE.jpg
That's what olivar stated and you even admitted he made a mistake probably. so yea it does look the same, i see little more grass and slightly better textures. no reason ps4 pro should not this 1440p/60fps when its 11x more powerful then switch 1, and the game is not even cpu limited. SD runs this max settings at 800p at 60fps and no way in hell is SD on par with ps4 pro gpu which is what this game is suggesting, really a pointless comparison but i understand its the only win switch 2 has.
 
Last edited:
You went from
"The Switch 2 version doesn't feature the same graphical features as the PS4 Pro and XSS; that's why it can run at 60 fps."
to
"Oliver from DF stated the Switch 2 version is just an upressed Switch 1 version without any gen 8 (PS4/X1/PS4 Pro/X1X) graphical features."
to
"Actually, Switch 1 version at 4k resolution looks exactly like PS4 Pro."
to
GGJqv51XwAABoTE.jpg

He's doubling down that switch 1 upres is the same as PS4 pro :messenger_tears_of_joy:

Please take more screenshots if you can, I'll grab the popcorn
 
ps4 pro is superior end of discussion. you can literally look at all these cross gen games only one goes over 1080p anyone with common sense would get it. dlss is mostly dlss lite which doesn't even work in motion.
no one cares shut up GIF

Does Not Matter Jessica Chastain GIF by Saturday Night Live


This entire thread is embarassing, but since you are the black hole of this thread, you should have said:

See Ya Comedy GIF by Rooster Teeth


But instead:

something ps GIF
 
Guys, chill. What S subzero83 is trying to say is that PS4, PS4 Pro, Switch 1 and Switch 2 are in the same ballpark of power and brings similar results to games rendering. What he's trying to say is that Switch 1 is a miracle and that PS4 Pro should be a portable.
 
Guys, chill. What S subzero83 is trying to say is that PS4, PS4 Pro, Switch 1 and Switch 2 are in the same ballpark of power and brings similar results to games rendering. What he's trying to say is that Switch 1 is a miracle and that PS4 Pro should be a portable.
switch 1 is not anywhere near them. sonic cross worlds though on switch 2 and ps4 pro other then some textures, 60fp, and resolution does not look that much better. switch 2 gets 60fps but ps4 pro just a terrible port.
 
switch 1 is not anywhere near them. sonic cross worlds though on switch 2 and ps4 pro other then some textures, 60fp, and resolution does not look that much better. switch 2 gets 60fps but ps4 pro just a terrible port.

But it looks so close, according ot you...
 
But it looks so close, according ot you...
i said sonic worlds switch 1 version with a resolution bump to 1440 and 60fps would look close to ps4 pro and switch 2 version. i mean that switch 1 emulation pic at 60fps would be vastly better then playing at 30fps and look close enough for me. i look at those comparison pics and its a very small downgrade.
 
Last edited:
i said sonic worlds switch 1 version with a resolution bump to 1440 and 60fps would look close to ps4 pro and switch 2 version. i mean that switch 1 emulation pic at 60fps would be vastly better then playing at 30fps and look close enough for me. i look at those comparison pics and its a very small downgrade.

So you're telling me PS4 Pro is a Switch 1 with a resolution bump. Got it now.
 
Last edited:
U blind?🤨




As we can all see - 50.4% of TFLOPs, but 78% of performance.
7uvxAFEplsfsrXoj.png


Again - 47.2% of TFLOPs, but 76% of performance.
IFnqd93J0s5YHQVU.png


So yeah, Haint is right. It is half and then you multiply it by 1.25 (1.33 at best).
One look at the TMUs between the generations tells you everything you need to know.
Just take a look at a few different Geforce generations and maybe (just maybe) you'll understand if Nvidia follows your flawed logic or if they actually follow real logic.

If anyone really feels the need to compare FLOPs between architectures, you can do so between Maxwell, Pascal, Turing, GCN 4, RDNA1 and RDNA 2, but THAT'S IT.
And it's not like Pascal and Turing or GCN 4 and RDNA 1 have the same 'FLOPs efficiency' but they're roughly comparable.
The FLOPs of Ampere, Ada Lovelace, Blackwell, RDNA 3 and RDNA 4 are all heavily inflated. Actually, RDNA 3 is in a league of its own when it comes to FLOPflation...🤭

And to quote your words from another thread: 'As if GCN 1.1 Tahiti generation would be even comparable to Ampere architecture'

You're right for once.
GCN 1 FLOPs aren't in any way comparable to Ampere's FLOPs.
In gaming, GCN 1 is much more performant per FLOP than Ampere is.
hvOxFsxLIZVtoQuu.png

And before you point to mesh shaders or some other barely used tech - how many games use hardware RT or mesh shaders on Switch 2? It's getting close to 200 games released on S2. Is it even 5%?:poop:


Switch 2 drops frames during traversal like a madman. Unlike PS4 Pro.
And I told you last month that I'm done looking at Cyberpunk.

What path tracing are you speaking of?
The one on Switch 2? 🤡
The one on Xbox Series X? 🤡
Or maybe the one on PS5 Pro? 🤡

If it weren't for Nvidia's fondness for the neo-noir artstyle with all its colorful neon lights, skyscrapers, and steam, Cyberpunk wouldn't have ever gotten all the attention from Nvidia.

Last time I personally cared about Cyberpunk 2077 was back in the fall of 2020 - when I was selling my CDR stock a few weeks before the release of the game.
Because it was obvious that they were hiding the base 8th gen consoles versions for absolutely obvious reason - abysmal performance.
The difference in framerates during car traversal in CP2077 between the PS4 Pro (high 20s) and the Switch 2 (low 20s) is very similar to CD Projekt Red's intentions in the summer and fall of 2020 - extremely obvious 🤷‍♂️

So, if Elden Ring defies all hardware, please tell me - why is the Xbox One S version more or less what one would expect it to be when compared to the PS4 version?
And why is the PS4 Pro version what you would expect it to be when compared to the PS4 one?
And also - do you think FromSoftware put more time into optimizing the Xbox version than it put in the already delayed once (it was supposed to release in 2025) Elden Ring Tarnished Edition on Switch 2? 🤔



And since you're so knowledgeable and so moderate in your claims - could you please explain what's going on with DLSS 4 SR on Switch 2?

It's not like you were fantasizing and daydreaming about the capabilities of your beloved plastic box, right?

FANTASY:


REALITY: 'On the Switch 2, moving objects just kind of look lower res and flickery on their edges as soon as any sight of movement kind of starts. This applies to any and all moving objects in all scenes in the Switch 2 version.' (DF, Oct 3rd, 2025, 16m0s)



Poor Buggy was fantasizing about the DLSS Transformer model, but instead he got something worse than the DLSS 3.x SR CNN model in the majority of games that actually opt to use Tensor cores for upscaling.
And as for all your wild, outlandish claims - it's not like AI upscaling is even something that is universally used on the Switch 2. Less than half of the S2 games use any kind of DLSS, and the ones that actually use the full CNN model can be counted on the fingers of one hand.

How many accounts does old sweetie tooth have lol..
 
Top Bottom