Yep, there is a severe lack of understanding with regards to minimum requirements and optimisations.its not about being slow, its about the amount itself
series x / ps5 has 10 gb allocated for gpu operations (most likely, typical) and 3.5 gb for cpu operations (sounds, physics and such). sx has a clear cut line between gpu and cpu operations, 10 gb @560 gb/s and 3.5 gb @336 gb/s (and an extra 2.5 gb for the system). towards the end of the generation, i expect the allocated RAM to increase to 4 GB and system for 2 GB with extra optimizations. so its 10 gb vram+3.5/4 gb ram config on SX and PS5, most likely
for series s, stuff is not good. 2 GB 56 gb/s goes directly to system (its too slow anyway, useless for both gpu and cpu operations). so they have one unified 224 gb/s 8 gb for cpu+gpu operations. amount of RAM is not easily scalable with resolution. we're talking about stuff like sound, physics et. Series S will have to at least allocated 3-3.5 GB of RAM for CPU operations. Then we have 4.5-5 GB VRAM for GPU operations, which is nearly the half amount what Series X/PS5 can allocate
and here is the part where problems starts, for example in rdr 2, you can run intended game textures with 4 GB @4K. But you cannot run the intended game textures with a 2 GB card even if you run the game at 360p (literally). the game needs a minimum of 3 GBs of buffer regardless of resolution for its intended textures. anything between does not work either.
in other terms, let me create an example so you understand;
ps4 / xbox one had 8 gb ram, which 3.5/4 GB they could allocate to their games for GPU operations and 2.5-3 GB for CPU operations.
say there was an additional console named ps3.5 and xbox half xd. say these consoles have a total of 5 GB RAM compared to ps4's 8 GB RAM.
now, this 5 gb console would need to at least allocate 1-1.5 GB to system. say they cut intricate corners and managed to fit CPU operations into 2 GBs of buffer. That lefts us with 2 GB of VRAM that GPU can use for its own operations.
Lets see how RDR 2 looks on 2 GB buffer ;
now back to the topic, if such a theroticail ps3.5 did exist, the low textures would not look like that. instead, rockstar would have to create a set of 2 GB compliant textures that looked decent. that's the "pain" part. series s will practically force developers to create an alternative set of textures specifically tailored for series s. they call it a pain because most of devs think the gains won't justify the costs and i totally agree with them. thats another topic of course.
as i said, some people in this thread are delusional and keep talking about "devs have to care for min spec and thats lower than series s". above picture is a proof that devs dont care about min spec either. do you really think rockstar gave any kind of care for 2 GB min spec gpus? those textures look okay to you? they just butchered their original textures probably with a generalized algorythm that created what we call "low" textures in mere hours and called it a day.
they cant do that for series s. if a ps3.5 existed, they couldn't do that for that console either. they would have to make "extra" effort to make the game look good in terms of textures on that potential ps3.5 with 2 GBs of budget for GPU operations.
all of this talk is relevant for series s. textures intended for a 10 GB buffer won't be easily turned down to 5 GBs just by reducing resolution alone. they would have to look like how RDR 2's low textures look like on 2 GB GPUs compared to 4 GB GPus if they only used a "flick of a switch". if they make extra effort, they will look okay (the pain part)
And your example of Red Dead 2 is one of the most graphically impressive games ever. However it's a 2018 game, which doesn't bode well for a spec sheet which will last until 2027.