The game engine is not even GCN optimized. It is just so old. Without using new features and techniques it just tried to brute force it's way with code that was optimized to reach 30fps on 2007 hardware.What a odd thread. Crysis remastered on consoles is not indicative of anything. It's a poorly optimized game engine from over a decade ago that has a hack job of RT extensions implemented into it. On top of that, the consoles are running in BC mode, so none of the features of RDNA 2 are present, such as the IPC gains and hardware accelerated RT (as confirmed by MS themselves). To add to that, you will have a software overhead from the emulation taking place.
If the game was actually developed with the modern Cryengine, and was a native game utilizing the RT acceleration on RDNA 2, then sure, you can use it as a benchmark if you wish. If that was the case however, then the game would be performing significantly better on both consoles.
Your comment against AMD is also odd, going with Nvidia for the PS4/Xbox One generation would have been a terrible idea as Kepler was just worse then GCN, and only stayed ahead of GCN on the PC due to AMDs lackluster drivers and DX11 performance at the time. Low-level APIs such as Dx12 and Vulkan with Async compute clearly demonstrated that GCN was the superior architecture.
Even if both consoles could have had a single SOC combining an AMD CPU and Nvidia GPU for this generation, that would not have led to better console performance. AMD and Nvidia are very close with normal rasterization, with Nvidia leading the way in RT performance, but Nvidia is lacking in both performance per watt and die size when compared to AMD. Even accounting for the difference in 8nm VS 7nm, going Nvidia would have just resulted in a weaker GPU for normal rasterization and a bit better RT performance as console are limited by die size and power consumption.
This thread feels like a thinly disguised attempt to once again discredit consoles (weirdly enough AMD as well).