The Unreal demo for example used 8k textures, and hollywood asset level geometry, with polygon per pixel rendering. Geometry wise a pc game cannot have more geometric detail than something using polygon per pixel detail. Texture wise, sure you could include extra nightmare 16K textures, but that wouldn't be perceptible, that would just be bragging and misuse of rendering resources. The image quality was also extremely good in the unreal demo.
We are getting to a point where detail level in terms of textures or polygons cannot be meaningfully increased from what the consoles can handle.
Judging by the nvidia marbles demo, unless more complex animation is significantly more taxing on path tracing lighting, it also seems we are one or two generations away from path traced lighting on mainline games on consoles. At which point you can add lighting to textures and polygon detail that cannot be meaningfully improved.
What remains after that is physics animation and framerate. If DLSS like solutions are implemented in the future, expect consoles to easily handle 60+fps at 4k, already the unreal demo uses hollywood level assets and runs fine on ps5(some estimates say 45+fps), so additional performance can go into framerate.
Sure you will be able to say you game at 8k 120fps on a future high end rig, but that won't be that much of a difference from 4k 60fps with the same assets lighting and excellent image quality.
Before when you had blurry textures, crappy image quality, and sub 30fps on consoles, you could say yeah there's a big difference. But as consoles get sharp textures, excellent image quality, 60fps, hollywood level textures and geometry, and a few generations down the line path traced lighting. The difference becomes much smaller.