We don't know official specs yet. At least 4GB (if not more) of RAM will be used by OS, not to mention that all of the memory on consoles is shared. It's basically PC without dedicated RAM and VRAM for different stuff and workloads. On PC both VRAM and RAM is used when you're playing games (including OS). For example, Resident Evil 2 eats up to 9GB of VRAM in native 4K (or close to it) and 6GB of RAM, but if you'll be playing the game in 4K on a GPU which is memory limited, RAM usage can spike up to 10GB in 4K and up to 9GB in 1440p so that's almost 19-20 GB (RAM + VRAM) in total for 1440p / 4K (provided you even have that much, otherwise game assets will be loading from SSD / HDD instead).
You can clearly see that some of the games on base consoles droped screen resolution to 720p in some cases, that's because they are memory limited and memory bandwith along with overall performance (floating-point operations per second) is just not good enough anymore for modern games to work at higher resolution with 30 FPS, even PS4 Pro struggles to achieve optimal and locked 30 FPS in some of the modern games. There's nothing you can do here and on consoles, but on PC you can just upgrade your GPU without even touching RAM or CPU (if it's a high-end one).
1. Having separate physical RAM spaces is a problem and not an advantage.
2. Most of the time assets are duplicated between main ram and vram. Mainly because you don't have any visibility on PC on how much vram is used (from within the game code).