Originally Posted by onQ123
on the graphics side of things
if the memory is 176GB/s & the game is 60FPS the most memory that the game will need at a giving time/frame is 2.93GB & if the game is 30FPS you wouldn't use any more than 5.86GB per frame.
in other words you can't move 8GB of data each frame with 176GB/s unless the game was 22FPS so why not put the other GB of ram to use with the OS?
You are indeed correct, based on bandwidth figures, the amount of ram available to each console is as follows.
| 8GB DDR3 at 68GB/s (5GB available to devs)
the maximum memory available per frame is 1.133GB
the maximum memory available per frame is 2.266GB
| 8GB GDDR5 at 176GB/s (5.5GB available to devs) 512mb of that swap space, paged to the HDD.
the maximum memory available per frame is 2.933GB
the maximum memory available per frame is 5.866GB
This is the actual maximum amount of memory available to each console irrespective of what amount the OS uses up.
If people are wondering why the figures at they are, the bandwidth amounts dictate the maximum amount of ram available per second. So 68GB/s means 68GB maximum memory access per second. If a game is 30fps it means there are 30 frames rendered per one second. So you just divide 68 (the amount of ram bandwidth and thus available ram per second)
by 30 (in this example the number of frames being rendered in per second)