Osiris
I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
Actually:
PS4 GPU >7850 in terms of CU, TMU (iirc), added GPGPU functionality, GCN 1.x, TF (1.84 v 1.76). It loses out to 7850 in terms of triangle count set up 1.6B vs 1.72B/sec and core clock.
Rumoured XB3 GPU> 7770 in terms of CU (12 vs 10 which is some here suggest that it may be a detuned Pitcairn instead), added GPGPU functionality, triangle set up (same count as PS4's GPU). It loses out in terms of core clock, TF count (1.23 v 1.28 of 7770)
More importantly, they are both being optimised to be more flexible and much more efficient. It means both of them will perform better than their closest PC variant counterparts.
Agreed, but for the simplest, most direct comparison they will suffice
Please explain the joke, or else just answer my rather simple question: Can microsoft do the same thing with DDR3 as Sony did with GDDR5?
Unlikely, DDR3 density capped out at 4Gb modules a few years ago I believe, Sony were just incredibly fortunate that the timing of further development and increased chip density for GDDR5 allowed them to make such a change.