blu
Wants the largest console games publisher to avoid Nintendo's platforms.
As much as I hate doing this..
You couldn't bother to correctly compute the BW of a well known RAM & bus configuration, even though you got the basic multipliers right (hint: it's not 10.2GB/s) - how you managed to do that is beyond me.
You come up with the most absurd of ideas that U-GPU's ROPs would be bound to the slower DDR3 pool, and not to the eDRAM pool.
You end up your posts with 'Fuck you, console vendor X'.
Have you considered the possibility you might be in the wrong thread, as per your current state of mind?
Pal, no offense, but your posts (like the one above) have contributed nothing to this thread. On the contrary, they've brought the level of discussion down several notches.You mean like how i talked about the Wii U's MEM2 pool being on a 64bit bus. DDR3 1600 on a 64bit bus. 4 chips, 512 megabyte capcity, all on a 16 bit bus. 16bit x 4 = 64bit. 200mhz base clock x 4 x 16 = 10.2gbs per second of bandwidth. This is in comparison to the next gen consoles which appear to be using 256bit for their main memory pools. The Xbox 360 and PS3 also used 128bit bus for their GDDR3, which still provides more raw bandwidth then the Wii Us. Even with a modern memory controller there's no way the Wii U's ram is on par even in the real world vs the Xbox 360's memory.
Or the likelyhood that the GPU only has 8 ROPs due to the low memory bandwidth of the Wii U. ROPs are bandwidth dependant, there's no point adding more ROPs unless you can feed them data fast enough.
To which i expanded on by using the XBox 360 as an example. With the Xbox 360 the ROPs were intergrated into the eDRAM. Due to this configuration the ROPs could be fed data at around 256 gigabytes per second. The Wii U's eDRAM implamentation does not seem to be similar to this, with its bus being considerably slower.
Or the fact the CPU is the size of a single Intel ATOM core and has an incredibly low TDP. It's also based on the decade old IBM PPC 750 based architecture. Its performance is going to be anything but stellar.
There's no smoke without fire. In the case of the Wii U's cpu, the house is well and truely alight We've seen Dice slam it, Crytech slam it, unnamed sources months ago slam it, even developers publically comment on how it was an obstacle they had to work around.
There's also no denying the CPU is based on decade plus old IBM PPC 750 architecture, and has the transistor count of a single Intel atom core. It also has an incredibly low TDP.
Technical knowledge from a small time indy developer who has made one simple small game for the e-store. They also just so happen to only make games for Nintendo hardware.
Yeah totally indicative of the Wii U's performance and an unbais source.
Also you're just as full of the rhetric as anyone else here. You come into this thread criticising people for their arugments and views, yet offer none of your own.
You couldn't bother to correctly compute the BW of a well known RAM & bus configuration, even though you got the basic multipliers right (hint: it's not 10.2GB/s) - how you managed to do that is beyond me.
You come up with the most absurd of ideas that U-GPU's ROPs would be bound to the slower DDR3 pool, and not to the eDRAM pool.
You end up your posts with 'Fuck you, console vendor X'.
Have you considered the possibility you might be in the wrong thread, as per your current state of mind?