Going back as far as the first WUST, we talked about Nintendo possibly choosing DDR3 over GDDR5 due to latency. The memory clock isn't really surprising since it's not dramatically less than what Thraktor and I talked about in this exchange.
http://67.227.255.239/forum/showthread.php?p=42579861#post42579861 I will say though that if Nintendo did stick with clock multiples (DSP originally listed at 120Mhz), the memory could very well be 720Mhz for all we know if they underclocked it. It wouldn't make sense to overclock a part when faster speeds are available to underclock. But what I found interesting is that ifixit.com (Step 12) has a teardown and theirs had Micron memory. Looking at the specs of the available choices I believe Nintendo chose latency (1.25ns @ CL = 11) over BW and at the same time were limited to a 64-bit bus because 32-bit wide DDR3 was not ready or available for production. Wii U's timing is unfortunate since Micron is taking the same memory module and sampiling a "TwinDie" doubling the density and BW so Wii U could have possibly had 4GB with twice the BW. All that said, I think some are giving too much weight/concern to the memory bandwidth. Nintendo has always been picky about balance and BW is not the only factor in memory speed. I decided to put forth the effort to look for/at the latency speeds. So based on the formula I found (and if I did it correctly) this is what we're looking at.
Wii U DDR3 - 13.75ns (based on 800Mhz)
Xbox 360 GDDR3 - 14.29ns (low end), 21.43ns (high end)
PS3 XDR - 35ns (taken from a PS3 wiki)
PS3 GDDR3 - 15.38ns or 16.92ns
I think there was a dev that talked about the latency for Wii U compared to the others so this should give some actual numbers to that. The reason there are multiple numbers for the GDDR3 is because the Samsung's data sheets gave multiple CAS latency options for the possible clock speed, so I calculated all of them since I didn't know which CL MS and Sony chose for their respective GDDR3 memories. There were three other possible outcomes for the 360, so I stuck with the high and low. Going back to Nintendo's choice, they seem to have gone with the lowest latency using Micron's info.
800Mhz - 13.75ns
900Mhz - 14.44ns
933Mhz - 13.93ns
1000Mhz - 14ns
So I didn't do this to justify Nintendo's decision since as you know I've said in the past that Wii U comes short of what I think a next-gen console (from a power perspective) should look like. As with other speculation this was to look at Nintendo's thinking.