I'll requote this since it was a good question that got stuck at the bottom of the page.
*some* don't look great and ye *some* look better. the speculation for that has been based on the fact that games designed for the 360 and PS3 are designed around a system which (comparatively) is CPU centric, rather than GPU centric. adapting a game for a system which leans more of it's power on the GPU isn't straight forwards, even if running code you built for the 360 and PS3 games is straight forwards.
these numbers wouldn't shine any new light there really. we'd still be looking at a system with a more powerful GPU than either the 360 or PS3 had. 1.5 times more powerful than 360, and more than that in comparison to the PS3.
presumably the multiplatform games that look better on Wii U are ones that already leant more heavily on GPU than CPU. the ones that don't, either require more out of the CPU than the Wii U can handle, or weren't as well optimized as they could have been (or both).
when you're straining to make launch and you have your game up and running at what you deem an acceptable framerate (even if it isn't as good as that seen on the other consoles) I don't think you're going to lose too much sleep over it. Obviously EA deemed Mass Effect 3's framerate as good enough, because it averages higher than the PS3 version. Activision got COD:BlOps 2 running at a locked 60 fps in multiplayer, and probably weren't too concerned that single player was running more around 45 fps.
I'm sure going forwards we'll still see the occasional multiplat that runs worse on Wii U (because the lead design on 360 doesn't fit too well into it's architecture) but I'm sure we'll see this less often now that people have more time to get things right.