If the PS3/360 games are doing heavy floating point lifting, then you are correct, the CPU in the WiiU cannot keep up and you're pretty much screwed barring a total rewrite. (IMHO this is why there is no Frostbite/UE4 on Wii U, etc.)
Taking advantage of the other benefits you mentioned on the WiiU's CPU (out of order execution, etc.) requires...recompiling for the target processor.
I'm still not convinced that the WiiU's CPU can outperform a current gen CPU on a typical game workload due to its low clock speed and weak floating point performance negating any architectural advantages...but we are getting off topic again. Please try to focus
Are you sure it is not using the same shading API? If that is true, that would be phenomenally boneheaded. People know HLSL/Cg and its variants and asking people to learn something completely new when they probably could have asked AMD for a functional shader compiler would be utter madness.
Do we know if all texture data must live inside the eDRAM for the GPU to see it? IIRC this was true on GameCube/Wii, but it was a cache, so texture management was automatic.
If this is true:
http://www.vgleaks.com/wii-u-memory-map/
Then textures obviously are stored in main memory, but it's possible that there is still a GPU limitation where they must get copied to MEM1. Perhaps the reason why the graphics libraries take over all of MEM1 is because they have implemented a transparent cache in the graphics libraries. This would not be optimal in all cases, but the alternative would be some PS2-style manual management for textures and that would probably mean a lot of work for someone to port their engine over...I suspect that Nintendo has done something more automagic if the leak is accurate as it really seems to imply MEM1 is off-limits to developers, but maybe they will let people manage MEM1 themselves if they are working on a Wii U exclusive...
Of course, if something like MEM1 management is automatic, that would mean that the amount of code that needs to be rewritten would be relatively minor. But maybe you are right, you'd still have to tweak things to optimize for the cache, even if the details were hidden from you.
I don't know if I would call the PS3 and 360 "similar." They are similar in some ways and not in others. IMHO the WiiU is closer to the 360 than the PS3 is.
My understanding was that the main differences for Skyrim was that that there is somewhat less memory to use on PS3, making it run out of memory sooner.
I don't really know what happened with Bayonetta, most likely the engine wasn't parallelized in a way that worked well with the SPUs, and they targetted the 360 GPU as a baseline (didn't another team port it over to PS3?)
I am not so optimistic about ports from Xbox One to WiiU. Things on the GPU might not be so ugly as long as the min-spec PC version is low enough. The Xbox One CPU is probably at least 2-3x faster than the CPU in the WiiU all told. Moving to WiiU might bottleneck gameplay code, which doesn't scale nicely (and IMHO there is already evidence of this bottlenecking on current gen ports). We went through this discussion on the CPU thread, I suspect that the WiiU will get either completely different games ("Call of Duty: Gaiden") or nothing at all once the PS3 and 360 stops getting cross-gen games.
You all are gonna hate me for saying this but perversely enough WiiU's best hope might be for the Vita to succeed, together they might represent a large enough target for 3rd parties to aim for. We saw this last gen when quite a few 3rd parties lumped the Wii and PSP together.