• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Speculation Thread The Third: Casting Dreams in The Castle of Miyamoto

DCKing

Member
Wait - I remember I had a discussion with blu before about what would've happened if Nintendo would have added programmable shaders to the Wii GPU. He replied that it was simply not possible to have two different pipelines in the graphics setup. TEV + actual shaders was impossible.

I really don't think Nintendo is going to take over things from the Wii at all in this console. What would be the benefit of spending silicon on TEV units when they could use actual unified shaders?
 

IdeaMan

My source is my ass!
Well then we should stop speculating about any news we get... If we disregard the Wii U not performing well in real gaming scenarios* all we have left is some vague statements saying "oh it's not so powerful" or "oh it's quite powerful", and some hardware parameters from some time ago.

I'm just trying to piece the rumours together, and I'm not going to disregard these recent statements because of 'vagueness'. They're the most concrete rumours we have!

* the E3 demos you posted really did not show off the power you claim that they did. There's absolutely no way you can even quantify 'how many times an Xbox 360' those demos were, but even if you could mirroring the graphics from the main screen on the Upad or drawing a picture on the Upad definitely doesn't qualify as '0.5x a 360'.

Oh i didn't forbid anyone to talk about these statements my friend :) Just that people tends to consider them as the entire truth to the Wii U power, without considering the context, which is important.

For the E3 demos demonstration, you get the general idea. You can switch some "0.Xx" between padlet usage and the "bonus" that come from more modern technology. But a 3D scene, different from the main screen, at 480p (+ the fact that it's pretty taxing to render that from the same hardware), well, it's something.
 
Ok, let's do this.

Ignore everything written in these speculations threads since 10 months

Now check a video of the japanese garden demo on Wii U from E3 2011, with AT THE VERY LEAST, a Xbox360+ content on the main screen + a 480p content on the padlet.

You know as a FACT that this demo was quickly built, run on firsts iterations of the dev kit (we are 4 dev kits after at the moment), on firsts SDK, etc etc etc.

Check a video of Ghost Recon Online with the UAV drone. You have at the very least, a current gen HD game + a different content on the padlet. Again, it was one year ago, on firsts dev kits, etc etc etc.

So you have at the very least:

A xbox360 on the main screen
A 480p xbox360 on the second screen. Let's say 480p + calculating a second view/scene/content from the same hardware is demanding, so it's roughly 0,5x xbox360

1 xbox 360 + 0,5x Xbox 360 = 1,5x xbox360

This was the situation one year ago, with all the elements derived from the very important context that i've described i don't know how many time. Add to that, at worst case scenarioS, just tweaking here and there, small boosts, optimizations, advancements in titles development, graphical polishing for these softs, devs being more and more accustomed to the asymmetrical setting (again, small improvement here), etc etc, all that since one year, you have, let's say roughly 0,5x xbox360 (through slightly better framerate, better/greater AA/texture filtering, and some things that the current gen GPU's can't handle).

So 1 xbox 360 + 0,5x Xbox 360 + 0,5x Xbox 360 = 2x Xbox 360, at the very least, considering the two screens. The Wii U isn't "on par" technologically, period, and this demonstration has always considered the worst that could have happened.


Yes, all of that, PLUS the awesome Zelda demo.
 

StevieP

Banned
Can anyone here get any more information about IBM's "Project Flanker"?

The codename may be related to IBM's CPU for the Wii U.
Some engineers at Infotech Enterprises are working on a 45nm IBM SOI CPU.

Infotech Enterprises is in Seattle, WA - steps away from Redmond (NOA).
Here is an example of "Project Flanker":

http://www.linkedin.com/pub/yu-luan/1b/10a/48
Current ASIC Design Engineer at Infotech Enterprises
Chip Integration Engineer at IBM

ASIC Design Engineer

Infotech Enterprises

Public Company; 5001-10,000 employees; INFOTECENT; Information Technology and Services industry

2010– Present (2 years)

Working on the physical design and timing closure for IBM Cu45HP Flanker project





Chip Integration Engineer

IBM

Public Company; 10,001+ employees; IBM; Information Technology and Services industry

2006– Present (6 years)

Physical Design/Integration Engineer for CU45HP LSI X1 design. Worked on the physical design and timing closure for LSI X1 design and floorplanning improvement on LSI PPC476 processor. Also, created and developed the IBM ChipBench and Cadence Marpo power routing for Cu65HP,. The power-routing methodology was used across all ASIC Cu65HP projects.
 
Say, I made an offhanded comment essentially along the lines of "80W seems like a reasonable upper limit for a gpu in the Wii U", and somebody pointed out that I didn't really have any basis for such a judgement. They were totally right, given that I wasn't even thinking about the fact that the Wii U is still much smaller than the Xbox 360 or Playstation 3.

Could one of the techheads here give a good rule of thumb and explanation (more or less) for what a good upper limit power draw would be given the internal volume of the system?
 

z0m3le

Banned
Well then we should stop speculating about any news we get... If we disregard the Wii U not performing well in real gaming scenarios* all we have left is some vague statements saying "oh it's not so powerful" or "oh it's quite powerful", and some hardware parameters from some time ago.

I'm just trying to piece the rumours together, and I'm not going to disregard these recent statements because of 'vagueness'. They're the most concrete rumours we have!

* the E3 demos you posted really did not show off the power you claim that they did. There's absolutely no way you can even quantify 'how many times an Xbox 360' those demos were, but even if you could mirroring the graphics from the main screen on the Upad or drawing a picture on the Upad definitely doesn't qualify as '0.5x a 360'.

Well if you believe that the demos ran at exactly 360 level:

720p content on the tv screen = 921600 pixels (assuming this is what the 360 can do)
480p content on the Upad screen = 409920 pixels (assuming 854x480)

That is a 44.47% increase in pixels, which should be directly connected to performance, while not actually even getting into the performance drain of multiple displays, you have
~1.5x 360. Unless you think that the 360 could outperform that tech demo, I'd say this is baseline speculation that should be considered whenever talking about the system.
 

AzaK

Member
That's where the hypothetical "shortcut" (fixed function) part would come into play. It's probably kinda slow when running standard shaders, but could achieve visible results typically requiring a lot more raw horsepower if developers use Nintendo's proprietary extensions.

If this is the case I would have thought words sounding like Puck Hoo would come from Epic no?

Would the big engine makers really add support for fixed function into their pipelines?
Would Fixed function really increase performance sufficiently?
Wouldn't most devs find it a pain to add to their engines?
And those devs that don't add fixed function support will get ports running hardly better than current gen.

It all just seems counter to what Nintendo want to achieve, an easy port machine.
 
Can anyone here get any more information about IBM's "Project Flanker"?

The codename may be related to IBM's CPU for the Wii U.
Some engineers at Infotech Enterprises are working on a 45nm IBM SOI CPU.

Infotech Enterprises is in Seattle, WA - steps away from Redmond (NOA).
Here is an example of "Project Flanker":

http://www.linkedin.com/pub/yu-luan/1b/10a/48

Oooh this is interesting!

What a fucking weird codename though. Unless it'll 'flank' the opposition sneakily!

No-one expects the Wii U to be an octa-cored 3.6 GhZ beast!
 

z0m3le

Banned
I'd say these days, as long as Epic and maybe Crytek are cool with Nintendo's custom extensions, they'd be fine. It's even possible companies like Epic had input there as well - instead of designing an engine around a GPU, they'd design a GPU around an engine. The audio algorithms embedded in the Gamecube GPU were in part developed by Factor 5 for example, and they also did the MusyX audio middleware for the system.
Doesn't this go against having Darksiders 2 up and running on v1 devkits in 5 weeks with a small group of people (under 10)? Seems that having a modern shader unit has a lot more benefits when it comes to third party developers.
 

wsippel

Banned
Can anyone here get any more information about IBM's "Project Flanker"?

The codename may be related to IBM's CPU for the Wii U.
Some engineers at Infotech Enterprises are working on a 45nm IBM SOI CPU.

Infotech Enterprises is in Seattle, WA - steps away from Redmond (NOA).
Here is an example of "Project Flanker":

http://www.linkedin.com/pub/yu-luan/1b/10a/48
Interesting. From Infotech's website:

With the average life span of a game shrinking drastically, gamers look for the newest and the hottest machines each day. Be it the Casino floor or the Console segment, Infotech integrates technology and innovation to create a better experience for your customers at a pace that ensures you are always ahead in the bigger game.

Service offering

Gaming consoles

Product Design and development
User Interface and multimedia graphics design
Gaming algorithm and application development
Access control application
Testing & Validation
http://www.infotech-enterprises.com/industries/consumer/gaming-equipment/
 

DCKing

Member
Well if you believe that the demos ran at exactly 360 level:

720p content on the tv screen = 921600 pixels (assuming this is what the 360 can do)
480p content on the Upad screen = 409920 pixels (assuming 854x480)

That is a 44.47% increase in pixels, which should be directly connected to performance, while not actually even getting into the performance drain of multiple displays, you have
~1.5x 360. Unless you think that the 360 could outperform that tech demo, I'd say this is baseline speculation that should be considered whenever talking about the system.
No. None of the demos we've actually seen do anything meaningful with the Upad. Drawing a map on the Upad or mirroring the buffer that was already rendered on the main screen is a very small operation in the grand scheme of things. The E3 demos did not really showcase the Wii U power at all (and they couldn't have since it was the old problematic devkit).

Furthermore, rendering a different scene on the Upad takes roughly 0% extra CPU power, indeed almost 50% more pixel fillrate, and possibly 100% more vertex processing in an unoptimized scenario. If we assume the Xbox 360 is the baseline here, then that's very unimpressive. Of course, this is above being on par with the Xbox 360, but we're not arguing that. Whatever the case, if the Wii U turns out to be only able to run Xbox 360 games on the main screen and a smaller different viewpoint on the Upad (it likely won't, but let's pretend for now) then that is seriously unimpressive. That's why I consider these statements bad news.
 

wsippel

Banned
Doesn't this go against having Darksiders 2 up and running on v1 devkits in 5 weeks with a small group of people (under 10)? Seems that having a modern shader unit has a lot more benefits when it comes to third party developers.
It would be an extension, not a replacement. This isn't about TEV, it's about a proprietary fork of SM4.1 developed in parallel with SM5.
 
I don't know if these were taken from E3 2011, but Nintendo used Panasonic sets there, which generally tend to default to showing 95% of the image. It's probably just a case of Nintendo not bothering to change the settings in the aspect adjustment menu.

Interesting.

Didn't know that.
 

z0m3le

Banned
No. None of the demos we've actually seen do anything meaningful with the Upad. Drawing a map on the Upad or mirroring the buffer that was already rendered on the main screen is a very small operation in the grand scheme of things. The E3 demos did not really showcase the Wii U power at all (and they couldn't have since it was the old problematic devkit).

Furthermore, rendering a different scene on the Upad takes roughly 0% extra CPU power, almost 50% more pixel fillrate, and possibly 100% more vertex processing in an unoptimized scenario. If we assume the Xbox 360 is the baseline here, then that's very unimpressive. Of course, this is above being on par with the Xbox 360, but we're not arguing that. Whatever the case, if the Wii U turns out to be only able to run Xbox 360 games on the main screen and a smaller different viewpoint on the Upad (it likely won't, but let's pretend for now) then that is seriously unimpressive. That's why I consider these statements bad news.

The bird demo did offer different viewpoints on the upad, if you watch the video again, those scenes don't change much, while a ton of stuff goes on on the tv screen, it's rendering the image twice, my entire quote is pretty accurate, that it is doing at least 1.5x what the 360 can do in that demo.

Ideaman's post logically assumed that the newest version of the devkits are able to do at least .5x more, this only means that it could display to a second tablet as well, these aren't wild speculations. I think this thread has finally hit illogical pessimism.
 
Can anyone here get any more information about IBM's "Project Flanker"?

The codename may be related to IBM's CPU for the Wii U.
Some engineers at Infotech Enterprises are working on a 45nm IBM SOI CPU.

Infotech Enterprises is in Seattle, WA - steps away from Redmond (NOA).
Here is an example of "Project Flanker":

http://www.linkedin.com/pub/yu-luan/1b/10a/48

Very interesting! I love the name Project Flanker. BTW off-topic, Flanker is the name of the Soviet/Russian Su-27 fighter that rivals the U.S. F-15 Eagle.

I wonder if in fact Project Flanker is in fact the codename for the Wii U CPU.
 

Donnie

Member
I consider that a possibility, yes. Cu45-HP CPU manufactured at the East Fishkill fab, Cu32 GPU manufactured at the new Fab 8 (initial production for devkits in East Fishkill).




I believe it was 400 or 450MHz (it's been a while), and it was still overheating when actually pushed. And yes, that was the AMD GPU. The Nintendo GPU wasn't done until many months later.

If it was still overheating when pushed then why wouldn't Nintendo just down clock it more? The only explanation would be that if it was down clocked any more then performance wouldn't be close to that of the final GPU.

A 4830 clocked at 400Mhz is still 512Gflops, 576Gflops at 450Mhz.

What exactly have you heard to make you suddenly think the WiiU GPU is going to be a 300Gflop GPU? I mean that wouldn't even be enough to run a 360 game plus a second complex scene on the controller.
 

z0m3le

Banned
It would be an extension, not a replacement. This isn't about TEV, it's about a proprietary fork of SM4.1 developed in parallel with SM5.

Ok, as long as we are talking about a unified shader unit, that is what the 3DS is built around after all, and I can't see Nintendo taking a step forward on their hardware and then 2 steps back on their console.
 

DCKing

Member
I think this thread has finally hit illogical pessimism.
What? We're both agreeing here. 1.5x an Xbox 360 just isn't that impressive, and for the most part of the Wii U speculation we thought it was better than that. (Also the 3DS does not have unified shaders, but you're right that its shaders are very different to what's found in the Wii)
 

z0m3le

Banned
What? We're both agreeing here. 1.5x an Xbox 360 just isn't that impressive, and for the most part of the Wii U speculation we thought it was better than that.

Ok, maybe it was just the big No at the beginning of your last post, as for 2x the 360, that is just a minimum, I fully expect it to be 3x+ in the GPU but that some of the hardware parts are roughly equal to the 360.

Simply more ram, more Gflops and a better, faster multithreaded 3core CPU would give us a nice bump in graphics.

The end result of next gen btw I think will in the worst case be that the Wii U is roughly 3x PS360, while the other consoles are closer to 9x (worst case).

That simply puts the Wii U in the same position that the PS360 are about to find themselves in.
 

ElFly

Member
Wait - I remember I had a discussion with blu before about what would've happened if Nintendo would have added programmable shaders to the Wii GPU. He replied that it was simply not possible to have two different pipelines in the graphics setup. TEV + actual shaders was impossible.

I really don't think Nintendo is going to take over things from the Wii at all in this console. What would be the benefit of spending silicon on TEV units when they could use actual unified shaders?

They are just gonna have a translation layer. If dolphin can reverse engineer it, surely Nintendo and the ATI guys can emulate it perfectly.
 

DCKing

Member
It will be, what exactly has changed to make you believe otherwise?
I don't believe anything really, there's really nothing clear to make out of these conflicting reports. However, assuming a Wii U that is less powerful than what we thought seems to make the most sense to explain what is actually running on the Wii U, going by what Vigil and IdeaMan said. People seemed to think I was arguing that it was on par with the 360, which is nonsense.
 

Mitsurugi

Neo Member
Well if you believe that the demos ran at exactly 360 level:

720p content on the tv screen = 921600 pixels (assuming this is what the 360 can do)
480p content on the Upad screen = 409920 pixels (assuming 854x480)

That is a 44.47% increase in pixels, which should be directly connected to performance, while not actually even getting into the performance drain of multiple displays, you have
~1.5x 360. Unless you think that the 360 could outperform that tech demo, I'd say this is baseline speculation that should be considered whenever talking about the system.

Correct me if I'm wrong, in a local milti-player game like COD, Wii U would have to process two separate and potentially very dissimilar looking 720p/60fps video streams (in this instance, first player could be on the ground in a tank and the second player could be in a chopper providing air support), send one directly to the HDTV, downscale the other's res to 480p and send it to the controller. Right?

So at the very least, when tasked with running a port, the Wii U is twice as powerful as a 360. Even moreso if you take into account the fact that COD is 600p on 360 and that the Wii U will may be able handle two 1080p/30fps streams.
 

wsippel

Banned
Ok, as long as we are talking about a unified shader unit, that is what the 3DS is built around after all, and I can't see Nintendo taking a step forward on their hardware and then 2 steps back on their console.
3DS isn't using unified shaders, there are no pixel shaders at all. Maestro is fixed function.
 

Donnie

Member
I don't believe anything really, there's really nothing clear to make out of these conflicting reports. However, assuming a Wii U that is less powerful than what we thought seems to make the most sense to explain what is actually running on the Wii U, going by what Vigil and IdeaMan said. People seemed to think I was arguing that it was on par with the 360, which is nonsense.

I've always thought it would be about 3 times 360, as far as graphics processing and RAM. The vague comments from Vigil haven't had any effect on that view and Ideaman's comments have been pretty much in line with what I was expecting AFAICS.
 
They are just gonna have a translation layer. If dolphin can reverse engineer it, surely Nintendo and the ATI guys can emulate it perfectly.

Doesn't Dolphin require a prohibitive level of performance for its occasionally poor level of emulation? And haven't there been some emulator issues even for twenty-year old systems on the VC?
 

z0m3le

Banned
3DS isn't using unified shaders at all, there are no pixel shaders. Maestro is fixed function.

That, I didn't know, when I originally looked at the hardware in 3DS I saw pica200 or maybe it was 400 (though i think it ended up custom) since I am not as familiar with mobile graphics units I just assumed from the developer comments about it being modern hardware that they were talking about unified shaders, and that was reinforced with a comment made a page or two back... anyways thanks for the info.

Doesn't Dolphin require a prohibitive level of performance for its occasionally poor level of emulation? And haven't there been some emulator issues even for twenty-year old systems on the VC?
That has a lot to do with the console using an IBM processor, still yes the gpu and bandwidth are problems, but Nintendo could emulate the system much like they did with the N64 Zelda games on Gamecube.
 

ElFly

Member
Doesn't Dolphin require a prohibitive level of performance for its occasionally poor level of emulation? And haven't there been some emulator issues even for twenty-year old systems on the VC?

Dunno, but their minimum requirements mention a "Radeon 2600 Pro or Better" which is a R600, a generation less than the rumored R700 the Wii U would be based off.

Besides, a lot of the performance comes from emulating the powerpc cpu, and that'd be easier for ibm.
 

Christine

Member
I'd say these days, as long as Epic and maybe Crytek are cool with Nintendo's custom extensions, they'd be fine. It's even possible companies like Epic had input there as well - instead of designing an engine around a GPU, they'd design a GPU around an engine. The audio algorithms embedded in the Gamecube GPU were in part developed by Factor 5 for example, and they also did the MusyX audio middleware for the system.

This is very weird, but I haven't heard many other good explanations for the length of time AMD is reputed to have worked on the design process.
 

z0m3le

Banned
This is very weird, but I haven't heard many other good explanations for the length of time AMD is reputed to have worked on the design process.

how about it takes years to design a GPU, if they started in 2009, that means it took them a little under 3 years for them to finish and have chips ready in fall 2012.
 

neeksleep

Member
Does anyone know how good the speakers are on the U-pad? Would it be asking too much to expect at least cell phone speaker quality?

Yep, flanker doesn't rhyme with anything!

On a side note, it's more mature than 'Dolphin' and 'Flipper'
Now what will the GPU be called? Blitzkrieg? :p

Wanker =I
 

Christine

Member
how about it takes years to design a GPU, if they started in 2009, that means it took them a little under 3 years for them to finish and have chips ready in fall 2012.

It shouldn't take years to come up with a specific configuration for highly modular components that you're going to be releasing in a wide variety of production profiles at that time in any case.
 

Nibel

Member
Rösti;36473088 said:
IdeaMan mentioned Zen Studios in post #10837 about an interview regarding their work with the eShop (on Nintendo 3DS). Zen Studios has now been added to the category "Digital Download/Gaming On Demand - Nintendo Wii U" of the E3 exhibitor list.

http://www.zenstudios.com

Source: http://www.mapyourshow.com/shows/index.cfm?Show_ID=E312

Nothing big, but I thought I should mention it.

Very cool. Thanks Rösti!
I hope that the DD roster of Wii U will have some suprises for us. :)

And hey, if we give the GPU a German name, then we pick Project Salatgurke!
 
Top Bottom