• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Speculation Thread of Brains Beware: Wii U Re-Unveiling At E3 2012

Status
Not open for further replies.

DCKing

Member
That said, 1.5GHz may have been a bit too low. 2.0GHz should be fine.
Your specsheet is quite in line with what I'm expecting. With 1.5 GHz you're probably underestimating it though. Though its an out-of-order architecture it has been designed to run at fairly high clock speeds because of its long pipeline. I don't expect it to be much a problem to run a tri-core POWER7 customization at 2.5+ GHz frequencies when they reduce chip complexity.

A consideration Nintendo needs to make here is that lots of power from the 360 comes from its specialized VMX128 units. Despite being in-order, the 360 has a lot of code lying written for it that assumes three 3.2 GHz SIMD units calculating stuff in an in-order way. Nintendo will probably have a similar unit in their chip, but it's not going to be able to match the 360 when the chip is clocked so slowly. Basically, they need better VMX units, or they need a chip running at high frequencies.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
So which came first? The chi.. Gekko or the 750CL?
Nah, that's an easy one - Gekko ; )

Yeah I remember wsippel saying that back at the beginning of the thread. Oh and that was an interesting find about configurable threads. But if two threads per core is true, why drop to that?
Maybe somebody read in the manual about some momentary state of the CPU and decided that that's how things are? Or perhaps Nintendo did give up on 2 out of the 4 thread contexts per core *shrug* Apparently I'm wild-guessing here.
 

sfried

Member
Your specsheet is quite in line with what I'm expecting. With 1.5 GHz you're probably underestimating it though. Though its an out-of-order architecture it has been designed to run at fairly high clock speeds because of its long pipeline. I don't expect it to be much a problem to run a tri-core POWER7 customization at 2.5+ GHz frequencies when they reduce chip complexity.

A consideration Nintendo needs to make here is that lots of power from the 360 comes from its specialized VMX128 units. Despite being in-order, the 360 has a lot of code lying written for it that assumes three 3.2 GHz SIMD units calculating stuff in an in-order way. Nintendo will probably have a similar unit in their chip, but it's not going to be able to match the 360 when the chip is clocked so slowly. Basically, they need better VMX units, or they need a chip running at high frequencies.
What about the 1T-SRAM cache (was it for L1 and L2)? Also aren't there more advantages to out-of-order architecture? If DS (and 3DS for that matter) is anything to go by...
 

BurntPork

Banned
Your specsheet is quite in line with what I'm expecting. With 1.5 GHz you're probably underestimating it though. Though its an out-of-order architecture it has been designed to run at fairly high clock speeds because of its long pipeline. I don't expect it to be much a problem to run a tri-core POWER7 customization at 2.5+ GHz frequencies when they reduce chip complexity.

A consideration Nintendo needs to make here is that lots of power from the 360 comes from its specialized VMX128 units. Despite being in-order, the 360 has a lot of code lying written for it that assumes three 3.2 GHz SIMD units calculating stuff in an in-order way. Nintendo will probably have a similar unit in their chip, but it's not going to be able to match the 360 when the chip is clocked so slowly. Basically, they need better VMX units, or they need a chip running at high frequencies.

But can the Wii U even handle the heat and power consumption of higher frequencies? I have serious doubts about that.

The minimum frequency for a stock POWER7 is 2.4GHz. IMO, in Wii U's case (lol double meaning), the would actually be the absolute highest speed we have any hope of seeing.
 

MDX

Member
I'm still of the opinion that it may resemble a Xenon in that it puts together modified cores on their own die with a pool of eDRAM for L2 cache. So I could see the L2 cache being "trimmed" from the core. Considering IBM's process achieves a density of around 1MB per 2mm^2 and looking at how large Xenon's 1MB of L2 cache is compared to its cores, I would think they could place a pretty large amount in a shared pool like Xenon's.


p7chip.jpg

4 way SMT per core vs 2-way on Power 6 results in net 32 threads/chip. This is twice threads per core and 8* threads per chip.


If Nintendo goes for the 4 core variant:

POWER7 has 8 cores per chip vs 2 on POWER6. It can also run with 6 cores or 4 cores in “turbo” mode. Turbo shuts down 4 of 8 cores and, increases the clock speed while sharing the L3 cache slicing the “pie” amongst fewer cores (more cache/core)..

Is Nintendo looking at this "Turbo" mode?

Power 780 / 795 TurboCore Chip:
TurboCore Chips: 4 available cores ␣
Aggregation of L3 Caches of unused cores.
TurboCore chips have a 2X the L3 Cache per Chip available
L3 = 32 MB ␣
Performance gain over POWER6.
Provides up to 1.5X per core to core

You may need to do a double take on these facts: How can the per core performance improve while clock speeds are reduced? Simply, lower latencies and twice the threads per cores.

First, this is impacted by lowering the number of transistors/per core and second by using sensible clock speeds. If you keep in mind that the energy use increases at log scale as processors reach maximum frequency, the solution to “turn down” clock speeds in order to save power makes sense and also sets the stage for “turbo mode” which we described.
http://tec20ten.wordpress.com/2010/02/08/power-7-roundup/


Maybe what Nintendo did was simply offer three core version of the Broadway for their dev kits?

Crytek co-founder Avni Yerli said he is impressed with the Wii U tech specs, and claimed that the team at Crytek UK are getting to grips with the hardware.
"The Wii U specs are very good," Yerli told GamesIndustry.biz

The system’s unique interface is however “a challenge for designers”, Yerli added.
“But once thought through it can add value, and that's what ultimately important. Our guys in Nottingham (Crytek UK) are very happy with their tests on the dev kits and they're excited about it."
http://www.develop-online.net/news/38667/Wii-U-passes-the-Crytek-test
 

v1oz

Member
..And peaks at 400MPix/s of theoretical physical fillrate (200MHz * 2 ROPs) for the case when its unified shader architecture does nothing but the lightest imaginable pixel shading (basically nothing more than a single texture). Unfortunately, in more realistic scenarios the same shader units have also vertex work to do, so there goes your fillrate, before you'd even start doing any fancy shading work (at 25/75 split of vs/ps your max theoretical fillrate has already dropped to 300MPix/s). So the 535 has to rely on its HSR to save the day. Unfortunately TBDR's HSR advantage vanishes when fillrate is needed the most, eg. in situations with heavy translucency overdraw like particles, etc. In contrast, Hollywood has some 1GPix/s of absolutely sustainable physical fillrate (with one texture and vertex color), and a separete TnL unit to boot. And Hollywood has an embedded fb with hefty BW plus Early-Z HSR (not exactly as efficient as TBDR at opaque overdraw, but still a very snappy HSR tool when used right), which does counter TBDR's traditional advantages over IMRs.

As I said, the Wii is decisively better than the ipad in most game-related scenarios.

Unfortunately we are yet to see it in games. Games with the visual fidelity of Real Racing 2, Infinity Blade and Rage are impossible on the Wii. And the iPad has the pixel pushing power to output 720p. I believe the little PowerVR cores in iOS devices trump Broadway in every measurable way.

...
 

wsippel

Banned
For one, you can immediately scratch out the DFU (decimal floating-point unit).
Yep, the DFPU would go. Waste of die space and money, anyway. And again, one VMX and two dual-pipeline VSX units makes no sense either. Best case: DFPU and VMX are out, ten execution units per core.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
wsippel said:
Yep, the DFPU would go. Waste of die space and money, anyway. And again, one VMX and two dual-pipeline VSX units makes no sense either. Best case: DFPU and VMX are out, ten execution units per core.
The VMX might need to stay, though, for 360 ports. Instead, several of those DP-tuned fp units can go.

Unfortunately we are yet to see it in games. Games with the visual fidelity of Real Racing 2, Infinity Blade and Rage are impossible on the Wii. And the iPad has the pixel pushing power to output 720p. I believe the little PowerVR cores in iOS devices trump Broadway in every measurable way.
If you actually paid attention, I was referring to the sgx535 alone throughout that paragraph. But you are absolutely entitled to your beliefs, and I respect that.
 

DCKing

Member
But can the Wii U even handle the heat and power consumption of higher frequencies? I have serious doubts about that.
I don't know that. But I guess that with a lot simplifications compared to the server chip, they could maintain a decent clock speed at low power usage. The architecture doesn't seem to be designed to be running at sub 2 GHz speeds anyway, so I doubt that will happen.
 

BurntPork

Banned
I don't know that. But I guess that with a lot simplifications compared to the server chip, they could maintain a decent clock speed at low power usage. The architecture doesn't seem to be designed to be running at sub 2 GHz speeds anyway, so I doubt that will happen.

Sucks that we won't know the answer to this question until a few months after launch. :lol
 
Unfortunately we are yet to see it in games. Games with the visual fidelity of Real Racing 2, Infinity Blade and Rage are impossible on the Wii. And the iPad has the pixel pushing power to output 720p. I believe the little PowerVR cores in iOS devices trump Broadway in every measurable way.

...

The iPad's advantage over the Wii is its ability to use modern shaders. The Wii was also unable to do 720p due to design, not raw power.
 
Nah, that's an easy one - Gekko ; )


Maybe somebody read in the manual about some momentary state of the CPU and decided that that's how things are? Or perhaps Nintendo did give up on 2 out of the 4 thread contexts per core *shrug* Apparently I'm wild-guessing here.

Ok. I was just wondering if there may be some power saving benefits or something. But back the the 750CL, I can believe that since all this stuff was probably being developed concurrently anyway.


http://tec20ten.files.wordpress.com/2010/02/p7chip.jpg



If Nintendo goes for the 4 core variant:



Is Nintendo looking at this "Turbo" mode?

Power 780 / 795 TurboCore Chip:





http://tec20ten.wordpress.com/2010/02/08/power-7-roundup/


Maybe what Nintendo did was simply offer three core version of the Broadway for their dev kits?


http://www.develop-online.net/news/38667/Wii-U-passes-the-Crytek-test

I guess you forgot (or didn't know) this thread was originally a POWER7-based speculation thread. Could have saved you the trouble of those searches. :)

Way back when, wsippel said the TurboCore feature was unnecessary.

Still seems like a waste of die space to use an actual POWER7 even if yields are a concern.
 
I just had this thought after reading BP's revised guesstimate,
but what if Nintendo's solution to the storage problem isn't just cloud storage for saves and things of that sort, but an OnLive approach to the VC and Wii U Ware?

Where instead of storing all your games locally, you can launch them from in the cloud and play them on any Wii U that you can log into your account from. Streaming Live from the internet.

I'm sure I'm not the only one to have that idea, but I don't remember seeing it posted before.
 
It's funny looking back at the posts back then. BP's second personality hadn't visited yet so he was pretty rational in his posts. ;)

I just had this thought after reading BP's revised guesstimate,
but what if Nintendo's solution to the storage problem isn't just cloud storage for saves and things of that sort, but an OnLive approach to the VC and Wii U Ware?

I'm sure I'm not the only one to have that idea, but I don't remember seeing it posted before.

I know very little about OnLive, but I've seen several mentions about latency issues. I doubt Nintendo would try something like that if latency is an issue.

Will it have Mode7?

Yes. There is also a special LSI that Sega helped design for blast processing.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
But back the the 750CL, I can believe that since all this stuff was probably being developed concurrently anyway.
Oh, it seems like it most definitely was. The ppc7xx / G3 generation was an amazing one. Though my favorite one will always remain the G2 - just the striking simplicity of the 603e design, reflecting the unadulterated elegance of the ISA.

/grumpy old man
 
I know very little about OnLive, but I've seen several mentions about latency issues. I doubt Nintendo would try something like that if latency is an issue.

Considering that all of the VC library and probably the majority of the WiiUware library will fit in the system RAM (with ease in most cases), it's doesn't sound so far fetched.
 
^ I wonder why people would say the OnLive games have problems with latency then?

Oh, it seems like it most definitely was. The ppc7xx / G3 generation was an amazing one. Though my favorite one will always remain the G2 - just the striking simplicity of the 603e design, reflecting the unadulterated elegance of the ISA.

/grumpy old man

LOL. This post needs more complaining about "new-fangled technology".

And looking back at the part you quoted I can see why you responded the way to did. I should have been clearer that I was referring to your other post about the "off-the-shelf POWER7". And that comparing it to the 750CL, I agree that we could see the same thing with the Wii U CPU since it would have been developed almost concurrently with POWER7 based on the timeline given to wsippel.
 

BurntPork

Banned
It's funny looking back at the posts back then. BP's second personality hadn't visited yet so he was pretty rational in his posts. ;)



I know very little about OnLive, but I've seen several mentions about latency issues. I doubt Nintendo would try something like that if latency is an issue.



Yes. There is also a special LSI that Sega helped design for blast processing.

I don't have a second personality. I just have mild OCD.
 
when ever I come here I dunno wth is going on lol. So confused. I hope they relase some specs. Me I only judgeded it on the e3 zelda trailer and the bird dem o and both looked amazing so it is already enough powerfull
 
I don't have a second personality. I just have mild OCD.

Nah, I have mild OCD. It bothers me when I make a spelling error or something that gets quoted since I can't fix their quote.

What does A Link To The Past's overworld have to do with this thread?

Because the CPU is really a Tri-Force CPU, not a Tri-core CPU.

when ever I come here I dunno wth is going on lol. So confused. I hope they relase some specs. Me I only judgeded it on the e3 zelda trailer and the bird dem o and both looked amazing so it is already enough powerfull

I agree that they looked good. I can only imagine what they'll look like on final hardware because what we saw can still be improved on.
 

Azure J

Member
the xbox 3 thread seems to have more solid specs than this old thread :(
damn Nintendo damn u and your secrets

Anyone else feel like this could turn into a real issue? All the MS next console hype could really take interest away from Nintendo if they don't own up and show off something to get the public's eyes soon.
 
Anyone else feel like this could turn into a real issue? All the MS next console hype could really take interest away from Nintendo if they don't own up and show off something to get the public's eyes soon.

The public don't care a year in advance, and they certainly don't care about specs.

When Nintendo comes out at E3 with Mario, Smash Bros, et al, that's when the public will care. They could reveal GPU facts until they're blue in the face, but games are the hypebuilders, not the specs.
 

Shtof

Member
Anyone else feel like this could turn into a real issue? All the MS next console hype could really take interest away from Nintendo if they don't own up and show off something to get the public's eyes soon.

It's bound to happen. That's probably MS's incentive and has been all along.
 

Osuwari

Member
there's a 99% chance the Wii U's specs will end up lower than the next xbox and nintendo actively avoids console specs pissing matches since they're usually the "weak" system the competition uses to show off.
also, we never got full specs of DS, Wii and 3DS, so it doesn't look that's going to change anytime soon.
 
there's a 99% chance the Wii U's specs will end up lower than the next xbox and nintendo actively avoids console specs pissing matches since they're usually the "weak" system the competition uses to show off.
also, we never got full specs of DS, Wii and 3DS, so it doesn't look that's going to change anytime soon.

Too many people in this thread seem to have started gaming THIS gen because I've been seeing this comment way too often. The Wii IS AN EXCEPTION. EVERY other Nintendo console has been technologically comparable to the competition. People need to stop claiming otherwise and ignoring nearly 20+ years of counterexamples
 
The public don't care a year in advance, and they certainly don't care about specs.

When Nintendo comes out at E3 with Mario, Smash Bros, et al, that's when the public will care. They could reveal GPU facts until they're blue in the face, but games are the hypebuilders, not the specs.

It's bound to happen. That's probably MS's incentive and has been all along.


One is right, one is wrong... but which one?
 
Too many people in this thread seem to have started gaming THIS gen because I've been seeing this comment way too often. The Wii IS AN EXCEPTION. EVERY other Nintendo console has been technologically comparable to the competition. People need to stop claiming otherwise and ignoring nearly 20+ years of counterexamples

This.
 
Too many people in this thread seem to have started gaming THIS gen because I've been seeing this comment way too often. The Wii IS AN EXCEPTION. EVERY other Nintendo console has been technologically comparable to the competition. People need to stop claiming otherwise and ignoring nearly 20+ years of counterexamples

that's right but you missed a important point

Nintendo released the N64 1 1/2 and the SNES 2 years after the competion.
 

DCKing

Member
Too many people in this thread seem to have started gaming THIS gen because I've been seeing this comment way too often. The Wii IS AN EXCEPTION. EVERY other Nintendo console has been technologically comparable to the competition. People need to stop claiming otherwise and ignoring nearly 20+ years of counterexamples
Although he is clearly wrong in a historic context, you can't deny that Nintendo hasn't released a console with up-to-date internals since 2001. The DS wasn't exactly cutting edge, the 3DS most certainly isn't, and let's not even get started again about the Wii. There's no reason to assume by default that the Wii U will change that, although luckily there's been some positive news.
 
Watch the Wii-U be totally different at E3. Looks different, better specs than everyone thought, lots of games shown, online done, new OS, 199 price tag, rainbows, unicorns, reggie fired, etc...
 
Nintendo was never going to show the games before the Wii's last holiday season.

How long afterward is the question.

I can imagine two scenarios: Nintendo showing something (whether it be a game preview, some info) shortly after the holiday season, say, January and February, and following it up with more tid bits, game announcements etc through march-may, OR Nintendo show absolutely nothing until E3.
 
Too many people in this thread seem to have started gaming THIS gen because I've been seeing this comment way too often. The Wii IS AN EXCEPTION. EVERY other Nintendo console has been technologically comparable to the competition. People need to stop claiming otherwise and ignoring nearly 20+ years of counterexamples

It bears repeating.

NES (1983) ≈ Master System (1985, was a bit better than NES)
SNES (1990) ≥ Genesis/Mega Drive (1988)
N64 (1996) ≥ PSX/Saturn (1994,1995)
PS2 (2000) ≤ GameCube (2001) ≤ Xbox (also 2001 and cost $100 more than Cube)
Wii (2006) < Xbox 360 (2005) &#8776; PS3 (2006)

Long and short of it is: with the singular exception of the Wii, whenever Nintendo has been late to release in a generation, their hardware is comparable to or better than the competition. Whenever they've been early (Famicom/NES), they were not royally outclassed by the competition until the next generation began.

The Wii was a gamble that bucked previous Nintendo trends and it paid off for several years. The question is--is this the approach they will take again? Hopefully the 3rd party influence they're supposedly listening to tells them not to be too tight on the wallet strings for the horsepower.

EDIT: DCKing: The 3DS may not be "cutting edge" from a raw specs perspective but you can practically count on one hand the number of commercial devices using an autostereoscopic screen.
 
It's gonna be hilarious when Reggie does eventually retire and the new NOA head is in every way so much more fiscally conservative that NCL turtles in to Japan almost completely.

Because the economy isn't getting better within the next generation.
 
Although he is clearly wrong in a historic context, you can't deny that Nintendo hasn't released a console with up-to-date internals since 2001. The DS wasn't exactly cutting edge, the 3DS most certainly isn't, and let's not even get started again about the Wii. There's no reason to assume by default that the Wii U will change that, although luckily there's been some positive news.

Yeah, but their handhelds have followed consistent jumps in power. The PSP was especially powerful because that's what Sony was going for last gen. But the DS was a full gen over the GBC and the 3DS is a full gen over the DS, and certainly as close to the Vita as...say the DC was to the Xbox (which I consider the same gen. Saturn was from the N64/PS1 gen)
 

DCKing

Member
EDIT: DCKing: The 3DS may not be "cutting edge" from a raw specs perspective but you can practically count on one hand the number of commercial devices using an autostereoscopic screen.
I'll give you that, but its internals bear the mark of Nintendo's idiosyncrasy of using stuff from people they like, instead of what's the best solution. It makes no sense that Nintendo went with DMP's graphics processor instead of just partnering with PowerVR, Qualcomm, NVIDIA, ARM or any established party in the mobile graphics business.

Before the announcements at E3 I was expecting similar decisions by Nintendo, but for now it doesn't seem as bad. Rumours about them using 1T-SRAM (is that stuff still relevant?) again could show that Nintendo will stubbornly have their way, even when it might not be the most optimal solution.
 
I'll give you that, but its internals bear the mark of Nintendo's idiosyncrasy of using stuff from people they like, instead of what's the best solution. It makes no sense that Nintendo went with DMP's graphics processor instead of just partnering with PowerVR, Qualcomm, NVIDIA, ARM or any established party in the mobile graphics business.

Before the announcements at E3 I was expecting similar decisions by Nintendo, but for now it doesn't seem as bad. Rumours about them using 1T-SRAM (is that stuff still relevant?) again could show that Nintendo will stubbornly have their way, even when it might not be the most optimal solution.

but i would assume going with 1t-sram helps with backwards compatability
 
Status
Not open for further replies.
Top Bottom