• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Speculation Thread of Brains Beware: Wii U Re-Unveiling At E3 2012

Status
Not open for further replies.

disap.ed

Member
More likely 3-core POWER7 at moderate clock speed (they're probably still using Xenos in the devkit) with 12 MB EDRAM, a 512 SPU Southern Islands like core with 32 MB EDRAM and 1 GB of mysterious memory. I don't think the final hardware will be much faster than the devkit, just more modern. Shouldn't the devkit be in intended to set some sort of realistic baseline?

My guess is something around these specs but I am hoping for a higher SPU count (considering relatively low clocks). I think I read somewhere (Beyond3d?) that the GPU part will be around 1GFLOPS, but maybe this number is also for the whole system (which would still be quite above the upcoming AMD Trinity APUs (~700 MFLOPS)
 
I haven't followed WiiU news at all since E3, so apologies if this has been addressed.

My understanding is that WiiU is backward compatible with Wii games out of the box. Has Nintendo talked about whether there will be an auto resolution upgrade? Could they theoretically do something similar to the Dolphin emulator if they wished to? I really want to play Skyward Sword, but don't really want to buy a Wii at this point just for that game, especially if I can get a better looking version of it sometime next year. Thanks for any information!

They've already said that Wii games will only run in 480p (so it probably includes the original Wii GPU/CPU)
 

AlStrong

Member
Depends on if developers want double buffering or triple buffering

This doesn't affect your working framebuffer. Triple buffering comes after you've already rendered the scene.

Then you get into HDR, which means you need to generate each scene twice then render a final buffer. More accurate your HDR is, the more framebuffer it needs.
This was Bungie's (and Metro 2033's) method for HDR on 360 because of certain costs of utilizing FP16 (64bpp) over FP10 (32bpp) - Bungie wanted alpha blending support, which Xenos does not have when using FP16 render targets. Also they output two different colour ranges from the pixel shader and combine them, not render the whole thing (geometry etc) twice.

I've gotten better with the calculation, but the application is something I still want to understand better. How would that effect things from a game standpoint?

You need depth for a lot of things, the least of which is being able to render the scene correctly (order of opaque objects, since alpha blended items aren't part of this, alpha test can take advantage of the z). Then there's shadowmapping, some post-processing (depth of field for example).

By "single pass", brainstew means without tiling


He is also assuming a particular setup of the framebuffer - depth+ 1 render target. Deferred shading requires a lot more.
 

DCKing

Member
They've already said that Wii games will only run in 480p (so it probably includes the original Wii GPU/CPU)
Yeah I thought so too. Embedding the Wii CPU/GPU/1T-SRAM would guarantee 100% backwards compatibility, plus they could use Broadway for running the background OS and Hollywood to optionally render the tablet graphics in demanding games. Hollywood is well suited for running decent 854x480 resolution graphics, and that way they could save precious framebuffer time and space on the main GPU.
 

BurntPork

Banned
One more just for good measure:

http://www.neogaf.com/forum/showpost.php?p=29863782&postcount=4627



Seeing as how I went back 10+ pages after that and couldn't find the direct quote referenced there, I'm going to assume something big is in this detail. Or I overlooked it in my comb through. Once more, if this is prying too much, just shoot me a message and this all disappears.



While not a direct modification for the sake of meeting DX12 specs, they might opt for more efficient portions of DX11/12 compliant silicon. Off the top of my head, there's always the tessellation unit if they go raw from the RV700 base. That could see a face lift just to be more efficient at what it does.

The tessellator in the R700 series is 100% useless. It's too weak to do anything. They'll either replace it or remove it if they stick with R700, which is part of why I think they won't.
 

User Tron

Member
I should probably rephrase to be sure I get the answer I want.

How much embedded Mem is necessary to achieve 1080p @ 60fps, some form of good AA and 2 tablets that also support their full resolution (800x540?) @60fps w/ good AA?

Will the rumored 32MBs eDRAM(or 1T-SRAM) cover that?


Is it possible that Nintendo would include 48MBs or 64MB?

I think it does not work this way. There's only one framebuffer and you can decide where to it is rendered. So if you just want a copy of the mainscreen (X) on the tablet the code would look like this:

RenderXToFramebuffer();
RenderToScreen();
RenderToTablet();

If you want to have something different (Y) on the tablet it would look like this:

RenderXToFramebuffer();
RenderToScreen();
ClearFramebuffer();
RenderYToFramebuffer();
RenderToTablet();

If it is that way you wouln't need any more framebuffer, "just" more rendering speed.
 

disap.ed

Member
I think it does not work this way. There's only one framebuffer and you can decide where to it is rendered. So if you just want a copy of the mainscreen (X) on the tablet the code would look like this:

RenderXToFramebuffer();
RenderToScreen();
RenderToTablet();

If you want to have something different (Y) on the tablet it would look like this:

RenderXToFramebuffer();
RenderToScreen();
ClearFramebuffer();
RenderYToFramebuffer();
RenderToTablet();

If it is that way you wouln't need any more framebuffer, "just" more rendering speed.

Could someone please validate this? Thx.
 

BurntPork

Banned
Each time I visit this thread ... it amazes me all the things the people can discuss without any real info.

Well, if you would toss us a bone.... ;)

I'm guessing you won't answer this, but has the RAM type or amount changed since E3? If so, in the positive or negative direction?
 
Yeah I thought so too. Embedding the Wii CPU/GPU/1T-SRAM would guarantee 100% backwards compatibility, plus they could use Broadway for running the background OS and Hollywood to optionally render the tablet graphics in demanding games. Hollywood is well suited for running decent 854x480 resolution graphics, and that way they could save precious framebuffer time and space on the main GPU.

That tablet screen resolution is surely not just a coincidence...
Hint, it's 480p widescreen. Exactly the perfect resolution for displaying GCN/Wii/VC games and Hollywood produced graphics...

The specs are the same.

Does that include the CPU and GPU as well, or is it just the RAM that has remained the same since E3 2011?
 

dr_rus

Member
DirectX 12 on a Nintendo console..?
There is no DX12 and there certainly won't be by the time Wii U hit the market. Windows 8 will bring DX11.1 (which is more or less a software update and will be avialable to some extent on all DX10+ h/w).

As I've said previously, the only reason to use something like RV770/5830 today is because the final GPU will have the same feature set - meaning DX10.1 plus some kind of a tesselator. If this GPU would have support for DX11 then why use such old h/w as RV7x0 GPUs? Perfomance wise there are GPUs in current AMD's DX11 line up which are on par with RV770/5830 - why not use them?
 

BurntPork

Banned
Okay. Slight changes.

I'm bored, so this is random.

My new spec guess, now with more sanity!

-2.0-2.5GHz POWER7 based tri-core with 2-way SMT @45nm.

-500MHz customized RV740 GPU @40nm. Customizations allow perfect hardware emulation of Broadway. Tessellation unit has been removed.

-CPU and GPU are part of an APU with 24-32MB of shared eDRAM.

-1GB GDDR3 unified RAM

- Proprietary Wii U Optical Disc Drive, capable of reading dual-layer Wii U Optical Discs. Each layer holds up to 25GB, allowing 50GB total.

- 8GB internal flash storage. Game save can also be transferred to a cloud service which will be revealed by Nintendo early next year as part of the next major 3DS update. Still no internal HDD, but games can be saved on USB drives.

- USB 3.0 ports. Still no built-in Ethernet or optical audio out.

- 802.11b/g/n. No dual band support.

Other predictions.

- GCN games will be added to the Virtual Console. Standard price will be $14.99 in the US.

- There will be a unified friend code-based friends list, but third-parties will be allowed to make games or online systems with separate, name-based friends lists. All lists will have some method of transfer.

- It will not be possible to buy most third-party DLC via the eShop. Each publisher will have their own shop interface. (Unless EA gets what they want...)

- Only one person needs to enter a friend code; the second person receives an invite.

- Party text, voice, and video chat.

- Everyone will still hate it.

- Nintendo will place a lot more focus on DD.

- Several new IPs will be shown at E3 2012. None will release until Holiday 2013 at the earliest, at least one will be canceled, and only the worst non-canceled one will make it to the US.

- Reggie will be assassinated. It will be swift. When he suddenly collapses, the first question asked will be "What's wrong with you?" before the bullet is noticed. The assassin will then check Reggie's box on his list.

- The funk is real.

-People will still say that Pikmin 3 isn't coming out, even after it comes out.

Changes in bold.
 

AzaK

Member
This doesn't affect your working framebuffer. Triple buffering comes after you've already rendered the scene.
.

Really? That implies that after rendering the framebuffer is then copied to some other place in order to be displayed. Wouldn't it be better to just have enough memory and then effectively point the display hardware at the correct buffer?
 

Turrican3

Member
How is this even a discussion? For the games the industry wanted to make, the Wii did not have enough power to run them. There. End of story. [...] Ergo, by all logic, the standard at which the industry was at was not met by the Wii. So, underpowered.

By this same logic, you could consider the PC overpowered. But publishers don't like that not because it's too powerful, but because the margins are thinner.
Are you claiming that EA (like Capcom) decided to make a Dead Space on-rails spinoff on the Wii because of technical limitations?

I'd argue it was a purely marketing choice instead: they thought this way it would have been a better fit for the audience (guess it wasn't - also, let's not forget the PSP, hardly as powerful as the Wii, got Assassin's Creed, Split/Second... I think this is further confirmation that the issue was not a technical one if they really wanted to exploit brands)

And that's ultimately what I believe is one of the primary reasons for less-than-stellar 3rd party support, which funnily enough, is somewhat similar to what you said about PC: thin margins.
Even assuming straight downporting from PS360 was not technically possible, a properly made ( = with decent budget and teams ) Wii spinoff probably required companies to spend too much money.

In a word, it wasn't worth it, at least from their point of view: they could do those, but decided not to mostly because it was a risky/unproven market and, most of all, it required dedicated efforts. But dedicated efforts does not equal to technically unfeasible.

That's my point. That's why I believe that, talking about "serious" 3rd parties support, even a comparatively (to PS4/Xbox720-Loop-whatever) specced WiiU would guarantee nothing but easy/cheap ports.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Really? That implies that after rendering the framebuffer is then copied to some other place in order to be displayed. Wouldn't it be better to just have enough memory and then effectively point the display hardware at the correct buffer?
It would be better in a world where edram is a cheap resource, or target resolutions are small. In a world where edram is this precious little resource, and the target resolutions are "HD" and up, all GPUs that use said memory for their framebuffer, tend to use it for their backbuffer alone (and sometimes not an *entire* fb at that) - the front buffer from any double/triple buffering schemes is sitting in UMA. The (small) latency from the resolution of the embedded fb to UMA is usually hidden by the fact that the next frame starts with some 'break' for GPU's ROPs
 

BurntPork

Banned
lherre's laughing at us, knowing there will be breakdowns abound once the final specs are leaked. :(

So, are the parts still off-the-shelf?
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
lherre's laughing at us, knowing there will be breakdowns abound once the final specs are leaked. :(

So, are the parts still off-the-shelf?
It's entirely off-the-shelf, edram included. In fact, the entire WiiU will be sold in the form of a drm'd pdf containting the shopping list for the correct parts that new customers will then buy on their own.
 

DCKing

Member
Each time I visit this thread ... it amazes me all the things the people can discuss without any real info.
Be amazed at what we can discuss when we do get it ;)
Now give us more.
Nuclear Muffin said:
That tablet screen resolution is surely not just a coincidence...
We're onto something here...
dr_rus said:
As I've said previously, the only reason to use something like RV770/5830 today is because the final GPU will have the same feature set - meaning DX10.1 plus some kind of a tesselator. If this GPU would have support for DX11 then why use such old h/w as RV7x0 GPUs? Perfomance wise there are GPUs in current AMD's DX11 line up which are on par with RV770/5830 - why not use them?
It's a weird choice though. The RV770LE is a faulty RV770 chip that consumes more power on a larger die for the same performance as many alternatives (I'm especially looking at the RV740 here). There's also nothing really stopping Nintendo from using actual 2011 tech for this. I have a theory as to why they're maybe using the RV770 here.
 

Jocchan

Ὁ μεμβερος -ου
It's entirely off-the-shelf, edram included. In fact, the entire WiiU will be sold in the form of a drm'd pdf containting the shopping list for the correct parts that new customers will then buy on their own.
You can probably get everything for $20 at Newegg.
 

[Nintex]

Member
lherre's laughing at us, knowing there will be breakdowns abound once the final specs are leaked. :(

So, are the parts still off-the-shelf?

The only spec bump the Wii U ever got from the more credible rumor mill was that they increased the clock speed of the GPU I believe. We got some good rumors/sources on that when they didn't show it much at E3 because of the overheating issue. They fixed that, removed the hardware lock on the clock speeds and went from there. They're not going to change the architecture at this point if the R770LE is what they were gunning for(Radeon 4000-series) than that's what you'll get.

They've certainly optimized the hardware and increased performance software wise. Which is why the support for multiple Wii U Pads was rumored to have been added recently. They also likely replaced the off the shelf parts with the actual Wii U silicon at this point(which was rumored to be a SOC like the Xbox 360 Slim / 3DS) or will soon.

Nintendo is probably more focused on ease of development, the features of their new controller and their first party games than they are about keeping up specs wise with the competition. There's nothing to gain in that field when Zelda HD looks as good as the tech demo they showed.
 
Why would Nintendo include Broadway/Hollywood in the WiiU for BC? Wouldn't in be a lot more efficient to have the new CPU and GPU place themselves in a "Wii" mode to achieve the same result?

Whatever method they use I hope that you still retain access to the OS while in BC mode because the hassle you had to go through to attach contollers and memory cards to the Wii everytime you wanted to play GC games was a big reason why I rarely did it.
 

Mako_Drug

Member
Whatever method they use I hope that you still retain access to the OS while in BC mode because the hassle you had to go through to attach contollers and memory cards to the Wii everytime you wanted to play GC games was a big reason why I rarely did it.

That's...not going to happen.
 

BurntPork

Banned
[Nintex];32853453 said:
The only spec bump the Wii U ever got from the more credible rumor mill was that they increased the clock speed of the GPU I believe. We got some good rumors/sources on that when they didn't show it much at E3 because of the overheating issue. They fixed that, removed the hardware lock on the clock speeds and went from there. They're not going to change the architecture at this point if the R770LE is what they were gunning for(Radeon 4000-series) than that's what you'll get.

They've certainly optimized the hardware and increased performance software wise. Which is why the support for multiple Wii U Pads was rumored to have been added recently. They also likely replaced the off the shelf parts with the actual Wii U silicon at this point(which was rumored to be a SOC like the Xbox 360 Slim / 3DS) or will soon.

Nintendo is probably more focused on ease of development, the features of their new controller and their first party games than they are about keeping up specs wise with the competition. There's nothing to gain in that field when Zelda HD looks as good as the tech demo they showed.

Yeah, I guess hoping for something newer was wishful thinking.They're going to stick to a philosophy of using completely obsolete hardware. This is bad, since it'll get in the way of porting. GDDR3 in 2012 is inexcusable.
 

wazoo

Member
Whatever method they use I hope that you still retain access to the OS while in BC mode because the hassle you had to go through to attach contollers and memory cards to the Wii everytime you wanted to play GC games was a big reason why I rarely did it.

Wii U does not read GC disks, so there will be an OS, and this one supports all Wii peripherals. So, the Wii BC will be much easier to use.
 
My gut is telling me that it has:
  • 4-core Power7 CPU, with approximately 2.1-2.6Ghz and 32MB of eDRAM
  • A surprisingly modern GPU that is only remotely related to the RV7XX with at least 640 SPU/600Mhz
  • At least 1.5GB of RAM - I guess GDDR3 which with some novel Nintendo magic will be faster than expected.

okwiththis.png


In, fact, this would be great. It's just a bit more than a bit better than expected... if you understand...
 
Yeah, I guess hoping for something newer was wishful thinking.They're going to stick to a philosophy of using completely obsolete hardware. This is bad, since it'll get in the way of porting. GDDR3 in 2012 is inexcusable.

Tell that to Xbox Ten...

Still, I doubt they'll go with GDDR3. If there's anything Nintendo are absolutely adamant about with their hardware design, it's RAM speed. They always go with the fastest RAM available at the time (GCN= 1T-SRAM, Wii = 1T-SRAM & GDDR3, 3DS = FCRAM), even at the expense of RAM quantity. They hate loading times with a passion!
 

disap.ed

Member
My gut is telling me that it has:
  • 4-core Power7 CPU, with approximately 2.1-2.6Ghz and 32MB of eDRAM
  • A surprisingly modern GPU that is only remotely related to the RV7XX with at least 640 SPU/600Mhz
  • At least 1.5GB of RAM - I guess GDDR3 which with some novel Nintendo magic will be faster than expected.

I think Power7 has "only" 4MB/core, so I guess 12 (3 core) or 16 MB (4 core) is more likely.

32MB are more likely on the GPU side I guess.
 
It's like multi-quote was made just for me.

Azure? Why be concerned about researching too much? That's part of the purpose of this thread. :)

My guess for WiiU are still 27MB 1T-SRAM that would allow Wii BC and a framebuffer for 720p w/ 4xAA or 1080p w/o AA and 2 tablets (even 4, but this probably won't happen because of other reasons)

My guess is something around these specs but I am hoping for a higher SPU count (considering relatively low clocks). I think I read somewhere (Beyond3d?) that the GPU part will be around 1GFLOPS, but maybe this number is also for the whole system (which would still be quite above the upcoming AMD Trinity APUs (~700 MFLOPS)

With stew saying 32MB, there's really no need to guess unless they change it.

And I hope it has way more power than that. That's not even a Gamecube. :p That rumor came from a Japanese website and that the kit had an AMD GPU that was "beyond 1TFLOP", but wasn't a 4890. The 4870 is the only one beyond 1TFLOP and could definitely explain early overheating. If it were a 4870 clocked down to 500Mhz, then it would be 800GFLOPS and have a fillrate of 8GP/s and 20GT/s. Compare that to other cards:

4770(RV740) - 896/12/24
4830(RV770LE) - 736/9.2/18.2
4730(RV770CE:750Mhz) - 896/6/24
4670(RV730XT) - 480/6/24

On paper it would be outperformed by some of these. It could be used to explain why Digital Foundry came to the conclusion they did.

The explanation to that is that the amount of GPU-local edram one can expect on the WiiU cannot meet the needs of deferred shading algorithms, not without some form of tiling.

What we can hope for is edram covering a 720p x 4AA or 1080p x 2AA RGBA8 + zs32 backbuffers, which anyway you look at it is 30-34MB. Hence my last 'optimistic' guess in this thread was 40MB (albeit, presented in a 6bit signed int ; )

It should. Though history knows cases of early devkits playing bad jokes on devs *cough* dual 970MP filling in for a Xenon *cough*

Ok. Yeah I remember the tiling part since that's why I wanted to learn about the calculation. I guess the (32bpp Z) is what I still haven't grasped totally. But like luigiv

And be fair, MS meant well on that. They just, well, messed up pretty badly trying to rush the launch. ;)

3 cores dev kit, or final design?
How would he know what Nintendo plans to do in their final design?

I assume console makers give out targets, but dont give developers final designs till
late in the game due to leaks. Especially leaks to the competition.

Im sticking with 4 cores, due to it being a Power7, the adding of the extra controller, and the perception of looking like last gen Xbox 360.

Final design. Those devs/pubs are a console maker's business partners since they are using the console to sell their games. You don't do that to business partners. They tell them what the plan is for the console. Things can always change, but even when they do change you let them know. And one thing about Nintendo when looking back at how they've worked, that kind of perception won't matter because they will focus more on how it looks on screen.

What? You guys are going back to old posts now?

LLShC.gif
this thread is madness!

New leaks please :(

32MB of eDRAM. :)

You need depth for a lot of things, the least of which is being able to render the scene correctly (order of opaque objects, since alpha blended items aren't part of this, alpha test can take advantage of the z). Then there's shadowmapping, some post-processing (depth of field for example).

Which is why when I first did my own calculations I think you and blu were talking about the amount needing to be much larger to do it "properly"?

lherre's laughing at us, knowing there will be breakdowns abound once the final specs are leaked. :(

So, are the parts still off-the-shelf?

He said they had the ugly version of the retail case so maybe they've gotten to prototype parts. Just a guess.

As I've said previously, the only reason to use something like RV770/5830 today is because the final GPU will have the same feature set - meaning DX10.1 plus some kind of a tesselator. If this GPU would have support for DX11 then why use such old h/w as RV7x0 GPUs? Perfomance wise there are GPUs in current AMD's DX11 line up which are on par with RV770/5830 - why not use them?

My guess was that they were going to use a VLIW4 GPU, but since there are none available with a lower ALU count that they used a GPU that would be close to it.
 

Azure J

Member
I think Power7 has "only" 4MB/core, so I guess 12 (3 core) or 16 MB (4 core) is more likely.

32MB are more likely on the GPU side I guess.

There's also the strong hint that the eDRAM is its own RAM pool courtesy of wsippel's speculation and a "wink" on it from brain_stew.
 
Tell that to Xbox Ten...

Still, I doubt they'll go with GDDR3. If there's anything Nintendo are absolutely adamant about with their hardware design, it's RAM speed. They always go with the fastest RAM available at the time (GCN= 1T-SRAM, Wii = 1T-SRAM & GDDR3, 3DS = FCRAM), even at the expense of RAM quantity. They hate loading times with a passion!

That is true. I've been giving more credence to GDDR5 as time has passed.

I think Power7 has "only" 4MB/core, so I guess 12 (3 core) or 16 MB (4 core) is more likely.

32MB are more likely on the GPU side I guess.

I agree about the GPU. If they copy Xenon's design they could do three cores and have 16MB of L2 cache.
 

disap.ed

Member
My guess is something around these specs but I am hoping for a higher SPU count (considering relatively low clocks). I think I read somewhere (Beyond3d?) that the GPU part will be around 1GFLOPS, but maybe this number is also for the whole system (which would still be quite above the upcoming AMD Trinity APUs (~700 MFLOPS)
And I hope it has way more power than that. That's not even a Gamecube. :p

Holy s***. Thx, sure I meant 1 TFLOPS instead of 1GFLOPS and 700 GFLOPS instead of 700 MFLOPS.
 

DCKing

Member
That is true. I've been giving more credence to GDDR5 as time has passed.
Given that they seem to be working around latency issues by putting a lot of EDRAM on the CPU, the "GDDR3 has lower latency" argument gets less relevant. GDDR5 and even XDR2 might be a real possibility.

With lherre implying that it has more than 1 GB of memory, I'm wondering where the rest is coming from. 2 GB unified seems less feasible because of motherboard complexity (8 memory chips are needed for that), so it must me some other memory type? I'd say they could put in 128+MB 1T-SRAM there, but lherre has stated none was present in the devkits. Perhaps it's a seperate pool of DDR3?
 
I think tessellation will be there. It's heavily used in most modern games isn't it? Plus didn't we already go over that, and show how pointless it would be for them to completely remove it?
 
Saint Gregory said:
Why would Nintendo include Broadway/Hollywood in the WiiU for BC? Wouldn't in be a lot more efficient to have the new CPU and GPU place themselves in a "Wii" mode to achieve the same result?

Whatever method they use I hope that you still retain access to the OS while in BC mode because the hassle you had to go through to attach contollers and memory cards to the Wii everytime you wanted to play GC games was a big reason why I rarely did it.
There's the thing--if they go into a "Wii mode" like Wii does with GCN games or 3DS does with DS games, any extra stuff might be out the window; and that would especially be a big problem if they hope to sell downloadable GCN and Wii games.
 

Gaborn

Member
lherre's laughing at us, knowing there will be breakdowns abound once the final specs are leaked. :(

So, are the parts still off-the-shelf?

If Iherre's in a position to know about the Wii U's specs, and i think we all pretty much expect he is in such a position he must be in that position because he WANTS to be, no one is forced to make games etc. so he must not be totally unhappy with it.

Although in fairness I suspect WHATEVER the actual specs of the Wii U (and frankly I find the speculation on the hardware more annoying because it's more important how the games look and how they play than what exact hardware it being used, at least in my view) there will be breakdowns. Same is true of the Loop/Ten/NeXtBox and the PS4/Whatever they call it (I still say it won't be "4" because of Japan's cultural tetraphobia). People always over or under inflate their expectations based on what PCs are doing and what they think of the competition and other pre-conceived notions and the truth is, when the system and the games are actually released none of it will matter because it will come down to the experience of playing them.
 
I think tessellation will be there. It's heavily used in most modern games isn't it? Plus didn't we already go over that, and show how pointless it would be for them to completely remove it?

Yes. It would be akin to Nintendo removing their hardwired mapping options from the TEV unit. I expect them to use the 8th gen. unit before anything else.

If Iherre's in a position to know about the Wii U's specs, and i think we all pretty much expect he is in such a position he must be in that position because he WANTS to be, no one is forced to make games etc. so he must not be totally unhappy with it.

Although in fairness I suspect WHATEVER the actual specs of the Wii U (and frankly I find the speculation on the hardware more annoying because it's more important how the games look and how they play than what exact hardware it being used, at least in my view) there will be breakdowns. Same is true of the Loop/Ten/NeXtBox and the PS4/Whatever they call it (I still say it won't be "4" because of Japan's cultural tetraphobia). People always over or under inflate their expectations based on what PCs are doing and what they think of the competition and other pre-conceived notions and the truth is, when the system and the games are actually released none of it will matter because it will come down to the experience of playing them.

I definitely agree that how the game looks and plays are very important, but you can't completely separate that from what's under the hood. The debates about the Link and bird demos (I thought they looked great for all things considered), are proof of that. And that's just what we are trying to find out.

Given that they seem to be working around latency issues by putting a lot of EDRAM on the CPU, the "GDDR3 has lower latency" argument gets less relevant. GDDR5 and even XDR2 might be a real possibility.

With lherre implying that it has more than 1 GB of memory, I'm wondering where the rest is coming from. 2 GB unified seems less feasible because of motherboard complexity (8 memory chips are needed for that), so it must me some other memory type? I'd say they could put in 128+MB 1T-SRAM there, but lherre has stated none was present in the devkits. Perhaps it's a seperate pool of DDR3?

I agree with you on the first part, though I get the feeling XDR2 might be too expensive. Comparing the launch BOMs for the PS3 and 360, MS paid a little over $8 per 512Mbit GDDR3 chip while Sony paid $12 per 512Mbit XDR chip. MS did order twice as many per console, but I would believe that even if the order were increased at that time Sony would barely be able to get them for $10, let alone $8.

As for the memory type(s), one of my more recent speculations (I think before you were cleared to join) had a split pool and he shot that down. After all GC had PC-100 memory in it so I figured there was some plausibility of U have DDR3.

They could do eight since the 360 had eight at least at launch (4 on top of the MB, 4 on the underside). Wii also has chips on the underside. Cost might prevent 2GB though. I could see 1.5 GB of GDDR5 with a 3/3 split.

Interestingly after everything we've heard recently, my current thought isn't that far from my original one. The main changes would be GDDR5 instead of DDR3, and maybe the GPU clock and ALU count.

http://www.neogaf.com/forum/showpost.php?p=30236063&postcount=5354

I don't even remember why I said 32MB, but that seems to be correct. I'll have to look at some of my other posts to see why.

When it comes to clocks, I really think Nintendo is waiting to see if they can get the GPU at 28nm. I believe they have their targets in mind though. I think 607.MHz would be for 40nm. And not long ago on B3D I guessed that the clock could be 729Mhz or possibly even 810Mhz at 28nm. Those numbers are based on multiples from the CPU since Nintendo likes doing that. I doubt it would be those exact number, but just something to get an idea of what Nintendo might do. At one point I was basing my memory clock on those leaked 7000 series specs, but I've recently seen it said that they were fake. I heard there were fake ones, I just didn't know it was those. So I'm going with either 911.25Mhz or 1215Mhz.

And finally I still think there's a possibility we could see at least a 12-cluster VLIW4 GPU (14 max). I'm still stuck on Nintendo wanting the transistor savings.
 

MDX

Member
What about some other components that can ensure a great gaming experience?
The MMU?
S3TC?
Disc Drive... Whats a good data transfer speed?

Like what could Nintendo do to ensure fast game loading?
Anything special out there?
 

Ormberg

Member
I still can't wrap my head around why Nintendo is choosing POWER7. I sure hope it is more than "that's what the competion is using", because that is catching up with current technology, not bringing any improvement.

I'm trying to read up on POWER7, but I'm having a hard time finding actual data and/or information about it. I sure hope the final chip uses 32 nm rather than 45 nm technology.
What I can tell, IBM seem to boost the power efficiency angle. Agressive OoOE technology and also a new memory interface for accessing the DRAM.

Anyone has any good information on POWER7? I'm really curious about it, since it's is one of the reliable information tidbits we have, hence my interest in it :)
 

DCKing

Member
I agree with you on the first part, though I get the feeling XDR2 might be too expensive. Comparing the launch BOMs for the PS3 and 360, MS paid a little over $8 per 512Mbit GDDR3 chip while Sony paid $12 per 512Mbit XDR chip. MS did order twice as many per console, but I would believe that even if the order were increased at that time Sony would barely be able to get them for $10, let alone $8.
I'm not sure how GDDR5 prices and XDR2 prices combine. From what I can Google XDR2 has a (minor) power advantage, a performance advantage (guess that's mostly marketing though), but importantly it seems that there are 4 GBit XDR2 chips available. With those chips they could accomplish 1GB of memory with only two chips. That would be attractive to Nintendo, I think.
As for the memory type(s), one of my more recent speculations (I think before you were cleared to join) had a split pool and he shot that down. After all GC had PC-100 memory in it so I figured there was some plausibility of U have DDR3.
DDR3 has no real disadvantages compared to GDDR3 (right?), and also comes in 4 Gbit chips. Each type of memory (GDDR3, GDDR5, DDR3 and XDR2) has something going for it, I'm not sure we can say anything sensible about it at this time. GDDR3 seems to have been in use in the devkit, but that could be a stopgap solution much like the CPU and GPU could be.
They could do eight since the 360 had eight at least at launch (4 on top of the MB, 4 on the underside). Wii also has chips on the underside. Cost might prevent 2GB though. I could see 1.5 GB of GDDR5 with a 3/3 split.
Wouldn't that imply a 192-bit bus though? That increases CPU, GPU and motherboard complexity.
1T-SRAM on the GPU instead of EDRAM is an interesting thought. I'm curious whether 1T-SRAM is still a competitor to EDRAM nowadays.
And not long ago on B3D I guessed that the clock could be 729Mhz or possibly even 810Mhz at 28nm. Those numbers are based on multiples from the CPU since Nintendo likes doing that.
Why would Wii clocks still be relevant in the Wii U? They could just choose a custom clock in BC mode if it is necessary at all?
And finally I still think there's a possibility we could see at least a 12-cluster VLIW4 GPU (14 max). I'm still stuck on Nintendo wanting the transistor savings.
12 VLIW4 clusters implies a 960 shader GPU like the Barts Pro. I'd say that's too much... 6 to 8 is more realistic I think, especially when compared to the RV770LE.
I still can't wrap my head around why Nintendo is choosing POWER7. I sure hope it is more than "that's what the competion is using", because that is catching up with current technology, not bringing any improvement.
There are not many high-end architectures to choose from anymore. There's IBM Power, and there's Intel/AMD x86. Other architectures are usually focussed on mobile use or highly parallel servers. Since Nintendo likes the Power architecture because they used it in GC and Wii, they chose that. IBM has four offerings in that area, POWER6, POWER7, PowerPC A2 and Power Processor Element (used in PS360). Both POWER6 and PPE are in-order designs, are considered inefficient compared to the others, and are fading into irrelevance. PowerPC A2 is again focussed on highly parallel tasks, and probably not ready yet. The only one left is POWER7, which is more comparable to traditional high performance CPUs. It should be a good gaming architecture.
 
So if we are speculating that it is an APU, then what? CPU and GPU both at 45nm, right? I doubt Nintendo are targetting a 28nm GPU. Any real or imagined "delay" of the system has more to do with software support and market conditions I'm guessing. Holding out for the 28nm process to work out its kinks is way too risky. The Wii was fabbed at 90nm at a time when 45nm was just starting to come into its own, iirc. I don't believe this time will be any different.

Can anyone (wsippel? blu) speculate as to how AMD could get the TDP of the GPU down without drastically shrinking the chip (most 700 series chips seem to have been made at 55 or 40nm)? If they are targetting 640spus and 500Mhz, how could they get it to run cool enough?
 
Status
Not open for further replies.
Top Bottom