• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

NBtoaster

Member
No that isn't speculation, not in the slightest. 360 has to split its frames because they couldn't fit into 10MB eDRAM. 32MB doesn't have that problem.

It's speculation that's what the eDRAM will be used for.

Framebuffer sizes next gen will also be much bigger. Killzone 2 already managed 36mb.
 

Donnie

Member
It's speculation that's what the eDRAM will be used for.

Framebuffer sizes next gen will also be much bigger. Killzone 2 already managed 36mb.

It isn't speculation, the idea of putting a load of extremely fast memory on your GPU and then bypassing it when rendering the frame its frankly insane.

Also source on Killzone 2?, it renders at 720p with AA AFAIK.
 

Donnie

Member
eDRAM has many different uses. PS2 and 360 use it very differently..

They both use eDRAM for their frame buffer, despite the differences in how the memory was connected and configured (and the fact that PS2 also used some as a texture cache). Like I said it'll be used as a frame buffer and possibly more.
 

beril

Member
I think you're partially right, in that it's intended both as a framebuffer and as low-latency GPGPU memory.

In fact, that brings me to another thought I'd had. I've been of the opinion for a while that this chip being manufactured in Fab 8 is the Wii U GPU, something which would imply that the eDRAM is on-die with the GPU, rather than on a separate die. If IBM is involved in manufacturing the GPU, there might be other implications, though. A few pages back, Matt talked about the Wii U's GPU having a "significant" increase in registers over the R700 series. More register memory would be a benefit to GPGPU functionality, but usually comes at the expense of added transistors, which means higher power usage, more heat and a larger, more expensive die.

Isn't that pretty much the definition of eDRAM?

Also it's unlikely that it will be used for GPGPU stuff as access from the CPU will likely be much slower than main ram, if it's even possible
 

NBtoaster

Member
They both use eDRAM for their frame buffer, despite the differences in how the memory was connected and configured (and the fact that PS2 also used some as a texture cache). Like I said it'll be used as a frame buffer and possibly more.

Like you say, PS2 eDRAM was more flexible. But with Wii U it's more likely to be accessable by both CPU and GPU, so if it's usable for the framebuffer you're unlikely to get the full 32MB.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
eh, how are they supposed to share edram? it can't really be embedded in both GPU and CPU at the same time can it?
Nope, embedded only for one of those..

The CPU may be able to access it in some manner but I'd assume that would be far slower than accessing the main memory.
..And yet something has to provide same-or-better latencies to the CPU as the original 24MB of 1T-sram on the cube/wii. There's only one such candidate in the system.. Like I said, I'm very curious what AMD have come up with for the interconnect..

It's very likely however that the main memory is shared, and that the GPU can write to main memory when doing gpgpu stuff, (or when you run out of edram if you have lots of rendertargets).
I think that's a given. Xenos could do memexport back in the day.
 

Donnie

Member
Like you say, PS2 eDRAM was more flexible. But with Wii U it's more likely to be accessable by both CPU and GPU, so if it's usable for the framebuffer you're unlikely to get the full 32MB.

Well unless they want 1080p plus AA they're never going to come close to using 32MB. So yes I agree that its seems very unlikely it's solely for the frame buffer, but currently that's all we know for sure that it'll be used for. I mentioned other tasks and had I gone into the possibilities that would have been speculation. Nothing speculative about saying that it'll be used as a frame buffer though because that part is a certainty. :)

By the way, can you post a source for that Killzone 2 comment. As I said AFAIK it rendered at 720p with AA, not sure where 36MB would come from.
 
Well unless they want 1080p plus AA they're never going to come close to using 32MB. So yes I agree that its seems very unlikely it's solely for the frame buffer, but currently that's all we no for sure that it'll be used for. I mentioned other tasks and had I gone into the possibilities that would have been speculation. Nothing speculative about saying that it'll be used as a frame buffer though because that part is a certainty. :)

Actually, the xbox360 also used the Edram to implement additional effects like a special HDR mode.
 

Earendil

Member
Look what I just noticed in my inbox today...Who is writing these responses? What does he actually know? Is this some sort of conspiracy? Maybe I'll get more emails...

Nb0Ap.png

I really don't understand what is going on here. Why would he have emailed you back? And I'm still skeptical that he would know anything, or be able to talk about it even if he did know something.

Well as I've said for a while I think the whole CPU issue comes down to differences rather than necessarily a problem with raw performance. One of the developers interviewed recently by Eurogamer about the CPU put the problems they were having down to this. Basically its a new CPU for them and they haven't had time to optimise things, but they expect better performance once they do. Once developers get to grips with the CPU I think its going to have performance very similar to Xenos and probably better in some ways.

But moving away from a direct CPU comparison for a moment its important to remember that, unlike 360, WiiU doesn't have just a main CPU to handle the usual CPU tasks. It has three separate processors. A main triple core IBM CPU with 1 thread per core (each thread effectively 33.33% of the CPU's total performance), an audio processor (DSP) and an I/O processor (most likely an ARM CPU clocked somewhere around 500Mhz). 360's CPU is a triple core IBM CPU with 2 threads per core (each thread is effectively 16.66% of the CPU's performance). More threads doesn't necessarily give you better performance but it can help in efficiency if you expect to use a CPU for many different tasks at once. A typical use for Xenos in game could look something like this:

AI (1 Thread)
physics (2 Threads)
Audio (1 Thread)
I/O (1 Thread)
Game Code (1 Thread)

With WiiU right now you might have:

AI/GameCode (1 core)
Physics (1 core)
Audio/I/O (1 core)

But use the extra hardware and you can go:

AI (1 core)
Physics (1 core)
Game Code (1 core)
Audio (DSP)
I/O (ARM CPU)

So while IMO the CPU's raw performance will be Xenos or better I doubt its as happy as Xenos when handling 5 or 6 different tasks at once due to less threads and that is likely one of the things causing efficiency problems. Use the CPU plus the dedicated audio and I/O hardware and it should leave Xenos behind by a decent margin. Because not only do you leave 3 tasks for 3 threads (which should improve efficiency) but you also take 30%+ load away from the CPU and give it to other tasks like AI and game code).

Don't forget that physics can be done on the GPU, so that may end up looking like this:

AI (1 core)
Physics (GPU)
Game Code (1 core)
Audio (DSP)
I/O (ARM CPU)
Realtime Iwata Hair AI (1 core)
 

Absinthe

Member
I just got a response from AMD as well. I sent my request the same day Zoramon089 posted his email.

Although it is very similar to Zoramon089, it looks like this person has made sure not to mention the exact model.

1pn2s.png
 
Not speculation. A cursory googling will get you all the information you want on console OS footprints and backend. Neither 360 or PS3 give game devs a full 512MB RAM to work with.

Speculation, the part where you've said "some of the extra 1GB will be added to that later on".

And edram have a lot of uses, not only as framebuffer.
 

wsippel

Banned
I doubt high speed GDDR5 will be used. 2GB of that would be 20W+ alone wouldn't it?
Yeah. I think it'll be DDR3. And that's not necessarily a bad thing. They'd need at least four 4Gb chips, and assuming they use IBM's quad channel controller (POWER7 has two of those per chip), that would be enough for an impressive ~100GB/s.
 
I just got a response from AMD as well. I sent my request the same day Zoramon089 posted his email.

Although it is very similar to Zoramon089, it looks like this person has made sure not to mention the exact model.

1pn2s.png

Could it be that amd staff have been casually told what wii GPU is and now attention is focussing on them they've briefed to lock down information
 
I just got a response from AMD as well. I sent my request the same day Zoramon089 posted his email.

Although it is very similar to Zoramon089, it looks like this person has made sure not to mention the exact model.

1pn2s.png

So now we need to figure out what embedded GPU it is.
Did the RV700-line even have any embedded GPUs?
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Yeah. I think it'll be DDR3. And that's not necessarily a bad thing. They'd need at least four 4Gb chips, and assuming they use IBM's quad channel controller (POWER7 has two of those per chip), that would be enough for an impressive ~100GB/s.
100GB/s is 256-bit IO at 3.2Gb/s. Even if nintendo had 32bit-wide 2Gb DDR3 chips (so that 8x = 256bit IO, = 2GB size), that's still 1.6GHz IO. I don't think such an DDR3 exists yet - top one I've seen on recent roadmaps was ~1.2GHz, IIRC.

So now we need to figure out what embedded GPU it is.
Did the RV700-line even have any embedded GPUs?
E4690
 

Donnie

Member
Speculation, the part where you've said "some of the extra 1GB will be added to that later on".

And edram have a lot of uses, not only as framebuffer.

Erm I think you're referring to me and I've already answered you, the first is speculation but very likely. The second isn't speculation, were did I say it would be used exclusively as a frame buffer? Also why is that even relevant? You asked what advantage the extra eDRAM could provide and I gave you one example. The leaked info from Nintendo already confirmed that WiiU will:

"supports 720p 4x MSAA or 1080p rendering in a single pass"

360 didn't support that and the reason WiiU can is down to the extra eDRAM.
 
Sounds like the tech staffer that answered my emails goofed...and he keeps goofing

My guess is still with the E6760 when looking at the numbers / specs.

WwCWq.png

How much of chips are customizable? Like could Nintendo technically add more texture units if they wanted?

edit: Looking at the responses other people are getting...Looks like I am the chosen one
 

wsippel

Banned
100GB/s is 256-bit IO at 3.2Gb/s. Even if nintendo had 32bit-wide 2Gb DDR3 chips (so that 8x = 256bit, = 2GB), that's still 1.6GHz IO. I don't think such an DDR3 exists yet - top one I've seen on recent roadmaps was ~1.2GHz, IIRC.
My bad, I guess I mixed up individual chips and DIMMs. POWER7 achieves 180GB/s with two quad channel DDR3 controllers according to the data sheets, but it's not using individual chips of course.

Then again, have you ever seen that weird custom 3DS FCRAM chip? Instead of using an off the shelf 128MB/ 32bit part, Nintendo had Fujitsu design a 128MB/ 64bit chip with two 64MB/ 32bit macros on a single die. As far as I understand, that is...
 

AzaK

Member

WTF?!? MY BRAIN! Who's right? Who's wrong? E4690, RV7300, RV700 base, E6760?? ARRGHH! Does anyone actually know any damned thing at all?

I can't see an AMD person leaking info about a GPU. Mods, check "akmcbroom" and Zora's IP's to make sure they're not the same person :)


So now I'm asking for the fourth time where the DSP idea is coming from. Where is it coming from?
What DSP idea? Are you asking if Wii U has one? It does.
 
My bad, I guess I mixed up individual chips and DIMMs. POWER7 achieves 180GB/s with two quad channel DDR3 controllers according to the data sheets.

Then again, have you ever seen that weird custom 3DS FCRAM chip? Instead of using an off the shelf 128MB/ 32bit part, Nintendo had Fujitsu design a 128MB/ 64bit chip with two 64MB/ 32bit macros on a single die. As far as I understand, that is...

What does that custom chip accomplish that the off the shelf chip doesn't?

I'm asking because I'd like to know what motivates Nintendo to customize their internal parts because perhaps they'll apply the same sort of design philosophies to the customization of Wii U innards.
 

Absinthe

Member
WTF?!? MY BRAIN! Who'd right? Who's wrong? Does anyone actually know any damned thing at all?

I can't see an AMD person leaking info about a GPU. Mods, check "akmcbroom" and Zora's IP's to make sure they're not the same person :)

Don't forget Lump! :)

We have three people (that we know of) getting similar emails, although mine did NOT mention the e6760.

I did send a follow up email to see if I could get a clarification though.

Edit: Make that four people - including the Spanish email that was translated first.
 

Donnie

Member
So now I'm asking for the fourth time where the DSP idea is coming from. Where is it coming from?

The first leak which was confirmed listed:

Sound and Audio
Dedicated 120MHz audio DSP.
Support for 6 channel discrete uncompressed audio (via HDMI).
2 channel audio for the Cafe DRC controller.
Monaural audio for the Cafe Remote controller.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
Actually, the xbox360 also used the Edram to implement additional effects like a special HDR mode.

Huh? It is a frame buffer pool, Halo 3 used two frame buffers to produce its HDR. I don't think it is legit to say the eDRAM had special features for this.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Then again, have you ever seen that weird custom 3DS FCRAM chip? Instead of using an off the shelf 128MB/ 32bit part, Nintendo had Fujitsu design a 128MB/ 64bit chip with two 64MB/ 32bit macros on a single die. As far as I understand, that is...
I assume you mean 16Mb x 64 chip and 16Mb x 32 macros. ..It sort of rings a bell, though I can recall from where such an info (about dual macros) might have come to me. .. Yes, I recall now - Micron was using a similar scheme for their 32-wide DDR3 chip.
 
Top Bottom