• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

JordanN

Banned
I'm actually skeptical of the original HD 4850 claims. I find it weird Nintendo would ship out those kits when the console is far far far away from that in power.

Anyone remember 01net? They were basically on an industry leaking spree and their write up of the Wii U made no mention to such card. They said it was a R700 with a notch more power. Looking at the Wii U from when it was first unveiled to now, that analysis seemed about right.

I think it's possible the HD 4850 was just sites being fed misinformation or exaggeration unless, I can see actual developer confirmation somewhere.

Looking at this one IGN article, they said in addition to the HD 4850 the CPU was a 3.2GHz. I think Lherre or Arkam said the first dev kits were 1Ghz. There was also Marcan who said it's now 1.24Ghz "similar to broadway".
 
DirectX is, quite honestly, a false pretense. It always was, baring Microsoft consoles no platform was ever DirectX compliant and that doesn't mean a thing other than... it's not compliant with that API.

That API is nothing to write home about, and some developers even hate it.

DirectX 11.2 is for all means and purposes only playing catch up with OpenGL 4.4; it doesn't lead it follows, and it's nothing special.


Everyone talks about it's feature set though and so we do too, this happened for years, it's like saying the Gamecube was evenly matched with DirectX 7.

GX2 is essentially a variant of either OpenGL or OpenGL ES; and that's the feature set we should be going by, but people are simply not used to that terminology and what it means.

Worst case scenario though, is OpenGL 3.3; and that's DirectX 10.1; best case scenario is OpenGL 4.3 (R800 GPU's being OpenGL 4.3); if it's anything over 4.0 (being that 3.3 was the last 3.x release) then it's "technically" a DirectX 11 capable part.

That's all there is to it, really.


And then they can throw some specific calls and accelerations, call it a different name (GX2) and call it a day; that's what we should be focusing on, as figuring the OpenGL implementation should be easy enough for every developer out there, just try and use some call that is specific to a later release than 3.3; if it works it's compliant.
 
The R700 part was Shader Model 4.1 and OpenGL 3.3. It really can't be going any lower than that.

R800 is very similar to the R700 Nintendo would have to be crazy not to backport OpenGL modifications to the same architecture and be getting their DirectX11-type effects through other means as it was there well within their grasp. R800 is not a new architecture like GCN (R1000), it's not even like R900 where some GPU's (Radeon HD 69xx) messed with the 5-way VLIW implementation to make it 4-way VLIW. No, it's the very same thing through and through; API's aside and other little changes aside, mostly an additive design and "blocks" other than the core where the most changed, Tesselation unit being of a different generation, and AVIVO and the multi-display technology also being heavily changed on that revision.

On that topic, we even know the GPU has some R800 specifics like Eyefinity (multi-display tech, mentioned above). OpenGL 4.x is not so unlikely, it's a free ticket to some DirectX 11 delicacies.
 
Thanks for the write-up. This is actually the first I've heard of HD4850 being used in devkits and I'm sure you have it confused with the 4600-series. The Latte is very obviously slower than the 4850 because that one is more than 3x faster than the 360's Xenos and its performance is only slightly below Xbox One specs. Wii U ports of 360 games would all be at 1080p if that were the case because performance would be 'free' at the resolution with those kinds of specs.

But, yes, GPGPU was always a given since the entire HD4000 series is OpenCL capable. And there are no DX11 specific effects; tessellation just didn't exist within the DX spec until DX11. It still existed in the HD4000-series though it wasn't DX11 compliant.

he doesn't have in confused at all, whether there was a 4850 in the devkits or not its been a pretty consistently well known/believed rumour
 
Eyefinity, OpenGL/DX specification, etc. are all irrelevant because Nintendo's API doesn't use any of those. Eyefinity, in particular, doesn't say anything about the R800-series because multi-monitor support is possible on any GPU if you have the drivers and hardware for it.

DX10 hardware, for example, supports many OpenGL 4.x extensions but it's not the same thing as full support for any OpenGL 4.x variant. It could support some extensions from 4.4 and some extensions from 4.0 but still not be compliant with either.
 

krizzx

Junior Member
The R700 part was Shader Model 4.1 and OpenGL 3.3. It really can't be going any lower than that.

R800 is very similar to the R700 though, Nintendo would have to be crazy not to backport OpenGL modifications to the same architecture and be getting their DirectX11-type effects through other means. It was there in their grasp via work already done by their licensor.


We even know the GPU has some R800 specifics like Eyefinity. OpenGL 4.3 is not so unlikely

I forgot about the eyefinity as well. We also have reasources backing that it has some DX11 equivalent functionality from Unity and the Project C.A.R.S. change log(especially that last one that listed the Wii U and the PS4 both using the same DX11 feature). Unless people want to start trying to bin the PS4 as an only DX10 equivalent GPU.

I'm sill interested in the dual graphics engine possibility. Getting higher polygon counts at 2X the frame rate as what was common is nothing to scuff at.

It should naturally have some higher polygon pushing power simply from being 550 Mhz. How it keeps such high numbers in all of the these 60 FPS games, though, is another question. For as long as I can remember, the relationship of polygon count to frame rate was that at 60 FPS, you could only draw half as many as you can at 30. Exactly 1/2. I'm seeing increased geometry at higher frame rate for similar games. Either they are using tessellation extensively, or the GPU can draw more polygon than what it seems at first glance.
 
he doesn't have in confused at all, whether there was a 4850 in the devkits or not its been a pretty consistently well known/believed rumour

It's the first that I've heard, in any case. Despite being new here, I follow these things rather closely. I just don't buy it because the 4850 is way faster than what we've seen from the Wii U.

Just to put it in perspective, a 550mhz 4850 would have:

8800 gpixels fillrate vs 13500 gpixels in Xbox One and 4000 gpixels in 360
22000 gtexels vs 40464 gtexels in Xbox One and 8000 in 360
880gflops vs 1300 gflops in Xbox One and 240 gflops in 360

The 4600-series would place it at 4400 gpixels, 17600 gtexels, and 352 gflops respectively. It's far more plausible, in my opinion, based on what we've seen.
 
I forgot about the eyefinity as well. We also have reasources backing that it has some DX11 equivalent functionality from Unity and the Project C.A.R.S. change log(especially that last one that listed the Wii U and the PS4 both using the same DX11 feature). Unless people want to start trying to bin the PS4 as an only DX10 equivalent GPU.

I'm sill interested in the dual graphics engine possibility. Getting higher polygon counts at 2X the frame rate as what was common is nothing to scuff at.

It should naturally have some higher polygon pushing power simply from being 550 Mhz. How it keeps such high numbers in all of the these 60 FPS games, though, is another question. For as long as I can remember, the relationship of polygon count to frame rate was that at 60 FPS, you could only draw half as many as you can at 30. Exactly 1/2. I'm seeing increased geometry at higher frame rate for similar games. Either they are using tessellation extensively, or the GPU can draw more polygon than what it seems at first glance.
That's interesting, but even without looking at technicalities that could corroborate it (chip layout, or performance comparisons of the same title running on both consoles) I don't find it very likely.

Basically, before the bottleneck was either texturing capacity (pipelines and textures per pass) or polygons; we didn't hear about neither this generation, because they weren't the bottlenecks anymore; I know GC and Xbox had 4 pipelines; I don't know how many does Xenos have; similarly we know how many polygons per second a lot of 128 bit games pushed, and suddenly this gen it turned out to be a non-factor.

Polygons have reached the point of diminishing returns; 5 million polygons per second to 10 million was so noticeable before, but today geometry is always complex providing the engine is not crap and artists invested time and resources into it; similarly, texturing is clearly not free, but memory and latency is often the bottleneck more so than internal bandwidth; and fillrate is still a bottleneck that impacts both; and stuff like megatexture is cool, but people forget someone has to do those textures, so it turned into a problem of resources, artistry and talent more so than it is a limitation

Current gen consoles were bottlenecked through hell and back; Wii U supposedly isn't, so even if the theoretically attainable polygons per second are close the end result is very different.

How to say this... 60 frames per second are remarkably hard to get this generation, you have to sacrifice way more than pulling half the polygons you were thinking about pulling; that's why Call of Duty games have gone 880x720 and so many others opt for dynamic framebuffers that output whatever the GPU can give them or simply being sub-720p. They've been basically doing the Halo 2 tradeoff; more texturing and less polygons; because polygons seemed enough, and textures were a better tradeoff to invest in to negate whatever limitations geometry had (normal maps being able to fake volumes and all that)

The issue is fillrate needed to output that, as well as RAM latency and whatever extra steps there are on the pipeline be it tiling on the 360 or actually using RAM for framebuffer purposes on the PS3 (not so rare on the X360 as well, I've been told); if you removed those, you probably could pull the same thing without the tradeoff's being so evident, and it's only natural Wii U can do that and then some; it's also pulling a lot more 1080p without even going into the dynamic framebuffer territory, it's clearly less bottlenecked, and that means it can go closer to the "theoretical" peaks than current gen, even if they happen to be close on paper; 550 Million polygons per second is nothing to sneeze at; thing is I'm quite secure in balparking current gen games to not surpassing the 60 million mark in any circumstance.
 

JordanN

Banned
lostinblue said:
How to say this... 60 frames per second are remarkably hard to get this generation
Uh, what are you talking about? Rayman is 60fps. Ninja Gaiden Sigma 2 was 60fps. A whole lot of fighting games are 60fps. They all run at native resolution too.

I said it before, Wii had a whole lot of 60fps games, the console was still far weaker than the PS3/360. It's not about power.
 
Uh, what are you talking about? Rayman is 60fps.
It's also 1080p

Rayman and the like doesn't count, for one it's mostly not 3D, so it's not even pulling a z-buffer in most instances which costs RAM and resources to do. And when it does the 3D is very low-end so it's more manageable.
Ninja Gaiden Sigma 2 was 60fps. A whole lot of fighting games are 60fps. They all run at native resolution too.
I can't believe you're giving me those examples.

Fighting games don't have much going on in comparison to other genres, they've been the poster childs of 60 frames per second since the Sega Saturn. Go figure. FPS's have also been mostly 60 frames per second this generation albeit sacrificing resolution and racing games too. That doesn't mean that the 60 FPS mark didn't mean more work and tradeoff's than it should. These consoles where bottleneck clusterfucks.


As for Ninja Gaiden Sigma 2, they had to optimize a lot in the meantime (after the original releasing on the X360) and still had to cut the number of enemies on screen quite a bit. It's also not a open-world game by any means; it's basically a corridor hack and slash.
I said it before, Wii had a whole alot of 60fps games, the console was still far weaker than the PS3/360. It's not about power.
It was clearly easier to pull 60 frames per second sans bottlenecks on the Wii than on PS3/X360. For 480p the thing was more than adequate at 60 fps, something current gen wasn't at 720p with it's higher quality assets to boot.

It just happened that the console was way weaker too; but the balance was off on that one; you certainly didn't see anyone going 320x240 in order to pull 60 FPS's on it; no need for that. (I know it's an exageration)
 

JordanN

Banned
lostinblue said:
It's also 1080p

Rayman and the like doesn't count, for one it's mostly not 3D, so it's not even pulling a z-buffer which costs RAM and resources to do.

I can't believe you're giving me those examples.

Fighting games don't have much going on in comparison to other genres, they've been the poster childs of 60 frames per second since the Sega Saturn. Go figure.

As for Ninja Gaiden Sigma 2, they had to optimize a lot in the meantime (after the original releasing on the X360) and still had to cut the number of enemies on screen quite a bit. It's also not a open-world game by any mean
This seems like moving the goal post. 60fps was done alot on last gen and not always to great compromise (in the case of NGS2, I think that's the case where the 360's stronger GPU comes to play).

There was nothing special about Wii that enabled it do 60fps. It was coming off the Gamecube dude, which was from the same generation where I'm sure the PS2/Xbox had plenty of the same games too.
 
I didn't say it wasn't done, 60 fps was a goal for many high end developers this gen, depending on genre.

My point being: it wasn't as easy to get as it should be, due to bottlenecks in the architecture. 720p @ 30 fps were a much better fit for the hardware, and 1080p were suicidal on most cases. Such is not the case on the Wii U, and it's not because it's a powerhouse.

Regarding NG2, NG2 struggled a lot on 360, on the stair scene; corrected on PS3 by nuking the enemies not sure I blame the GPU seeing the game behaves better albeit with less enemies on screen on the weakest platform; problem is current gen architecture, no platform was pulling something it shouldn't be doing going by theorectical performance alone.

60 FPS on a good looking game required work and sacrifices. I mean for fucks sake Doom 3 kills 360 and PS3 just for being at 60 FPS; as Carmack puts it, stencil shadows are raster pigs. Dudes on it have pointy heads.


Regarding the Wii, GC and Wii are the same shit (sorry I didn't go there), 60 FPS on it weren't difficult it was down to deciding early on what the target framerate was, I mean F-Zero GX; GC... hell PS2 where clearly less bottlenecked for 480p60 than X360/PS3 are for 720p60; for the type of software they could hope to run, obviously. Current Gen can pull that at 60 FPS, sure: but push much further more and it'll go tits up. Hence you have to optimize a lot and rely on the things current gen hardware can do that previous consoles couldn't (better tetxures, geometry and effects) to get an edge. 720p60 in itself is expensive.
 
I didn't say it wasn't done, 60 fps was a goal for many high end developers this gen, depending on genre.

My point being: it wasn't as easy to get as it should be, due to bottlenecks in the architecture. 720p @ 30 fps were a much better fit for the hardware, and 1080p were suicidal on most cases. Such is not the case on the Wii U, and it's not because it's a powerhouse.

Regarding NG2, NG2 struggled a lot on 360, on the stair scene; corrected on PS3 by nuking the enemies not sure I blame the GPU seeing the game behaves better albeit with less enemies on screen on the weakest platform; problem is current gen architecture, no platform was pulling something it shouldn't be doing going by theorectical performance alone.

60 FPS on a good looking game required work and sacrifices. I mean for fucks sake Doom 3 kills 360 and PS3 just for being at 60 FPS; as Carmack puts it, stencil shadows are raster pigs. Dudes on it have pointy heads.


Regarding the Wii, GC and Wii are the same shit (sorry I didn't go there), 60 FPS on it weren't difficult it was down to deciding early on what the target framerate was, I mean F-Zero GX; GC... hell PS2 where clearly less bottlenecked for 480p60 than X360/PS3 are for 720p60; for the type of software they could hope to run, obviously. Current Gen can pull that at 60 FPS, sure: but push much further more and it'll go tits up. Hence you have to optimize a lot and rely on the things current gen hardware can do that previous consoles couldn't (better tetxures, geometry and effects) to get an edge. 720p60 in itself is expensive.
Good post. It should be noted that Nintendo has been very focus on predictable/consistent hardware performance after the severely bottlenecked N64.
If Nintendo's target was to be a HD machine, it is almost certain that the Wii U has been designed to run games at least at 720p at a good and consistent frame rate.
 

ahm998

Member
Is nintendo making wii U Spec. like this because of making Unity Engine easy for Indie?

As i see from nintendo they are only care about Indie & Unity engine.

---------------------------

If in gaf there is developer used Unity engine with Wii U Devkit, Is it easy use?
 

ikioi

Banned
Also, we know from AMD's own statement that they didn't actually make the GPU at all, Nintendo made it. AMD just provided the hardware and helped them design it.

Can you provide a source for that please.

From what I recall AMD had a small team of staff who built and designed the Wii U's GPU. Nintendo did none of the engineering, design, or fabrication. AMD and other partners did all the design work for Nintendo based on their requirements.

I doubt very much Nintendo have the in-house expertise, resources, or skill to engineer GPUs or CPUs.
 

tipoo

Banned
Also, we know from AMD's own statement that they didn't actually make the GPU at all, Nintendo made it. AMD just provided the hardware and helped them design it. This thread wouldn't have gone on this long it that fabricated DF claim was even 1/10 true.

Maybe I'm misunderstanding, but that doesn't make an iota of sense to me. AMD is not a fabrication plant, they can't just make GPU designs that others send them. They themselves pay others like GloFo or TSMC to make their GPU designs. And it's certainly a collaborative effort in The Wii U, what else would AMDs role be? "Nintendo made it" can only be partially correct, like saying Sony made the Cell processor while it was IBM that was the microprocessor powerhouse. Sony laid out what they wanted and collaborated with them sure, but the expertise was with IBM. Designing GPUs and CPUs is a massive undertaking, and Nintendo doesn't have that many hardware engineers.
 

disap.ed

Member
Can you provide a source for that please.

From what I recall AMD had a small team of staff who built and designed the Wii U's GPU. Nintendo did none of the engineering, design, or fabrication. AMD and other partners did all the design work for Nintendo based on their requirements.

I doubt very much Nintendo have the in-house expertise, resources, or skill to engineer GPUs or CPUs.

Don't doubt.
 

tipoo

Banned
Don't doubt.

Doubt. They have 5000 employees, a tiny tiny fraction of which are chip engineers. There's a reason there are so few players in the microprocessor space, it takes billions in R&D for each new chip and not to mention the decades of expertise AMD, IBM and such have. They may give AMD goalposts and they may tweak, but the bulk of the design will have been AMD.
 
Doubt. They have 5000 employees, a tiny tiny fraction of which are chip engineers. There's a reason there are so few players in the microprocessor space, it takes billions in R&D for each new chip and not to mention the decades of expertise AMD, IBM and such have.

they most assuredly have engineers who at the very least help customizing the chips. they've been in the business for like 30 years...
 
Doubt. They have 5000 employees, a tiny tiny fraction of which are chip engineers. There's a reason there are so few players in the microprocessor space, it takes billions in R&D for each new chip and not to mention the decades of expertise AMD, IBM and such have.
Yes, I think that Nintendo did not actually make the chip, but they may have contributed and asked for some rather interesting requests. It did sound as if they did play an active role for integrating Hollywood compatibility.
 

krizzx

Junior Member
Yes, I think that Nintendo did not actually make the chip, but they may have contributed and asked for some rather interesting requests. It did sound as if they did play an active role for integrating Hollywood compatibility.

Yes, I was talking about the design when I said make, not manufacture. I can't remember the exact place they said it, but it was right after the XboxOne and PS4 were announced to be using AMD GPU and AMD commented on their work on them.

They were pretty much, "go ask Nintendo" when it came to the Wii U GPU. They said they provided the tech but Nintendo made it to their specification.
 

tipoo

Banned
they most assuredly have engineers who at the very least help customizing the chips. they've been in the business for like 30 years...

I havn't said they didn't. Sony does too, Microsoft does, I'm sure Nintendo does. The debate here is whether Nintendo designed MOST of the chip, rather than just tweaking and specifying needs for AMD to go and make.


They were pretty much, "go ask Nintendo" when it came to the Wii U GPU. They said they provided the tech but Nintendo made it to their specification.

Which doesn't mean Nintendo designed most of it, they asked for a custom part, just like Sony and MS have done. If any custom partner asked AMD for a part, I'm sure they'd differ to that partner to reveal specs to the public.
 
How to say this... 60 frames per second are remarkably hard to get this generation, you have to sacrifice way more than pulling half the polygons you were thinking about pulling; that's why Call of Duty games have gone 880x720 and so many others opt for dynamic framebuffers that output whatever the GPU can give them or simply being sub-720p. They've been basically doing the Halo 2 tradeoff; more texturing and less polygons; because polygons seemed enough, and textures were a better tradeoff to invest in to negate whatever limitations geometry had (normal maps being able to fake volumes and all that)

Sorry to ask a question largely unrelated to the topic, but this bottleneck are different if you are rendering 720p/ 60 than if you are doing 720p/30 in 3d? Shouldn't it be quite equivalent?
 

krizzx

Junior Member
I havn't said they didn't. Sony does too, Microsoft does, I'm sure Nintendo does. The debate here is whether Nintendo designed MOST of the chip, rather than just tweaking and specifying needs for AMD to go and make.




Which doesn't mean Nintendo designed most of it, they asked for a custom part, just like Sony and MS have done. If any custom partner asked AMD for a part, I'm sure they'd differ to that partner to reveal specs to the public.

No. They were quite specific on this that Nintendo's GPU was not created in the same manner as the PS4/Xboxone. They actually explained the design and functionality of the Xone/PS4 GPU, but gave next to nothing for the Wii U GPU except for what I stated.

Sorry to ask a question largely unrelated to the topic, but this bottleneck are different if you are rendering 720p/ 60 than if you are doing 720p/30 in 3d? Shouldn't it be quite equivalent?

Everything rendered at 60 FPS must be rendered twice as much as it is at 30 FPS.
 

tipoo

Banned
No. They were quite specific on this that Nintendo's GPU was not created in the same manner as the PS4/Xboxone. They actually explained the design and functionality of the Xone/PS4 GPU, but gave next to nothing for the Wii U GPU except for what I stated.

Sony and MS are also more open with the specs than Nintendo has been for the last decade. Maybe Nintendo just asked them not to say anything. This is a silly point to argue, it doesn't prove or disprove whether a chip is designed by a or b at all. Just going by occams razor though, I'm thinking the bulk of the chip design was left to the actual chip designer. Nintendo could do all the tweaking to a design they asked for later, they just don't have the numbers for a ground-up chip. Look at what they spent on R&D last year, even though it increased, compared to what any big CPU maker has to spend.

I'm not sure what this proves or disproves about the chip anyways, Nintendo making it wouldn't change the ballpark we've put it in, it just feels like antfucking over highly unlikely things at this point.
 

krizzx

Junior Member
Sony and MS are also more open with the specs than Nintendo has been for the last decade. Maybe Nintendo just asked them not to say anything. This is a silly point to argue, it doesn't prove or disprove whether a chip is designed by a or b at all. Just going by occams razor though, I'm thinking the bulk of the chip design was left to the actual chip designer. Nintendo could do all the tweaking to a design they asked for later, they just don't have the numbers for a ground-up chip.

This is not about being open about specs either as it wasn't the specs that AMD gave for the PS4/XBoxone GPU either.

They were discussing their own involvement in the projects.

It is clearly known that the PS4/XboxOne GPU's are HD7XXX based. The Wii U GPU apparently owes no single GPU to its creation. The level of customization is on two different planes.
 

tipoo

Banned
It is clearly known that the PS4/XboxOne GPU's are HD7XXX based. The Wii U GPU apparently owes no single GPU to its creation. The level of customization is on two different planes.

Have I been saying otherwise?

It's custom. AMD does do custom chips. Their whole new semicustom strategy is about this, and ATI also did the 360 GPU which was unlike both the prior and next generation PC chips.
 

krizzx

Junior Member
Have I been saying otherwise?

It's custom. AMD does do custom chips. Their whole new semicustom strategy is about this, and ATI also did the 360 GPU which was unlike both the prior and next generation PC chips.

That is all well known, but that it not really pertinent, as in supportive or dismissive, to what I said.
 

tipoo

Banned
That is all well known, but that it not really pertinent, as in supportive or dismissive, to what I said.

Your own post didn't add or detract anything from the AMD or Nintendo thing we've been talking about, you just said it's more custom than most which I agree with. Feh, this is silly.
 

krizzx

Junior Member
Your own post didn't add or detract anything from the AMD or Nintendo thing we've been talking about, you just said it's more custom than most which I agree with. Feh, this is silly.

What exactly was your point?

Now that i remember, even Nintendo made it kind of clear that they were the designers back in the Iwata asks about the hardware. Like when they said how AMD told "them" how they could achieve backwards compatibility in the console without the bulk of the Wii chips. The bulk of the Wii U hardware is Nintendo's specification from what I can see. They said it in a way suggesting they were designing it, and that AMD was providing support to help with the design. Like with the MCM where they were talking about getting two different companies to produce hardware that worked in conjunction. That seems to be what AMD was restating in their comment.
 

jeffers

Member
tipoo: remember this is where we started this at. he said ninty didnt contribute at all.
Can you provide a source for that please.

From what I recall AMD had a small team of staff who built and designed the Wii U's GPU. Nintendo did none of the engineering, design, or fabrication. AMD and other partners did all the design work for Nintendo based on their requirements.

I doubt very much Nintendo have the in-house expertise, resources, or skill to engineer GPUs or CPUs.
 
Everything rendered at 60 FPS must be rendered twice as much as it is at 30 FPS.

What I mean is stereoscopic 3d: you need two different 720p images to obtain a single stereoscopic 720p images, if I'm not wrong, so shouldn't stereoscopic 720@30 equivalent to non stereoscopic 720@60? Why not?
 

tipoo

Banned
What exactly was your point?

That nothing supports this
Nintendo made it. AMD just provided the hardware and helped them design it.

I'm sure it's highly custom. I'm sure Nintendo put a lot into the design. I'm also damn sure the bulk of the custom design will have been done by AMD, the chip engineering company. I'm not sure why the parameters of this debate keep being changed. That is the only point I've been making. Highly custom, Nintendo specified, Nintendo tweaked, Nintendo integrated BC, but AMD is still the main chip architect here.
 

krizzx

Junior Member
What I mean is stereoscopic 3d: you need two different 720p images to obtain a single stereoscopic 720p images, if I'm not wrong, so shouldn't stereoscopic 720@30 equivalent to non stereoscopic 720@60? Why not?

Not quite but that is mostly correct.

If you run the Unigene heaven demon through, then enable 3D and run it again, you should get exactly half your previous frame rate.

That nothing supports this


I'm sure it's highly custom. I'm sure Nintendo put a lot into the design. I'm also damn sure the bulk of the custom design will have been done by AMD, the chip engineering company. I'm not sure why the parameters of this debate keep being changed. That is the only point I've been making. Highly custom, Nintendo specified, Nintendo tweaked, Nintendo integrated BC, but AMD is still the main chip architect here.

No. The design was clearly Nintendo's doing going by both their comments and AMDs. AMD is the "manufacturer" of the GPU, but Nintendo designed it to meet their interests. Wasn't it stated before that the first thing made for the Wii U was the shell? Everything was built to fit into that compact design by Nintendo, then there was the MCM connectivity which I just mentioned. The CPU and GPU communicating on the same MCM was clearly of their design. I remember it being said that their was a ram cache on the GPU that was specifically for communicating with Espresso in a this thread a while back as well(one of the smaller 2).

I doubt IBM/AMD suddenly shook hand and shared each others secret which would be needed for them to make this design.

http://iwataasks.nintendo.com/interviews/#/wiiu/console/0/0

Nintendo has always had hardware teams put to put their hardware together. You are discrediting their 30 years of hardware manufacturing experience.
 

tipoo

Banned
AMD is the "manufacturer" of the GPU, but Nintendo designed it to meet their interests. .

AMD doesn't manufacture anything. Like I said, AMD themselves sends their CPU and GPU designs to fabrication plants like GLobal Foundries or TSMC for manufacture. If AMD didn't design the chip, Nintendo could go straight to a manufacturer.

The embedded memory as a scratchpad was just a theory we created just like everything else here by the way, it's also thought to be necessary for backwards compatibility.
 

krizzx

Junior Member
AMD doesn't manufacture anything. Like I said, AMD themselves sends their CPU and GPU designs to fabrication plants like GLobal Foundries or TSMC for manufacture. If AMD didn't design the chip, Nintendo could go straight to a manufacturer.

I'm not talking about the actual plants that they produce them at. I talking about the parts provided.

I'll use the word "vender" then. I didn't think small semantics like that would make such a huge difference. You keep dodging all of the other points as well.
 

tipoo

Banned
I'm not talking about the actually plants that they produce them at. I talking about the parts provided.

I have no idea what you mean. Nintendo designed the chip according to you, someone else is fabricating them, so what does it mean to you to say AMD "provided" parts?
 

tipoo

Banned
I'll use the word "vender" then. I didn't think small semantics like that would make such a huge difference. You keep dodging all of the other points as well.

What points am I dodging? It's not semantics. For someone intimately knowledgeable about the silicon industry, using manufacturer, designer, fabrication, vendor etc interchangeably changes what you are talking about entirely. What does vendor even mean to you, what was AMDs role in your mind if Nintendo made the chip and someone else fabricated them?
 
Not quite but that is mostly correct.

If you run the Unigene heaven demon through, then enable 3D and run it again, you should get exactly half your previous frame rate.

http://iwataasks.nintendo.com/interviews/#/wiiu/console/0/0

I was wondering 'cause while is true that little AAA had pushed for 60 frames on the current generation, a lot of this game can do stereoscopic 30:( games like Killzone3, Uncharted 3 or Motorstorm Apocalypse) and while there is some trade off, is not so prominent as one can believe
For example:
http://www.eurogamer.net/videos/motorstorm-apocalypse-3d-performance-analysis
http://www.eurogamer.net/videos/uncharted-3-airfield-2d-vs-3d

so I must believe that the reliance on 720@30 is, for the most part, a deliberate choise from the developers
 

krizzx

Junior Member
I have no idea what you mean. Nintendo designed the chip according to you, someone else is fabricating them, so what does it mean to you to say AMD "provided" parts?

I mean exactly what I just said. We know that the GPU is not "100% AMD's hardware simply for the fact that their is Renesas hardware in it and the Espresso connectivity. That was how it was first realized that it was a custom made chip and not a tweaked HD4850 like what was initially believed.

Did you not read the Iwata's ask? Nintendo tailored every component to meet their desire.

Sony and Microsoft, are going with augmented stock hardware in comparison.
 

tipoo

Banned
I mean exactly what I just said. We know that the GPU is not "100" of AMD's hardware simply for the fact that their is Renesas hardware. That was how it was first realized that it was a custom made chip and not a tweaked HD4850 like what was previously believed.

Did you not read the Iwata's ask? Nintendo tailored every component to meet their need.

Sony and Microsoft, are going with augmented stock hardware.

Sigh. That has nothing to do with whether the bulk of the chip design was Nintendos engineers or AMDs. I've said repeatedly, maybe you missed it each time, I KNOW Nintendo tailors the chip. I also read Nintendos Iwata Asks, and that talks about designing the MCM and working with the three other companies together on it. It doesn't say "we made the whole chip while AMD did nothing". How about you read my other post with the AMD quote, Nintendo is licencing an AMD design.

http://www.neogaf.com/forum/showthread.php?t=673121

So just for a bit of clarity in the last generation of consoles Microsoft and Nintendo were using our technology and we received licensing royalties from that. In this generation of consoles Nintendo is still licensing our technology, Microsoft has switched from a licensing model to a silicon model and Sony is new revenue for us.

I'm out of this debate, no need getting our heads hot over something that doesn't really change the parameters of what the U can do.
 

krizzx

Junior Member
Sigh. That has nothing to do with whether the bulk of the chip design was Nintendos engineers or AMDs. I've said repeatedly, maybe you missed it each time, I KNOW Nintendo tailors the chip. I also read Nintendos Iwata Asks, and that talks about designing the MCM and working with the three other companies together on it. It doesn't say "we made the whole chip while AMD did nothing".

http://www.neogaf.com/forum/showthread.php?t=673121



I'm out of this debate, no need getting our heads hot over something that doesn't really change the parameters of what the U can do.

I didn't say they "made the whole chip" either. I said AMD provided the hardware and Nintendo designed it.

I also fail to see what posting that thread and the comments in it matter. It doesn't support or discredit anything that has been stated unless I'm missing something. On the other hand, the Iwata asks clearly shows that Nintendo was active in the design of the Wii U hardware and that they were receiving support from AMD.

What is so probablematic about this? Nice of you to announce that you are dipping after the argument you started didn't go your way, though.

I was wondering 'cause while is true that little AAA had pushed for 60 frames on the current generation, a lot of this game can do stereoscopic 30:( games like Killzone3, Uncharted 3 or Motorstorm Apocalypse) and while there is some trade off, is not so prominent as one can believe
For example:
http://www.eurogamer.net/videos/motorstorm-apocalypse-3d-performance-analysis
http://www.eurogamer.net/videos/uncharted-3-airfield-2d-vs-3d

so I must believe that the reliance on 720@30 is, for the most part, a deliberate choise from the developers

You cannot discount hardware limitations and other bottlenecks in this. Its not quite that simple. Of course, design choice is always one of the most critical factors, but what must be sacrifised to make those choices possible tends to vary.


According to the IGN forums, this article http://www.siliconera.com/2013/09/0...-faster-and-more-badass-than-its-predecessor/

Says Bayo 2 is in 1080P 60 FPS, I saw the 1080P, didnt see the 60FPS confirmation though, maybe I missed something.

And yet another source stating that Bayonetta 2 is 1080p 60 FPS. This will be very telling if true.
 

tipoo

Banned
Nice of you to announce that you are dipping after the argument you started didn't go your way, though.

Ah, so mature. I'm leaving because I don't want to be sucked into this circular logic with someone who clearly has no part in the silicon industry all day. You can believe you "won" or whatever if it gives you the willies.
 

krizzx

Junior Member
Ah, so mature. I'm leaving because I don't want to be sucked into this circular logic all day. You can believe Nintendo is a GPU designer if you so choose.

How else could the chip be custom built? Do you think AMD just randomly made a custom chip out of nowhere and Nintendo said, "Wow! Look at that. Give me that chip that is unlike any other chip you've ever made with components from different hardware manufacturers that you just created randomly for no specific reason."

This chip is customized to Nintendo specifications, meaning they are its designer.
 

tipoo

Banned
How else could the chip be custom built? Do you think AMD just randomly made a custom chip out of nowhere and Nintendo said, "Wow! Look at that. Give me that chip that is unlike any other chip you've ever made with components from different hardware manufacturers that you just created randomly for no specific reason."

Yes, that is exactly what I think. As clearly I've never said Nintendo gave AMD requirements to work towards and then further tweaked the design themselves. I think they just put a bunch of chips on a dartboard then closed their eyes and shot. You win.


You've misread what I've said enough times that I'm thinking you're getting into troll territory.

This chip is customized to Nintendo specifications, meaning they are its designer.

Apple asked Intel for ultra low voltage chips to certain specs before most others. So Apple is the designer of Intel chips. Good to know.
 

krizzx

Junior Member
Yes, that is exactly what I think. As clearly I've never said Nintendo gave AMD requirements to work towards and then further tweaked the design themselves. I think they just put a bunch of chips on a dartboard then closed their eyes and shot. You win.


You've misread what I've said enough times that I'm thinking you're getting into troll territory.



Apple asked Intel for ultra low voltage chips to certain specs before most others. So Apple is the designer of Intel chips. Good to know.

Those are not the same circumstances and that is not what I said at all as that would be the same scenario as I suggested about GPU being already there.

I'm quite certain that from the start I've been saying that Nintendo designed the chip and AMD provided their hardware and support in helping them design the chip as the Iwata asks suggests, and that other comment about their involvement in the production of the GPU for all 3 of the next gen consoles.

Nintendo has 30 years of experience with putting hardware together. What is so difficult to believe that their own hardware teams, some of which are pictured right there in the Iwata asks, understand enough about processor/thermal design to design a chip to meet their own very specific functionality and power needs?

Who do you think developed GX2? AMD? Do you think AMD had a stockpile of Renesas eDRAM, Wii U Gamepads, and MCM's with custom PPC CPU's to just make the chip of their own volition?

There are only two possibilities here. One AMD designed the chip, which just miraculously had all of the exact functionality Nintendo needed, or Nintendo themselves designed the chip licensing AMD's hardware with AMD's support as the Iwata asks suggests.
 
Status
Not open for further replies.
Top Bottom