• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

JordanN

Banned
z0m3le said:
About the tessellators, HD 4000 series used Gen 2 tessellator from ATI (this was capable of true tessellation and not simply surface tessellation like Xenos) when comparing this to GCN, yes it certainly will lack but that is to be expected, even if they were the same generation tessellator, Wii U has ~1/3rd the polygon count of PS4 and XB1 (which is 1.6B and 1.7B respectively) This should be all we really need to know to frame the picture of what Wii U's tessellator is capable of, HD 4000 also had tessellation demos which we can still easily find on Youtube to give us a pretty solid example of what Wii U's tessellator is capable of.
I don't think it's that simple.

The HD 4000 had a hardware limit of x15 tessellation stages whereas all future DX11 class GPU's can go up to x64 at minimum (HD 5000).

Also, I'm confused by your last sentence. Are you saying that those max figures are with tessellation? Because Crysis 3 was already doing more than 10 million polygons a frame before tessellation on a HD 7970*. Even at half that, I don't think the numbers for tessellation would come up right. Also IMO, your previous post comes off as a desperate plea for excusing Wii U being weak. I'm sure if the Wii U had the exact same or better poly performance as the other systems, you wouldn't be saying those things.

*Assuming it's on ultra and going by recommended specs.
 

H6rdc0re

Banned
This is correct to my recollection. I remember this brought a lot of ire from people who were praising the hardware like it was a "high end" PC like ATI said.

http://gamerant.com/xbox-one-ps4-pc-comparison/
http://www.xbitlabs.com/news/multim..._PlayStation_4_Is_Not_Just_a_High_End_PC.html


Something I've failed to get around to do to, *issues* over the past few days is the Bayonetta 1 comparison posted a while back.
http://www.lensoftruth.com/head2head-bayonetta/
I'm looked at the fidelity and performance of this game and I honestly wish to know why it cannot be stated that Bayonetta 2 is a clear example of power of the Wii U GPU and its position as a next gen GPU.

What game *insert* and *insert* are doing will never been as good of a comparison as two game of the same type made by the same dev in the same style but with such tremendous difference in performance.

Inserts Devil May Cry and God of War Acension. You can keep trying all you want but Wii U is nowhere near to even Xbox One not to mention PS4. The CPU is much weaker core for core and the GPU is much much much weaker with an older and less efficient architecture.
 

krizzx

Junior Member
Inserts Devil May Cry and God of War Acension. You can keep trying all you want but Wii U is nowhere near to even Xbox One not to mention PS4. The CPU is much weaker core for core and the GPU is much much much weaker with an older and less efficient architecture.

Now this is the type of post that and behavior I have been speaking of. Not once did I or anyone I can think of in this thread ever question or suggest that that Wii U is as strong as the PS4 or Xbox One, No one even suggest that the Wii U was even on the same level. The PS4 and Xbox One were not being questioned or addressed at all. This was never debated. I was not addressing the PS4 or Xbox One in any way imaginable.

What is the purpose of stating this in response to what I asked? Please, enlighten me.

Also, please provide the source of how you know the "The CPU is much weaker core for core" and the source that says "the GPU is much much much weaker with an older and less efficient architecture. ". I'm waiting intently.
 
Just out of interest, what would you have expected from PS4/XBO for them to be 'next gen', not talking specs, more with regards to what's happening, on screen ?.

I think Killzone, Infamous and Driveclub look amazing on PS4 and Ryse and Dead Rising 3 on XBO. These games are running at 1080p native and show off an impressive level of visual fidelity while displaying effects not possible on current gen.

Apart from the frame rates being 60fps I can't think of how much further they could have gone tbh, esp for the price point and power / heat envelope console have.
I know that this question was for z0m3le, but I believe that there were a lot of people that had some somewhat realistic expectations for next-gen in general.

1) Some expected them to allow multiplayer games to be on-par or better than current high-end PCs.

2) Others expected games to be almost always 1080p/60fps

3) Other expected the general baseline of next-gen games to be higher than it is (example: original Watch Dog demo versions the new one.)

4) On the spec side, some expected the XB1/PS4over 2TFLOPS and bigger CPUs than what they are.

What I expected was Global Illumination in titles as open and massive as GTAV or Red Dead Redemption while pushing 1080p at 60fps (to have solid gameplay) heavy use of tessellation, wind simulation, realistic rain, cloth and pushing fidelity of textures and character models to UE 3.9's Samaritan demo.

As it is now, PS4 can't do GI in complex games, target renders are being used a lot right now for things as well (I heard that the quantum demo at e3 was run on target specs and not a PS4) we will see how close deep down gets to the original reveal but that was close to what I expected, though it was closed in spaces. XB1 will fall short of all these things and UE4 for PS4 doesn't even support all the features they had planned for the engine because PS4's performance falls short.

Since this is a Wii U thread, I'll state again, I originally expected Wii U to have XB1's specs, of course I expected XB1's specs to be 2TFLOPs to 2.5TFLOPs and use much faster CPUs. Even if Wii U had the original 600 GFLOPs, it would of held up strikingly well with XB1, you'd see games push into 1080p much more often and you would of seen every launch game run higher fidelity to the last gen consoles, that isn't what we got so I'm disappointed in all 3 next gen consoles, powerwise even more so in PS4 and XB1 because I expected them to actually give us a new level of fidelity only seen on the highest end PCs. Wii U was never going to give us that so it giving me the bare minimum just isn't as important to me since that system was never about getting the best graphics IMO.
Yes, a lot of people overestimated how power consoles can become at a certain price range. Having said that, there will be some impressive games on all next-gen systems.
 

krizzx

Junior Member
Has anyone gotten there hand on Splinter Cell Blacklist for the Wii U? I"m not expecting anything spectacular as its still aport and it got no marketing(meaning minamul effort was put in) but it would be good to see how Wii U ports perform post launch.

I'm mostly interested in whether or not the ports are still showing the inconsistent frame rate. Arkham City and Revelations HD had a fluctuating FPS from 26-34 while the other console were locked at 30.

edit: found 1 video. http://www.youtube.com/watch?v=HLFBpJp3zXI Apparently the consensus in the comment is that the Wii U version is slightly higher quality though I honeestly can't tell. I don't see any frame rate stutter though.

I guess all that's left now is for "them" to analyze it.
 
What I expected was Global Illumination in titles as open and massive as GTAV or Red Dead Redemption while pushing 1080p at 60fps (to have solid gameplay) heavy use of tessellation, wind simulation, realistic rain, cloth and pushing fidelity of textures and character models to UE 3.9's Samaritan demo.

As it is now, PS4 can't do GI in complex games, target renders are being used a lot right now for things as well (I heard that the quantum demo at e3 was run on target specs and not a PS4) we will see how close deep down gets to the original reveal but that was close to what I expected, though it was closed in spaces. XB1 will fall short of all these things and UE4 for PS4 doesn't even support all the features they had planned for the engine because PS4's performance falls short.

Since this is a Wii U thread, I'll state again, I originally expected Wii U to have XB1's specs, of course I expected XB1's specs to be 2TFLOPs to 2.5TFLOPs and use much faster CPUs. Even if Wii U had the original 600 GFLOPs, it would of held up strikingly well with XB1, you'd see games push into 1080p much more often and you would of seen every launch game run higher fidelity to the last gen consoles, that isn't what we got so I'm disappointed in all 3 next gen consoles, powerwise even more so in PS4 and XB1 because I expected them to actually give us a new level of fidelity only seen on the highest end PCs. Wii U was never going to give us that so it giving me the bare minimum just isn't as important to me since that system was never about getting the best graphics IMO.

Fair enough, I think a lot of people had way, way unrealistic expectations of what the consoles were going to be tbh. I remember BG talking about the PS4/XBO specs ~Oct last year so I never had unrealistic expectations above1.8TFLOPs for PS4 and 1.5TFLOPs for XBO but I do remember people in the PS4 thread still hoping it's GPU would be boosted to 2.5TFLOPs but it looks like they went for an exta 4GB's of RAM rather than boosting the GPU further.

Although it's very disappointing to find out that the WiiU GPU is probably only 176GFLOPs (considering BG estimated 1TF in late 2011 and 600GFLOPs in late 2012), games like Pikmin 3, W-101, Sonic, MK8, Smash and Bayonetta 2 all show that it can still produce some very nice looking games at 720p native with no tearing and mostly at 60fps which is even more impressive considering most PS4/XBO games are 30fps.

I don't think there is much more to be found from the die shot and I buy most WiiU exclusive games on release so I will post anything interesting I see in Wind Waker HD, Sonic, DKC and Mario when I get them with relation to the power / feature set of the GPU.

Something I would like to know is the bandwidth of the eDRAM, could it be over 200GB/s ?.
 

krizzx

Junior Member
Fair enough, I think a lot of people had way, way unrealistic expectations of what the consoles were going to be tbh. I remember BG talking about the PS4/XBO specs ~Oct last year so I never had unrealistic expectations above1.8TFLOPs for PS4 and 1.5TFLOPs for XBO but I do remember people in the PS4 thread still hoping it's GPU would be boosted to 2.5TFLOPs but it looks like they went for an exta 4GB's of RAM rather than boosting the GPU further.

Although it's very disappointing to find out that the WiiU GPU is probably only 176GFLOPs (considering BG estimated 1TF in late 2011 and 600GFLOPs in late 2012), games like Pikmin 3, W-101, Sonic, MK8, Smash and Bayonetta 2 all show that it can still produce some very nice looking games at 720p native with no tearing and mostly at 60fps which is even more impressive considering most PS4/XBO games are 30fps.

I don't think there is much more to be found from the die shot and I buy most WiiU exclusive games on release so I will post anything interesting I see in Wind Waker HD, Sonic, DKC and Mario when I get them with relation to the power / feature set of the GPU.

Something I would like to know is the bandwidth of the eDRAM, could it be over 200GB/s ?.

If you mean the Lattes eDRAM bandwith, that has been theorized to be either 72 MB/s or 107(not sure if that was the exact high end number I remember it was in the 100's and had a 7.).
 
Although it's very disappointing to find out that the WiiU GPU is probably only 176GFLOPs (considering BG estimated 1TF in late 2011 and 600GFLOPs in late 2012)
According to me, this 176 gflops thing is not similar in "flops metric" than the 240 flops of the xenos... It's completely different architecture, we don't know how this gpu performs if we compare in the flops metric of the ps3 and 360 gpu. Maybe this 176 gflops means 352 gflops in efficiency if we use the gflops's xenos metric. Maybe.
 

H6rdc0re

Banned
Now this is the type of post that and behavior I have been speaking of. Not once did I or anyone I can think of in this thread ever question or suggest that that Wii U is as strong as the PS4 or Xbox One, No one even suggest that the Wii U was even on the same level. The PS4 and Xbox One were not being questioned or addressed at all. This was never debated. I was not addressing the PS4 or Xbox One in any way imaginable.

What is the purpose of stating this in response to what I asked? Please, enlighten me.

Also, please provide the source of how you know the "The CPU is much weaker core for core" and the source that says "the GPU is much much much weaker with an older and less efficient architecture. ". I'm waiting intently.

I've seen your posts in the other thread (Wii U CPU Espresso) aswell so wouldn't take a genius to see your agenda.

Well Espressso is low clocked PPC750 which isn't build for multithreading, doesn't have much Integer of Floating performance clock for clock, has terrible branch prediction and wasn't that great an architecture to begin with 15 years ago. The better architecture back in the day was the PPC970.

The Latte (Wii U GPU) is basicly a downclocked Amd Radeon HD4650. Which is terrible in pretty much anything as it has low bandwidth, low floating point performance (about 170GFlops), low fillrate (both pixel and texel), low flexibility and a dated architecture (R700 family).

Enlighten me with your sources what makes the Wii U more than 20% more powerfull than the PS360. All it has is more advanced GPU architecture which can more in less cycles.
 
According to me, this 176 gflops thing is not similar in "flops metric" than the 240 flops of the xenos... It's completely different architecture, we don't know how this gpu performs if we compare in the flops metric of the ps3 and 360 gpu. Maybe this 176 gflops means 352 gflops in efficiency if we use the gflops's xenos metric. Maybe.

Good point, it's definitely outperforming Xenos with 720p native, 60fps and no tearing with better visuals. Would have been unreal if it had have been the 600GFLOPs BG thought !.
 

guek

Banned
edit: found 1 video. http://www.youtube.com/watch?v=HLFBpJp3zXI Apparently the consensus in the comment is that the Wii U version is slightly higher quality though I honeestly can't tell. I don't see any frame rate stutter though.

I guess all that's left now is for "them" to analyze it.

lol, I wouldn't put much stock at all in youtube comments. There was a review I read that said wii u has no screen tearing but more frame drops, though I'm not sure to what major extent the difference is. I hope DF does a comparison.
 

pulsemyne

Member
I've seen your posts in the other thread (Wii U CPU Espresso) aswell so wouldn't take a genius to see your agenda.

Well Espressso is low clocked PPC750 which isn't build for multithreading, doesn't have much Integer of Floating performance clock for clock, has terrible branch prediction and wasn't that great an architecture to begin with 15 years ago. The better architecture back in the day was the PPC970.

The Latte (Wii U GPU) is basicly a downclocked Amd Radeon HD4650. Which is terrible in pretty much anything as it has low bandwidth, low floating point performance (about 170GFlops), low fillrate (both pixel and texel), low flexibility and a dated architecture (R700 family).

Enlighten me with your sources what makes the Wii U more than 20% more powerfull than the PS360. All it has is more advanced GPU architecture which can more in less cycles.

Expresso is a custom chip and it has likely been designed with multi core in mind. It's also clocked higher than any other chip in it's line so it's had some serious work done to it. It isn't just a matter of tweaking a setting and boom that's the clock rate.It has been worked on quite a bit. Also for integer the Power7 were famous for being monstrously good at this. They lack in floating point operations (which is something games use a lot) but there are some extra bits on the chip that we know nothing about (suspicion is extra registers).

As for latte being a 4650...well if this thread has proven anything it's that we only have a very rough idea on what it's based on. There's no real certainty about it at all. It's still up in the air whether it 160 or 320. It may even be somewhere in between.The closest pc part based on the specs we know is the HD5550 (Redwood LE), although there is some suspicion is also based of Brazos. Or is (And in my opinion most likely) a combination of different things from a couple of different generations, a bit from R700 and bit from R800.

The only real truth to this whole thread is knowing that we know nothing....or at least only a little bit.
 

fred

Member
I've seen your posts in the other thread (Wii U CPU Espresso) aswell so wouldn't take a genius to see your agenda.

Well Espressso is low clocked PPC750 which isn't build for multithreading, doesn't have much Integer of Floating performance clock for clock, has terrible branch prediction and wasn't that great an architecture to begin with 15 years ago. The better architecture back in the day was the PPC970.

The Latte (Wii U GPU) is basicly a downclocked Amd Radeon HD4650. Which is terrible in pretty much anything as it has low bandwidth, low floating point performance (about 170GFlops), low fillrate (both pixel and texel), low flexibility and a dated architecture (R700 family).

Enlighten me with your sources what makes the Wii U more than 20% more powerfull than the PS360. All it has is more advanced GPU architecture which can more in less cycles.

Lol, not sure if serious. Funny how a GPU that's 'terrible in pretty much anything' and a CPU as poor as you think it is can produce a game like Bayonetta 2 that's not only displaying a visually impressive boss fight (Gomorrah) not only in 720p at 60fps with no screen tearing but also displaying the same boss fight in 480p at 60fps on the GamePad screen at the same time.

Plus youve got more than 3 times as much eDRAM available to the GPU compared to Xenon.

Who knows about percentages..? But if you want to compare GPU performance just compare Bayonetta 1 to Bayonetta 2 and Sonic Generations to Sonic Lost World. The difference is night and day.
 

tkscz

Member
I've seen your posts in the other thread (Wii U CPU Espresso) aswell so wouldn't take a genius to see your agenda.

Well Espressso is low clocked PPC750 which isn't build for multithreading, doesn't have much Integer of Floating performance clock for clock, has terrible branch prediction and wasn't that great an architecture to begin with 15 years ago. The better architecture back in the day was the PPC970.

The Latte (Wii U GPU) is basicly a downclocked Amd Radeon HD4650. Which is terrible in pretty much anything as it has low bandwidth, low floating point performance (about 170GFlops), low fillrate (both pixel and texel), low flexibility and a dated architecture (R700 family).

Enlighten me with your sources what makes the Wii U more than 20% more powerfull than the PS360. All it has is more advanced GPU architecture which can more in less cycles.

Someone doesn't know what they're talking about. Like at all.
 

fred

Member
Forgot to mention the multithreading 'issue'. Expresso has a ridiculously short pipeline (4 stages) so it doesn't need multithreading. If I remember correctly, and I will stand to be corrected, Xenos has 8 stages and the PS4 has 17. The Wii U is built around efficiency.
 
I am curious if Schnoz got a chance to run any more tests with his Fluke. After doing some reading on the matter, I am inclined to believe his readings more accurate than a Kill-A-Watt's.

The thing is, even if we grant another few watts, the GPU still seems to draw <20w. Possibly much less. For a quick refresher, here's a list of ICs on the Wii U PCB besides the main processor:

  • Broadcom BCM43237KMLG Wireless LAN module (for USB dongles)
  • Broadcom BCM43362KUB6 802.11n Wireless Module (handles wifi and Gamepad signal)
  • Broadcom BCM20702 Bluetooth 4.0 module
  • Panasonic MN864718 HDMI Controller
  • Samsung KLM8G2FE3B eMMC 8 GB NAND Flash/Memory Controller
  • Micron 2LEI2 D9PXV [part number MT41K256M16HA-125] 4 Gb DDR3L SDRAM x4
  • DRH-WUP 811309G31 (possibly the chip that handles Gamepad A/V encoding)
  • Fairchild DC4AY x3 (MOSFETs?)
  • SMC 1224EE402 (don't know what this does)
  • Samsung K9K8G08U1D 4 Gb (512 MB) NAND Flash
http://www.ifixit.com/Teardown/Nintendo+Wii+U+Teardown/11796/2

I was surprised to find that one chip handles the Wifi for both normal usage and the Gamepad signal. Does that say 5 watts on the chip in the link I gave? Would be in line with the regulation figures I found for a similar dual band device here.

Anyway, even using conservative estimates, this is how I estimate it starting at 38 watts.

PSU @ 90% ~4w
CPU ~ 6w
Fan ~ 1.5w
Optical Drive ~ 1.5w
Wifi/Gamepad signal ~ 3w
DDR3 RAM ~ 1w
FLASH memory ~ 1w
Everything else (HDMI, MOSFETs, encoder chip, SMC chip, board itself) ~?

We're already at ~20 watts for the GPU and there are still things unaccounted for (although they may have little impact, no idea). I tried to lowball all estimates as well. Thus, while I'm still very much interested in the true power draw of this system, as of now I just can't see it making much of a difference. In fact, when we first started on this subject a few days back, I had totally forgotten to take in PSU efficiency as a factor. If anyone can add anything to this or volunteer some more accurate figures, by all means...
 
Well Espressso is low clocked PPC750
No, it's a pretty high clocked PPC750, the highest clocked ever.
which isn't build for multithreading, doesn't have much Integer of Floating performance clock for clock, has terrible branch prediction and wasn't that great an architecture to begin with 15 years ago.
"If you judge a fish by its ability to climb a tree, it will live its whole life believing that it is stupid"

But this is the GPU thread and I'm not chiming in for krizzx; we've had our disagreements in the past and I certainly can relate with some superficial criticism regarding him being too optimistic or pushing too hard for an (positive) end; but I don't get the rabbidness or the all aboard thing against him (and I get his reaction to that), he's been around for a while and his participation has gotten way better, he's grasping at straws way less than before and he's constantly making and effort to be reasonable and be articulate. I think the name calling and not letting him save face is misplaced, like bullying in a way.

And thing is, even if this was about him, you guys are making way worse and misinformed statements. I don't think you should point while not being coherent, that's the best advice I can give you.

Cheer up boys, your make up is running (and your logic has it's pants down).
The better architecture back in the day was the PPC970.
That made me chuckle.

1997 being the same thing as 2002 amirite?

Completely different generations, that's like comparing a Pentium M to a latter Core 2 Duo (3 cpu architectures apart).

And that's the one that wasn't all that great; certainly a huge energy hog, unlike PPC750.
The Latte (Wii U GPU) is basicly a downclocked Amd Radeon HD4650. Which is terrible in pretty much anything as it has low bandwidth, low floating point performance (about 170GFlops), low fillrate (both pixel and texel), low flexibility and a dated architecture (R700 family).
You should write for Digital Foundry.
 

krizzx

Junior Member
Someone doesn't know what they're talking about. Like at all.

And this is precisely why I asked him for an explanation. I knew it would be filled with everything except fact.

Its absurd how that rushed, fictitious analysis which cited this thread and the early info in it as its entire basis, is being used to contradict the more matured finding on the GPU in this thread.

I still say people should call out Digital Foundry for that writeup. To many people take it as the truth when none of is true at all. The entire basis for their claim was that it "looked like" it is 4650/4630 but it does not. Its not even entirely made of AMD technology.
 
Has anyone gotten there hand on Splinter Cell Blacklist for the Wii U? I"m not expecting anything spectacular as its still aport and it got no marketing(meaning minamul effort was put in) but it would be good to see how Wii U ports perform post launch.

I'm mostly interested in whether or not the ports are still showing the inconsistent frame rate. Arkham City and Revelations HD had a fluctuating FPS from 26-34 while the other console were locked at 30.

edit: found 1 video. http://www.youtube.com/watch?v=HLFBpJp3zXI Apparently the consensus in the comment is that the Wii U version is slightly higher quality though I honeestly can't tell. I don't see any frame rate stutter though.

I guess all that's left now is for "them" to analyze it.
I own it and from what I've played it looks fine but it's certainly nothing to write home about. The performance and lighting has been very good from what I've played but there are a lot of jaggies that downgrade the visual fidelity somewhat. I haven't played the other next-gen versions so I'm not sure how they compare.

I'm really hoping that this aliasing issue that seems to be in every Wii U game is just because of the early stages of figuring out the hardware and not a limitation of the system because it really detracts from otherwise nice looking visuals in a lot of the games that I've played so far.

On another note krizzx I can understand why you're getting so frustrated but you really should take things a little easier. I've been participating in these threads off and on since the 1st WUST and there have always been posters (a lot of them the same ones that you're arguing with) who have moved goalposts back and forth between last gen and next gen in trying denigrate the console. None of that really matters though as there are people on both sides who are going to believe what they want to believe.

If you look at what's happening in the XBone and PS4 threads you'll see the same level of dissonance and confirmation bias between fans of those systems claiming that one's visuals are blowing away the other. Even more famously we had the E3 thread after the Wii U reveal where posters were saying the 3rd party sizzle real looked worse than last gen gen versions when all the footage was taken from the last gen consoles.

So the moral of the story is that everyone is going to have different opinions about subjective matters so don't get so worked up by opposing views.
 

krizzx

Junior Member
I own it and from what I've played it looks fine but it's certainly nothing to write home about. The performance and lighting has been very good from what I've played but there are a lot of jaggies that downgrade the visual fidelity somewhat. I haven't played the other next-gen versions so I'm not sure how they compare.

I'm really hoping that this aliasing issue that seems to be in every Wii U game is just because of the early stages of figuring out the hardware and not a limitation of the system because it really detracts from otherwise nice looking visuals in a lot of the games that I've played so far.

On another note krizzx I can understand why you're getting so frustrated but you really should take things a little easier. I've been participating in these threads off and on since the 1st WUST and there have always been posters (a lot of them the same ones that you're arguing with) who have moved goalposts back and forth between last gen and next gen in trying denigrate the console. None of that really matters though as there are people on both sides who are going to believe what they want to believe.

If you look at what's happening in the XBone and PS4 threads you'll see the same level of dissonance and confirmation bias between fans of those systems claiming that one's visuals are blowing away the other. Even more famously we had the E3 thread after the Wii U reveal where posters were saying the 3rd party sizzle real looked worse than last gen gen versions when all the footage was taken from the last gen consoles.

So the moral of the story is that everyone is going to have different opinions about subjective matters so don't get so worked up by opposing views.

This is difficult because when I do ignore there attempts at distortion of facts I get told or their outrageous claims I get told "I can't take what I dish out", accused of not listening to other's arguments, and other things of that nature. Though, they never, "ever" quote where I did it. When I actually do respond with explicit, pin point details and facts they either ignore them or completely vanish from the thread leaving me with nothing but a wasted effort.

The problem with the last few pages is that they resorted to making straw man arguments and ad hominem(for those not familiar with argumentative fallacies http://en.wikipedia.org/wiki/Ad_hominem , http://en.wikipedia.org/wiki/Straw_man) when their arguments weren't winning. I called them on it got flamed in response.

I have no problem with people disagreeing with my opinion as they do not more often than they do, but personal attacks and twisting my words are a different story. I do have problem with that. A big problem.
 

ThaGuy

Member
It seems like the further along the thread goes, the more confused I get. People talk mad shit about Wii U's hardware then Nintendo shits out games like Pikmin 3 which is blowing some people away.

By the way, what happened to Llhere? I last seen him saying he was surprised at what consoles were on the UE4 list and then poof he was gone.
 

krizzx

Junior Member
It seems like the further along the thread goes, the more confused I get. People talk mad shit about Wii U's hardware then Nintendo shits out games like Pikmin 3 which is blowing some people away.

By the way, what happened to Llhere? I last seen him saying he was surprised at what consoles were on the UE4 list and then poof he was gone.

Forget UE4. Honestly is didn't even look as capable as Cryengine 3.

I want to see the console list for Cryengine 4(not sure if its actually 4 or more of a 3.5 since it wasn't explicitly stated to my knowledge..

http://www.youtube.com/watch?v=aseq4T81P7g

The automated geometry generation feature is going to be a boon to developers. It would be nice if the Wii U was supported by it.
The procedural GPGPU weather alone knocks out the PS3/360/Wii. The Wii U still holds that possibility, though.

Speaking of which. Has there been any confirmation of Latte's GPGPU features being used in games yet?
 
If you're posting in an online forum and letting people get you so worked up you're probably giving most of them exactly what they're looking for. These are very controversial topics because a lot of people don't like the direction that Nintendo has been trying to take the industry so just factor in that emotions are high on both sides and the desire to see Nintendo's low power philosophy fail is very strong.

It seems like the further along the thread goes, the more confused I get. People talk mad shit about Wii U's hardware then Nintendo shits out games like Pikmin 3 which is blowing some people away.

By the way, what happened to Llhere? I last seen him saying he was surprised at what consoles were on the UE4 list and then poof he was gone.
Llhere still drops in from time to time if you comb through the thread. I'm pretty sure that it was confirmed a while ago that there's nothing stopping 3rd parties from porting UE4 to the Wii U but Epic doesn't officially support it as in "if you run into trouble with your code, don't call us".

They've made it pretty clear that their philosophy is that for anything a developer needs to do on Wii U UE3 is enough and I tend to agree with them. I used to hate UE3 but there have been some beautiful games developed for it lately.

UE4 seems to be more for cutting edge heavy lifting and I'm not even sure if it's really suited for the soon to be released consoles anymore.
 

ugoo18

Member
Forget UE4. Honestly is didn't even look as capable as Cryengine 3.

I want to see the console list for Cryengine 4(not sure if its actually 4 or more of a 3.5 since it wasn't explicitly stated to my knowledge..

http://www.youtube.com/watch?v=aseq4T81P7g

The automated geometry generation feature is going to be a boon to developers. It would be nice if the Wii U was supported by it.
The procedural GPGPU weather alone knocks out the PS3/360/Wii. The Wii U still holds that possibility, though.

Speaking of which. Has there been any confirmation of Latte's GPGPU features being used in games yet?

If you mean the recently announced New Cryengine then yes the WiiU is listed as one of the consoles supported.

http://crytek.com/news/crytek-announces-the-arrival-of-the-new-cryengine

On top of these changes, the new CRYENGINE supports development on current and next generation consoles (Xbox One, PlayStation®4, and Wii U™), alongside PC, with further platforms to be added in the near future.
 

krizzx

Junior Member
If you mean the recently announced New Cryengine then yes the WiiU is listed as one of the consoles supported.

http://crytek.com/news/crytek-announces-the-arrival-of-the-new-cryengine

Thanks.Now is only a question of whether EA will let them make a Wii U game with it. We already know what happened with Crysis 3. I have this strange feeling that it would have been the port to beat all Wii U ports. It may be a leap in logic but I think EA probably didn't want it to show up the other ports and contradict their claim that the Wii U wasn't a next gen console which they had recently changed their stance to after the Origin fallout.
 

FLAguy954

Junior Member
Seeing the Face-Off for Saints Row 4 on the PS360 got me thinking that a Wii U version would of been the definitive if Deep Silver didn't write the Wii U off :p (with v-sync, 720p native, better effects and probably a more stable frame-rate).
 

MDX

Member
The only real truth to this whole thread is knowing that we know nothing....or at least only a little bit.


What I really would like to know is the net effect of going through the trouble of placing eDRAM in both the CPU and GPU at the amounts that they did. If anything can keep the WiiU as a relevant next gen machine down the line, I think it must be that. This, as well as other technological tricks like lag-free dual screen gaming places the WiiU as a next gen console in general imho. But its developers learning the ins and outs of using eDRAM that I think will provide gains in performance and visual fidelity for future games.

Its unfortunate that people are still wondering if the WiiU is a stronger machine that than the x360. Retro should have shown off a title to erase doubt that the WiiU is the first console of the next generation, but we got fcking Donkey King instead. But, on paper, the eDRAM alone already places the WiiU ahead of the X360, and inline with the XONE.

“Wii U you have enough eDRAM to use it for 1080p rendering... In comparison, on XBOX360 you usually had to render in sub 720p resolutions or in mutliple passes.”

That should close the book right there.
1080p vs sub 720p resolutions.
32MB vs 10MB

Than Linzer discussed the further limitations of the Xbox 360 and how Microsoft fixed that with the Xbox One.

“Even if you don’t use MSAA (MultiSample Anti-Aliasing) you already need around 16Mb just for a 1080p framebuffer (with double buffering). You simply don’t have that with XBOX360 eDRAM. As far as I know Microsoft corrected that issue and put also 32MB of Fast Ram into their new console.”


We know MS uses eSRAM
MS states:
Xbox One’s revolutionary architecture, the combination of its CPU, GPU and ESRAM is like having a supercomputer in your living room.
http://www.xbox.com/en-US/xbox-one/innovation

What does that make the WiiU? It has a similar set up.
As a matter of fact, it has more than 32mb eDRAM, it has
34MB + 1MB sram. And we know this is in the GPU, not outside it.
Im not sure how the XONE has its eSRAM set up, but I understand
it should be less dense than eDRAM so they may not want it to take
up too much space in their GPU.

But actually, the WiiU has 37MB of eDRAM to work with if you count
what is assumed in the CPU.

Linzer... "Using eDRAM properly is a simple way to get extra performance without any other optimizations.”
 

AzaK

Member
Seeing the Face-Off for Saints Row 4 on the PS360 got me thinking that a Wii U version would of been the definitive if Deep Silver didn't write the Wii U off :p (with v-sync, 720p native, better effects and probably a more stable frame-rate).
Nintendo wrote the Wii U off. They released a console with essentially zero support from their first party. They released a buggy and slow OS. They didn't provide developers with good tools until AFTER the machine released and who knows how good they are even at now. They delayed all their games for 4-8 months. They waited 5+ months to start their VC and even now they trickle out games.

And they have only slightly reduced their arrogance since the NES days. Nintendo are almost solely to blame if developers don't support their machine. It's up to the platform holder to show an ecosystem that looks viable and profitable.
 
Nintendo wrote the Wii U off. They released a console with essentially zero support from their first party. They released a buggy and slow OS. They didn't provide developers with good tools until AFTER the machine released and who knows how good they are even at now. They delayed all their games for 4-8 months. They waited 5+ months to start their VC and even now they trickle out games.

And they have only slightly reduced their arrogance since the NES days. Nintendo are almost solely to blame if developers don't support their machine. It's up to the platform holder to show an ecosystem that looks viable and profitable.
While I still feel that most have been too hard on Nintendo and the Wii U's capabilities now that I've read some of the comments from developers I have to agree with this.

After developing a system last gen that made it very difficult to port multiplatform games Nintendo's primary focus with Wii U should have been documentation and tools to get 3rd parties all the support that they needed to hit the ground running. Instead they seem to take a very traditional, old-school development approach of dropping the system in developers laps and then saying "What? What do you mean you need documentation? Do you need us to hold your dicks when you go to the bathroom too? Figure it out!"

This really came through when I read the interview between Iwata and Kamiya. It seemed like both took a lot of pride in taking the time to figure out hardware on their own and getting the most out of it. The problem is when you have deadlines and multiple systems to support most 3rd parties don't have the time to dissect Nintendo's hardware and figure out how to make it perform well. This is probably why we heard so many comments from devs disparaging aspects of the system around the Wii U launch. It's a lot easier to say that the hardware is shit than to say we don't have enough information to do the work that we need to do.

This is not to say that the Wii U is some kind of secret powerhouse but when you have respected devs like DICE saying that Wii U can't handle Frostbite but mobile devices can it just makes all parties involved look ridiculous. With the recent backtracking EA has done in regards to Wii U support I would guess that developers are starting to get more information about how the system works and are sharing the info between studios but at this point the damage has been done and it remains to be seen if Nintendo can recover from this latest judgment lapse.
 

krizzx

Junior Member
Nintendo wrote the Wii U off. They released a console with essentially zero support from their first party. They released a buggy and slow OS. They didn't provide developers with good tools until AFTER the machine released and who knows how good they are even at now. They delayed all their games for 4-8 months. They waited 5+ months to start their VC and even now they trickle out games.

And they have only slightly reduced their arrogance since the NES days. Nintendo are almost solely to blame if developers don't support their machine. It's up to the platform holder to show an ecosystem that looks viable and profitable.

Care to explain the 40+ devs that went out of business for developing for the PS3 and 360 and why the PS3 didn't have many games at the start of its life.

http://kotaku.com/5876693/every-game-studio-thats-closed-down-since-2006

Please define profitable environment for me.

http://www.destructoid.com/journey-took-thatgamecompany-into-bankruptcy-244311.phtml

Didn't know the dev of the acclaimed PS3 exclusive Journey went bankrupt.

It was always my belief that game development was supposed to be competitive and that your reap what you so. I guess its also Nintendo's faults that most of the people who did buy the Wii U didn't buy the terrible ports that got released for it.

While I still feel that most have been too hard on Nintendo and the Wii U's capabilities now that I've read some of the comments from developers I have to agree with this.

After developing a system last gen that made it very difficult to port multiplatform games Nintendo's primary focus with Wii U should have been documentation and tools to get 3rd parties all the support that they needed to hit the ground running. Instead they seem to take a very traditional, old-school development approach of dropping the system in developers laps and then saying "What? What do you mean you need documentation? Do you need us to hold your dicks when you go to the bathroom too? Figure it out!"

This really came through when I read the interview between Iwata and Kamiya. It seemed like both took a lot of pride in taking the time to figure out hardware on their own and getting the most out of it. The problem is when you have deadlines and multiple systems to support most 3rd parties don't have the time to dissect Nintendo's hardware and figure out how to make it perform well. This is probably why we heard so many comments from devs disparaging aspects of the system around the Wii U launch. It's a lot easier to say that the hardware is shit than to say we don't have enough information to do the work that we need to do.

This is not to say that the Wii U is some kind of secret powerhouse but when you have respected devs like DICE saying that Wii U can't handle Frostbite but mobile devices can it just makes all parties involved look ridiculous. With the recent backtracking EA has done in regards to Wii U support I would guess that developers are starting to get more information about how the system works and are sharing the info between studios but at this point the damage has been done and it remains to be seen if Nintendo can recover from this latest judgment lapse.

Devs like D.I.C.E. all fall under EA's blanket. EA dictates what they release their games on as their publisher. Most of the devs who have been vehemently against the Wii U get their games published by EA. This is why there has been so much back peddling.

First it was said that the Wii U couldn't run Crysis 3 because of Cryengine 3. Then it was said that it ran Cryengine 3 just find that the Crysis 3 was already running until EA blocked its release on the Wii U.

First they said Frostbite Engine couldn't run well on the Wii U. Then they said it could run but their isn't enough interest in the console. There is also the fact that EA intentionally gimped its games that it released for the Wii U in the launch period.

Regardless of what Nintendo does to the Wii U, it is a devs sole responsibility to make their game as appealing as possible and they seem to have worked to do just the opposite in a lot of cases. It is not Nintendo fault that I had on interest in Darksider 2 on the Wii U or Arkham City Armored Edition. Its the devs fault for making the game poorly.

Sure the tools weren't 100% at a launch and more than likely still aren't. But that didn't stop Frozenbyte, Shin'en, Wayforward all of the other devs who made fully capable games on the console who have praised their success on the console.
 

The_Lump

Banned
I am curious if Schnoz got a chance to run any more tests with his Fluke. After doing some reading on the matter, I am inclined to believe his readings more accurate than a Kill-A-Watt's.

The thing is, even if we grant another few watts, the GPU still seems to draw <20w. Possibly much less. For a quick refresher, here's a list of ICs on the Wii U PCB besides the main processor:

  • Broadcom BCM43237KMLG Wireless LAN module (for USB dongles)
  • Broadcom BCM43362KUB6 802.11n Wireless Module (handles wifi and Gamepad signal)
  • Broadcom BCM20702 Bluetooth 4.0 module
  • Panasonic MN864718 HDMI Controller
  • Samsung KLM8G2FE3B eMMC 8 GB NAND Flash/Memory Controller
  • Micron 2LEI2 D9PXV [part number MT41K256M16HA-125] 4 Gb DDR3L SDRAM x4
  • DRH-WUP 811309G31 (possibly the chip that handles Gamepad A/V encoding)
  • Fairchild DC4AY x3 (MOSFETs?)
  • SMC 1224EE402 (don't know what this does)
  • Samsung K9K8G08U1D 4 Gb (512 MB) NAND Flash
http://www.ifixit.com/Teardown/Nintendo+Wii+U+Teardown/11796/2

I was surprised to find that one chip handles the Wifi for both normal usage and the Gamepad signal. Does that say 5 watts on the chip in the link I gave? Would be in line with the regulation figures I found for a similar dual band device here.

Anyway, even using conservative estimates, this is how I estimate it starting at 38 watts.

PSU @ 90% ~4w
CPU ~ 6w
Fan ~ 1.5w
Optical Drive ~ 1.5w
Wifi/Gamepad signal ~ 3w
DDR3 RAM ~ 1w
FLASH memory ~ 1w
Everything else (HDMI, MOSFETs, encoder chip, SMC chip, board itself) ~?

We're already at ~20 watts for the GPU and there are still things unaccounted for (although they may have little impact, no idea). I tried to lowball all estimates as well. Thus, while I'm still very much interested in the true power draw of this system, as of now I just can't see it making much of a difference. In fact, when we first started on this subject a few days back, I had totally forgotten to take in PSU efficiency as a factor. If anyone can add anything to this or volunteer some more accurate figures, by all means...

If Shnoz is correct with his reading of 48w then at 90% efficiency that would be ~43w going into the box, right?

My estimate (guess) is ~10w for RAM, WiFi, peripherals etc and the rest to the MCM. I'd guess 35w is the absolute max we're ever likely going to get going to the MCM (assuming the system has room to be pushed into drawing more power).

I don't know much about the subject, but I was under the impression that the decision to go with an MCM (containing both cpu & gpu) was partly to reduce power consumption. By how much its impossible to know I suppose.

We also don't know how much the games were able to test it with so far are pushing the system. It would be nice to get a good pool of data (idle consumption vs various graphically intensive, online multiplayer with peripherals plugged in etc).
 

The_Lump

Banned
Care to explain the 40+ devs that went out of business for developing for the PS3 and 360 and why the PS3 didn't have many games at the start of its life.

http://kotaku.com/5876693/every-game-studio-thats-closed-down-since-2006

Please define profitable environment for me.

http://www.destructoid.com/journey-took-thatgamecompany-into-bankruptcy-244311.phtml

Didn't know the dev of the acclaimed PS3 exclusive Journey went bankrupt.

It was always my belief that game development was supposed to be competitive and that your reap what you so. I guess its also Nintendo's faults that most of the people who did buy the Wii U didn't buy the terrible ports that got released for it.

What Azak said is still completely true though. Regardless of 3rd party shoddy ports etc, Nintendo screwed up the launch/window big time as perfectly described by Azak.

Edit: Also, this isn't tech talk so is not really relevant here is it? Let's not get this thread closed after all the work that's gone into it. There's still a lot we need to find out, even if we now have a pretty firm picture of the gpu's likely real world performance.
 

krizzx

Junior Member
What Azak said is still completely true though. Regardless of 3rd party shoddy ports etc, Nintendo screwed up the launch/window big time as perfectly described by Azak.

Edit: Also, this isn't tech talk so is not really relevant here is it? Let's not get this thread closed after all the work that's gone into it. There's still a lot we need to find out, even if we now have a pretty firm picture of the gpu's likely real world performance.

I do not question that Nintendo screwed up its launch. What i question is the logic that followed. This being his claim that it was Nintendo screwed up launch that made developers not want to go multiplat with the Wii U instead of other reasons.

That doens't explain why games didn't come to the Wii, and when fully functional multiplats were released on the Wii, like Call of Duty, they did well. That didn't explain why mutliplats didn't come to the Dreamcast, Gamecube, N64, and Saturn. Thing is, most devs have a preferred console for whatever reason. Epic prefer the 360. Naught Dog prefers the PS3. Shin'en prefers Nintendo hardware. The Anrgy Bird's Dev prefers iOS.

There is another keypoint as well. Nintendo rarely wipes devs posteriors with money the way Sony and Microsoft do, buying timed exclusives, exclusive dlc, and forcing devs to only show Nintendo footage during demonstrations and advertisements.

As for the tech talk, I've been saying this as well. This wouldn't be a problem if people didn't come in here with a primary goal of attacking Nintendo and its business pratices which is irrelevant to the GPU in any way. That is what usually causing the huge derailings.


I would prefer to keep talk strictly to the GPU and CPU talk strictly to its own thread.
 
Devs like D.I.C.E. all fall under EA's blanket. EA dictates what they release their games on as their publisher. Most of the devs who have been vehemently against the Wii U get their games published by EA. This is why there has been so much back peddling.

First it was said that the Wii U couldn't run Crysis 3 because of Cryengine 3. Then it was said that it ran Cryengine 3 just find that the Crysis 3 was already running until EA blocked its release on the Wii U.

First they said Frostbite Engine couldn't run well on the Wii U. Then they said it could run but their isn't enough interest in the console. There is also the fact that EA intentionally gimped its games that it released for the Wii U in the launch period.

Regardless of what Nintendo does to the Wii U, it is a devs sole responsibility to make their game as appealing as possible and they seem to have worked to do just the opposite in a lot of cases. It is not Nintendo fault that I had on interest in Darksider 2 on the Wii U or Arkham City Armored Edition. Its the devs fault for making the game poorly.

Sure the tools weren't 100% at a launch and more than likely still aren't. But that didn't stop Frozenbyte, Shin'en, Wayforward all of the other devs who made fully capable games on the console who have praised their success on the console.
While there is some valid points in what you're saying Nintendo does have a responsibility as the platform holder to make sure that their 3rd party partners have the tools to be successful on their platform. There have been a lot of posts in this tread (hopefully not by me :/) blaming lazy devs for shoddy Wii U ports when now in retrospect it doesn't seem that anyone, including Nintendo's studios, had the time and dev tools to optimize their games at launch.

When you already have a poor relationship with 3rd parties creating an environment where they feel that you've help sabotage their games and make them look like bad developers is only going to make things worse.
Edit: Also, this isn't tech talk so is not really relevant here is it? Let's not get this thread closed after all the work that's gone into it. There's still a lot we need to find out, even if we now have a pretty firm picture of the gpu's likely real world performance.
It's not my call but I think discussion of the lack of documentation and proper dev tools are relevant in a GPU analysis discussion since it directly impacts our ability to make judgments about the hardware based on released software.

Discussions about publisher pricing and release schedules is OT though IMO.
 

H6rdc0re

Banned
No, it's a pretty high clocked PPC750, the highest clocked ever."


High clocked for a PPC750 but it's clock speeds are very low compared to other architectures these days. All doing more DMIPS and FLOPS per clock.

1997 being the same thing as 2002 amirite?

Completely different generations, that's like comparing a Pentium M to a latter Core 2 Duo (3 cpu architectures apart).

And that's the one that wasn't all that great; certainly a huge energy hog, unlike PPC750.

I'm not talking the date of the actual architecture but rather the Gamecube which was arround the same timeframe.

You can't deny the brand new Jaquar cores make short work of the ancient PPC750.


Sure it's not exactly a HD4650 but it is close in terms of performance judging by it's die size and power drain. It certainly won't be anywhere near the GPU in the Xbox One and/or PS4.

And this is precisely why I asked him for an explanation. I knew it would be filled with everything except fact.

Its absurd how that rushed, fictitious analysis which cited this thread and the early info in it as its entire basis, is being used to contradict the more matured finding on the GPU in this thread.

I still say people should call out Digital Foundry for that writeup. To many people take it as the truth when none of is true at all. The entire basis for their claim was that it "looked like" it is 4650/4630 but it does not. Its not even entirely made of AMD technology.

Like I said before enlighten me. Show me with a detailed response why the Wii U is so impressive otherwise give me some sources.
 

krizzx

Junior Member
While there is some valid points in what you're saying Nintendo does have a responsibility as the platform holder to make sure that their 3rd party partners have the tools to be successful on their platform. There have been a lot of posts in this tread (hopefully not by me :/) blaming lazy devs for shoddy Wii U ports when now in retrospect it doesn't seem that anyone, including Nintendo's studios, had the time and dev tools to optimize their games at launch.

When you already have a poor relationship with 3rd parties creating an environment where they feel that you've help sabotage their games and make them look like bad developers is only going to make things worse.

It's not my call but I think discussion of the lack of documentation and proper dev tools are relevant in a GPU analysis discussion since it directly impacts our ability to make judgments about the hardware based on released software.

Discussions about publisher pricing and release schedules is OT though IMO.

I can agree with that to a point. That still doesn't excuse the fact that, just as with the Wii, the devs failed to properly polish and promote their games.

The Wii didn't have banged up tools at launch and development was around 1/3 the cost of making a PS3/360, but it received this exact same treatment in most cases. Devs would half-ass their games to sub PS2 levels. I can't fault the limit of the Wii's power because most devs never reached it. They didn't even try. They failed to do anything hat generally makes a game a success.

I'm not excusing Nintendo for not having proper tools ready, but I'm not excusing the devs for failing to make sure the games were AAA before they released them to customers and trying to charge full price for year old games and content that you can get cheaper for everything else.

The flaws in the graphical performance of the games, which is the main aspect relevant to the GPU, are primarily the devs fault, not Nintendo. Whether they had to trim content or delay the release, their is not excuse for selling people a glitchy game and then not patching the glitches. I suspect none of that was done because they never put that much effort into the ports to begin with.

The facts that most indie devs had praised their sales on the console and contentiously talk about what they will release next shows that Nintendo did make a profitable environment for those who are willing to make the proper effort.

Best example is Capcom. They a been vocal about how they had trouble developing for the Wii U, yet they have stated that Monster Hunter Tri Ultimate was a "smash hit".
 
Hey guys. I wanted to come through to see how things were. Surprised the thread is still going after some of the more recent posts.

I just wanted to chime in on the TDP discussion.

First I wanted to try and help clarify what USC and Fourth were saying. What they are talking about is load (the word I think was missing) along with efficiency. If I put my two cents in this I would say that as of now the PSU is 70-75% efficient. Most readings hit around 32w. So in this case: Outlet draw - 30% lost to heat (9.6w) = 22.4w used by the console. This would also put the load at ~30% and

One thing I would disagree with would be the amount of power draw given to the PSU itself. USC can correct me, but I think he gave as much as 4w to the power supply on it's own. I don't think a PSU with no fan is drawing ~5% of the total 75w at only a 33% load. However I don't know if there are any tests that look strictly at PSU draw. I think the only thing we know is that from Anandtech's earlier analysis the power draw of Wii U while idle is 0.22 watts.

Now the next part I am going to say is not to argue whether or not the GPU has 160 or 320 ALUs (though I will say at this point I believe I can eliminate my suggestion of 256), but to point out that looking at the current power draw may not be a reliable metric to draw a final conclusion on which of the two it may be.

Looking at the E4690 (Disclaimer: I am in no way claiming that this is the GPU or a modified version of it is in Wii U. I don't want someone like SpecialRangers saying I am claiming that to be the case), this is a low-TDP GPU that is embedded and I am using this as a comparison GPU. This is a chart taken from a PDF for this GPU. This GPU's ALU/TMU/ROP count is 320/32/8.

dnw1sU9.png


As you can see at 25w the GPU is clocked at 600Mhz and the memory at 700Mhz. From here I will be using some "dirty math" since I'm only using paper specs and not able to take every little detail into account. Between a clock speed difference of 150Mhz (600-450) we see the TDP drop from 25w to 17w. This means that the TDP in that range drops 1w for every 18.75Mhz reduction in the clock speed. This would mean that this GPU clocked at 550Mhz like Latte would be ~22.3w. Last I was here we were given indications of a fab at 40/45nm. I looked at similar (as possible) AMD/ATI GPUs that went from 55nm to 40nm and their TDP reductions. I focused on the low end with an 80 ALU part and a 320 ALU part, I saw a 23.4% and 18.75% reduction respectively (this was larger on more powerful GPUs). If we applied this to the 22.3w we are looking at a range of approx. 17-18w. This includes both the GPU and memory. In contrast to Latte you would still need to factor in things like it having at best half the TMUs and a 2012 40/45nm process vs a 2009 40nm process. I would also postulate that the DDR3 plus eMemory in Wii U has a draw equal to or slightly more than the GDDR3 in the E4690.

Going back to load, we can't say for sure at this time that a 33% load (assumed) is the max we will see in games throughout Wii U's life. Will some later games that are assumed to be more intensive cause a larger draw on power? That's something that can only be determined through testing games during Wii U's life.

So again I'm just saying this to suggest that looking at TDP to help draw a conclusion may not be conclusive as I think it could be argued that a 320 ALU part could fit in the current thermal envelope as well. Not saying that Latte has 320 ALUs, just that as of now I don't think it can be ruled out based on TDP either.

EDIT: I fixed my error with the formula. I also understand now what USC was saying. He was talking about power lost as heat, while I was thinking he was talking about power draw. I only glanced at the post that's why I wanted him to correct me if needed. It was needed. :p
 

krizzx

Junior Member
Hey guys. I wanted to come through to see how things were. Surprised the thread is still going after some of the more recent posts.

I just wanted to chime in on the TDP discussion.

First I wanted to try and help clarify what USC and Fourth were saying. What they are talking about is load (the word I think was missing) along with efficiency. If I put my two cents in this I would say that as of now the PSU is 70-75% efficient. And I guess the load at 33% of the PSU (25w). So in this case: 33% load + 30% extra power draw = 32.5w from the outlet.

One thing I would disagree with would be the amount of power draw given to the PSU itself. USC can correct me, but I think he gave as much as 4w to the power supply on it's own. I don't think a PSU with no fan is drawing ~5% of the total 75w at only a 33% load. However I don't know if there are any tests that look strictly at PSU draw. I think the only thing we know is that from Anandtech's earlier analysis the power draw of Wii U while idle is 0.22 watts.

Now the next part I am going to say is not to argue whether or not the GPU has 160 or 320 ALUs (though I will say at this point I believe I can eliminate my suggestion of 256), but to point out that looking at the current power draw may not be a reliable metric to draw a final conclusion on which of the two it may be.

Looking at the E4690 (Disclaimer: I am in no way claiming that this is the GPU or a modified version of it is in Wii U. I don't want someone like SpecialRangers saying I am claiming that to be the case), this is a low-TDP GPU that is embedded and I am using this as a comparison GPU. This is a chart taken from a PDF for this GPU. This GPU's ALU/TMU/ROP count is 320/32/8.

dnw1sU9.png


As you can see at 25w the GPU is clocked at 600Mhz and the memory at 700Mhz. From here I will be using some "dirty math" since I'm only using paper specs and not able to take every little detail into account. Between a clock speed difference of 150Mhz (600-450) we see the TDP drop from 25w to 17w. This means that the TDP in that range drops 1w for every 18.75Mhz reduction in the clock speed. This would mean that this GPU clocked at 550Mhz like Latte would be ~22.3w. Last I was here we were given indications of a fab at 40/45nm. I looked at similar (as possible) AMD/ATI GPUs that went from 55nm to 40nm and their TDP reductions. I focused on the low end with an 80 ALU part and a 320 ALU part, I saw a 23.4% and 18.75% reduction respectively (this was larger on more powerful GPUs). If we applied this to the 22.3w we are looking at a range of approx. 17-18w. This includes both the GPU and memory. In contrast to Latte you would still need to factor in things like it having at best half the TMUs and a 2012 40/45nm process vs a 2009 40nm process. I would also postulate that the DDR3 plus eMemory in Wii U has a draw equal to or slightly more than the GDDR3 in the E4690.

Going back to load, we can't say for sure at this time that a 33% load (assumed) is the max we will see in games throughout Wii U's life. Will some later games that are assumed to be more intensive cause a larger draw on power? That's something that can only be determined through testing games during Wii U's life.

So again I'm just saying this to suggest that looking at TDP to help draw a conclusion may not be conclusive as I think it could be argued that a 320 ALU part could fit in the current thermal envelope as well. Not saying that Latte has 320 ALUs, just that as of now I don't think it can be ruled out based on TDP either.

Thanks for the info BG. Though, I'm curious as to whether different cards that both have the same ALU count, clock, the same fab in the same time frame always give the same wattage as well. In other worse I'm curious to the difference in efficiency of the ALU's themselves.

As for the bolded, I understand this more than you can imagine at the moment.
 
Thanks for the info BG. Though, I'm curious as to whether different cards that both have the same ALU count, clock and the same fab in the same time frame always give the same wattage as well.

Someone can correct me if I'm wrong since I'm not remembering at the moment, but I think the listed TDP is the total board draw. Which is how you can see similar GPUs with different ratings. In this case being embedded vs discrete.
 

krizzx

Junior Member
I see. I was hoping to find a difference between ALUs themselves. Say, maybe if they use a different material, design or logical set you could make one 30 ALU component on 55nm consume 50% les energy than another 30 ALU component on a different 55nm chip. A difference in efficiency amongst the same type of part.

ALU's confuse me.

I see cards like this http://www.amd.com/us/products/desk...d-5450-overview/Pages/hd-5450-overview.aspx#2 getting such high performance vs others cards with higher specs but going by popular thought, it shouldn't with that clock vs, its ALU count.

Well, thanks for the help.
 
I see. I was hoping to find a difference between ALUs themselves. Say, maybe if they use a different material, design or logical set you could make one 30 ALU component on 55nm consume 50% les energy than another 30 ALU component on a different 55nm chip. A difference in efficiency amongst the same type of part.

ALU's confuse me.

That's a logical issue. While I was looking at the listed TDPs the Cypress LE (5830) and Barts XT (6870), both of which are 40nm, the latter has more ROPs, a slightly higher clock, however its listed TDP is 151W while the former is 175w. A ~13.7% reduction.
 

JordanN

Banned
Why can't Nintendo just formally release all of this to the public? Is it so fucking hard?
They're stubborn that's all. They've released specs before (way before the Gamecube crap happened) so there is no real argument now other than PR.

Luckily, the console is out now and the games are not doing anything magical so it's not a big loss that we don't know.

But it sure would have mattered pre-release when we had all those Wust/rumor threads that only ended in pointless bickering. Now the same still exists but it's limited to just this thread!
 
High clocked for a PPC750 but it's clock speeds are very low compared to other architectures these days. All doing more DMIPS and FLOPS per clock.
Suuuuure.

Xbox 1 XCPU: 951.64 DMIPS @ 733 MHz
Pentium III: 1124.311 @ 866 MHz
GC Gekko: 1125 DMIPS @ 486 MHz (har har fast against de-cached Pentium 3 my ass)
Wii Broadway: 1687.5 DMIPS @ 729 MHz

Pentium 4A: 1694.717 @ 2 GHz
PS3 Cell PPE: 1879.630 DMIPS @ 3.2 GHz (sans SPE)
X360 Xenon: 1879.630 DMIPS*3 = 5638.90 DMIPS @ 3.2 GHz (each 3.2 GHz core performing the same as the PS3)
PowerPC G4: 2202.600 @ 1.25GHz
AMD Bobcat: 2662.5*2 = 5325 DMIPS @ 1 GHz
Wii U Espresso: 2877.32 DMIPS*3 = 8631.94 DMIPS @ 1.24 GHz (again, final performance taking into account 3 fully accessible cores)
Pentium4 3.2GHz: 3258.068
6 core Bobcat: 4260*6 = 25560 DMIPS @ 1.6 GHz (said CPU doesn't exist, but best case scenario Jaguar is supposed to perform 20% better; that would be 5112 DMIPS per core, 30672 DMIPS for 6 cpu's, it's probably somewhere in between; I'm using 6 cores because that's what devs will have access to)

a 6 core 1.6 GHz PPC750 could actually compete with bobcat in Dhrystone performance; as for Floating Point it was simply not designed that way; but I feel people put too much emphasis on CPU floating point for no good reason.


You also have Blu's SIMD benchmarks which are certainly not embarassing for the architecture.
I'm not talking the date of the actual architecture but rather the Gamecube which was arround the same timeframe.
Bloody hell.

Gamecube launched in September 2001; PowerPC G5/970 launched in June 2003; that's almost two years, and I don't think I have to remind you how fast things changed back then.

And yeah, because Nintendo should have went from a 180 nm 4.9W TDP from Gekko to 130 nm 42W PPC970 variant; that's tenfold increase despite the smaller manufacturing method.
You can't deny the brand new Jaquar cores make short work of the ancient PPC750.
Higher clocked and it has more cores to it; I mean duh. Like I ilustrated above, in DMIPS they could actually be closely matched; as for the rest it's pretty much design decision and they have to live and die for it, but it's certainly not a shameful architecture as you're making it out to be.

I think Nintendo should have went with more cores but with that said, it's still incredibly powerful for the energy drain; that's the thing it has going for it.

Notice we're listing it as 6W part; how lean the design is is a huge factor; even if it doesn't bring any bonus to it (other than a lower power bill you and me really don't care about); the thing is effective.

The Wii U seems to be remarkably effective for a 33/38W console, with the HD Twins not being able to dream to go as low even now after numerous core shrink and optimizations.

Problem is, as impressive as something might be in the effectiveness per Watt, they shot themselves in the foot for not going higher. Like saying a 80W Wii U could have the margin to be so much more powerful than it is; but that could be achieved by doubling logic, including CPU cores.
Sure it's not exactly a HD4650 but it is close in terms of performance judging by it's die size and power drain. It certainly won't be anywhere near the GPU in the Xbox One and/or PS4.
It's not meant to be. Calling it an off the shelf part though is not only insulting and revealing of how much you seem to know regarding this, but it's also misleading should anyone read it and run along with that.

They might have had all the wrong priorities to it you (and me) might think of, but it's still as custom as it gets.
 
Suuuuure.

Xbox 1 XCPU: 951.64 DMIPS @ 733 MHz
Pentium III: 1124.311 @ 866 MHz
GC Gekko: 1125 DMIPS @ 486 MHz (har har fast against de-cached Pentium 3 my ass)
Wii Broadway: 1687.5 DMIPS @ 729 MHz

Pentium 4A: 1694.717 @ 2 GHz
PS3 Cell PPE: 1879.630 DMIPS @ 3.2 GHz (sans SPE)
X360 Xenon: 1879.630 DMIPS*3 = 5638.90 DMIPS @ 3.2 GHz (each 3.2 GHz core performing the same as the PS3)
PowerPC G4: 2202.600 @ 1.25GHz
AMD Bobcat: 2662.5*2 = 5325 DMIPS @ 1 GHz
Wii U Espresso: 2877.32 DMIPS*3 = 8631.94 DMIPS @ 1.24 GHz (again, final performance taking into account 3 fully accessible cores)
Pentium4 3.2GHz: 3258.068
6 core Bobcat: 4260*6 = 25560 DMIPS @ 1.6 GHz (said CPU doesn't exist, but best case scenario Jaguar is supposed to perform 20% better; that would be 5112 DMIPS per core, 30672 DMIPS for 6 cpu's, it's probably somewhere in between; I'm using 6 cores because that's what devs will have access to)

a 6 core 1.6 GHz PPC750 could actually compete with bobcat in Dhrystone performance; as for Floating Point it was simply not designed that way; but I feel people put too much emphasis on CPU floating point for no good reason.


You also have Blu's SIMD benchmarks which are certainly not embarassing for the architecture.Bloody hell.

Gamecube launched in September 2001; PowerPC G5/970 launched in June 2003; that's almost two years, and I don't think I have to remind you how fast things changed back then.

And yeah, because Nintendo should have went from a 180 nm 4.9W TDP from Gekko to 130 nm 42W PPC970 variant; that's tenfold increase despite the smaller manufacturing method.Higher clocked and it has more cores to it; I mean duh. Like I ilustrated above, in DMIPS they could actually be closely matched; as for the rest it's pretty much design decision and they have to live and die for it, but it's certainly not a shameful architecture as you're making it out to be.

I think Nintendo should have went with more cores but with that said, it's still incredibly powerful for the energy drain; that's the thing it has going for it.

Notice we're listing it as 6W part; how lean the design is is a huge factor; even if it doesn't bring any bonus to it (other than a lower power bill you and me really don't care about); the thing is effective.

The Wii U seems to be remarkably effective for a 33/38W console, with the HD Twins not being able to dream to go as low even now after numerous core shrink and optimizations.

Problem is, as impressive as something might be in the effectiveness per Watt, they shot themselves in the foot for not going higher. Like saying a 80W Wii U could have the margin to be so much more powerful than it is; but that could be achieved by doubling logic, including CPU cores.It's not meant to be. Calling it an off the shelf part though is not only insulting and revealing of how much you seem to know regarding this, but it's also misleading should anyone read it and run along with that.

They might have had all the wrong priorities to it you (and me) might think of, but it's still as custom as it gets.

Question: How Many watts did Broadway draw?
 

prag16

Banned
lol, I wouldn't put much stock at all in youtube comments. There was a review I read that said wii u has no screen tearing but more frame drops, though I'm not sure to what major extent the difference is. I hope DF does a comparison.
For what it's worth an amazon reviewer claimed there was a day 1 update to fix the frame drops, which reviewers probably didn't have. But take that with a dose of salt. We'll see.
 
Status
Not open for further replies.
Top Bottom