• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
Huh??? I am chill. I'm just calling out his actions.

All I did was ask him to verify what he said and pointed out the flaws in his argument just the same as when you called my statement about multiple Project Cars logs being posted , and I quote, "bullshit" only to later be proved wrong. How is that hostile?

This is a logical discussion. If you don't like what you say being called into question then you are going to have a difficult time here. I am not supposed to take any and everything you say at face value without questioning it.

You've gotten into in argument with a lot of people in this thread. I've already stated I was wrong on my part. But don't you think it's a bit odd that you've gotten into an argument with most people that have posted on here? You need to calm down.

If you want to have a logical discussion. That's fine. But some people just weren't happy with your arguments.

For example, the most recent one. The figures you use for failure rates were from 2007-2008. Those are highly ill-representative of current rates, for better or worse.

Not only that but you associate low power consumption with "efficiency" which is just wrong. Yes, A Wii U is more efficient than a PS360, slim or not... But to say that the build quality of a Wii is better than a PS3/360 because the failure rate is lower is wrong.

You can't compare the two. The power consumption and cooling requirements of all 3 are different. More power = more heat = faster hardware degradation. You can't just ignore all the things (especially power) that go into the devices that the Wii simply doesn't have.

Ignoring those is illogical...

Any suggestions? I'm willing to risk my wii u power supply for science (lol) & have some basic electronics gear + experience.

I would have no idea how to do that... I can create logic gate with circuitry.. but I'm terrible at wiring in general. I would have zero clue how to wire a PSU to something that would draw 75w. Someone else probably can.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
I get your points, blu. So you don't think that the 450 Mhz laptop part I mentioned could possibly be some of the runoff from the embedded line?
I don't think I've seen the particular part you're referring to, but in principle nothing stops a laptop from hosting an E-series GPU. It could be hard to justify business-wise as the laptop maker would be paying a premium for that part, but hey, some of those laptops are priced outrageously anyway, so they could afford all kinds of 'exotic' parts inside.

Granted, I still don't think they are a good way of comparison for the reasons I mentioned to bg in that same post. I don't want to belabor the issue any more. I will accept the possibility of them cramming 320 shaders into Latte, looking at nothing other than the power draw. That does not mean that I find it likely, however.
Don't get me wrong - I'm not saying anything about the number of ALUs here (well, save for the fact it must be a multiple of 4), I'm just saying that the 'don't compare them to AMD's E-series of GPUs' argument is based on fallacious reasoning.
 

QaaQer

Member
I would have no idea how to do that... I can create logic gate with circuitry.. but I'm terrible at wiring in general. I would have zero clue how to wire a PSU to something that would draw 75w. Someone else probably can.

Maybe I'll hit up one of the diy forums. I'm off to hike the six glaciers at Lake Louise tho, so it won't be today.

ciao.
 

krizzx

Junior Member
You've gotten into in argument with a lot of people in this thread. I've already stated I was wrong on my part. But don't you think it's a bit odd that you've gotten into an argument with most people that have posted on here? You need to calm down.

If you want to have a logical discussion. That's fine. But some people just weren't happy with your arguments.

For example, the most recent one. The figures you use for failure rates were from 2007-2008. Those are highly ill-representative of current rates, for better or worse.

Not only that but you associate low power consumption with "efficiency" which is just wrong. Yes, A Wii U is more efficient than a PS360, slim or not... But to say that the build quality of a Wii is better than a PS3/360 because the failure rate is lower is wrong.

You can't compare the two. The power consumption and cooling requirements of all 3 are different. More power = more heat = faster hardware degradation. You can't just ignore all the things (especially power) that go into the devices that the Wii simply doesn't have.

Ignoring those is illogical...

First off, 95% of the time, they initiate the argument, not me. Why am I being villified. Its like no matter what I do its wrong with people like you.

If I ignore them when they start throwing stuff at me then I get accused of not responding to opinions I don't like or not being able to handle what I said being questioned.
If I do respond to what they say then I'm accused of being argumentative or attacking people.

Its a discussion thread using theory and hypothesis. Of course there will be arguments. Fourth Storm gets into arguments, Zomie gets into argument,(heck, Zomie and Fourth just had a big one a few pages back) lostinblue gets into arguments and so on. That's what happens when people trade opinions. Why am I suddenly evil for arguing my own point? What matters is that they address each other's point properly.

I could see if I was just making problems for the sake of making problem like a lot of people who come at me do, but I don't. I will never claim something that I can't back up. The same can't be said for the rest. If I get into an argument it is because I have a point and I will substantiate that point.



What you just did is a good example of the stuff I'm talking about that causes problem. It was from 2007-2012. I posted 2 links, not one. I also posted a video to support it as well. That is how you support your statement.

The point it was used to make was that Nintendo does not pinch pennies on product quality like what was being suggested. You have spun that to me using it to draw the conclusion that the PS3/360 were poor quality which I didn't and where did the energy vs efficiency come into this? Though in retrospect, the Wii is of better build quality for those reasons since your bring it up.

Also, you very well can make an issue of heat and energy consumption. Its the hardware manufacturers job to make sure their product works properly under load. If your product was design with a material, heat balance that caused it to malfunction then that is a low quality product. Its the console producters fault that they want to make and sell a powerful machine that wasn't sufficiently stable. I have a PC that has lasted five years. It draws more energy, produces more heat and I use it more often than any 360 or PS3. My PC still works. My first 360 does not and the second has to be red ring fixed every 2 days of use. The reason is that my PC is properly designed for the load it undertakes. Its a quality machine.

Nothing excuses faulty hardware.
 
What you just did is a good example of the stuff I'm talked about that causes problem. It was from 2007-2012. I posted 2 links, not one. I also posted a video to support it. That is how you support your statement.

The point it was used to make was that Nintendo does not pinch pennies on product quality.
I never argued that point. But if I were to, I'd argue otherwise. Or at least argue that they don't know how to manage costs of hardware. I frankly don't care for that argument.

You have spun that to me using it to draw the conclusion that the PS3/360 were poor quality which I didn't and where did the energy vs efficiency come into this?
Well...
ummm, when it comes to heatsinks, the bigger the more pathetic.
It's comments like this that make me wonder where you are going with your discussion. Of course I'm inferring that the word "pathetic" used in this case means inferior...

Also, you very well can make an issue of heat and energy consumption. Its the hardware manufacturers job to make sure their product works properly under load. If your product was design with a material, heat balance that caused it to malfunction then that is a low quality product. Its the developers fault that they want to make a powerful machine that wasn't sufficiently stable.
Yes, it IS their job.
But it's still easier for things to go wrong when dealing with higher amounts of power.
 

z0m3le

Banned
I don't think I've seen the particular part you're referring to, but in principle nothing stops a laptop from hosting an E-series GPU. It could be hard to justify business-wise as the laptop maker would be paying a premium for that part, but hey, some of those laptops are priced outrageously anyway, so they could afford all kinds of 'exotic' parts inside.


Don't get me wrong - I'm not saying anything about the number of ALUs here (well, save for the fact it must be a multiple of 4), I'm just saying that the 'don't compare them to AMD's E-series of GPUs' argument is based on fallacious reasoning.

I think the big problem with notebook manufactures using the e4690 (july 09') is that the HD5650m (july 10') was released a year later while e6760 that replaces the e4690 wasn't released until may 2011. That and the rv730 on desktops and laptops sold less than the embedded parts which is very hard to do when you are binning the best power consumption parts since they are always the "lucky few" which we can see when AMD first launched their "black" series of CPUs.

also you mean a multiple of 5? VLIW5, of course they could of changed to VLIW4 but there is nothing to really back that up.
 

krizzx

Junior Member
I never argued that point. But if I were to, I'd argue otherwise. Or at least argue that they don't know how to manage costs of hardware. I frankly don't care for that argument.


Well...
It's comments like this that make me wonder where you are going with your discussion. Of course I'm inferring that the word "pathetic" used in this case means inferior...

Yes, it IS their job.
But it's still easier for things to go wrong when dealing with higher amounts of power.

You just took a comment from a completely different post made much further back in the thread to a different person that was used to make a completely different point that had nothing to do with the what was being discussed immediately.

You just sandwiched two unconnected things together to make something to argue against.
 
You just took a comment from a completely different post made much further back in the thread to a different person that was used to make a completely different point that had nothing to do with the what was being discussed immediately.

You just sandwiched two unconnected things together to make something to argue against.

How are those not connected?

I could just be stupid, but we're talking about console power design, etc. Right? How is heat sink not a part of that?
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
also you mean a multiple of 5? VLIW5, of course they could of changed to VLIW4 but there is nothing to really back that up.
Four as in the number of SIMD blocks seen on the die.

edit: ..and of course the number of simd engines is 8. My bad, memory lanes crosswired.
 

krizzx

Junior Member
How are those not connected?

I could just be stupid, but we're talking about console power design, etc. Right? How is heat sink not a part of that?

No, the one you went back and grabbed from the past was about heatsinks, specifically the claim by the person I was responding too that a small heatsink was "pathetic". It was pretty clear.

The actual relevant discussion that the first quote is from was about Nintendo's hardware quality and how much they spend on it. I also made this pretty clear.

How did you obtain any of what you are talking about from those two posts?
 
Well, there's got to be some hardware on there running translation from TEV code into shader language and such. However we take it, my point was that the design is not just R700 - it's been altered at least somewhat at the hardware level.


Quick question, (and I'm probably showing my lack of knowledge on this one) but could they not just run a translator for the TEV using general purpose code without having to make hardware modifications? Surely a GPU this much more powerful than that if the Wii would have the resources to do so... right?

Or use one of the CPU cores? They share the fast/low latency eDram after all... It would seem wasteful not to use it.
 

krizzx

Junior Member
Quick question, (and I'm probably showing my lack of knowledge on this one) but could they not just run a translator for the TEV using general purpose code without having to make hardware modifications? Surely a GPU this much more powerful than that if the Wii would have the resources to do so... right?

Or use one of the CPU cores? They share the fast/low latency eDram after all... It would seem wasteful not to use it.

That is a hard call. Of course there is "something" in Latte translating it, but the TEV efficiency at doing things cannot be matched my modern code in a lot of cases from what I've seen. Say effects that would take 5 or 6 passes on a modern GPU would could be done in 1 or 2 on the TEV. That is how it was able to pull of current gen shading in games like Mario Galaxy, The Conduit and Overlord.

Its not merely a matter of power. This is something that was hard to explain to people about the Wii's capabilities since a lot of people just saw it as an overclocked Gamecube which it wasn't.
 

Schnozberry

Member
Quick update on the wattage. Had a line tech at work calibrate the fluke 1735 meter for me so it should be as accurate as it can be. I ran home and did a quick test on my lunch break, and here's what I got.

1. Wii U without my external hard drive and USB network adapter attached sitting at the home menu: 32w

Same scenario playing Splinter Cell Blacklist from disc drive: 36w

Wii U with External Drive and USB Network Adapter sitting at home menu: 43w

Same Scenario playing Splinter Cell Blacklist from disc drive: 47w

So, it turns out the fluke needed a little calibration, and my initial tests were a little high. I did not test again with the cheap ass Kill-a-Watt meter because I only had a half hour to get home, eat lunch and test, and then get my ass back to work.
 

krizzx

Junior Member
Quick update on the wattage. Had a line tech at work calibrate the fluke 1735 meter for me so it should be as accurate as it can be. I ran home and did a quick test on my lunch break, and here's what I got.

1. Wii U without my external hard drive and USB network adapter attached sitting at the home menu: 32w

Same scenario playing Splinter Cell Blacklist from disc drive: 36w

Wii U with External Drive and USB Network Adapter sitting at home menu: 43w

Same Scenario playing Splinter Cell Blacklist from disc drive: 47w

So, it turns out the fluke needed a little calibration, and my initial tests were a little high. I did not test again with the cheap ass Kill-a-Watt meter because I only had a half hour to get home, eat lunch and test, and then get my ass back to work.

So the earlier statements that some made were right. Certain games will pull more power from the GPU than others which means that it has not peaked its power. We have can get near 45w with no auxiliary attachments.

Iwata's statement about it drawing 45 watts looks to be accurate.
 
That is a hard call. Of course there is "something" in Latte translating it, but the TEV efficiency at doing things cannot be matched my modern code in a lot of cases from what I've seen. Say effects that would take 5 or 6 passes on a modern GPU would could be done in 1 or 2 on the TEV. That is how it was able to pull of current gen shading in games like Mario Galaxy, The Conduit and Overlord.

Its not merely a matter of power. This is something that was hard to explain to people about the Wii's capabilities since a lot of people just saw it as an overclocked Gamecube which it wasn't.
It's been a long while but I thought that someone (Marcan?) identified a separate 8bit processor on the GPU that was hypothesized as being the TEV to shader translator for Wii BC.

I could be remembering all the facts wrong but it was a working theory that there isn't any dedicated silicon at all for Wii mode.

...Or maybe that was my theory. I am getting old :(
¿?

I read 32-36w without attachments and 43-47 WITH attachments (external hdd). I'm missing something?
You're not :/
 
Quick update on the wattage. Had a line tech at work calibrate the fluke 1735 meter for me so it should be as accurate as it can be. I ran home and did a quick test on my lunch break, and here's what I got.

1. Wii U without my external hard drive and USB network adapter attached sitting at the home menu: 32w

Same scenario playing Splinter Cell Blacklist from disc drive: 36w

Wii U with External Drive and USB Network Adapter sitting at home menu: 43w

Same Scenario playing Splinter Cell Blacklist from disc drive: 47w

So, it turns out the fluke needed a little calibration, and my initial tests were a little high. I did not test again with the cheap ass Kill-a-Watt meter because I only had a half hour to get home, eat lunch and test, and then get my ass back to work.

So it looks like other tests published online (likely using a Kill-A-Watt or similar level device) were about 3 watts too low across the board. Very interesting results, Schnoz. Thanks for doing this.
 
Funny, all of the other handhelds provide a dual voltage power supply! Pretty much all the cellphones and tablets I have seen come with dual voltage power supplies, not to mention pretty much 100% of all laptops (because you know, travelling). I have bought some very cheap generic USB chargers and they have all been dual voltage. Seriously, Nintendo is the outlier here - people are genuinely surprised when they fry their AC adapter in another country.

I guess this is like folks saying "the eShop policies are fine, I'm ok with having to fill out a police report if I lose my 3DS. It doesn't affect me"

You began speaking on the efficiency of the Wii U power supply and how their frugal ways points to a cheap power brick. I responded. You quoted me only to spend the above sentences focusing on a handheld power supply issue about dual voltages, completely ignoring the point of what I wrote and also the issue with the Wii U power supply.

So again: Nintendo can be cheap is some areas, but they are far from cheap in others. Their cheapness in not providing certain features stems from issues that do not affect things like the reliability of their hardware. Please re-read my post you quoted.
 
So, it turns out the fluke needed a little calibration, and my initial tests were a little high. I did not test again with the cheap ass Kill-a-Watt meter because I only had a half hour to get home, eat lunch and test, and then get my ass back to work.

lunch? pfffttt.. should have skipped that ;)
 

The_Lump

Banned
Quick update on the wattage. Had a line tech at work calibrate the fluke 1735 meter for me so it should be as accurate as it can be. I ran home and did a quick test on my lunch break, and here's what I got.

1. Wii U without my external hard drive and USB network adapter attached sitting at the home menu: 32w

Same scenario playing Splinter Cell Blacklist from disc drive: 36w

Wii U with External Drive and USB Network Adapter sitting at home menu: 43w

Same Scenario playing Splinter Cell Blacklist from disc drive: 47w

So, it turns out the fluke needed a little calibration, and my initial tests were a little high. I did not test again with the cheap ass Kill-a-Watt meter because I only had a half hour to get home, eat lunch and test, and then get my ass back to work.

Awesome work, thanks for doing that :)
 
So the earlier statements that some made were right. Certain games will pull more power from the GPU than other which means that. We have near 45w with no auxiliary attachments.

Iwata's statement about it drawing 45 watts looks to be accurate.

Warning: This post is completely Off Topic.
I'm not sending this as a PM in hopes it could potentially help the tension in the thread.


Krizzx, I have a quick question. What is your native language? If it isn't English, I think that may be the source of a lot of the issues with your posts... particularly the harshness some of your comments seem to have when debating. The duality of old English(Germanic) and French(Latin) language that evolved into modern English can be difficult to navigate for non-native speakers. (That's not even getting into the differences between British, American, Aussie, South African, and Scottish English.)

Edit: Nevermind.
 

krizzx

Junior Member
¿?

I read 32-36w without attachments and 43-47 WITH attachments (external hdd). I'm missing something?

I read the same, but the point it that the hardware is not using all of its resources yet. I believe it will reach that 45 watt mark in game before its all said and done. I can feel.

Like earlier, with the TVtrope edit that said the Wii U got a clock bump to 3.6 Ghz after the spring update. Most people outright threw it out, but i sort of read in between the lines. I interpreted that as that Nintendo finally added functionality to the OS to delicate idol processes to unused cores which was reported as not being present before. It is possible that whoever got those number was reading the voltage of all 3 1.24 Ghz cores at once as 3.6. That's just my hypothesis on that. One thing that's certain is that performance improved in a lot of games after the spring update.

Then we have the summer update next month. I bet that if we test the performance after the Summer update, we will get different watt readings.
 

krizzx

Junior Member
Warning: This post is completely Off Topic.
I'm not sending this as a PM in hopes it could potentially help the tension in the thread.


Krizzx, I have a quick question. What is your native language? If it isn't English, I think that may be the source of a lot of the issues with your posts... particularly the harshness some of your comments seem to have when debating. The duality of old English(Germanic) and French(Latin) language that evolved into modern English can be difficult to navigate for non-native speakers. (That's not even getting into the differences between British, American, Aussie, South African, and Scottish English.)

English. I generally type everything I have to say at once and edit the grammar and spelling later. I was in hurry to be somewhere and didn't have time to reread and spell check that one.
 

jeffers

Member
I read the same, but the point it that the hardware is not using all of its resources yet. I believe it will reach that 45 watt mark in game before its all said and done. I can feel.

you can't just phrase your original thing as you said. you made it sound like fact instead of opinion/conjecture >.> the point here is to a) not sound crazy like if you misread or did crazy maths b) avoid people then mistakenly taking your belief as fact

edit: see now you changed it a bit, still not quite there but fairs.
 

krizzx

Junior Member
you can't just phrase your original thing as you said. you made it sound like fact instead of opinion/conjecture >.> the point here is to a) not sound crazy like if you misread or did crazy maths b) avoid people then mistakenly taking your belief as fact

edit: see now you changed it a bit, still not quite there but fairs.

As I said to the other guy, that was not written properly. I wrote it in a hurry and didn't have time to look over it. I just got back and saw all of these people talking about it fixed it.
 
Quick update on the wattage. Had a line tech at work calibrate the fluke 1735 meter for me so it should be as accurate as it can be. I ran home and did a quick test on my lunch break, and here's what I got.

1. Wii U without my external hard drive and USB network adapter attached sitting at the home menu: 32w

Same scenario playing Splinter Cell Blacklist from disc drive: 36w

Wii U with External Drive and USB Network Adapter sitting at home menu: 43w

Same Scenario playing Splinter Cell Blacklist from disc drive: 47w

So, it turns out the fluke needed a little calibration, and my initial tests were a little high. I did not test again with the cheap ass Kill-a-Watt meter because I only had a half hour to get home, eat lunch and test, and then get my ass back to work.

Good on you for sacrificing a lunch break for us nerds ! :).

So it seems WiiU has 3w more to play with than we all thought (not including HDD's ect).

What would 3w mean to performance, *if* it was all used for the GPU ?.

I wouldn't be surprised if Nintendo tweaked the GPU clock in the last update, didn't they change a lot of stuff in 3DS with updates (a GPU clock increase and unlocked a second CPU ?).

As others have said the console blows out freezing cold air even after 4/5 hour sessions of W-101, quite stunning the difference between WiiU and my PS3 which blows out heat that could cook a steak after about 10 mins. I wonder how high they could go before heat and power became an issue.
 

Argyle

Member
Like earlier, with the TVtrope edit that said the Wii U got a clock bump to 3.6 Ghz after the spring update. Most people outright threw it out, but i sort of read in between the lines. I interpreted that as that Nintendo finally added functionality to the OS to delicate idol processes to unused cores which was reported as not being present before. It is possible that whoever got those number was reading the voltage of all 3 1.24 Ghz cores at once as 3.6. That's just my hypothesis on that. One thing that's certain is that performance improved in a lot of games after the spring update.

Delicate idol who? Read the voltage of the giga-what?

I have no idea what that is even supposed to mean...

Edit: I feel bad because your post made no sense and basically IMHO reflects a bunch of wishful thinking, so I'll try to contribute something here. I have no idea if people really got improved performance post-system update (there are a lot of things that could have improved performance), but in my experience if you want to have things running on another core you need to pretty much explicitly do so. Usually the OS will expose an interface that allows you to spawn a thread on another core, and then you would tell your thread to run whatever it is you want. Some OS'es may not allow you to lock a thread to a core, or a thread might get stopped and restarted on a different core, but unless the OS was broken to the point where it was only spawning and scheduling threads on the main core (which would be a pretty stunning level of incompetence, and frankly I doubt Nintendo would have shipped a console in that state - hell, I doubt it would have run multiplatform anything at close to decent framerates if that were the case) any unused cores are going to be the responsibility of the game programmers and not something that would magically get unlocked in a system update. In fact, multicore programming can be fraught with peril - if there was a magic system update to "unlock the cores" I would bet that they would maintain the old functionality (only running on one core) for old games so that they don't deadlock at random!
 

Argyle

Member
Dedicate idle, I think. (ie no spell check, mobile phone post, etc).

Edit: Argyle, here's what Ideaman said about the CPU cores: http://www.neogaf.com/forum/showpost.php?p=57889424&postcount=656

Correct - and what IdeaMan said may very well be true, and can be chalked up to an oversight on the part of the folks developing those particular games (not a rip on them - things like that happen, good thing they caught it before ship!), and not something that a system update will automatically fix or give magic performance boosts to all games (read IdeaMan's post, he says as much).
 

krizzx

Junior Member
Correct - and what IdeaMan said may very well be true, and can be chalked up to an oversight on the part of the folks developing those particular games (not a rip on them - things like that happen, good thing they caught it before ship!), and not something that a system update will automatically fix or give magic performance boosts to all games (read IdeaMan's post, he says as much).

That is totally not what I was stating at all...
 

Argyle

Member
That is totally not what I was stating at all...


I read the same, but the point it that the hardware is not using all of its resources yet. I believe it will reach that 45 watt mark in game before its all said and done. I can feel.

Like earlier, with the TVtrope edit that said the Wii U got a clock bump to 3.6 Ghz after the spring update. Most people outright threw it out, but i sort of read in between the lines. I interpreted that as that Nintendo finally added functionality to the OS to delicate idol processes to unused cores which was reported as not being present before. It is possible that whoever got those number was reading the voltage of all 3 1.24 Ghz cores at once as 3.6. That's just my hypothesis on that. One thing that's certain is that performance improved in a lot of games after the spring update.

Then we have the summer update next month. I bet that if we test the performance after the Summer update, we will get different watt readings.

???

I guess the general point isn't clear enough for me because if that wasn't what you were implying (that somehow Nintendo pushed a system update that something something unused cores and that is what he meant by 3.6Ghz - a pretty big stretch IMHO) then perhaps you should try again so that people understand what you were actually trying to say...
 
???

I guess the general point isn't clear enough for me because if that wasn't what you were implying (that somehow Nintendo pushed a system update that something something unused cores and that is what he meant by 3.6Ghz - a pretty big stretch IMHO) then perhaps you should try again so that people understand what you were actually trying to say...

Yes, I'm unsure on what K was coming from with that. Now, it would be an interesting theory that Nintendo slightly boosted the clock of the CPU/GPU during the system update due to them being able to save some energy with software optimizations. We do know that the disc drive doesn't run all of the time like before, for example, though software optimizations alone could result to what the small upgrades we are seeing for previously released hardware.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
160 alu's pull around 15w I think. The numbers were posted before.

But 320 would be about double that.. so.. ~30w.

That's just too much.
Where did you read that?

Sometime I wonder why people keep ignoring all the data in plain sight:
http://www.amd.com/us/Documents/47285D_E4690_Discreet_GPU.pdf

55nm, 320 ALUs (VLIW5), 512MB GDDR3:
  • engine @ 150MHz, mem @ 250MHz = 8W
  • engine @ 300MHz, mem @ 400MHz = 12W
  • engine @ 450MHz, mem @ 600MHz = 17W
  • engine @ 600MHz, mem @ 700MHz = 25W
 

krizzx

Junior Member
Where did you read that?

Sometime I wonder why people keep ignoring all the data in plain sight:
http://www.amd.com/us/Documents/47285D_E4690_Discreet_GPU.pdf

55nm, 320 ALUs (VLIW5), 512MB GDDR3:
  • engine @ 150MHz, mem @ 250MHz = 8W
  • engine @ 300MHz, mem @ 400MHz = 12W
  • engine @ 450MHz, mem @ 600MHz = 17W
  • engine @ 600MHz, mem @ 700MHz = 25W

I'd imagine its because its doesn't fit their agenda.

Some people seem hell bent on labeling the Wii U a 160 ALU part for no other reason than the number being lower than the 360's. I personally estimate to be somewhere in the 200-250 range(24,28, or 32 ALU components). That would account for the 90% size difference.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
How does this reconcile with Usc-fan's point?
It doesn't.

A good quality PSU would have higher efficiency than 80% at the middle of its rated output. So it's not 80% to begin with. I bet that in reality it's much closer to 90%, or 32.4W of draw for the console (let's round it down to 32W).

A 40nm R800 160ALU core @ 600MHz (+ 512MB of GDDR5 @ 800MHz) is 16W. So in the case of WiiU that leaves 16W+ for CPU, disc and gamepad radio. That'd be quite wasteful of whoever designed the thing.
 
It doesn't.

A good quality PSU would have higher efficiency than 80% at the middle of its rated output. So it's not 80% to begin with. I bet that in reality it's much closer to 90%, or 32.4W of draw for the console (let's round it down to 32W).

A 40nm R800 160ALU core @ 600MHz (+ 512MB of GDDR5 @ 800MHz) is 16W. So in the case of WiiU that leaves 16W+ for CPU, disc and gamepad radio. That'd be quite wasteful of whoever designed the thing.

Thanks.

Trying to keep up with all the 'facts' thrown around in this thread can be confusing. :)
 
It doesn't.

A good quality PSU would have higher efficiency than 80% at the middle of its rated output. So it's not 80% to begin with. I bet that in reality it's much closer to 90%, or 32.4W of draw for the console (let's round it down to 32W).

A 40nm R800 160ALU core @ 600MHz (+ 512MB of GDDR5 @ 800MHz) is 16W. So in the case of WiiU that leaves 16W+ for CPU, disc and gamepad radio. That'd be quite wasteful of whoever designed the thing.

Any estimation on how the CPU power use would scale? Gekko was like a 5w chip wasn't it?
 

z0m3le

Banned
Where did you read that?

Sometime I wonder why people keep ignoring all the data in plain sight:
http://www.amd.com/us/Documents/47285D_E4690_Discreet_GPU.pdf

55nm, 320 ALUs (VLIW5), 512MB GDDR3:
  • engine @ 150MHz, mem @ 250MHz = 8W
  • engine @ 300MHz, mem @ 400MHz = 12W
  • engine @ 450MHz, mem @ 600MHz = 17W
  • engine @ 600MHz, mem @ 700MHz = 25W

I find this chip interesting as Latte would resemble this chip at a lower process (higher energy efficiency) since it doesn't have DX11 ALUs, iirc the R700 ALUs were smaller and had lower power consumptions on average when compared to the same clocked R800 ALUs thanks to those things.

I see no reason why power consumption of a 320 ALU r700 chip at 40nm would draw over 15watts at 550MHz. Still not saying this proves it is 320 ALUs but it just means that it could be, unless I'm missing something.
 
Status
Not open for further replies.
Top Bottom