• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
Question: How Many watts did Broadway draw?
180nm Gamecube Gekko - 4.9W @ 486 MHz
90 nm Wii Broadway - 3.9W @ 729 MHz

Bare in mind Espresso biggest change was L2 cache being eDRAM now; quick google says eDRAM uses up less 36% less energy compared to on-die SRAM; meaning a Broadway sans-core-shrink using it could be substantially more energy efficient just by using it.

Also, we're not taking it into account as a consumed "fact", but DMIPS might be as high as 2.7 DMIPS/MHz instead of 2.3 on Gekko, because of the extra cache improving instruction fetch.
 
180nm Gamecube Gekko - 4.9W @ 486 MHz
90 nm Wii Broadway - 3.9W @ 729 MHz

Bare in mind Espresso biggest change was L2 cache being eDRAM now; quick google says eDRAM uses up less 36% less energy compared to on-die SRAM; meaning a Broadway sans-core-shrink using it could be substantially more energy efficient.

Soooooooo...would it be safe to assume that Espresso consumes about 10 watts?
 
Soooooooo...would it be safe to assume that Espresso consumes about 10 watts?
That's what I've been ballparking as a maximum for a long time now; but I really have no way of knowing as core shrink energy gains are not linear and it could have been further fine tuned for operating under low voltage, then there's the change from SRAM to eDRAM.

I'm pretty sure it could never go higher than 10W TDP, and it could be under; 10W is a "worst case scenario".
 
That's a logical issue. While I was looking at the listed TDPs the Cypress LE (5830) and Barts XT (6870), both of which are 40nm, the latter has more ROPs, a slightly higher clock, however its listed TDP is 151W while the former is 175w. A ~13.7% reduction.

Hey bg. Take a look at the die size and transistor count of those two cards. The Cypress seems to be a variant of the high-end version with some blocks disabled. Barts is not. That might explain the discrepancy.

If it can be shown that those embedded parts are not binned/rebranded laptop chips, I will agree with your comparison. I just don't think we can make that leap and claim that there's no binning going on with that line.

Looking at Brazos, for Zacate you have an 18 watt TDP for a dual core variant with an 80:8:4 GPU config. As Bobcat is touted as being able to go as low as <1w, we're already looking at maybe 10-12w minimum for something that's half as capable as the proposed 160 ALU Latte config.
 

krizzx

Junior Member
One thing I've been meaning to bring up as far as comparing the stock cards to Latte is the fact that their mass is about 20 times what it is.

Most video cards these days are 1/3 the size of a PC motherboard or more. Latte is a drip in a bucket compared to them. Might the fact that there is so much more to those those stock GPUs that need to be powered be taken into account when comparing the chips wattage or is that already accounted for?
 
Hey bg. Take a look at the die size and transistor count of those two cards. The Cypress seems to be a variant of the high-end version with some blocks disabled. Barts is not. That might explain the discrepancy.

Thanks. I only did a superficial comparison.

If it can be shown that those embedded parts are not binned/rebranded laptop chips, I will agree with your comparison. I just don't think we can make that leap and claim that there's no binning going on with that line.

Give me a refresher course as to why binned parts would make this an issue.

Looking at Brazos, for Zacate you have an 18 watt TDP for a dual core variant with an 80:8:4 GPU config. As Bobcat is touted as being able to go as low as <1w, we're already looking at maybe 10-12w minimum for something that's half as capable as the proposed 160 ALU Latte config.

I would suggest we need more info about the APU as Caicos is rated as 18w (625Mhz) and it's a 160 ALU part.
 
This is ridiculous. What did I do in this thread for everyone suddenly vilify me like I'm a troll
Honest question?

You did nothing special, other than give your opinion and ask questions, but you tend to use a more confrontational tone than others at times and resort to disregarding those disagreeing with you as "haters" instead of asking them to explain why the feel their opinion is accurate. If you feel that people are misrepresenting your opinion or your intentions, don't you think you could be doing that as well when it comes to other posters? Keep in mind that it seems that multiple people have expressed they have an issue with the way you are expressing yourself, while they seem to be ok with other people in the thread, regardless of their opinion on the hardware.

If you look at your own posts in this thread, you will realize you have spent a serious amount of time accusing other people of pretty negative things. While you likely feel that it's justified, what if you are interpreting other people's actions incorrectly? What if those posters are no different from you, except they have a different opinion? If you disagree with them at times, then give your opinion and move on.

In my experience, these more confrontational arguments are the result of low frustration tolerance and that has little to do with the actions of others. It's a perception problem. Not only it's ok if others disagree with you, it's actually a good thing, since they provide you with a different point of view and that will challenge your opinions and may lead you to a better understanding of the situation.

And just in case it helps you to understand where I'm coming from when it comes to Wii U's hardware, it has more advanced shading capabilities than PS3/360 due to more modern tech, the extra amount of memory is definitely helpful, it will likely show a performance advantage in the long run as well (even if it's not all that big) and it's well below PS4/XBone/Gaming PCs. I think most people here, even the ones that disagree with you, have a similar opinion, even if their expectations vary or are lower than yours and think most Wii U games shown so far don't really show much difference.
 

krizzx

Junior Member
Honest question?

You did nothing special, other than give your opinion and ask questions, but you tend to use a more confrontational tone than others at times and resort to disregarding those disagreeing with you as "haters" instead of asking them to explain why the feel their opinion is accurate. If you feel that people are misrepresenting your opinion or your intentions, don't you think you could be doing that as well when it comes to other posters? Keep in mind that it seems that multiple people have expressed they have an issue with the way you are expressing yourself, while they seem to be ok with other people in the thread, regardless of their opinion on the hardware.

If you look at your own posts in this thread, you will realize you have spent a serious amount of time accusing other people of pretty negative things. While you likely feel that it's justified, what if you are interpreting other people's actions incorrectly? What if those posters are no different from you, except they have a different opinion? If you disagree with them at times, then give your opinion and move on.

In my experience, these more confrontational arguments are the result of low frustration tolerance and that has little to do with the actions of others. It's a perception problem. Not only it's ok if others disagree with you, it's actually a good thing, since they provide you with a different point of view and that will challenge your opinions and may lead you to a better understanding of the situation.

And just in case it helps you to understand where I'm coming from when it comes to Wii U's hardware, it has more advanced shading capabilities than PS3/360 due to more modern tech, the extra amount of memory is definitely helpful, it will likely show a performance advantage in the long run as well (even if it's not all that big) and it's well below PS4/XBone/Gaming PCs. I think most people here, even the ones that disagree with you, have a similar opinion, even if their expectations vary or are lower than yours and think most Wii U games shown so far don't really show much difference.

I have never called anyone in this forum a hater. That is not even a term that I use. I don't use personal attacks in place of real arguments. Also, I don't disregard what people say unless it has no bases to go on(not founded on fact or substantial material) to begin with, they were using argumentative fallacies, or they were asking a question I've already answered in which case I will always tell them to go back in the thread and look for themselves. I'm tired of reposting the same thing like I'm doing right now..

That link takes me to nothing.



As for the no different than me but with a different opinion thing. Everyone in this forum has a different opinion from me. I've stated this endlessly that I have no problem whatsoever with anyone expressing an opinion different than mine. I have no problem whatsoever with people disagreeing with me. I wouldn't be here if I did, and I would think there was something wrong if everyone agreed with me. All I care about is whether or not it said on solid ground. If it isn't I will call it out or ignore the poster as arguing with nothing goes nowhere. I'm not trying to be confrontational. I'm trying to keep things as factual or in the pursuit of fact as possible. I use logic to make my decisions, not empathy. I just find it very annoying when people make claims that are completely fictitious, can't stay on their own topic(changes the focus everything they get in a corner) or misquotes others. There opinion being different is not what I have a problem with.

My problem in the post that you are responding to is that they were attacking me personally instead of addressing what I said and showing how it was not correct which, as I pointed out, is an argumentative fallacy and I will not tolerate fallacious responses.

Show me where I have started attacking people for unspecified things they supposedly said in other threads or threads they have with me. Show me where I have tried to label someone a fanboy for speaking positive about another console like what was done to me.

When I post something its usually accompanied with images, video, links to dev statements and so on. I provide things to back up what I say. Most of the people whom I've gotten into argument with did not, and trying to argue against something that is founded on nothing generally leads nowhere. I'm not ignoring their point. They didn't have one for me to argue against in the first place. Its like answering a loaded question. Addressing it would be giving it validity that it did no have previously.




Secondly, I always ask when uncertain of something. They don't. They make claims and get mad when the validity of what they say is questioned. If someone questions what I say and they have shown themselves to be reasonable and responsive then l will respond with material support it. If they have shown themselves to be dismissive and prone to ignoring anything inconvenient then I will ignore them.

Show me where I have ignored a fact that wasn't convenient.

If it was actually factual and I didn't respond then I must not have seen it. I will not ignore any facts. Opinions on the other hand depend on how well they are grounded.


This is the type of thing irritates me. To many people assume without ever asking or making any effort to verify anything. How I respond to you depends on how you approach me.

I'm tired of hearing all of these "ALLUSIONS" to some unspecified things I supposedly did. If I call someone out I will tell them exactly what they did, when and where. I've yet to see this from anyone trying to accuse me of anything.
 
What are the reasons other than GPU clock that make some believe that the eDRAM bandwidth is only 70GB/s? Has bus width determined, or is there an assumption that Nintendo wouldn't pay
for the wider bus?
 

USC-fan

Banned
Bg the psu consumes power converting , so if you say wiiu pulling 34 watts with 70% efficient psu the console itself only using 23.8 watts.
 

strata8

Member
Suuuuure.

Xbox 1 XCPU: 951.64 DMIPS @ 733 MHz
Pentium III: 1124.311 @ 866 MHz
GC Gekko: 1125 DMIPS @ 486 MHz (har har fast against de-cached Pentium 3 my ass)
Wii Broadway: 1687.5 DMIPS @ 729 MHz

Pentium 4A: 1694.717 @ 2 GHz
PS3 Cell PPE: 1879.630 DMIPS @ 3.2 GHz (sans SPE)
X360 Xenon: 1879.630 DMIPS*3 = 5638.90 DMIPS @ 3.2 GHz (each 3.2 GHz core performing the same as the PS3)
PowerPC G4: 2202.600 @ 1.25GHz
AMD Bobcat: 2662.5*2 = 5325 DMIPS @ 1 GHz
Wii U Espresso: 2877.32 DMIPS*3 = 8631.94 DMIPS @ 1.24 GHz (again, final performance taking into account 3 fully accessible cores)
Pentium4 3.2GHz: 3258.068
6 core Bobcat: 4260*6 = 25560 DMIPS @ 1.6 GHz (said CPU doesn't exist, but best case scenario Jaguar is supposed to perform 20% better; that would be 5112 DMIPS per core, 30672 DMIPS for 6 cpu's, it's probably somewhere in between; I'm using 6 cores because that's what devs will have access to)

A 4-core Jaguar @ 1.5 GHz actually gets 26,660 MIPS, or 6,550 per core. Scale that up to 1.6 GHz and you end up with 6,986 per core, or 41,916 for 6 cores.

I'm guessing the improvement is due to the addition of SSE4.1, SSE4.2, and AVX.

It's a bit misleading to only use 6 cores though unless you're assuming that the Wii U and other consoles reserve none of the CPU for OS functions.
 

krizzx

Junior Member
A 4-core Jaguar @ 1.5 GHz actually gets 26,660 MIPS, or 6,550 per core. Scale that up to 1.6 GHz and you end up with 6,986 per core, or 41,916 for 6 cores.

I'm guessing the improvement is due to the addition of SSE4.1, SSE4.2, and AVX.

It's a bit misleading to only use 6 cores though unless you're assuming that the Wii U and other consoles reserve none of the CPU for OS functions.

They allow access to all 3 CPU cores to the dev. I know in the Wii they used an ARM core(nicknamed Starlet) to handle the OS features and security.

The Wii U may also use an arm core for this but I'm not sure. I think there is an arm core somewhere on it. It was nicknamed Starbuck if I recall. The Wii U also has a DSP to handle sound processing. There are no reserved cores in Espresso like there was with the PS3(had 2 PPE's reserved for the OS, secruity and sound) and 360(used 1 to 1.5 cores for the OS, security and Sound).
 
High clocked for a PPC750 but it's clock speeds are very low compared to other architectures these days. All doing more DMIPS and FLOPS per clock.

You originally said that "Espressso is low clocked PPC750" and were shown that it's the highest clocked. So now you are writing that it's clock speed it lower compared to other architectures for absolutely no reason. Why are you obfuscating your point about the performance of the CPU relative to other CPUs?

You said it was a low clocked PPC750 and you were shown wrong. Move on to your next argument. Why would you compare clock speeds to other architectures? That's not how clock speeds work.
 

z0m3le

Banned
Bg the psu consumes power converting , so if you say wiiu pulling 34 watts with 70% efficient psu the console itself only using 23.8 watts.

The 70% might be more realistic if the PSU was drawing 70-75watts. The best measurement we have is probably the fluke, which gave us 48watts with HDD and USB network adapter plugged in, given that is only about a 60% draw from the PSU (the load), it should be close to it's max efficiency @ 85-90%, going with that we are looking at 41 to 44 watts being used, 3 of the 4 USBs being used so - 7.5watts (worst case) gives us 34 to 37 watts for the entire system to draw after USB devices are unplugged and PSU has been taken into account.
 

ThaGuy

Member
Suuuuure.

Xbox 1 XCPU: 951.64 DMIPS @ 733 MHz
Pentium III: 1124.311 @ 866 MHz
GC Gekko: 1125 DMIPS @ 486 MHz (har har fast against de-cached Pentium 3 my ass)
Wii Broadway: 1687.5 DMIPS @ 729 MHz

Pentium 4A: 1694.717 @ 2 GHz
PS3 Cell PPE: 1879.630 DMIPS @ 3.2 GHz (sans SPE)
X360 Xenon: 1879.630 DMIPS*3 = 5638.90 DMIPS @ 3.2 GHz (each 3.2 GHz core performing the same as the PS3)
PowerPC G4: 2202.600 @ 1.25GHz
AMD Bobcat: 2662.5*2 = 5325 DMIPS @ 1 GHz
Wii U Espresso: 2877.32 DMIPS*3 = 8631.94 DMIPS @ 1.24 GHz (again, final performance taking into account 3 fully accessible cores)
Pentium4 3.2GHz: 3258.068
6 core Bobcat: 4260*6 = 25560 DMIPS @ 1.6 GHz (said CPU doesn't exist, but best case scenario Jaguar is supposed to perform 20% better; that would be 5112 DMIPS per core, 30672 DMIPS for 6 cpu's, it's probably somewhere in between; I'm using 6 cores because that's what devs will have access to)

a 6 core 1.6 GHz PPC750 could actually compete with bobcat in Dhrystone performance; as for Floating Point it was simply not designed that way; but I feel people put too much emphasis on CPU floating point for no good reason.


You also have Blu's SIMD benchmarks which are certainly not embarassing for the architecture.Bloody hell.

Gamecube launched in September 2001; PowerPC G5/970 launched in June 2003; that's almost two years, and I don't think I have to remind you how fast things changed back then.

And yeah, because Nintendo should have went from a 180 nm 4.9W TDP from Gekko to 130 nm 42W PPC970 variant; that's tenfold increase despite the smaller manufacturing method.Higher clocked and it has more cores to it; I mean duh. Like I ilustrated above, in DMIPS they could actually be closely matched; as for the rest it's pretty much design decision and they have to live and die for it, but it's certainly not a shameful architecture as you're making it out to be.

I think Nintendo should have went with more cores but with that said, it's still incredibly powerful for the energy drain; that's the thing it has going for it.

Notice we're listing it as 6W part; how lean the design is is a huge factor; even if it doesn't bring any bonus to it (other than a lower power bill you and me really don't care about); the thing is effective.

The Wii U seems to be remarkably effective for a 33/38W console, with the HD Twins not being able to dream to go as low even now after numerous core shrink and optimizations.

Problem is, as impressive as something might be in the effectiveness per Watt, they shot themselves in the foot for not going higher. Like saying a 80W Wii U could have the margin to be so much more powerful than it is; but that could be achieved by doubling logic, including CPU cores.It's not meant to be. Calling it an off the shelf part though is not only insulting and revealing of how much you seem to know regarding this, but it's also misleading should anyone read it and run along with that.

They might have had all the wrong priorities to it you (and me) might think of, but it's still as custom as it gets.

That list just goes to show how everything in console hardware never reached its full potential at the end of it's generation.

It also shows why Nintendo decided not to play the spec numbers game with customers. It can get very confusing without deep research.
 
The 70% might be more realistic if the PSU was drawing 70-75watts. The best measurement we have is probably the fluke, which gave us 48watts with HDD and USB network adapter plugged in, given that is only about a 60% draw from the PSU (the load), it should be close to it's max efficiency @ 85-90%, going with that we are looking at 41 to 44 watts being used, 3 of the 4 USBs being used so - 7.5watts (worst case) gives us 34 to 37 watts for the entire system to draw after USB devices are unplugged and PSU has been taken into account.

Usually the efficiency is around the same regardless of load. Being honest I'm going with a 70-75% efficiency due to cost. That said I do think the load is going to play a big role in trying to learn more about the hardware.
 

USC-fan

Banned
Yeah I think I calculated "backwards" for lack of a better word.

My point it not fixed number. The power used by the psu is on What looks like a bell curve for efficiency. There is a sweat spot where efficiency is at its peak. If the wiiu has a high end psu it would be 90% but it's very unlikely. It most likely closer to 70-80% at wiiu power consumption like you stated.
 
A 4-core Jaguar @ 1.5 GHz actually gets 26,660 MIPS, or 6,550 per core. Scale that up to 1.6 GHz and you end up with 6,986 per core, or 41,916 for 6 cores.
Those are SiSoft Sandra Dhrystones not the standard Dhrystone v2.1 benchmark.

The Bobcat I listed on that site/methodology gets listed as 5840 MIPS instead of the 5325 DMIPS I listed; there's a clear discrepancy. They should be indicative if you're comparing SiSoft figures to SiSoft figures... But we aren't.


Problem is SiSoft is a PC/x86 thing and closed source, so we really can't take those numbers and compare them to other architectures like PPC for that same reason.

The figures I posted should be indicative, really, I reckon they're taking into account the 20% increase AMD touted religiously, knowing that 20% increases are usually peak situations, therefore not realworld. I've ignored all that and gone for the best case scenario, precisely to be safe/for the numbers to be indicative even if inflated.
I'm guessing the improvement is due to the addition of SSE4.1, SSE4.2, and AVX.
Not so sure.

SSE and AVX are SIMD implementations, they'll have more impact with whetstones than dhrystones.

And of course, PPC750 strength really doesn't lie on whetstones.
It's a bit misleading to only use 6 cores though unless you're assuming that the Wii U and other consoles reserve none of the CPU for OS functions.
I honestly don't think so.

We know devs have access to 3 CPU's on the Wii U, and knowing Nintendo previous decisions with the Wii where the I/O portion of the OS ran on Starlet... Wii U probably offloads those same things to the similar named Starbuck.

Of course there are some dedicated resources, but Nintendo didn't have the ambition Sony and Microsoft had for OS, which is why they reserved way less resources to it (it's not as if Sony/Microsoft are using those resources right now, they most likely aren't, but they planned it so they might).

And honestly, Wii U and XBone/PS4 represent a leap so big in general purpose performance (and predictability) as well as featuring aditional DSP's (like sound) that once developers are used to the different nature of these architectures it shouldn't really be a problem compared to, say... PS3.

Fact is as a developer you'll have access to 3 cores on the wii, and 6 on PS4/XBone, and my point is this iteration of the PPC750 architecture should be pretty capable for what it is; Nintendo just undershot on the core count and perhaps even MHz (because the thing about Nintendo is... They never pull aggressive clocks on their hardware.; instead they go for the balance where they can keep the best clock at the lowest voltage while compensating with fast FSB for the spec. (cases in point, Gekko was 486 MHz, similar PPC750 Cxe part reached 700 MHz, a 44% increase; Broadway was 729 MHz, consumer version PPC 750CL reched 1 GHz, a 37% increase; for Espresso a ~40% clock increase would be 1.74 GHz). No doubt Nintendo could go higher.

I have no doubts this part could easily do 1.6 GHz and higher; but that's simply not Nintendo's thing.
 
My point it not fixed number. The power used by the psu is on What looks like a bell curve for efficiency. There is a sweat spot where efficiency is at its peak. If the wiiu has a high end psu it would be 90% but it's very unlikely. It most likely closer to 70-80% at wiiu power consumption like you stated.

Oh I understand what you're saying. I'm saying what I did was take an amount + 30% (in this scenario) instead of the total - 30%.
 

bomblord

Banned
Shouldn't we be more concerned with single core performance than the number of cores that developers have access to?

I know a lot of modern engines have taken better advantage of multithreaded functionality but I'm sure there are developers who still only dump the code on there and go for a brute force approach instead of taking the time to properly multithread their application.

Even Shinen claimed to only be using 1 core on their first game.
 
Give me a refresher course as to why binned parts would make this an issue.

If those embedded chips only represent the top whatever % in a yield, the rest can still be used as a higher drawing desktop part, a lower clocked mobile version with some disabled SIMDs, etc. With Latte, there is no other market for bad chips to go to, so they need to be extra conservative when it comes to how much they pack on and how high they clock it.

The 70% might be more realistic if the PSU was drawing 70-75watts. The best measurement we have is probably the fluke, which gave us 48watts with HDD and USB network adapter plugged in, given that is only about a 60% draw from the PSU (the load), it should be close to it's max efficiency @ 85-90%, going with that we are looking at 41 to 44 watts being used, 3 of the 4 USBs being used so - 7.5watts (worst case) gives us 34 to 37 watts for the entire system to draw after USB devices are unplugged and PSU has been taken into account.

The Broadcom chip in Wii U that enables the USB dongle can draw up to 6w itself. He gave us the reading without all that attached. 38w.
 

USC-fan

Banned
Those are SiSoft Sandra Dhrystones not the standard Dhrystone v2.1 benchmark.

The Bobcat I listed on that site/methodology gets listed as 5840 MIPS instead of the 5325 DMIPS I listed; there's a clear discrepancy. They should be indicative if you're comparing SiSoft figures to SiSoft figures... But we aren't.


Problem is SiSoft is a PC/x86 thing and closed source, so we really can't take those numbers and compare them to other architectures like PPC for that same reason.

The figures I posted should be indicative, really, I reckon they're taking into account the 20% increase AMD touted religiously, knowing that 20% increases are usually peak situations, therefore not realworld. I've ignored all that and gone for the best case scenario, precisely to be safe/for the numbers to be indicative even if inflated.Not so sure.

SSE and AVX are SIMD implementations, they'll have more impact with whetstones than dhrystones.

And of course, PPC750 strength really doesn't lie on whetstones.I honestly don't think so.

We know devs have access to 3 CPU's on the Wii U, and knowing Nintendo previous decisions with the Wii where the I/O portion of the OS ran on Starlet... Wii U probably offloads those same things to the similar named Starbuck.

Of course there are some dedicated resources, but Nintendo didn't have the ambition Sony and Microsoft had for OS, which is why they reserved way less resources to it (it's not as if Sony/Microsoft are using those resources right now, they most likely aren't, but they planned it so they might).

And honestly, Wii U and XBone/PS4 represent a leap so big in general purpose performance (and predictability) as well as featuring aditional DSP's (like sound) that once developers are used to the different nature of these architectures it shouldn't really be a problem compared to, say... PS3.

Fact is as a developer you'll have access to 3 cores on the wii, and 6 on PS4/XBone.
That's not a fact at all. We don't have information on What is reserved. We just have speculation.
 
That's not a fact at all. We don't have information on What is reserved. We just have speculation.

Marcan has said that the OS kernel runs across all CPU cores. What % of each core is reserved, who knows? The other 2 next gen systems may still be speculation right now, agreed.
 
If those embedded chips only represent the top whatever % in a yield, the rest can still be used as a higher drawing desktop part, a lower clocked mobile version with some disabled SIMDs, etc. With Latte, there is no other market for bad chips to go to, so they need to be extra conservative when it comes to how much they pack on and how high they clock it.

I think the issue for me in this case is that would normally be a concern for a new line of GPUs on a new process. We know the process for Latte is mature whatever it may be. And we also know other than eMemory, there's nothing that exotic about the architecture. So I don't see a high amount of bad chips coming from those wafers.

For me I don't understand the TDP impact issue of a console-designed GPU vs a binned GPU of similar specs. IMO it comes off like saying the former would automatically have a higher TDP, when I would think the opposite is true. I also still believe there is merit to the other points I made.
 
My point it not fixed number. The power used by the psu is on What looks like a bell curve for efficiency. There is a sweat spot where efficiency is at its peak. If the wiiu has a high end psu it would be 90% but it's very unlikely. It most likely closer to 70-80% at wiiu power consumption like you stated.

You are making a valid point, but I'd like to note that "bell curve" term is not used correctly here. Regardless, you just wrote that the PSU is just as inefficient when drawing less or more power equally distant from the "sweet spot." I didn't think PSUs worked this way.
 
That's not a fact at all. We don't have information on What is reserved. We just have speculation.
Regarding Wii U or the new HD-Twins?

I don't think that's speculation at this point, unless a lot of things we assume as facts regarding them are "speculation".

PS4 and XBone are no secret at this point, Sony elaborates on the RAM architecture and Microsoft isn't beating around the bush either. If there weren't only 6 cores available to devs we'd know.

And thing is, most likely no one is complaining because 6 cores that are not utter bullcrap at General Purpose for a change and a sound DSP probably make it so that it's more than adequate.
Marcan has said that the OS kernel runs across all CPU cores. What % of each core is reserved, who knows? The other 2 next gen systems may still be speculation right now, agreed.
We have to assume 6 cores available to developers, and it's more than appropriate that way too.

If that changes it changes, but everything points to it being that way.


Anyway, there's no doubt Jaguar is substantially more supercharged than the PPC750 specification in hand is; if anything because clocks are lower (not an indication of anything, but along with the dhrystone math), they're no slugs and they have more cores going on; which is a reason all in itself to not make it any worse by including two extra cores that are probably reserved for the system anyway. Point really was, on a watt vs general purpose power they're actually very effective for what they are; regardless of how narrow sighted the actual implementation really was (I find it really puzzling that they had a CPU this power efficient and small and went with only 3 cores of it; why not a fourth?)
 

strata8

Member
The figures I posted should be indicative, really, I reckon they're taking into account the 20% increase AMD touted religiously, knowing that 20% increases are usually peak situations, therefore not realworld. I've ignored all that and gone for the best case scenario, precisely to be safe/for the numbers to be indicative even if inflated.Not so sure.

Your other points are fine, but this is completely unsubstantiated. The Sandra benchmarks saw a 50% improvement per clock in MIPS and 20% improvement in FLOPS. Performance per clock in Cinebench 11.5 and x264 improved by 37% and 41% respectively. File decompression improvement hit almost exactly 20% as well.
 
Your other points are fine, but this is completely unsubstantiated. The Sandra benchmarks saw a 50% improvement per clock in MIPS and 20% improvement in FLOPS. Performance per clock in Cinebench 11.5 and x264 improved by 37% and 41% respectively. File decompression improvement hit almost exactly 20% as well.
I haven't seen Jaguar benchmarks yet.

Those numbers were lifted from a post I did months ago, before there were Jaguar benchmarks and going by the things AMD announced regarding Jaguar performance over bobcat; I didn't think they'd downplay the improvements seeing brands like Nvidia always severely inflate them. I still think they *should* be indicative because I was going by what AMD themselves claimed, but it was always a placeholder, sort to speak.

Jaguar has AVX albeit 128 bit one, and SSE 4.1 and 4.2 like you said, so Cinebench, x264, file compression improvements and flops don't surprise me.

50% increase in MIPS is surprising though, I'll have to look into it. And I'd like the see a Dhrystone benchmark and update the list, but not a SiSoft one. It's bound to appear; and Blu could also run it against his other tests sometime.
 
That link takes me to nothing.
Sorry, you are right, my fault.

I will try to find some specific examples to show what I was referring to. Keep in mind I'm just interested in helping you and others see where the conflict is so that we can have a more friendly conversation and there are others that have used a similar tone at times in this thread. I consider this a confrontational tone:

Don't inflect your ideology on me. I've said it a dozen time. I'm not here for console war/fanboy garbage like that. Stop trying to twist my words. You are accusing me of doing the opposite of what you and a few others are doing(as in trying your hardest to downplay the GPU).
http://www.neogaf.com/forum/showpost.php?p=77898657&postcount=8439
I don't care what "impresses" you as I doubt you would find anything released on Nintendo hardware impressive. Please take the console war stuff elsewhere. Too many people come in here for no other reason than to find angles to downplay the proposed gains demonstrated by the hardware and not actually help the analysis. My goal is to help determine the capability of the GPU, not to see which system/game is you find impressive.
http://www.neogaf.com/forum/showpost.php?p=77582693&postcount=8237
This is why it is irksome when people start slinging numbers around. The most important numbers to CPU/GPU performance aren't hertz and flops these day, but that's all that you see people posting, because most have an agenda(painting the hardware as weak) and having people only see low number on the forefront with no logical explanation as to how they relate to performance is key for painting a negative picture.
http://www.neogaf.com/forum/showpost.php?p=76204183&postcount=7704
What goes both ways?

Ah, yes. How could I forget X. Though as I said, nothing can ever convince them. They were anti-Nintendo from the start on all fronts and made up their mind before they ever saw a picture. The fact that they focused solely on something as insignificant as less detailed background buildings from a singe game sequence that stay on the screen for all of 3 seconds do to you zooming past them while ignoring "everything" else makes that clear.
http://www.neogaf.com/forum/showpost.php?p=64326146&postcount=6629
Bayonetta 2, Smash and Mario Kart 8 looked beyond anything I've seen on the PS3/360 technically. I doubt you would consider anything released on the console graphically superior even if the devs themselves claimed it was, though(which they have many times)

Nothing can convince a person who wears tinted glasses.
http://www.neogaf.com/forum/showpost.php?p=64322256&postcount=6626
Thank you. I thought I was alone in noticing this. Its like there is this small group who come in here for not other reason than bash the console and impede progress. Its like the thought of someone hating the Wii U causing them physical harm. Its so disgusting.
http://www.neogaf.com/forum/showpost.php?p=60299721&postcount=6084
That is your opinion, and I doubt you would find anything on this hardware impressive.
http://www.neogaf.com/forum/showpost.php?p=59825833&postcount=5951
Dude, will you stop harassing me and trying to vilify my statements. Unlike you, I am not beyond reason.
http://www.neogaf.com/forum/showpost.php?p=59795233&postcount=5905
This thread is about analyzing the Wii U GPU, but every time someone points out where a Wii U game exceeds a last gen game, here come angry fans trying to dismiss it either by making some arbitrary claim of the 360/PS3 being able to do it as well despite all details to the contrary with no real supporting facts or flipping the argument to thePS4 even though it was never being argued that it was not stronger than the Wii U by anyone. Jordan is notorious for this. Especially the latter. Just look above. What does Killzone 4 have to do with the Wii U or any game on it? How does that further progress this discussion?
http://www.neogaf.com/forum/showpost.php?p=59711649&postcount=5747

At this point, we have to ask ourselves what's the difference between calling someone a hater and saying "you wouldn't find anything on the system impressive anyway", saying they have an agenda to downplay the GPU or that they are anti-Nintendo angry fans? Differente words, same meaning.

It's this confrontational tone what's causing the responses you are seeing, not your opinions. And I agree and have no doubts that you are trying to discuss the GPU and find more info about it. I also agree that you care about facts (even though I must say a screenshot has to clearly support your claim, otherwise it's just a subjective observation). In other words, I have no doubts that you have good intentions and you find that you are doing the right thing, but every time people focuses in confrontation instead of collaboration it brings the discussion to a halt, not to mention it makes many others avoid the discussion altogether. This applies to this thread and everything else. In real life this would lead you nowhere as well. Any successful conversation I have ever had had a component of good faith or trust, as well as friendliness.
 

krizzx

Junior Member
Sorry, you are right, my fault.

I will try to find some specific examples to show what I was referring to. Keep in mind I'm just interested in helping you and others see where the conflict is so that we can have a more friendly conversation and there are others that have used a similar tone at times in this thread. I consider this a confrontational tone:


http://www.neogaf.com/forum/showpost.php?p=77898657&postcount=8439

http://www.neogaf.com/forum/showpost.php?p=77582693&postcount=8237

http://www.neogaf.com/forum/showpost.php?p=76204183&postcount=7704

http://www.neogaf.com/forum/showpost.php?p=64326146&postcount=6629

http://www.neogaf.com/forum/showpost.php?p=64322256&postcount=6626

http://www.neogaf.com/forum/showpost.php?p=60299721&postcount=6084

http://www.neogaf.com/forum/showpost.php?p=59825833&postcount=5951

http://www.neogaf.com/forum/showpost.php?p=59795233&postcount=5905

http://www.neogaf.com/forum/showpost.php?p=59711649&postcount=5747

At this point, we have to ask ourselves what's the difference between calling someone a hater and saying "you wouldn't find anything on the system impressive anyway", saying they have an agenda to downplay the GPU or that they are anti-Nintendo angry fans? Differente words, same meaning.

It's this confrontational tone what's causing the responses you are seeing, not your opinions. And I agree and have no doubts that you are trying to discuss the GPU and find more info about it. I also agree that you care about facts (even though I must say a screenshot has to clearly support your claim, otherwise it's just a subjective observation). In other words, I have no doubts that you have good intentions and you find that you are doing the right thing, but every time people focuses in confrontation instead of collaboration it brings the discussion to a halt, not to mention it makes many others avoid the discussion altogether. This applies to this thread and everything else. In real life this would lead you nowhere as well. Any successful conversation I have ever had had a component of good faith or trust, as well as friendliness.


1st post, the guy makes a pesonal attack and insults me. I called him out on it. He initiated that "confrontation". Also, I was not praising the GPU which would be the polar opposite of what he was doing. That doesn't contradict at all.

2nd post. What exactly did i do wrong in the second post? It was just as I said. He(and many others I encountered) use the fact that something doesn't impress them to disregard it. The unending problem with this is that it was never my goal or the purpose of the post to do so. The focal point was the contrast between the photos but they never addressed it. I called it like it was. There is nothing I could possibly show them on a Nintendo console that would impress them. There was no better way to put it.

3rd post. I don't get how that correlates with the boded at all.

4th post. Same as above. What is the problem in that post? That was what they have shown. They will not engage in a logical discussion no matter how much I try to have one with them. If you present fact they will ignore them and change the focus. They say the games look the same, you show a dev or proffesional analyst comment that says the Wii U version was superior someone, they immediately switch to talking about a flaw in a launch port and never address the evidence I provide or go back to that topic. That is how the conversations have gone every time I tried to have a normal, respectable conversation with them. So I stopped trying to have and simply called out what they were doing. This is not a "1" time occurence. I didn't just lash out at them for speaking negatively about the Wii U. They move the goal post every time their argument falls flat, and that is my problem with what they say, not whether or not they agree with my opinion or like Nintendo like you are trying to paint it These are repeat offenders
. I usually just ignore them but then they started attacking me directly as shown above.


5th post. I stated my opinion similar to what many others have stated. That there is a group of people who come in here for no other reason than to downlplay anything that looks like its promoting the Wii U. They do drive by's and then leave. I am not the only one who sees this.
http://www.neogaf.com/forum/showpost.php?p=78122697&postcount=8585
http://www.neogaf.com/forum/showpost.php?p=78103049&postcount=8574
http://www.neogaf.com/forum/showpost.php?p=78100449&postcount=8566
http://www.neogaf.com/forum/showpost.php?p=78059481&postcount=8531
http://www.neogaf.com/forum/showpost.php?p=78036261&postcount=8514
http://www.neogaf.com/forum/showpost.php?p=78058001&postcount=8527
What have i done wrong in this instance?

6th post. Same as above. When the other guy said he didn't find it impressive. I don't expect them to nor care if they do. What they find impressive was 100% irrellevent to what was being discussed, but all they had to contribute was their dismissal on the grounds that it didn't impress them like that was some form of credible measurement of capability. The point is the difference images showed to one another under the circumstance, but they wouldn't address those. They instead dismissed the analysis on the grounds that it is not impressive with nothing even remotely concrete backing it.

7th post. Goes back to the first post. He was using personal attacks and would not adhere to any reasoning whatsoever which a lot of people called him on if you read anything around it. There was no discussing with him. His sole goal was attacking me directly at that point I had enough. Once again. I respond to people dependent on how they approach me.

8th post. What did i do wrong here. This is a common occurrence. Heck someone did it again to day.
http://www.neogaf.com/forum/showpost.php?p=78097689&postcount=8561
I do not make this up. It happens over and over and over and over and over and over and over in the exact same way. And once again, I would have had no problem with it but he intentional directed it specifically at me coupled with false accusation with the intent of making a problem with me.



You quoted a lot of things I said but none of them were said for the reasons you are trying to make them out to be or towards the end you are trying to place them at. I was not calling them haters in any form. I was address their behavioral history directly. If a person is constantly dismissive of everything I say and constantly ignores all of the material I post to support it then I am not going to keep trying to reason with them. I'm just going to state directly what they have demonstrated. That nothing I say will matter to them in continuing the conversation would be stupidity on my part. That's not calling them a hater. That's saying that I've had enough of trying to hold a discussion with them.

I take a logical tone. How you or anyone chooses to interpret is beyond my control. I am about facts and detail which is another things that I have repeated a ridiculous amount of time.

Feel free to disagree with me, so long as your provide credible material to back up what you say. I certainly will. Unless you are a dev, or a person who has proven his experience like BG_Assassin, Fourth Storm, blu, Lostinblue or the other major posters, then I'm not going to take you own your word at face value with nothing backing it. Especially when you have a history of changing your arguments at the drop of a hat.

I have "one" agenda in this thread and that is to learn what the GPU can and can't do. My post all go toward that end.
 

krizzx

Junior Member
its gonna be a lot of meltdowns and disappointment this generation and i cant wait. but as far as thread topic everyone is due to have their own opinion... lets not get our feelings hurt over everything i hope there are mods watching this thread though... some stuff have gone over the line... now back to the GPU talk.

I wish we could just get back to the GPU too.

I'm so tired of all of these attacks. Its like its a crime to suggest something on the Wii U is better than something on the PS3 or 360. Comparing games is the best gauge I can think of barring another VGleak.

On that note. Does anyone have any clue as to what the the purpose of that 700 MB of unused RAM is for? There is like an entire RAM chip that just stays dormant. Maybe that has something to do with the power draw as well, since if a RAM chip is not being used fully, then it will consume more power when it is finally used.

Another things. It says MEM1 is used for graphic libraries. Does that mean graphics libraries for games or just the OS? That could mean that the other 1GB of RAM that is supposedly for the OS is already being used in games sorta, thus the Wii U has more than 1 GB of RAM contributing to the games already. This is stretch though.

Also why does it list MEM1 as 32MB(the size of the eDRAM) in one diagram and 1 GB in the other? This looks like some sort of labeling mistake.

wii_u_mem.jpg
wii_u_mem2.jpg
 

JordanN

Banned
but you tend to use a more confrontational tone than others at times
I'm so tired of all of these attacks. Its like its a crime to suggest something on the Wii U is better than something on the PS3 or 360.
I HAVE NOT said that the WiiU can not beat the twins, just that nothing that has been SHOWN looks clearly beyond the PS3 and 360
I'm so tired of all of these attacks. Its like its a crime to suggest something on the Wii U is better than something on the PS3 or 360.
Enlighten me with your sources what makes the Wii U more than 20% more powerfull than the PS360. All it has is more advanced GPU architecture which can more in less cycles.
I'm so tired of all of these attacks. Its like its a crime to suggest something on the Wii U is better than something on the PS3 or 360.
I never said Wii U was not better than Xbox 360. I'm only against point blank judgements off what is clearly circumstantial evidence.
I'm so tired of all of these attacks. Its like its a crime to suggest something on the Wii U is better than something on the PS3 or 360.

I'm not even bothered if he/she doesn't read it. Someone on the internet will and they can judge who's right.
 
It is true that a lot of the time when krizzx says (constantly) something along the lines of 'Geez, can we just stay on topic here? I'd like to discuss (enter point). What about that?', then a few posters jump down his throat about it. Not to mention that sometimes, he's ignored completely. It would seem to make sense that people who have a solidified opinion would just move on, and let people who want to discuss, discuss. But it's as if, because sometimes it appears that discussion will move away from certain conclusions that others are satisfied with, they can't. I fully understand krizzx' frustration. Even if I also do understand the frustration that some people have with the shortness with which he also treats people with at times due to that frustration.

Also, it seems that some of the frustration, (and I hope I'm not misrepresenting his feelings here) comes from the fact that sometimes new people, or people who have been away for a while jump in, and expect him to go pages back and quote himself on things he's already covered, or explain anew. Then get upset when he doesn't. When from his end, it appears that their motive for discussing it isn't for moving the conversation forward, but to try and discredit him/shut him up.

(Just saying these things because it really seems like he's being attacked, and perhaps people may want to try and understand a different perspective regarding this).
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
If it can be shown that those embedded parts are not binned/rebranded laptop chips, I will agree with your comparison. I just don't think we can make that leap and claim that there's no binning going on with that line.
They are not. Embedded parts and consumer parts have entirely different lifespans. How do I know that? I've spent ~6 years of my career working with AMD's (and other vendors') embedded parts.
 
You know guys. It all dosen't matter anymore! I haven't really talked much for quite some time in this thread or other threads about how capable Wii U is. For a simple reason...

176 gflops? 352 gflops? It dosen't matter!

Anything being discussed in here dosen't matter to me anymore since E3.

Wii U could have 1 gflop and it wouldn't matter.

Wii U since E3 IMO showerd exactly what i (and several others) expected Wii U to do. Being a good step up above PS360.

If you look at "X" you have to keep in mind that Nintendos games get the really good visual bling bling at the end of the development cycle. Now imagine how this game will look a year from now. Cause if its really coming in 2014, its propably a holiday title. So comparing games that are out or coming in the next weeks to "X" makes no sense. Remember that Pikmin 3 and The Wonderful 101 both improved alot in their last year of development! Also a game like GTA 5 does look incredible because they propably spent over 100 million on it and have a far bigger team on it than any Wii U game will ever see. "X" is propably done with a fraction of GTA 5s budget. A huge budget can make quite a difference that has nothing to do with technical capabilities of the hardware.

Also Wind Waker HD looks incredible on the latest footage. Full 1080p + really awesome lighting effects and higher res textures and proper widescreen. IMO All that was needed to bring this game to 2013 was done.

Mario Kart 8 looks absolutely amazing. Great textures, awesome character/kart models and effects. All in 60 frames/sec. Not buyin the 1080p everyone is speaking of until its official though.

Smash is 1080p60 and looks really amazing. N'uff said :p

Bayonetta 2 looks alot better than Bayonetta 1. And i don't buy the "Bayonetta is a 2010 game so it dosen't count" excuse. Bayonetta 1 came 2010 and was developed on a mature Xbox 360 dev kit from a team with good technical abilities. So the Bayo1/2 comparison is valid. Also both games are propably utilising a comparable budget since i can't see Nintendo "wasting" as much cash on a game as other developers. Especially not on a "B" franchise like Bayonetta wich only appeals to a niche audience. IMO Bayo 1 vs. Bayo 2 is the most exact comparison we have right now if you also use the budget as a factor (And you should)

3D World looks really great and super clean and i wouldn't be surprised if it was 1080p60. (Yet i would be ok woth 720p60 too and wouldn't whine or laugh about it like some people on here would)

The people who still try to push the "Wii U = 360" agenda in terms of capabilities are the same people coming to every Wii U thread to trashtalk so their opinions and statements have absolutely no value to me. Im not calling those people out, you know who you are.

Thats just my 2 cents to the Wii U capabilities talk.
 

z0m3le

Banned
They are not. Embedded parts and consumer parts have entirely different lifespans. How do I know that? I've spent ~6 years of my career working with AMD's (and other vendors') embedded parts.

Yeah this makes sense, thanks for the experienced comment about it. Considering they sell for 2+ years on the same embedded parts and sell millions of units, binning never made much sense to me, especially because even when they are based on the same core, as a desktop series, they are a different config.
 
You know guys. It all dosen't matter anymore! I haven't really talked much for quite some time in this thread or other threads about how capable Wii U is. For a simple reason...

176 gflops? 352 gflops? It dosen't matter!

Anything being discussed in here dosen't matter to me anymore since E3.

Wii U could have 1 gflop and it wouldn't matter.

Wii U since E3 IMO showerd exactly what i (and several others) expected Wii U to do. Being a good step up above PS360.

If you look at "X" you have to keep in mind that Nintendos games get the really good visual bling bling at the end of the development cycle. Now imagine how this game will look a year from now. Cause if its really coming in 2014, its propably a holiday title. So comparing games that are out or coming in the next weeks to "X" makes no sense. Remember that Pikmin 3 and The Wonderful 101 both improved alot in their last year of development! Also a game like GTA 5 does look incredible because they propably spent over 100 million on it and have a far bigger team on it than any Wii U game will ever see. "X" is propably done with a fraction of GTA 5s budget. A huge budget can make quite a difference that has nothing to do with technical capabilities of the hardware.

Also Wind Waker HD looks incredible on the latest footage. Full 1080p + really awesome lighting effects and higher res textures and proper widescreen. IMO All that was needed to bring this game to 2013 was done.

Mario Kart 8 looks absolutely amazing. Great textures, awesome character/kart models and effects. All in 60 frames/sec. Not buyin the 1080p everyone is speaking of until its official though.

Smash is 1080p60 and looks really amazing. N'uff said :p

Bayonetta 2 looks alot better than Bayonetta 1. And i don't buy the "Bayonetta is a 2010 game so it dosen't count" excuse. Bayonetta 1 came 2010 and was developed on a mature Xbox 360 dev kit from a team with good technical abilities. So the Bayo1/2 comparison is valid. Also both games are propably utilising a comparable budget since i can't see Nintendo "wasting" as much cash on a game as other developers. Especially not on a "B" franchise like Bayonetta wich only appeals to a niche audience. IMO Bayo 1 vs. Bayo 2 is the most exact comparison we have right now if you also use the budget as a factor (And you should)

3D World looks really great and super clean and i wouldn't be surprised if it was 1080p60. (Yet i would be ok woth 720p60 too and wouldn't whine or laugh about it like some people on here would)

The people who still try to push the "Wii U = 360" agenda in terms of capabilities are the same people coming to every Wii U thread to trashtalk so their opinions and statements have absolutely no value to me. Im not calling those people out, you know who you are.

Thats just my 2 cents to the Wii U capabilities talk.

Yes. Some of the already available and upcoming games prove that the Wii U has obviously more power than PS360. People that even now have doubts about that could not be taken serious and emberass oneself.

Despite this some parts of this thread are interesting to read because there are some unusal parts of the Wii U hardware you wont find in other consoles or pc tech. Nintendo engineers seems to be as unusal and original as the Nintendo game maker ;)
 
krizzx's post
You don't need to address those posts, they were just examples to make a point about something that creates a negative effect.

What I'm trying to show is that tone is very important in a conversation and if you don't really feel that conversation is going anywhere for whatever reason, the best curse of action is to stop and focus on something else. People are here for the GPU analysis and friendly discussion and when we have these back and forth arguments or comments about how unreasonable this or that guy is not only does it detract from the conversation, but it also changes the tone of the discussion in a bad way. It's better to keep it civil or move along.

I realize having this conversation is not only extremely off-topic but it's not the best thing to do either, but this whole thing has been going on for almost 200 pages and the last few have been especially bad so I was hoping many would realize that for most people reading the thread it's actually a problem, like going to a bar and seeing the owner arguing with the barmaid.

Does this make sense?


ArchangelWest's post
Yep, as I said, this is a problem with frustration. Some people just deal with it better than others.

All sides are at fault at different times.


I'm not even bothered if he/she doesn't read it. Someone on the internet will and they can judge who's right.
You didn't get my point at all! Quick, how many polys has Princess Peach's underwear?
 

69wpm

Member
Something I've always wondered: All Wii U games seem to be v-synced. Could this be some kind of (forced) feature of the GPU?
 
Something I've always wondered: All Wii U games seem to be v-synced. Could this be some kind of (forced) feature of the GPU?
IIRC, Darksiders 2 had tearing according to DF, and if that's the case it's not a forced feature. Pretty much every Wii U game is V-synced though, so it may be important due to gamepad streaming.
 

69wpm

Member
IIRC, Darksiders 2 had tearing according to DF, and if that's the case it's not a forced feature. Pretty much every Wii U game is V-synced though, so it may be important due to gamepad streaming.

Dark siders 2 kind of disproves that theory

Well, there goes that theory. I know the PC version had screen tearing, but that got fixed with a patch. Can't believe the Wii U version has the same issues..
 
Well, there goes that theory. I know the PC version had screen tearing, but that got fixed with a patch. Can't believe the Wii U version has the same issues..

If assume they could've fixed it quite easily with a patch too but theres no one to patch it any more
 
They are not. Embedded parts and consumer parts have entirely different lifespans. How do I know that? I've spent ~6 years of my career working with AMD's (and other vendors') embedded parts.

Thanks for the clarification.


Yeah that's what made it easier for me to go and focus on other things. It's on Nintendo to show us what Wii U is capable of. And thanks for reminding me of E3. :(

It still irks me because MK8 is what I had been looking for in the next 3D Mario and instead I got cats. :/

Yeah this makes sense, thanks for the experienced comment about it. Considering they sell for 2+ years on the same embedded parts and sell millions of units, binning never made much sense to me, especially because even when they are based on the same core, as a desktop series, they are a different config.

Yep. That's why I wanted to look at the TDP of an embedded chip because its configuration was much closer to a console environment than a discrete card.
 

krizzx

Junior Member
It is true that a lot of the time when krizzx says (constantly) something along the lines of 'Geez, can we just stay on topic here? I'd like to discuss (enter point). What about that?', then a few posters jump down his throat about it. Not to mention that sometimes, he's ignored completely. It would seem to make sense that people who have a solidified opinion would just move on, and let people who want to discuss, discuss. But it's as if, because sometimes it appears that discussion will move away from certain conclusions that others are satisfied with, they can't. I fully understand krizzx' frustration. Even if I also do understand the frustration that some people have with the shortness with which he also treats people with at times due to that frustration.

Also, it seems that some of the frustration, (and I hope I'm not misrepresenting his feelings here) comes from the fact that sometimes new people, or people who have been away for a while jump in, and expect him to go pages back and quote himself on things he's already covered, or explain anew. Then get upset when he doesn't. When from his end, it appears that their motive for discussing it isn't for moving the conversation forward, but to try and discredit him/shut him up.

(Just saying these things because it really seems like he's being attacked, and perhaps people may want to try and understand a different perspective regarding this).

You hit the nail on the head perfectly.

Dark siders 2 kind of disproves that theory

Vsyinc was only one of this games problems. Darksiders 2 is the worst port to hit the console, barring EA's year old sports games. Its clear that the game wasn't even finished. THey rushed it out for bucks before the bankruptcy took full effect.
 
Usually the efficiency is around the same regardless of load. Being honest I'm going with a 70-75% efficiency due to cost. That said I do think the load is going to play a big role in trying to learn more about the hardware.

I doubt it's that low of an efficiency. You're talking cheap chinese knock off power supplies that hit that level now adays. Hell cheap 30 dollar power supplies are hitting 80+ efficiency now adays and those are ones putting out 3 - 4x the wattage. I'm not saying it's definitely 90+ but it's at least 80+.

Less efficient power supplies also burn out easier/faster, and when has Nintendo ever been known to use something that doesn't last.
 

krizzx

Junior Member
Yes. Some of the already available and upcoming games prove that the Wii U has obviously more power than PS360. People that even now have doubts about that could not be taken serious and emberass oneself.

Despite this some parts of this thread are interesting to read because there are some unusal parts of the Wii U hardware you wont find in other consoles or pc tech. Nintendo engineers seems to be as unusal and original as the Nintendo game maker ;)

This is what draws me to the hardware. This is also how I believe a console should be.

Stock GPU's have dozens of functions meant for PCs such as things to help with viewing stuff like PowerPoint, various video streaming and decoding, or processing excel sheets and so on. Things like that are a waste of space and a waste of power in a console.

I cannot view a machine that uses stock parts as a console. Its loos like a static PC to me.

This goes back to what I suggested on the last page. Part of the difference in power draw from the stock card could its being compared to for TDP can simply be the fact that this isn't a stock GPU. It lacks all of those auxiliary components used for things that aren't gaming. Everything in the GPU uses electricity. There is no free power, so it would make sense that a card with a 38 watt TDP would be spending that on the extra features in the standard GPU. Take those out and then what type of watt for watt performance would the GPU be getting?

We know that Nintendo customized the chip with energy efficiency in mind and let us not forget the one quote "There is not wasted silicon". I take this as that there is no empty die space and that there are not components that the Wii U does not put to use.

Also, I got no response to the query of the RAM power consumption. According to vgleaks, leaked diagram. There is an Entire 512 chip in the Wii U that isn't being used at all. How much difference in power will it make when that extra 700 MB of unused in MEM1 is put to use in the consoles?
 
I doubt it's that low of an efficiency. You're talking cheap chinese knock off power supplies that hit that level now adays. Hell cheap 30 dollar power supplies are hitting 80+ efficiency now adays and those are ones putting out 3 - 4x the wattage. I'm not saying it's definitely 90+ but it's at least 80+.

Less efficient power supplies also burn out easier/faster, and when has Nintendo ever been known to use something that doesn't last.

I definitely understand what you're saying and can agree with it. The pessimistic side in me feels the controller ate more into costs than Nintendo intended.

Also, I got no response to the query of the RAM power consumption. According to vgleaks, leaked diagram. There is an Entire 512 chip in the Wii U that isn't being use.

Just wanted to address this because I forgot to from your other post, but that's not how it would work. Put simply the data and space would be spread out over the 2GB, not per chip. Also the memory will have a very low impact on power draw.
 
Status
Not open for further replies.
Top Bottom