• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
I guess that comment is going to be subjective then because I don't even get adding transistors from "adjusting" in that statement.

Well, there's got to be some hardware on there running translation from TEV code into shader language and such. However we take it, my point was that the design is not just R700 - it's been altered at least somewhat at the hardware level.


Of course we are. That's what I'm getting at. No one can draw a proper conclusion with that info.

The point you make here would fit more for the argument of a 320 part than a 160 part IMO. And with the overheating that was also supposedly with a 640 part GPU.

The point I made doesn't support any shader quantity over the other. The only thing that it says, is that for whatever Latte is, it's currently running at the highest clock they could get it while maintaining acceptable yields. I just can't see a 640 ALU part as ever being in the dev kits. It makes no sense of any of the other information we've gotten since.
 
The point I made doesn't support any shader quantity over the other. The only thing that it says, is that for whatever Latte is, it's currently running at the highest clock they could get it while maintaining acceptable yields. I just can't see a 640 ALU part as ever being in the dev kits. It makes no sense of any of the other information we've gotten since.


I guess you need to explain a little more how you come to that conclusion (unless from your POV this is true for every single console design).
 
Not sure how you can correlate between yield levels and a cooling problem in the dev kits. Not to mention those early dev kits were not using the MCM but rather had a discrete GPU. As far as I'm aware (and I could very well be wrong) there were no overheating issues with the later kits using the "real" hardware.

I doubt there was just a straight up Radeon card in the dev kits. Have you seen how small the case is? Not much larger than the Wii U itself. I'd imagine there was some type of prototype Wii U chipset in there. The GPU has a 2010 stamp on it, although I'm sure testing and optimization continued for quite some time. The idea of "cramming" a Radeon card into that small case was a rationalization made by us forum members who were holding out for more hardware beef.

Uuh... what? It's pretty unlikely that the very different architectures of PS4 and Wii U randomly result in the same "perfect" CPU/GPU clock ratio.

Nintendo seemed to like even multipliers in the past, and the Wii U architecture isn't entirely removed from the Wii and GCN designs. We were all surprised when the clock speeds were revealed. The current, seemingly arbitrary, clocks seem to be the result of Nintendo pushing everything higher than planned, especially the CPU, after devs got the dev kits and complained. This is speculation, yes, but it makes sense of the whispers we've heard.

I guess you need to explain a little more how you come to that conclusion (unless from your POV this is true for every single console design).

Why would they intentionally gimp the GPU? It's not like 550 Mhz is some magic number that makes sense of anything else in the hardware. If they could have had it clocked higher, while maintaining good yields at the target TDP, they would have. Or at least I would hope they would have. To deny this is to claim that Nintendo weren't even trying.
 

OryoN

Member
I just can't see a 640 ALU part as ever being in the dev kits. It makes no sense of any of the other information we've gotten since.

Didn't we get tipped off by a pretty trusted insider as to generally what type of card was in the kits? (my memory may be foggy, so correct/refresh me if wrong).

Also, wouldn't they need a pretty decent card in the first kits, especially to compensate for the lack of a large eDRAM pool, among other enhancements? Not saying a 640 ALU part definitely HAD to be in there, but still, I'd imagine it would have been - at least on paper - a bit more 'beasty' than what we see in Latte(minus Latte's performance/efficiency/special enhancements gains). You don't agree?
 

krizzx

Junior Member
So you are done here?

I doubt it. Not as long as people still suggest that the Wii U can perform better than the lowest end hypothesis, its worst ports and high end AAA PS3/360 games.

Didn't we get tipped off by a pretty trusted insider as to generally what type of card was in the kits? (my memory may be foggy, so correct/refresh me if wrong).

Also, wouldn't they need a pretty decent card in the first kits, especially to compensate for the lack of a large eDRAM pool, among other enhancements? Not saying a 640 ALU part definitely HAD to be in there, but still, I'd imagine it would have been - at least on paper - a bit more 'beasty' than what we see in Latte(minus Latte's performance/efficiency/special enhancements gains). You don't agree?

As I recall, it was an underclocked 4850 in earlier dev kits.

Iwata Asks for Bayonetta 2 can be found HERE ;).

It's rather brilliant. Post Of The Year Stuff... Or even the Post Of All The Internets!!

I agree with what you wrote. The differences are clear, and if anybody still doubts, they should see this breakdown by 3Dude on another site. If this was being released on PS360 consoles, it would be downported and those versions wouldn't be able to keep up. I totally reject the "2010 doesn't count" changing of the goalposts here when earlier in this thread, people were perfectly happy to throw around Just Cause 2 (a title released in North America in the same year, and two months after Bayonetta) in relation to Monolith Soft's Project X in their desperate reach to play down anything and everything the Wii U had going for it over PS360 consoles - 2010 meant that the X360 had been out for almost 4 1/2 years, and the PS3 for about 3 1/2. Please note those time frames, realise that developers were far more familiar with PS360 consoles back then than they are with the Wii U at this point, then understand why I think that the idea it somehow "doesn't count" is a complete BS.

Bayonetta 2 will be released in 2014, at that point, up to 1 1/2 years of the Wii U's life, and within up to a third of the X360's time - We haven't seen the rest of the game, and there is plenty of time to polish it (After all, the finished article of The Wonderful 101 looks better than when it was revealed at E3 2012...). Even with Project X, one can see how it's progressed since January - The E3 trailer shows marked differences.

Another point to add, which many people fail to do is that one must allow for the fact that this is Nintendo's first step into the HD gaming development era - Their first efforts are already showing improvements on the 7th Generation at this stage. Certainly, all of the signs are encouraging. If you can't see the improvements now, then let us come back in 2016, 2017 and 2018, and then you will have seen the noticeable steps. Or even 2020 - That way, one can say that the Wii U had eight years, just as the X360 has - But it doesn't need eight years; it's already there.

Thanks for link the IwataAsks, and I agree with your post completely. There is this select group of individuals who will use anything they can find coupled with the most irrational arguments to play down examples of the Wii U outdoing the last gen consoles by any significant amount. I could name most off the top of my head.
 

prag16

Banned
Nintendo seemed to like even multipliers in the past, and the Wii U architecture isn't entirely removed from the Wii and GCN designs. We were all surprised when the clock speeds were revealed. The current, seemingly arbitrary, clocks seem to be the result of Nintendo pushing everything higher than planned, especially the CPU, after devs got the dev kits and complained. This is speculation, yes, but it makes sense of the whispers we've heard.

This was a long time ago at this point, and maybe it was fully validated and explained beyond any possible doubt many times, but are we 100% cock sure on those reported clock speeds? Didn't Marcan glean those while the system was running in Wii mode (I remember he had to hack around a bit to even unlock the other two CPU cores in that mode or something?)?

Did anyone ever corroborate this independently, or was marcan the only one? Again, sorry if this was 100% irrefutably explained before, but I've been in and out on this thread from way back and haven't seen 100% of the content.
 
Well, there's got to be some hardware on there running translation from TEV code into shader language and such. However we take it, my point was that the design is not just R700 - it's been altered at least somewhat at the hardware level.

I gotcha, but I don't see anything being unique enough to cause a high level of concern.


The point I made doesn't support any shader quantity over the other. The only thing that it says, is that for whatever Latte is, it's currently running at the highest clock they could get it while maintaining acceptable yields. I just can't see a 640 ALU part as ever being in the dev kits. It makes no sense of any of the other information we've gotten since.

Ok. However that isa rather long jump to make that conclusion when we know how overly cautious Nintendo is.

I will say that around the time right before I stopped posting last year someone told me something along the lines of Nintendo reducing their original plans after the overheating issue. By then after hearing conflicting things I didn't even care and just wanted to focus on other things.
 
Thanks for link the IwataAsks, and I agree with your post completely. There is this select group of individuals who will use anything they can find coupled with the most irrational arguments to play down examples of the Wii U outdoing the last gen consoles by any significant amount. I could name most off the top of my head.

...you...DO realize that this IwataAsks was a joke post, right?
 
This was a long time ago at this point, and maybe it was fully validated and explained beyond any possible doubt many times, but are we 100% cock sure on those reported clock speeds? Didn't Marcan glean those while the system was running in Wii mode (I remember he had to hack around a bit to even unlock the other two CPU cores in that mode or something?)?

Did anyone ever corroborate this independently, or was marcan the only one? Again, sorry if this was 100% irrefutably explained before, but I've been in and out on this thread from way back and haven't seen 100% of the content.

I'm not 100%. I wouldn't give his analysis any credence, to be honest. Partly because of the fact that it was done in Wii mode. Also, I can't take him seriously after his "duct tape" comment - It was almost as if I was reading some dumb meme. But that's just my opinion, nothing else.
 
I gotcha, but I don't see anything being unique enough to cause a high level of concern.




Ok. However that isa rather long jump to make that conclusion when we know how overly cautious Nintendo is.

I will say that around the time right before I stopped posting last year someone told me something along the lines of Nintendo reducing their original plans after the overheating issue. By then after hearing conflicting things I didn't even care and just wanted to focus on other things.
Hmm, I remember talking to you during those times, and I think we generally agreed that whatever Nintendo had in those eariler underclocked devs kits either had similar performance or weaker than the ones in the final dev kits. This also matches Matt and lherre's info in that the dev kits did get stronger.

Gearbox did say that there was some things they could do on the earlier dev kit that but not on the later ones, though. Maybe it was a change in architecture, but the statement was vague.
 
I think it's more due to form factor that they stuck with flash. Where would you stick a HDD in that case? If you read the Iwata Asks on the console, they actually started with the case design and then designed the hardware around what would fit inside it - quite...curious a decision.

I see, it seems the size of the console and the included and expensive tablet controller had just as much (if not more) to do with Nintendo's WiiU specs than them wanting not to get into a specs war with Sony or MS.

Are they lego city loading long?

It's one of the few WiiU exclusives I don't own as I'm not a fan of the Lego games but a friend has it and says the loading times are really, really bad but the game is very good.

W-101's loading times aren't awful but it's about a 30 second wait in between the main chapters.

What components of a console affect loading times, I was under the impression it's the speed of the media and RAM bandwidth if the game is coming from a DVD/Blu Ray.
 
I think it's more due to form factor that they stuck with flash. Where would you stick a HDD in that case? If you read the Iwata Asks on the console, they actually started with the case design and then designed the hardware around what would fit inside it - quite...curious a decision.

Curious is putting it nicely. That would lend credence to what I mentioned in the other post.

Hmm, I remember talking to you during those times, and I think we generally agreed that whatever Nintendo had in those eariler underclocked devs kits either had similar performance or weaker than the ones in the final dev kits. This also matches Matt and lherre's info in that the dev kits did get stronger.

Gearbox did say that there was some things they could do on the earlier dev kit that but not on the later ones, though. Maybe it was a change in architecture, but the statement was vague.

Forgot about the second part. I know I assumed at the time it was an architectural change, but that other info could corroborate that now that I think about it. And yeah it could be viewed that the improvements were to the "downgraded" dev kits if that is true.
 
Nintendo seemed to like even multipliers in the past, and the Wii U architecture isn't entirely removed from the Wii and GCN designs. We were all surprised when the clock speeds were revealed. The current, seemingly arbitrary, clocks seem to be the result of Nintendo pushing everything higher than planned, especially the CPU, after devs got the dev kits and complained. This is speculation, yes, but it makes sense of the whispers we've heard.


Yepp, it's speculation. But nope, it doesn't make all that much sense ;) It's perfectly possible that Nintendo "liked" even multipliers in the past and it's perfectly possible that they don't like them anymore for whatever reason.




Why would they intentionally gimp the GPU? It's not like 550 Mhz is some magic number that makes sense of anything else in the hardware. If they could have had it clocked higher, while maintaining good yields at the target TDP, they would have. Or at least I would hope they would have. To deny this is to claim that Nintendo weren't even trying.


Uh, I didn't say this...

You said that the current clock speed is the highest they could go while maintaining acceptable yields. Now that's a problematic claim for numerous reasons; one of them being that we don't know anything at this point about the Wii Us GPU yield and one of them being that I simply doubt that pushing that GPU to, say, 600mhz would have changed all that much in terms of yield (it's a pretty old, mature process, it's a reasonably small GPU, Nintendo, at current GPU clocks, seemed to be able to produce a pretty good amount of WiiU from launch on etc.). Now in terms of target TDP that's something completely different (that wasn't in the post I originally responded to) and of course that's true per definition. As power consumption rises with clock speed (and voltage etc.) and Nintendo probably has set a certain TDP limit they obviously chose the GPU/CPU clock speed to get as close to that TDP limit as possible.
 
I definitely understand what you're saying and can agree with it. The pessimistic side in me feels the controller ate more into costs than Nintendo intended.

I still don't think it makes sense to assume the lowest crappiest efficiency for a company known for putting things out that last. Like I said 70% efficiency is almost only being seen in the shittiest of cheap PC power supplies. Personally I wouldn't say 90 for sure, but I'd go at least 80.
 

Argyle

Member
I still don't think it makes sense to assume the lowest crappiest efficiency for a company known for putting things out that last. Like I said 70% efficiency is almost only being seen in the shittiest of cheap PC power supplies. Personally I wouldn't say 90 for sure, but I'd go at least 80.

I have no idea what the actual efficiency is but Nintendo has been known to cut corners on things like that. See the lack of a multi voltage power supply on their portable systems as an example.
 
Yepp, it's speculation. But nope, it doesn't make all that much sense ;) It's perfectly possible that Nintendo "liked" even multipliers in the past and it's perfectly possible that they don't like them anymore for whatever reason.

It's not about "liking" or "disliking." It's about the benefits to latency and such. Benefits which Nintendo formerly touted and which Sony currently also seem to be exploiting w/ the PS4 design.

Uh, I didn't say this...

You said that the current clock speed is the highest they could go while maintaining acceptable yields. Now that's a problematic claim for numerous reasons; one of them being that we don't know anything at this point about the Wii Us GPU yield and one of them being that I simply doubt that pushing that GPU to, say, 600mhz would have changed all that much in terms of yield (it's a pretty old, mature process, it's a reasonably small GPU, Nintendo, at current GPU clocks, seemed to be able to produce a pretty good amount of WiiU from launch on etc.). Now in terms of target TDP that's something completely different (that wasn't in the post I originally responded to) and of course that's true per definition. As power consumption rises with clock speed (and voltage etc.) and Nintendo probably has set a certain TDP limit they obviously chose the GPU/CPU clock speed to get as close to that TDP limit as possible.

No, it's not. Their yields would be measured by how many chips hit the clocks they needed at the target TDP. If it hits 550 Mhz, but at 25watts, then it's no good.

Anyway, I'm out of this thread unless some new info comes along. Arguing every single point is exhausting.
 
I have no idea what the actual efficiency is but Nintendo has been known to cut corners on things like that. See the lack of a multi voltage power supply on their portable systems as an example.

Multi voltage power supplies are for traveling, no? I don't think that's a reason to think they'd cheap out on the efficiency of their power brick. That's not really something where they've cut corners in, in my experience.

Or do you mean the lack of a packaged adapter in Europe with the 3DS? One on hand, that was "cheap" to not put it in the package, on the other hand, it was does because many people already owned a DS brick. That doesn't speak negatively to the quality of the brick itself.
 
I have no idea what the actual efficiency is but Nintendo has been known to cut corners on things like that. See the lack of a multi voltage power supply on their portable systems as an example.

That's different than going with a low efficiency power supply which runs a way higher risk of blowing/dieing than a higher efficiency one. Cheap low efficiency power supplies kill hardware and themselves all the time. Not including a multi-voltage power supply doesn't effect the effective life of the power supply, going with a cheap low efficiency one does.
 

Argyle

Member
Multi voltage power supplies are for traveling, no? I don't think that's a reason to think they'd cheap out on the efficiency of their power brick. That's not really something where they've cut corners in, in my experience.

Or do you mean the lack of a packaged adapter in Europe with the 3DS? One on hand, that was "cheap" to not put it in the package, on the other hand, it was does because many people already owned a DS brick. That doesn't speak negatively to the quality of the brick itself.

Yes, traveling is what people tend to do with their portable systems. I thought it a curious choice to save a few pennies, it was quite annoying when I traveled to Europe.

That's different than going with a low efficiency power supply which runs a way higher risk of blowing/dieing than a higher efficiency one. Cheap low efficiency power supplies kill hardware and themselves all the time. Not including a multi-voltage power supply doesn't effect the effective life of the power supply, going with a cheap low efficiency one does.

Sure - but they've shown they are willing to pinch pennies wherever they can, so I guess I'm not ready to give them the benefit of the doubt.

For the record the last two Nintendo systems I have bought (3DS, Wii) were both dead within a week of ownership - Nintendo did take care of me for the Wii (and Amazon for the 3DS) but it doesn't exactly inspire confidence in their build quality. As for that replacement Wii (replaced at launch), it is pretty flaky at this point, I suspect the flash memory is going bad. (Also, didn't Nintendo put the cheapest and most awful DVD drive they could find in the Wii? There's a reason why you haven't really seen any dual layer games on the Wii since Brawl...)

Edit: Looks like they did release a few DL games since - the story I had heard was that it wasn't possible after the fiasco they had with Brawl. Probably explains why they were reluctant to release Xenoblade at first!
 

AzaK

Member
I think it's more due to form factor that they stuck with flash. Where would you stick a HDD in that case? If you read the Iwata Asks on the console, they actually started with the case design and then designed the hardware around what would fit inside it - quite...curious a decision.

WTF! I missed that. I don't know what to say.
 

z0m3le

Banned
No, it's not. Their yields would be measured by how many chips hit the clocks they needed at the target TDP. If it hits 550 Mhz, but at 25watts, then it's no good.

Anyway, I'm out of this thread unless some new info comes along. Arguing every single point is exhausting.

The clocks could also just be about efficiency, certain clocks yield better performance per watt than others.

Maybe you should try not to argue a point, things like blu's whole embedded experience you put into question, so far he is the only person to speak up with any real experience with those chips and they are not APUs which has an entire product line (the e series based on r700 only has 1 sku with a unique configuration for instance) if you really have to make a firm stand against every statement that goes against your conclusion it just becomes a debate and not an open discussion where others feel they can contribute unless it fits into that conclusion you've brought along with you. I really am not trying to attack you, I'm just pointing out that you've been debating this entire time and while that is part of this thread and very helpful at times, it can be draining.
 

krizzx

Junior Member
Yes, traveling is what people tend to do with their portable systems. I thought it a curious choice to save a few pennies, it was quite annoying when I traveled to Europe.



Sure - but they've shown they are willing to pinch pennies wherever they can, so I guess I'm not ready to give them the benefit of the doubt.

For the record the last two Nintendo systems I have bought (3DS, Wii) were both dead within a week of ownership - Nintendo did take care of me for the Wii (and Amazon for the 3DS) but it doesn't exactly inspire confidence in their build quality. As for that replacement Wii (replaced at launch), it is pretty flaky at this point, I suspect the flash memory is going bad. (Also, didn't Nintendo put the cheapest and most awful DVD drive they could find in the Wii? There's a reason why you haven't really seen any dual layer games on the Wii since Brawl...)

Edit: Looks like they did release a few DL games since - the story I had heard was that it wasn't possible after the fiasco they had with Brawl. Probably explains why they were reluctant to release Xenoblade at first!

Both dead within a week? The only way I can see that happening is if you used them as a frizzby or football in their downtime.

Nintendo hardware has the lowest failure rate.
http://www.gamespot.com/news/xbox-360-failure-rate-237-ps3-10-wii-27-study-6216691
http://www.nofussreviews.com/survey-results-2012

They also boast the most endurance.
http://www.youtube.com/watch?v=sVRJtqPRZhQ

Nintendo does not use "cheap" parts. People confuse lower power with cheap. The parts they use are high quality. Making things low cost and "pinching penneis" are two different thing. They do not cut corners on product quality(NOA excluded in a few circumstances). That is more than likely where most of their costs come from. Like the ridiculously smooth/round discs of the Wii U.

If it wasn't user error, then you must have had the worst luck on the planet Earth to get a new Wii and 3DS that failed within a week of purchase.
 
WTF! I missed that. I don't know what to say.
Yes. They were so committed to keep it a certain size that even the earilest dev was roughly the same size as the final casing. Interestingly, it look as if they also wanted the system to reach a certain performance target, and that gave them trouble. Nintendo and AMD probably gave themselves a big challenge to reach whatever performance they wanted to get to with that casing. That's likely why the GPU is so customized.
 
Yes, traveling is what people tend to do with their portable systems. I thought it a curious choice to save a few pennies, it was quite annoying when I traveled to Europe.

The usage you describe is not common for the majority of their handheld customers. Most people don't really need an adapter like that. And if there's one thing I'll agree with you (and others) on, is that when it comes to Nintendo being frugal/cheap... it's almost always in situations where the majority of its customers wouldn't make use of a feature even though that feature, in itself, is useful (or Nintendo thinking that such a feature would only be applicable to a small group of the market).

That being said, it's really unlike Nintendo to actually make and provide a cheap power brick. Like their philosophy or not, they do put their money where their mouth is and I've never questioned them having the courage in their convictions. They do spend money, even though many people question their priorities. Their history as a hardware manufacturer/build quality + customer service is pretty solid.
 
WTF! I missed that. I don't know what to say.
http://iwataasks.nintendo.com/interviews/#/wiiu/console/0/1 said:
Iwata: Was making the casing smaller a clear target from the start? Kitano-san?
Kitano: Yes. At the start of development, Takeda-san gave us the task of making the console a "stagehand", a kind of unobtrusive role behind the scenes.
So maybe they didn't start off with the exact case design we see, but they had a certain size in mind.

The clocks could also just be about efficiency, certain clocks yield better performance per watt than others.

Maybe you should try not to argue a point, things like blu's whole embedded experience you put into question, so far he is the only person to speak up with any real experience with those chips and they are not APUs which has an entire product line (the e series based on r700 only has 1 sku with a unique configuration for instance) if you really have to make a firm stand against every statement that goes against your conclusion it just becomes a debate and not an open discussion where others feel they can contribute unless it fits into that conclusion you've brought along with you. I really am not trying to attack you, I'm just pointing out that you've been debating this entire time and while that is part of this thread and very helpful at times, it can be draining.

Blu is the only one who presented any clear cut reason to call into question known practices in binning and trends in power consumption. Posters like blu and bg are not the ones who wear on me. It's the constant posts that look for any opening whatsoever to argue for some type of "unlockable" power in Wii U. And for instance, I already pointed out that the e4xxx is RV730 architecture, which is not just used in embedded parts, but also laptops and desktops, so I don't see your point in bringing that up again. Blu's explanations from experience are educational. Your posts have not been. That's not an attack. I just haven't learned anything from your posts.

Your claim that I take a firm stance to defend my predrawn conclusions is utterly false. If I were as hard-headed as you make, I would not have accepted Schnoz's power measurements or blu's explanation of how embedded GPUs are selected. Just because you couldn't convince me, while presenting no evidence whatsoever - not a single link, does not make me stubborn. It makes me smart. It is because we failed to question what we were told about Wii U from unreliable sources that many of us ended up disappointed when actual details of the hardware emerged.

Don't think that I'm unaware of where you grudge stems from, z0m3le. I fully remember calling into question your level of expertise. Ironically enough, it was for the very thing you accuse me of doing - having foregone conclusions and debating rather than keeping an open mind. When I presented the possibility of Latte being 160 shaders, you said it was "impossible," and were utterly hostile to the idea, as if you knew exactly how it was these game engines were coded! That is why I took exception to your remarks. I only wonder why you have let it lie dormant, even engaging in friendly coversation when I signed into IRC, rather than hashing things out via PM. In fact, that is probably where your last comment should have gone, as it is utterly irrelevant to the hardware discussion. This is the last time I will publicly address the matter, but the people who rely on your analysis should be aware that it is fueled on vengeance!
 
For the record the last two Nintendo systems I have bought (3DS, Wii) were both dead within a week of ownership
That's unconventional; Nintendo things can bite the dust just like anything else, and they do; but it's unlikely to be plagued like that. Bad luck on your account.
As for that replacement Wii (replaced at launch), it is pretty flaky at this point, I suspect the flash memory is going bad. (Also, didn't Nintendo put the cheapest and most awful DVD drive they could find in the Wii? There's a reason why you haven't really seen any dual layer games on the Wii since Brawl...)
Actually... No.

Wii never had any real problem with DL DVD's on a hardware level, the problem was the fact that the lens were coming out of the factory dirty. That didn't matter much for Single Layer DVD's which were the main diet, but proved to be a problem with the release of Super Smash Bros.

The solution? The official cleaning kit.


Wii's DVD drive was actually custom built for it, seeing no slot-in drive supports Gamecube mini-DVD's, so Nintendo went to some lengths, and the lens are not bad at all. Failure rate is nowhere near that of some PSone and PS2 models; Sega consoles could also get a mention, or early PS3's.
Edit: Looks like they did release a few DL games since - the story I had heard was that it wasn't possible after the fiasco they had with Brawl. Probably explains why they were reluctant to release Xenoblade at first!
Not really, no.

the JP version was Single Layer, they made it dual layer just so they could have the english dub alongside the japanese one.

If NOA wanted it on a single layer DVD they could; they just had to axe the JP build.


They were being wussies because they're wussies.
 

ozfunghi

Member
I don't buy his reasoning on Bayonetta. Again, Bayonetta 2 looks better but it looks better in the way we generally expect a sequel to look, there are plenty of action games that have come out since 2010 that look much better than the original. Shit, even Metal Gear Rising as an example.

You can't just dismiss those other games, consumers won't make the distinction you are and they aren't going to see the leap.

And you can't dismiss the fact that developers in 2010 had way more experience on the 360/ps3 hardware than what they have now on WiiU. WiiU launch ports were in the same ballpark as late ps360 multiplat games. Why would anyone assume the console can't see similar improvements the other consoles did, going from 2006 to 2013? I'm not going to be debating it's a "generational leap" (never have), but if a small Criterion team can pull off a better looking version than the lead platform they've had tons of experience on, and when Darksiders II is ported by a single digit team (initially it was a team of 3, i seem to recall)... there is obviously plenty of room for improvement.
 
Sure - but they've shown they are willing to pinch pennies wherever they can, so I guess I'm not ready to give them the benefit of the doubt.

For the record the last two Nintendo systems I have bought (3DS, Wii) were both dead within a week of ownership - Nintendo did take care of me for the Wii (and Amazon for the 3DS) but it doesn't exactly inspire confidence in their build quality. As for that replacement Wii (replaced at launch), it is pretty flaky at this point, I suspect the flash memory is going bad. (Also, didn't Nintendo put the cheapest and most awful DVD drive they could find in the Wii? There's a reason why you haven't really seen any dual layer games on the Wii since Brawl...)

Edit: Looks like they did release a few DL games since - the story I had heard was that it wasn't possible after the fiasco they had with Brawl. Probably explains why they were reluctant to release Xenoblade at first!

Your bad experience is counter to the average though. Most people with their systems experience little to no problems and there is definitely an acknowledgement of their systems being built to last. Pinching pennies on a mutli-voltage PS is completely different than pinching pennies where it effects how long something will last. They pinch pennies on not having an ethernet port, or supporting DD or DTS, not in the quality of the product. They're completely separate issues on pinching pennies on one thing is not a guarantee or a logical measure to assume such actions on the other.

The DVD drive in the Wii is actually a really expensive one, as slot loading drives that support discs the size of a normal DVD and ones the size of a GC disc are more expensive than regular slot loaders. The issue with DL discs wasn't an across the board problem, and no they didn't stop releasing DL games. Plus considering the only market that Xenoblade was delayed in, or saw them reluctant to release it in was NA. It had nothing to do with DL discs and had all to do with sales potential.
 
I see, it seems the size of the console and the included and expensive tablet controller had just as much (if not more) to do with Nintendo's WiiU specs than them wanting not to get into a specs war with Sony or MS.



It's one of the few WiiU exclusives I don't own as I'm not a fan of the Lego games but a friend has it and says the loading times are really, really bad but the game is very good.

W-101's loading times aren't awful but it's about a 30 second wait in between the main chapters.

What components of a console affect loading times, I was under the impression it's the speed of the media and RAM bandwidth if the game is coming from a DVD/Blu Ray.

The 3DS version of Lego City, which I have, have very bad loading times too.
Maybe it's a issue with the developer/software rather than the hardware.
 
The 3DS version of Lego City, which I have, has very bad loading times too.
Maybe it's a issue with the developer/software rather than the hardware.
Loadings always depend on the way the software is written; those are clearly written to take too much information of the medium at once instead of doing it in a disguised fashion (loading per areas instead of streamed ones); on the 3DS it could mean they're using too much compression. On the Wii U it could be because they didn't repeat the data enough times as a means to reduce seek times; but the huge ass loadings keep happening even when installed on something like a SSD so it really seems like lousy software structure of sorts.
 

z0m3le

Banned
So maybe they didn't start off with the exact case design we see, but they had a certain size in mind.



Blu is the only one who presented any clear cut reason to call into question known practices in binning and trends in power consumption. Posters like blu and bg are not the ones who wear on me. It's the constant posts that look for any opening whatsoever to argue for some type of "unlockable" power in Wii U. And for instance, I already pointed out that the e4xxx is RV730 architecture, which is not just used in embedded parts, but also laptops and desktops, so I don't see your point in bringing that up again. Blu's explanations from experience are educational. Your posts have not been. That's not an attack. I just haven't learned anything from your posts.

Your claim that I take a firm stance to defend my predrawn conclusions is utterly false. If I were as hard-headed as you make, I would not have accepted Schnoz's power measurements or blu's explanation of how embedded GPUs are selected. Just because you couldn't convince me, while presenting no evidence whatsoever - not a single link, does not make me stubborn. It makes me smart. It is because we failed to question what we were told about Wii U from unreliable sources that many of us ended up disappointed when actual details of the hardware emerged.

Don't think that I'm unaware of where you grudge stems from, z0m3le. I fully remember calling into question your level of expertise. Ironically enough, it was for the very thing you accuse me of doing - having foregone conclusions and debating rather than keeping an open mind. When I presented the possibility of Latte being 160 shaders, you said it was "impossible," and were utterly hostile to the idea, as if you knew exactly how it was these game engines were coded! That is why I took exception to your remarks. I only wonder why you have let it lie dormant, even engaging in friendly coversation when I signed into IRC, rather than hashing things out via PM. In fact, that is probably where your last comment should have gone, as it is utterly irrelevant to the hardware discussion. This is the last time I will publicly address the matter, but the people who rely on your analysis should be aware that it is fueled on vengeance!

http://www.neogaf.com/forum/showpost.php?p=47300841&postcount=3

I claimed it was 160 shaders in post 3 and the entire thread ran with it for pages. I still don't care what it ends up being and I only said that I doubted 160 shaders could perform better than Xenos later on. After some research in the area I found how badly Xenos was at efficiency and the fact that it is 216 shaders not 240 for xenos that cleared those doubts.

You always bring up this childish war between us? what war? you've attacked me from time to time and have apologized for doing so, I've done the same, the post you quoted wasn't me attacking you though this post is filled with attacks. Maybe my posts aren't here to educate you but to discuss a topic to the best of my knowledge, I'm here to be constructive not to teach. I hope you don't bring up these topics again especially in public, you say that my post should of been a PM, clearly this line of reasoning escaped you when posting this reply. If you find me on that IRC channel I would still be friendly to you because I don't take these posts of discussion personally. Not everyone can be as educated in the area as blu or as researched in the area as BG, so expecting me to hold to a standard (beyond yours considering I'm suppose to teach you) is about the lowest I've seen you post regarding me.

I'm trying to discuss the topic when I come in here but you've ran me out of the thread multiple times even though I feel my posts are constructive. (many posts have been clearly targeted at attacking me such as this and the one you bring up in this post I'm replying to which you've apologized for at a later time)

I still believe that Wii U is probably 160 ALUs, I've only been exploring the GPU and confirming or challenging what we know about Wii U, if you can't do that you have come to a conclusion about Wii U and it does in fact control your bias. My questioning that AMD's embedded parts are all binned because of quantity does have merit, yet you shrugged it off, and that is ok but the idea came from solid ground (the embedded parts far outsell the desktop and mobile parts you point out so binning makes little sense)

Almost all of my posts come from questioning and I've tried to have an open mind about everything I've come across, this is the last post I am going to direct to you, if I have to refer to you in the future I guess I'll use an unnamed poster since I can't mention your name without incurring your wraith.
 
Has anyone posted the Lens of Truth Splinter Cell Blacklist comparison?

http://www.lensoftruth.com/head2hea...ison-and-analysis-ps3-vs-xbox-360-vs-wii-u/2/

In the end we feel the overall better experience belongs to the Wii U. Although, the load times were much longer on the Wii U and the PlayStation 3 textures were slightly more detailed throughout, we still feel that due to the superior performance of the Wii U version not having any screen tearing and texture resolution almost on par with the PlayStation 3, the overall better experience was on the Wii U. Below are the videos for the Wii U, Xbox 360 and PlayStation 3 analysis.

Frame rate is better, no tearing.
ZhQvRaW.gif


Seems like the lack of caching/installing ability for Wii U severally hinders load times.
Some of those load times are brutal. Sometimes double the load times of the other consoles.
Qmyz2XR.gif


1 set of comparisons.
Wii U image is more crisp, but as you see on the ground, that rug has a lower resolution.
tcVUlCq.jpg


PS3 and Wii U versions seem to be most similar, but as you can see, the image is pretty blurry in comparison.
WWKXP7f.jpg


Not sure why the resolution of the 360 looks so odd.
OJR9JTm.jpg


Check the website for better comparisons.
 

QaaQer

Member
Both dead within a week? The only way I can see that happening is if you used them as a frizzby or football in their downtime.

Nintendo hardware has the lowest failure rate.
http://www.gamespot.com/news/xbox-360-failure-rate-237-ps3-10-wii-27-study-6216691
http://www.nofussreviews.com/survey-results-2012

They also boast the most endurance.
http://www.youtube.com/watch?v=sVRJtqPRZhQ

Nintendo does not use "cheap" parts. People confuse lower power with cheap. The parts they use are high quality. Making things low cost and "pinching penneis" are two different thing. They do not cut corners on product quality(NOA excluded in a few circumstances). That is more than likely where most of their costs come from. Like the ridiculously smooth/round discs of the Wii U.

If it wasn't user error, then you must have had the worst luck on the planet Earth to get a new Wii and 3DS that failed within a week of purchase.

Just a bit more info:

SquareTrade pointed out an April survey by media-research firm Nielsen that concluded the Wii is the least played of the three major consoles, being used for only 516 minutes per month. By contrast, the Xbox 360 is played over twice as much (1,191 minutes per month), with the PS3 lagging slightly behind it (1,053 minutes per month).

360s failure rate was, nonetheless, catastrophic in the early years.
 
You can mention me, z0m. Just don't misrepresent my views. That's actually why I was pulled back to posting in here to begin with. You mentioned that my whole theory was centered around a rumor I heard that Latte was 45nm, so I stepped in to clarify. I have no problem debating if it's kept civil. As you said, sometimes good info comes out of it.

You linked to that post early in the thread, and that really highlights the problems I've had with some of your analysis and why we had that initial problem when I began suggesting 160 ALUs. You seem to be obsessed with GFLOPs, as if everything boils down to that one simple measure. You were frantically going back and forth with your calculations, making edit upon edit, as the rest of us were taking our time to examine the chip as a whole and carefully weigh the options.

I do not expect education from every poster - only when they act like they are an authority on a subject, such as you did when you made this pair of posts.

http://www.neogaf.com/forum/showpost.php?p=57721440&postcount=4780
http://www.neogaf.com/forum/showpost.php?p=57745332&postcount=4824

Your focus on flops led you to hastily dismiss the theory which you now admit to being likely. On the contrary, I never claimed anything "impossible" when it came to TDP - only that I thought that what we were seeing made a strong case for 160 ALUs. I will admit that it's not as strong a case as I initially thought, but I still think that comparisons to AMD's embedded line aren't the best, because of the reasons I gave to bgassassin a few posts back:
We can't assume that Latte's manufacturing is all that easy, however. And certainly not on Day 1. Remember, we are talking Renesas, firstly - not TSMC, who have been doing this constantly for years now. Also, even though it may be based on R700, it is not the same chip. The way the blocks are configured is entirely different, which means the interchip communication has likely been reworked. There is probably extra silicon throughout for Wii BC. And the inclusion of the eDRAM in addition to putting the GPU on an MCM with an IBM CPU is also likely to be a somewhat tricky process. The Iwata asks seems to indicate it was. The last thing they would need throughout all this is a shader core blowing out.
 

ozfunghi

Member
Has anyone posted the Lens of Truth Splinter Cell Blacklist comparison?

It's been posted. What amazes me, is the fact that with the extra WiiU RAM, they ended up with worse textures. If there's one thing where NFS looked miles better on WiiU, it was textures, due to extra RAM.

As for framerate... this has been the case since launch. Framerate in the same ballpark as the other versions, but with the benefit of no torn frames. Which is a feature that people overlook or minimalize too often or too easily. To me, no torn frames is maybe as important as a 30 fps to 60 fps increase. Who knows what FPS these games would run at if V-sync were disabled on WiiU.
 

krizzx

Junior Member
Has anyone posted the Lens of Truth Splinter Cell Blacklist comparison?

http://www.lensoftruth.com/head2hea...ison-and-analysis-ps3-vs-xbox-360-vs-wii-u/2/



Frame rate is better, no tearing.
ZhQvRaW.gif


Seems like the lack of caching/installing ability for Wii U severally hinders load times.
Some of those load times are brutal. Sometimes double the load times of the other consoles.


1 set of comparisons.
Wii U image is more crisp, but as you see on the ground, that rug has a lower resolution.


PS3 and Wii U versions seem to be most similar, but as you can see, the image is pretty blurry in comparison.


Not sure why the resolution of the 360 looks so odd.


Check the website for better comparisons.

I posted that an the overall analysis earlier today. The Wii U version clearly didn't get half as much effort put into judging by the day 1 patch that was released alone. Though that patch supposedly fixed most of the frame rate drop. The lens of truth review was probably using the unpatched Wii U version. Even then, the Wii U still encountered frame rate drop in gameplay less off than the other two version.

The final verdict was that the Wii U version was the overall superior version.

Just a bit more info:



360s failure rate was, nonetheless, catastrophic in the early years.

What does that second quote you added have anything to do with the hardware failure rate? Was there any purpose to it?
 
It's been posted. What amazes me, is the fact that with the extra WiiU RAM, they ended up with worse textures. If there's one thing where NFS looked miles better on WiiU, it was textures, due to extra RAM.

As for framerate... this has been the case since launch. Framerate in the same ballpark as the other versions, but with the benefit of no torn frames. Which is a feature that people overlook or minimalize too often or too easily. To me, no torn frames is maybe as important as a 30 fps to 60 fps increase. Who knows what FPS these games would run at if V-sync were disabled on WiiU.

Yeah, it seems like a low effort port to me. I can't comment on the load times, but I would have expected better textures at the least.
 

z0m3le

Banned
You can mention me, z0m. Just don't misrepresent my views. That's actually why I was pulled back to posting in here to begin with. You mentioned that my whole theory was centered around a rumor I heard that Latte was 45nm, so I stepped in to clarify. I have no problem debating if it's kept civil. As you said, sometimes good info comes out of it.

I didn't misrepresent your views, I didn't say it was the only reason, if you go back to that post I said "A reason" or One reason, not the main reason, not the only reason. Again I'm defending my posts this is why mentioning you might not be worth it for me in the future.

You linked to that post early in the thread, and that really highlights the problems I've had with some of your analysis and why we had that initial problem when I began suggesting 160 ALUs. You seem to be obsessed with GFLOPs, as if everything boils down to that one simple measure. You were frantically going back and forth with your calculations, making edit upon edit, as the rest of us were taking our time to examine the chip as a whole and carefully weigh the options.

The post was edited because I learned a lot about VLIW5 and didn't want to misrepresent what was known about the topic, had it been page 10 or page 50 I would of just used a new post, but that is the 3rd post in the thread and many people just coming into the thread for quick info would see that and assume it was fact. Not just deleting my post is proof that I accepted the 160 ALU possibility until I compared it to Xenos which I've explained my change on that stance as well... It was about challenging the notion that something is fact that we simply don't know.

I do not expect education from every poster - only when they act like they are an authority on a subject, such as you did when you made this pair of posts.

http://www.neogaf.com/forum/showpost.php?p=57721440&postcount=4780
http://www.neogaf.com/forum/showpost.php?p=57745332&postcount=4824

Your focus on flops led you to hastily dismiss the theory which you now admit to being likely. On the contrary, I never claimed anything "impossible" when it came to TDP - only that I thought that what we were seeing made a strong case for 160 ALUs. I will admit that it's not as strong a case as I initially thought, but I still think that comparisons to AMD's embedded line aren't the best, because of the reasons I gave to bgassassin a few posts back:

At the time, Xenos was assumed to be a much higher efficiency than 60% had Xenos of been as efficient as first thought (80%+), Wii U being 160 ALUs would of been impossible thanks to evidence from launch ports such as CoD BO2 which ran the same resolution and effects with better shadows, this isn't possible if Xenos is already being used at 80%+ efficiency however Xenos' average efficiency according to Microsoft is around 60% putting Xenos at ~130 ALUs being used rather than the false 240 ALUs that was floating around at the time (Xenos has 216) I've explained this even in the post you are replying to. My posts were dismissing the idea that 160 Wii U ALUs could out perform 240 Xenos ALUs with only a 10% clock increase on ports that were done on unfinished hardware without the experience of 8+ years that they had for Xenos. It was pretty logical.

I've questioned a lot of things and used comparisons to more accurately measure performance, that is all anyone in here is doing so I don't understand your points with those posts.
 

QaaQer

Member
What does that second quote you added have anything to do with the hardware failure rate? Was there any purpose to it?

If product A is used twice as much as product B, everything else being equal, product A will have a higher failure rate. Just filling in the picture a bit, that is all.
 
Z0m, I understand that views change as new info is presented. All I am saying is that you have not been as open-minded as you claim and have presented your views as absolutes in the past without having enough info to do so. You went so far as to claim the 160 estimate "irresponsible!" lol. I still don't know where you got the idea of how "shader efficient" Xenos was before finding that 60% quote. Those were just vague averages anyway - not good in comparing specific games.

Also, Xenos is 240 shaders, but 216 GFLOPS. I thought you made a typo at first, but you've said it a few times since.
 

Log4Girlz

Member
Has anyone posted the Lens of Truth Splinter Cell Blacklist comparison?

http://www.lensoftruth.com/head2hea...ison-and-analysis-ps3-vs-xbox-360-vs-wii-u/2/



Frame rate is better, no tearing.
ZhQvRaW.gif


Seems like the lack of caching/installing ability for Wii U severally hinders load times.
Some of those load times are brutal. Sometimes double the load times of the other consoles.
Qmyz2XR.gif


1 set of comparisons.
Wii U image is more crisp, but as you see on the ground, that rug has a lower resolution.
tcVUlCq.jpg


PS3 and Wii U versions seem to be most similar, but as you can see, the image is pretty blurry in comparison.
WWKXP7f.jpg


Not sure why the resolution of the 360 looks so odd.
OJR9JTm.jpg


Check the website for better comparisons.

My that's unimpressive.
 

69wpm

Member
My that's unimpressive.

I hope nobody expected anything great from Ubisoft. The only good multi-plat coming out soon is Rayman Legends, because it was originally developed for Wii U. I hope they don't screw us over with Watch Dogs. I learnt my lesson so I will not pre-order any more games from them unless they show us real footage.
 

MDX

Member
I really feel the debate is pretty moot at this point. If the only option on the table is 160 vs 320. There is just no way there is 320 alu on that die. We have data on what size the blocks have to be on 40/45nm.

Every bit of data points to 160.


If its 160, then what cant the WiiU do going forward?
 
And you can't dismiss the fact that developers in 2010 had way more experience on the 360/ps3 hardware than what they have now on WiiU. WiiU launch ports were in the same ballpark as late ps360 multiplat games. Why would anyone assume the console can't see similar improvements the other consoles did, going from 2006 to 2013? I'm not going to be debating it's a "generational leap" (never have), but if a small Criterion team can pull off a better looking version than the lead platform they've had tons of experience on, and when Darksiders II is ported by a single digit team (initially it was a team of 3, i seem to recall)... there is obviously plenty of room for improvement.
Bayonetta 2 isn't a launch/extremely early Wii U title? It is a game being released a year and a half to 2 years into the Wii U's life cycle. It doesn't even have a release date yet other than "2014".
 

Log4Girlz

Member
I hope nobody expected anything great from Ubisoft. The only good multi-plat coming out soon is Rayman Legends, because it was originally developed for Wii U. I hope they don't screw us over with Watch Dogs. I learnt my lesson so I will not pre-order any more games from them unless they show us real footage.

I can't wait for a Watchdogs comparison.
 
Status
Not open for further replies.
Top Bottom