• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
I think ditching PPC for something more potent yet efficient would be a better idea, out-of-the-box BC be damned. The benefit of more likely 3rd party support would be worth it. IBM unfortunately isn't interested enough in the low-power consumer area.

Also, going for a compact, low(er) wattage system ain't cutting it, especially when the key controller is hardly any smaller to begin with.
 
I think ditching PPC for something more potent yet efficient would be a better idea, out-of-the-box BC be damned. The benefit of more likely 3rd party support would be worth it. IBM unfortunately isn't interested enough in the low-power consumer area.

Also, going for a compact, low(er) wattage system ain't cutting it, especially when the key controller is hardly any smaller to begin with.

I was going to make a joke about Nintendo dropping PPC and going back to MIPS, but I don't know if the MIPS architecture went anywhere.
 
Which is what in relation to Wii U power (From what we know?)
That's pretty dim information to judge; first of, it's Nvidia, they inflate their numbers like it's 1st of April, every-single-time.

I remember how Tegra 3 promised the world and delivered only 12 GFlops. And how every Tegra generation is supposed to be the second coming of jesus except they never actually deliver on their promises, some day they might be somewhat right, like in peter and the wolf story, but we're past the point of believing it blindly based on the shit they say.


8800 GTX is quite old now but it still pulls out 518 GFlops. Tegra 4 is supposed to be on the ~84 GFlop figure, no more, so here's to that. Quite far away still; far away from X360/PS3, let alone Wii U. As for Tegra 5 they'll probably match PS3/X360 or fall somewhere in between on paper. Bare in mind their mobile multi-core architecture seems to be quite different from the desktop ones, so pushing it/using it to the extent X360/PS3 chips were used is also in question; this was seemingly a problem on Tegra 3 and it's probably why they're pulling stuff like Nvidia Shield (as they pulled Tegrazone) as a means to bring incentives just so the developers actually optimize the software for their architecture.

As for how Wii U performs next to a 8800 GTX, possibly on par, possibly surpassing it in certain things (it's a more modern chip) and possibly impaired in fillrate heavy applications, we don't know.

Tegra 5 won't be even close to a 8800 GTX.
 

antonz

Member
cant really see Nintendo ever going with tegra after they ditched it for the 3ds

Yep Nvidia failing to deliver for Nintendo with the 3DS means Nvidia is going to have to work really hard if they ever want a chance again. As far along as Nintendo was relying on Nvidia they are lucky they found a suitable replacement so late in development
 

wsippel

Banned
AAAAAND Nvidia just announced the Kepler-based Tegra 5 which, supposedly, outperforms the GeForce 8800 GTX.
Never believe anything Nvidia says, especially when it comes to Tegra.

Also, as pointed out by Anandtech, this is just GFLOPS, which isn't as meaningful as it seems. Because you don't know which operations the system can actually perform in a single cycle. The Tegra 5 CUDA cores are most likely stripped down quite a bit, which means the chip probably needs several cycles to perform ops an 8800 could do in one.

This is obviously also important looking at Latte. If we assume that the chip really only has 160 shader units, but take into account that the individual shader units seem to be unusually large, it's certainly possible that the chip has some "shortcuts" - allowing it to do more with less ops.
 
Yep Nvidia failing to deliver for Nintendo with the 3DS means Nvidia is going to have to work really hard if they ever want a chance again. As far along as Nintendo was relying on Nividia they are lucky they found a suitable replacement so late in development

E2uC2MI.jpg


Nvidia: We didn't want to do it anyway ™
 
Yep Nvidia failing to deliver for Nintendo with the 3DS means Nvidia is going to have to work really hard if they ever want a chance again. As far along as Nintendo was relying on Nvidia they are lucky they found a suitable replacement so late in development

Never heard that story. I thought all the Tegra talk was just bogus rumors.
 

antonz

Member
Never heard that story. I thought all the Tegra talk was just bogus rumors.

There were actual boards with tegra markings delivered at one point.

It's never been said what actually caused the falling out but considering how Nvidia has promised the stars and deliver a turd its been assumed they failed to deliver performance promised in power envelope promised. Nvidia around the time was hyping a huge new contract too that never surfaced and they quietly never brought up again too
 

strata8

Member
There were actual boards with tegra markings.

It's never been said what actually caused the falling out but considering how Nvidia has promised the stars and deliver a turd its been assumed they failed to deliver performance promised in power envelope promised.

Same thing with Tegra 4. Remember how they said it would destroy every other tablet GPU out there or something like that? Turns out it's beaten by Qualcomm's Snapdragon 600 (which is in the Galaxy S4) and absolutely destroyed by the Snapdragon 800.

edit: This is a good article about NVIDIA's Tegra overpromising:
http://semiaccurate.com/2013/02/18/nvidias-telegraphs-tegras-woes-at-ces/
 
There were actual boards with tegra markings delivered at one point.

It's never been said what actually caused the falling out but considering how Nvidia has promised the stars and deliver a turd its been assumed they failed to deliver performance promised in power envelope promised. Nvidia around the time was hyping a huge new contract too that never surfaced and they quietly never brought up again too

Ah, interesting. Thanks for the info. I never knew about the Tegra 3DS boards.
 

LiamA

Member
Anyone have an idea on how Tegra compares to the 3DS's current CPU and GPU? Not to get too off topic, of course.
 
Nintendo push for 60fps more any other developer. I think it would be in their best interest to make sure that bandwidth is available. Iwata has already mentioned that the console is memory focused. From what I have read memory bandwidth is one of the most important aspects of maintaining a solid framerate. Also those framerate issues in COD has been ironed out to non existent.

I remember the days when Nintendo settled for 20fps with some N64 games to make sure they had the most stunning graphics for the time. I'm looking at you, Ocarina of Time.
 
Nintendo is very coy, we know they considered Tegra until very late in the hardware development, but chances are they were also talking to DMP over Pica200 all along.

Of course though, Pica200 most likely won the bid on five premises, more power efficiency (not having programmable shader units and going the hardwired fixed function route), japanese design, ready for market (it was a 2006 implementation, tegra could have problems pulling stock out of the door), licenciable (so Nintendo could put it into production wherever they wanted to) and eagerness.

Eagerness is possibly the most important thing here, Nintendo is very custom oriented and they build platforms with their software and implementations in mind, so it's only normal that they'll ask for custom tuning and hardwired things to be added if not for performance considerations just so the development environment is familiar enough to them. Nvidia appears to be a bitch to work with in that sense, they prefer to force feed ready made solutions from the shelf and that's what the Tegra 1 was supposed to be.

But of course, this is no more than conjecture. Fact is the prototypes with Tegra were very real.
 
I remember the days when Nintendo settled for 20fps with some N64 games to make sure they had the most stunning graphics for the time. I'm looking at you, Ocarina of Time.
They were one the first to go for 60 fps on a 3D-racer with F-Zero X though.

N64 was a pretty difficult platform to work with, 30 frames were difficult, let alone 60. Also back then a few frames where the difference between pulling something out or not, the same thing can't be said today.

And a lot of things changed since then in Nintendo's approach to platform building and the like. In a lot of senses Nintendo first found itself, in regards to hardware design with the Gamecube.
 

TheD

The Detective
Same thing with Tegra 4. Remember how they said it would destroy every other tablet GPU out there or something like that? Turns out it's beaten by Qualcomm's Snapdragon 600 (which is in the Galaxy S4) and absolutely destroyed by the Snapdragon 800.

edit: This is a good article about NVIDIA's Tegra overpromising:
http://semiaccurate.com/2013/02/18/nvidias-telegraphs-tegras-woes-at-ces/


Do you have a link to that?
Because the benchmarks I have found with the 600 and 800 have the T4 beating the 600 and losing to the 800 (GPU wise).
 
So I take it that the E3 2010 models had the Picas200 GPU in them?
They did.
edit: This is a good article about NVIDIA's Tegra overpromising:
http://semiaccurate.com/2013/02/18/nvidias-telegraphs-tegras-woes-at-ces/
Great article.
Do you have a link to that?
Because the benchmarks I have found with the 600 and 800 have the T4 beating the 600 and losing to the 800 (GPU wise).
It's probably down to realworld scenarios, like with Tegra 3 it's better on paper or when being stressed by abstract benchmarking techniques than when it comes to giving software the power they expect the way they expect it. The multicore architecture doesn't seem to help:

original.png


GFXBench 2.7 T-Rex HD C24Z16 - Offscreen (1080p):
Nvidia Tegra 4 (Nvidia Shield) - 1,027 Frames (18.3 Fps)
Snapdragon 600 (Samsung Galaxy S4) - 957 Frames (17.1 Fps)
Snapdragon 800 (Qualcomm MDP) - 1,456 Frames (26.0 Fps)

Quadrant 2.0:
Nvidia Tegra 4 (Nvidia Shield) - 16,436
Snapdragon 600 (Samsung Galaxy S4) - 12,684
Snapdragon 800 (Qualcomm MDP) - 22,022
 
Well, to be fair, Tegra 4 seems to beat Adreno 600 slightly; but they're both clocked (if not underclocked) at 1.9 GHz; Adreno 800 wipes the floor with Tegra 4 at 2.3 GHz, question is how evenly matched they'll be at 2.3 GHz with Nvidia releasing Tegra 4i at that same clock speed.

This Tegra 4 though, seems to be scaled down to go to higher clocks, as it would obviously perform worse than a regular Tegra 4 at 1.9 GHz (Tegra 4 having 72 graphics cores and Cortex A15 architecture, Tegra 4i having 60 graphics cores and Cortex A9 architecture).

Has the potential to be one of those Nvidia ridiculous "clockspeed/number of things" emergency countermeasures. Had it been the very same design at 2.3 GHz this could be interesting as it is though, I'm antecipating defeat.
 
I'm pretty confident Treyarch will be able to squeeze some decent stuff out of the Wii U provided they apply themselves. Their Wii CoD games were at times surprisingly comparable to the HD Twin versions despite the constraints of the little white box. This is gonna be their second go-around on the hardware, and it very well might a telling of just how comparable Xbox One/PS4 games can be to Wii U in the early running. Granted, the gap will widen, devs will prioritize for the Xbox One and PS4, and we'll ultimately see the best of what the Wii U can do from Nintendo, but it'll be interesting to see how big the gulf is at this point in time.
 

krizzx

Junior Member
I think ditching PPC for something more potent yet efficient would be a better idea, out-of-the-box BC be damned. The benefit of more likely 3rd party support would be worth it. IBM unfortunately isn't interested enough in the low-power consumer area.

Also, going for a compact, low(er) wattage system ain't cutting it, especially when the key controller is hardly any smaller to begin with.

You lost me at "more effiicient". The entire reason for going PPC is that is one of the most efficient processor designs on Earth. Watt for watt, there are few CPU's that do more if any.

Honestly, I'd say that they should stick with the PPC and expand it with more features. Though I must ask again, why are we discussing the CPU in the GPU thread instead of the CPU thread? Espresso has nothing to do with Latte's performance or functionality as far as I know.

I'm pretty confident Treyarch will be able to squeeze some decent stuff out of the Wii U provided they apply themselves. Their Wii CoD games were at times surprisingly comparable to the HD Twin versions despite the constraints of the little white box. This is gonna be their second go-around on the hardware, and it very well might a telling of just how comparable Xbox One/PS4 games can be to Wii U in the early running. Granted, the gap will widen, devs will prioritize for the Xbox One and PS4, and we'll ultimately see the best of what the Wii U can do from Nintendo, but it'll be interesting to see how big the gulf is at this point in time.

I wouldn't give them to much credit for the Wii develop. Honestly, the simple design of COD is not that hard to reproduce to begin with. High Voltage did more in ever aspect with the conduit and I doubt that was the Wii's limit.

Even using the next gen engine, when COD was shown off for the XboxOne it looks really, REALLY unimpressive to me compared to other things I've seen from the next gen.

I'm looking at this the same way I'm looking at Project C.A.R.S., Watchdogs, and the next Batman, and I'm looking at those the way I looked at the comment about Black Ops on the Wii a long time ago when it was announced. If they would doing anything special or made any great effort, they would be bragging about it and trying to promote it more.
 

Lonely1

Unconfirmed Member
Remember this?

8W2jWAJ.jpg


There were actual boards with tegra markings delivered at one point.

It's never been said what actually caused the falling out but considering how Nvidia has promised the stars and deliver a turd its been assumed they failed to deliver performance promised in power envelope promised. Nvidia around the time was hyping a huge new contract too that never surfaced and they quietly never brought up again too

Also, Tegra III was delayed for several months. There wasn't a consumer Tegra III version by the time of the projected 3DS release (holidays 2010).
 
Even using the next gen engine, when COD was shown off for the XboxOne it looks really, REALLY unimpressive to me compared to other things I've seen from the next gen.

I'm looking at this the same way I'm looking at Project C.A.R.S., Watchdogs, and the next Batman, and I'm looking at those the way I looked at the comment about Black Ops on the Wii a long time ago when it was announced. If they would doing anything special or made any great effort, they would be bragging about it and trying to promote it more.
I don't think they'll be doing anything particularly brag worthy, either. Nor do I think Ghosts is all that impressive looking for a next-gen title. It gets by as far as what we've seen, aside from some games that will obviously be out a bit later and built for next-gen platforms such as FFXV and the like. But it will be interesting if Treyarch can pull out a comparable performance to a next-gen game that is just getting by that standard. I'm sure the Wii U is capable of more than that, but I don't think we'll be seeing it too often outside of first party Nintendo games.
 
I'm sure that a company like Activision would rather spend more money on figuring out how easily they can leverage some 'next gen' features onto their 'next gen' games, yet spend the same amount of money they've been spending on CoD games, than go all out and spend the money it takes for their graphics engine to remain relevant visually for the next half decade or so. How CoD games stack up graphically this past generation as compared to the best looking games will probably be the same during this new generation. And why shouldn't they? The games will look good enough, and people will still buy them by the millions.
 

wilsoe2

Neo Member
I'm looking at this the same way I'm looking at Project C.A.R.S., Watchdogs, and the next Batman, and I'm looking at those the way I looked at the comment about Black Ops on the Wii a long time ago when it was announced. If they would doing anything special or made any great effort, they would be bragging about it and trying to promote it more.

I agree with you, but I think there is a small chance that Ghosts could be a different scenario. Perhaps the rivalry between Infinity Ward and Treyarch is motivating? If they could get similar results out of Wii U as PS4/XBone i'm sure it would be worth bragging rights... not sure if they have the resources to devote to it or desire though.
 
You lost me at "more effiicient". The entire reason for going PPC is that is one of the most efficient processor designs on Earth. Watt for watt, there are few CPU's that do more if any.

Honestly, I'd say that they should stick with the PPC and expand it with more features. Though I must ask again, why are we discussing the CPU in the GPU thread instead of the CPU thread? Espresso has nothing to do with Latte's performance or functionality as far as I know.

It doesn't seem to be the case any longer, with IBM not bothering to advance the platform, and with all the comparative strides being made by Intel and AMD. Especially since the two actually have SoCs going for them, which has bearing on the GPU discussion.

Not to mention that continuing to pursue efficiency to the degree Nintendo has been is a sucker's bet.
 

strata8

Member
You lost me at "more effiicient". The entire reason for going PPC is that is one of the most efficient processor designs on Earth. Watt for watt, there are few CPU's that do more if any.

Honestly, I'd say that they should stick with the PPC and expand it with more features. Though I must ask again, why are we discussing the CPU in the GPU thread instead of the CPU thread? Espresso has nothing to do with Latte's performance or functionality as far as I know.

It's definitely not more efficient when you're fabbing the processor at 45nm/40nm when everyone else is down to 28nm. In May, AMD released an SoC that has similar GPU performance to the Wii U, 4x the CPU performance, lower power consumption, half the size, and costs like 60 bucks.
 
I have a question, do you know why the nintendo game have no AA? Pikmin 3, Mario kart 8, lack of AA, especially The wonderful101 that the jaggies are everywhere in the trailer and screens, this is disturbing.

Is there a bottleneck in the hardware that they can't do that, or maybe they don't want to take a time to use that?

If we remember the vg leaks's informations it was:
720p x 4 msaa
Or
1080p no AA

Just the next smash bros and Wind Waker HD seems to used the second option.
 
I have a question, do you know why the nintendo game have no AA? Pikmin 3, Mario kart 8, lack of AA, especially The wonderful101 that the jaggies are everywhere in the trailer and screens, this is disturbing.

Is there a bottleneck in the hardware that they can't do that, or maybe they don't want to take a time to use that?

If we remember the vg leaks's informations it was:
720p x 4 msaa
Or
1080p no AA

Just the next smash bros and Wind Waker HD seems to used the second option.
That's what I was wondering. Possibly not enough ROPs in the GPU? Or maybe they just haven't tried to utilize them properly yet, sort of like the Vita. It pretty much has "free" MSAA, but it really hasn't been used up until now, with Killzone. Back to Wii U, looking at W101, that game has some really harsh aliasing. Seems kind of odd, but could be in a similar situation.
 
That's what I was wondering. Possibly not enough ROPs in the GPU?

That's a possibility I think.
It's certainly not a hardware limitation in principle but I don't think it could come for free either.

And of course you could always use a post-processing solution. If they didn't implement it in NSMB:U you could say maybe they don't like it because it always apllies a slight blur to the whole image. But the way it is I think they just prefer additional eye candy over anti-aliasing.
 
Did other developers have problems with a lack of AA at the start of PS360 development ?.

I'm absolutely loving Pikmin 3 and after 10 mins you literally don't notice it but since this is a tech thread... the jaggies are pretty bad, it's like they have no idea what AA is tbh.

I suppose we have to remember this is the first time Nintendo are making HD games, they are not use to people seeing their games in such extreme detail.

I think honestly it's just something that they will get better at as the generation progresses.

Was playing ZombiU again this weekend after a long time and noticed the frame rate is worse than at launch despite the patch notes saying that they improved it. I do wonder if it's got something to do with Nintendo stopping the console constantly spinning the disc, when the frame rate dropped I heard the disc spin up again like it was struggling to load data at the rate I was running.

They really did drop the ball with regards to not including a large enough hard drive so developers could have the option of installing games...
 
Status
Not open for further replies.
Top Bottom