• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

The_Lump

Banned
Why not take the measurement with nothing hook up? Wow the logic in this thread sometimes.

Calm down dude. You seem to think anyone not throwing their hands up and saying "it's shit" has some kind of agenda. I'm just trying to be logical here.

Even without the external HDD, it was more than the 33w figure you keep turning too so I don't see how "best case scenario" is 18w to the gpu. My point still stands.
 

SmokyDave

Member
This ^^^

Not to mention FIFA 13 being released with half the game (Ultimate Team) missing.

And then you've got to take into account various interviews where they've slagged the console off. EA certainly wildly changed their stance on the platform after the E3 2011 console reveal.

People are saying it's the Origin business that's done this but I'd say that Nintendo patenting the use of the GamePad as the view of the ground in a golf game has something to do with it too.

Perhaps 'sabotage' is too strong a word, but they haven't made a great deal of effort to help the Wii U be a success either.
EA always half-arse ports at launch. Read up on the cuts made to sports games for gen 7 consoles that weren't made to the gen 6 titles from the same year. It is in no way 'unprecedented'.

Also, people really shouldn't talk about the 'Origin deal' as if it actually happened.

This isn't actually relevant to a tech thread, but it's painful to watch these myths in action.
 

The_Lump

Banned
EA always half-arse ports at launch. Read up on the cuts made to sports games for gen 7 consoles that weren't made to the gen 6 titles from the same year. It is in no way 'unprecedented'.

Also, people really shouldn't talk about the 'Origin deal' as if it actually happened.

This isn't actually relevant to a tech thread, but it's painful to watch these myths in action.


It's equally as painful to watch some of the drive by's that go on in this thread (not you). It's as if not holding a stout "wiiu is bollocks" attitude means you'll be labelled an apologist and cast asunder! :) Thread needs to go back to tech analysis and not fanboy wars frontline.
 

tesla246

Member
Yes. Xenos is a weird beast in the sense that it's also essentially a VLIW design, but the ALU instruction word is very narrow - 2 ops, where one of the ops is actually a 4-way SIMD. Or you can think of it as a 2-way co-issue instruction word, comprising a vec4+scalar pair. The register file, though, is again of float4. The problem with the popular flops mis-estimation of Xenos comes from the fact Xenos cannot co-issue a 4-arg vec4 op (i.e. a MAD dst, src0, src1, src2) and a scalar op. So it cannot do a MAD + scalar in a clock.

Blu, your knowledge of architectures in general is quite substantial, and you also seem to know an awful lot about specific hardware design consoles such as the gamecube and xbox360. I am an extreme tech noob, so dont expect any contribution from me in these matters, however, if you dont mind me asking; what is your take on the ALU and sp count? Do you agree with it being 160 and what is your stance on fourth storms thought-through analysis thus far?
 
I suppose. But then why aren't we seeing installs on WiiU instead of 85 second loading times?
85 second loadings are over the moon, even if you were to stream 1024 MB (available RAM) over 22 MB/s CAV you end up with a 46 second load time, so even accounting for seek times and no repeated data on disc... It's truly horrible. It can only be explained by no care whatsoever going into easing the loadings a little; that or much of the data consisting in compressed data which doesn't really make sense with the current software and 25 GB of storage space being available on each disc. Suffices to be said that's a huge screw up.

As for installs, I don't know how they are structured, but Microsoft allowed them, Nintendo might discourage them or not, I dunno; but fact is if we consider 8GB standard (and from those only 3 GB are available) that might be acting as a dissuasive fact for most; as for full disc installation my X360 sure supports it even with launch games that weren't designed for it; I don't know how it is for Wii U because I don't have one yet but I believe it's not in there yet (is it?)... And if that is the case then it really should.
 

AzaK

Member
People on the other thread are sayng that both the loading time and the performance don't cange significantly with the DD version
That doesn't surprise me because I would think that Nintendo would just want to match the speed of the DVD drive anyway so they get the cheapest FLash
 

Earendil

Member
Ah, you're right. Rather than pointing out that some of the blame lies with the choices Nintendo have made for the hardware, I should've been on track like this guy:

They deserve plenty of the blame, most of us in this thread have long since accepted that. However, your comment did nothing but escalate the already prevalent console wars BS that doesn't belong here. The attitude of your posts do far more harm than the actual content.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Blu, your knowledge of architectures in general is quite substantial, and you also seem to know an awful lot about specific hardware design consoles such as the gamecube and xbox360. I am an extreme tech noob, so dont expect any contribution from me in these matters, however, if you dont mind me asking; what is your take on the ALU and sp count? Do you agree with it being 160 and what is your stance on fourth storms thought-through analysis thus far?
My take is that this discussion is a prime example of the easy way vs the right way dilemma. I think we should stick to what Fourth Storm & co have been trying to do since the beginning and try to decipher the blocks on the die - that is the right way. All TDP talk should be supplementary and not the primary line of argumentation, simply because we didn't even have proper power readings of the bulk system until very recently when we got some. In the absence of such people tend to pull numbers out of their behinds, and wave them as banners.
 
Calm down dude. You seem to think anyone not throwing their hands up and saying "it's shit" has some kind of agenda. I'm just trying to be logical here.

Even without the external HDD, it was more than the 33w figure you keep turning too so I don't see how "best case scenario" is 18w to the gpu. My point still stands.

Actually, it seems about right to me. 36w without anything attached.

@90 PSU efficiency ~32w
-6w for cpu
-1.5w for fan
-1.5w for drive
-3w for gamepad/wifi radio
-1w for DDR3 x 4
-1w for everything else

...leaves ~18w and I've tried to be conservative w/ estimates, although I stand to be proven wrong. As has been shown, it's not impossible to fit a 320 shader part into that power envelope. I don't think it's likely though.
 

krizzx

Junior Member
Actually, it seems about right to me. 36w without anything attached.

@90 PSU efficiency ~32w
-6w for cpu
-1.5w for fan
-1.5w for drive
-3w for gamepad/wifi radio
-1w for DDR3 x 4
-1w for everything else

...leaves ~18w and I've tried to be conservative w/ estimates, although I stand to be proven wrong. As has been shown, it's not impossible to fit a 320 shader part into that power envelope. I don't think it's likely though.

What about 224, 256, or 288[8x28, 8x32, 8x36 ALU count]?

Last I checked, Xenos had 5x48 ALU according to wikipedia.

Why are only the number 20 and 40 in play? I wound think the fact that the components are slightly smaller than known 40 ALU and much bigger than 20 ALU components would mean that its in between.
 

The_Lump

Banned
Actually, it seems about right to me. 36w without anything attached.

@90 PSU efficiency ~32w
-6w for cpu
-1.5w for fan
-1.5w for drive
-3w for gamepad/wifi radio
-1w for DDR3 x 4
-1w for everything else

...leaves ~18w and I've tried to be conservative w/ estimates, although I stand to be proven wrong. As has been shown, it's not impossible to fit a 320 shader part into that power envelope. I don't think it's likely though.

How is that best case though? I agree it's likely but it's hardly the highest possible wattage we could see, is it? Or am I misunderstanding something?

Maybe my 25w was a little high then. But 20w is certainly a reasonable possibility - which is plenty for what WiiU is aiming for isn't it? I'm trying to bring in 320sp or anything like that. Was just trying to get an accurate-ish figure for the top end of what the gpu will be using at full swing.
 
What about 224, 256, or 288[8x28, 8x32, 8x36 ALU count]?

Last I checked, Xenos had 5x48 ALU according to wikipedia.

Why are only the number 20 and 40 in play? I wound think the fact that the components are slightly smaller than known 40 ALU and much bigger than 20 ALU components would mean that its in between.

Could be anything depending on how far you believe Nintendo/Renesas/AMD departed from the Radeon line as we know it.

How is that best case though? I agree it's likely but it's hardly the highest possible wattage we could see, is it? Or am I misunderstanding something?

Maybe my 25w was a little high then. But 20w is certainly a reasonable possibility - which is plenty for what WiiU is aiming for isn't it? I'm trying to bring in 320sp or anything like that. Was just trying to get an accurate-ish figure for the top end of what the gpu will be using at full swing.

No game yet has been shown to spike it any higher. Take that as you will...
 

The_Lump

Banned
Could be anything depending on how far you believe Nintendo/Renesas/AMD departed from the Radeon line as we know it.



No game yet has been shown to spike it any higher. Take that as you will...


Fair point. Are games being tested regularly for this?

Maybe I'm bring too lenient in assuming a couple more watts might be used up in the next few years of development. If 18w is our top mark, then where does that sit given the other similar GPUs AMD were making around the time?

My take is that this discussion is a prime example of the easy way vs the right way dilemma. I think we should stick to what Fourth Storm & co have been trying to do since the beginning and try to decipher the blocks on the die - that is the right way. All TDP talk should be supplementary and not the primary line of argumentation, simply because we didn't even have proper power readings of the bulk system until very recently when we got some. In the absence of such people tend to pull numbers out of their behinds, and wave them as banners.


This is true. Apologies if anyone thought I was doing that btw; I wasn't. I was just trying to add up the numbers to prevent others from doing that. We've been working on the 33w psu draw assumption for a while, so I didn't like the look of people ignoring that we now had an increased number and claiming it didn't make difference. It does, no matter how small.
 
A big problem that I've tried to highlight several times in other threads is that most 3rd party publishers don't have dedicated Nintendo teams. The few that di earlier this gen transitioned them in mobile development teams. That means that publishers either have to dedicate separate resources to create teams to learn how to develop for Wii U or let their teams finish current gen games and then work on Wii U later.

This is something that is often ignored completely in favor of people shouting 'lazy dev's' ect.

Nintendo basically sat out the first round of HD consoles so now a lot of the publishers have one team for each of the HD twins. Now the likes of DICE have to somehow develop another version for PS4 and XBO, they must be pushed to breaking point already without adding in another console, esp one which they are not familiar with from a hardware standpoint.

I think that is the main reason EA are not making versions of Fifa, Madden, Tiger Woods and BF for WiiU this Winter, it has nothing to do with them 'holding a grudge' against Nintendo.

They simply see the PS4 and XBO as a much more lucrative market this Winter, my guess is that they will set up a dedicated WiiU development team early next year and we will see the return of the big name EA franchises that still release on PS360 in WInter 2014 and beyond esp if WiiU has a good Winter and they can push the install base up near 10 million by next April.

I would love to see the main DICE guys get put on a WiiU version of BF4 for release next Winter if only to see how close they could get it to the XBO version.

It actually says a lot for the WiiU hardware that mobile / low tier developers could create some of the launch games with unfinished dev kits, unfinished tools and in some cases while only using a single core of the CPU.

Splinter Cell Blacklist is no surprise, it's developed by Ubisoft Shanghai and still keeps up with PS360 frame rate wise while runing at 720p native resolution, with v-sync enabled... pretty impressive and shows the difference final dev kits and tools can make.
 
Maybe I'm bring too lenient in assuming a couple more watts might be used up in the next few years of development. If 18w is our top mark, then where does that sit given the other similar GPUs AMD were making around the time?

C'mon Lump, you've been around this thread for a while. There have been examples thrown around the last few pages. If we compare to desktop GPUs, we see ALU counts on the lower side. If we compare to embedded GPUs, we see ALU counts on the higher side. Fact is, none of the examples are the Wii U GPU or even manufactured by Renesas. I think blu is right. There are too many variables involved to make any definitive argument from power draw.
 

krizzx

Junior Member
This is something that is often ignored completely in favor of people shouting 'lazy dev's' ect.

Nintendo basically sat out the first round of HD consoles so now a lot of the publishers have one team for each of the HD twins. Now the likes of DICE have to somehow develop another version for PS4 and XBO, they must be pushed to breaking point already without adding in another console, esp one which they are not familiar with from a hardware standpoint.

I think that is the main reason EA are not making versions of Fifa, Madden, Tiger Woods and BF for WiiU this Winter, it has nothing to do with them 'holding a grudge' against Nintendo.

They simply see the PS4 and XBO as a much more lucrative market this Winter, my guess is that they will set up a dedicated WiiU development team early next year and we will see the return of the big name EA franchises that still release on PS360 in WInter 2014 and beyond esp if WiiU has a good Winter and they can push the install base up near 10 million by next April.

I would love to see the main DICE guys get put on a WiiU version of BF4 for release next Winter if only to see how close they could get it to the XBO version.

It actually says a lot for the WiiU hardware that mobile / low tier developers could create some of the launch games with unfinished dev kits, unfinished tools and in some cases while only using a single core of the CPU.

Splinter Cell Blacklist is no surprise, it's developed by Ubisoft Shanghai and still keeps up with PS360 frame rate wise while runing at 720p native resolution, with v-sync enabled... pretty impressive and shows the difference final dev kits and tools can make.

That's one way to look at it. You could also say that the fact that they don't have dedicated Nintendo dev teams is a sign that they are lazy/uncaring/cheap/etc

I'd say its far from final dev kits. Nintendo hasn't even released the second major performance update yet. That will likely add even more functionality to the OS than the first one as well as improving overall system performance more.

I just played Lego City again a little while and it reminded me of something. At the beginning of the game every cut scene was needed to as well as returning from cut scenes. Also the cutscenes were extremely long. From the middle to the end of the game cutscenes don't load individually now, only area transitions and the game loads before the load bar finishes. I've still yet to encounter any slow down in my play through like people were insistent about when it first came out.

It will be nice to see how Latte performance enhances after it.
 
That's one way to look at it. You could also say that the fact that they don't have dedicated Nintendo dev teams is a sign that they are lazy/uncaring/cheap/etc

I'd say its far from final dev kits. Nintendo hasn't even released the second major performance update yet. That will likely add even more functionality to the OS than the first one as well as improving overall system performance more.

I just played Lego City again a little while and it reminded me of something. At the beginning of the game every cut scene was needed to as well as returning from cut scenes. Also the cutscenes were extremely long. From the middle to the end of the game cutscenes don't load individually now, only area transitions and the game loads before the load bar finishes. I've still yet to encounter any slow down in my play through like people were insistent about when it first came out.

It will be nice to see how Latte performance enhances after it.

My friend who owns Lego City swears he noticed an fps improvement after the first OS update, it might just be his eyes though :p.

Hopefully the second OS update opens up some more RAM for developers, I have yet to see why the WiiU OS needs a 1GB of RAM. Another half a gig for developers would be fantastic and give the console a 1GB advantage over PS360 for games.
 

krizzx

Junior Member
My friend who owns Lego City swears he noticed an fps improvement after the first OS update, it might just be his eyes though :p.

Hopefully the second OS update opens up some more RAM for developers, I have yet to see why the WiiU OS needs a 1GB of RAM. Another half a gig for developers would be fantastic and give the console a 1GB advantage over PS360 for games.


It doesn't "need" it. That's just the amount Nintendo chose to reserve.

They likely intended to add more features in the future that would fill it up since the Wii U OS wasn't even complete enough to be on the console when it launched.

Many have attested to a performance increase in a lot of launch games after that update, though no official sources or journalists will do a write up on it. I guess no one really cared.
 

fred

Member
Just wondering how much more processing having v-synch enabled takes..? I'm guessing that this is extra work done by the GPU, right..? So does it tell us anything about the grunt behind the GPU when comparing the performance between the different SKUs..?

And I'm also assuming that we should stop paying any sort of attention to Digital Foundry and rely on the analysis from the Lens Of Truth fellahs given the former's strange agenda with the console..?

Also, shouldn't the differences between Sonic Generations and Sonic Lost World and Bayonetta 1 and Bayonetta 2 tell us something about the differences in power between Xenon/RSX and Latte..?

I really think that concentrating on the Voltages is a bit of a waste of time when Iwata has told us that the total average power draw is going to be 40W (or was it 45..?) and the maximum is going to be 75W. We haven't seen anything that comes anywhere near taxing Latte yet and won't do for another couple of years.

Don't you think we'd be better off looking at comparable titles like the two pairs I've mentioned above to get some sort of idea of what Latte is capable of regardless of how much juice it is or isn't using..?
 

NBtoaster

Member
Just wondering how much more processing having v-synch enabled takes..? I'm guessing that this is extra work done by the GPU, right..? So does it tell us anything about the grunt behind the GPU when comparing the performance between the different SKUs..?

Depends on the game. Black Ops 2 for example had an extremely small amount of tearing on PS3 and being vsynched on Wii U should not account for the performance difference alone. With a game with a large amount of tearing like Splinter Cell vsync will reduce framerate more.

And I'm also assuming that we should stop paying any sort of attention to Digital Foundry and rely on the analysis from the Lens Of Truth fellahs given the former's strange agenda with the console..?

Lens of Truth offers vastly less comprehensive analysis with worse captures and they have far less technical knowledge.

Also, shouldn't the differences between Sonic Generations and Sonic Lost World and Bayonetta 1 and Bayonetta 2 tell us something about the differences in power between Xenon/RSX and Latte..?

No, because of the difference in scope between Generations and Lost World. Lost World's abstract environments appear inherently less demanding than Generations realistic ones. Bayonetta 1 was Platinums first 360 game and was hardly the best looking title on the platform.
 

OryoN

Member
Not sure if this was posted before but...

Mark Williams, Technology Director @ VooFoo (team behind Pure Chess), had some interesting words for Wii U.

Wii U really seemed to be an ideal platform to bring Pure Chess to - it's quite a beast of a machine so is very capable of throwing around some pretty impressive visuals.
http://www.officialnintendomagazine...-pure-chess/?cid=OTC-RSS&attr=ONM-General-RSS

While the comment can be though of as dramatic or even cliche, it seems to have stemmed from genuine impressions of the hardware. I been seeing more positive impressions of the Wii U now that tools are more complete, and devs had more time to familiarize themselves with the hardware. It's reminds me of Criterion finding the Wii U's capabilities to be "incredible," especially for having such a low power draw.
 
Not sure if this was posted before but...

Mark Williams, Technology Director @ VooFoo (team behind Pure Chess), had some interesting words for Wii U.


http://www.officialnintendomagazine...-pure-chess/?cid=OTC-RSS&attr=ONM-General-RSS

While the comment can be though of as dramatic or even cliche, it seems to have stemmed from genuine impressions of the hardware. I been seeing more positive impressions of the Wii U now that tools are more complete, and devs had more time to familiarize themselves with the hardware. It's reminds me of Criterion finding the Wii U's capabilities to be "incredible," especially for having such a low power draw.

Last week or so.
 
Not sure if this was posted before but...

Mark Williams, Technology Director @ VooFoo (team behind Pure Chess), had some interesting words for Wii U.


http://www.officialnintendomagazine...-pure-chess/?cid=OTC-RSS&attr=ONM-General-RSS

While the comment can be though of as dramatic or even cliche, it seems to have stemmed from genuine impressions of the hardware. I been seeing more positive impressions of the Wii U now that tools are more complete, and devs had more time to familiarize themselves with the hardware. It's reminds me of Criterion finding the Wii U's capabilities to be "incredible," especially for having such a low power draw.

Performance is so bandwidth bound, I can't help but think Nintendo went crazy increasing memory in every area that used or needed it for performance. I wonder if you take a 360 or PS3 and increase the bandwidth and memory in places where its needed, what kind of games could of been produced.
 

Log4Girlz

Member
Not sure if this was posted before but...

Mark Williams, Technology Director @ VooFoo (team behind Pure Chess), had some interesting words for Wii U.


http://www.officialnintendomagazine...-pure-chess/?cid=OTC-RSS&attr=ONM-General-RSS

While the comment can be though of as dramatic or even cliche, it seems to have stemmed from genuine impressions of the hardware. I been seeing more positive impressions of the Wii U now that tools are more complete, and devs had more time to familiarize themselves with the hardware. It's reminds me of Criterion finding the Wii U's capabilities to be "incredible," especially for having such a low power draw.

This is a great quote. I've always wondered how a technically demanding game like Chess would do on the Wii U.
 

StevieP

Banned
Not sure if this was posted before but...

Mark Williams, Technology Director @ VooFoo (team behind Pure Chess), had some interesting words for Wii U.


http://www.officialnintendomagazine...-pure-chess/?cid=OTC-RSS&attr=ONM-General-RSS

While the comment can be though of as dramatic or even cliche, it seems to have stemmed from genuine impressions of the hardware. I been seeing more positive impressions of the Wii U now that tools are more complete, and devs had more time to familiarize themselves with the hardware. It's reminds me of Criterion finding the Wii U's capabilities to be "incredible," especially for having such a low power draw.

Hope you don't mind, I cross-posted your link in the thread for it here, so the discussion can continue about the game itself:

http://www.neogaf.com/forum/showthread.php?p=78671745#post78671745
 

The_Lump

Banned
C'mon Lump, you've been around this thread for a while. There have been examples thrown around the last few pages. If we compare to desktop GPUs, we see ALU counts on the lower side. If we compare to embedded GPUs, we see ALU counts on the higher side. Fact is, none of the examples are the Wii U GPU or even manufactured by Renesas. I think blu is right. There are too many variables involved to make any definitive argument from power draw.

Yeah sorry, it was sorta rhetorical really.

People are jumping to one conclusion or the other when the numbers could mean either at this stage - I shouldn't get drawn in to playing devils advocate.

I guess we wait for some more scraps of info and then go bananas over them again ;)
 

wsippel

Banned
85 second loadings are over the moon, even if you were to stream 1024 MB (available RAM) over 22 MB/s CAV you end up with a 46 second load time, so even accounting for seek times and no repeated data on disc... It's truly horrible. It can only be explained by no care whatsoever going into easing the loadings a little; that or much of the data consisting in compressed data which doesn't really make sense with the current software and 25 GB of storage space being available on each disc. Suffices to be said that's a huge screw up.
As I wrote a couple of months ago, it seems quite a few developers don't follow Nintendo's recommendation and use the wrong file IO block size, which leads to dramatically increased load times. Considering the load times in Blacklist are still shit even when it's running from flash, it's certainly possible that Ubisoft didn't use the correct block size, either.
 

Argyle

Member
As I wrote a couple of months ago, it seems quite a few developers don't follow Nintendo's recommendation and use the wrong file IO block size, which leads to dramatically increased load times. Considering the load times in Blacklist are still shit even when it's running from flash, it's certainly possible that Ubisoft didn't use the correct block size, either.

What is interesting to me is that apparently Nintendo doesn't seem to have any kind of load time certification requirement. The load times people are reporting on Wii U games simply would not fly on PS3 or 360.
 

Phazon

Member
As I wrote a couple of months ago, it seems quite a few developers don't follow Nintendo's recommendation and use the wrong file IO block size, which leads to dramatically increased load times. Considering the load times in Blacklist are still shit even when it's running from flash, it's certainly possible that Ubisoft didn't use the correct block size, either.

I think LEGO city might be one of those victims too? The load times were pretty long.

When I play Wonderful101 DD however, dayum. Load times are very fast and nice. (it's about 5-7 seconds for menu to mission menu and then 15 seconds to load the actual mission)
 

The_Lump

Banned
As I wrote a couple of months ago, it seems quite a few developers don't follow Nintendo's recommendation and use the wrong file IO block size, which leads to dramatically increased load times. Considering the load times in Blacklist are still shit even when it's running from flash, it's certainly possible that Ubisoft didn't use the correct block size, either.

Is that patchable (if it were the case)? Or is it too fundamental to amend later on?
 

MDX

Member
Also, shouldn't the differences between Sonic Generations and Sonic Lost World and Bayonetta 1 and Bayonetta 2 tell us something about the differences in power between Xenon/RSX and Latte..?

The shear amount and location of the eDRAM in the GPU alone puts Latte ahead of those two in my book. Even if the WiiU only had only 512MB of ram. Lets see how Nintendo is going to make the WiiU relevant against the next gen consoles.
 

krizzx

Junior Member
A couple of entries from the -latest- PCars changelog.

I say this should pretty much solidify whether or not the Wii U has DX11 capabilities. Nomura's a liar.

Are there any new screenshots from the Wii U build? You don't have to post them but I wanted to know if there has been any substantial improvement from before.

Just wondering how much more processing having v-synch enabled takes..? I'm guessing that this is extra work done by the GPU, right..? So does it tell us anything about the grunt behind the GPU when comparing the performance between the different SKUs..?

And I'm also assuming that we should stop paying any sort of attention to Digital Foundry and rely on the analysis from the Lens Of Truth fellahs given the former's strange agenda with the console..?

Also, shouldn't the differences between Sonic Generations and Sonic Lost World and Bayonetta 1 and Bayonetta 2 tell us something about the differences in power between Xenon/RSX and Latte..?


I really think that concentrating on the Voltages is a bit of a waste of time when Iwata has told us that the total average power draw is going to be 40W (or was it 45..?) and the maximum is going to be 75W. We haven't seen anything that comes anywhere near taxing Latte yet and won't do for another couple of years.

Don't you think we'd be better off looking at comparable titles like the two pairs I've mentioned above to get some sort of idea of what Latte is capable of regardless of how much juice it is or isn't using..?

Those are probably the best and most accurate comparisons you can make, especially in Bayonetta's case.

Bayonetta 2 is a whole level up. The character models and everything overall have more polygons. The textures are higher resolution. There are more effects and animation, and the biggest thing of all is the solid 720p 60fps. There are also a few sources that state it might be 1080p. That will be interesting to see if true. GAF will probably have a meltdown that exceeds the one caused by Bayonetta 2's exclusivity.

Then there is the fact that what we've seen of the Bayonetta 2 is on a console with known dev kit problems that isn't even a year old compared to a console with matured dev kits.

Same with Sonic Genertaions. They are Apple to apple comparisons. The only major different between the 2 is the hardware. Sonic Generation generally had shorter draw distances with less details objects/characters and it ran at 30fps. Sonic Lost World has already shown an increase in polygons detail and texture quality on top of having large draw distances, more shadows, foliage, and running at 60 FPS. It is doing over twice as much on screen as generations did. Just compare to the two photos in my old post http://www.neogaf.com/forum/showpost.php?p=77793633&postcount=8390.

Then, of course, there is also the same fact that Generations was made on extremely mature dev kits and firmware with teams that had 3 Sonic games of experience on the PS3/360. Lost World is being made on a less than 1 year old Wii U that hasn't even fnished having it OS and dev kits optimized.

I don't see how anyone can logically dismiss them. These are the best comparisons available now.

You can also try Mario Kart 8 vs LBP Karting and Playstation Allstars vs Super Smash U as those would also be mostly Apples to Apples comparison(fully backed first party titles of the same type), but that will bring the Sony strike force down on this thread, so I wouldn't recommend it.
 
What is interesting to me is that apparently Nintendo doesn't seem to have any kind of load time certification requirement. The load times people are reporting on Wii U games simply would not fly on PS3 or 360.

I somewhat doubt that Nintendo even has the sort of leverage with Ubisoft it takes to enforce it, even if there are such requirements. Ubisoft is carrying a huge load for Nintendo on the 3rd party front right now. With Splinter Cell, AC4, Rayman Legends, Watchdogs, and CoD: Ghosts all releasing in these last 4 months of this year, the Wii U is quite legit in the 3rd party department for now. Ubisoft is releasing 4/5 of those games.
 

Argyle

Member
I somewhat doubt that Nintendo even has the sort of leverage with Ubisoft it takes to enforce it, even if there are such requirements. Ubisoft is carrying a huge load for Nintendo on the 3rd party front right now. With Splinter Cell, AC4, Rayman Legends, Watchdogs, and CoD: Ghosts all releasing in these last 4 months of this year, the Wii U is quite legit in the 3rd party department for now. Ubisoft is releasing 4/5 of those games.

True, but if it comes down to a requirement being waived...you would think Nintendo would tell them that they are not using an optimal block size and that they should consider remastering their game, which doesn't seem like something that would be too hard to do.

If there are other reasons for the long load times then I could definitely see Nintendo looking the other way because they definitely need games right now.
 

Jrs3000

Member
True, but if it comes down to a requirement being waived...you would think Nintendo would tell them that they are not using an optimal block size and that they should consider remastering their game, which doesn't seem like something that would be too hard to do.

If there are other reasons for the long load times then I could definitely see Nintendo looking the other way because they definitely need games right now.

Hopefully whatever the reason splinter cell was so crappy on loading they fix for Watch Dogs, and AC4.
 
This was a long time ago at this point, and maybe it was fully validated and explained beyond any possible doubt many times, but are we 100% cock sure on those reported clock speeds? Didn't Marcan glean those while the system was running in Wii mode (I remember he had to hack around a bit to even unlock the other two CPU cores in that mode or something?)?
Conjecture, track record and...

0ik2dbi.png


Source: https://twitter.com/marcan42/status/276134321700610048

He clarified further, actually, but I'm lazy to dig that deep. Anyway, he had full access to the hardware in his tests; and like said his track record speaks for itself.
Did anyone ever corroborate this independently, or was marcan the only one? Again, sorry if this was 100% irrefutably explained before, but I've been in and out on this thread from way back and haven't seen 100% of the content.
He tested it, and that number came about, it's not so hard to get numbers when you have access to the hardware; plus it's in line with what can be expected from a PPC750. (ie: it's a short pipeline design so it's certainly not going past 1.6/1.8 GHz.

Not corroborated by others, just accepted by being in line with the expected values and his track record is pretty solid, so a undetected misreading of sorts is not really likely; he knew stuff we discovered via core dissection (ie: memory banks nature) by running code alone; dude is pretty far ahead from us that's all there is to it, you have to assume he knows his shit when he proved it numerous times.
 
it's not so hard to get numbers when you have access to the hardware; plus it's in line with what can be expected from a PPC750. (ie: it's a short pipeline design so it's certainly not going past 1.6/1.8 GHz.

People keep saying this but there is no "expected" clock speed considering the cores are already locked higher than the PPC750 was suppose to go (1GHz). It's heavily modified. We don't know what the upper bound
 
People keep saying this but there is no "expected" clock speed considering the cores are already locked higher than the PPC750 was suppose to go (1GHz). It's heavily modified. We don't know what the upper bound
It's still code compatible, that says miles and hills regarding the "heavily modified" part. Listen, it really isn't; otherwise I'd be one of the dudes leaning towards it seeing that I hate being overly negative; thing is it really doesn't have any leeway for that.

Hell, had you read documentation you'd know code compatibility on a low level basis is not even a given within the PPC750 line; I reckon FX and GX are incompatible with CL (due to paired singles, probably). Mess to much and it's gone; in fact... sneeze and it's gone. Hell, Sony has broken 100% code compatibility numerous times on PS2, be it with PSone games running on the PSone SoC (rendering them incompatible, glitched or suffering major slowdown; just because some DMA crap changed) and even with the futher integration of PS2 hardware, some PS2 games ceased working mid-generation; keeping code compatibility whilst messing up stuff is like undergoing a open heart surgery.

The CPU is a core shrinked Broadway/750CL, rearranged due to the core shrink and the fact the L2 is now eDRAM rather than SRAM; add an external SMP interface and 2 extra cores and that's it. At most they messed with the x60 bus a little and perhaps added some SIMD custom instructions seeing the ones from Gekko/2001 are probably outdated now; everything else would have to be accessed via a separate "optional" logic bank; and we have the cpu diagrams not showing any signs of such thing (nor have we heard developers talking about something of those lines).

Stop hoping the cpu is something it really isn't. And 1.24 GHz is actually fine too; craps all over X360/PS3 for all they're worth (taking SIMD/FP aside, of course); I only fail to understand the reason why they went with 3 cores rather than 4.
 

JohnB

Member
Thread needs to go back to tech analysis and not fanboy wars frontline.

I agree, particularly since I paid around $50 for the original die shot. ;-)

So - what have we got as a summation?

Low power(erful) chip, unusual design and it's got more grunt than a 10 year-old system.

Is that it?

JB
 

disap.ed

Member
A couple of entries from the -latest- PCars changelog.

1/2 OT: May I remind you that the pCARS devs clearly said that we shouldn't post the WiiU changelogs outside the WMD board? I know a lot of people don't give a fuck and do this as well, but I don't want you to get banned because of this.
 

disap.ed

Member
^ This. DX10.1 version would be the right term from all I read, but nobody gives a fuck about DX10 so it is called DX11 version.
 

krizzx

Junior Member
So I'm supposed to disregard multiple changelogs over a 5 months period that repateadly state that the are using DX11 features specifically on Latte because in DX10 a somewhat, kind of similar feature may exist?

Some people really go out of their way to downplay gains on this hardware.

They say its the DX11 feature set and have listed it multiple times, so I will take the devs words.

The Wii U GPU isn't DX10. It uses GX2 which is a custom Nintendo API from all I can tell. That clearly holds DX11 equivalent features.

There have been small confirmations and references of its DX11 capability since before launch. Its not full DX-anything, but It can produce the equivalent graphics, the same as Hollywood's TEV could produce all DX9 shading effects.
 
So I'm supposed to disregard multiple changelogs over a 5 months period that repateadly state that the are using DX11 features specifically on Latte because there exist some kind of similiar features in DX10?

Some people really go out of their way to downplay gains on this hardware.

They say its the DX11 feature set so I will take the devs words.

Yep, some people seem to want to manipulate any info they can find to go after Nintendo, I am so shocked. (Sarcasm)
 
Well, they are porting from their DX11 renderer... of which DX10 is a sub-set.
Yes. When it comes to comparing dx10.1 and dx11, they have many of the same features but dx11 is more efficient and better at it, correct? As time goes on, it will be interesting to check out the quality and quantity of the more advanced effects the Wii U could pull off.
 
Status
Not open for further replies.
Top Bottom