• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
One thing Nintendo does correctly is quality. I run a Repair shop and have 10-20 broken ps3 and xbox 360's.....I have 0 nintendo consoles. They come threw here but not often enought. All of their consoles have been enjoyable and they didn't break down every year and make you have to run out and buy a new one.....

I've actually had my Wii break on me TWICE; once after about a year of having it (clicks of death) and the other as of about 3 weeks ago (the disc drive refuses to spin).
 

japtor

Member
I would agree but do you remember NFS Most Wanted U?

"Wii U's extra memory allows for PC-quality textures and assets"

I wouldn't rule it out.
If you're referring to the graphics (considering this is the GPU thread) yeah sure to an extent. Like I said I wouldn't be surprised if it looks nice and is a reasonable facsimile at a glance, but it's still not going to compare to a super high end PC rig from a more critical point of view.
 
I would agree but do you remember NFS Most Wanted U?

"Wii U's extra memory allows for PC-quality textures and assets"

I wouldn't rule it out.

call me kooky, but weren't the PC textures really not that much better than the 360 textures? Let me double check...

WiiU_024.png


360_024.png


The Wii U textures do look higher rez, but I doubt it would look much better in motion.
 
Hey guys, I wish I could show you a screen shot but I can't figure out how to take one. I've been playing around with that free Animal Crossing app on the Wii U and it looks really nice. It has what seems to be standard on Wii U now which is very good motion blur and bokeh(? or DoF at least). Also, all the animals have nice high res self shadowing with not a jaggy in sight. The polygon counts are pretty low but that might just be because these are based off the 3DS AC characters.

It's not graphically impressive really, but it sure is clean looking.
 

krizzx

Junior Member
I've actually had my Wii break on me TWICE; once after about a year of having it (clicks of death) and the other as of about 3 weeks ago (the disc drive refuses to spin).

Did you store your Wii in a microwave when you were done with it? Historically, Nintendo hardware has always won endurance tests. I remember them being tested on video, and being suprised that the flimsy GC beat the tank known as the Xbox.

They dropped them, submerged them, put them in the oven. Nintendo products always came out on top. I believe that this is what people are factoring out with the Wii U parts and things like the battery in the Gamepad.

Everyone does analysis using the absolute cheapest parts they can find and presuming what they don't know as being as low end as possible. They never factor in quality of the part. There are multiple brains of RAM chips boasting the same specs and stats but which boast the most endurance and longevity? The battery in the gamepad may only have a 3 hour life but how often will you have to replace it? How long is that batteries life vs. third party batteries with more capacity? How resistance is it to overheating or other malfunctions? What is the material casing made of and how dense is it?

Nintendo doesn't usually use "cheap" parts. Thats why the 360 had such a high failure rate and the Wii had such a low one.

call me kooky, but weren't the PC textures really not that much better than the 360 textures? Let me double check...

WiiU_024.png


360_024.png


The Wii U textures do look higher rez, but I doubt it would look much better in motion.

It looks substantially better to me and the shading does as well. The Wii U version has a much better frame rate. To easiet way to see is to look at the objects at a great distance. The high quality textured items still retain detail even at a great distance. You can see detail waning on the 360 version. I'm not talking about the things caused by the the limit in draw distance either, which was also enhanced on the Wii U version.
 
Did you store your Wii in a microwave when you were done with it? Historically, Nintendo hardware has always won endurance tests. I remember them being tested on video, and being suprised that the flimsy GC beat the tank known as the Xbox.
GC was a tank. Same can't be said of the Wii or Wii U.

Even if they're still of higher quality than X360 tupperware finish; they're essentially good for what they are, or appropriate when it comes to finishing, and then there's a little more stress test than usually applied by other parties; that's it.
There are multiple brains of RAM chips boasting the same specs and stats but which boast the most endurance and longevity?
Probably something you can ignore altogether, as Nintendo is, using parts from various sellers.

Failure rate for RAM that is not that fast and on a mature process is very low; it's down to how many write cycles it's supposed to endure. (and they're a lot)
The battery in the gamepad may only have a 3 hour life but how often will you have to replace it? How long is that batteries life vs. third party batteries with more capacity?
Probably a non factor, really.

Nintendo went with a smaller battery in order to save in costs.
How resistance is it to overheating or other malfunctions? What is the material casing made of and how dense is it?
I'd say given how light on chips that thing is, being resistant to heating was not a concern of the highest order.
Nintendo doesn't usually use "cheap" parts. Thats why the 360 had such a high failure rate and the Wii had such a low one.
Not really due to that.

It was the solder, european union had banned lead from them, so the boiling point was lower than before. So X360's solder would boil down over time and pins would become disconected due to it; which is why you can resolder them and never have problems again.

Same thing happened to Nvidia cards on laptops on Macs, of all things. (expensive, supposedly well built)


Nintendo would never approach such heat production and overall parts price Microsoft was paying back then for the same package, so yeah, even though I agree the construction quality for X360 was bullcrap, it's really like comparing apples to oranges (and not really the culprit for the RRoD). Nintendo likes low wear solution (slow spinning fans, cooling not hugely overdependent on them, stuff like that) said design could never save the X360 from desoldering/heating.
 

joesiv

Member
Hmm, considering the different view and the off-camera image for the Wii U version, it's not too clear to judge.
Agreed, you can't really make any comparisons with the WiiU screen since it's totally washed out, and doesn't have the dramatic lighting that the PC shot has (remember the WiiU NFS comparisons), time of day, makes a huge difference.

Also the PC version might be supersampled to give incredibly crisp details, which the WiiU most definitely won't have.

I can't wait for the comparisons between versions, once the game is released though, should be interesting.
 

krizzx

Junior Member
Agreed, you can't really make any comparisons with the WiiU screen since it's totally washed out, and doesn't have the dramatic lighting that the PC shot has (remember the WiiU NFS comparisons), time of day, makes a huge difference.

Also the PC version might be supersampled to give incredibly crisp details, which the WiiU most definitely won't have.

I can't wait for the comparisons between versions, once the game is released though, should be interesting.

I recall seeing something about MSAA in the Wii U version change log that was posted. I wouldn't rule it out.
 

HTupolev

Member
I recall seeing something about MSAA in the Wii U version change log that was posted. I wouldn't rule it out.
That would be nice. I'm actually a little surprised that we aren't seeing more MSAA on the WiiU; at a glance it looks like it should be somewhat friendly for it. A fast 32MB multi-purpose eDRAM pool *could* make a decent compromise between the PS3's large render output space (but merely alright bandwidth) and the 360's cartoonishly high multisample bandwidth (into a small pool where you might have to use tiling to make your framebuffer fit).

Still... If we're looking at a typical console multisampling experience (i.e. 720p2xMSAA), I'm easily going to "rule out" it being comparable to a "supersampled to give incredibly crisp details" PC version. Multisampling is hardly a general-purpose "supersample", and you're starting at a low enough resolution that even a non-antialiased 1080p render is going to have comparable geometric quality and substantially better texture and shader sampling.
 

foxuzamaki

Doesn't read OPs, especially not his own
GC was a tank. Same can't be said of the Wii or Wii U.

Even if they're still of higher quality than X360 tupperware finish; they're essentially good for what they are, or appropriate when it comes to finishing, and then there's a little more stress test than usually applied by other parties; that's it.Probably something you can ignore altogether, as Nintendo is, using parts from various sellers.

Failure rate for RAM that is not that fast and on a mature process is very low; it's down to how many write cycles it's supposed to endure. (and they're a lot)Probably a non factor, really.

Nintendo went with a smaller battery in order to save in costs.I'd say given how light on chips that thing is, being resistant to heating was not a concern of the highest order.Not really due to that.

It was the solder, european union had banned lead from them, so the boiling point was lower than before. So X360's solder would boil down over time and pins would become disconected due to it; which is why you can resolder them and never have problems again.

Same thing happened to Nvidia cards on laptops on Macs, of all things. (expensive, supposedly well built)


Nintendo would never approach such heat production and overall parts price Microsoft was paying back then for the same package, so yeah, even though I agree the construction quality for X360 was bullcrap, it's really like comparing apples to oranges (and not really the culprit for the RRoD). Nintendo likes low wear solution (slow spinning fans, cooling not hugely overdependent on them, stuff like that) said design could never save the X360 from desoldering/heating.

the wii was a known tank, its built almost the same way as the gamecube, small but compact and made it super hard to physically break
 

krizzx

Junior Member
I have a question about the Upad streaming. I remember it being said that the component that streams data to the gamepade does it as 60 hz which had to be split to stream to two, but would it be possible to stream data to the Upad without that component?



The GC and Wii were able to connect to the their periods handhelds without such a component. Might it be possible to use an auxiliary USB device to connect 4 Upads to the console wirelessly sort of like a wireless mouse receiver? Or maybe even without any auxillary device at all.

The DS could connect to the Wii wirelessly. Perhaps an updated game pad v2 may be a way around the 2 pad limit.
 
I have a question about the Upad streaming. I remember it being said that the component that streams data to the gamepade does it as 60 hz which had to be split to stream to two, but would it be possible to stream data to the Upad without that component?



The GC and Wii were able to connect to the their periods handhelds without such a component. Might it be possible to use an auxiliary USB device to connect 4 Upads to the console wirelessly sort of like a wireless mouse receiver? Or maybe even without any auxillary device at all.

The DS could connect to the Wii wirelessly. Perhaps an updated game pad v2 may be a way around the 2 pad limit.

I don't remember, does the Gamepad have bluetooth? I believe the video stream in a proprietary WiFi AD-HOC if memory serves.
 

StevieP

Banned
48 shaders at what clock and with what other features supporting it? There is still the other main point I brought. The shader units in Latte are physically larger than AMD's normal 20sp components?

Also, another point I forgot to mention on the power consumption vs hardware performance thing is that the Wii U's target watt rate is "45 watts" not 36 watts. Iwata actually stated this. The fact that 36 watts is the highest recorded in active use simply means that the hardware isn't being pushed or utilized to its full extent.

It makes sense why Nintendo doesn't release more info on the hardware. There are people who keep complaining about Nintendo not releasing information, yet they disregard the information that has been released by Nintendo as PR talk or just outright dismiss it when it doesn't support their negative outlook. What point would their be in releasing more? I doubt anyone who looks negatively on the hardware or company would shift there opinion even in the least. Most of it is the result of bias that existed before they knew anything about the hardware at all.



This is interesting, though its still a far cry from this. When they get the visuals up to this point on the Wii U, I shall consider buying this game.

large.jpg


Would be nice to see how it stands against the console version for the 360 and PS3 though. Are there any shots available for those?

You're not going to get the quality of the PC shot on the Wii U, even if it will likely be better than the PS360 versions.
 

muteant

Member
one thing that's niggling me about Wii U games' visuals is the lack of AA. Pikmin 3 is gorgeous but the jaggies on my 60" TV bug, and TW101 looks to suffer the same fate (still really stoked about it). I'm wondering if AA is particularly taxing on this system for some reason. Are there games already released for the system that employ effective AA solutions?
 

krizzx

Junior Member
one thing that's niggling me about Wii U games' visuals is the lack of AA. Pikmin 3 is gorgeous but the jaggies on my 60" TV bug, and TW101 looks to suffer the same fate (still really stoked about it). I'm wondering if AA is particularly taxing on this system for some reason. Are there games already released for the system that employ effective AA solutions?

Its all up to the devs as with all things. The ports on the system are all had the same AA as there PS3/360 counterparts, so its AA performance is certainly not worse than what it last gen console can do.

Its probably the same things as with 1080p. Most devs feel the resources would be better leveraged elsewhere.
 
the wii was a known tank, its built almost the same way as the gamecube, small but compact and made it super hard to physically break
Not really no.

GC had bigger width ABS walls; that's a easy conclusion to come by, just press the wii on top or bottom, the plastic will react to pressure by going along inside.

That couldn't happen on the GC, due to various reasons. first of all, the width of it was bigger; second, it wasn't molded in a form as plain as the Wii. See how plastic water bottles have all those indentations to hold structure? it's the same thing; GC had them everywhere; on bottom with the high speed ports, network ports and stuff, on top with the DVD loading device, you name it; even if it was like a cube it didn't have plain surfaces of "just plaster plastic without ins and outs). All in all even if you had the same finishing, it would be more sturdy already; and yet it has more width to it, as I previously said..

On top of it all, since GC was a multi-layered tank it had structure within the structure; and that means it's sturdy as hell. Wii wasn't multilayered, so the only structure within the structure to be mentioned is the shielding. Which is something but not really in the same fashion/degree of sturdyness.

Source: I have a industrial design degree, I think I know what I'm talking about. This said, I'm not saying Wii is poorly built, I'm saying GC was a fisherprice tank; in Nintendo's design briefing it must have been something along the lines of "it has to survive a soccer game", "an army of kids that think the thing is edible" or something.

The Wii had a different brefing, less "toy" and less about surviving kids, and more about being 3 DVD cases stacked, it's only natural results vary.
 

z0m3le

Banned
AFAIK it's 240 shaders (48*5-vector units)

The Xenos is a very well documented ATI part.

I'm actually in the camp that 160 shaders is on the lower end for the Wii U, and that the actual count is much higher than that.

If was truly a RV73x derivative it should have 320 shaders just like the Mobility Radeon 4670 card I have in my laptop. Which would make a heck of a lot more sense then, in how it can surpass the 360/PS3's graphical prowess and have DX10/11+ abilities.
http://beyond3d.com/showthread.php?t=45061 Here:
We found out a while ago that its actually 216 for Xenos since the scalar opp is only one FP operation as opposed 2 as was initially thought. Thats makes the total flops per ALU, per cycle at 9 rather than 10.

Also the ~60% efficience is based on Microsoft's own 66% improvement in efficiency per shader compared to Xenos with XB1 (making it around 99% if Xenos is around 60%) These are all theoretical numbers anyways, but it's interesting to compare to the minimum of Wii U.
 
The real problem is the architecture of the chip, you do not understand which is what, and the ALUs are not recognisable. You can hypothesise but the GPGPU is very heavily customised plus the power PC 3 cores CPU is even more customised to run specific actions which is done now by latte rather the previous generations consoles bet heavily on their CPU for that. Wii U is not CPU heavy but rather GPGPU.

So some ALUs may have multiple new code applications running at the same time that may run more efficiently on the hardware rather than have multiple ALUs to do different things separate. Wii U built around efficiency so the crazy bastards on Nintendo R&D do not say shit to keep their work under wraps. I believe it is the only reason nintendo will not reveal more specs for the Wii U.

We need more games released that are built from the ground up for the console to understand ALUs by observation. Today When I was playing the W101 demo I make a comparison with the older built and they added pretty much a lot of shadders like depth of field, normal mapping,lightmap mode and other stuff that where absent before.

So latte is not yet giving all of it's assets yet. Normally we will see more when more heavy productions hit the system, especially from nintendo.
 
All you guys with Twitter, get onto developers, simply ask 'Is the WiiU GPU 176GFLOPs or more ?', maybe someone will bite.

Can they really still be under NDA, 10 months after launch ?.

Also where are Ideaman and BG ?, maybe one of their sources can shed some light on the situation. Would be nice to put it to bed once and for all.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
All you guys with Twitter, get onto developers, simply ask 'Is the WiiU GPU 176GFLOPs or more ?', maybe someone will bite.

Can they really still be under NDA, 10 months after launch ?
You haven't seen many NDAs, have you? ; )
 

AzaK

Member
Wii U built around efficiency so the crazy bastards on Nintendo R&D do not say shit to keep their work under wraps. I believe it is the only reason nintendo will not reveal more specs for the Wii U.

Not the only reason. They used to release specs but the GameCube looked inferior to it's competition and got slammed by the enthusiasts. Even though it was definitely not a lightweight.

Then they went low tech with Wii so of course they won't tout their specs; it'd look laughable against the PS3 and XBox 360. Wii U is similar. It's a bit of a bump over the 360/PS3 but it would be annihilated by the PS4/XB1 and Nintendo being paranoid think it would hurt them.

Personally I don't think it would matter. People know the Wii U is comparatively weak compared to the PS4/XB1 so there's no news there.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
:(, how long do they last, the whole generation ?...
Never signed one with N myself, but theoretically they could last indefinitely (in contrast to the non-compete agreements).
 

JordanN

Banned
Not the only reason. They used to release specs but the GameCube looked inferior to it's competition and got slammed by the enthusiasts. Even though it was definitely not a lightweight.

Did Nintendo ever clarify this? If it really bothered them so much why not just say "nuh-uh" rather than ignore it?

Just not releasing specs because you chose to present them a different way (and thus got compared differently) doesn't make sense.

They also kept the Gamecube specs on their website way after that fact so I can't really believe it was a problem.
 

Oblivion

Fetishing muscular manly men in skintight hosery
Question: Is the process of deciding the specs for the GPU solely decided by the R&D divisions at Nintendo/Sony/MS, or do the chip manufacturers provide input as well?

Did Nintendo ever clarify this? If it really bothered them so much why not just say "nuh-uh" rather than ignore it?

I don't think they necessarily ignored it. For example, when it came to the polygon crunching numbers, the GC was able to do 12 million, a far cry from the 70 million that Sony claimed. Nintndo however, tried to point out that the PS2's numbers were without any effects added on, while the GC's was supposed to be with them included. No one cared. That's probably why they stopped trying.
 
Not the only reason. They used to release specs but the GameCube looked inferior to it's competition and got slammed by the enthusiasts. Even though it was definitely not a lightweight.

Then they went low tech with Wii so of course they won't tout their specs; it'd look laughable against the PS3 and XBox 360. Wii U is similar. It's a bit of a bump over the 360/PS3 but it would be annihilated by the PS4/XB1 and Nintendo being paranoid think it would hurt them.

Personally I don't think it would matter. People know the Wii U is comparatively weak compared to the PS4/XB1 so there's no news there.

The days of the GameCube comparison specs with Ps2 and Xbox are laughable, I won't say much on that department but the "theoritical" peaks and polygon counts of that generation did not matter much as the exclusivity deals with 3rd parties. Nintendo did not have any problem with the specs of the GameCube. The emotion engine Ps2 was using was so exotic so to back their claims the tech demos they represented in E3 was full of bullshit and PR talk that reminded me the February presentation of the Ps4 revelation. Old man faces, same as quantic dreams sorcerer head, target renders.

Wii U is a very efficient machine, to place it little above the Xbox360 and Playstation 3 is not correct. I do not know if you are like the people that say more RAM is MORE POWER because that is the mistake that most unfamiliar with silicon engineering fail to understand.

More shared memory do not have anything to do with processing power even bandwidth do not apply correct in practice even if in theory it is possible. X86 CPU architecture especially that APU AMD uses for Xbox one and Ps4 is from the self cheap part. Jaguar is literally an upcloacked bob cat(see benchmarks for this CPU) and Ps4 and Xbox one have two duct tape together. Who on the right mind says that is a next gen architecture. Plaster a shared pool of memory of 8 GDDR5 of DDR3 in that is like put 8 mechanical pumps with capacity of 17 litres per hour single pipes and transfer the Pacific ocean to the Atlantic and say that you created the fastest way to transfer one ocean to the other. With the Wii U we do not know shit about how its innards work. If a guy with a dev kit on GAF can do a benchmark an in a way pass the info without breaking the NDA would give us an idea how the machine works.

Some people would call me crazy BUT I believe the difference between XOne/Ps4 vs the Wii U is that of a 2 generations middle range GPU like an HD Radeon 7430M(Wii U) with a HD Radeon 7690M XT(Xbox One/Ps4). From the game that I have seen running real time. Pass that it also depends on the studio and tools that can process the power of this machines and we haven't seen anything on Xbox One or Ps4 or Wii U it is too early to pass Judgement.

Because the exotic architecture the GameCube pulled more than twice the process power of its "numbers". It was like a Volkswagen Touareg with a porche 955 cayenne engine under the hood.

Thats the way I think Wii U manufactured because I do not have anything that past gamecube engineering to back my claims. I need games to pass judgement of what Wii U is capable of. Also no middle ware engines comparison because no one will bother optimise a Wii U version of a mutli-platform game that is cross gen.
 

KidBeta

Junior Member
More shared memory do not have anything to do with processing power even bandwidth do not apply correct in practice even if in theory it is possible. X86 CPU architecture especially that APU AMD uses for Xbox one and Ps4 is from the self cheap part. Jaguar is literally an upcloacked bob cat(see benchmarks for this CPU) and Ps4 and Xbox one have two duct tape together.

You are literally wrong.

http://www.3dcenter.org/dateien/abbildungen/AMD-Jaguar-vs-Bobcat.jpg
http://www.notebookcheck.net/filead...Prozessoren/A6-1450_Test/Kabini_vs_Bobcat.jpg

With the Wii U we do not know shit about how its innards work. If a guy with a dev kit on GAF can do a benchmark an in a way pass the info without breaking the NDA would give us an idea how the machine works.

We don't know shit about its innards expect, we know its around the last gen maybe half a head, certainly doesn't hold a candle to next gen.

Some people would call me crazy BUT I believe the difference between XOne/Ps4 vs the Wii U is that of a 2 generations middle range GPU like an HD Radeon 7430M(Wii U) with a HD Radeon 7690M XT(Xbox One/Ps4). From the game that I have seen running real time. Pass that it also depends on the studio and tools that can process the power of this machines and we haven't seen anything on Xbox One or Ps4 or Wii U it is too early to pass Judgement.

Expect that the Wii U isn't even based on GCN so that gives it a even further performance disadvantage.
 

mo60

Member
The. Wiiu is stronger than the 360 and ps3.From what I have been reading it may be as powerful ad a pc using low settings.The 360 and ps3 are weaker than a pc using low settings.
 

chaosblade

Unconfirmed Member
Expect that the Wii U isn't even based on GCN so that gives it a even further performance disadvantage.

I don't think GCN inherently means there's a performance disadvantage. It has a huge advantage on PC where games aren't developed for a specific GPU architecture, and it's thread level parallelism is easier to take advantage of than VLIW's instruction level parallelism. All other things equal VLIW actually has higher theoretical performance (versus GCN 1, not sure about GCN2).

Of course the question remains how much of that theoretical performance you can get with a given workload, which is something that there doesn't really seem to be an answer for due to varying workloads. Either way, the WiiU GPU can be used more efficiently than a comparable PC VLIW part since games will be more optimized for it. As a whole, I don't think not being GCN is inherently a major factor for WiiU GPU performance.

Whether or not it would affect ports/multiplats is another story, but I don't think it's going to see many of those from PS4/XB1 anyway.
 

Rvaisse

Member
I know almost nothing about raw power, but i've played pikmin 3 and i've NEVER seen something like this on PS360. NEVER, nintendo's special sauce really makes a difference. There's no on-par.
 
I know almost nothing about raw power, but i've played pikmin 3 and i've NEVER seen something like this on PS360. NEVER, nintendo's special sauce really makes a difference. There's no on-par.

Pikmin 3 benefits from the heavy use of DoF and the higher res texture they use here and there. Had it started on WiiU it would either look much better or at least be 1080p/60 FPS.
See TW101, it has a lot more going on at any given time, has consistently higher resolution textures and runs at 60fps.
As for PS3/360, Pikmin3 would most likely be doable with some sacrifices.
 

Rvaisse

Member
Pikmin 3 benefits from the heavy use of DoF and the higher res texture they use here and there. Had it started on WiiU it would either look much better or at least be 1080p/60 FPS.
See TW101, it has a lot more going on at any given time, has consistently higher resolution textures and runs at 60fps.
As for PS3/360, Pikmin3 would most likely be doable with some sacrifices.

Great reply, thanks for the details, i'll have a look at this when replaying the W101 demo, didnt really pay attention to the details til now, i was quite absorbed by the frenetic gameplay ^^
 

He's...not entirely wrong actually. Granted, ''low settings PC'' means something like a dual core CPU and an 8800GS with 512VRAM (which are my current specs), which are quite a bit older than what the stuff inside the WiiU. My PC is probably more performing than a WiiU when all is said and done, but the WiiU definitely does things like lighting, DoF, and motion blur a lot better than a 8800GS, those features tend to tank my FPS pretty badly. PS360 ports look and run better on my pc than on consoles (and at 1200p to boot, but that's just my iMac being badass)

Note that most PC games list the above as ''minimum'' specs specifically because they are tied to consoles in some ways. When the new consoles and games come out, the bar will be set quite a bit higher than that.
 

Do you even understand the things you are quoting? I wasn't specific about the numbers but which part of the performance >10% Clock speed and >15% Instructions per clock you do not understand from the AMD specs sheet?

edit: I am an idiot about the "literally" part of my previous post. Yes of course jaguar did make some hardware modifications like adding two more core, more memory per core, and upclocked the speed but the results were only 15% more CALCULATING power. The architecture remain the same as bobcats on each core whatsoever.

We don't know shit about its innards expect, we know its around the last gen maybe half a head, certainly doesn't hold a candle to next gen.

I do not know anything about the engine of the car but I know that is NOT a good engine because... Logic in your post, I couldn't find any.

Expect that the Wii U isn't even based on GCN so that gives it a even further performance disadvantage.

How is it possible to hardware emulate Wii and GameCube games then (gamecube through homebrew)? Or much better why are you so sure about what IBM has done with the CPU and Nintendo engineers? Power PC architecture is way more advance than of AMD and Intel commercial CPUs with their X86 processors(not that Wii U have a X86 CPU). Again that's nothing to do with what I said in my previous post that you intentionally misinterpreted.
 
A

A More Normal Bird

Unconfirmed Member
GCN = Graphics Core Next, AMD's architecture used in the 7-series chips and the XB1 and PS4 GPUs, not Gamecube.
 

strata8

Member
Do you even understand the things you are quoting? I wasn't specific about the numbers but which part of the performance >10% Clock speed and >15% Instructions per clock you do not understand from the AMD specs sheet?

Do you? You said that "Jaguar is literally an upcloacked bob cat". Any increases in IPC, or other changes in the architecture, invalidate that statement.

Power PC architecture is way more advance than of AMD and Intel commercial CPUs with their X86 processors(not that Wii U have a X86 CPU).

Source? I don't see how this could possibly be true.

edit:
edit: I am an idiot about the "literally" part of my previous post. Yes of course jaguar did make some hardware modifications like adding two more core, more memory per core, and upclocked the speed but the results were only 15% more CALCULATING power. The architecture remain the same as bobcats on each core whatsoever.

This is wrong. Within the core itself, Jaguar adds a loop buffer, improves the prefetcher, adds a hardware divider, and doubles the FPU width. This lead to a 15-20% increase per clock, or an increase of 25-30% overall when combined with the 10% frequency bump.
 

chaosblade

Unconfirmed Member
Thanks for making more clear for me to understand. Do we even know if the latte is not a GCN architecture or Xbox one and Ps4 GPUs? Because I did not know this new approach from AMD.

Latte apparently is based on a GPU that used VLIW (4? 5? I don't know, there was some debate on that earlier in the topic), so it probably doesn't use GCN. The other two are based on GCN, it's known already.

VLIW has higher theoretical performance (assuming all other things equal), but GCN has proven to have better real world performance. I mentioned this several posts up.
 
Source? I don't see how this could possibly be true.

I have this benchmark sheet but it is for the supercomputer server units, I do not have commercial bench now. I am also in a hurry for an appointment so I will post later.

edit:


This is wrong. Within the core itself, Jaguar adds a loop buffer, improves the prefetcher, adds a hardware divider, and doubles the FPU width. This lead to a 15-20% increase per clock, or an increase of 25-30% overall when combined with the 10% frequency bump.

Are you serious about adding a loop buffer and a divider and you call that a new processor? This changes were made because of the jump in cores in the general design to help the flow calculations or else to put it simply the extra cores would hinder the overall performance. A good example is like two trucks of 2 tonnes load, adding another 2 tonnes of cargo more on one of them, of course the truck is overloaded so it can not move like before so if we divide that cargo between them and change the engine from 1000hp to 1300hp to each truck then we will have the trucks perform 15% better than before. The trucks ARE the same but the load is different, sure we made some modifications but the bottom line is the trucks can only transfer from 2 tonnes to 3. Same with jaguar and bobcat. I do not have time now we will continue after I return from my work.
 

strata8

Member
I have this benchmark sheet but it is for the supercomputer server units, I do not have commercial bench now. I am also in a hurry for an appointment so I will post later..

No one's going to be using PowerPC in a supercomputer. POWER7 is not the same thing as the architecture used in the Wii U.


Are you serious about adding a loop buffer and a divider and you call that a new processor? This changes were made because of the jump in cores in the general design to help the flow calculations or else to put it simply the extra cores would hinder the overall performance.

If it's the same core, how does a 1.5 GHz Jaguar outperform a 1.6 GHz Bobcat by 22% in a single-threaded benchmark?
 

KidBeta

Junior Member
Latte apparently is based on a GPU that used VLIW (4? 5? I don't know, there was some debate on that earlier in the topic), so it probably doesn't use GCN. The other two are based on GCN, it's known already.

VLIW has higher theoretical performance (assuming all other things equal), but GCN has proven to have better real world performance. I mentioned this several posts up.

VLIW doesn't have to have a higher theoretical performance its majorly dependent on the number of shading cores / work units you have in the GPU. I think the measured speed difference between GCN and the last gens cards (VLIW4 I think) was ~20% at the same clock speed, maybe higher depending on the workload. GCN brings more then just a different shader unit it also improves the cache hierarchy among other things.
 

Donnie

Member
If it's the same core, how does a 1.5 GHz Jaguar outperform a 1.6 GHz Bobcat by 22% in a single-threaded benchmark?

I'm not saying it is exactly the same core, because there are clearly some optimisations. However to your question of why a Jaguar 1.5Ghz would outperform a Bobcat 1.6Ghz by 22% in the test you linked too. Part of the reason would be that the Jaguar based AMD A4-5000 tested has 50% more main memory bandwidth than the Bobcat based AMD E-350. Also Jaguar has a improved L2 cache design which increases effective bandwidth further and so increases performance further. The core itself will also have some part to play in the increased performance of course.
 
Status
Not open for further replies.
Top Bottom