• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

Schnozberry

Member
Given E.A.'s history, I would not doubt that it was an arbitrary decision done for no other reason than to disparage Wii U owners, the same as pricing is 60 dollars and then giving $30 discount on their own proprietary Origin network on launch day and the same as announcing that they would no further support the game with DLC before it even touched a shelf.

EA hosts the multiplayer. It's all run through your Origin Account.
 
That's all GPU stuff, not CPU stuff. If anything, the Wii U port should even free CPU resources, because it enables things like GPU skinning.


That "lazy" shit ruins discussions all the time. The main reasons for inferior ports are the lack of familiarity with the architecture and tools and - most of all - RoI. If the RoI isn't there, no publisher will assign their A team and give them enough time to figure everything out and do a proper job. It's just not going to happen.
Agreed. In the case of RE:R, my impression is that the game is not running as well as it should be on any platforms, except for the 3DS. The Wii U would suffer the most due to having the youngest development tools.
 

v1oz

Member
Your argument is filled with fallacies. You are using facts to make claims that do not logically follow.
Hmm and what exactly are those claims of mine that are filled with fallacies?

No aspect of the CPU would cause cars to be dropped form multiplayer. Only a GPU issue would cause that and the GPU is clearly better than the PS3/360's for them to do all of the upgrades they did. You don't draw game models on the CPU. The only thing the CPU is used for is general games code, A.I. and physics. The single player campaign would be more taxing on the CPU than multilayer because it has to calculate the physics over a wider array and the A.I. for the cars. If it were a CPU issue then their would have been far more cuts than a few multiplayer cars. At worst, the CPU would lead to them having the scale back the physics but that didn't happen.
Who said less players in NFS multiplayer was a CPU issue? It certainly wasn't me.


The car reduction was more than likely the result of either network issues or EA simply wanting to shortchange Nintendo some more which far more likely. Sonic All-Stars racing has no such problem. Black Ops on the Wii U has no such problem. Tekken Tag on the Wii U has no such problems.
Like I said in my previous post, unless you know exactly what's going on under the hood you can't be so sure of anything. Anything else and you are making gross assumptions.
 

v1oz

Member
That's all GPU stuff, not CPU stuff. If anything, the Wii U port should even free CPU resources, because it enables things like GPU skinning.
Well for one I didn't say or imply that it wasn't GPU stuff.

All I said is that the game's underlying code is different and the enhancements to the graphics point towards that. And that was because you said it was the same exact game as the 3DS version - which clearly is an over simplification.


That "lazy" shit ruins discussions all the time. The main reasons for inferior ports are the lack of familiarity with the architecture and tools and - most of all - RoI. If the RoI isn't there, no publisher will assign their A team and give them enough time to figure everything out and do a proper job. It's just not going to happen.
From what I've heard there's nothing too unfamiliar about the architecture. It's a very straight forward design and PPC code is not unfamiliar to anyone in the industry.

As for the ROI stuff. I'm sure the very same team worked on all versions of RE Revelations. There's no reason to believe they were not motivated to do a good job. Unless you know something that we don't all know.
 

StevieP

Banned
From what I've heard there's nothing too unfamiliar about the architecture. It's a very straight forward design and PPC code is not unfamiliar to anyone in the industry.

Not as simple as you make it out to be.

Are for the ROI stuff. I'm sure the same team worked on all version of RE Revelations. There's no reason to believe they were not motivated to do a good job. Unless you know something that we don't all know.

Less money to be made on the Wii U = less time spent on the Wii U. Not a reflection of skills of the port team as much as it is time spent vs other platforms.
 

guek

Banned
Like I said in my previous post, unless you know exactly what's going on under the hood you can't be so sure of anything. Anything else and you are making gross assumptions.

hmm...

From what I've heard there's nothing too unfamiliar about the architecture. It's a very straight forward design and PPC code is not unfamiliar to anyone in the industry.

As for the ROI stuff. I'm sure the very same team worked on all versions of RE Revelations. There's no reason to believe they were not motivated to do a good job.

So I guess we can say you know exactly what's going on under the hood then...?
 

wsippel

Banned
Well for one I didn't say or imply that it wasn't GPU stuff.

All I said is that the game's underlying code is different and the enhancements to the graphics point towards that. And that was because you said it was the same exact game as the 3DS version - which clearly is an over simplification.
You know, I'd agree, but look at MH3U. Same engine, same situation, launch game, similar improvements, no stutter, runs at 1080p. But it's also a more high profile project, exclusive no less, and more effort went into it.

From what I've heard there's nothing too unfamiliar about the architecture. It's a very straight forward design and PPC code is not unfamiliar to anyone in the industry.

As for the ROI stuff. I'm sure the very same team worked on all versions of RE Revelations. There's no reason to believe they were not motivated to do a good job. Unless you know something that we don't all know.
Where did you hear that? Because we recently got a statement from Shin'en on that topic, and they apparently disagree. And you'll have a hard time finding a studio that has more experience on Nintendo hardware. The system might be "easy to develop for", but that doesn't mean it's easy to exploit - it just means you don't have to bend over backwards to get something up and running on the system. The architecture doesn't really matter, either, Espresso, Xenon and Cell are still extremely different animals.

Also, MT Framework has been running on 360 and PS3 for years, the Wii U version only had a couple of months. It doesn't matter if it's the same team, and this is not about motivation, laziness or some other nebulous bullshit. It's a new and different platform with new and different tools.
 
You know, I'd agree, but look at MH3U. Same engine, same situation, launch game, similar improvements, no stutter, runs at 1080p. But it's also a more high profile project, exclusive no less, and more effort went into it.

Actually, I believe that MH3U presents the same framerate problems, there are discussions about the thing on the topic dedicated to RE:R...
 

wsippel

Banned
Actually, I believe that MH3U presents the same framerate problems, there are discussions about the thing on the topic dedicated to RE:R...
I've played MH3U on Wii U for, dunno, 200 or 300 hours? There's no stutter. There are framerate problems related to one specific effect, namely a certain snow particle effect, but that effect isn't even impressive or anything, which means it's almost certainly a bug.
 

krizzx

Junior Member
Indeed, saying that somethign that runs on the PS3/360 with no problems should run the same on the Wii U is like saying that something that was made for Windows should be able run on a Mac, or somethign that was made to run on an Intel cpu should have no problems producing the same performance on an AMD CPU. That is not always the case.

I don't know where these people get there understanding of how computer systems work, and they assert their beliefs so wholeheartedly on top of it. Software won't work as well on hardware it wasn't made for as it will on hardware it was without effort even if the hardware its being ported to is stronger.

ZOE Second Runner and Silent Hill HD are good examples of this.

http://www.cinemablend.com/games/Ko...e-Silent-Hill-HD-Collection-Fiasco-45528.html
http://www.gametrailers.com/side-mi...hd-ps3-getting-framerate-patch-sequel-on-hold
Using the logic I see people using around here toward the Wii U, I guess that means that the PS3/360 can hardly keep up with or are only slightly better than the PS2.


The red blocks and notes in the list are clearly an indicator that the Xbox 360 is not that much better than the Xbox1.
http://en.wikipedia.org/wiki/List_of_Xbox_games_compatible_with_Xbox_360
 
Actually, I believe that MH3U presents the same framerate problems, there are discussions about the thing on the topic dedicated to RE:R...

Re:R and MH occasional framerate drops has nothing to with cpu or gpu performance but based clearly on unoptimised code. The framerate drops has nothing to do how much happen on the screen or the complexity of the scene. This is a clear hint for unoptimised code. The MT-Framework capcom uses for all games is clearly in its pre-alpa phase on Wii U.
 
Ok, lets go with that 160sp number for the time being. To match up with 360's performance, the Latte would have to be at least 33% more efficient than Xenos ( (240 -160)/240) to give it a 50% boost. According to aegies, Xbox One's shaders are suppose to be 66% more efficient than Xenos. Going by that, the 768sp of Xbox One's GPU is equalivant to ~1275sp of Xeno's shaders. That means that the XBONE has at most a little more than 5x the shader power of Wii U. Since XBONE only uses 90% of its power for games, though, that would reduce its advantage to a little less than 5x.

Assuming that the PS4 is as efficient as the Xbox, it will be at most close to 8x the shader power of Wii U. Interestingly, the XBONE appears to be close to being the midpoint between Wii U and PS4 in shader power.

Is my math in this analysis correct?
 

USC-fan

Banned
Ok, lets go with that 160sp number for the time being. To match up with 360's performance, the Latte would have to be at least 33% more efficient than Xenos ( (240 -160)/240.) According to aegies, Xbox One's shaders are suppose to be 66% more efficient than Xenos. Going by that, the 768sp of Xbox One's GPU is equalivant to ~1275sp of Xeno's shaders. That means that the XBONE has at most a little more than 5x the shader power of Wii U. Since XBONE only uses 90% of its power for games, though, that would reduce its advantage to a little less than 5x.

Assuming that the PS4 is as efficient as the Xbox, it will be at most close to 8x the shader power of Wii U. Interestingly, the XBONE appears to be close to being the midpoint between Wii U and PS4 in shader power.

Is my math in this analysis correct?
SUre but that is just one factor. People in here focus way too much on one number. The system is the sum of all its parts. Again I wouldnt compare the wiiu to the PSxONE at all.

As an outsider looking in, he seems to be one of only a few people that's trying to determine what the Wii U is rather than dream about what they want it to be. His posts are way more conclusive then half the things I'm seeing in this thread.
Its been like this for the longest time in WIIu tech threads. At least the attack posts have gone down a lot.
 

krizzx

Junior Member
Ok, lets go with that 160sp number for the time being. To match up with 360's performance, the Latte would have to be at least 33% more efficient than Xenos ( (240 -160)/240.) According to aegies, Xbox One's shaders are suppose to be 66% more efficient than Xenos. Going by that, the 768sp of Xbox One's GPU is equalivant to ~1275sp of Xeno's shaders. That means that the XBONE has at most a little more than 5x the shader power of Wii U. Since XBONE only uses 90% of its power for games, though, that would reduce its advantage to a little less than 5x.

Assuming that the PS4 is as efficient as the Xbox, it will be at most close to 8x the shader power of Wii U. Interestingly, the XBONE appears to be close to being the midpoint between Wii U and PS4 in shader power.

Is my math in this analysis correct?

This is assuming that the Wii U is not subject to same drop in efficiency if not worse do to its repeatedly confirmed poor documentation and early firmware.
 

tipoo

Banned
Assuming that the PS4 is as efficient as the Xbox, it will be at most close to 8x the shader power of Wii U. Interestingly, the XBONE appears to be close to being the midpoint between Wii U and PS4 in shader power.


Similarly the ROP counts seem to be 8, 16, and 32, all double the of the next one down. But agreed with above, singular numbers like that don't mean much. The memory bandwidth of each certainly breaks that scaling.
 

bomblord

Banned
Indeed, saying that somethign that runs on the PS3/360 with no problems should run the same on the Wii U is like saying that something that was made for Windows should be able run on a Mac, or somethign that was made to run on an Intel cpu should have no problems producing the same performance on an AMD CPU. That is not always the case.

I don't know where these people get there understanding of how computer systems work, and they assert their beliefs so wholeheartedly on top of it. Software won't work as well on hardware it wasn't made for as it will on hardware it was without effort even if the hardware its being ported to is stronger.

ZOE Second Runner and Silent Hill HD are good examples of this.

http://www.cinemablend.com/games/Ko...e-Silent-Hill-HD-Collection-Fiasco-45528.html
http://www.gametrailers.com/side-mi...hd-ps3-getting-framerate-patch-sequel-on-hold
Using the logic I see people using around here toward the Wii U, I guess that means that the PS3/360 can hardly keep up with or are only slightly better than the PS2.


The red blocks and notes in the list are clearly an indicator that the Xbox 360 is not that much better than the Xbox1.
http://en.wikipedia.org/wiki/List_of_Xbox_games_compatible_with_Xbox_360

Software will run on any platform the compiler or interpreter supports.

Software will not run well on a platform that it is not optimized for though I would like to make that distinction. If I write hyperthreaded application that assumes an 8 core cpu and then try to run it on a 3 core CPU or vice versa we will probably have issues.

Right now were at a point in the generation where the developers tool sets are heavily optimized for a certain setup, that setup is not the wiiU. Until the developers skills and or tools get to a point where they have a thorough understanding of the wiiU's setup then we will not see well optimized ports because the dev's and compilers are assuming a different architecture. Another example I would like to use is fairly simple. I'm sure most developers compilers are set up to where they offload a large number of GPU functions to the CPU (because of the way the xbox360 and PS3 are setup) the problem is the wiiU's setup needs the exact opposite the GPGPU should be handling some basic CPU functions while the CPU should handle only normal computing. Once developers tool sets have properly implemented this I bet even brute forced code will show a pleasant bump in performance.

The reason we will not see these same issues with the Xbox 1 and PS4 at launch is because they are using a platform architecture (x86) and fairly out of the box PC hardware Which means they can use the heavily optimized compilers and toolsets have existed on the platform for decades.
 
Impressive thread !!

I must say that the chip is much more expensive than what I would have anticipated ....

A big thanks to everybody involved in this analysis and to Chipworks!
 

Meelow

Banned
Ok, lets go with that 160sp number for the time being. To match up with 360's performance, the Latte would have to be at least 33% more efficient than Xenos ( (240 -160)/240) to give it a 50% boost. According to aegies, Xbox One's shaders are suppose to be 66% more efficient than Xenos. Going by that, the 768sp of Xbox One's GPU is equalivant to ~1275sp of Xeno's shaders. That means that the XBONE has at most a little more than 5x the shader power of Wii U. Since XBONE only uses 90% of its power for games, though, that would reduce its advantage to a little less than 5x.

Assuming that the PS4 is as efficient as the Xbox, it will be at most close to 8x the shader power of Wii U. Interestingly, the XBONE appears to be close to being the midpoint between Wii U and PS4 in shader power.

Is my math in this analysis correct?

So...

Wii U = PS2
Xbox One = GameCube
PS4 = Xbox
 
A

A More Normal Bird

Unconfirmed Member
Ok, lets go with that 160sp number for the time being. To match up with 360's performance, the Latte would have to be at least 33% more efficient than Xenos ( (240 -160)/240) to give it a 50% boost. According to aegies, Xbox One's shaders are suppose to be 66% more efficient than Xenos. Going by that, the 768sp of Xbox One's GPU is equalivant to ~1275sp of Xeno's shaders. That means that the XBONE has at most a little more than 5x the shader power of Wii U. Since XBONE only uses 90% of its power for games, though, that would reduce its advantage to a little less than 5x.

Assuming that the PS4 is as efficient as the Xbox, it will be at most close to 8x the shader power of Wii U. Interestingly, the XBONE appears to be close to being the midpoint between Wii U and PS4 in shader power.

Is my math in this analysis correct?

No, you're comparing shader processor count when you should be comparing FLOPS (even that is imperfect though). 1220x0.9 divided by 176 = 6.238636363... That's assuming 90% usage for the XB1 GPU, 100% usage of the theoretical flops of a 160ALU Latte and equal efficiency between them. In reality, the architectural advances of GCN would likely make the performance gap larger.
 

krizzx

Junior Member
For 3 generations straight Nintendo had the most powerful console? I always thought the Xbox was more powerful than the GameCube.

So...

Wii U = PS2
Xbox One = Xbox
PS4 = GameCube

?

I'm not gonna lie, I kinda miss when Nintendo had the most powerful console...Maybe next generation? Lol.

Actually, only for 2. The Genesis was stronger than the SNES. Its CPU destroyed the one in the SNES and it could output more sprites at once. The only areas that the SNES exceeded it in was coloration and and sprite size. That made all of the difference to most people, because more colors and bigger sprites meant the most SNES games looked better, even if they played like crap comparatively.

Enough gaming history though. If what I've been seeing is right, the Wii U GPU is more modern than people think. You can't disregard the 5 dual components which leads me to believe the Wii U uses a dual graphics engine like the later HD6000 models and it also has a lot of components that bear close resemblance to the later HD 6000s GPU components.

I'm leaning towards a custom mashup of HD4X/6X chip modules. That was place the technology around the same range as the other two next gen consoles. I'd say the only real limiting factor of the GPU is its RAW clock.

Does anyone know the clock of the PS4/XboxOne GPUs?
 
SUre but that is just one factor. People in here focus way too much on one number. The system is the sum of all its parts. Again I wouldnt compare the wiiu to the PSxONE at all.

Similarly the ROP counts seem to be 8, 16, and 32, all double the of the next one down. But agreed with above, singular numbers like that don't mean much. The memory bandwidth of each certainly breaks that scaling.

Well, I specifically focused on the raw shader power of the SPs, and I do understand that things will be alot more complicated as we add other factors. For RAM, for example, the DDR3 RAM in the Wii U is about 1/6th the speed of the ones in the XBONE. However, the eDRAM in Wii U is either 30% slower or faster than the ESRAM in XBONE, and the banks are the same size. The PS4's GDDR5 is even faster than that. Along with the cache sizes, latency, and setup, though, I'm not sure how we can accurately compare the memory subsystems.

This is assuming that the Wii U is not subject to same drop in efficiency if not worse do to its repeatedly confirmed poor documentation and early firmware.

I was specifically talking about the raw performance. It's safe to assume that neither of these systems will perform close to their max anytime soon.
 

Meelow

Banned
Actually, only for 2. The Genesis was stronger than the SNES. Its CPU destroyed the one in the SNES and it could output more sprites at once. The only areas that the SNES exceeded it in was coloration and and sprite size. That made all of the difference to most people, because more colors and bigger sprites meant the most SNES games looked better, even if they played like crap comparatively.

Enough gaming history though. If what I've been seeing is right, the Wii U GPU is more modern than people think. You can't disregard the 5 dual components which leads me to believe the Wii U uses a dual graphics engine like the later HD6000 models and it also has a lot of components that bear close resemblance to the later HD 6000s GPU components.

I'm leaning towards a custom mashup of HD4X/6X chip modules. That was place the technology around the same range as the other two next gen consoles. I'd say the only real limiting factor of the GPU is its RAW clock.

Does anyone know the clock of the PS4/XboxOne GPUs?

So Genesis and SNES can do something better than the other.

In your opinion, how well will the best Wii U game look compared to the best PS4/One game?
 

krizzx

Junior Member
So Genesis and SNES can do something better than the other.

In your opinion, how well will the best Wii U game look compared to the best PS4/One game?
Those were only 2 small areas out of a huge number of things that the Genesis dwarfed the SNES in. The Genesis, uses just its core hardware could run FPS games better and produce more polygons. The Genesis also had a far superior sound system.

If you tried to run this on the SNES, it would likely run at 1/3 the frame rate on top of having a shorter drawing distance. Blast processing wasn't just an advertisement.
http://www.youtube.com/watch?v=9doqwl-U7jU



As for how the Wii U games will look comparatively, that is mostly up to the devs. You want an idea of what the Wii U is capable of, then look at the garden tech demo
http://www.youtube.com/watch?v=i2Nsa06KRLo

The limiting factor in this gen will undoubtebly be money. We saw how many studios got ran out of business by the high costs of last gen. http://www.lazygamer.net/general-ne...-studios-that-have-closed-in-this-generation/ That list only covers up to 2011 and there have been more since then like Big Huge Games and THQ.

As far as basic games go, there shouldn't be to much difference other than the PS4/XboxOne version normally outputting at a higher resolution and using higher res textures more often. Only the games that have tremendous budgets will be able to reach a scale that the Wii U would not be able to easily achieve parity with. Though, we are talking some HUGEEEEE budgets. I'm not looking forward to the studio closures that the PS4 will bring, because I'm certain the game sales are not going to be high enough on average to cover the costs.

We've reached an era where million sellers aren't enough anymore. Million sellers don't even break even sometimes with modern big budget games.
 

Meelow

Banned
Those were only 2 small areas out of a huge number of things that the Genesis dwarfed SNES in. The Genesis, uses just its core hardware could run FPS games better and produce more polygons. The Genesis also had a far superior sound system.

If you tried to run this on the SNES it would likely run at 1/5 the frame rate on top of have a short drawing distance.
http://www.youtube.com/watch?v=9doqwl-U7jU



As for how the Wii U games will look comparitively, that is mostly up to the devs.

The limiting factor in this gen will undoubtebly be money. We saw how many studios got ran out of business by the high costs of last gen. http://www.lazygamer.net/general-ne...-studios-that-have-closed-in-this-generation/ That list only covers up to 2011 and there have been more since then like Big Huge Games and THQ.

As far as basic games go, there shouldn't be to much difference other than the PS4/XboxOne version normally outputting at a higher resolution and using higher res textures more often. Only the games that have tremendous budgets will be able to reach a scale that the Wii U would not be able to easily achieve parity with. Though, we are talking some HUGEEEEE budgets. I'm not looking forward to the studio closures that the PS4 will bring, because I'm certain the game sales are not going to be high enough on average to cover the costs.

We've reached an era where million sellers aren't enough anymore. Million sellers don't even break even sometimes with modern big budget games.

Ah, I know Sega brought out the Sega CD and 32x to make that difference even better but with Nintendo releasing DKC which looked better than most Genesis games, I don't if Sega should of just used that money from the R&D of the Sega CD and 32x for the Saturn.

It is very sad about the huge lists of companies that went out of business last generation, I rather see worse looking next gen games than companies going all out on the graphics and losing millions from it.

I do wonder what the first game that will show the real difference with Wii U vs PS4 vs Xbox One, I doubt it is COD Ghosts since it did look good but not really anything "next gen", maybe Watch Dogs? And possibly more games we don't know about.

I'm really curious to see the comparison.
 
Ok, elaboration time.

Disclaimer: I'm using only the high-res Brazos die-shot + detailed floorplan and the high-res Latte die-shot + 'unknowns' floorplan, as those provide sufficiently detailed pictures where I don't have to guess the block boundaries /disclaimer

First, a foreword: I do agree that determining the number of TMUs and ROPs can be used as a good guideline for determining the number of SPs. A quick look at R700 and R800 historical data reveals that for those two gens:
  • SP-to-TMU ratio is in the set of { 10, 20 }-to-1
  • SP-to-ROP ratio is in the set of { 20, 40, 45, 50, 70, 80 }-to-1

Where I disagree with you is your TMU count estimate, for nothing else but your block reading. First, I'm of the opinion the original 'bunch of unknowns' floorplan from Thraktor's OP is still perfectly valid - each block in that plan is actually a block. That includes the doubles. So what you call blocks S in my opinion is indeed two identical blocks S1 and S2 + some material from T1 and T2 (which while not identically laid out, are still apparent doubles).

Hmm, I think I follow you. Perhaps my awesome paint skills didn't portray so clearly what I had in mind, though. I agree with the floor plan in the OP as well. S1 and S2 are clearly separate blocks. I believe that Latte is using separate blocks for its L1 caches, which are aligned with the TMUs (T1 and T2), which are also separate blocks - similar to RV770 and Llano. About the only thing I find different between the design of Latte and those two chips is that the TMUs do not come into direct contact with the shader blocks. Actually, if I'm looking at the RV770 die photo correctly, it looks like the L1 lies between the shader cores and the TMUs. And wait, is that LDS between them as well? I've heard people claim that little sliver is shader redundancy, but looking at block diagrams, I'm not so sure.

Brazos is kind of the odd man out in its layout design, with all the L1 and L2 cache in a single block. I believe you can still make the connection though, with the SRAM config I described - the 16 banks and all. Despite the single block, there must still be two separate pools of L1 cache on Brazos, as we know it has 8 TMUs and the block diagrams of the architecture show the caches aligned with the TMUs - just like R700 and most other modern designs.
 
For 3 generations straight Nintendo had the most powerful console? I always thought the Xbox was more powerful than the GameCube.

So...

Wii U = PS2
Xbox One = Xbox
PS4 = GameCube

?

I'm not gonna lie, I kinda miss when Nintendo had the most powerful console...Maybe next generation? Lol.

If we take architecture to consideration, I think a better comparison would be:

Wii U = Dreamcast (very notable gap between the others in most of its specs, but efficient for what it is. The GPU features, however, are a lot closer to the other systems than what the Dreamcast was to the PS2/GCN/XBOX.)

XBONE = GameCube (powerful for what it is, and efficient. Has an "interesting" memory system)

PS4 = Wii (very similar architecture, GPU raw power is boosted 50% over XBONE , better work RAM. Unlike GCN to Wii, the CPU probably the same performance, and the memory setup is more complicated to compare.)

I don't believe any of the systems are imbalanced like the PS2, and PCs would be more like the Xbox.. except that the gap between it and the others will significantly increase in time. :)

No, you're comparing shader processor count when you should be comparing FLOPS (even that is imperfect though). 1220x0.9 divided by 176 = 6.238636363... That's assuming 90% usage for the XB1 GPU, 100% usage of the theoretical flops of a 160ALU Latte and equal efficiency between them. In reality, the architectural advances of GCN would likely make the performance gap larger.
Heh, if we are taking about flops it gets more bizarre since the 176 GFLOPs theory goes along with the conjecture that Latte can match or outperform Xeno's 240 GFLOPS. That would require the Wii U FLOPS to be at least a 27% more efficient than the ones in Xenos. Since FLOPS is apparently a variable instead of a constant, that makes things a bit tricky. I'm also not convinced yet that a 176 GFLOPS Latte would give us the performance that we are seeing for games like Trine 2 and Need for Speed...
 
Considering its just a simple guess of a post you really went hard on him, just speculation dude.
It was so wrong, though. For example, PS4 is probably like 8x more powerful than the Wii U and he puts it as Xbox (PS4) vs. PS2 (Wii U)... how does this make any sense? Someone with even the most rudimentary knowledge of last gen hardware knows that's not true. Same deal with his Gamecube (Xbox One) vs. PS2 (Wii U) comparison... how do you make a post like that? Consistently the same type of smh-worthy posts.
 

krizzx

Junior Member
Ah, I know Sega brought out the Sega CD and 32x to make that difference even better but with Nintendo releasing DKC which looked better than most Genesis games, I don't if Sega should of just used that money from the R&D of the Sega CD and 32x for the Saturn.

It is very sad about the huge lists of companies that went out of business last generation, I rather see worse looking next gen games than companies going all out on the graphics and losing millions from it.

I do wonder what the first game that will show the real difference with Wii U vs PS4 vs Xbox One, I doubt it is COD Ghosts since it did look good but not really anything "next gen", maybe Watch Dogs? And possibly more games we don't know about.

I'm really curious to see the comparison.
SEGA did not release the addons to match Nintend. Addons themselves were a fad at the time and SEGA was joining in. The SEGA CD was made to match the PC-Engine/Turbo Graphx CD expansion and the 32X was just an attempt at producing next gen graphics without actually making a new system.

Games like Starfox, Yoshi's Island and Donkey Kong Country used a special chip to enhance the game that was embedded in the cartridge. The SNES could not do that normally.


Don't expect much from Watch Dogs, because I've think it has been unofficially confirmed to just be a port of the 360 version and runs worse. I'd say look to Arkham Origins instead. The team who are making it are the ones who ported Arkham City to the Wii U, so they have more experience with making Batman on the Wii U than on the 360/PS3. Though, don't expect any multi-platform game to show significant difference from one version to another at this scale. Most devs aren't going to drop that kind of money on their own.

Only the Wii U's exclusives will really show its capability. Once again, I point to the garden demo. This should be the base of what to expect from the Wii U http://www.youtube.com/watch?v=i2Nsa06KRLo
 

atbigelow

Member
Those were only 2 small areas out of a huge number of things that the Genesis dwarfed the SNES in. The Genesis, uses just its core hardware could run FPS games better and produce more polygons. The Genesis also had a far superior sound system.


whooaoaoaa there. While the Genesis' Yamaha synth chip is pretty rad and makes awesome clean music, I'm not sure you could call it far superior over the SPC. Matter of fact, I think you'd be hard pressed to make the argument it was superior overall.
 

Meelow

Banned
SEGA did not release the addons to match Nintend. Addons themselves were a fad at the time and SEGA was joining in. The SEGA CD was made to match the PC-Engine/Turbo Graphx CD expansion and the 32X was just an attempt at producing next gen graphics without actually making a new system.

Games like Starfox, Yoshi's Island and Donkey Kong Country used a special chip to enhance the game that was embedded in the cartridge. The SNES could not do that normally.


Don't expect much from Watch Dogs, because I've think it has been unofficially confirmed to just be a port of the 360 version and runs worse. I'd say look to Arkham Origins instead. The team who are making it are the ones who ported Arkham City to the Wii U, so they have more experience with making Batman on the Wii than on the 360/PS3.

Yeah, ad on's in the 90's we're something all right, Zapper, etc.

And Watch Dogs was in development along side the PS4 and Xbox One version, while the PS3/360 version started after. We have no idea what the Wii U version will look like.
 

JordanN

Banned
When the PS2/GC/Xbox launched, you already had an idea how the order of power went (i.e Metal Gear Solid 2, Super Smash Bros Melee, Dead or Alive 3).

Can't say the same about Wii U. So far, PS4/XBO both got games that are worlds apart from Wii U in both IQ (1080p vs 720p) and other effects (ray traced reflections, particles etc).

Wii U, whether through weak hardware or "tools" has a hurdle the PS2/GC/Xbox never had and that is being closer to last gen (PS3/360) than next gen (PS4/XBO).
 
It was so wrong, though. For example, PS4 is probably like 8x more powerful than the Wii U and he puts it as Xbox (PS4) vs. PS2 (Wii U)... how does this make any sense? Someone with even the most rudimentary knowledge of last gen hardware knows that's not true. Same deal with his Gamecube (Xbox One) vs. PS2 (Wii U) comparison... how do you make a post like that? Consistently the same type of smh-worthy posts.
Well I dont care if it was wrong or not we're talking about your animosity lol
 

krizzx

Junior Member
Yeah, ad on's in the 90's we're something all right, Zapper, etc.

And Watch Dogs was in development along side the PS4 and Xbox One version, while the PS3/360 version started after. We have no idea what the Wii U version will look like.

You are right. We don't know what it will look like. Things are still looking like the Wii U version will be an after though amongst the other ones. Same with project cars. The last I saw in the project cars thread, the 360/PS3 version had plenty of progress but the Wii U version hadn't even been started on. It likely just going to be a port of the PS3 or 360 version and suffer port problems like most others on the console.

Don't get your hopes up.

Only the games that are built from the ground up using tools that were "made for the Wii U" will really show what the Wii U can do
It was so wrong, though. For example, PS4 is probably like 8x more powerful than the Wii U and he puts it as Xbox (PS4) vs. PS2 (Wii U)... how does this make any sense? Someone with even the most rudimentary knowledge of last gen hardware knows that's not true. Same deal with his Gamecube (Xbox One) vs. PS2 (Wii U) comparison... how do you make a post like that? Consistently the same type of smh-worthy posts.



8X? Those consoles aren't even 8X stronger than their last gen counterparts. I'd be stretching to say 3X which is about how much more capable the GC proved to be than the PS2.
 

Meelow

Banned
You are right. We don't know what it will look like. Things are still looking like the Wii U version will be an after though amongst the other ones. Same with project cars. The last I saw in the project cars thread, the 360/PS3 version had plenty of progress but the Wii U version hadn't even been started on. It likely just going to be a port of the PS3 or 360 version and suffer port problems like most others on the console.

Don't get your hopes up.

Only the games that are built from the ground up using tools that were "made for the Wii U" will really show what the Wii U can do

Well it is Ubisoft, and they currently hadn't done the Wii U wrong (I'll ignore Rayman).

Didn't the developers of Project Cars say great things about the Wii U version?

Here it is: http://mynintendonews.com/2013/04/2...ing-sure-wii-u-doesnt-get-the-crappy-version/
 

krizzx

Junior Member
Well it is Ubisoft, and they currently hadn't done the Wii U wrong (I'll ignore Rayman).

Didn't the developers of Project Cars say great things about the Wii U version?

Here it is: http://mynintendonews.com/2013/04/2...ing-sure-wii-u-doesnt-get-the-crappy-version/

That strangely came out after I was told that the Wii U version hadn't even been started on in the main Project C.A.R.S. thread on GAF by people who were actually involved. I thought it was kind of odd.
 
A

A More Normal Bird

Unconfirmed Member
Heh, if we are taking about flops it gets more bizarre since the 176 GFLOPs theory goes along with the conjecture that Latte can match or outperform Xeno's 240 GFLOPS. That would require the Wii U FLOPS to be at least a 27% more efficient than the ones in Xenos. Since FLOPS is apparently a variable instead of a constant, that makes things a bit tricky. I'm also not convinced yet that a 176 GFLOPS Latte would give us the performance that we are seeing for games like Trine 2 and Need for Speed...

You do know that the 176GFLOPS theory is the same as the 160ALU theory you're using, right? 160ALUS at 550mhz = 176GFLOPS. You can't just compare the processor count of Latte to the XB1/PS4 GPUs when those GPUs are clocked higher. There's also little to be gained in taking efficiency figures that we have no context for (let alone invented ones) and making them a cornerstone of any comparison.

FLOPS are an imperfect metric. Without explicit information from developers about the hardware's capability or where the bottlenecks in their software were, we can't know if or how much of a role is played by memory, API, featureset or hardware efficiency in making the Wii-U perform better than the 360.
 

Meelow

Banned
That strangely came not after I was told that the Wii U version hadn't even been started on in the main Project C.A.R.S. thread on GAF by people who were actually involved. I thought it was kind of odd.

I guess we'll see what happens when it gets shown.
 

bomblord

Banned
If we take architecture to consideration, I think a better comparison would be:

Wii U = Dreamcast (very notable gap between the others in most of its specs, but efficient for what it is. The GPU features, however, are a lot closer to the other systems than what the Dreamcast was to the PS2/GCN/XBOX.)

XBONE = GameCube (powerful for what it is, and efficient. Has an "interesting" memory system)

PS4 = Wii (very similar architecture, GPU raw power is boosted 50% over XBONE , better work RAM. Unlike GCN to Wii, the CPU probably the same performance, and the memory setup is more complicated to compare.)

I don't believe any of the systems are imbalanced like the PS2, and PCs would be more like the Xbox.. except that the gap between it and the others will significantly increase in time. :)


Heh, if we are taking about flops it gets more bizarre since the 176 GFLOPs theory goes along with the conjecture that Latte can match or outperform Xeno's 240 GFLOPS. That would require the Wii U FLOPS to be at least a 27% more efficient than the ones in Xenos. Since FLOPS is apparently a variable instead of a constant, that makes things a bit tricky. I'm also not convinced yet that a 176 GFLOPS Latte would give us the performance that we are seeing for games like Trine 2 and Need for Speed...

I'm not going to pretend I know anything about high level system hardware but I do have some pretty good rudimentary computer knowledge so I hope I'm not completely off base when I ask this.

How can you have a more "efficient flop" isn't a flop a direct measurement of how many of a certain kind of calculation is performed in a set amount of time? If your doing the calculations more efficiently then they should be completed faster which would increase the total flops.
 

krizzx

Junior Member
I'm not going to pretend I know anything about high level system hardware but I do have some pretty good rudimentary computer knowledge so I hope I'm not completely off base when I ask this.

How can you have a more "efficient flop" isn't a flop a direct measurement of how many of a certain kind of calculation is performed in a set amount of time?

You are right.
FLOPS = Floating-point Operations Per Second

FLOPS are a measure of floating point performance. Nothing more, but people seem to treat it like the absolute, penultimate measurement of hardware performance when its not. Rarely do FLOPS performance live up to boasts. Bragging about it is little more than marketing gimmick like Sony did with their 1.84 Tflops performance claim with the CELL. Sony lied their ass off and people ate it for breakfast.
 

Sipheren

Banned
How can you have a more "efficient flop" isn't a flop a direct measurement of how many of a certain kind of calculation is performed in a set amount of time? If your doing the calculations more efficiently then they should be completed faster which would increase the total flops.

I would like to know this as well. I find the FLOPS units to be a very unreliable way to measure performance. I have 2x HD6850's in my PC and combined they produce 2976 GFLOPS (Source) but apparently the new PS4 is the greatest thing ever and it only produces 2000 GFLOP, so there must be a lot more to it than these numbers and I would love someone in the know to give an explanation of it.
 
Status
Not open for further replies.
Top Bottom