• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

lord pie

Member
This is off topic for this thread but the difference is that the Gekko/Expresso architecture is much more efficient at parallel processing than x86. The PS4's and Xbone's inefficient x86 architecture forces them to have a high number of cores so they can "hack" parallel processing into an architecture that was never designed to have it. Expresso was built for it, which may explain why it was able to run games designed to run on 3 360 cores on its main core.

...
.

1) IBM considered putting multiple cores in GCN. They decided to go with a Superscalar design instead (I actually had no idea what that meant but then I Wikipedia'd it...)

Erm. Pardon me, but I'm going to be very blunt...

Processor architectures are extremely complex systems. To make such a broad claim (that x86 is inefficient and that parallel processing is 'hacked in') is a gross display of ignorance. If you were to make such a claim, it would require substantial evidence to support it and sound reasoning. You provide no such evidence or reasoning.
You are also claiming the reverse is true for the espresso CPU - again with no evidence other than anecdotal correlation.

The problem I have with this, is that many people read these forums with a goal of learning or finding clarification - and often take a post as trustworthy when they themselves lack the knowledge to properly understand the content. In more blunt terms; I fear people will believe you.

Further, you later show that you lack understanding of a fundamental processor architectural feature (superscalar) and additionally selectively quote wikipedia in a misleading way... I'm really not sure what to say. If you do not understand supersclar, then honestly what made you feel qualified to make such broad and sweeping statements earlier?
 
If you do not understand supersclar, then honestly what made you feel qualified to make such broad and sweeping statements earlier?
Logic. If the "ancient" Power architecture of the Espresso derived Gekko is so limited, how could it possibly run games designed to run on 360's CPU and Cell (with shitty tools on its end at that)? I've made no claim that the CPU is faster than PS4's or Xbone's, just more efficient with what it has. I'm certainly no electronics expert, and never made the claim to be one. But what I do do well is research.

And yes, I would describe x86 parallelism "hacked in" when we have the Power architecture that can do the same calculations with fewer cores (literally a more power efficient solution). Unless I'm mistaken, the primary reason we even still use x86 is because Microsoft uses it to be compatible across Windows generations.
 

AzaK

Member
I believe it is a giving that it will take resources to have an independent full 3D screen on the pad. The question here is what modifications were done to keep the system from having unconsistent performance? Nintendo has a huge focus on consistent performance. For the DS, for example, they had two 2D GPUs and a 3D GPU that "forced" 60fps on one screen or splits 30fps on both so that the system will have a consistent performance. In the case of the 3DS, they doubled the GPU clockspeed to deal with the drain of resources when the system uses 3D. I would expect that Nintendo has designed the system to still have its desired performance even with two independent 3d screens.

Of course they are likely to be able to get "better than desired" if all resources go to one screen. If not, then they are wasting silicon if it's not doing anything. Something I doubt Nintendo would do.
 
Of course they are likely to be able to get "better than desired" if all resources go to one screen. If not, then they are wasting silicon if it's not doing anything. Something I doubt Nintendo would do.
I agree with that. The point is that Nintendo likely made sure the system was able to have a achieve a certain level of performance when the system is rendering at least two full independent 3D displays. The problem is that we are not sure on what the target power was. :)
 

z0m3le

Banned
None of what you list here is even remotely fact, especially FLOPS and it certainly doesn't tell the capability of any of the hardware. The probable FLOPs performance is more likely in the 200-250 rang, not 176 or 352.

The clocks mean nothing. If they were the biggest measurement of performance, then that would make the PS4 CPU weaker than the CELL because it has 8 SPU's at 3.2 Ghz that people (including a dev) were swearing for full fledged cores in the Espresso thread.

You can't truncate the other ram sticks, and only 300 MB of the Wii U's RAM is used for system files. That some of the info that was confirmed through vgleaks. That other 700Mb isn't being used at all or no one knows what its used for according to the docs and will like be be put to use for games or game related features in future firmware updates.

The only options for GFLOPs is 176, 288 or 352. The 8 SPUs have either 20, 32, or 40 SPs/ALUs in them, giving you 160 ALUs, 256 ALUs or 320 ALUs. You take the ALU count, multiply it by 2 and then multiply that by how many GHz the GPU is clocked at, so 160 ALUs would look like this: 160 x 2 x .550 (mhz) = 176 GFLOPs.

I'm not opposed to any of those 3 numbers being real, though if it is 32ALUs per SPU it would have to be VLIW4, as the ALUs (shaders) have to be grouped in even numbers divisible by 4, VLIW5 would mean that they have to be divisible by 5.

If the GPU die is 40nm I don't think 160 ALUs makes sense, especially because DX10.1 shaders are quite a bit smaller than DX11 shaders from AMD. I'm also of the opinion that Nintendo built Wii U's architecture as a basis for all future hardware, using 45nm for the GPU and CPU would point to a hope to shrink them in the future and bring some form of the hardware to their next handheld which when you consider all the tools they are making for it (their own version of the unity engine, nintendo web framework, VC again) you start to see that they are marrying Wii U's architecture for the next decade, not just the next 4 years.
 

Argyle

Member
Well, if you read the post(and IBM document I originally quoted, the Gekko GPU was designed to run parallel operations (superscalar) even though it was one core. And IBM went with this design in lieu of multiple cores for the Gekko according to the IBM document. Then the second article states that the Power design in general requires fewer cores than x86 to run the same parallel operations. Its not that much of a leap to arrive at the conclusion that Power is by design more efficient than x86 (which is why Power is still very popular in server applications). All of this makes sense in light of the fact that x86 was created in the 70's and Power was developed in the late 80's and early 90's.

Pretty sure nearly everything is superscalar nowadays, including current gen console (PS3/360)...hell most if not all cellphones have superscalar processors now. x86 processors have been superscalar since the original Pentium. This is not new and amazing tech anymore.

Keep in mind that each implementation of Power may have completely different architecture under the hood, so comparing current generation server processors with Gekko is like saying that the PS4 is awesome because the latest Intel Core i7 processors are super fast.
 
High motion scenes with a lot of variety in colour are naturally more susceptable to compression artefacts.

You don't understand my question. There are plenty of other games out there with a lot of variety in colour and high motion scenes, such as Monster Hunter 3 Ultimate or even NintendoLand. These games look infinitely better on the Wii U GamePad's screen than Sonic Transformed. I swear my Sonic Transformed looks bad even when the picture is more or less static. Does this mean the games handle Wii U GamePad streaming (and all that comes with it such as bandwidth etc.) instead of the console, which may result in inconsistent quality across games? Does it have something to do with Sonic Transformed not rendering at 720p natively but below?
 
Keep in mind that each implementation of Power may have completely different architecture under the hood, so comparing current generation server processors with Gekko is like saying that the PS4 is awesome because the latest Intel Core i7 processors are super fast.
I understand the architecture implementation is different per chip, what I said was that in general Power processors are more efficient than x86 for certain applications. More efficient != faster or more powerful. Using fewer cores to do the same amount of work as an x86 is the definition of more efficient, as it requires less energy to power them. Now, that doesn't mean the Wii U's chip is still efficient with respect to current chips (if it is indeed the direct derivative of a Power750CL). But that's only true if you believe the CPU is basically an unmodified 750, which it may or may not be.

With regard to the Gekko/Expresso rendering parts of the graphics load. Luigi's Mansion, a GameCube launch game had parts of its lighting rendered by the CPU. If it was done then, it can be done now. That doesn't mean its as fast or Cell, the 360's or any other CPU, but the functionality is there and its usable.
 

strata8

Member
Pretty sure nearly everything is superscalar nowadays, including current gen console (PS3/360)...hell most if not all cellphones have superscalar processors now. x86 processors have been superscalar since the original Pentium. This is not new and amazing tech anymore.

Exactly. Modern large core x86 processors (Bulldozer/Piledriver, IVB/Haswell) all issue 4 instructions per cycle. Expresso is 3-issue, and the Jaguar cores used in the PS4/XB1 are 2-issue. It's only one aspect of core performance though (Jaguar has a much higher performance per clock than other 2-issue architectures for example).

Also this quote GameGuru posted:
"Let’s also look at x86 versus Power servers. Although x86 is good at processing many fast threads, it can only execute two threads per cycle. Power Systems servers can execute four. What this means is right off the bat, Power technology is twice as powerful. Power servers are also known to do compute-intensive jobs more efficiently. Both systems perform well in processing parallel tasks, but to scale x86, you must throw more processing cores into the configuration—more cores burn more power and vendors often charge for applications based on the number of cores. So using x86 solutions may drive up license costs. My point is that, despite what many IT buyers think, x86 is not the answer to running the most optimized solutions, as it doesn’t do every job optimally."

That has nothing to do with POWER vs. x86. It's referring to SMT, which is a way to increase the performance of a single core dealing with multiple threads, but it's not even close to having full cores. Intel's processors and the Xbox 360 can handle 2 threads per core, while IBM's POWER server chips can handle 4 threads. The Wii U doesn't have SMT at all.
 
I understand the architecture implementation is different per chip, what I said was that in general Power processors are more efficient than x86 for certain applications.

Which is not true. PowerPC or x86 is just the instruction set. Anything can be under the hood as long as it implements that.

With regard to the Gekko/Expresso rendering parts of the graphics load. Luigi's Mansion, a GameCube launch game had parts of its lighting rendered by the CPU. If it was done then, it can be done now. That doesn't mean its as fast or Cell, the 360's or any other CPU, but the functionality is there and its usable.

That's no special funcitionality. CPUs can calculate anything, that's what they're for. The only thing that was stated here is that they enhanced floating point performance in comparison to earlier IBM designs by adding support for 2-way SIMD ("paired singles"). The equivalent of Intel's and AMD's CPUs is called SSE.
 
So do we really think they are using the CPU for graphical processing? Wouldn't we think the GPU is capable of doing all the stuff the CPU can do for graphics?
 
So do we really think they are using the CPU for graphical processing? Wouldn't we think the GPU is capable of doing all the stuff the CPU can do for graphics?
No. Yes, and should be doing them much more efficiently. That's why GPUs exist, is it not? That's why the PS360 set-up never made sense to me...
 
Just struck me as odd as to why a poster thought they'd use it for things like lighting because the GC did.
I'm sure that functionality is still there for for BC with Wii, but what good would Wii level assistance be when you have a full featured GPU? It'd probably be like trying to cook a steak with a match instead of the fire pit right next to it. But who knows...
 
I'm sure that functionality is still there for for BC with Wii, but what good would Wii level assistance be when you have a full featured GPU? It'd probably be like trying to cook a steak with a match instead of the fire pit right next to it. But who knows...
Understand what you mean. From what we are seeing so far, Nintendo designed the system to shift all the extra tasks away from the CPU and offsource them to the DSP, ARMs, and especially the GPU. This is not super unique since the other next-gen consoles are designed in a similar way. There was, however, a higher focus on the CPU with the 360 and especially the PS3.
 
Which is not true. PowerPC or x86 is just the instruction set. Anything can be under the hood as long as it implements that.

That's true, but the whole "PPC is more efficient" mindset has a basis is historical fact. back in the 90s PPCs WERE more efficient as they were RISC machines. Intel kept things competitive by making x86 processors have a RISC core with some extra bits to translate the x86 instruction set to a reduced instruction set.

Those extra bits represented a significant number of transistors at the time meaning PPCs could do the same with fewer transistors—hence the perception of efficiency. However, as processors used more and more transistors, the number needed to translate the x86 instruction didn't grow proportionally, and thereby becoming an insignificant portion of the processor eliminating the PPC advantage.
 
That's true, but the whole "PPC is more efficient" mindset has a basis is historical fact. back in the 90s PPCs WERE more efficient as they were RISC machines. Intel kept things competitive by making x86 processors have a RISC core with some extra bits to translate the x86 instruction set to a reduced instruction set.

Those extra bits represented a significant number of transistors at the time meaning PPCs could do the same with fewer transistors—hence the perception of efficiency. However, as processors used more and more transistors, the number needed to translate the x86 instruction didn't grow proportionally, and thereby becoming an insignificant portion of the processor eliminating the PPC advantage.

Correct. Maybe I should have added that as an additional explanation (though even back then it had little to do with specific applications being more efficient).
 

69wpm

Member
From the sounds of the ign review there's barely any difference between the ps3/360 and wii u versions of splinter cell.

If the Wii U version has tearing I will cancel my pre-order. Let's wait for DF though, I can't believe Ubisoft was that lazy and didn't optimise the Wii U version at all.

Edit: After reading the review, the PS360 version has screen tearing, the Wii U version has frame drops, I gues because of the vsync. Now all depends on how low it gets.
 
If the Wii U version has tearing I will cancel my pre-order. Let's wait for DF though, I can't believe Ubisoft was that lazy and didn't optimise the Wii U version at all..

Not unfinished dev kits or old dev tools or small budgets but lazy developers now ?.

I can't wait to see the excuses when the WiiU versions of CoD Ghosts, Assassin's Creed IV and Watch Dogs don't compare favorably to the PS360 versions nevermind the PS4/XBO versions...

Maybe when all is said and done the hardware just isn't good enough ?.
 
If the Wii U version has tearing I will cancel my pre-order. Let's wait for DF though, I can't believe Ubisoft was that lazy and didn't optimise the Wii U version at all.

Edit: After reading the review, the PS360 version has screen tearing, the Wii U version has frame drops, I gues because of the vsync. Now all depends on how low it gets.

I'd take the dropped frames over tearing any day.
 

Lonely1

Unconfirmed Member
Not unfinished dev kits or old dev tools or small budgets but lazy developers now ?.

I can't wait to see the excuses when the WiiU versions of CoD Ghosts, Assassin's Creed IV and Watch Dogs don't compare favorably to the PS360 versions nevermind the PS4/XBO versions...

Maybe when all is said and done the hardware just isn't good enough ?.

We already have multipats that perform better than the HD twins...
 
I'd take the dropped frames over tearing any day.

It's only in the cutscenes. No need to worry.

Any character not named Sam Fisher is rather ugly, with very little facial detail and awful hair. That just can’t be ignored in the plentiful pre- and post-mission cutscenes, most of which suffer from rampant distracting screen tearing in the 360 and PS3 versions and framerate stuttering on the Wii U

http://www.ign.com/articles/2013/08/14/splinter-cell-blacklist-review
 

EDarkness

Member
If the Wii U version has tearing I will cancel my pre-order. Let's wait for DF though, I can't believe Ubisoft was that lazy and didn't optimise the Wii U version at all.

Edit: After reading the review, the PS360 version has screen tearing, the Wii U version has frame drops, I gues because of the vsync. Now all depends on how low it gets.

Reading the review and from the comments, it seems like IGN didn't even play the Wii U version. Still really don't know what that version is gonna be like. Probably gonna have to wait until the game comes out to know for sure.
 

prag16

Banned
Not unfinished dev kits or old dev tools or small budgets but lazy developers now ?.

I can't wait to see the excuses when the WiiU versions of CoD Ghosts, Assassin's Creed IV and Watch Dogs don't compare favorably to the PS360 versions nevermind the PS4/XBO versions...

Maybe when all is said and done the hardware just isn't good enough ?.

Lazy, rushed, whatever you want to call it. They chopped out local co-op so it stands to reason they could have fallen short in other areas as well.

But keep beating that drum if it entertains you, I guess.
 
Not unfinished dev kits or old dev tools or small budgets but lazy developers now ?.

I can't wait to see the excuses when the WiiU versions of CoD Ghosts, Assassin's Creed IV and Watch Dogs don't compare favorably to the PS360 versions nevermind the PS4/XBO versions...

Maybe when all is said and done the hardware just isn't good enough ?.
Or, you know, financial decisions concerning how much effort they could put into it before they start eating too much of their profit...
They'd have had to recode quite a bit considering the different design compared to the PS3 or 360... Hint: they're designed to run a lot of GPU functions on their CPUs. That's the only way they can run games that look worth a damn by today's standards.
 
But aren't PS4 and XBO designed around slower CPU's just like WiiU so theoretically cross gen games like Ghosts, AC IV and Watch Dogs should look a lot better on WiiU compared to PS360 ?.
 
Just struck me as odd as to why a poster thought they'd use it for things like lighting because the GC did.

I think the idea is that the MCM allows for the CPU to pick up the slack for the GPU when necessary, and vice versa. So for the purposes of this example, if one CPU core and unoptimized use of the GPU can run PS360 ports as it had been, then 2 cores can do something quite a bit better, and the 3rd can be used to create effects that perhaps aren't possible otherwise for an even better looking experience.
 
But aren't iPS4 and XBO designed around slower CPU's just like WiiU so theoretically cross gen games like Ghosts, AC IV and Watch Dogs should look a lot better on WiiU compared to PS360 ?.
XBO and PS4 has more raw power to deal with those issues the more brute way. On that note, you shouldn't expect any of those game to be he prettiest games on the systems.
 

krizzx

Junior Member
I don't know where to even begin with this post, but it seems you created a lot of subtext to my post that wasn't intended nor present in anything I said. Someone said the character models looked good, I pointed out that they are actually substandard from the norm. Regardless of whether they'll be improved in the future is irrelevant to the discussion.
No, you didn't. You took a single picture of Shulk's face that wasn't even in game footage(probably wasn't even being rendered on Wii U hardware), as opposed to using an actual in game character model which, by all standards looks far above the norm for a last gen game, to make an inaccurate state about the entirety of every character in the the game with absolutely nothing else backing it.

I called it out because it was simply ludicrous. If you had actually used an in game character model as opposed to a single off screen shot of Shulk's face(that actually looks like the same one from Xenoblade on the Wii only with a higher res texture) or more than one image, then that would have been different.

Really if you didn't enter the discussion with so much outrage and a template response you would've noticed your mistake. And claiming that I chose "The worst possible image" in the case at hand is absurd, you realize that right?

What outrage? I simply called your statement how I saw it and I see that type of thing often as I said at the end of my post. Any feelings I had were far on the opposite end of rage. Try dissapointed for more accuracy. I'm disappointed at seeing these exact same unsubstantiated, cherry picked, narrow sighted arguments that are always done in the exact same way towards the exact same, often fallacious, ends popping up so often.

I'm not here for console war rubbish. I came here for the analysis. Whether or not you speak positive or negative about the hardware, the company or employees as its relates to the GPU is inconsequential to me. What I am interested in is sound logic and/or substantiated fact, as well as other materials and examples to back such statements. I rarely ever see such things, though, because most people come with a short sighted agenda that approaching with an open mind would only work to refute.

Its fine if you don't like Nintendo or the Wii U. That is not my business, but if you are going to make a such a grand generalization with nothing much backing it then I am going to call you on it.

It would be like me taking a shot of a low res textures from COD or Metal Gear Solid 4 and saying that all of the textures are lower res or that the that's all the PS3 can do and that is not that much better from last gen consoles of its time.

Oh, look at how low res these textures are. Metal Gear Solid 4 is clearly not that huge of a step up from normal PS2 games.
926596_20080514_screen010.jpg

Wow, look at all of those low res textures, all of that sparse, flat grass and all that fog in Uncharted. The PS3 clearly can't do that much more than the Xbox1. Its only a small step up from the gen before it.
uncharted-drakes-fortune-2007111303.jpg

Those are terrible, overgeneralized arguments. Even then, its still better than the one with Shulk's face as I'm at least using normal in game footage.
 
Fine, then find a better image of a human character to back your point.

And we're talking about the quality of the modeling not image quality, so yes it is fair to use that picture.

It would be better for the thread I think, if you either brought something meaningful to the discussion, or let it be.
That's fine by me, it's just the confrontational/accusatory tone of his original post.
 

krizzx

Junior Member
isn't Blops2 a launch title? :/

Didn't you know? The Wii U was maxed out at launch. The worst demonstrations on the ports were all, 100% do the Wii U's inability to keep up with last gen hardware and absolutely nothing else whatsoever. I though that was "factually" determined two years ago when the console was announced at e3 2011.

Sometimes it feels like we've made no progress at all with these exact same arguments being used over and over again.

Heck, the godfather on the PS3 looked and ran worse than the Xbox1 version sometimes. Then there was GUN on the 360. You never saw people trying to limit the PS3 and 360 by its launch ports like I see them do incessantly with the Wii U hardware.

This GPU is clearly a few generations ahead of the PS3/360s.

Going by the Project C.A.R.S. change log, this is not a simple DX10.1 GPU. There are too many DX11 specific assets being used. I'm guiessing that its at least OpenGL 4.0-4.2.
 
Didn't you know? The Wii U was maxed out at launch. The worst demonstrations on the ports were all, 100% do the Wii U's inability to keep up with last gen hardware and absolutely nothing else whatsoever. I though that was "factually" determined two years ago when the console was announced at e3 2011.

Sometimes it feels like we've made no progress at all with these exact same arguments being used over and over again.

We are never going to see anything impressive on WiiU though, except maybe Zelda U (which won't arrive until Winter 2014 at the earliest), Iwata said they will not compete with Sony or MS on game budgets so I think the chances of seeing games that exceed the big name PS360 exclusives (Killzone, God of War, Uncharted, GT, Halo, Gears, Forza ect) are very, very slim.

The console is also struggling to get current gen multi platform games nevermind next gen multi platform games so where exactly do you think these improvements will be shown, by Platinum Games or Retro (a company with less than 100 staff) ?.

krizzx said:
This GPU is clearly a few generations ahead of the PS3/360s.
Again what games show that the GPU is "a few generations ahead of PS360" ?. It has nice DOF and fire effects but in terms of pushing much more polygons or running games at improved frame rates / resolutions over PS360 I just don't see it.

There is definitely a leap in tech because the console has half a gig more RAM, more eDRAM and a more modern GPU but I think you drastically overestimate how large that leap is, esp with diminishing returns making it harder and harder to show tiny improvements on screen.
 
Dude... Seriously...? I mean, I do not agree with Krizzx's obvious biased opinions, but you are really, REALLY wrong here....

What games do you expect to show a large leap over PS360 ?. Nintendo do not push realistic graphics or spend big on budgets. Platinum put 60fps over anything else and Retro are a tiny company in terms of HD development.

I'm not trolling or looking for arguments, I'm genuinely interested in where people think this apart power gap will be show because it certainly won't be in multi platform games as most people in this thread seem to think those companies will not put sufficient budget or man power into them.

Also if the GPU is more powerful than 176GFLOPs, shouldn't it be able to run Wii games like MH3 at 1080p/60fps ?.
 

krizzx

Junior Member
We are never going to see anything impressive on WiiU though, except maybe Zelda U
What? We've already seen plenty. Then again, you may have a different concept of "impressive" then most people. There are some who wouldn't find anything impressive on Nintendo hardware simply because it's on Nintendo hardware as well. They will never find anything produced on the console impressive no matter what an dev or analyst says.

krizzx said:
This GPU is clearly a few generations ahead of the PS3/360s.

Again what games show that the GPU is "a few generations ahead of PS360" ?. It has nice DOF and fire effects but in terms of pushing much more polygons or running games at improved frame rates / resolutions over PS360 I just don't see it.

There is definitely a leap in tech because the console has half a gig more RAM, more eDRAM and a more modern GPU but I think you drastically overestimate how large that leap is, esp with diminishing returns making it harder and harder to show tiny improvements on screen.

As I've just posted in another thread.



I'm not comparing apples to oranges. I'm comparing apples to apples. Not only are these game sporting higher polygon counts, higher res textures and larger/more detailed effects, but at higher render points(higher resolutions and frame rates).

All of that, while outputting to a second screen at 480p widescreen.
 

LCGeek

formerly sane
Again what games show that the GPU is "a few generations ahead of PS360" ?. It has nice DOF and fire effects but in terms of pushing much more polygons or running games at improved frame rates / resolutions over PS360 I just don't see it.

There is definitely a leap in tech because the console has half a gig more RAM, more eDRAM and a more modern GPU but I think you drastically overestimate how large that leap is, esp with diminishing returns making it harder and harder to show tiny improvements on screen.

How many more polygons are you expecting? Even in the pc world there have been very few massive improvements in geometry except having more on screen which is still quite a bit. Also the fact that WIi games have native resolution more frequently in both 720p and 1080p where we just went through a generation where either weren't so clear for the hd twin has me thinking you're blind or don't get it. Same could be said for FPS a lot of WIIU titles have no problem struggling to get 60 fps. It's harder to have more frames than it is pixels but the fact WiiU does both so far better than HD twins isn't impressing you....

SMH
 

Ishida

Banned
What games do you expect to show a large leap over PS360 ?. Nintendo do not push realistic graphics or spend big on budgets. Platinum put 60fps over anything else and Retro are a tiny company in terms of HD development.

I'm not trolling or looking for arguments, I'm genuinely interested in where people think this apart power gap will be show because it certainly won't be in multi platform games as most people in this thread seem to think those companies will not put sufficient budget or man power into them.

Also if the GPU is more powerful than 176GFLOPs, shouldn't it be able to run Wii games like MH3 at 1080p/60fps ?.

The Wii U is indeed more powerful than the other two consoles, bro. I'm sure it's not by much, but it has better performance nonetheless. More RAM, newer GPU. That alone should give better results than 7-8 years old Hardware.

The console is still in it's early lifetime.
 
Krizzx I agree the WiiU games do look better but not to the degree that I would start shouting about the WiiU's GPU being several generations ahead of PS360's.

Also as people in this thread are always happy to bring up on the side of Nintendo, do you really think LBP Karting and PS All Stars have the same level of budget or the same level of talented developers put on them as MK8 and Smash ?.

Would be like comparing -

http://www.platinumgames.co.jp/tw101/wp-content/uploads/2013/08/611.jpg

To -

http://abload.de/img/4uwsab.jpg
 
It's important to mention that Smash also apparently runs at 1080p 60fps on Wii U (I may be mistaken but isn't this confirmed?).

Edit:
Yup quick google search says confirmed by Nintendo.

Edit 2: Also, as another measure of power, Mario Kart runs at 60 fps at 720p. Nintendo says its working on 60 fps 720p with 2 player split screen, but apparently this isn't the easiest thing in the world for them to pull off and they need to optimize the engine to do it. That should give us a decent idea of what the Wii U's power is. Zelda and X should give us a definitive answer running at 720p and 30 fps. Also, Bayonetta 2 runs at 60...
 
The Wii U is indeed more powerful than the other two consoles, bro. I'm sure it's not by much, but it has better performance nonetheless. More RAM, newer GPU. That alone should give better results than 7-8 years old Hardware.

The console is still in it's early lifetime.

I have never disputed that, I just think it's more powerful than PS360 by a tiny amount and not 'several generations'.
 

krizzx

Junior Member
Krizzx I agree the WiiU games do look better but not to the degree that I would start shouting about the WiiU's GPU being several generations ahead of PS360's.

Also as people in this thread are always happy to bring up on the side of Nintendo, do you really think LBP Karting and PS All Stars have the same level of budget or the same level of talented developers put on them as MK8 and Smash ?.

Would be like comparing -

http://www.platinumgames.co.jp/tw101/wp-content/uploads/2013/08/611.jpg

To -

http://abload.de/img/4uwsab.jpg

It being several generations ahead was stated by an experienced dev.

Going from DX9 to DX11 feature set capable tech is several generations. Going from the average games being upscaled to 720p at 30 FPS with tons of frame drops to the average game being either 720p 60 FPS or 1080p at solid frame rate is a few generations. Having "over" twice as much RAM usable for game with no restrictions is a few generation. The PS3/360 didn't have 512 for game, they have 512 for everything and in both cases they couldn't use the full bandwitdth of the RAM. The 360's CPU was bottlenecked to only the RAM at 10 Gb per second. They also didn't have DSP to handle sound. Not all of that was usable by games. Just having a solid 1 GB is more than twice the RAM that the PS3 and 360 had access to without counting the substantially faster edram.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Krizzx I agree the WiiU games do look better but not to the degree that I would start shouting about the WiiU's GPU being several generations ahead of PS360's.
Pardon my curiosity, but how would you know how many generations Latte is ahead of Xenos (or RSX, if you'd prefer)?
 

gundalf

Member
Krizzx I agree the WiiU games do look better but not to the degree that I would start shouting about the WiiU's GPU being several generations ahead of PS360's.

But the WiiU's GPU is several Generations ahead of PS360.
People here tend to equal "Hardware Generation" with Performance which is wrong.
 
Status
Not open for further replies.
Top Bottom