• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

krizzx

Junior Member
Just the type of calculation that has to be done. You can't magically change one to the other, games *need* floating point.



What would I know, I'm just a programmer. I could have sworn we explained fp vs integer to you a while back and you acquiesced saying you understood now, and now you're back to saying they are magically interchangeable.

1080p 60fps exclusives? Beyond 2.5D platformers, what is that?

I never said they are interchangeable or even hinted at it now or then. That is the opposite of what I was saying. You are twisting my words to your own preference. It not being feasible was the entire point I was making. I said it quite clearly.

Stop distorting my posts.

I'ma screen-cap that before he edits it.

Why would I edit that?

Oh. It seems I have another one.
 
The GC and the Wii both used GDDR3 RAM (...)
GC certainly didn't use GDDR3.
Also, its gDDR3 RAM for the Wii U I thought. Have they found out what the lower case g means yet?
Low voltage.
All of the Wii U ports of 360 and PS3 games have much faster loading times on the Wii U. Is that related to the RAM?
No.

That bottleneck was down optical drive streaming; with PS3 doing 9 MB/s and X360 doing 11.2 MB/s on average.

Wii U does 22 MB/s; so if assets weight the same it's only natural.
 

krizzx

Junior Member
GC certainly didn't use GDDR3.
Low voltage.No.

That bottleneck was down optical drive streaming; with PS3 doing 9 MB/s and X360 doing 11.2 MB/s on average.

Wii U does 22 MB/s; so if assets weight the same it's only natural.

Why are you replying to all of these ancient posts?

I had made a mistake with the GC RAM type thing. That was just a slip of the mind from being in so many debates about the GC/Wii at that time.

I'm well aware of what gDDR3 now as someone answered the question right after I asked it month's ago when I posted that.

I thought the Wii U's RAM was 12 GB/s though?
 

krizzx

Junior Member
You replied


To the question about why floating point can't be made integer. Stop forgetting your own posts, I'm not distorting anything.

No such question was asked.

That was a reply to the question "any particular reason why?" Which was a proxy replay to this "Yes, and unfortunately the bulk of game code is and has to be floating point heavy, not integer heavy". I was saying that the code does not "have to be" floating point. Nothing more, nothing less.

You rewrote the question and then redefined my answer to it both to your liking.
 

atbigelow

Member
Your logic is fine, I just had a issue with the manner you stated the info contrary to the manner they stated. Also, consider the things they said in the other interviews they've done as far as the game not doing a lot of optimazation, resources being free, and mainly using a single core. Do you think if they worked more on it 1080p with whatever extra they added could be reached?

No idea. Shinen seems capable of a lot of things, but if the hardware just doesn't have the oomph, then it doesn't.

Personally, I don't mind games running at 720p so I'm not too hung up about it. If the hardware was geared to excel at 720p, then that should be the route to take. Especially when you start hitting into the realm of people not even being able to notice.

EDIT:
Wait, aren't floating points just numbers with values after a decimal point.

Basically yes. However the math required for adequate floating point calculations is more intense. A lot of old systems would use fixed-point numbers for faster math and less memory use.
 

krizzx

Junior Member
Wait, aren't floating points just numbers with values after a decimal point.

Pretty much.

I have written code for a game before and used not a single floating point.

Basically yes. However the math required for adequate floating point calculations is more intense. A lot of old systems would use fixed-point numbers for faster math and less memory use.

And this is why Espresso, which has far superior integer performance compared to other CPUs, will run adequately better with code written in integers. The CPU is more capable then the numbers suggest.
 

bomblord

Banned
Basically yes. However the math required for adequate floating point calculations is more intense. A lot of old systems would use fixed-point numbers for faster math and less memory use.

I am aware of this, even today most programmers would recommend using integers instead of floating points whenever possible. I was just making sure I'm on the same page as everyone else.

I am a programmer myself.
 

atbigelow

Member
I am aware of this, even today most programmers would recommend using integers instead of floating points whenever possible. I was just making sure I'm on the same page as everyone else.

I am a programmer myself.
Cool cool. No offense intended. :)

The Nintendo DS and PS1 both had lacking floating-point capabilities. You can see this with their geometry engines where the polygon edges shuffle around all gross-like.
 
Why are you replying to all of these ancient posts?
They appeared for some reason and I thought I was on the last page.
I thought the Wii U's RAM was 12 GB/s though?
Some projections suggested that, yes.


The Wii U CPU is overall more efficient than the others and especially x86 CPUs. It doesn't have brute force power, but that is because if you program it right, it doesn't need it.
Double checked this one to see if it was todays:

No. It's especially more efficient in general purpose/integer than long pipeline architectures with in-order execution, ie: current gen console CPU's.

Against x86 depends on the implementation, pipeline length and other factors. Sure, PPC is more clean (less legacy support), but putting it like that is a pretty clueless thing to do.

It's essentially a ultra-short pipeline design so, at the same clockrate each cycle lasts less to complete.

Think of pipeline stages as of a highway, the shorter the distance to go the faster to get through at the same speed; speed being the MHz your CPU has; which is why it is quick for what it is... Not because it's not x86 or any of those nonsenses.

In order to be so simple, or reduced to the essential though, it's also missing the solutions that since short pipelines scale badly; manufactuers had to think to counter it, and that's making a CPU more than a 1 bullet per cycle thing; in short: other cpu's have bigger cycles but dispatch more instructions, they also happen to have multiple pipelines if needed be (2-way, 4-way, SMT) or pipelines with the height (think of it as an high ceiling) to accomodate such traffic. This CPU lacks that, so of course it's not really a powerhouse, but it's darn efficient for what it is.

It's a design choice, at this point; and certainly not the best solution for a modern architecture/implementationotherwise companies would be returning to it.
 

tipoo

Banned
I am aware of this, even today most programmers would recommend using integers instead of floating points whenever possible.

So the first hit on google tells us. However, whenever possible is the key there. If you have, say, a large (and I use the term loosely, most modern games are already there) procedurally generated map, you'd be hard pressed not to have lots of FP.

I work for a game development studio in Nova Scotia which I would prefer not to name, games are and will continue to be floating point heavy.

Cool cool. No offense intended. :)

The Nintendo DS and PS1 both had lacking floating-point capabilities. You can see this with their geometry engines where the polygon edges shuffle around all gross-like.


Good point, if I were to posit a guess as to what the programmers are doing is that they reset a new point to the "0" in the map when things move since they have to use integers for decent performance, causing some shuffling.
 

krizzx

Junior Member
They appeared for some reason and I thought I was on the last page.Some projections suggested that, yes.


Double checked this one to see if it was todays:

No. It's especially more efficient in general purpose/integer than long pipeline architectures with in-order execution, ie: current gen console CPU's.

Against x86 depends on the implementation, pipeline length and other factors. Sure, it's more clean (less legacy support), but putting it like that is a pretty clueless thing to do.

It's essentially a ultra-short pipeline design so, at the same clockrate each cycle lasts less to complete.

Think of pipeline stages as of a highway, the shorter the distance the faster to get through at the same speed; that speed is the MHz your CPU has; which is why it is quick for what it is... Not because it's not x86 or any of those nonsenses.

In order to be so simple, or reduced to the essential though, it's also missing the solutions that since short pipelines scale badly; manufactuers had to think to counter it, and that's making a CPU more than a 1 bullet per cycle thing; in short: other cpu's have bigger cycles but dispatch more instructions, they also happen to have multiple pipelines if needed be (2-way, 4-way, SMT) or pipelines with the height (think of it as an high ceiling) to accomodate such traffic. This CPU lacks that, so of course it's not really a powerhouse, but it's darn efficient for what it is.

It's a design choice, at this point.

Indeed, thats the point I was making.

Its far more efficient watt for watt and does not need raw power to produce the same results as the 360/PS3 CPU's. It just needs efficient programming.

I'd imagine Latte is similar in design as well. More performance watt for watt as opposed to going the cheaper, more energy consuming, more heat producing brute force path.

Cool cool. No offense intended. :)

The Nintendo DS and PS1 both had lacking floating-point capabilities. You can see this with their geometry engines where the polygon edges shuffle around all gross-like.

That is why that happened? I always thought those were issues brought about by the lack of texture filtering. Well I learned something new.
 

bomblord

Banned
So the first hit on google tells us. However, whenever possible is the key there. If you have, say, a large (and I use the term loosely, most modern games are already there) procedurally generated map, you'd be hard pressed not to have lots of FP.

I work for a game development studio in Nova Scotia which I would prefer not to name, games are and will continue to be floating point heavy.

Maybe I'm dense but I cannot think of a single place where well formed code couldn't use an int or long int in place of a floating point. I understand this can be harder to do in some cases but I can't think of a single case where it would be impossible to replace.

I will also put out there I'm not very familiar with any game design or graphic libraries as I have a focus in server, database, and web development and have never actually worked on a game before.
 

gtj1092

Member
That I don't know, nor what they planned originally, I guess I probably could via google, but truth to be told they showed a beta and that was probably 30 frames per second at that point even if they were still looking at the 60 frames per second prospects. I know though it's been touted around lately as a downgrade; and even if it isn't it's still telling for a launch game; if it chooses to go to those lenghts.

But fact is, yes, all Killzone's have been 30 frames per second and suffered from input lag against other first person shooters for it, if it was a deliberate choice by them on a new platform I'd say they're bollocks and putting graphical prowess above gameplay at this point.

Megabytecr also said as much so there's that notion at least. I don't really pay much attention to Killzone because I never found it appealing.

So instead of just saying you were wrong you go on a tangent letting us know you don't like the game. You followed enough to claim it was downgraded when it was not.
 
Its far more efficient watt for watt and does not need raw power to produce the same results as the 360/PS3 CPU's. It just needs efficient programming.
It's simply more efficient than those, quite honestly also because they're bad CPU's; and yet they're PPC, not x86. that architecture got dropped once, (IBM GuTS) because it was turd, it got revived out of Sony wanting something different, Microsoft wanting whatever they ordered and IBM wanting to test some more at their clients expenses.

You don't need to be efficient to tackle this CPU or no-one would be saying it punches above it's weight; you have to be efficient to tackle PS3 and X360 architectures, and that's why it's suffering here, because it's of a different nature.

You just need to code for it: it's a 32-bit CPU with 64-bit FPU. And it behaves like one; short pipeline so it's fast at giving resources, but it doesn't dispatch many instructions per cycle opposite of modern solutions. And the 64-bit FPU ensues the floating point performance certainly isn't over the moon; but CPU's not so long ago were like that so it's not like it's a foreign concept.

The fact that it is only 1.24 GHz though, is damaging for that part of the design.
I'd imagine Latte is similar in design as well. More performance watt for watt as opposed to going the cheaper, more energy consuming, more heat producing brute force path.
A more branched design is not necessarily brute force; brute force is going 4 GHz with liquid cooling, adding more complexity just so the cpu can do more at the same time was the solution needed in order to keep gaining performance. I mean we've been sitting on the 2.4 GHz figure being a normal clock for 10 years now; yet pull a 2.4 GHz from 10 years ago (a multiprocessor solution if you will just so it compares better) versus a dual core from 5 years ago to one today, all within that ballpark and performances will grandly differ.

That's because architectures became more and more complex internally while striking for a balance.

It's balancing act.
So instead of just saying you were wrong you go on a tangent letting us know you don't like the game. You followed enough to claim it was downgraded when it was not.
Erm... sure dude. I was pretty honest regarding it, that's it. I dunno if it was downgraded, people told me it was, what else do you want from me?


I don't pay much attention to it, no. But 1080p30 on a launch FPS for PS4 is pretty lousy in my book, not because 1080p60 is easy, but if it's not doable then they should be aiming for 720p60.

I won't change my opinion regarding that, it's a first person shooter after all. Perhaps I dislike previous installments for the fact they felt lagged and sluggish; should be right up there on the compromises not to do on a new platform, at least on the first game.


My point being, if their target for Single Player campaign was 30 frames all along, then they're stupid.
 

atbigelow

Member
That is why that happened? I always thought those were issues brought about by the lack of texture filtering. Well I learned something new.
There's actually two artifacts you can see with the DS, in particular.

1) No texture filtering, as you mentioned. That's where you will see the majority of pixels moving around more than they should. Thankfully the DS's texturing capabilities are still better than the PS1's so you don't see them get super distorted.

2) The DS does have anti-aliasing so the edges of polygons are smoothed. However, that doesn't alleviate the shuffling issues. You can see the "shuffling" and shifting pretty well with close-ups of models. A lot of games have "breathing" animations and they kind of jounce around most unsexily.
 

krizzx

Junior Member
It's simply more efficient than those, quite honestly also because they're bad CPU's; and yet they're PPC, not x86. that architecture got dropped once, (IBM GuTS) because it was turd, it got revived out of Sony wanting something different, Microsoft wanting whatever they ordered and IBM wanting to test some more at their clients expenses.

You don't need to be efficient to tackle this CPU or no-one would be saying it punches above it's weight; you have to be efficient to tackle PS3 and X360 architectures, and that's why it's suffering here, because it's of a different nature.

You just need to code for it: it's a 32-bit CPU with 64-bit FPU. And it behaves like one; short pipeline so it's fast at giving resources, but it doesn't dispatch many instructions per cycle opposite of modern solutions. And the 64-bit FPU ensues the floating point performance certainly isn't over the moon; but CPU's not so long ago were like that so it's not like it's a foreign concept.

The fact that it is only 1.24 GHz though, is damaging for that part of the design.A more branched design is not necessarily brute force; brute force is going 4 GHz with liquid cooling, adding more complexity just so the cpu can do more at the same time was the solution needed in order to keep gaining performance. I mean we've been sitting on the 2.4 GHz figure being a normal clock for 10 years now; yet pull a 2.4 GHz from 10 years ago (a multiprocessor solution if you will just so it compares better) versus a dual core from 5 years ago to one today, all within that ballpark and performances will grandly differ.

That's because architectures became more and more complex internally while striking for a balance.

It's balancing act.Erm... sure dude. I was pretty honest regarding it, that's it. I dunno if it was downgraded, people told me it was, what else do you want from me?



I don't want to clutter up the GPU thread with too much CPU talk so I will take this to the CPU thread.

Also, I found a confirmation for Smash Brothers being 1080p. I will post it in a moment.

http://www.eurogamer.net/articles/2013-06-11-super-smash-bros-for-wii-u-and-3ds-due-2014

UPDATE: Nintendo has revealed a couple of new Super Smash Bros. details this evening, firstly that the game will run in 1080p.
 

tipoo

Banned
1080p on a 2.5D side scroller is much easier to achieve than other game types. Small world size, limited number of textures, less polygons. Rayman ran on 1080p on the ancient PS3 and 360 hardware, that doesn't mean they can do more intensive games at that res.
 

balohna

Member
I think he meant, how would that specifically impact a particular game scenario.

And games "need" floating point makes it sound like Wii U can only perform operations on integers. It might not be as powerful operating on floating point numbers, but it certainly can do them. So then you get to the realization that with optimizing between integers and floating point numbers, there is plenty that it is capable of if one made the effort.

That's the real issue though... people think that this effort is not worth the time and/or money, rightfully or wrongfully.

I'm far from a professional coder, but in the coding I have done I use integers as basically booleans with more than just true or false conditions. Like if certain conditions are met, this variable equals 0, 1, 2, 3, etc. and causes different outcomes.

Whereas a floating point calculation would be used for stuff like "if this is greater than this, do this" because I want it to be precise. Again, I don't do anything very complex (mostly just C#, ActionScript, etc.).

Float variables are usually better for any sort of actual calculations because you're getting a more accurate result.
 

lyrick

Member
I'm far from a professional coder, but in the coding I have done I use integers as basically booleans with more than just true or false conditions. Like if certain conditions are met, this variable equals 0, 1, 2, 3, etc. and causes different outcomes.

Whereas a floating point calculation would be used for stuff like "if this is greater than this, do this" because I want it to be precise.

Yea, that's a very basic usage of programming. In a 3D gaming environment you're dealing with translational functions on three dimensional objects all represented by multiples sets of XYZ Floating Point coordinates.
 

atbigelow

Member
I'm far from a professional coder, but in the coding I have done I use integers as basically booleans with more than just true or false conditions. Like if certain conditions are met, this variable equals 0, 1, 2, 3, etc. and causes different outcomes.

Whereas a floating point calculation would be used for stuff like "if this is greater than this, do this" because I want it to be precise. Again, I don't do anything very complex (mostly just C#, ActionScript, etc.).

Float variables are usually better for any sort of actual calculations because you're getting a more accurate result.
FYI that's called a "finite state machine." Plenty of game code uses those, but that's mostly for AI and state.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Float variables are usually better for any sort of actual calculations because you're getting a more accurate result.
Floats give you better dynamic range over integers, but they're not 'more precise' per se - on the contrary, floats have fewer mantissa bits than the same-sized integer.
 

krizzx

Junior Member
If the CPU is not so good at floating point can't you use the GPU to do it?

The CPU is not bad a floating point. It handles them moderately. Its just not supercharged for doing floating points like Xenon and the Cell, and its integer performance dwarfs its floating point performance.

It can still do floating point quite well. It just want give any miraculous performance like it will with integers.

Also, the GPU can be used for it since it is GPGPU.

I doubt anyone has had any reason to use it yet. I'd imagine the best use of the GPGPU capability would be for aiding tessellation and advanced shading techniques.

Floats give you better dynamic range over integers, but they're not 'more precise' per se - on the contrary, floats have fewer mantissa bits than the same-sized integer.

Looking at Elebits, and Boom Blocks, the CPU should be capable of some extreme physics. How much difference would it make with calculating physics in integers as opposed to floating points though?
 

krizzx

Junior Member
1080p on a 2.5D side scroller is much easier to achieve than other game types. Small world size, limited number of textures, less polygons. Rayman ran on 1080p on the ancient PS3 and 360 hardware, that doesn't mean they can do more intensive games at that res.

I was just providing proof which has apparently upset you and few others. I wasn't trying to make hardware challenge, but since you brought it up,

Rayman to Smash Brothers is a horrible comparison. Rayman uses sprites and its hardly hardware intensive at all.

If you are going to compare games across platforms, then you need to compare games of the same design. PlayStation All Stars(the PS3 Smash Brothers clone) ran at 720 60 FPS. Smash U is running at 1080p 60 FPS with far more detail.

B007XYBUBE.04.lg.jpg
Diu4KCl.jpg


psashots006.jpg
L7k1ElG.jpg


Playstation-All-Stars-Battle-Royale-Beta-Incoming-3.jpeg
9p0rtzX.jpg


They are a generation apart in shading, resolution, polygon details, physics and pretty much anything else you can name.
 

tipoo

Banned
I was just providing proof which has apparently upset you and few others. I wasn't trying to make hardware challenge, but since you brought it up,

Rayman to Smash Brothers is a horrible comparison. Rayman uses sprites and its hardly hardware intensive at all.

If you are going to compare games across platforms, then you need to compare games of the same design. PlayStation All Stars(the PS3 Smash Brothers clone) ran at 720 60 FPS. Smash U is running at 1080p 60 FPS with far more detail.




They are a generation apart in shading, resolution, polygon details, physics and pretty much anything else you can name.

Still fixed stage, still smaller worlds than most game types, everything I said still applies. A side scroller running 1080p doesn't mean anything more open can, which was my point, which still stands. It's really like pulling teeth with you. And Injustice Gods Among us was more detailed than All Stars and ran on 1080p on the 7th gen consoles.

http://**************/injustice-gods-among-us/
 
Still fixed stage, still smaller worlds than most game types, everything I said still applies. A side scroller running 1080p doesn't mean anything more open can, which was my point, which still stands. It's really like pulling teeth with you. And Injustice Gods Among us was more detailed than All Stars and ran on 1080p on the 7th gen consoles.

http://**************/injustice-gods-among-us/

Why would you say something like that? Not only are Smash and All Stars (You know the games of legitimate direct comparison in this conversation) 4 player, the backgrounds in them are far more dynamic and alive than a game like Injustice. For that matter, Injustice characters look better, but they move like stiff old people.

If every time he brings up a point you are going to look at it only from your own preconceived angle, why should he even bother? Talk about what he brings up in a way that's relevant to what he's trying to get across or just don't bother. It's rather annoying reading this thread, and a bunch of you want to act like it's krizzx alone who is the failure in communication...
 

StevieP

Banned
No, there's too much extremity at both sides. It's unfortunate, as it bogs down the thread and its subject. But that's just my lowly opinion...
 

Log4Girlz

Member
They are a generation apart in shading, resolution, polygon details, physics and pretty much anything else you can name.

A generation apart? Not by a long shot. You'd have to have 10x increase in all regards to really be considered a technological generation apart.
 

StevieP

Banned
A generation apart? Not by a long shot. You'd have to have 10x increase in all regards to really be considered a technological generation apart.

You're correct, it isn't a generation apart. However, if "10x increase in all regards" is your metric, then we've got new consoles that aren't a technological generation apart.
 

fred

Member
A generation apart? Not by a long shot. You'd have to have 10x increase in all regards to really be considered a technological generation apart.

If a game is a 'generation apart' or considered 'next gen' then for me personally it means that said game can't be replicated on the previous gen's hardware. Pikmin 3, The Wonderful 101, Bayonetta 2, X, Mario Kart 8 and SSBU are all a 'generation apart' from anything on the previous gen's consoles.

Bayonetta 2 in particular is technically very impressive, particularly the Gomorrah boss fight. Either that's a VERY high poly model or it's using tesselation, either way the PS3 and 360 would probably melt attempting that sort of IQ at 60fps on ONE screen, let alone two...or at least slow to a crawl lol
 

NBtoaster

Member
If a game is a 'generation apart' or considered 'next gen' then for me personally it means that said game can't be replicated on the previous gen's hardware. Pikmin 3, The Wonderful 101, Bayonetta 2, X, Mario Kart 8 and SSBU are all a 'generation apart' from anything on the previous gen's consoles.

Bayonetta 2 in particular is technically very impressive, particularly the Gomorrah boss fight. Either that's a VERY high poly model or it's using tesselation, either way the PS3 and 360 would probably melt attempting that sort of IQ at 60fps on ONE screen, let alone two...or at least slow to a crawl lol

You mean this?
ibkTlp4P9CIdGq.jpg


The polycount on the model doesn't appear to be particularly high. You can see it's silhouette isn't very rounded, and the design; jagged back, sharp teeth and spikey features are quite cuductive for a model that looks complex but has low polycount.
 

JordanN

Banned
Does Fred and Krizzx think Wii U is 5000 Teraflops or something? How are you getting numbers that "melt" the PS3/360?

The console's most promising feature is only 2x better than PS3/360 (the RAM). Everything else about it struggles to live up to that.

Edit: Don't forget you need a strong CPU to keep up with the GPU or else you're going be bottlenecked. I dont think that puny CPU in Wii U is enough to melt anything (except itself).
 
Does Fred and Krizzx think Wii U is 5000 Teraflops or something? How are you getting numbers that "melt" the PS3/360?

The console's most promising feature is only 2x better than PS3/360 (the RAM). Everything else about it struggles to live up to that.

Yes, Krizzx is a bit up on the extreme side, but also you and others are the opposite, this extremist positions bring nothing to the discussion, one side is trying to make Wii U look almost like the XB1 and the other wants to make it look worse than PS360.

I am happy with the games and performance we are getting, 720p60 is no slouch, considering Killer Instinct had to go that route on the XB1.

There is nothing new right now to discuss, I hope we hear from Shinen and Retro soon.
 

USC-fan

Banned
Does Fred and Krizzx think Wii U is 5000 Teraflops or something? How are you getting numbers that "melt" the PS3/360?

The console's most promising feature is only 2x better than PS3/360 (the RAM). Everything else about it struggles to live up to that.

Edit: Don't forget you need a strong CPU to keep up with the GPU or else you're going be bottlenecked. I dont think that puny CPU in Wii U is enough to melt anything (except itself).

I think they are just trolling people. Nothing he post is grounded in reality.
 

Log4Girlz

Member
You mean this?
ibkTlp4P9CIdGq.jpg


The polycount on the model doesn't appear to be particularly high. You can see it's silhouette isn't very rounded, and the design; jagged back, sharp teeth and spikey features are quite cuductive for a model that looks complex but has low polycount.

PS3 would melt

WIHkEXk.jpg


k2kj48H.jpg
 

StevieP

Banned
Do we really have to post ps3 bullshots in a thread that's supposed to discuss the technical side of the Wii u GPU? The discussion has Been polarized full of stupid already.
 

JordanN

Banned
Yes, Krizzx is a bit up on the extreme side, but also you and others are the opposite, this extremist positions bring nothing to the discussion, one side is trying to make Wii U look almost like the XB1 and the other wants to make it look worse than PS360.

I am happy with the games and performance we are getting, 720p60 is no slouch, considering Killer Instinct had to go that route on the XB1.

There is nothing new right now to discuss, I hope we hear from Shinen and Retro soon.
I don't think anyone really thinks the console is weaker than PS3/360 (I actually quoted a list of people in this thread who said it's stronger). Games though, that's totally understandable if you consider this:

Wii U is literally the strangest the console that has more power but the games fumble with it. It's been this way since the console was first unveiled. Even now, games that are still suppose to be better still have a flaw somewhere (i.e Splinter Cell I think has lower FPS cutscenes). It's never been a case of "Wii U games are totally better as a whole". Instead it's "Wii U games do certain stuff better or worse".
 

Log4Girlz

Member
Do we really have to post ps3 bullshots in a thread that's supposed to discuss the technical side of the Wii u GPU? The discussion has Been polarized full of stupid already.

Your opinion StevieP, how much more capable than the last gen is the Wii U GPU? 20%? 50%? 100%? 200%?

Rough estimate.
 
Yea, that's a very basic usage of programming. In a 3D gaming environment you're dealing with translational functions on three dimensional objects all represented by multiples sets of XYZ Floating Point coordinates.

I think the point is that you can optimize and use both floats and ints to strive for better efficiency with this particular hardware. You can just as easily use large integers in your matrices if you wanted to. Floats differ due to their use of an exponent.

I'm not saying they don't have advantages. What I'm saying is that I'm sure games could accommodate a system that uses both integers and floating points as the hardware allows. Perhaps that's too time consuming or perhaps these devs don't bother because they've always done it using floating point numbers or some combination of reasons.

Floats give you better dynamic range over integers, but they're not 'more precise' per se - on the contrary, floats have fewer mantissa bits than the same-sized integer.
 

StevieP

Banned
I don't think anyone really thinks the console is weaker than PS3/360 (I actually quoted a list of people in this thread who said it's stronger). Games though, that's totally understandable if you consider this:

Wii U is literally the strangest the console that has more power but the games fumble with it. It's been this way since the console was first unveiled. Even now, games that are still suppose to be better still have a flaw somewhere (i.e Splinter Cell I think has lower FPS cutscenes). It's never been a case of "Wii U games are totally better as a whole". Instead it's "Wii U games do certain stuff better or worse".

I own splinter cell for ps3 and Wii u as well. I'd rather have my cutscenes in 24fps and my game pretty well 30 vsynced the whole way through than the worse framerates/tearing the whole way through. The Wii u version is better. The Wii u version of need for speed is also better than everything other than the PC version. These are a couple examples of many, which should amaze you considering the Wii u port budets/teams are tiny and often an afterthought.

Your opinion StevieP, how much more capable than the last gen is the Wii U GPU? 20%? 50%? 100%? 200%?

Rough estimate.

What's the point of estimates? The thing obviously isn't a powerhouse and shouldn't be evangelized as such. Its also better (in most ways) than previous generations by a small degree. The sooner that's accepted, the more the thread will go from being a polarized mess to something more tolerable.
 

JordanN

Banned
Just for fun, here's what the internals of each console looks like. (I kept the hands for scale)

PS3 Slim:

Wii U:

Xbox One (can't find any PS4 pics):

A HD 7770

I don't think Nintendo has better engineers than everyone else. Actually, all the GPU's are now made by AMD so the comparisons couldn't be anymore close. It wouldn't make sense how a company that now cares more about making gamepads/wiimotes is making a machine built to outpace last gen by alot when they're the same ones who aren't all that interested in that. Wii U is definitely more powerful than last gen but the verdict that it's going to "melt" them has gone on too long.
 

krizzx

Junior Member
Yes, Krizzx is a bit up on the extreme side, but also you and others are the opposite, this extremist positions bring nothing to the discussion, one side is trying to make Wii U look almost like the XB1 and the other wants to make it look worse than PS360.

I am happy with the games and performance we are getting, 720p60 is no slouch, considering Killer Instinct had to go that route on the XB1.

There is nothing new right now to discuss, I hope we hear from Shinen and Retro soon.

Extreme? I'm actually just stating what the facts point to. Nothing I've said was extreme.

I'm comparing two games of the same type with one showing and all around greater level of fidelity.

How did that get blown out of proportion to me thinking the Wii U is 5000 teraflops or me being extreme?

Its like saying anything even remotely positive about the hardware is a crime or makes you a fanboy.
 
Status
Not open for further replies.
Top Bottom