• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

krizzx

Junior Member
PS3 and 360 were said to run "DX11" shaders when they sure as hell can't, but you can get similar results by messing with things and approaching the shaders differently.

No they weren't as far as I remember. Show me a single place where this was said.

The Wii U API is called GX2. I think in previous pages of this thread, it was said to be similar to OpenGL 3.3 with tessellation support and perhaps some newer features added on. Honestly, all the dick swinging about DX10 or DX11 equivalent features probably matters little to what you'll end up seeing on screen. You can fake just about anything you want with pre-baked effects and they still look pretty good. You would need to really pick nits to even care. The PS3 and Xbox 360 ended up supporting much more modern effects in this manner than the hardware would otherwise indicate it would.
 
Of course PS4/Wii U don't support "DX".

And what does the Project Cars changelog say? I can't find it. It would be easier if you posted what it said instead of me not seeing it. I checked the past 5 pages, can't find anything about it other than the mention on this page.

No they weren't as far as I remember. Show me a single place where this was said.

You can simulate DX11 shaders in DX10 or DX9. The further you go down, the more difficult and less streamlined it is. It doesn't mean it's capable of running DX11.

http://www.eurogamer.net/articles/2...-graphics-running-on-crysis-3-on-ps3-xbox-360
 
Why don't Nintendo just release the specs & get someone to make some stunning effects/power/graphics demo to show what the WiiU can do, something that really pushes it because the 2011 E3 WiiU graphics demo looks very dated already !

http://www.youtube.com/watch?v=Shch7LNkVXw

The games will speak for themselves. If GPU horsepower is that much more important, there are other avenues for it. Nintendo isn't going to release the specs. There's no reason for them to do it. The games coming out will show what the system is capable of - at least at a minimum level.

As always, you buy Nintendo systems for the games that come out for it. This console-war non-sense is detracting from the focus of the thread though.
 

krizzx

Junior Member
Of course PS4/Wii U don't support "DX".

And what does the Project Cars changelog say? I can't find it. It would be easier if you posted what it said instead of me not seeing it. I checked the past 5 pages, can't find anything about it other than the mention on this page.

Its been posted in this thread numerous times. Go do a search for it.
 
The Wii U API is called GX2. I think in previous pages of this thread, it was said to be similar to OpenGL 3.3 with tessellation support and perhaps some newer features added on. Honestly, all the dick swinging about DX10 or DX11 equivalent features probably matters little to what you'll end up seeing on screen. You can fake just about anything you want with pre-baked effects and they still look pretty good. You would need to really pick nits to even care. The PS3 and Xbox 360 ended up supporting much more modern effects in this manner than the hardware would otherwise indicate it would.

That, and their CPU's were used to crunch a bit of that to make up for their GPUs.

Of course PS4/Wii U don't support "DX".

And what does the Project Cars changelog say? I can't find it. It would be easier if you posted what it said instead of me not seeing it. I checked the past 5 pages, can't find anything about it other than the mention on this page.

http://www.neogaf.com/forum/showpost.php?p=74557043&postcount=7333
 

krizzx

Junior Member
The games will speak for themselves. If GPU horsepower is that much more important, there are other avenues for it. Nintendo isn't going to release the specs. There's no reason for them to do it. The games coming out will show what the system is capable of - at least at a minimum level.

As always, you buy Nintendo systems for the games that come out for it. This console-war non-sense is detracting from the focus of the thread though.

I have said this repeatedly in this thread.

We can't make any progress on the analysis, because every time someone proposes anything new suggests something that would equal more capability, the same few people will pop up for no other apparent reason than to outright dismiss it.

It would be fine if they actually providing links, up to date comments and or substantial material that directly supports or back up their dismissal, but I rarely ever seem them provide anything other than twisted paraphrasing of loosely connected facts that rarely have anything to do with the immediate topic of discussion.

We can't make any headway like this.
 
Another dev states that they are using DX11 for a Wii U game
http://playeressence.com/candle-ind...ect-x11-and-is-confirmed-for-wii-u/#idc-cover

Now let the spinning of facts and twisting of statements begin... (cues anti-N sqad)

Devs have stated it doesn't support it native, and will be adapted. Which is how Unity does it anyway, it compiles from DX11 to DX10 on DX10 hardware.

Its already uses DX11 features in Project C.A.R.S. and a few other devs have stated that the GPU has them.

Please, stop. Your assumptions are constantly contradicting known facts.

I've searched and found nothing. Project Cars being Dx11 on PC doesn't mean Wii U is dx 11, unless the devs specifically say that the Wii U is DX11 native.

That is correct, but as I said above, its going to be twisted/spinned.

The Wii U doesn't "support" Direct anything, but can produce the same effects as they are being used in known games.

No it doesn't mean what you think it means. PS4 doesn't support DX11 either, but it does support DX11 features natively. The wording isn't ambiguous in the least, you are reading what you want to read.

In any case, Tesselation and many of the features you see in DX11 have been around pre DX11. The point is how these designs are or not designed to be specifically efficient at them, like compute. I believe fully that we will see games on Wii U use some of these features in limited amount, but it simply doesn't have the horsepower to punch them out together like other more advanced and powerful GPU's.

It all comes down to TDP, at 33 watts you can't hope for the impossible. Same happens with PS4 and PC's.
 

z0m3le

Banned
I believe it was. VLIW4 came after VLIW5 (introcuded with Radeon HD 2000). It's one of the reasons why I say for example, that just because it has hardware tessellation doesn't mean it will be efficient in game.

Every GPU on these consoles are custom. It doesn't not mean what you imply, in that they forego the core architecture and build something completely different. The Wii U having more modern architecture, and more power than 360 is common knowledge. I wouldn't need to debate that.



R800 designs are DX11 from the get go, so while you might be right I can't help but be confused when devs say it doesn't support DX 11 features natively:

“We will ADAPT our DirectX11 features to Wii U, not that it supports them natively. However, we are very happy with Nintendo and its console, and we think that it well deserves that extra effort.”. There's clearly some pieces missing from the puzzle still.



I wasn't aware of that, do you have a link?

HD 2000 came at the end of 2007, it was VLIW5. Wii U will have to adapt any direct x effects as it doesn't natively support DirectX at all. Again DX11 wasn't a big update over DX10.1 in terms of ability, the main difference is how tessellation was handled, as it needed a higher spec'd fixed function unit for Tessellation. I saw earlier someone mentioning that it was a fully programmable shader, but then what point would a tessellation unit even be? think of that for a second and realize that it is fixed function, the difference for AMD is that DirectX 10.1 used Gen 2 tessellation which took short cuts that are not available for DirectX 11 compatible fixed function Gen 3 tessellation units. I am not aware if the tessellation unit has changed beyond Gen 3 or if it simply was beefed up, or if programmable shaders offer help to the Tessellation unit in Gen 3. All I do know is that Nvidia and Microsoft didn't want to go down the path ATI had discovered for Tessellation which on the R700 actually seems to do really well in the foblin demo when compared to R800's performance in unigen heaven which couldn't maintain a playable frame rate.

DirectX 11 often just means it uses Microsoft's software solution to a problem rather than any actual unique solution that might perform better. Valve's performance findings of OpenGL over Direct X should be very telling why DX11 isn't really a feature and simply a spec. OpenGL can produce better performance and we know both PS4 and Wii U's APIs are designed on OpenGL thus any Direct X 11 effects have to be adapted to those consoles, obviously OpenGL has comparable effects.
 

krizzx

Junior Member
Found it, posted TWO MONTHS ago, and bullshit it was posted "several times". It was posted once by Disorienter.

http://www.neogaf.com/forum/showthread.php?p=63015721#post63015721

You referred to it several times, but NEVER quoted it.

http://www.neogaf.com/forum/showpost.php?p=74528651&postcount=7317

http://www.neogaf.com/forum/showthread.php?p=74528651&highlight=project+cars#post74528651

With that, I'm done responding to your made up facts .

I have no reason to believe that you know anything about what you are talking about. You have made multiple claims that entirely contradict known facts and statement and you always ignore it when anyone posts contradiction to this. You never correct yourself. This is why I don't waste my time going to gather links. I did many times to you exact posts, and you just completely ignored them all.
 
http://www.neogaf.com/forum/showpost.php?p=74528651&postcount=7317

http://www.neogaf.com/forum/showthread.php?p=74528651&highlight=project+cars#post74528651

With that, I'm done responding to your made up facts .

I have no reason to believe that you know anything about what you are talking about. You have made multiple claims that entirely contradict known facts and statement and you always ignore it when anyone posts contradiction to this. You never correct yourself. This is why I don't waste my time going to gather links. I did many times to you exact posts, and you just completely ignored them all.

Couple things.
1) I only found the first post from June. Said the DX11 renderer was "stripped down"

* WiiU CRenderer/RenderThread base headers (stripped down from DX11)

How about next time, you point out exactly which post it is. I couldn't find it. Hell, I was looking for one of YOUR posts, so that didn't help either.

2) What "multiple" claims? The only claim I made was the one about the DX11. And I obviously didn't know about that post.

3) I never correct myself? Did you not see my post on the last page? After Nostremitus kindly linked the post (something you didn't do), I not only edited the post to admit I was wrong, but I left all the initial text (but striked out) to SHOW how much I was fucking wrong.

I didn't hide behind an edit, I kept it there for everyone to see how I was wrong.

Here, to help you out, let me lead you to that post.

Found it, posted TWO MONTHS ago, and bullshit it was posted "several times". It was posted once by Disorienter.

http://www.neogaf.com/forum/showthread.php?p=63015721#post63015721

You referred to it several times, but NEVER quoted it.


EDIT: Well fuck me sideways. I only found that older changelog post.
Last edited by phosphor112; Today at 04:21 PM.

That was 3 minutes after the link was posted proving me wrong.
 
HD 2000 came at the end of 2007, it was VLIW5. Wii U will have to adapt any direct x effects as it doesn't natively support DirectX at all. Again DX11 wasn't a big update over DX10.1 in terms of ability, the main difference is how tessellation was handled, as it needed a higher spec'd fixed function unit for Tessellation. I saw earlier someone mentioning that it was a fully programmable shader, but then what point would a tessellation unit even be? think of that for a second and realize that it is fixed function, the difference for AMD is that DirectX 10.1 used Gen 2 tessellation which took short cuts that are not available for DirectX 11 compatible fixed function Gen 3 tessellation units. I am not aware if the tessellation unit has changed beyond Gen 3 or if it simply was beefed up, or if programmable shaders offer help to the Tessellation unit in Gen 3. All I do know is that Nvidia and Microsoft didn't want to go down the path ATI had discovered for Tessellation which on the R700 actually seems to do really well in the foblin demo when compared to R800's performance in unigen heaven which couldn't maintain a playable frame rate.

DirectX 11 often just means it uses Microsoft's software solution to a problem rather than any actual unique solution that might perform better. Valve's performance findings of OpenGL over Direct X should be very telling why DX11 isn't really a feature and simply a spec. OpenGL can produce better performance and we know both PS4 and Wii U's APIs are designed on OpenGL thus any Direct X 11 effects have to be adapted to those consoles, obviously OpenGL has comparable effects.

I missed a year, you're right it's 2007.

When people talk about DX11 they aren't talking about API, we're talking about generations. Switching the discussion to that doesn't make sense.

The games will do the talking, and we will see if such a modest GPU has the horsepower to pull off the same features you'll see on the other much more powerful next gen gpus. Time will tell the truth, and If I'm wrong I'm wrong.
 
If the Wii U can do decent tessellation, that's all I need anyway... that and some global illumination implementation. After that its all gravy. Ultra high res. textures aren't that important to me. The Xbox 360 was doing decent tesselation, if the Wii U can do better (and have the stuff be in game, rather than demos) I'm happy. Limited GI was also demoed on the 360, so Wii U should be able to pull that off too.
 

USC-fan

Banned
Another dev states that they are using DX11 for a Wii U game
http://playeressence.com/candle-ind...ect-x11-and-is-confirmed-for-wii-u/#idc-cover

Now let the spinning of facts and twisting of statements begin... (cues anti-N sqad)

From the comments section:

Hi guys! Teku Studios here.

Just to clarify, we will ADAPT our DirectX11 features to Wii U, not that it supports them natively. However, we are very happy with Nintendo and its console, and we think that it well deserves that extra effort :)

Thanks for your comments!


So no. The Wii U does not support DirectX 11 natively.
I can't be the only ones that just lol at the irony.

We have known the feature set of the wiiu gpu. It about the only confirm specs we have for the system.
 
The Wii U API is called GX2. I think in previous pages of this thread, it was said to be similar to OpenGL 3.3 with tessellation support and perhaps some newer features added on. Honestly, all the dick swinging about DX10 or DX11 equivalent features probably matters little to what you'll end up seeing on screen. You can fake just about anything you want with pre-baked effects and they still look pretty good. You would need to really pick nits to even care. The PS3 and Xbox 360 ended up supporting much more modern effects in this manner than the hardware would otherwise indicate it would.

Good post. Actually, for tessellation they probably set it up the way it is for OpenGL 4.0.
PS3 and 360 were said to run "DX11" shaders when they sure as hell can't, but you can get similar results by messing with things and approaching the shaders differently.
This whole gap between what is feasible or not feasible with dx9/DX10.1 compared to DX11 gets blurred as we find more situations like that. It doesn't help when we have devs from COD saying things like, "We made brand new engine for next-gen and we are able to do alot of things we didn't do before... We were able to get nearly all of them running on current gen too."


Devs have stated it doesn't support it native, and will be adapted. Which is how Unity does it anyway, it compiles from DX11 to DX10 on DX10 hardware.



I've searched and found nothing. Project Cars being Dx11 on PC doesn't mean Wii U is dx 11, unless the devs specifically say that the Wii U is DX11 native.



No it doesn't mean what you think it means. PS4 doesn't support DX11 either, but it does support DX11 features natively. The wording isn't ambiguous in the least, you are reading what you want to read.

In any case, Tesselation and many of the features you see in DX11 have been around pre DX11. The point is how these designs are or not designed to be specifically efficient at them, like compute. I believe fully that we will see games on Wii U use some of these features in limited amount, but it simply doesn't have the horsepower to punch them out together like other more advanced and powerful GPU's.

It all comes down to TDP, at 33 watts you can't hope for the impossible. Same happens with PS4 and PC's.
There has been word that the tessellation the Wii U used is not the same as the older r700 dx10.1 version.
HD 2000 came at the end of 2007, it was VLIW5. Wii U will have to adapt any direct x effects as it doesn't natively support DirectX at all. Again DX11 wasn't a big update over DX10.1 in terms of ability, the main difference is how tessellation was handled, as it needed a higher spec'd fixed function unit for Tessellation. I saw earlier someone mentioning that it was a fully programmable shader, but then what point would a tessellation unit even be? think of that for a second and realize that it is fixed function, the difference for AMD is that DirectX 10.1 used Gen 2 tessellation which took short cuts that are not available for DirectX 11 compatible fixed function Gen 3 tessellation units. I am not aware if the tessellation unit has changed beyond Gen 3 or if it simply was beefed up, or if programmable shaders offer help to the Tessellation unit in Gen 3. All I do know is that Nvidia and Microsoft didn't want to go down the path ATI had discovered for Tessellation which on the R700 actually seems to do really well in the foblin demo when compared to R800's performance in unigen heaven which couldn't maintain a playable frame rate.

DirectX 11 often just means it uses Microsoft's software solution to a problem rather than any actual unique solution that might perform better. Valve's performance findings of OpenGL over Direct X should be very telling why DX11 isn't really a feature and simply a spec. OpenGL can produce better performance and we know both PS4 and Wii U's APIs are designed on OpenGL thus any Direct X 11 effects have to be adapted to those consoles, obviously OpenGL has comparable effects.
Hmm. It could be either way. As it has been said, though, the effects would still be feasible. It is just that transferring dx11 down to dx10.1 will effect efficiency and usage compare to just transferring them from OpenGL. Perhaps those devs to get them to specify.
 

krizzx

Junior Member
I can't be the only ones that just lol at the irony.

We have known the feature set of the wiiu gpu. It about the only confirm specs we have for the system.

You do know what "natively" means right? I must ask Can Crusher as well as he selectively skipped that when underlining.

As has already been stated, the Wii U doesn't support DirectX "anything" natively. Neither did the Wii or Gamecube which led to people saying things like they couldn't do normal mapping, bloom or other texture effects that they actually could and did do in reality.

Note that is says "adapt". If they were not possible then there would be no way to adapt them. Also, the featureset for the Wii U is GX2 which was also restated earlier.

As I have posted numerous time. The have been many dev claims that the Wii U has DX11 capability and that haven been numerous post in the Project C.A.R.S. changelog that it is using DX11 features.
* WiiU fixup 11.11.10 HDR format support(* DX11 Dynamic Envmap prefers 32bit HDR format (11.11.10). Deferred small rendertargets used for envmaps and RVM now attempt to use 11.11.10 for HDR phase if available). I remember someone tried to claim it could be the DX10 version but it says the DX11 version specifically. Gotten from here http://forums.overclockers.co.uk/showthread.php?t=18529962

* WIP WiiU Multithreaded shadow rendering. DX11 support for mult-threaded shadow rendering (via -DX11MT).

Also this from Unity. http://www.cinemablend.com/games/Wi...ble-DirectX-11-Equivalent-Graphics-47126.html

http://www.nintengen.com/2011/07/wii-us-opengl-will-have-similar-effects.html

We've known there is no "native" support for DX11 for the Wii U from the start nor is there native support for DX10 or DX9. Its a Microsoft technology. The Wii U is still capable of producing DX11 equivalent effects.
 

alan666

Banned
The games will speak for themselves. If GPU horsepower is that much more important, there are other avenues for it. Nintendo isn't going to release the specs. There's no reason for them to do it. The games coming out will show what the system is capable of - at least at a minimum level.

As always, you buy Nintendo systems for the games that come out for it. This console-war non-sense is detracting from the focus of the thread though.

There is nothing coming out that looks anywhere near to what has been shown for the X1/PS4 though, i know graphics are not everything but i am talking more about the physics in the games & what is going on with all the 'stuff' on the screen.

Too many people are putting the WiiU in with the X360/PS3 gen & that is the real issue.

As for the games coming out will show what the WiiU is capable of, what games ? no pun intended, but the WiiU is the most disappointing console i have ever owned.

Back on topic, the PS4 doesn't have DX11 & neither does the WiiU they both have OpenGL so once again i ask, shouldn't people be looking to the WiiU not being x86 like the X1 & PS4 ?

Looking at a chipset can never tell you what it is capable of producing.

The reason Nintendo won't let on what is on the chipset is because it is scared imo.
 

krizzx

Junior Member
There is nothing coming out that looks anywhere near to what has been shown for the X1/PS4 though, i know graphics are not everything but i am talking more about the physics in the games & what is going on with all the 'stuff' on the screen.

Too many people are putting the WiiU in with the X360/PS3 gen & that is the real issue.

As for the games coming out will show what the WiiU is capable of, what games ? no pun intended, but the WiiU is the most disappointing console i have ever owned.

Back on topic, the PS4 doesn't have DX11 & neither does the WiiU they both have OpenGL so once again i ask, shouldn't people be looking to the WiiU not being x86 like the X1 & PS4 ?

The reason Nintendo won't let on what is on the chipset is because it is scared imo.

Real issue? To who? That sounds like stuff that console war obsessed people care about. Also, for 100th time, no one has now or have ever argued that the Wii U was a strong as the PS4/XboxOne so why does this constantly keep coming up in conversations?
Looking at a chipset can never tell you what it is capable of producing.

Nintendo hasn't released specs for its hardware since the GC and it has stated why. Its been the same reason consistently. The specs aren't important and wouldn't amount to anything good. This is the exact type of console war crap that keeps getting brought to this thread I'm talking about.

The PS4 and XboxOne both use pretty much the same stock GPUs and that GPU likely does support DX11 even the PS4.
 

AlStrong

Member
As I have posted numerous time. The have been many dev claims that the Wii U has DX11 capability and that haven been numerous post in the Project C.A.R.S. changelog that it is using DX11 features.

Multithreaded rendering is an API support issue, not quite a hardware-level problem (for a while). 360 had it too.
 
There is nothing coming out that looks anywhere near to what has been shown for the X1/PS4 though, i know graphics are not everything but i am talking more about the physics in the games & what is going on with all the 'stuff' on the screen.

Too many people are putting the WiiU in with the X360/PS3 gen & that is the real issue.

As for the games coming out will show what the WiiU is capable of, what games ? no pun intended, but the WiiU is the most disappointing console i have ever owned.

Back on topic, the PS4 doesn't have DX11 & neither does the WiiU they both have OpenGL so once again i ask, shouldn't people be looking to the WiiU not being x86 like the X1 & PS4 ?

Looking at a chipset can never tell you what it is capable of producing.

The reason Nintendo won't let on what is on the chipset is because it is scared imo.

Bayo 2 and MK8 say Hi. That is my opinion of course, I think the gamersyde 60fps videos are superb. Also X shows promise even with pretty early footage. Also SSB doing 1080p60.

Most other games (first year games) are 720p60, that alone should tell you something.
 
Bit of a vague way to estimate the number of transistor in Latte though. Better just to look at the size of the chip, the manufacturing process and compare that to other AMD parts on a similar process. If you do that its clear the GPU is well above 500m transistors, more like twice that amount. Obviously embedded memory will take up a few hundred million transistor, but GPU logic should still be well over 500m.

When you take these kinds of things into consideration again it becomes really hard to agree with the 160 ALU count. We're looking at a 500 - 750m transistor count, with much large SPUs than what 20 ALU count SPUs should be at 40nm. No one who backs the 160 ALU count has come up with any compelling argument that supports 160 ALUs and these 2 issues.

160 ALUs would give us 1/2 - 1/3 of the transistor count, and much smaller SPUs.
 

c0de

Member
Nice find. This should go in the CPU thread, though as the GPU has nothing to do with the PPC other than the MCM connectivity which I would like to be explored more. The CPU thread does not get enough attention.

Seems graphics are all that matters to most people. I guess some things will never change.

Obviously no one cares about facts :) Opinions are more important.
 
There is nothing coming out that looks anywhere near to what has been shown for the X1/PS4 though, i know graphics are not everything but i am talking more about the physics in the games & what is going on with all the 'stuff' on the screen.

Too many people are putting the WiiU in with the X360/PS3 gen & that is the real issue.

As for the games coming out will show what the WiiU is capable of, what games ? no pun intended, but the WiiU is the most disappointing console i have ever owned.
ibkJckSCnCjkDH.gif

yearofthemech6ekr2.gif

x03l8bit.gif

datcolor0szbf.gif

ibvKoXvephim1E.gif


Graphics look as good as ANY PS4/Xbox one game.
 

JordanN

Banned
Man, those X gifs are so annoying.

It's a nice game but it's also barren. PS4/XBO at least have games that are both impressive and show more than one character at a time.
 
Man, those X gifs are so annoying.

It's a nice game but it's also barren. PS4/XBO at least have games that are both impressive and show more than one character at a time.

I agree that gif lists in general are annoying, but there are clearly three (welll, the third isn't that clear) characters in the second gif, not counting the likely possibility that the mech is manned.
 
Nowhere near as annoying as people who come here solely for the purpose of attempting to shove the lowest possible opinion of the Wii U down the throats of others...
 

RayMaker

Banned
X has some parts which look beyond the 360/ps3, things like lighting,polycount and some higer res textures, but there are also pats which just look like a good looking 360/ps3 game.

and it does not look as good as a PS4/X1 game, they are a step up in lighting, effects, resolution and polycount.

I just watched the trailer where those gifs are from and it does look very impressive.

The fact the games like call of duty ghosts having a separate build for the wiiu/360/ps3 version and the X1/PS4/PC is telling of the WiiU's graphical situation. its seems to be destined to be clump with 360/ps3 ports and get the odd 360+ looking game.
 
X has some parts which look beyond the 360/ps3, things like lighting,polycount and some higer res textures, but there are also pats which just look like a good looking 360/ps3 game.

and it does not look as good as a PS4/X1 game, they are a step up in lighting, effects, resolution and polycount.

I just watched the trailer where those gifs are from and it does look very impressive.

The fact the games like call of duty ghosts having a separate build for the wiiu/360/ps3 version and the X1/PS4/PC is telling of the WiiU's graphical situation. its seems to be destined to be clump with 360/ps3 ports and get the odd 360+ looking game.

X is a work in progress. Just thought I remind you.
 

Dicer

Banned
Scary that parts of X almost look photo-realistic.

It boils down to this Wii U is an efficient machine, the issue is will devs bother to write clean lean code and take advantage of the machine or not.

We have had a pretty good taste of what we can expect if someone puts forth effort, let's see if they do.
 
We are getting a bit off topic.

At best it would be VLIW4 which would still be inefficient for GPGPU when compared to GCN. But then it wouldn't make sense to start off with a VLIW5 design as the base.

Also Brazos is an APU design, not a GPU design. The GPU in brazos are in fact VLIW5 designs. Considering that the likes of Kingdom Hearts III and Final Fantasy XV aren't coming to the Wii U specifically because of lack of DX 11 level support, I would say we're actually dealing with a R700 core base through and through.

More than likely most of the heavy customization has to be directly connected to the main design philosophy behind the system. Low power consumption is clearly a priority here, and you don't get there by beefing up specs all over the place.

On the topic of GPGPU's, I asked blu nearly a year ago on what would he say if Nintendo asked what additions he would like include in a dx10.1 GPU, and he said, "..a lot of Local Data Share, and/or a fast conduit to the edram for the compute shader threads." From what we saw from looking at Latte's magnified photos, there does seem to be extra pockets of SRAM and two small banks of 1T-SRAM/eDRAM within the processor. Perhaps that is a few of the modifications done to improve the efficiency of the VLIW/pre-GCN architecture, though the two small banks are also obviously for GCN/Wii compatiblity.
 

foxuzamaki

Doesn't read OPs, especially not his own
alright guys, next topic, this pic is possibly from a wiiU pokemon game, possibly a stadium game, it was shown at the end of this CGI retrospective, but looks absolutely nothing like the CGI before it, it looks more like something from a video game, speculation is leaning towards a stadium/coliseum type game
BR1rulwCIAEa5Sc.jpg:large
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Good that loop unrolling exists then.

Sure. For certain scenarios. For others VLIW wold be just as good. For yet others, VLIW would be actually better.

Please help me out - which one of those quotes undeniably proves you cannot do GPGPU on a VLIW design?

You know what's funny? I'm typing this from a VLIW4 machine, and it's shit in GPGPU scenarios, as seen by the horrid horrid performance in Tomb Raider using TressFX. And that's just hair on one character.
Well, that a was a convincing single example. I guess you've actually profiled it and have a fair idea why TressFX underperforms on your GPU?

Essentially you have nothing to go on to even suggest that an architecture that was introduced in 2006 is efficiently designed for compute.
It was designed efficient enough that it was the platform AMD built upon and brought to fruition their entire GPU compute initiative. Welcome to planet Earth.

I do not need to be a seasoned physicist to understand the concept of physics. You keep baiting, and either you want me to ask you the same or you seem to believe that you need a PHD to be able to use your brain.
You might want to look up the term 'armchair expert' - it could be a real eye-opener for you.

Vague and broad? Completely wrong. See the fact that I can turn on tesselation effects and GPGPU physics on my VLIW4 gpu, it doesn't mean they will run well nor that the architecture itself was designed specifically with those features in mind, and therefore might not be efficient at them at all.
Guess what - nothing stops a GCN part from taking a hit on its GPGPU tasks while doing tessellation. It's all a matter of how much resources each task gets allocated for its needs. But you apparently think that proves something - armchair analysis at its best.

Wrong, Brazos is an APU. In 2011 came with a Radeon HD 6310 or Radeon HD 6250 (Wrestler - VLIW5). In either case these are low power GPUs, with very low performance. It's actually based on the same graphics are as the M HD 4300 series, which is R700.
You have no clue what you're talking about. Brazos happens to be in my notebook. Its GPU (which is called Cedar, since you somehow missed that), comes in a bunch of marketing monikers, including 6xxx and 7xxx ones, and is an R800 through and through. Go educate yourself.

edit: Apropos, I just checked out Wrestler - that's the respin of the Cedar and is found in the C70 - a respin of the C60. Bottlomine, Wrestler has nothing to do with HD 4xxx, outside of the relation R700 -> R800.

edit2: Before anybody decided Brazos could still host an R7xx in some obscure APU model - here's the list of all GPUs (listed by PCI device ID) found in AMD family 0x14 APUs (i.e. Brazos), and *none* of that is a R7xx - they're all R800 Cedars, Wrestler moniker or not:

Code:
0x9802 : Radeon HD 6310
0x9803 : Radeon HD 6310
0x9804 : Radeon HD 6250
0x9805 : Radeon HD 6250
0x9806 : Radeon HD 6320
0x9807 : Radeon HD 6290
0x9808 : Radeon HD 7340
0x9809 : Radeon HD 7310
0x980a : Radeon HD 7290
 

alan666

Banned
X has some parts which look beyond the 360/ps3, things like lighting,polycount and some higer res textures, but there are also pats which just look like a good looking 360/ps3 game.

and it does not look as good as a PS4/X1 game, they are a step up in lighting, effects, resolution and polycount.

I just watched the trailer where those gifs are from and it does look very impressive.

The fact the games like call of duty ghosts having a separate build for the wiiu/360/ps3 version and the X1/PS4/PC is telling of the WiiU's graphical situation. its seems to be destined to be clump with 360/ps3 ports and get the odd 360+ looking game.

This is what i am getting at, it is now becoming ingrained that the WiiU will be put in with the X360/PS3.

If the hardware in the WiiU is so good dev's would be making games but they are not, it could be that the WiiU is hard to develop for, but with the low sales it is not really worth investing the time & money in pushing for a release on the WiiU.

I have seen lots of thread on here & other sites where people are discussing the internals of the WiiU & some of these people really know what they are talking about, but does it matter, it is all guess work, it is the games that matter at the end of the day but there are no games, well very few, everyone knows how powerful the PS4/X1 are & the programmers, developers & publishers know what they have to work with, but it seems that coming to the WiiU they are blind, like they are having to work it out for themselves & nobody can blame them for not bothering to invest the time & money for a small user base.

Does it matter that the WiiU is a compact machine that is energy efficient, really when you think about it, does it matter ? it doesn't matter to me, it is not like any device in the past has ever wasted energy or space unnecessarily, the WiiU doesn't use much electricity, who cares if it costs a pound more a month to run a X1/PS4 over a WiiU.

It makes no difference how powerful a console is it is the games that matter & the WiiU can only be judged on the games it has, but is it really worth the trade off to have just a handful of good games over a lifetime or have a huge range with some good & some bad ?
 

plank

Member
alright guys, next topic, this pic is possibly from a wiiU pokemon game, possibly a stadium game, it was shown at the end of this CGI retrospective, but looks absolutely nothing like the CGI before it, it looks more like something from a video game, speculation is leaning towards a stadium/coliseum type game
BR1rulwCIAEa5Sc.jpg:large

But stadium/coliseum games are turn base rpg battles with cgi cut scenes isn't it?
 
Barren my ass
Posting a gif of what seems to be from a cutscene and another one which displays mech combat gameplay in an empty space with just few objects in the foreground and nothing to render in the background only strengthens the 'barren environment' argument. The sparks / star dust look nice though.
 
This is what i am getting at, it is now becoming ingrained that the WiiU will be put in with the X360/PS3.

If the hardware in the WiiU is so good dev's would be making games but they are not, it could be that the WiiU is hard to develop for, but with the low sales it is not really worth investing the time & money in pushing for a release on the WiiU.

I have seen lots of thread on here & other sites where people are discussing the internals of the WiiU & some of these people really know what they are talking about, but does it matter, it is all guess work, it is the games that matter at the end of the day but there are no games, well very few, everyone knows how powerful the PS4/X1 are & the programmers, developers & publishers know what they have to work with, but it seems that coming to the WiiU they are blind, like they are having to work it out for themselves & nobody can blame them for not bothering to invest the time & money for a small user base.

Does it matter that the WiiU is a compact machine that is energy efficient, really when you think about it, does it matter ? it doesn't matter to me, it is not like any device in the past has ever wasted energy or space unnecessarily, the WiiU doesn't use much electricity, who cares if it costs a pound more a month to run a X1/PS4 over a WiiU.

It makes no difference how powerful a console is it is the games that matter & the WiiU can only be judged on the games it has, but is it really worth the trade off to have just a handful of good games over a lifetime or have a huge range with some good & some bad ?

Is Infinity Ward handling all the versions while Treyarch exclusive works on the Wiii U version of Call of Duty: Ghost?

The focus on a smaller console has several benefits: Smaller fan, easier to keep cool, less material for casing, easier to make space for the machine, etc. Energy efficency is just one of the benefits. Those things can positively effect pricing and marketablity. If you look at the other next-gen consoles, for example, you can see that those type of things have been more of a consideration compared to their predecessors.

I agree with you about the concern of games. Assuming that Wii U survives this holiday, though, games should pick up more as Nintendo stabilizes their HD dev teams and properly support the system to increase the size of the userbase.
 
Good that loop unrolling exists then. etc
[/code]

I didn't say you can't do GPGPU just that it won't be efficient enough, nor will the GPU have grunt enough to make it a viable scenario in most games, or big trade offs will be made.

But you're right, I am an arm chair analyst. It doesn't really have the negative connotation you want it to have, and I would suggest you to actually go learn about armchair theorizing. In the end if you're right then we will see Wii U games running in HD and delivering subsurface scattering, tessellation, alpha blending, global illumination, cloth physics, particle physicson GPU etc at the same time because each single feature can be done on the Latte gpu.

For example if you stick a HD6310 in there, it will support all that shit. Yet it will run at 10 fps at sub hd resolutions from the point you turn tesselation on. If we aim for the optimistic target of Latte, at +- 300 Gflops and displaying in HD you are going to hit a ceiling so quick I doubt devs will even consider messing around with resource hog features.

Obviously I can't see the future, you say yes I say no and we will see. I mean it will have to be shown in games, and not in paper theories.

Scary that parts of X almost look photo-realistic.

Oh come on, not even close.
 

Log4Girlz

Member
So now that we've seen more games from the competition, how does the Latte hold up? I'm starting to think its perhaps a third as powerful as the competition.
 
So now that we've seen more games from the competition, how does the Latte hold up? I'm starting to think its perhaps a third as powerful as the competition.

I think that's about right.

If it's 1/3 of the PS4, then X1 is 2/3 of the PS4.

With the Wii U being about 2x (minimum) more powerful than current gen.
 
So now that we've seen more games from the competition, how does the Latte hold up? I'm starting to think its perhaps a third as powerful as the competition.

I think that's really optimistic. Even if we believe in the theory of 320 shader units, we're looking at only 352 GFLOPs with a less modern architecture and feature set (XBO's GPU is 1.31 TFLOPs). There's also less then 1/3 of available memory and main memory bandwidth.
 
Status
Not open for further replies.
Top Bottom