• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

Reallink

Member
What's interesting to me about the no AA thing on Wii U games, is that when a game is in motion, it looks fine. You can't even tell. It's just when there's no motion, whatever it is about how these games are crafted changes. It's the same with Arkham Origins. Terrible jaggies when you just sit there and look at a motionless screen, but when you are actually playing, or observing someone playing, there are some jaggies, but all in all it's night and day.

I'm not sure what they are doing for this to be the case, but in all honesty, if someone thinks less of the game because of how it looks while not playing it, then I don't know what to tell you. The idea of screen shots telling the whole story is outdated.

LCD motion blur, saving Wii U one game at a time.
 

StevieP

Banned
http://www.eurogamer.net/articles/digitalfoundry-batman-arkham-origins-face-off

The Wii U version lacks anti aliasing even seen on the ancient PS360 GPUs. I'm not one to go off one game, but the omission is strange.

FXAA and similar solutions are not very computationally intensive, and are post processed obviously, which is why you see it in use on the other consoles. It is not "real" AA as we understand it. FXAA was on the previous Batman Wii U title, from what I recall. It's just that Human Head chose not to implement it for whatever reason (time constraints, budget/team size, etc).

The upside is that the game doesn't have the vaseline blur to it that the other ones do and the previous game did, including on Wii U (that's compensated with a sharpening filter - yuck). The downside is that image quality is even worse. I'm not sure which one of those options I prefer (as when I play on my PC, FXAA is the devil and is always off because of the amount of detail that is lost as a result) but on a 120" 1080p screen, aliasing sticks out just as much as vaseline smearing does.
 
it is just sad. It seems like third parties are doing everything they can do to make Wii U ports dont take advantage of the hardware and are technical failures. Seriously why would you not let the team that did AC do Origins? Its ok though at this point i could GAF... bring on 3D world and it graphics that "makes everyone else look like they had HD all wrong"
Let's hope that the game will also permanently increase interest on the system so that we may get good ports.. Or at least ports at all.
 

StevieP

Banned
Let's hope that the game will also permanently increase interest on the system so that we may get good ports.. Or at least ports at all.

It's best not to get your hopes up. Even if the game sells in the millions and pushes a few extra consoles out the door, you won't really get much third party support. With general output contraction of the major publishers (they're doubling down on a few hardcore/mainstream 16-35 male AAA franchises and trying to create a couple new ones for the most part) and the more experimental stuff moving to DD-indie and mobile, there isn't much outside of a few mega franchises (like Disney stuff/Skylanders/etc) that would fit in with "that audience". Why are we talking about this in the GPU thread again? lol
 

EDarkness

Member
It's best not to get your hopes up.

Seriously. Like last generation they've made their bed, so good or bad they're in it for the long haul. Which is not good for the Wii U even if it ends up being some massive success in the end. All you can do now is enjoy the games we get and pick up one of the other systems to fill gaps.

Gotta love this industry.
 
It's best not to get your hopes up. Even if the game sells in the millions and pushes a few extra consoles out the door, you won't really get much third party support. With general output contraction of the major publishers (they're doubling down on a few hardcore/mainstream 16-35 male AAA franchises and trying to create a couple new ones for the most part) and the more experimental stuff moving to DD-indie and mobile, there isn't much outside of a few mega franchises (like Disney stuff/Skylanders/etc) that would fit in with "that audience". Why are we talking about this in the GPU thread again? lol
Well, I brought it up in that it will be interesting to see better quality ports when devs become more familiar with the system. I understand the situation on why we shouldn't expect too many of those :)
 

StevieP

Banned
i could care less about getting another FPS on Wii U. there is maybe one 3rd party game i want as a "Wii U ONLY owner. i want quality games more than anything. not games that require day one patches and just shitty ports. I am fine with Nintendo's offering this year and what we know so far about next year. I mean X in itself is going to rule my game play for next year.
No fps titles? Well that's unfortunate. Although I play the majority of those on my PC, the Wii remote is one of only 2 decent controllers on console to play fps titles (and it is much better than move's gyros when IR is used). I wish more developers would use it, but nintendo's decided against that by not packing a Wii remote in the box, and consumers have decided that there won't be very many greenlit as per horrible software sales

Well, I brought it up in that it will be interesting to see better quality ports when devs become more familiar with the system. I understand the situation on why we shouldn't expect too many of those :)

Publishers aren't even printing many of the games they are putting on the console, let alone green lighting more ports :p

Though I do hope that folks at places like straight right and hexa drive get more work
 
No fps titles? Well that's unfortunate. Although I play the majority of those on my PC, the Wii remote is one of only 2 decent controllers on console to play fps titles (and it is much better than move's gyros when IR is used). I wish more developers would use it, but nintendo's decided against that by not packing a Wii remote in the box, and consumers have decided that there won't be very many greenlit as per horrible software sales



Publishers aren't even printing many of the games they are putting on the console, let alone green lighting more ports :p

Though I do hope that folks at places like straight right and hexa drive get more work

I agree.

Back on topic, there are still some ongoing questions about Latte:

eDRAM: Bobblehead from the beyond3D forum stated that the devs have access to the 32MB of eDRAM at 31.7GB/sec, and that the 2+1MB of eDRAM/1T-SRAM is not accessible to devs but is somehow being used by the system with Wii U games. We still don't know how those extra banks are being used, and there was something weird mentioned in Latte's docs (I will leave BG to clarify if he wants) that implies that there are some more things going on with mem1.

160 ALUs and 196 threads?: BG and the others are still trying to figure that out. At this time, it is believed that Nintendo may be using the older definition of GPU "threads" to get to that number, but it is still in discussion.

TMU/ROPs 1:1 Ratio: There may be numerous reasons why Nintendo/AMD decided on that configuration.
 

ahm998

Member
How much shader core on wii u?

From my information :

PS4: Shader core 1152.

X1:Shader core 786.

Wii U : Shader core 320.

Is it true?
 

OryoN

Member
I agree.

Back on topic, there are still some ongoing questions about Latte:

eDRAM: Bobblehead from the beyond3D forum stated that the devs have access to the 32MB of eDRAM at 31.7GB/sec, and that the 2+1MB of eDRAM/1T-SRAM is not accessible to devs but is somehow being used by the system with Wii U games.

Did he confirm how recent those SDK are? Is that 31.7GB/s some kind of SDK-imposed limitation, for the time being? Assuming that's the actual full BW, I wonder why Nintendo scaled the performance in this manner. In both GC and Wii, the total eDRAM bw(texture read, and frame & z-buffer combined) was nearly 6X that of the main RAM's bw. We saw a definite performance gain that justified adding an eDRAM pool.

Going from Wii to Wii U, the 'main memory' bw increased by about 3X, but the total eDRAM bw almost never changed? On top of that, the eDRAM offers less than 3X bw increase over the 'main RAM'(compared to 6X in the past). For all that, why not just make the DDR3 bus wider & faster, and forget about the expensive eDRAM? All that just for better latency for backward compatibility? Doesn't add up...

This console is a real mystery. The more questions that get answered, the more are raised. Everyone pretty much understand it's not a very powerful console, especially on paper. Still, after all these curiously low figures, you can't help but want to understand why the console is able to punch so far beyond its weight. Too many pieces to this puzzle are missing.

We still don't know how those extra banks are being used, and there was something weird mentioned in Latte's docs (I will leave BG to clarify if he wants) that implies that there are some more things going on with mem1.
160 ALUs and 196 threads?: BG and the others are still trying to figure that out. At this time, it is believed that Nintendo may be using the older definition of GPU "threads" to get to that number, but it is still in discussion.
TMU/ROPs 1:1 Ratio: There may be numerous reasons why Nintendo/AMD decided on that configuration.

The mystery continues...
 

AzaK

Member
How much shader core on wii u?

From my information :

PS4: Shader core 1152.

X1:Shader core 786.

Wii U : Shader core 320.

Is it true?

Based on what BG has said, Wii U has 160. That's it. Peroid. Done. Finished. We can stop worrying about it.


I agree.

Back on topic, there are still some ongoing questions about Latte:

eDRAM: Bobblehead from the beyond3D forum stated that the devs have access to the 32MB of eDRAM at 31.7GB/sec, and that the 2+1MB of eDRAM/1T-SRAM is not accessible to devs but is somehow being used by the system with Wii U games. We still don't know how those extra banks are being used, and there was something weird mentioned in Latte's docs (I will leave BG to clarify if he wants) that implies that there are some more things going on with mem1.

160 ALUs and 196 threads?: BG and the others are still trying to figure that out. At this time, it is believed that Nintendo may be using the older definition of GPU "threads" to get to that number, but it is still in discussion.

TMU/ROPs 1:1 Ratio: There may be numerous reasons why Nintendo/AMD decided on that configuration.

From what BG said there aren't 192 actual threads. If you add up all the constants of thread numbers for various shaders (Geometry, Pixel, Vertex) you get 192 in one of the configurations, but that doesn't mean you can actually use those at any point in time.

Bad analogy time.

Say you have a bridge. Due to laws you're not allowed more than 100 cars on the bridge. You're also not allowed more than 10 trucks. And you're not allowed more than 2 busses.

Does that mean you can put 100 cars, 10 trucks and 2 busses on the bridge? No it doesn't. The bridge might only be able to fit/hold, say 90 cars, 7 trucks and a bus before it's out of room or over the weight limit.
 

usmanusb

Member
I think games will look nice because of newer shader technologies and because of Nintendo art work but I think the main bottleneck will be complex environments and AI stuff which xbox one and Ps4 will kick the butt of WiiU far away..
 
I agree.

Back on topic, there are still some ongoing questions about Latte:

eDRAM: Bobblehead from the beyond3D forum stated that the devs have access to the 32MB of eDRAM at 31.7GB/sec, and that the 2+1MB of eDRAM/1T-SRAM is not accessible to devs but is somehow being used by the system with Wii U games. We still don't know how those extra banks are being used, and there was something weird mentioned in Latte's docs (I will leave BG to clarify if he wants) that implies that there are some more things going on with mem1.
That's way too low. I mean, even if Nintendo went with a 512 bit bus, at 550 Mhz this is 35,2 GB/s.
Does that mean that the memory speed inside the GPU die is only running at 490 Mhz? If that's the case, then Nintendo has screwed up to a point its almost unbelievable. I mean, the point of having that eDRAM is the 0 latency factor, but at 490 Mhz the memory and 550 Mhz the GPU there will always be latency due to the memory being that slow.

Why the hell did Nintendo waste 1/3 of the die area of the big die in something that:
1. It has a small memory bandwidth compared to what should have had. The GC's 1MB of texture memory had 10.4 GB/second of theoretical bandwidth. Since now we are speaking of 720p resolutions, this 1MB of memory can't be used in the same way and won't be enough.
Even more, if backwards compatibility works like it should, that MB of memory should have a BW of 15.6 GB/s on Wii mode, and if it's speeds scale proportionally on WiiU mode, it should be faster than the 32MB memory bank.

2. It still has latencies with the GPU die due to it being slower than the rest of the chip.

I don't know how Nintendo can screw this that hard, I mean, they seemed to knew how to work on memory architectures those past years, but with this information I'm starting to think that the WiiU is as inefficient as a design as the Xbox 360 or the PS3 were, and even less powerful than them.
 

AzaK

Member
No one will know this console in and out like Nintendo. But i just refuse to believe after what they achieved with the gamecube Nintendo will make a dud of a "HD" console. i think it is a balanced console design and judging by titles so far Nintendo is going to make magic and titles that stand the test of time. we will be looking abck 10-15 years from now saying damn that Wii U game still lokks good.

Swap balanced with 'just enough' and you're right. Nintendo made a console just a bit better than what we've had this gen and just enough better for Nintendo fans who had had the Wii for 6 years. The only balancing they did was in trying to find a sweet spot for effort vs return.
 

Popstar

Member
That's way too low. I mean, even if Nintendo went with a 512 bit bus, at 550 Mhz this is 35,2 GB/s.
Does that mean that the memory speed inside the GPU die is only running at 490 Mhz?
512 / 8 * 550 000 000 / 1024³ ≈ 32.78 GB/s.
512 / 8 * 533 000 000 / 1024³ ≈ 31.77 GB/s.
 
just the fact that its a member of beyond3d says enough to me. I have been on those forums and the heavy biased towards Wii U and it being weaker than 7th gen.... i dont believe anything any memeber of that site says. I almost got banned for suggesting Wii U performance/power shouldnt be judged by launch games. so i gave up and left there for good
So true, every speculation that might be good for the Wii U that you bring to this board, you can get banned... For me they lost all credibility with acts like that... Everyone can have their own speculation, but trying to shut down all "wiiu is not that crappy console after all" can mean something...
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
That's way too low. I mean, even if Nintendo went with a 512 bit bus, at 550 Mhz this is 35,2 GB/s.
Could be a combination of a typo ('1' instead of '2') and some 10^3-vs-2^10 discrepancies. Behold:

512 * 550 * 10^6 / 8 / 1024^3 = 32.7GB/s

vs.

512 * 550* 10^6 / 8 / 1000^3 = 35.2GB/s
 

Waaghals

Member
So true, every speculation that might be good for the Wii U that you bring to this board, you can get banned... For me they lost all credibility with acts like that... Everyone can have their own speculation, but trying to shut down all "wiiu is not that crappy console after all" can mean something...

Well, Beyond3d can be pretty brutal to all consoles. They have critiqued AAA exclusives on all systems.

I'm not sure that they have a particular Wii U bias though.
 
512 / 8 * 550 000 000 / 1024³ ≈ 32.78 GB/s.
512 / 8 * 533 000 000 / 1024³ ≈ 31.77 GB/s.
I've screwed a bit the numbers, but the problem is still the same. With the memory at 533Mhz, you will always introduce an additional latency due to it being slower than the GPU.
Since the WiiU doesn't have heat problems why would they put the memory at 533Mhz and screw the whole system performance due to that?
This is not cutting the GPU in order to save some money, this is screwing the whole design for no apparent reason (well, maybe a saving of a few mili-watts), which would point towards utmost incompetence in hardware design.

What's worst is that coming from Iwata I don't know what to think at this point, I'm not as confident on Nintendo's hardware engineering as I was during the GC/Wii eras, and if this is a legit information then the WiiU won't come even close to the Xbox 360 or the PS3 in terms of real performance.

blu said:
Could be a combination of a typo ('1' instead of '2') and some 10^3-vs-2^10 discrepancies.
Well, if it's a typo then this makes a bit more sense from an engineering point of view. I hope this is the case since because otherwise there couldn't be a more screwed design than this one.
32,7 GB/s is still a fucking joke of course, but at least it doesn't have the latency problem.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Come on, guys, no need to bash b3d here. Don't judge a forum by whoever's been the most vocal in a thread or two.

Also, BobbleHead did not say that was the entire BW of this pool, only that a game could use it at that BW. For all we know, this figure could be referring to the CPU side alone, and the access to the pool could be multiplexed between the CPU and the GPU. I'm not saying that's the case, just that some of us here have been way too hasty with their worldviews ; )
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
sorry when you are hitting a rate of like 9 out of 10 people bashing a console it is to be judged. i literally got like 2 or 3 pages pages of hateful post directed towards me all because i said the Wii U shouldnt be judged by launch games.
Had I done most of my WiiU posting on b3d rather than here, would you have dismissed my posts just as easily. Just curious ; )
 
Not all beyond3D posters are that bad, and I believe I can trust bobblehead on this. He actually was responding to a user that implied that Nintendo was wasting those other smaller eDRAM banks because it was inaccessible to devs. He clarified that nothing in the system is wasted and is put to use during Wii U games. The bandwidth for the eDRAM was a bonus, though his wording implied that there is something more to this story.
Come on, guys, no need to bash b3d here. Don't judge a forum by whoever's been the most vocal in a thread or two.

Also, BobbleHead did not say that was the entire BW of this pool, only that a game could use it at that BW. For all we know, this figure could be referring to the CPU side alone, and the access to the pool could be multiplexed between the CPU and the GPU. I'm not saying that's the case, just that some of us here have been way too hasty with their worldviews ; )

You seem to have the right idea about the eDRAM bandwidth, Blu. At least, that's what I was thinking.
 

codhand

Member

this is so awful, no idea why you'd post this. bunch of dudes living with their moms complaining Wii is "woosy", they shouldnt have made Wind Waker, and Nintendo need to appeal less to everyone, and more to 18-35 year old males....really?


something actually said in this clip:
"i wouldnt be surprised if MS gave Nintendo a payout to have Nintendo to take a backseat" LOL omg....

These guys still bitter it wasn't Nintendo Revolution instead of Wii, thank god Nintendo don't listen to man babies.
 

tsab

Member
Could it be possible that Nintendo optimize OS process in order to utilize less RAM allocation, and more RAM to games?

Yeah sure, like the 3DS did, they allocated more CPU cycles and Ram to the game. This would need a patch for the already released games to use it or to for the in-development games to be compiled with a newest SDK that support this though
 

tipoo

Banned
eDRAM: Bobblehead from the beyond3D forum stated that the devs have access to the 32MB of eDRAM at 31.7GB/sec, and that the 2+1MB of eDRAM/1T-SRAM is not accessible to devs but is somehow being used by the system with Wii U games. We still don't know how those extra banks are being used, and there was something weird mentioned in Latte's docs (I will leave BG to clarify if he wants) that implies that there are some more things going on with mem1.

Damn...Even the either 70 or 130GB/s assumption was overestimating it by a long shot if this is right. How reliable is this guy?


How much shader core on wii u?

From my information :

PS4: Shader core 1152.

X1:Shader core 786.

Wii U : Shader core 320.

Is it true?


Nope, Wii U is almost certainly 160 at this point.
 
Damn...Even the either 70 or 130GB/s assumption was overestimating it by a long shot if this is right. How reliable is this guy?
Well, Bobblehead carefully said that was the bandwidth number accessable to devs instead of simply saying that was the max bandwidth. That may imply that something else is going on with the bandwidth, like what Blu was suggesting.
 
Given what bg has told me that eDram bw number probably isn't far off. From what he was telling me 4x main RAM bw is its peak.

Interesting design Nintendo. I can't knock it. Theoretically it has many facets of its design that are on paper weaker than PS3/360 and yet its not only holding its own, but in many ways surpassing what has been achieved.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Given what bg has told me that eDram bw number probably isn't far off. From what he was telling me 4x main RAM bw is its peak.

Interesting design Nintendo. I can't knock it. Theoretically it has many facets of its design that are on paper weaker than PS3/360 and yet its not only holding its own, but in many ways surpassing what has been achieved.
Paper surely is an interesting medium (pardon the worn-off imagery)
 

tipoo

Banned
Interesting design Nintendo. I can't knock it. Theoretically it has many facets of its design that are on paper weaker than PS3/360 and yet its not only holding its own, but in many ways surpassing what has been achieved.

Can't say I agree. The "holding its own" in third party cross platform titles usually means being on par, a few being slightly better a few being slightly worse. Looking at these games and thinking the Wii U can at least match the PS360 seems fallacious to me, since those cross platform games aren't the technical peak of those two consoles. The U has not produced either third party games with notably better visuals than the PS360, nor first party games that exceed the peaks of the 7G consoles (ie, what on the Wii U shows the scope of God of War 3, or the impressive rendering of Halo 4, etc?).

I'm not saying this proves it's inferior, just that the evidence at this point, imo, is lacking that it's superior (other than having a bit more than twice the RAM for games)
 

z0m3le

Banned
Uh... that's so incredible weak :( WHY NINTENDO WHY??!?

The number is pretty abstract. I mean you could never compare PS4 and XB1 to it, most of the time, Wii U won't even get games for it that the others will get. I don't think it mattered much what shader count there is if it wasn't close enough to get ports that would look similar to XB1, then there is really little point in worrying about ALU count, Ram bandwidth or anything else because Wii U will end up mostly being bought for it's exclusives or last gen ports, which it can handle with some care at a higher fidelity than 360 or PS3. In the very end that is the conclusion to all Wii U hardware threads, whether you get there now or in 5 years, it doesn't change the outcome.

Can't say I agree. The "holding its own" in third party cross platform titles usually means being on par, a few being slightly better a few being slightly worse. Looking at these games and thinking the Wii U can at least match the PS360 seems fallacious to me, since those cross platform games aren't the technical peak of those two consoles. The U has not produced either third party games with notably better visuals than the PS360, nor first party games that exceed the peaks of the 7G consoles (ie, what on the Wii U shows the scope of God of War 3, or the impressive rendering of Halo 4, etc?).

I'm not saying this proves it's inferior, just that the evidence at this point, imo, is lacking that it's superior (other than having a bit more than twice the RAM for games)

You are not taking into account forced Vsync, the more modern shader effects Wii U handles at a much lower cost than last gen consoles. Wii U is certainly stronger than previous gen, it just doesn't really matter much because what you'll end up with is mostly exclusive content.
 

fred

Member
Can't say I agree. The "holding its own" in third party cross platform titles usually means being on par, a few being slightly better a few being slightly worse. Looking at these games and thinking the Wii U can at least match the PS360 seems fallacious to me, since those cross platform games aren't the technical peak of those two consoles. The U has not produced either third party games with notably better visuals than the PS360, nor first party games that exceed the peaks of the 7G consoles (ie, what on the Wii U shows the scope of God of War 3, or the impressive rendering of Halo 4, etc?).

I'm not saying this proves it's inferior, just that the evidence at this point, imo, is lacking that it's superior (other than having a bit more than twice the RAM for games)

I'd say that Super Mario 3D World, Mario Kart 8, Bayonetta 2, X and SSBU exceed the peak of the PS3 and 360.

Out of those titles Super Mario 3D World is the only one that's finished and it looks amazing just in screenshots. Plus as others have mentioned you have v-synch enabled for a lot of games which would slow PS3 and 360 down considerably.

It's obvious that there's something going on under the hood that we're completely unaware of because the likes of Pikmin 3, The Wonderful 101 and those titles I've listed above shouldn't be possible with 30/60fps and v-synch enabled on a bog standard 160 ALUs part.

Seeing Super Mario 3D World in particular has dispelled any worries I had about the Wii U's power, and that's a late 1st generation title. These next few years are going to be very interesting I think.
 

Xun

Member
I'd say that Super Mario 3D World, Mario Kart 8, Bayonetta 2, X and SSBU exceed the peak of the PS3 and 360.

Out of those titles Super Mario 3D World is the only one that's finished and it looks amazing just in screenshots. Plus as others have mentioned you have v-synch enabled for a lot of games which would slow PS3 and 360 down considerably.

It's obvious that there's something going on under the hood that we're completely unaware of because the likes of Pikmin 3, The Wonderful 101 and those titles I've listed above shouldn't be possible with 30/60fps and v-synch enabled on a bog standard 160 ALUs part.

Seeing Super Mario 3D World in particular has dispelled any worries I had about the Wii U's power, and that's a late 1st generation title. These next few years are going to be very interesting I think.
Agreed.

I can't wait to see what Nintendo has in-store for the system in the future.
 
I can definitely see where you're coming from tipoo, and your assessment is probably not wrong in the strictest sense. I just kind of think when budgeting came around for this years WiiU titles and expected unit sales only being in the "Lucky if we hit 10,000 units" range their WiiU porters weren't given the manpower or money to make a real show of it.

Given Nintendo has a title using fairly high quality AO I think its safe to say it finds some thing easy that PS3/360 took years of prodding to achieve.
 

nordique

Member
I can definitely see where you're coming from tipoo, and your assessment is probably not wrong in the strictest sense. I just kind of think when budgeting came around for this years WiiU titles and expected unit sales only being in the "Lucky if we hit 10,000 units" range their WiiU porters weren't given the manpower or money to make a real show of it.

Given Nintendo has a title using fairly high quality AO I think its safe to say it finds some thing easy that PS3/360 took years of prodding to achieve.

Yes, and it matched the late/end of gen PS360 software pretty well at its launch considering where the PS360 started off at launch respectively.

As time goes on, the Wii U will impress even more. It might not have as much of a ceiling but looking through the next gen screenshot thread, its not like those Wii U games are ugly or anything.

The Wii U is what it is; and given its power draw and output its pretty impressive I think. Its not what many hoped prior to revea of what to expect from system specs, but for Nintendo's first HD console it gets the job done.
 

Powerwing

Member
Shinen said:
Theoretical RAM bandwidth in a system doesn’t tell you too much because GPU caching will hide a lot of this latency. Bandwidth is mostly an issue for the GPU if you make scattered reads around the memory. This is never a good idea for good performance

Shinen said:
We only know that you need to treat the Wii U differently than other consoles, because of a very different and in our view more accessible architecture

Shinen said:
Also your engine layout needs to be different. You need to take advantage of the large shared memory of the Wii U, the huge and very fast EDRAM section and the big CPU caches in the cores. Especially the workings of the CPU caches are very important to master. Otherwise you can lose a magnitude of power for cache relevant parts of your code

I know people think Shinen are biased but i trust their comments about Wii u hardware.
 
So, I just got a random system update... The whole thing downloaded in less than a minute...

Or, I suppose it could've downloaded in the background and just finished up quickly before it would let me in the eShop...
 

tipoo

Banned
I wish shin'en just showed their game already.

Did I miss an announcement? Because I thought all their commentary was based off Nano Assault Neo, I'm sure they're making other stuff as we speak but that game is already out, what game do you want them to already show?
EDIT: Yes, yes I did miss an announcement. Interesting.

Given Nintendo has a title using fairly high quality AO I think its safe to say it finds some thing easy that PS3/360 took years of prodding to achieve.

I've been a hardware geek for all my life but have never come across this acronym, AO, in terms of game image quality. Nor can I google an explanation. Someone make me feel like a complete noob by explaining what it stands for?
 
Did I miss an announcement? Because I thought all their commentary was based off Nano Assault Neo, I'm sure they're making other stuff as we speak but that game is already out, what game do you want them to already show?

I've been a hardware geek for all my life but have never come across this acronym, AO, in terms of game image quality. Nor can I google an explanation. Someone make me feel like a complete noob by explaining what it stands for?

http://www.neogaf.com/forum/showthread.php?t=706181&highlight=shinen

Also: AO
numa

http://en.wikipedia.org/wiki/Ambient_occlusion
 
I've been a hardware geek for all my life but have never come across this acronym, AO, in terms of game image quality. Nor can I google an explanation. Someone make me feel like a complete noob by explaining what it stands for?

In my world if you say AO you mean Ambient Occlusion.

Here's a definition for you

"Ambient Occlusion is a sophisticated raytracing calculation which simulates soft global illumination by faking darkness perceived in corners and at mesh intersections, creases, and cracks, where light is diffused (usually) by accumulated dirt and dust. This has the effect of darkening cracks, corners and points of contact, which is why ambient occlusion is often referred to as a “dirt shader”.
 
In other words, it makes the King of Red Lions' head look like claymation.

eSw3dDoYLpfo
 
On the 192 wavefronts ("threads" in Nintendospeak), I think that it could very well be correct, and that that the sequencer (or whatever queues the instructions) just doesn't scale linearly as ALU count increases. I was reading this link, and it helped me understand things a bit better. While there may be a global limit on threads, how many are in flight on the SIMDs will likely be limited by register space, if anything.

http://www.realworldtech.com/cayman/5/
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
On the 192 wavefronts ("threads" in Nintendospeak), I think that it could very well be correct, and that that the sequencer (or whatever queues the instructions) just doesn't scale linearly as ALU count increases. I was reading this link, and it helped me understand things a bit better. While there may be a global limit on threads, how many are in flight on the SIMDs will likely be limited by register space, if anything.

http://www.realworldtech.com/cayman/5/
That's a particularly good article on the AMD's VLIW lineup. Since the scheduling mechanisms from that era remain the same across iterations (with very small exceptions*), pretty much everything (except for cache/LDS sizes) should hold true for Latte. So, good find, Storm.

* Perhaps the biggest change there was when R800 introduced load/read ops in the ALU clause, which previously was not allowed.
 
Status
Not open for further replies.
Top Bottom