• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
So the Wii U GPU is (at least)352 Gflops compared to the 360 xenos gpu which is about 240 Gflops (according to wiki). I was going to compare it to the ps3 rsx but it said the rsx was 400.4 Gflops. I thought the 360 gpu was better than the rsx? I guess Gflops are just part of the equation.
 

tipoo

Banned
Anyone care to take a stab at this? B3D seemed rather convinced about it, but they seem to be convinced about a lot of stuff (they don't know).

I have no clue how big alpha textures alone would be, but certainly all the textures needed in a game couldn't be stored in eDRAM after everything else it's probably used for, so it would still have to hit up the main memory often.
 

Schnozberry

Member
No, going by that comparison there are 8 clusters. The important information to discern from the comparison was that the size difference means that most likely there are 40ALUs per cluster meaning 320ALUs.

It also could be completely custom. There are lots of variables.
 
It also could be completely custom. There are lots of variables.

Of course, but I didn't want to speculate too much that ALUs could exist on the die outside these cluster or that there could very well be an odd amount of ALUs in there because we don't know much about the GPU. However anything is possible with as many questions as there are with this GPU.
 

ozfunghi

Member
So the Wii U GPU is (at least)352 Gflops compared to the 360 xenos gpu which is about 240 Gflops (according to wiki). I was going to compare it to the ps3 rsx but it said the rsx was 400.4 Gflops. I thought the 360 gpu was better than the rsx? I guess Gflops are just part of the equation.

RSX flops are BS numbers.
 

tipoo

Banned
So the Wii U GPU is (at least)352 Gflops compared to the 360 xenos gpu which is about 240 Gflops (according to wiki). I was going to compare it to the ps3 rsx but it said the rsx was 400.4 Gflops. I thought the 360 gpu was better than the rsx? I guess Gflops are just part of the equation.

Pretty sure Sony used some insane metric to measure the RSX, like back with the PS2 they calculated performance using only shades of grey and not even drawn to the screen or something like that. RSX is universally lauded as being worse than the 360 GPU. In fact because of it the Cell was said to never reach its potential because it has to waste SPEs on graphics tasks that a better GPU could have handled.

THe 360 GPU also was the first to use unified shaders, so it was more like a x1900/HD2000 series hybrid than a straight x1900 chip.
 

Schnozberry

Member
I have no clue how big alpha textures alone would be, but certainly all the textures needed in a game couldn't be stored in eDRAM after everything else it's probably used for, so it would still have to hit up the main memory often.

I read the patent filing that was posted a few pages back, and it seems the memory controller is specifically designed to reduce the number of writes out to DDR3. I would assume then, and I freely admit I could be very wrong, that the memory subsystem was designed to write a lot of textures out to DDR3 at once, and then the stream the information back into EDRAM before being displayed.

In any case, it's an entirely different setup than what was done on 360 and PS3 to achieve good performance, at least from what I've read about the tricks and shortcuts used on those platforms. That, along with the fuzzy nature of the SDK/Final Hardware, would explain why some ports at launch were lacking in the performance department.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Anyone care to take a stab at this? B3D seemed rather convinced about it, but they seem to be convinced about a lot of stuff (they don't know).
Latte needs to have 70.4GB/s BW to MEM1 and it would be at virtual parity with Xenos in terms of ROPs/clock (FSAA/zexels non-withstanding), which translates to 10% advantage in fillrate, assuming 8 ROPs @ 550MHz.
 
I read the patent filing that was posted a few pages back, and it seems the memory controller is specifically designed to reduce the number of writes out to DDR3. I would assume then, and I freely admit I could be very wrong, that the memory subsystem was designed to write a lot of textures out to DDR3 at once, and then the stream the information back into EDRAM before being displayed.

In any case, it's an entirely different setup than what was done on 360 and PS3 to achieve good performance, at least from what I've read about the tricks and shortcuts used on those platforms. That, along with the fuzzy nature of the SDK/Final Hardware, would explain why some ports at launch were lacking in the performance department.

I think that too. That would also eliminate the DDR3 bandwidth limitation problem.
 
Latte needs to have 70.4GB/s BW to MEM1 and it would be at virtual parity with Xenos in terms of ROPs /clock, FSAA/zexels non-withstanding.

I assume bandwith is not something measurable just looking at a die shot. How would we go about finding out how fast the eDRAM and SRAM is? Would we have to run some software level benchmarks like Héctor Martín?
 

tipoo

Banned
I read the patent filing that was posted a few pages back, and it seems the memory controller is specifically designed to reduce the number of writes out to DDR3. I would assume then, and I freely admit I could be very wrong, that the memory subsystem was designed to write a lot of textures out to DDR3 at once, and then the stream the information back into EDRAM before being displayed.

In any case, it's an entirely different setup than what was done on 360 and PS3 to achieve good performance, at least from what I've read about the tricks and shortcuts used on those platforms. That, along with the fuzzy nature of the SDK/Final Hardware, would explain why some ports at launch were lacking in the performance department.

I think that too. That would also eliminate the DDR3 bandwidth limitation problem.


I still don't see how? Even if the DDR3 is streaming textures to the eDRAM, at some point you would be limited by the DDR3 data rate, would you not? From what I'm understanding it would be like a cache level on a processor, yes you can have insanely fast cache and increase the size to be pretty big, but at some point you're going to have to take a penalty for waiting for the main memory, since it obviously can't stream to the cache/eDRAM as fast as the cache/eDRAM can send the needed information back to the CPU/Wii U GPU. Or like the buffer on a hard drive if you prefer.
 

ozfunghi

Member
I have no clue how big alpha textures alone would be, but certainly all the textures needed in a game couldn't be stored in eDRAM after everything else it's probably used for, so it would still have to hit up the main memory often.

No, not all of them. The most frequent ones (like in battle scenes). I doubt "all" textures are alpha. Usually, these are few textures being repeated many times (effects). So i would think storage wouldn't be the issue. Also, we know the max read bandwidth of WiiU is larger than the max read bandwidth in 360 which never appeared to reach it's theoretical performance. And i verry much doubt Nintendo would gimp their system in such a specific way. That's why i'm asking about the memory setup. Brain_Stew supposedly said we wouldn't be disappointed with the memory lay-out... so, to me that explanation doesn't make sense.

EDIT: since the gDDR3 is twice the size (maybe more in the future) of 360's main RAM, it could load larger chunks at once, having more time to stream to eDRAM, with a total read (or write) speed faster than that of 360. Only when it starts to do a lot of reading/writing at the same time, it would start to be an issue... right?

Latte needs to have 70.4GB/s BW to MEM1 and it would be at virtual parity with Xenos in terms of ROPs/clock (FSAA/zexels non-withstanding), which translates to 10% advantage in fillrate, assuming 8 ROPs @ 550MHz.

Ok, so for the less tech savy
me
, that argument is void? Unless the 32MB eDRAM is slower than that? But that is unlikely? And what about tipoo's argument, that not everything could be stored in MEM1?
 

tipoo

Banned
No, not all of them. The most frequent ones (like in battle scenes). I doubt "all" textures are alpha. Usually, these are few textures being repeated many times (effects). So i would think storage wouldn't be the issue. Also, we know the max read bandwidth of WiiU is larger than the max read bandwidth in 360 which never appeared to reach it's theoretical performance. And i verry much doubt Nintendo would gimp their system in such a specific way. That's why i'm asking about the memory setup. Brain_Stew supposedly said we wouldn't be disappointed with the memory lay-out... so, to me that explanation doesn't make sense.

I'm sure Nintendo balanced things out as always, but in terms of the eDRAM giving the GPU access to memory much faster than the DDR3 I'm still iffy. On the 360, the 10MB eDRAM was used for high bandwidth operations like FSAA, z-buffering, and alpha blending, and I recall developers saying they were often limited by it because it was *just* large enough for 720p video, and they often had to drop down lower.

The Wii Us is over 3x as large, sure, but if it's doing those same high bandwidth operations, plus a framebuffer for both screens (or two?), plus potentially being used as a quick data transfer method between GPU and CPU for GPGPU calculations, how much is left for storing plain old textures, and how much does that reduce the DDR3s burden from textures? My impression was that apart from resolution, the reason VRAM on graphics cards keeps going up is because of texture sizes increasing, so obviously they can't fit in a small space. If you could just fit the important ones in a small fast memory pool, why wouldn't GPU makers instead of shoving 2GB of GDDR5 which costs a lot in there?
 

Schnozberry

Member
I still don't see how? Even if the DDR3 is streaming textures to the eDRAM, at some point you would be limited by the DDR3 data rate, would you not? From what I'm understanding it would be like a cache level on a processor, yes you can have insanely fast cache and increase the size to be pretty big, but at some point you're going to have to take a penalty for waiting for the main memory, since it obviously can't stream to the cache/eDRAM as fast as the cache/eDRAM can send the needed information back to the CPU/Wii U GPU. Or like the buffer on a hard drive if you prefer.

Of course there are always limits imposed by the DDR3 bandwidth. I guess it comes down to "compared to what". In comparison to PS360, it should have far fewer wasted cycles.
 
Yes clearly a rushed port of a unimpressive multiform game by an awful studio built for the architectures of the HD twins makes a solid hypothesis.

Just like how Bayonetta proves that the PS3 is garbage and can't maintain the frame rate the 360 can.

Not to mention the fact that that most of the horrid textures in X look like they are still alpha (development) textures with 64 x 64 resolution. Meaning?.... Place holders.

We need to actually see games built from the ground up on Wii-U to know if memory bandwidth is actually a problem. Not some awful port by a third party

Did you watch the video? You can literally see the WiiU choking! And you can see the EXACT cause of it, it occurs when he uses the brush and the Alpha transparencies come into effect. This is a CLEAR example of a system bottlenecking. The main question is why? The B3D hypothesis is that this is due to poor memory bandwidth. I think this hypothesis is well founded and grounded in the objective analysis of what we can see. Instead you give a hysterical and meaningless whine that in no way disproves this.
 
Big kudos to the chip works guys.

Glad to see that there is some good speculation mixed in here amongst the trolling.
It's going to get even better once the CPU pics go up... and by that I mean the speculation, not the trolling.
Did you watch the video? You can literally see the WiiU choking! And you can see the EXACT cause of it, it occurs when he uses the brush and the Alpha transparencies come into effect. This is a CLEAR example of a system bottlenecking. The main question is why? The B3D hypothesis is that this is due to poor memory bandwidth. I think this hypothesis is well founded and grounded in the objective analysis of what we can see. Instead you give a hysterical and meaningless whine that in no way disproves this.
What in the world? So the only objective analysis can be that the issue is with the hardware and not the software? I'm not sure what else this could mean.
 
Did you watch the video? You can literally see the WiiU choking! And you can see the EXACT cause of it, it occurs when he uses the brush and the Alpha transparencies come into effect. This is a CLEAR example of a system bottlenecking. The main question is why? The B3D hypothesis is that this is due to poor memory bandwidth. I think this hypothesis is well founded and grounded in the objective analysis of what we can see. Instead you give a hysterical and meaningless whine that in no way disproves this.

Because the entire hypothesis is nonsense. It's a quick and sloppy port of a game that isn't pushing the system. If it was an issue why is it that much more graphically intensive games don't have the same issues? Why is it that Nintendo's games don't have the issues? I haven't seen Nintendoland so much as hiccup and it's full of alpha transparencies in the plaza and is MUCH nicer looking
 

Log4Girlz

Member
Any educated guesses about transistor count? I think 32MB of edram at any given node size can be easily tabulated. Anyone have an educated guess?
 

Schnozberry

Member
Did you watch the video? You can literally see the WiiU choking! And you can see the EXACT cause of it, it occurs when he uses the brush and the Alpha transparencies come into effect. This is a CLEAR example of a system bottlenecking. The main question is why? The B3D hypothesis is that this is due to poor memory bandwidth. I think this hypothesis is well founded and grounded in the objective analysis of what we can see. Instead you give a hysterical and meaningless whine that in no way disproves this.

I think its equally silly to dismiss off hand that unfamiliarity may have caused the developer in question to potentially not leverage the memory subsystem properly. If the documentation was non descript and the sdk/final kits were very late, it certainly seems possible to me.
 

KingSnake

The Birthday Skeleton
Maybe this needs to be remembered:

Iwata: Every gaming hardware has its specialities. There is a timing of hit and miss before the functions can be used fully. We were not able to provide development kits that get out all the power of wiiu until mid of last year. With other gaming consoles firms had 6 to 7 years to experiment but our console has a different balance so it is easy to see who has adapted and who hasnt. However this is something time will heal so we are not too worried.
 
I think its equally silly to dismiss off hand that unfamiliarity may have caused the developer in question to potentially not leverage the memory subsystem properly. If the documentation was non descript and the sdk/final kits were very late, it certainly seems possible to me.

I never dismissed this. The two things can (and probably do) go together. The hypothesis presented was that the poor memory bandwidth was the cause. This DOES NOT mean there are no workarounds. But if there is no problem then why do you need workarounds?

It feels like people don't understand the meaning of a hypothesis. You don't need all the facts to create an hypothesis, just evidence and something to connect those facts.
 

Lonely1

Unconfirmed Member
Anyone care to take a stab at this? B3D seemed rather convinced about it, but they seem to be convinced about a lot of stuff (they don't know).

Just like here, there's people in BY3D that are biased towards putting Wii U in the worst light possible.
 

Schnozberry

Member
I never dismissed this. The two things can (and probably do) go together. The hypothesis presented was that the poor memory bandwidth was the cause. This DOES NOT mean there are no workarounds. But if there is no problem then why do you need workarounds?

Maybe I was too harsh. I guess I meant that it that its a matter of using something like it was designed vs not. When I first learned how to drive a stick shift, it took me a few months to really get it down to the point where it felt natural to me. I'm assuming the differences in the memory subsystems of the Wii U and current HD consoles must be large enough to require some level of training on how to use properly.

The manual transmission didn't need workarounds, I just didn't grasp the design right away, and I had worse performance from the car as a result.
 

prag16

Banned
Did you watch the video? You can literally see the WiiU choking! And you can see the EXACT cause of it, it occurs when he uses the brush and the Alpha transparencies come into effect. This is a CLEAR example of a system bottlenecking. The main question is why? The B3D hypothesis is that this is due to poor memory bandwidth. I think this hypothesis is well founded and grounded in the objective analysis of what we can see. Instead you give a hysterical and meaningless whine that in no way disproves this.

And this my friends is what we call jumping to conclusion.
 

ozfunghi

Member
I'm sure Nintendo balanced things out as always, but in terms of the eDRAM giving the GPU access to memory much faster than the DDR3 I'm still iffy. On the 360, the 10MB eDRAM was used for high bandwidth operations like FSAA, z-buffering, and alpha blending, and I recall developers saying they were often limited by it because it was *just* large enough for 720p video, and they often had to drop down lower.

The Wii Us is over 3x as large, sure, but if it's doing those same high bandwidth operations, plus a framebuffer for both screens (or two?), plus potentially being used as a quick data transfer method between GPU and CPU for GPGPU calculations, how much is left for storing plain old textures, and how much does that reduce the DDR3s burden from textures?

see my edit

first of all, the problems we are seeing are far worse than what the setup seems to imply in a worst case scenario. Even going on the "worse than 360" scenario. Numbers for 360 bandwidth are theoretical, not achieved in real-world. That number is split in read/write, while the WiiU is read or write. Since the pool is larger, it will need to do less writing i suppose. If it only needs to be read at any specific time, it already has more BW than the theoretical 360 numbers. Then there is the "actual" MEM1/eDRAM. So, while i could understand "below 360" performance... i have a hard time believing the B3D explanation based on what we are seeing (Epic Mickey being the worst offender, while running smoothly on Wii).
 

krizzx

Junior Member
Sorry. This isn't the "proof" i'm looking for. The game looks hardly better than the Wii version, which runs perfectly. WiiU < Wii confirmed?

Well stated, to many seem to be drawing irrelevant conclusions that lean toward only the most negative possibility.

Just like here, there's people in BY3D that are biased towards putting Wii U in the worst light possible.
I thought I was the only one who noticed that. I know B3D are supposed to be up their in the tech field but they are certainly not without bias and agendas.

I'm still waiting for more info regarding Jim Morrison's comment. The bulk of the digital foundry analysis seemed to based on the assumption that the GPU was a 4630?(can't remember) chip, but he said that there are no AMD/ATI markings on the GPU. Also, why do they insist it is based on that chip and not the other ones that were mention like the HD 5550 or e6730? What makes them so certain?
 
I never dismissed this. The two things can (and probably do) go together. The hypothesis presented was that the poor memory bandwidth was the cause. This DOES NOT mean there are no workarounds. But if there is no problem then why do you need workarounds?

It feels like people don't understand the meaning of a hypothesis. You don't need all the facts to create an hypothesis, just evidence and something to connect those facts.

But you don't have evidence, you don't work with the Wii-U. Your free to hypothesis all you want but the fact epic mickey 2 slowed down IS NOT evidence, and it seems like you are trying to say the slow down is proof. But proof of what? That they are using the system incorrectly?

What we do know, is that the game was built for the HD twins and ported over to the Wii-U in 6 months. We have no problem with you hypothesizing that there is low bandwidth in the Wii-U.

But the video you posted is absolute rubbish. You are saying that your evidence is a multiform game not optimized for the Wii-U and ported in 6 months that has frame rate issues is proof that there is bandwidth problems. When in fact other games don't suffer from this and use alpha textures as well.

If anything this is a hypothesis that the full explanation of the memory sub-system in the Wii-U may not have been available for developers creating launch games. Like I said, you will have to see games built from the ground up for the Wii-U struggling with alpha textures to be able to discern any problems with memory bandwith.
 
Is Epic Mickey a more demanding game than Trine 2?

I think a better question is which one uses more alpha.

I mean, yeah, Epic Mickey 2 is probably not the best game to judge the hardware on, but let's not pretend that it's only game showing issues when there's a lot of alpha going on (BLOPS2 if my memory serves me correctly, and there are others).

EM2 isn't the only game showing issues with alpha, so unless we have a game using lots of alpha NOT showing issues, then it doesn't rule out the system having a problem with alpha.

Trine 2 I remember having a good number of transparency effects, but I don't know how they were achieved or if the volume of them matches what was seen on the games struggling. perhaps someone can elucidate. we could even try to get one of the Frozenbyte guys in here.
 
Maybe I was too harsh. I guess I meant that it that its a matter of using something like it was designed vs not. When I first learned how to drive a stick shift, it took me a few months to really get it down to the point where it felt natural to me. I'm assuming the differences in the memory subsystems of the Wii U and current HD consoles must be large enough to require some level of training on how to use properly.

The manual transmission didn't need workarounds, I just didn't grasp the design right away, and I had worse performance from the car as a result.

I don't think this is the case, but I guess it is possible. I do believe the number of games that have this problem lend credence to the hypothesis, but I accept the possibility that all those games' devs were just doing things totally wrong and the bandwidth is fine.

I'm not surprised by the people attacking me because they can't deal with facts though - is par for the course in WiiU threads nowadays.
 

guek

Banned
I'm not surprised by the people attacking me because they can't deal with facts though - is par for the course in WiiU threads nowadays.

What "facts?" That Epic Mickey runs poorly? You don't see how you might be jumping to conclusions here?

People aren't necessarily disagreeing with the possibility, merely your assertive conclusion.
 

ozfunghi

Member
I don't think this is the case, but I guess it is possible. I do believe the number of games that have this problem lend credence to the hypothesis, but I accept the possibility that all those games' devs were just doing things totally wrong and the bandwidth is fine.

I'm not surprised by the people attacking me because they can't deal with facts though - is par for the course in WiiU threads nowadays.

Read my last post. The question i am asking, is because the performance we are seeing, in no shape or form can be justified by what we know from the memory setup. Even if it were worse. The fact that so many games suffer from it so excessively, makes me think there has to be another explanation.
 
I think a better question is which one uses more alpha.

I mean, yeah, Epic Mickey 2 is probably not the best game to judge the hardware on, but let's not pretend that it's only game showing issues when there's a lot of alpha going on (BLOPS2 if my memory serves me correctly, and there are others).

EM2 isn't the only game showing issues with alpha, so unless we have a game using lots of alpha NOT showing issues, then it doesn't rule out the system having a problem with alpha.

Trine 2 I remember having a good number of transparency effects, but I don't know how they were achieved or if the volume of them matches what was seen on the games struggling. perhaps someone can elucidate. we could even try to get one of the Frozenbyte guys in here.

I already pointed out a game. Nintendoland! That has tons of transparencies in the plaza, especially when it starts getting filled up with more prizes and gadgets and stuff
 
Is Epic Mickey a more demanding game than Trine 2?

That's what im trying to figure out here. While this is nothing to write home about mass effect 3 was a port to the wii u and even then it ran a little better than the ps3 version. I would even say that it has better texture resolution, for example compare liaras armor.
1280x-1
1280x-1
Now i haven't played epic mickey but i would say that me3 is way more demanding in my eyes.
 
What "facts?" That Epic Mickey runs poorly? You don't see how you might be jumping to conclusions here?

People aren't necessarily disagreeing with the possibility, merely your assertive conclusion.

There are a number of games on the WiiU that have frame rate drops when dealing with alpha textures - FACT. FACT. FACT.

Epic Mickey was used as an EXAMPLE. Because:
1. The issue is on demand - when testing something the best thing is when you know how to recreate the issue. This way we can nicely rule out the problem being something else. In Epic Mickey we can recreate this problem every time.
2. The issue is clear. Some people have problems seeing frame-rate drops. Here both the work the WiiU is struggling with and the frame rate drops are extremely clear.

There was a hypothesis that these drops were due to low memory bandwidth. I agreed with this hypothesis.

That's literally it. Several people attacked me because they can't deal with this.

If presenting a supported hypotheis is jumping to conclusions then sure I'll be jumping all day.
 
There are a number of games on the WiiU that have frame rate drops when dealing with alpha textures - FACT. FACT. FACT.

Epic Mickey was used as an EXAMPLE. Because:
1. The issue is on demand - when testing something the best thing is when you know how to recreate the issue. This way we can nicely rule out the problem being something else. In Epic Mickey we can recreate this problem every time.
2. The issue is clear. Some people have problems seeing frame-rate drops. Here both the work the WiiU is struggling with and the frame rate drops are extremely clear.

There was a hypothesis that these drops were due to low memory bandwidth. I agreed with this hypothesis.

That's literally it. Several people attacked me because they can't deal with this.

If presenting a supported hypotheis is jumping to conclusions then sure I'll be jumping all day.

Man you are going to regret a lot of the things you are saying at e3. Mark it people. Lol
 
see my edit

first of all, the problems we are seeing are far worse than what the setup seems to imply in a worst case scenario. Even going on the "worse than 360" scenario. Numbers for 360 bandwidth are theoretical, not achieved in real-world. That number is split in read/write, while the WiiU is read or write. Since the pool is larger, it will need to do less writing i suppose. If it only needs to be read at any specific time, it already has more BW than the theoretical 360 numbers. Then there is the "actual" MEM1/eDRAM. So, while i could understand "below 360" performance... i have a hard time believing the B3D explanation based on what we are seeing (Epic Mickey being the worst offender, while running smoothly on Wii).

alpha performance is REALLY dependent on resolution though isn't it? so I don't think the Wii / Wii U comparison tells us much. Alan Wake runs at such a low framebuffer size on 360 in large part because of how much alpha it has to do.
 
Man you are going to regret a lot of the things you are saying at e3. Mark it people. Lol

the only thing he could possibly 'regret' is agreeing with the hypothesis that we're seeing alpha issues on multiple Wii U titles because of memory bandwidth. the things he calls facts... all are. Epic Mickey 2 does allow you to create the issue on demand, and multiple titles show the same issue.
 
There are a number of games on the WiiU that have frame rate drops when dealing with alpha textures - FACT. FACT. FACT.

Epic Mickey was used as an EXAMPLE. Because:
1. The issue is on demand - when testing something the best thing is when you know how to recreate the issue. This way we can nicely rule out the problem being something else. In Epic Mickey we can recreate this problem every time.
2. The issue is clear. Some people have problems seeing frame-rate drops. Here both the work the WiiU is struggling with and the frame rate drops are extremely clear.

There was a hypothesis that these drops were due to low memory bandwidth. I agreed with this hypothesis.

That's literally it. Several people attacked me because they can't deal with this.

If presenting a supported hypotheis is jumping to conclusions then sure I'll be jumping all day.

First of all no one attacked you, they attacked your argument. Not because we "can't deal with this." but that you have no actual proof. We have cases of both alpha textures working fine and alpha textures causing slowdowns.

BUT, to say the textures aren't working because of bandwidth is absolutely false. These games work great on Xbox 360. FACT. Even without eDRAM the Wii-u's DDR3 is faster than the Xbox 360's. FACT.

How is the problem bandwidth, if the system that has less of it is streaming those oh so hard to do alpha textures perfectly? What we are trying to say is that it is MUCH MORE LIKELY that there is a problem with the software development and not the hardware.

Don't take it personally, this is a speculation thread. So yes, and speculation will be dissected and discussed.
 

guek

Banned
There are a number of games on the WiiU that have frame rate drops when dealing with alpha textures - FACT. FACT. FACT.

Epic Mickey was used as an EXAMPLE. Because:
1. The issue is on demand - when testing something the best thing is when you know how to recreate the issue. This way we can nicely rule out the problem being something else. In Epic Mickey we can recreate this problem every time.
2. The issue is clear. Some people have problems seeing frame-rate drops. Here both the work the WiiU is struggling with and the frame rate drops are extremely clear.

There was a hypothesis that these drops were due to low memory bandwidth. I agreed with this hypothesis.

That's literally it. Several people attacked me because they can't deal with this.

If presenting a supported hypotheis is jumping to conclusions then sure I'll be jumping all day.

You're flipping out because people disagree with your hypothesis. Deal with it. You've had the most marked reaction to this discussion out of everyone involved. Acting like people shouldn't be skeptical that this is an issue with hardware instead of software is, indeed, jumping to conclusions. It didn't help that you used a really poor example.
 

ozfunghi

Member
alpha performance is REALLY dependent on resolution though isn't it? so I don't think the Wii / Wii U comparison tells us much. Alan Wake runs at such a low framebuffer size on 360 in large part because of how much alpha it has to do.

I was taking wii into account, because for the hardware that they were going for at that time (6 years ago), they didn't seem to "gimp" it in a way that it couldn't run decent alpha textures (relative to its graphical output, obviously). It would be strange for Nintendo to overlook this NOW with WiiU. You could even argue that EM2 would have been better served by using the Wii alpha textures, than what the game ended up doing on WiiU at unplayable framerates.
 
Status
Not open for further replies.
Top Bottom