• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

Argyle

Member
Nobody has said anything about the TEV Unit from Flipper being the 'secret sauce'. If Latte is using fixed functions (and I believe it is) then the TEV Unit will have to have evolved considerably to prevent the Wii U from having the same problem that the Wii had with ports. Hollywood gave the Wii a nonstandard rendering pipeline making ports impossible. We know this isn't the case with the Wii U because developers have found it easy to port PS3 and 360 games to the console.

And if it isn't using fixed functions and Latte is indeed a 160 ALU GPU then I'd love someone to explain how Bayonetta 2 is possible when the power draw is so low and why the ALUs are also twice the size they should be. Fixed functions of some description explain the size of the ALUs and the low power draw.

It doesn't matter how efficient the shaders are, what we've seen of Super Mario 3D World, Bayonetta 2, Mario Kart 8, X and SSBU shouldn't be possible on a bog standard 160 ALU GPU.

You can call it secret sauce, Nintendo magic, Navi fairy dust, Pikmin hard at work or whatever you want but there's plenty going on under the hood that we're completely unaware of. Again, if you have an alternate theory to fixed functions making all this possible that also explains the huge ALUs and low power draw then we're all ears.

Just my opinion, but you won't like it. There's nothing there in any of those games that makes me think "wow how did they do that on Wii U?" As a result I don't think there is any special sauce involved at all.
 

NBtoaster

Member
I asked you first if you can explain what isnt technical about it then we can talk. a lot of people just come in here and talk down on games but never explain their reasoning.

You don't start from the position that a game is technically impressive. You need to point out what is impressive about it first.

But I can point out what is unimpressive. No AA, No AF, low poly environments, low res transparencies. Bad water effects.

What the game does do is maintain some relatively high res textures and decent detail at a distance, but it obviously doesn't come free.
 
of course the thread was bumped with more graphics wars, as if we haven't had enough of those. Mario 3d world is impressive, even just for the lighting, the frame rate and the physics engine, which I found to be very precise, for a Mario game
 

fred

Member
Just my opinion, but you won't like it. There's nothing there in any of those games that makes me think "wow how did they do that on Wii U?" As a result I don't think there is any special sauce involved at all.

Most of those titles are 720p native at 60fps with v-synch enabled.

Bayonetta 2 is the standout title for me. The Gomorrah boss fight looks to be pushing an impressive amount of polys - the Gomorrah boss model is huge and detailed, the skyscraper is pretty big and the rest of the city is in the background, including the water.

There's absolutely no way that a bog standard 160 ALU GPU drawing 20-odd Watts can do all that. Unless Nintendo have rewritten the laws of physics.

And you haven't given a plausible explanation of why the ALUs are almost twice the size they should be.
 

Argyle

Member
Most of those titles are 720p native at 60fps with v-synch enabled.

Bayonetta 2 is the standout title for me. The Gomorrah boss fight looks to be pushing an impressive amount of polys - the Gomorrah boss model is huge and detailed, the skyscraper is pretty big and the rest of the city is in the background, including the water.

There's absolutely no way that a bog standard 160 ALU GPU drawing 20-odd Watts can do all that. Unless Nintendo have rewritten the laws of physics.

And you haven't given a plausible explanation of why the ALUs are almost twice the size they should be.

Shrug...how do you know exactly what a 160 ALU GPU can do? You sound like the folks who claim to be able to count polygons from a screenshot.

Twice as big...compared to what? Isn't the Wii U GPU made at a different foundry than whatever you are comparing it to?
 
A

A More Normal Bird

Unconfirmed Member
I asked you first if you can explain what isnt technical about it then we can talk. a lot of people just come in here and talk down on games but never explain their reasoning.
Plenty of the opposite takes place as well. Anything that's vaguely round has to be tessellated or proof of a dual graphics engine. Mario is impossible on a 160ALU GPU because... reasons. Not a list of the effects used and educated estimates of their render time in ms, not a statement from a developer, it just can't be.
 

If you look at the question, it was about the final dev kit. That's not the one the rumor is about.

Nobody has said anything about the TEV Unit from Flipper being the 'secret sauce'. If Latte is using fixed functions (and I believe it is) then the TEV Unit will have to have evolved considerably to prevent the Wii U from having the same problem that the Wii had with ports. Hollywood gave the Wii a nonstandard rendering pipeline making ports impossible. We know this isn't the case with the Wii U because developers have found it easy to port PS3 and 360 games to the console.

And if it isn't using fixed functions and Latte is indeed a 160 ALU GPU then I'd love someone to explain how Bayonetta 2 is possible when the power draw is so low and why the ALUs are also twice the size they should be. Fixed functions of some description explain the size of the ALUs and the low power draw.

It doesn't matter how efficient the shaders are, what we've seen of Super Mario 3D World, Bayonetta 2, Mario Kart 8, X and SSBU shouldn't be possible on a bog standard 160 ALU GPU.

You can call it secret sauce, Nintendo magic, Navi fairy dust, Pikmin hard at work or whatever you want but there's plenty going on under the hood that we're completely unaware of. Again, if you have an alternate theory to fixed functions making all this possible that also explains the huge ALUs and low power draw then we're all ears.

Comparing Latte to Xenos and RSX, Xenos was the first market ready GPU with a unified shader architecture (USA). RSX was a EOL pre-USA GPU. Both of those are also EOL DX9+ level GPUs that started out using a 90nm fab. Latte is a mid-gen USA that is also an EOL DX10.1+ level GPU with a fab that's most likely 45nm. Latte doesn't quite have the same raw power, but should be more efficient and have more advanced features. The latter more than likely playing a strong part for the Project CARS team dropping PS360 development but continuing the Wii U version. The ALU core sizes meaning something is pretty irrelevant considering what's known.

And considering those games are tailored to a 160 ALU GPU, they are definitely capable of looking like that with those specs and TDP. This isn't something like BF4 on PC designed with high end GPUs in mind being run on a PC with a low end GPU.
 

wsippel

Banned
If you look at the question, it was about the final dev kit. That's not the one the rumor is about.
I believe I was one of the guys responsible for spreading the 770LE rumor, and my source was dubious. Second hand, misinterpretation and stuff. Basically, the guy interpreted "RV770, but weaker" as "RV770LE", when it was at no point a RV770 - I think the original source just wanted to say that it's R700 based. Going by what other developers said, every pre-release kit revision was stronger than the one before it, not weaker. I don't think there ever was a downgrade.

EDIT: Also, one of the reasons I considered stuff like TEVs instead of TMUs is, in fact, X. Because alpha seems to be a huge problem with ports, but native games, and X in particular, throws alpha-blended shit on screen like there's no tomorrow. And alpha and multi-texturing were the main strong points of TEVs as far as I understand. While raw shader performance is awesome, at the end of the day, you will use a lot of it on stuff dedicated hardware can do much more efficiently.
 
I believe I was one of the guys responsible for spreading the 770LE rumor, and my source was dubious. Second hand, misinterpretation and stuff. Basically, the guy interpreted "RV770, but weaker" as "RV770LE", when it was at no point a RV770 - I think the original source just wanted to say that it's R700 based. Going by what other developers said, every pre-release kit revision was stronger than the one before it, not weaker. I don't think there ever was a downgrade.

Yeah, but I don't really attribute any possible validity of the rumor to anything you might have dug up early on. Although similar info did come from other places as well and I doubt they had the same source. Me personally, I assumed it was the second dev kit primarily because of Vigil's comment, but the source of this rumor didn't mention a specific dev kit just that the overheating caused Nintendo to "downgrade" the hardware. For all we know that could have been the downclocking that came out after E3 '11 I believe. Which was even more interesting because like you said there was never a public indication of a downgrade beforehand. Though who would really come out and say something like that anyway? Although there clearly and unfortunately was a third party shift from E3 '11 to E3'12.

Hopefully I'll join you guys on Miiverse sometime next year.
 

HTupolev

Member
EDIT: Also, one of the reasons I considered stuff like TEVs instead of TMUs is, in fact, X. Because alpha seems to be a huge problem with ports, but native games, and X in particular, throws alpha-blended shit on screen like there's no tomorrow. And alpha and multi-texturing were the main strong points of TEVs as far as I understand. While raw shader performance is awesome, at the end of the day, you will use a lot of it on stuff dedicated hardware can do much more efficiently.
From what I can tell digging around the internet, a TEV is just a small harvard-architecture processor with a very small amount of program memory, a single op type, and a small instruction set of texture-combining ops. Basically, it's an early pixel shader lacking "branching" and heavy computational ops.

The Gamecube's alpha blending performance was decent, probably because the framebuffer interactions were isolated from other memory access, not because of anything the TEVs were doing. The PS2 and Xbox 360 also had good alpha blending performance, whereas the render-to-big-pool oXbox and PS3 have had some issues there.

And aren't the TEVs called "good for multitexturing" simply because they could handle a then-respectable eight textures per pass? That figure isn't very impressive anymore.
 
And aren't the TEVs called "good for multitexturing" simply because they could handle a then-respectable eight textures per pass? That figure isn't very impressive anymore.
It's not about impressive, the Wii U being obviously not that impressive; it's about doing more with less.

See it this way, pulling what was being done on the GC/Wii on a modern architecture would probably require more computational power even if you were doing the same via modern shaders. There's some reasons to it, from the fact TEV was very close to silicon in operation as well as simple; the more simple something is usually the more efficient it is, albeit limited.

TEV was fixed function, and thus, predictable. That means whatever it managed to do, other than putting those pipelines to use, was "free".
 

AzaK

Member
Yeah, but I don't really attribute any possible validity of the rumor to anything you might have dug up early on. Although similar info did come from other places as well and I doubt they had the same source. Me personally, I assumed it was the second dev kit primarily because of Vigil's comment, but the source of this rumor didn't mention a specific dev kit just that the overheating caused Nintendo to "downgrade" the hardware. For all we know that could have been the downclocking that came out after E3 '11 I believe. Which was even more interesting because like you said there was never a public indication of a downgrade beforehand. Though who would really come out and say something like that anyway? Although there clearly and unfortunately was a third party shift from E3 '11 to E3'12.

Hopefully I'll join you guys on Miiverse sometime next year.

Hey BG. Was there ever any mention of how much the downgrade was.

Also, be great to see you on Miiverse :)
 

tipoo

Banned
I don't think it's fair to say things like that when God of War: Ascension is largely fixed view, and the gamer cannot control the camera at all, whereas in Zelda games, and in that 2011 tech demo, the user had complete control, zooming in and out, as well as any direction, change of lighting at a whim, etc.

God of War games look so great in part because of almost everything being pre-rendered in a way that doesn't allow for visual changes or angle of view that aren't scripted. They are on rails games where the gamer controls whether or not the train is moving forward. The Wii U is able to output visuals like that which aren't restricted in that way.

I agree with you on the hope in art style though. They probably won't do it, but hope is what it is.

I can see your point, but it's also an open roam game (even if it's pretty linear), it's not exactly like a 2.5D side scrollers by far. They can probably cull things better than in a movable camera, more open world game, but it's still somewhere in between and probably more like fully open games than side scrollers. And funny enough people used to try to debunk the argument that platformers were easier to process and render earlier in this thread :p
 

TheD

The Detective
Latte doesn't quite have the same raw power, but should be more efficient and have more advanced features. The latter more than likely playing a strong part for the Project CARS team dropping PS360 development but continuing the Wii U version.

That is false, they did not drop support for the 360 and PS3 because of that (the game even still has a DX9c path, which would have a few less API features than the last gen console versions).
They dropped support due their being replacement consoles that they wanted to target instead.
 
I don't think it's fair to say things like that when God of War: Ascension is largely fixed view, and the gamer cannot control the camera at all, whereas in Zelda games, and in that 2011 tech demo, the user had complete control, zooming in and out, as well as any direction, change of lighting at a whim, etc.

God of War games look so great in part because of almost everything being pre-rendered in a way that doesn't allow for visual changes or angle of view that aren't scripted. They are on rails games where the gamer controls whether or not the train is moving forward. The Wii U is able to output visuals like that which aren't restricted in that way.

I agree with you on the hope in art style though. They probably won't do it, but hope is what it is.
I haven't ever actually played a God of War game (I know, I know), but from what I've seen before, there's nothing that suggests anything is pre-rendered, nor that the camera is immovable and the game "on rails." Can you verify that's right? I find that difficult to believe, but then again, it's hard to tell if a camera can move from footage alone.
 
I haven't ever actually played a God of War game (I know, I know), but from what I've seen before, there's nothing that suggests anything is pre-rendered, nor that the camera is immovable and the game "on rails." Can you verify that's right? I find that difficult to believe, but then again, it's hard to tell if a camera can move from footage alone.
There's no camera control. You only view the game from the angles the devs allow... This means that they only have to render what's facing their camera as it moves upon a predetermined, linear path as you move through the level. This also means they can fake geometry be using intricate textures on a few flat planes in areas that the camera remains mostly static.
 

NBtoaster

Member
just for reference.

Red Dead Redemption
John Marston - 14,980
John Marston (Deadly Assassin) - 13,362

Metroid Prime 3: Corruption
Samus - 18,962

Open world games of 7th gen might look good but they sacrifice a lot to look that way... and character polycount is just one example of that. I can guarantee you the main Character in X poly count trumps that of RDR John... but again thats just an example.

source: http://beyond3d.com/showthread.php?t=43975

That's a pretty bad comparison because while Johns model is lower poly it's constantly on screen, and features far higher quality shading, textures, normal maps etc. The model is far more complex than anything on 6th generation or Wii.

There's no camera control. You only view the game from the angles the devs allow... This means that they only have to render what's facing their camera as it moves upon a predetermined, linear path as you move through the level.

A dev debunked this ages ago. It's not fixed and they cannot predict everything the player does or sees.

http://forum.beyond3d.com/showpost.php?p=1394584&postcount=1
 
Hey BG. Was there ever any mention of how much the downgrade was.

Also, be great to see you on Miiverse :)

No sir. Just that Nintendo supposedly decidedly not to go with their original target.

That is false, they did not drop support for the 360 and PS3 because of that (the game even still has a DX9c path, which would have a few less API features than the last gen console versions).
They dropped support due their being replacement consoles that they wanted to target instead.

I didn't claim it as fact.
 

HTupolev

Member
It's not about impressive, the Wii U being obviously not that impressive; it's about doing more with less.
But when you're already taking advantage of the computational ops in a modern USA? Obviously a TEV would be a cheaper adder than a modern shader, but only when all you need is an adder. GPU scheduling is already pretty wild, throwing in extra hardware off the the side with its own shader type, registers, instruction memory, and whatnot for a case that would see only very small utilization (and merely to slightly improve availability of the main shader processors during pixel shading steps) seems more than a little inelegant.

TEV was fixed function, and thus, predictable. That means whatever it managed to do, other than putting those pipelines to use, was "free".
Not really. From what I can tell, the only significant difference between TEV and a pixel shader is the variety of supported ops and the program length. Again, that meant that it's much smaller than a shader processor if all you want to do is add textures together, but it's not like it didn't in theory suffer the same sorts of limitations.

If the TEV was "free", it was seemingly only because you don't usually need more texture-combining ops than textures, so the TEV program would usually not wind up longer than the TMU's texture list (or vise versa, hence small hardware with very good utilization). So the TMU would usually not bottleneck on the TEV. But, if you did need more TEV steps than you needed textures for some reason, I don't see why pixel throughput wouldn't have to decrease, like any other shader ever. It's not like the TEV would have been able to execute multiple sequential steps in a single cycle.

Unless I'm just totally wrong, which is possible considering how little solid data I've seen.
 
A dev debunked this ages ago. It's not fixed and they cannot predict everything the player does or sees.

http://forum.beyond3d.com/showpost.php?p=1394584&postcount=1

I stand corrected, I guess. I think he worded his response very well, at the very least. However, I don't buy that this means that the world is fully realized behind the camera based on him saying that there's geometry. If it is, then these guys are pretty wasteful with their manpower and pretty terrible at resource management. It would be pretty stupid to do all of the texture passes on geometry that is behind the camera, the camera's allowed movement is only a few degrees in each direction, panning, and zooming. So... Geometry behind the camera does no equal a finished world and he didn't mention if there would need to be asset realocation in order to give full camera control without the world looking broken.

Greatly worded response, though. He said enough to refute the surface claim without elaborating on anything, unless I missed other posts.

Then again... I did just wake up, maybe I read it wrong.
 
But when you're already taking advantage of the computational ops in a modern USA? Obviously a TEV would be a cheaper adder than a modern shader, but only when all you need is an adder. GPU scheduling is already pretty wild, throwing in extra hardware off the the side with its own shader type, registers, instruction memory, and whatnot for a case that would see only very small utilization (and merely to slightly improve availability of the main shader processors during pixel shading steps) seems more than a little inelegant.
Oh, don't get me wrong (and I can totally see why you did by the way I worded it), I'm not defending the theory of it being there.

All points to not being the case and instead TEV instructions just being translated somewhere into code a modern shader architecture understands; we've reached that conclusion here a long time ago.

I was focusing on the idea of fixed function being efficient therefore somewhat convenient in a low power console like this; but if the Wii U has something along those lines going on, it isn't a TEV pipeline.
Not really. From what I can tell, the only significant difference between TEV and a pixel shader is the variety of supported ops and the program length. Again, that meant that it's much smaller than a shader processor if all you want to do is add textures together, but it's not like it didn't in theory suffer the same sorts of limitations.
True, but I was saying "free" not in the *automagical* way and more on the lines of... if it's there not getting used then, using it for something means that something is theoretically free.

For instance, MLAA could be considered "free" on some PS3 games for they wouldn't be using the SPE's anyway.

TEV was all about "free" texture passes, except you're right; surpass their capability and there goes the free.
If the TEV was "free", it was seemingly only because you don't usually need more texture-combining ops than textures, so the TEV program would usually not wind up longer than the TMU's texture list (or vise versa, hence small hardware with very good utilization). So the TMU would usually not bottleneck on the TEV. But, if you did need more TEV steps than you needed textures for some reason, I don't see why pixel throughput wouldn't have to decrease, like any other shader ever. It's not like the TEV would have been able to execute multiple sequential steps in a single cycle.

Unless I'm just totally wrong, which is possible considering how little solid data I've seen.
I don't think you are.

But still, my point was that it was doing more with less, perhaps I should amend it to doing the same with less; feels more accurate.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Not really. From what I can tell, the only significant difference between TEV and a pixel shader is the variety of supported ops and the program length.
That's from a static-programming-model, dont-care-about-clocks POV. From a pipeline-latency POV, TEV and R700 are worlds apart. Not a single op in the R700 pipeline has latency less than 8 clocks. For reference, an alu-heavy TEV routine maxes out at ~16 clocks, and that includes up to 16 alu ops.
 

tipoo

Banned
I hope that now people can stop posting about how they think the old GameCube GPU is behind all the magic of the Wii U...Looks like there are two GPUs there and they look pretty separate to me.


Which would be implying that this diagram is anything more than what someone pieced together from what we already know. Like Fourth said, it uses names that were made up by people not in Nintendo (Starbuck was made by FailOverflow and Marcan I think), and has a few inaccuracies.

Not saying your point is wrong, just the "I hope now people can stop ____", since this isn't any more official than this thread.


That's just a diagram someone made up. It's got a few inaccuracies, such as the amount of L2 cache in the main CPU core (it should be 2 MB and not 1). Plus, "Starbuck" is not an official code name used by devs for the ARM core.
 
To be fair, that diagram is from the 'Marcan and friends' Wii U hack presentation.

So while there is an inaccuracy in regard to the CPU cache, and they did give the ARM core a pet name that seems to have stuck, there may be some usefulness to that diagram.
 

krizzx

Junior Member
I asked you first if you can explain what isnt technical about it then we can talk. a lot of people just come in here and talk down on games but never explain their reasoning.

I thought I was the only aware of this.

Well lets start with the fact that X has been shown in ALPHA stage of development. It is easily showing you a MASSIVE world to roam and explore... also lets look at main characters poly count. RDR as people like to say looks better than X main character has a polycount of 6th generation games. there is a lot i could say but i dont want to bog down thread with a back and forth. lets just say we can fully discuss this in its own thread whenever X is released. as of right now it wouldnt do any good just a back and forth.

just for reference.

Red Dead Redemption
John Marston - 14,980
John Marston (Deadly Assassin) - 13,362

Metroid Prime 3: Corruption
Samus - 18,962

Open world games of 7th gen might look good but they sacrifice a lot to look that way... and character polycount is just one example of that. I can guarantee you the main Character in X poly count trumps that of RDR John... but again thats just an example.

source: http://beyond3d.com/showthread.php?t=43975

This is what I like to see. Rational examples coupled with detailed explanations.

Yeah, but I don't really attribute any possible validity of the rumor to anything you might have dug up early on. Although similar info did come from other places as well and I doubt they had the same source. Me personally, I assumed it was the second dev kit primarily because of Vigil's comment, but the source of this rumor didn't mention a specific dev kit just that the overheating caused Nintendo to "downgrade" the hardware. For all we know that could have been the downclocking that came out after E3 '11 I believe. Which was even more interesting because like you said there was never a public indication of a downgrade beforehand. Though who would really come out and say something like that anyway? Although there clearly and unfortunately was a third party shift from E3 '11 to E3'12.

Hopefully I'll join you guys on Miiverse sometime next year.

It certainly would have prevented a lot of turmoil if you had put emphasis on that aspect before.
 

tipoo

Banned
Perhaps this is the explanation of the bigger shaders - this dev (who sounds reliable) says it's 55nm, not our expected 45 or 40

http://www.eurogamer.net/articles/digitalfoundry-2014-secret-developers-wii-u-the-inside-story


At a very basic level, look at the power draw taken by the next-gen consoles compared to the Wii U. The PlayStation 4 draws over 100W more from the mains than Nintendo's console, and it does so using the latest, most power-efficient x86 cores from AMD in concert with a much larger GPU that's a generation ahead and runs on a much smaller fabrication process - 28nm vs. what I'm reliably informed is the 55nm process from Japanese company Renasas.
 
A

A More Normal Bird

Unconfirmed Member
Perhaps this is the explanation of the bigger shaders - this dev (who sounds reliable) says it's 55nm, not our expected 45 or 40

http://www.eurogamer.net/articles/di...e-inside-story
From what I understand at 55nm the eDRAM would be much larger than it currently is. I have no idea if it's possible for it to be manufactured on a separate process to the rest of the die or not, but if it is that might explain something.

This is what I like to see. Rational examples coupled with detailed explanations.
Yep, when we consider Tron#1's guarantee that the character poly counts in X are significantly higher than 15-20k and fred's assertion that the game is simply not possible on a 160ALU GPU then the Latte being a custom fixed-function/programmable shader hybrid with around half a teraFLOP of processing power and dual geometry engines is all but confirmed.

They are both devs with Monolith right?
 

tipoo

Banned
From what I understand at 55nm the eDRAM would be much larger than it currently is. I have no idea if it's possible for it to be manufactured on a separate process to the rest of the die or not, but if it is that might explain something.

Did someone do the math somewhere?
If the eDRAM was on a seperate die on the same package, they could be different nodes (like Intels crystalwell eDRAM on a seperate die from the main Haswell chip), but since they're on the same chip I don't think they could be separate fabs.
 
A

A More Normal Bird

Unconfirmed Member
Did someone do the math somewhere?
If the eDRAM was on a seperate die on the same package, they could be different nodes (like Intels crystalwell eDRAM on a seperate die from the main Haswell chip), but since they're on the same chip I don't think they could be separate fabs.
wsippel mentioned it in the secret devs thread, and I think it's cropped up in this thread a few times before. Plus Chipworks seemed pretty confident on 40nm and their opinion isn't to be discounted lightly, for obvious reasons.
 

prag16

Banned
wsippel mentioned it in the secret devs thread, and I think it's cropped up in this thread a few times before. Plus Chipworks seemed pretty confident on 40nm and their opinion isn't to be discounted lightly, for obvious reasons.
Yeah until we get something more solid, probably have to assume this anonymous dev was mistaken.
 
So... Anonymous guy from a Eurogamer article says 55nm with a healthy dose of 'lol Wii U' toned frosting, or the guys at Chipworks, who provided the chip for us that has even made this thread possible? Yeah, I'm going to go with the latter.
 
That whole article read like a Wii U hater from some internet message board summing up first year Wii U history, and filling in the holes with everything bad, negative, or lowest common denominator, imaginable.

Another reason I think it's bullshit? The guy speaks so casually of the Nondisclosure agreements he supposedly had to sign as he supposedly breaks them, even as they hold his balls/career in a vice. Then he goes on later to talk about how Renesas personally told him that the GPU die was 55nm. If this were true, then the most inept investigator would have already exposed this guy to Nintendo, and we'd already be reading stories about him being sued.

Whoever wrote that trash tried a little too hard to sound legit, even amidst that much nonsense, and wrote themselves into a corner.

Pathetic.
 
Another reason I think it's bullshit? The guy speaks so casually of the Nondisclosure agreements he supposedly had to sign as he supposedly breaks them, even as they hold his balls/career in a vice.
Pssst, that's why he spoke off the record, without being named. I know speaking as an unnamed source as regards Nintendo is apparently the same as waiving an "I made this shit up" flag to you and some others, but it's not at all uncommon for exactly this reason.

You should come out and say what you're thinking: That the author of the author, or the source, are performing deliberate libel to hurt Nintendo because reasons. In other words, accusing the author or source of committing a pretty serious fucking crime.
 
A

A More Normal Bird

Unconfirmed Member
That whole article read like a Wii U hater from some internet message board summing up first year Wii U history, and filling in the holes with everything bad, negative, or lowest common denominator, imaginable.

Another reason I think it's bullshit? The guy speaks so casually of the Nondisclosure agreements he supposedly had to sign as he supposedly breaks them, even as they hold his balls/career in a vice. Then he goes on later to talk about how Renesas personally told him that the GPU die was 55nm. If this were true, then the most pathetic investigator would have already exposed this guy to Nintendo, and we'd already be reading stories about him being sued.

Whoever wrote that trash tried a little too hard to sound legit, even amidst that much nonsense, and wrote themselves into a corner.

Pathetic.
What. Not only is the only example of dubious information you listed from the article incorrect (the dev simply stated they had it on good authority, not that they were in contact with Renesas), but do you honestly think a story like that isn't thoroughly vetted before it's published? It certainly wasn't a flattering story for Nintendo but the console warrior tone you're reading into it is coming only from your own warped perspective.
 

krizzx

Junior Member
Also, I thought people were aware by now that Eurogamer will lie with no reserve (looks at Splinter Cell Blacklist analysis conclusions vs Eye of Truth, and promotion of COD on the PS4 being 1080p when it has been confirmed that it wasn't at that time.)

They are simply biased game journalists with their own agendas at the end of the day. I'll take the words of real, verified developers like Shin'en, Frozenbyte, Criterion, Renegade Kid, Two Tribes, the Project C.A.R.S. or any verfiable dev any day of the week over Eurogamer. I also prefer Eye of Truths of analysis as theirs are far more accurate and unbiased. They don't write sensationalist articles for hits.

Speaking of Project C.A.R.S. devs, they've got me very interested in the game now with their recent comments. I was originally certain that it was just going to be a cheap port of the PS3/360 version of the game, but since they have scrapped those two versions and started promoting it on the Wii U , this may just be the 3rd party game to judge the consoles capabilities by.
 

prag16

Banned
What. Not only is the only example of dubious information you listed from the article incorrect (the dev simply stated they had it on good authority, not that they were in contact with Renesas), but do you honestly think a story like that isn't thoroughly vetted before it's published? It certainly wasn't a flattering story for Nintendo but the console warrior tone you're reading into it is coming only from your own warped perspective.

Based on what we know so far, he is probably mistaken on the fab size. Unless they can somehow mix nodes on the same die, as was said. Doesn't mean the rest is made up necessarily.

What I don't get is the Green Hills stuff. The tools are touted as great and super fast for debugging, etc. But this guy says they're shit.

And the Green Hills deal didn't come about until March 2012. Are we to believe devs of a launch game didn't get their initial kits and tools until 8 months before launch??
 

OryoN

Member
I don't see anything suspicious about the anonymous source. The whole 55nm thing - which seems off - isn't information devs are privy to, and was simply 'hear-say' they felt was credible. Some things did sound like they were intentionally painting a very bleak picture though. Like the whole shaders not compatible thing(making it seem outdated), or how competing consoles' GPUs are leagues ahead. Aside from lacking the raw power Wii U's GPU is about as feature rich as those. I would imagine being "leagues" ahead would involve a host of on-chip GPU features that just aren't possible on Latte, in-hardware.

In the end, it was their opinion on Wii U development environment leading up to launch. That particular period is anything but ideal for ANY platform, so this is not really 'news'. That said, while I don't doubt that this dev did had a rough time during that period, I believe the experience would have obviously varied between different studios.

I think the important thing to note is that Wii U - like with any console - has moved pass that initial rocky period. It seems like people are treating this article as an indicator of the current environment. Criterion acknowledged that a large part of why they were able to do such an amazing port - in so little time - was that tools and software that were absent at launch, finally were in place. Those tools would only have gotten better since then, and will continue to improve.
 

Chronos24

Member
Anyone find it odd though that the guy took some tidbits from things others have said before? "Punching above it's weight" (referring to the CPU) I know another source said the exact same phrase about the CPU. I feel like the guy read some cliff notes from discussions including from this website to sound legit and help put out that article. Not saying some is true regarding power of the WiiU but something still seems fishy about that guy.
 

krizzx

Junior Member
I made a thread with this article but it got locked almost immediatley..., "another" verfiable dev from another company has since weighed to the first devs statement supporting that the "secret developer" is speaking hogwash.

http://nintendoenthusiast.com/news/harder-develop-games-wii-u-case-says-renegade-kid/

So that is two dev from two different companies contradicting Eurogamer report.

It is becoming increasingly evident that the report from Eurogamer was mostly fabricated and exagerrated to paint as negative a picture as they could to get hits. This wouldn't be a first time.

I'll take the words of the verified developers.
 
I made a thread with this article but it got locked almost immediatley..., "another" dev from another company has sense weighed to the first devs statement supporting that the "secret developer" was speaking hogwash.

http://nintendoenthusiast.com/news/harder-develop-games-wii-u-case-says-renegade-kid/

So that is two dev from two different companies contradicting Eurogamer report.

It is becoming increasingly evident that the report from Eurogamer was mostly fabricated to paint as negative a picture as they could to get hits. This wouldn't be the first time.

It's a "very informative" article which was "well researched" and most importantly "clear of any glaring biases" by an author with a predetermined agenda. I know when I think of reliable information regarding technical performance, I think of Nintendo Enthusiast, not Eurogamer or Digital Foundry.
 

krizzx

Junior Member
It's a "very informative" article which was "well researched" and most importantly "clear of any glaring biases" by an author with a predetermined agenda. I know when I think of reliable information regarding technical performance, I think of Nintendo Enthusiast, not Eurogamer or Digital Foundry.

Huh? At what point did I in anyway say that Nintendo enthusiast was or was not a reliable source, that they were more credible than Eurogamer, or promote them in any form?

I was sure I said the " verifiable devs" who's responses were in this article.

Why are you attacking the host of the article who was never part of the argument and ignoring the actual developer's comments themselves?
 

NBtoaster

Member
It doesn't contradict any of it.

No one has attested against the idea that at and before launch the Wii U had horrible tools, communication, and OS.

Though considering the failings of a lot of ports a year after launch, we can probably conclude there are still problems now too.
 
I made a thread with this article but it got locked almost immediatley..., "another" verfiable dev from another company has since weighed to the first devs statement supporting that the "secret developer" is speaking hogwash.

http://nintendoenthusiast.com/news/harder-develop-games-wii-u-case-says-renegade-kid/

So that is two dev from two different companies contradicting Eurogamer report.

It is becoming increasingly evident that the report from Eurogamer was mostly fabricated and exagerrated to paint as negative a picture as they could to get hits. This wouldn't be a first time.

I'll take the words of the verified developers.
You should probably stop posting something if it was considered too insubstantial or from a disreputable/biased source to get a thread locked.

But regardless, you somehow ignored what the second Dev said: that it's not hard to develo for, the implication being now, but that he likely had a super early Dev kit. Which the article in question implies based on the challenges he faced. So no, this does jack fucking shit at disproving anything. It simply says that, now, a year after launch, the Dev tools are mature. That does not in any way invalidate the experiences of other people from a different time. The fact that you're trying to draw some equivalence is honestly fucking ridiculous. Your argument boils down to "People now aren't having issues, so other people before must be lying!!!"
 

fred

Member
So... Anonymous guy from a Eurogamer article says 55nm with a healthy dose of 'lol Wii U' toned frosting, or the guys at Chipworks, who provided the chip for us that has even made this thread possible? Yeah, I'm going to go with the latter.

And going by what he's said in that quote it appears to be coming from someone else entirely so it isn't even first hand knowledge, it's hearsay that contradicts what Chipworks said.

I've always found it amusing that people with an odd anti-Wii U agenda (and why they have that agenda is very odd, you'd think that Nintendo had killed their family and raped their dog or something), always take negative comments from anonymous devs as the complete truth and don't pay any attention to named devs that praise the system.
 

The_Lump

Banned
I made a thread with this article but it got locked almost immediatley..., "another" verfiable dev from another company has since weighed to the first devs statement supporting that the "secret developer" is speaking hogwash.

http://nintendoenthusiast.com/news/harder-develop-games-wii-u-case-says-renegade-kid/

So that is two dev from two different companies contradicting Eurogamer report.

It is becoming increasingly evident that the report from Eurogamer was mostly fabricated and exagerrated to paint as negative a picture as they could to get hits. This wouldn't be a first time.

I'll take the words of the verified developers.


The only part that may be incorrect is the 55nm.That doesn't discount the rest of the article though. It's mostly information we've known for a year and is clearly (and openly) a retrospective of what it was like to develop a launch game on WiiU for some 3rd party devs.

I don't see the problem with it. We know the situation has changed now (regarding the immature tool chainand Nintendo 's support to 3rd parties ) thanks to Critereon confirming just as much in an interview based on more recent infirmation; but even they conceded that in the beginning it was very difficult to develop for due to various issues.


Also, the article you're linking above is clearly talking present tense. I'm sure right now the WiiU is no harder to develop for than anything else (experience permitting) , but the Eurogamer story is specifically talking about pre launch. As in working with the very first Dev Kits. They are not directly comparable articles/quotes.
 
Also, I thought people were aware by now that Eurogamer will lie with no reserve (looks at Splinter Cell Blacklist analysis conclusions vs Eye of Truth, and promotion of COD on the PS4 being 1080p when it has been confirmed that it wasn't at that time.)

They are simply biased game journalists with their own agendas at the end of the day. I'll take the words of real, verified developers like Shin'en, Frozenbyte, Criterion, Renegade Kid, Two Tribes, the Project C.A.R.S. or any verfiable dev any day of the week over Eurogamer. I also prefer Eye of Truths of analysis as theirs are far more accurate and unbiased. They don't write sensationalist articles for hits.

Speaking of Project C.A.R.S. devs, they've got me very interested in the game now with their recent comments. I was originally certain that it was just going to be a cheap port of the PS3/360 version of the game, but since they have scrapped those two versions and started promoting it on the Wii U , this may just be the 3rd party game to judge the consoles capabilities by.

I think whats different, is SMS is using multi-threaded rendering. The last post from an anonymous dev in the forums said that they are getting 5 cars on screen at 25 fps during a race. With optimization they should hit 30 and hopefully a solid 30.
 
Pssst, that's why he spoke off the record, without being named. I know speaking as an unnamed source as regards Nintendo is apparently the same as waiving an "I made this shit up" flag to you and some others, but it's not at all uncommon for exactly this reason.

You should come out and say what you're thinking: That the author of the author, or the source, are performing deliberate libel to hurt Nintendo because reasons. In other words, accusing the author or source of committing a pretty serious fucking crime.

Apparently, he's already admitted to breaking a nondisclosure agreement, which is pretty serious already. Career ending, and criminal in the civil sense. So you trying to paint a picture of 'legit' from 'anonymous' just can't and won't be seen that way by many.
 
Status
Not open for further replies.
Top Bottom