• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
1) who cares if the WiiU is "the PS2 of this gen"? Where does this even come from? Don't project weird opinions on people.
2) the WiiU's lack of 3rd party support is explainable entirely based on its different (custom) architecture. It may also be "underpowered", for some arbitrary definition of underpowered, but even if it were a more powerful system than the Xbone and PS4, it would not get the same AAA support as those two. Because those systems are basically x86 PCs with run of the mill GPUs, devs can optimize for what is essentially one platform. Why port to a 2nd platform, especially when the market share isn't there?
 

JordanN

Banned
You guys arguing with Jordann are arguing the impossible. He will make points based off things you arent even arguing about.
Actually, the reverse is happening. Check this out.

Elfforkusu said:
Why port to a 2nd platform, especially when the market share isn't there?

My post was about why didn't the Wii U get stuff like UE4/Agni's before PS4/XBO? Hell, those consoles don't even have a market share yet so it's funny to see this claim already. I guess PS4/XBO are already expected to win.
 

bomblord

Banned
Actually, the reverse is happening. Check this out.



My post was about why didn't the Wii U get stuff like UE4/Agni's before PS4/XBO? Hell, those consoles don't even have a market share yet so it's funny to see this claim already. I guess PS4/XBO are already expected to win.

really?
 
My post was about why didn't the Wii U get stuff like UE4/Agni's before PS4/XBO? Hell, those consoles don't even have a market share yet so it's funny to see this claim already. I guess PS4/XBO are already expected to win.
Yes, one of PS4/XBone/PC are "expected to win", and they all share what's fundamentally the same architecture. What does that have to do with this thread?
 

Earendil

Member
He finally came around. :)

I'm in the middle of buying a house, so I haven't been able to be part of the conversation as much as I used to. If you need me to do anything, shoot me a PM. I have a little over a week before we close on the house and I'm sans Internet for a while.
 

wsippel

Banned
I think that it is probably the UTDP at this point. I've always seen the two distinct SRAM pools along with it's general vicinity next to CPU interface and what I see as hardware interpolators as being indicative of an instruction cache and constant cache. I originally thought the UTDP was an adjacent block and D was just the caches. Now, after going back and forth w/ bg and seeing Brazos, I think it's all-inclusive.

What kind of decompressor were you thinking exactly? Video signal?

I think I might be onto something with the UVD thing down in P though. Still racking my brain on what TAVC could stand for in Brazos. "Target Video Codec?"
Nope, I'm thinking about something lossless. LZMA or something. Having that close to the CPU interface and MEM0 seems like it would make sense as well.
 

krizzx

Junior Member
I've been trying to find a block that resembles the UVD on Latte, but so far, no luck. A and B bare the closest resemblance.
100184-amd-llano-slide.png

Does anyone remember the link to that other site that was discussing the hardware a while back? I want to see how they have progressed. They were proposing some pretty interesting things.

I need someone to refresh my memory. Isn't Llano a 32nm chip?
http://www.pcworld.com/article/224424/Why_the_32nm_Chip_is_a_Big_Deal_for_AMD_and_Laptop_Users.html
 
Nope, I'm thinking about something lossless. LZMA or something. Having that close to the CPU interface and MEM0 seems like it would make sense as well.

Hmm, what makes you think it's on there? Are you thinking about texture compression? Perhaps because of the Two Tribes texture data comment? How about R?

I've been trying to find a block that resembles the UVD on Latte, but so far, no luck. A and B bare the closest resemblance.

I need someone to refresh my memory. Isn't Llano a 32nm chip?
http://www.pcworld.com/article/224424/Why_the_32nm_Chip_is_a_Big_Deal_for_AMD_and_Laptop_Users.html

Yeah, for some reason I had been assuming UVD wasn't included in Latte, but if wsippel dug up some evidence that says it is, we need to work it in. In my eyes, the only block big enough and with enough SRAM is "P."

Yes, Llano is 32nm SOI from Global Foundries.

Oh bg, Block E on Latte looks kinda like block P on Llano. What do you think? Also happens to be in the same relative position next to the DC.
 

krizzx

Junior Member
Hmm, what makes you think it's on there? Are you thinking about texture compression? Perhaps because of the Two Tribes texture data comment? How about R?



Yeah, for some reason I had been assuming UVD wasn't included in Latte, but if wsippel dug up some evidence that says it is, we need to work it in. In my eyes, the only block big enough and with enough SRAM is "P."

Yes, Llano is 32nm SOI from Global Foundries.

Oh bg, Block E on Latte looks kinda like block P on Llano. What do you think? Also happens to be in the same relative position next to the DC.

Yes, now that you mention it does look like P. I was leaning for A or B because its in a corner in the photo and kind of L shaped with a thick body and short stem like A and B. I thought it would naturally be in a corner on Latte as well. Its also right by the GP i/os given it a nice vantage.


I will go with P as well.

Seems like a waste in Latte though since Nintendo doesn't allow DVD move playback on the console.
http://en.wikipedia.org/wiki/Unified_Video_Decoder
On a side note I'm guessing its using UVD2.2 for the dual stream decoding. Would make outputting video to the game pad much better. Would this be of any functional use for gaming?
 

MDX

Member
When it comes down to it, its up to Nintendo to bring the goods.

So which game, that we know of, showcased during the next E3, do you guys think,
can show what the WiiU is capable of? Which game will
undoubtedly prove Latte is a few generations ahead of the PS360?

X? Smash Brothers? Mario3D? Bayonetta 2?
something else?

and what graphical effects will you be looking for?
 
A

A More Normal Bird

Unconfirmed Member
When it comes down to it, its up to Nintendo to bring the goods.

So which game, that we know of, showcased during the next E3, do you guys think,
can show what the WiiU is capable of? Which game will
undoubtedly prove Latte is a few generations ahead of the PS360?


X? Smash Brothers? Mario3D? Bayonetta 2?
something else?

and what graphical effects will you be looking for?

Retro's title, hopefully. BTW, the part of your post I've bolded isn't something that can be disproven/proven by game visuals. Latte could be half as strong as it is and it would still technically be a few generations removed from the GPUs in the 360/PS3.
 

tipoo

Banned
XB1 and PS4 are pretty far apart. PS4 has a GPU with ~66% more processing power.

Just being anal here, but both of these following statements are correct, using the statement interchangeably with the number is not: The One has 66% the shading power of the PS4 (12/18). OR, the PS4 has 1.5x/150% the shading power of the One (18/12). Switching them up and saying the PS4 is 66% more powerful is not accurate. That would imply the PS4 had 20 CUs.
 
I'm not really worthy of this thread (knowledge + language), but I've been wondering if this AMD patent application titled "DYNAMIC CONTROL OF SIMDs" (filed 07/12/2011) has anything to do with the Wii U GPU (or CPU - I'm clueless).

It could be totally generic and already being widely used on a bunch of -commercial- GPUs, so be nice :p (searched thread/gaf)

It tickled my interest because it's aimed at reducing power consumption and achieving "optimal usage of the graphics processing unit" and because of the time-frame it was filed (July 2011).

I understand that it's something that doesn't have to be exclusive to the WiiU - I'm just asking of what are the chances it's being used here.

Summary:

Hate to be quoting myself but...anyone?

Might not be discussion-worthy I understand - a simple yes/no/"nothing new" will also do.

(better suited for WiiU CPU thread?)

thanks
 

Mithos

Member
Actually, the reverse is happening. Check this out.



My post was about why didn't the Wii U get stuff like UE4/Agni's before PS4/XBO?

UE3 is the first engine for a Nintendo platform supported by EPIC GAMES.

Not Even UE2 was on Gamecube or Wii with/by EPIC GAMES support, but personally ported by the companies that had games using that engine.
 

Donnie

Member
Just being anal here, but both of these following statements are correct, using the statement interchangeably with the number is not: The One has 66% the shading power of the PS4 (12/18). OR, the PS4 has 1.5x/150% the shading power of the One (18/12). Switching them up and saying the PS4 is 66% more powerful is not accurate. That would imply the PS4 had 20 CUs.

Think he's referring to this:

http://www.neogaf.com/forum/showthread.php?t=565505

Though obviously he didn't phrase it correctly.
 
I don't want to read 6000+ posts so can someone please summarize what is known?

I would say this is what we know. The ALU count is pretty much 160-320. The TMU count is either 8 or 16 and very likely 8 ROPs. Whether the J blocks are TMUs or Interpolators, there seems to be some kind of modification to them. There also seems to be an extraordinary amount of Constant and Instruction cache in Latte compared to other AMD GPUs. Latte retains the 1MB and 2MB eMemories from Flipper/Hollywood. The ARM chip was located. There seems to be a Southbridge in the chip.

I think recent discussion is more about trying to get as concise as possible with what we're seeing.

Oh bg, Block E on Latte looks kinda like block P on Llano. What do you think? Also happens to be in the same relative position next to the DC.

I don't see it. I don't even really see it being much like my previous comparison now. I think I've shared this before, but this picture suggests the UVD isn't quite how we see it in other GPUs.


Also I think I can see what you are seeing at least with the L2 cache in RV770. What made you decide those were the L2 cache?

Also wsippel got that info for us in the first WUST. You were probably wondering why I was listing the UVD in my block list.

Hate to be quoting myself but...anyone?

Might not be discussion-worthy I understand - a simple yes/no/"nothing new" will also do.

(better suited for WiiU CPU thread?)

thanks

I knew there was a post I missed. To me it seems to be strictly about power saving allowing a lower power state when certain SIMDs are not used instead of either "all SIMDs on" or "all SIMDs off".
 

wsippel

Banned
Hmm, what makes you think it's on there? Are you thinking about texture compression? Perhaps because of the Two Tribes texture data comment? How about R?
I just wondered what the block might be, in light of the large SRAM macro and the dual channel SRAM. And a while ago, I've seen a (de)compressor for LZ77 or LZMA or something that shared a few similarities, except it was way less complex. Such a thing wouldn't be used for textures (lossy compression doesn't need dictionaries), but for low entropy data that can't be compressed using lossy formats, like geometry.
 

MDX

Member
Retro's title, hopefully. BTW, the part of your post I've bolded isn't something that can be disproven/proven by game visuals. Latte could be half as strong as it is and it would still technically be a few generations removed from the GPUs in the 360/PS3.


Yeah, and thats a problem. Nintendo can only claim the WiiU is markedly/noticeably more advanced in every way to the Wii. And, I think thats what Nintendo wants the public to see. Not to worry about how it matches up to the PS3&4 or XB360&ONE, but to its own last console. And for the millions of Wii owners who did not buy a 360 or PS3, maybe thats all they need to do.


But OK, Sony showed off KillZone and they showed how advanced the PS4 is because it could handle all these real time reflections. Is that not a visual indicator that would make a distinction between this gen and the next?
 

Schnozberry

Member
OK, Sony showed off KillZone and they showed how advanced the PS4 is because it could handle all these real time reflections. Is that not a visual indicator that would make a distinction between this gen and the next?

Maybe, but at that point you have to ask who it's aimed at and why it's important. Sony seems to be taking the approach that they can be profitable being primarily aimed at core gamers. Microsoft and Nintendo seem to be aiming at expanded audiences and families. Nintendo wants them to play smaller indie games and their own casual game franchises, and Microsoft is aiming for more of a digital convergence device. We'll see in the next two years who does a better job of finding a market, and whose strategy changes the most. I think Nintendo could have an opportunity they likely won't get again this holiday if they get out software that people desire and hit a good price point. I think it will come down to price and software.

Edit: On a GPU related note, is it possible that Nintendo grafted UVD3 onto this GPU? The more advanced hardware would make gamepad streaming a lot easier by taking additional heat off the CPU.
 

Schnozberry

Member
Also, is it possible that the GX2 API could put ASTC to use? I know Nintendo has S3TC licensed, but the Two Tribes comment made me wonder if they had implemented ASTC, which would offer more flexibility with input formats and has better peak performance than S3TC. It became a part of OpenGL in August of 2012, so I thought it may have been possible to find it integrated in GX2.

It could also be completely unrelated, but I figured I'd throw it out there.
 

z0m3le

Banned
Ah, thanks. Although if that was factored it it's taking the unsafe (imo) assumption that 100% of the PS4 GPU will be free for games, so that's why I just stick with their theoretical power and not reserves.

Yeah I was talking about game performance, from what we know only 1.1TFLOPs in XB1 is usable for games. PS4 so far has nothing like this, and considering XB1 is the only console to do this so far, I don't think it's safe to assume that PS4 will also reserve GPU processing power for the single OS it likely features. 1229-10% = 1106 GFLOPs + 66% = 1836 which is very close to PS4's 1843GFLOPs.
 
I would say this is what we know. The ALU count is pretty much 160-320. The TMU count is either 8 or 16 and very likely 8 ROPs. Whether the J blocks are TMUs or Interpolators, there seems to be some kind of modification to them. There also seems to be an extraordinary amount of Constant and Instruction cache in Latte compared to other AMD GPUs. Latte retains the 1MB and 2MB eMemories from Flipper/Hollywood. The ARM chip was located. There seems to be a Southbridge in the chip.

I think recent discussion is more about trying to get as concise as possible with what were are seeing.

thumbsup2mudb.gif
 

wilsoe2

Neo Member
I’ve been really enjoying the conversations here and learning a lot. I can tell many of you are real pros and I appreciate taking the time to analyze this and discuss for the pure curiosity of it. Especially in the past few pages here there's been some great work in labeling, discussing, etc as we try to wrap our heads around this. I don’t think I know enough about the hardware schema to dispute any of the die shot analysis, and certainly not Fourth Storm’s work so I don’t mean to come across as combative. That said he made a statement a while back that I wanted to clarify based on past examples. I understand the rationale for thinking the wii u is 160 shaders as well as the size discrepancies and other factors that make that range 160-320. The truth is we don’t have much in-game evidence one way or another to draw on and so I wouldn’t want preconceived ideas to influence or bias the analysis.

Here's a question: If Latte has 320 shaders, why wasn't NSMBU in 1080p? We know the eDRAM should be enough to hold the framebuffer and the amount of effects going on isn't astronomical. With 1080p being the buzzword that it is for them, wouldn't they have wanted to go for it?

Originally Nintendo did say the game was going to be 1080p right on their official website. “Experience Mario like never before… in full 1080p HD, only on the Wii U console!”

http://mynintendonews.com/2012/10/11/new-super-mario-bros-u-will-run-at-1080p/

Admittedly that doesn’t really mean anything. As you say 1080p is a buzzword so a marketing person could have written that on the site not knowing any better. I couldn’t find a link so maybe I am mistaken but I thought I remembered Iwata clearing up the NSMBU 1080p thing saying that they experimented with it at 1080p but chose not to do it for the launch game (possibly implying they could have done it with more time? I'm speculating).

Criterion talk about refinements they made in their own lighting engine, however. Sheer programming skill shouldn't be brushed aside in analyzing these matters, and if what we saw is the best they could squeeze out of a 320 shader part, I'd be very surprised.

To put this in perspective I googled for a list of xbox 360 launch titles and unfortunately found a kotaku article that I am very begrudgingly linking here.
http://kotaku.com/the-xbox-360-had-18-games-at-launch-heres-what-they-l-509057349

Granted, Criterion Games is an exceptionally talented studio and in a different league than say NeverSoft, but the point remains. Games like Tony Hawk American Wasteland, which were developed for the PS2 generation, are not dramatically (or at all) better looking on the much more advanced Xenos architecture. I bring it up because I’ve read many people in this thread point to Black Ops 2 Wii U as proof that the system is NOT in any way more capable than PS/360.

But looking back to the silly Kotaku article, even Perfect Dark Zero looks awful in comparison to today’s 360 games and Rare was a very talented developer, especially back then. They were also bought my Microsoft for a gajillion dollars and so I’d assume were given time, budget, and tools to complete that project. So it makes me cringe a little to read “best they could squeeze” for a brand new system, with small team, small budget, based on previous gen code, and completing it in a couple months.

I’m not making an argument for or against the hardware I just wanted to caution against having a myopic graphical comparison between 360 and Wii U and having that influence die shot interpretation. So take that for what it’s worth and please please please don’t start debating launch 360 games. Let’s just agree to say that graphically they have improved substantially over its lifetime. And as I’ve shown through past example in this post, ports of previous generation games do not automatically drastically improve on drastically more powerful hardware. (JordanN I’m implying that the 360 was drastically more powerful than the PS2, and NOT implying that the Wii U is drastically more powerful than anything ; ) )
 
I don't see it. I don't even really see it being much like my previous comparison now. I think I've shared this before, but this picture suggests the UVD isn't quite how we see it in other GPUs.

That picture actually helped me think that those Q blocks might be UVD related. That diagram doesn't seem to be worried about precise borders, as I doubt that UVD is only half a block. The block on Brazos is labeled clearly enough. But that diagram has one of those small blocks (Llano's H/G) included in the UVD partition. So we've got the big UVD block and two small identical blocks adjacent to it. Coincidence? On Latte, the arrangement of the SRAM on the left of the Q blocks might hint at there being some interaction between those three blocks as well.

I had figured that Brazos' TAVC block was texture-related, but now I'm not so sure. The TD block does seem to have about the same amount of SRAM that the two T blocks in Latte have - minus those 32 banks, which I located in the TC block. Barring a better guess at what TAVC might stand for, something video related seems like it could work (target analytics, vision, video, catalyst, codec, control, compress, *sigh* hahaha). But there appears to be some SRAM on the border of UVD/TAVC that also might indicate them working together in some fashion. Strange that the UVD on Brazos seems to have less SRAM than the block on Llano, but maybe that's just my eyes playing tricks on me. Both blocks appear to hold a good amount of logic.

Q isn't an exact match for H/G on Llano, but it looks pretty close to me. Llano is 32nm SOI from GF, so again, size comparisons are not likely applicable. Actually, if I didn't know better, I'd think that some of the Latte blocks are stretched just to make the puzzle pieces all fit lol. But that's crazy talk. But there is a similar hierarchy of SRAM in those blocks - 4 different sizes.


Also I think I can see what you are seeing at least with the L2 cache in RV770. What made you decide those were the L2 cache?

It's just got a huge stack of SRAM compared to the other blocks! It ended working out nicely too when I was attempting to identify the others yesterday. That configuration seems to make a good deal of sense with the memory controller placement.
 

joesiv

Member
Nice work. These are the RV770 and Brazos pics I've been relying on. That was the best RV770 I found.

RV770

Brazos

Thank you sir!

I've went and blocked things as I saw fit (lol), this time I included a version that doesn't have the labels, so anyone can go and annotate the blocks as they wish.

Also feel free to give me feedback on the blocking, can't guarantee I'll be able to make the changes right away, but I'll try.

*edit* I'll start with the Wii U GPU so we have an easier comparison:
WiiU-GPU_enhanced_blocks_small.jpg


Llano:
LlanoDie_blocked-small.jpg

Blocked and Labled: http://www.joesiv.net/fourthstorm/LlanoDie_blocked.jpg
Just Blocks: http://www.joesiv.net/fourthstorm/LlanoDie_blocked-nolabels.jpg
Just Die Shot: http://www.joesiv.net/fourthstorm/LlanoDie.jpg

Brazos:
Brazos_blocked_labeled_small.jpg

Blocks and Labels: http://www.joesiv.net/fourthstorm/Brazos_blocked_labeled.jpg
Just Blocks: http://www.joesiv.net/fourthstorm/Brazos_blocked.jpg
Just Die: http://www.joesiv.net/fourthstorm/Brazos.jpg

R770:
RV770_blocked_labeled_small.jpg

Blocks and Labels: http://www.joesiv.net/fourthstorm/RV770_blocked_labeled.jpg
Just Blocks: http://www.joesiv.net/fourthstorm/RV770_blocked.jpg
Just Die: http://www.joesiv.net/fourthstorm/RV770.jpg
 
I’ve been really enjoying the conversations here and learning a lot. I can tell many of you are real pros and I appreciate taking the time to analyze this and discuss for the pure curiosity of it. Especially in the past few pages here there's been some great work in labeling, discussing, etc as we try to wrap our heads around this. I don’t think I know enough about the hardware schema to dispute any of the die shot analysis, and certainly not Fourth Storm’s work so I don’t mean to come across as combative. That said he made a statement a while back that I wanted to clarify based on past examples. I understand the rationale for thinking the wii u is 160 shaders as well as the size discrepancies and other factors that make that range 160-320. The truth is we don’t have much in-game evidence one way or another to draw on and so I wouldn’t want preconceived ideas to influence or bias the analysis.



Originally Nintendo did say the game was going to be 1080p right on their official website. “Experience Mario like never before… in full 1080p HD, only on the Wii U console!”

http://mynintendonews.com/2012/10/11/new-super-mario-bros-u-will-run-at-1080p/

Admittedly that doesn’t really mean anything. As you say 1080p is a buzzword so a marketing person could have written that on the site not knowing any better. I couldn’t find a link so maybe I am mistaken but I thought I remembered Iwata clearing up the NSMBU 1080p thing saying that they experimented with it at 1080p but chose not to do it for the launch game (possibly implying they could have done it with more time? I'm speculating).



To put this in perspective I googled for a list of xbox 360 launch titles and unfortunately found a kotaku article that I am very begrudgingly linking here.
http://kotaku.com/the-xbox-360-had-18-games-at-launch-heres-what-they-l-509057349

Granted, Criterion Games is an exceptionally talented studio and in a different league than say NeverSoft, but the point remains. Games like Tony Hawk American Wasteland, which were developed for the PS2 generation, are not dramatically (or at all) better looking on the much more advanced Xenos architecture. I bring it up because I’ve read many people in this thread point to Black Ops 2 Wii U as proof that the system is NOT in any way more capable than PS/360.

But looking back to the silly Kotaku article, even Perfect Dark Zero looks awful in comparison to today’s 360 games and Rare was a very talented developer, especially back then. They were also bought my Microsoft for a gajillion dollars and so I’d assume were given time, budget, and tools to complete that project. So it makes me cringe a little to read “best they could squeeze” for a brand new system, with small team, small budget, based on previous gen code, and completing it in a couple months.

I’m not making an argument for or against the hardware I just wanted to caution against having a myopic graphical comparison between 360 and Wii U and having that influence die shot interpretation. So take that for what it’s worth and please please please don’t start debating launch 360 games. Let’s just agree to say that graphically they have improved substantially over its lifetime. And as I’ve shown through past example in this post, ports of previous generation games do not automatically drastically improve on drastically more powerful hardware. (JordanN I’m implying that the 360 was drastically more powerful than the PS2, and NOT implying that the Wii U is drastically more powerful than anything ; ) )

Good points. I was a bit riled up at the time. I thought we were getting close to drawing some solid conclusions (still do, in fact, but I realize I can't convince everybody - which is fine), and it was frustrating that others rejected the notion, dragging the chip back into murky waters. I want answers, damnit! :p

I believe Nintendo could pull off a 1080p Mario as well. The fool I am, I actually didn't realize that much of the NSMB games is made up of sprites and 2d bgs! I thought they were just simple polygonal graphics. The somewhat combative point I was making was that it should be easy for such a game to run in 1080p if the hardware were unanimously better than current gen. I basically think they can do it, but it will take a bit more work in optimizing (and redoing assets).

Yes, I don't like making launch game comparisons, but I did so in light of reports that Wii U was using a familiar architecture. I find it a bit laughable that Iwata only backpedeled with comments that it's "new hardware that takes time to exploit" when the first wave of titles failed to visually impress. It's a valid point, and the memory architecture and SDK definitely seem to require some learning, but I don't think it should be as drastic a learning curve as going from the PS2 generation to this one. Back then, devs were forced to learn how to exploit shaders, weird in-order CPUs, non-unified memory pools in the case of PS3, and all that jazz. Yet still, even those initial games gave us that bump up to HD resolutions. That alone showed the hardware to be indisputably more capable.
 

Hermii

Member
Good points. I was a bit riled up at the time. I thought we were getting close to drawing some solid conclusions (still do, in fact, but I realize I can't convince everybody - which is fine), and it was frustrating that others rejected the notion, dragging the chip back into murky waters. I want answers, damnit! :p

I believe Nintendo could pull off a 1080p Mario as well. The fool I am, I actually didn't realize that much of the NSMB games is made up of sprites and 2d bgs! I thought they were just simple polygonal graphics. The somewhat combative point I was making was that it should be easy for such a game to run in 1080p if the hardware were unanimously better than current gen. I basically think they can do it, but it will take a bit more work in optimizing (and redoing assets).

Yes, I don't like making launch game comparisons, but I did so in light of reports that Wii U was using a familiar architecture. I find it a bit laughable that Iwata only backpedeled with comments that it's "new hardware that takes time to exploit" when the first wave of titles failed to visually impress. It's a valid point, and the memory architecture and SDK definitely seem to require some learning, but I don't think it should be as drastic a learning curve as going from the PS2 generation to this one. Back then, devs were forced to learn how to exploit shaders, weird in-order CPUs, non-unified memory pools in the case of PS3, and all that jazz. Yet still, even those initial games gave us that bump up to HD resolutions. That alone showed the hardware to be indisputably more capable.

PS3 was orders of magnitude more powerful than PS2, and the launch games only looked marginally better because the architecture was so different. The gap to Wii U is nowhere near that big. I dont know how much rewriting it would take to reoptimize a 360 game to a different memory architecture, OoO processing, and a more modern GPU but I imagine its far from taking fully advantage of the hardware even if Criterion is an exellent team. All the first party games released so far have started their development on Wii so I dont know how much they tell us.
 
PS3 was orders of magnitude more powerful than PS2, and the launch games only looked marginally better because the architecture was so different.

That's a huge exaggeration imo. PS3 games ran at 3-4x the resolution, proper texture filtering, and certainly had way higher polycounts and texture resolutions.
 

joesiv

Member
That's a huge exaggeration imo. PS3 games ran at 3-4x the resolution, proper texture filtering, and certainly had way higher polycounts and texture resolutions.
From what I remember, the PS3 did run at a higher resolution, and compared to the PS2, yup it looked significantly better, however, compared to the Xbox, for cross generation titles, the difference was less noticable.

However, at least with EA games (where I was working at the time), the Xbox360/PS3 was running on the "new" engine, and had all those things like sweat beads, and cloth animation. For some cross gen games the difference wasn't huge, mostly more clarity (compared to Xbox), but obviously for exclusives, the difference was more huge.
 

JordanN

Banned
UE3 is the first engine for a Nintendo platform supported by EPIC GAMES.

Not Even UE2 was on Gamecube or Wii with/by EPIC GAMES support, but personally ported by the companies that had games using that engine.
Ok? But that's UE3, not UE4.

I specifically recall Mark Rein said they went to the console manufacturers and told them what they want out of next gen consoles, even showing them the Samaritan Demo as a base. Seems like a missed opportunity for Nintendo if Wii U wasn't "buried" by 1 Tflop hardware or less.

That, or Iwata is the most incompetent person ever, though I doubt he would be allowed to have a job if he pulled off something like that (or should be fired if he did).

Edit: Epic games aren't the only developers out there. I also mentioned Agni's Philosophy (Squenix) so what's their excuse?
 
From what I remember, the PS3 did run at a higher resolution, and compared to the PS2, yup it looked significantly better, however, compared to the Xbox, for cross generation titles, the difference was less noticable.

examples?

And of course the gap wouldn't be big if you talk about games which used last-gen engines and assets.
Though Call of Duty 3 (US launchgame) looked significantly better than the last gen version.


(PS3 vs. Wii - couldn't find a good picture of the Xbox Version)
 
Ok? But that's UE3, not UE4.

I specifically recall Mark Rein said they went to the console manufacturers and told them what they want out of next gen consoles, even showing them the Samaritan Demo as a base. Seems like a missed opportunity for Nintendo if Wii U wasn't "buried" by 1 Tflop hardware or less.

That, or Iwata is the most incompetent person ever, though I doubt he would be allowed to have a job if he pulled off something like that (or should be fired if he did).

Edit: Epic games aren't the only developers out there. I also mentioned Agni's Philosophy (Squenix) so what's their excuse?

Mark Rein said multiple times that if devs wanted to, the could bring UE4 to Wii U. Just like they had to with UE2.
 

joesiv

Member
examples?

And of course the gap wouldn't be big if you talk about games which used last-gen engines and assets.
No examples at this time, just by memory. I was a QA tester for games like FIFA, NBA Live, NFS during the transition from PS2 to PS3, so I spent a lot of time staring at the games lol. For me to get examples would be me scouring google for screen grabs, which I could do but, don't have time ATM :) Maybe later. Screen grabs are hard though, as they're easy to nit pick, but seeing them in motion is closer to reality of "what people will notice". Though I'm all for pixel peeping! I'll see if I can pull out examples, or you can if you have time. Examples I'm thinking of are NFS most wanted, same engine, NBA live 2006, it was on a new engine for "next gen" (though I think the PS3 sku was cancelled), FIFA 2006, don't remember what engine it was.
 

Schnozberry

Member
Yes, I don't like making launch game comparisons, but I did so in light of reports that Wii U was using a familiar architecture. I find it a bit laughable that Iwata only backpedeled with comments that it's "new hardware that takes time to exploit" when the first wave of titles failed to visually impress. It's a valid point, and the memory architecture and SDK definitely seem to require some learning, but I don't think it should be as drastic a learning curve as going from the PS2 generation to this one. Back then, devs were forced to learn how to exploit shaders, weird in-order CPUs, non-unified memory pools in the case of PS3, and all that jazz. Yet still, even those initial games gave us that bump up to HD resolutions. That alone showed the hardware to be indisputably more capable.

This may seem like a silly question, but what kind of bump would have been convincing enough to sell that launch lineup to early adopters? Many of them were late ports at full price, or shovel ware, and the Nintendo franchises ready at launch were received like a fart in church by core gamers.

I run a PC at home that is vastly superior in terms of raw computing power to what we're going to see from any of these consoles, and going back and forth between it and console games for me is not nearly as jarring as what we saw last gen just due to the lack of a real resolution bump. Maybe I'm in the minority and I'm willing to accept that, but I think the failure of the Wii U thus far has had far more to do with content (or lack thereof) than computational power.

Edit: I should clarify that I'm just being rhetorical, and I do think you're correct to say that the transition from last gen was much harder, and the gap between the 360 and PS3 and Wii U is much smaller than most have come to expect from generational change. That being said, I don't think the other consoles will be completely immune to that problem. As people who love and obsess about raw specs, I think we lose sight of what is visually apparent to the untrained eye of John Q. Gamer. If you compare the best efforts on the OG Xbox to the best efforts of the 360, the added muscle is obvious and requires no explanation. I think you're going to have to do a lot more explaining going from the 360 to the Xbox One, at least early in it's life.
 
Here is why I think porting to Wii U will be hard
Warning, stupid multipliers coming up!

If the Wii U is 2x the strength of the 360 (360 = 250gfl, Wii U = ~350+ unknown hardware, lets say 500, just for a solid number)
Then the Xbone = still over twice the Wii U (Xbone = 1.23/.5=2.46) + efficiencies of HSA/APU.

Let's factor in ram differences, capacity and bandwidth...

the gap between the 360 and Wii U is smaller than the gap between the Wii U and Xbone.

This is why porting will be difficult.

360 x 2 = Wii U
Wii U x 2.5-3 = Xbone

Of course... without solid figures I can only assume that the Wii U is about 2x as powerful as the 360.

/Stupid multipliers.
 
This may seem like a silly question, but what kind of bump would have been convincing enough to sell that launch lineup to early adopters? Many of them were late ports at full price, or shovel ware, and the Nintendo franchises ready at launch were received like a fart in church by core gamers.

I run a PC at home that is vastly superior in terms of raw computing power to what we're going to see from any of these consoles, and going back and forth between it and console games for me is not nearly as jarring as what we saw last gen just due to the lack of a real resolution bump. Maybe I'm in the minority and I'm willing to accept that, but I think the failure of the Wii U thus far has had far more to do with content (or lack thereof) than computational power.

I agree it's mostly a content problem, but that may, in turn, be due to a computational problem. Already, we have seen devs like DICE and 4A decide that the effort to get their current gen engines up and running on Wii U was not worth it. These are devs that actually tried. We can argue about whether they tried hard enough, but I'm not buying the conspiracy that they just dislike Nintendo. And those decisions were made before Wii U's disastrous launch.

To answer your question about what it would take, going by the WUSTs, I would guess people would have liked to see true 720p across the board, more solid framerates in games like CoD, better AA, and higher res textures. That would have satisfied me, at least. There was alot of hype for Wii U from the time of it first being rumored and well into last year. That critical moment where the tide seems to have turned seems to be E3 2012, where basically nothing new was announced and the third party titles didn't seem to be looking so hot.
 

Mithos

Member
Ok? But that's UE3, not UE4.

I specifically recall Mark Rein said they went to the console manufacturers and told them what they want out of next gen consoles, even showing them the Samaritan Demo as a base. Seems like a missed opportunity for Nintendo if Wii U wasn't "buried" by 1 Tflop hardware or less.

That, or Iwata is the most incompetent person ever, though I doubt he would be allowed to have a job if he pulled off something like that (or should be fired if he did).

Edit: Epic games aren't the only developers out there. I also mentioned Agni's Philosophy (Squenix) so what's their excuse?

So with UE4 it will be up to developers to port it themselves and that is what Mark have stated, UE4 can run on Wii U, but they (EPIC GAMES) won't port it and have an official build, considering all teh inhouse engines that seems to be the "thing" now though, I doubt there will be many games that use UE4 that need to be ported for a Wii U version.

There have been games on Wii U from SquareEnix released yet?
 

JordanN

Banned
So with UE4 it will be up to developers to port it themselves and that is what Mark have stated, UE4 can run on Wii U, but they (EPIC GAMES) won't port it and have an official build, considering all teh inhouse engines that seems to be the "thing" now though, I doubt there will be many games that use UE4 that need to be ported for a Wii U version.

There have been games on Wii U from SquareEnix released yet?
It's like you're tempting me to completely lose it or something...

I don't care about the engine. I stated this from the beginning. I'm talking about the technology that uses said engine.

There are no games from Squenix (that run on Luminous) but I don't think that will even matter if said games try to be as impressive as Agni's was.
 
I agree that it is partly a content problem, but that is beyond this thread. Anyway, I think power helps sell devices. That and features, but without power, you can't promote newer, better features.

It's like if Apple tried to sell an iPhone 6, but it was only a half step above iPhone 5 in hardware. They need to see the benefits. And not being able to sell the idea of the gamepad really causes a problem for Nintendo.
 
So with UE4 it will be up to developers to port it themselves and that is what Mark have stated, UE4 can run on Wii U, but they (EPIC GAMES) won't port it and have an official build, considering all teh inhouse engines that seems to be the "thing" now though, I doubt there will be many games that use UE4 that need to be ported for a Wii U version.

There have been games on Wii U from SquareEnix released yet?
Dragon Quest X in Japan.
 

Schnozberry

Member
I agree it's mostly a content problem, but that may, in turn, be due to a computational problem. Already, we have seen devs like DICE and 4A decide that the effort to get their current gen engines up and running on Wii U was not worth it. These are devs that actually tried. We can argue about whether they tried hard enough, but I'm not buying the conspiracy that they just dislike Nintendo. And those decisions were made before Wii U's disastrous launch.

To answer your question about what it would take, going by the WUSTs, I would guess people would have liked to see true 720p across the board, more solid framerates in games like CoD, better AA, and higher res textures. That would have satisfied me, at least. There was alot of hype for Wii U from the time of it first being rumored and well into last year. That critical moment where the tide seems to have turned seems to be E3 2012, where basically nothing new was announced and the third party titles didn't seem to be looking so hot.

Yeah, the decisions were likely hardware related to a degree. How much I don't know. I'm sure some of it was business and finance related too, as THQ went tits up and EA has had to cleave their workforce and drastically reduce their output in the face of losses. I don't blame them at all for doing what they needed to do to make their business work, and I don't think you need conspiracies to explain it either. The companies that did do their ports for launch were clearly at a disadvantage due to the sorry state of the SDK, and I'm sure that left a bad taste in everyone's mouth as well. Nintendo needs to do some private apologies, for certain.

I edited my previous comment with a clarification while you were writing this, so I hope you read it and don't get the wrong impression from my comment.
 

Schnozberry

Member
It's like you're tempting me to completely lose it or something...

I don't care about the engine. I stated this from the beginning. I'm talking about the technology that uses said engine.

There are no games from Squenix (that run on Luminous) but I don't think that will even matter if said games try to be as impressive as Agni's was.

Agni's Philosophy ran on a high end Intel CPU and a Nvidia GTX 680. If you think the Xbox One and PS4 will match them, I have a bridge to sell you in Brooklyn.
 
It's like you're tempting me to completely lose it or something...

I don't care about engine. I stated this from the beginning. I'm talking about the technology that uses said engine.

There are no games from Squenix but I don't think that will even matter if said games try to be as impressive as Agni's was.

From what I've recall, Squarenix may not even focus on a lot of "next-gen" games in general compared to smartphones. We will have to wait and see on what they will do.
 
Status
Not open for further replies.
Top Bottom