• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

Donnie

Member
I have already posted the real numbers.


Look if you want to learn about this stuff go read the old thread.

Your evasion of anything you don't understand enough to argue about is quite pathetic, and trying to deflect that lack of understanding onto me won't work.
 
Erm no I'm not, Jaguar is 1 thread per core, 8 cores equals 8 threads. CMT is cluster based multithreading, like AMD use in buldozer cores when they in fact have two cores sharing the resources of 1 core in order to simulate something similar to SMT. Now if you want to claim Jaguar has SMT then fine, Durango has a 4 core dual theaded CPU.
I didn't realize you knew that/were taking that into account.

It's technically or theoretically SMT; just a different incarnation of it, but it's up for interpretation I guess and I'll respect your opinion of it.

As for the reason for it to be so, like I said small pipelined architectures usually don't benefit much if at all from SMT capabilities; SMT is like having a house with a high ceiling, and since ceiling space is not often used/taken advantage of that's banwidth going to waste, SMT is a way to try and make use of it.

Small pipelined cpu's though, have no such high ceiling. I don't know the pipeline length for Jaguar, but it's certainly not huge, at most core 2 duo big who lacked SMT just the same. The CMT solution is actually very clever, I don't know how it behaves in real world scenarios though.
Your evasion of anything you don't understand enough to argue about is quite pathetic, and trying to deflect that lack of understanding onto me won't work.
He doesn't understand enough of anything to make an argument of apparently. Pretty sure english is also some kind of barrier there (of understanding and being able to construct an argument), it just feels wrong and I'm far from a native speaker so I'm probably not as sensitive to those things.

It's like arguing with those AI bots, their answers are short and their core goal is to try to keep it ambiguous, because that makes the chances of them not being totally off higher.


On the flipside I bet Twitter word limit doesn't feel limitative to him.
 

Donnie

Member
I didn't realize you knew that/were taking that into account.

It's technically or theoretically SMT; just a different incarnation of it, but it's up for interpretation I guess and I'll respect your opinion of it.

As for the reason for it to be so, like I said small pipelined architectures usually don't benefit much if at all from SMT capabilities; SMT is like having a house with a high ceiling, and since ceiling space is not often used/taken advantage of that's banwidth going to waste, SMT is a way to try and make use of it.

Small pipelined cpu's though, have no such high ceiling. I don't know the pipeline length for Jaguar, but it's certainly not huge, at most core 2 duo big who lacked SMT just the same. The CMT solution is actually very clever, I don't know how it behaves in real world scenarios though.

Yeah I agree SMT isn't always needed, I was really just making a point to phosphor112 about how SMT or lack there of doesn't make something necessarily good or bad. Something we both seem to agree on.

It's like arguing with those AI bots, their answers are short and their core goal is to try to keep it ambiguous, because that makes the chances of them not being totally off higher.

A very apt description of some extremely irritating behavior :)
 

krizzx

Junior Member
Suddenly it would be more efficient. And even according to the Wii U CPU thread, it's based off of the PPC 750 core that the Wii was based off of.

Its based off of it in that its part of the same family. There are substantially more improvements added to Espresso than a 3 core Broadway at a higher clock speed would suggest as I'm pretty sure that it actually being a 3 core Broadway was ruled out by way of architectural limits if I'm not mistaken.
 

wsippel

Banned
I don't even know if I replied to this (on muh phone) but this is a great post. The stock 750 was locked at a lower clock rate wasn't it? And yes, I agree that the Wii U GPU is supposed to do most of the work. It's capable of compute functions and those fixed functions can make up for any general tasks on the GPU taking up processing power.
The fastest (well, highest clocked) 750 used to be the 750GX, clocked at 1GHz. It also had 1MB L2 cache. That wasn't the last 750, however: The 750CL was. The 750CL is essentially Broadway. Less cache and clocked lower compared to the 750GX, but with all the customizations previously specific to Nintendo's cores: The same extended instruction set, paired singles, write gather pipe, L1d locking and DMA.

Interestingly enough, if you browse the EEMBC database, the 750CL is faster than the 750GX in almost all benchmarks, but there are a few where the 750GX pulls ahead thanks to its larger cache. Espresso combines the extended functionality of the CL with the larger cache of the GX, introduces SMP and apparently an enhanced, wider 60x bus, while also bumping the clock to 1.25GHz - 25% higher than any 750 before it.


EDIT: Also, SMT is mostly used to achieve better utilization on CPUs with long pipelines and/ or lots of execution units. A 750 wouldn't benefit from SMT. In fact, it would do more harm than good.
 

USC-fan

Banned
The reason for the performance hit was the 64bit fpu datapath in bodcat. If you ran that test on jaguar you would get very difference results.

Your evasion of anything you don't understand enough to argue about is quite pathetic, and trying to deflect that lack of understanding onto me won't work.

Weird seeing I have already posted the reason for the poor performance of bobcat in that data set. If you can't understand what that means then I'm not going to explain it. I really do not care.
 

Donnie

Member
Weird seeing I have already posted the reason for the poor performance of bobcat in that data set. If you can't understand what that means then I'm not going to explain it. I really do not care.

I won't bother wasting my time with a new post responding to something I already responded to here:

http://www.neogaf.com/forum/showthread.php?p=49391532#post49391532

You don't know what performance difference that makes, and neither do I until we're shown benchmarks. But my point is its pretty ridiculous to call Espresso cores measly when what people say is the weakest part of their design (floating point performance) is significantly better per clock than a Bobcat core. AMD claim Jaguar is 20% more powerful in floating point performance than Bobcat according to the numbers phosphor112 posted. Now it may be that in reality its more than that (though PR rarely if ever exceeds reality), but even if it was, calling Espresso cores measly is still ludicrous given what we've seen even from older Broadway cores vs Bobcat. Its not even based on any knowledge of each CPU's performance, simply the idea that one is "based off older technology" while the other is apparently brand spanking new. Forgetting the fact that Jaguar is going to be based off some older core or another, who honestly cares what each CPU is based on? Performance is all that matter. 8 Jaguar cores at 1.6Ghz is going to be quite a bit better than 3 Espresso cores at 1.24Ghz, that much is obvious. But I take issue with exaggerations based purely on the perception of "newness" making one piece of hardware massively better than another.

You couldn't carry the discussion on and instead have crawled behind ridiculous platitudes.

NOTE: I never said Jaguar wouldn't be faster than Bobcat. Only that you don't know what is limiting Bobcat in that test really or what kind of difference a wider data path would make specifically.
 
The fastest 750 used to be the 750GX, clocked at 1GHz. It also had 1MB L2 cache. That wasn't the last 750, however: The 750CL was. The 750CL is essentially Broadway. Less cache and clocked lower compared to the 750GX, but with all the customizations previously specific to Nintendo's cores: The same extended instruction set, paired singles, write gather pipe, L1d locking and DMA.

Interestingly enough, if you browse the EEMBC database, the 750CL is faster than the 750GX in almost all benchmarks, but there are a few where the 750GX pulls ahead thanks to its larger cache. Espresso combines the extended functionality of the CL with the larger cache of the GX, introduces SMP and apparently an enhanced, wider 60x bus, while also bumping the clock to 1.25GHz - 25% higher than any 750 before it.
Great post.

Do you have those benchmarks at hand by any means? I'd like to take a look at them. :)

And 750 GX actually could pull 1.1 GHz.
EDIT: Also, SMT is mostly used to achieve better utilization on CPUs with long pipelines and/ or lots of execution units. A 750 wouldn't benefit from SMT. In fact, it would do more harm than good.
I said as much (well, minus the "more harm than good" part), but I'm kinda weary of claiming such things out of the blue, I believe no one had put it like that before.

See, I'm a theoretical kind of guy; I mostly grasp stuff, understand SMT and why it's absent from small pipeline designs (it just makes sense), but I still get cold feet for claiming stuff that I haven't seen being claimed before, I do realize you're more in the know than I am so it's good to feel reassured. Thanks.
 

Schnozberry

Member
I don't even know if I replied to this (on muh phone) but this is a great post. The stock 750 was locked at a lower clock rate wasn't it? And yes, I agree that the Wii U GPU is supposed to do most of the work. It's capable of compute functions and those fixed functions can make up for any general tasks on the GPU taking up processing power.

The 750 cores on the Espresso Die are an entirely new configuration than what anyone has used in a mainstream tech product before. The cache is sizable for a 750 and the MCM it sits on has a rather large amount of high bandwidth low latency memory. It has a very high clock speed for a short pipeline stage, and has the advantage of being out of order in comparison to the 360 and PS3.

The Wii U also has a custom GPU with a more modern core and very likely much better real world output than Xenos or RSX merely by the latency advantages of the architecture, let alone whatever theoretical performance advantage it may have. Nintendo probably thought they were making a platform people would want to program for. It has a very sane CPU and a modern GPU, with a lot more memory to work with than people were used to. A lot of people have said positive things about it, it's just that they are dismissed for being too close to Nintendo or their game is dismissed for not meeting some arbitrary level of graphical complexity.

Evidence points to Nintendo not even being close to having their tool chain completed until near launch, with most devs not having access to the higher clocked dev kits and final API until they were close to going gold with launch software. That, if anything, is the reason this launch was a blunder. They knew it wasn't ready, and the OS software was a work in progress. They also had a lot of third party software that wasn't well optimized as a result.

The few games coming out this month look much better. If anything, Nintendo has at least been sincere about wanting to fix the problems.

EDIT: Great points made above by wsippel and lostinblue.
 

OryoN

Member
Are people really expecting something on the level of KZ4 for WiiU ? :O.

There is no way imo, that game probably has a budget close to $100 million and it's made by one of the most renowned first party developers (in terms of visuals) in the industry.

Then you have the hardware which is about 6x as powerful as WiiU and more up to date (DX10.1 effects vs DX11.1).

Even at 720p I don't think WiiU is pulling off anything close to KZ4 but that is not to say it won't have some amazing looking first party games.

People need to lower their expectations though.

It's true that expectations can be a little high, but the opposite is true also. That's a pretty bold statement there. It's almost like everyone expects games on every platform in the history of gaming to demonstrate significantly improved visuals as time goes on... except for Wii U.

For the record, I don't think KillZone: Shadow Fall's visuals during actual gameplay were all that impressive(but looked great), making your statement all the more bold. Just because the PS4 itself is a few times more powerful than Wii U, that doesn't mean this translate to every game, especially this early on. Wii U visuals would have to remain nearly stagnant for it's entire lifetime for it NOT to even come CLOSE to(forget matching) those visuals.

Apart form rendering all that at 1080p, I don't see anything in KZ:SF - so far - that would be impossible on Wii U in its lifetime. The more impressive looking portions of the game were all during the points when the devs are in total control of what the player sees. As a result, the demo was easily more visually/artistically impressive than it was technically. If it comes down to sheer visual beauty over technical prowess, I believe that - over time - we may see Wii U games that actually stand favorably against KillZone: SF. It's just way too early in the console's life to put such a definitive cap on what it can/can't achieve graphically.
 
For the record, I don't think KillZone: Shadow Fall's visuals during actual gameplay were all that impressive(but looked great), making your statement all the more bold.

The more impressive looking portions of the game were all during the points when the devs are in total control of what the player sees.
It's rather strange how often this criticism is leveled against KillZone: Shadow Fall; while concurrently not too long ago people were fawning over how apparently impressive pre-staged scenes like the Zelda tech demo and bird tech demo; both of which ran at 720p30, the latter of which was choppy in places at 30 fps according to Digital Foundry.
 
The fastest (well, highest clocked) 750 used to be the 750GX, clocked at 1GHz. It also had 1MB L2 cache. That wasn't the last 750, however: The 750CL was. The 750CL is essentially Broadway. Less cache and clocked lower compared to the 750GX, but with all the customizations previously specific to Nintendo's cores: The same extended instruction set, paired singles, write gather pipe, L1d locking and DMA.

Interestingly enough, if you browse the EEMBC database, the 750CL is faster than the 750GX in almost all benchmarks, but there are a few where the 750GX pulls ahead thanks to its larger cache. Espresso combines the extended functionality of the CL with the larger cache of the GX, introduces SMP and apparently an enhanced, wider 60x bus, while also bumping the clock to 1.25GHz - 25% higher than any 750 before it.


EDIT: Also, SMT is mostly used to achieve better utilization on CPUs with long pipelines and/ or lots of execution units. A 750 wouldn't benefit from SMT. In fact, it would do more harm than good.

Didn't know it had a bus increase. Also, out of curiosity, how does one measure the length of the pipelines? Jaguar is supposed to have really short pipelines, but I never knew how anyone knew those.
 
It's true that expectations can be a little high, but the opposite is true also. That's a pretty bold statement there. It's almost like everyone expects games on every platform in the history of gaming to demonstrate significantly improved visuals as time goes on... except for Wii U.

For the record, I don't think KillZone: Shadow Fall's visuals during actual gameplay were all that impressive(but looked great), making your statement all the more bold. Just because the PS4 itself is a few times more powerful than Wii U, that doesn't mean this translate to every game, especially this early on. Wii U visuals would have to remain nearly stagnant for it's entire lifetime for it NOT to even come CLOSE to(forget matching) those visuals.

Apart form rendering all that at 1080p, I don't see anything in KZ:SF - so far - that would be impossible on Wii U in its lifetime. The more impressive looking portions of the game were all during the points when the devs are in total control of what the player sees. As a result, the demo was easily more visually/artistically impressive than it was technically. If it comes down to sheer visual beauty over technical prowess, I believe that - over time - we may see Wii U games that actually stand favorably against KillZone: SF. It's just way too early in the console's life to put such a definitive cap on what it can/can't achieve graphically.

Technically speaking Shadow Fall is doing things Wii U wouldnt be able to do at 30fps, even if it was downgraded to 720p. The lighting and volumetric effects are pretty insane. It's basically equal to Crysis 3 on a high end PC right now. Check out the DF tech analysis for more information.

http://www.eurogamer.net/articles/digitalfoundry-killzone-shadow-fall-demo-tech-analysis
 

wsippel

Banned
Great post.

Do you have those benchmarks at hand by any means? I'd like to take a look at them. :)

And 750 GX actually could pull 1.1 GHz.
Yeah, the 750GX could run at 1.1GHz with sufficient cooling, but I believe that was still beyond IBM's specifications.

As for the benchmarks (CL/ GX, both at 1GHz):

TeleBench 1.1: 32.3/ 30.3
OABench 1.1: 1535.9/ 1452.0
AutoBench 1.1: 1241.2/ 1155.8
TCPmark: 475.0/ 467.1
IPmark: 153.6/ 286.1
DENBench 1.0: 138.7/ 173.6
ConsumerBench 1.1: 95.7/ 124.0

http://www.eembc.org/benchmark/index.php

Those aren't the same benchmarks I've seen last time, but I can't find those anymore. The EEMBC site is terrible. The GX in the benchmarks I've posted above was using faster RAM compared to the CL, which screws up the results a little. Still, when the GX pulls ahead, the difference is significant. Pretty impressive what a couple hundred kB L2 cache can do for a processor.
 
Yeah, the 750GX could run at 1.1GHz with sufficient cooling, but I believe that was still beyond IBM's specifications.

As for the benchmarks (CL/ GX, both at 1GHz):

TeleBench 1.1: 32.3/ 30.3
OABench 1.1: 1535.9/ 1452.0
AutoBench 1.1: 1241.2/ 1155.8
TCPmark: 475.0/ 467.1
IPmark: 153.6/ 286.1
DENBench 1.0: 138.7/ 173.6
ConsumerBench 1.1: 95.7/ 124.0

http://www.eembc.org/benchmark/index.php

Those aren't the same benchmarks I've seen last time, but I can't find those anymore. The EEMBC site is terrible. The GX in the benchmarks I've posted above was using faster RAM compared to the CL, which screws up the results a little. Still, when the GX pulls ahead, the difference is significant. Pretty impressive what a couple hundred kB L2 cache can do for a processor.

People really understimate the importance of cache for the CPU. Espressos Cache alone should give it a big speed bump over Gecko. Not just the clockspeed. Also IIRC thanks to the cache being eDram and therefor has very low latency, less cycles are wasted for accessing the cache IIRC. Another thing that gives Espresso more juice.
 
Eh... SRAM is always lower latency than DRAM, only reasons to use DRAM are size and cost.
But when dealing with such low frequency processors, isn't there a way to mitigate it for example clocking the eDram higher than what you would clock the SRAM in the same context?
I mean, the WiiU is 100% backwards compatible, and it uses one of the cores, clocked at the same speed than the Wii's broadway and halves the L2 caché. In Broadway this L2 caché was SRAM, and now it's eDram. Doesn't that mean that the eDram + it's memory controller on the WiiU is at least equal in terms of efficiency to the SRAM + it's memory controller on the original Wii... at the same CPU clock speed?
 
Yeah, the 750GX could run at 1.1GHz with sufficient cooling, but I believe that was still beyond IBM's specifications.
I believe it was off the shelf like that, multiplier is locked so it would mean messing with FSB otherwise. Notice that kit has the same FSB number for both 1 and 1.1 GHz configurations
Just for the record, and sake of clarity as we ponder, didn't Iwata say that the Wii U CPU is purposely underclocked?
I don't know about that, but it certainly is.

It's a Nintendo trend, they prefer to go lower to keep the FSB/cpu clock differential/multiplier at bay and under the number of 3x (this reduces bottlenecking) and they'll avoid running stuff too hot at all means.

This architecture also kinda loses it's power effectiveness if clocked too high, as illustrated by this PPC750 CL table:



Nintendo probably went for a relation of performance versus consumption for every part of the system. Also notice Wii's Broadway 729 MHz rating stands in the middle of that official sheet.
 
It's true that expectations can be a little high, but the opposite is true also. That's a pretty bold statement there. It's almost like everyone expects games on every platform in the history of gaming to demonstrate significantly improved visuals as time goes on... except for Wii U.

For the record, I don't think KillZone: Shadow Fall's visuals during actual gameplay were all that impressive(but looked great), making your statement all the more bold. Just because the PS4 itself is a few times more powerful than Wii U, that doesn't mean this translate to every game, especially this early on. Wii U visuals would have to remain nearly stagnant for it's entire lifetime for it NOT to even come CLOSE to(forget matching) those visuals.

Apart form rendering all that at 1080p, I don't see anything in KZ:SF - so far - that would be impossible on Wii U in its lifetime. The more impressive looking portions of the game were all during the points when the devs are in total control of what the player sees. As a result, the demo was easily more visually/artistically impressive than it was technically. If it comes down to sheer visual beauty over technical prowess, I believe that - over time - we may see Wii U games that actually stand favorably against KillZone: SF. It's just way too early in the console's life to put such a definitive cap on what it can/can't achieve graphically.

One of the main reasons I don't think we will see an exclusive WiiU game on par with KZ4 is simply budget, Nintendo wouldn't consider spending $80 - $100 million on a single game.

With regards to FPS's, if WiiU gets BF4 then I think it will look decent but much more like the PS360 versions rather than the PS4/720 versions, although the GPU supports modern features it's lacking any real grunt when compared to PS4 and Durango's GPU.
 

joesiv

Member
The lighting and volumetric effects are pretty insane.
While I don't think the WiiU will be pulling off killzone, I also think people are giving killzone too much credit.

A lot of the "volumetric" effects that people talk about are faked sprite based effects, some of them laughable. For example, the "wind" from the transports are really poorly done and often pop. Another example is the waterfall off the building, I've heard a bunch of people comment how great it looks, but it's clearly not volumetric and is just sprite based as well. Explosions look great however, but could also be pre-rendered (especially for non interactive portions).

The lighting as well, a lot of people are wowed by the the flyin portions, how the lighting looks. The reflections are not real-time, you can clearly see on the large buildings that it's just a reflectionmap. The building geometry is instanced and very low detail. It's just a decent lighting across a large space with faked reflections that give it most of the wow.

Having said that, there are some lighting effects that would cause the Wii U to struggle, and actually even doing all those alpha transparencies for effects may also struggle. I just find that people assume that killzone isn't faking half the stuff it is. Though it should be said, Killzone isn't released yet, and some say is only using 2GB's of VRAM, and is a launch title...
 
I believe it was off the shelf like that, multiplier is locked so it would mean messing with FSB otherwise. Notice that kit has the same FSB number for both 1 and 1.1 GHz configurationsI don't know about that, but it certainly is.

It's a Nintendo trend, they prefer to go lower to keep the FSB/cpu clock differential/multiplier at bay and under the number of 3x (this reduces bottlenecking) and they'll avoid running stuff too hot at all means.

This architecture also kinda loses it's power effectiveness if clocked too high, as illustrated by this PPC750 CL table:



Nintendo probably went for a relation of performance versus consumption for every part of the system. Also notice Wii's Broadway 729 MHz rating stands in the middle of that official sheet.
Well, in terms of frecuency vs consumption this has always gone the same way, consumption never scales linearly.
And then, because of the short pipeline, you reach the frequency limit much earlier than with other designs.
 
Yeah, the 750GX could run at 1.1GHz with sufficient cooling, but I believe that was still beyond IBM's specifications.

As for the benchmarks (CL/ GX, both at 1GHz):

TeleBench 1.1: 32.3/ 30.3
OABench 1.1: 1535.9/ 1452.0
AutoBench 1.1: 1241.2/ 1155.8
TCPmark: 475.0/ 467.1
IPmark: 153.6/ 286.1
DENBench 1.0: 138.7/ 173.6
ConsumerBench 1.1: 95.7/ 124.0

http://www.eembc.org/benchmark/index.php

Those aren't the same benchmarks I've seen last time, but I can't find those anymore. The EEMBC site is terrible. The GX in the benchmarks I've posted above was using faster RAM compared to the CL, which screws up the results a little. Still, when the GX pulls ahead, the difference is significant. Pretty impressive what a couple hundred kB L2 cache can do for a processor.
I've been trying to understand that site for a while now, direct linking is purposefully broken, the link you provided only says "suite is missing", it's hard to get anything out of it (for people trying to navigate it, benchmarks are actually located in scores, not on the benchmarks tab).

Anywho, luckily, after some sweat and tears I've managed to get this out of it:





-> PPC750CL/GX Comparison table

Perhaps that's what you were looking for? Almost every test gives PPC750CL the edge.

As for interpreting the benchmarks in there taking the obvious ones aside, I'll just leave this here.
Well, in terms of frequency vs consumption this has always gone the same way, consumption never scales linearly.
And then, because of the short pipeline, you reach the frequency limit much earlier than with other designs.
Yes, precisely; and Nintendo strives for the middle frequency balance tied to high FSB for the design, so of course it could go higher albeit with a different consumption/heat tradeoff.

Gamecube had a 162 MHz FSB @ 486 MHz whereas the PPC 750CXe it was based on featured a 133 MHz bus tied to a maximum clock frequency of 600 MHz. Broadway had 729 MHz with a 243 MHz bus when PPC 750CL topped out at 1 GHz with a maximum of 240 MHz supported memory bus (ok, here there's not much improvement, but that's because 750 CL is a Broadway anyway). Pretty consistently going lower on cpu clocks and supercharging the FSB. Oh, also every single time the bus has been 64 bit wide whereas CXe and CL usually had 32 bit buses/ram configurations.

If this thing scaled accordingly that would mean multiplier was still locked at 3x, so a 414 MHz FSB, but wsippel talked about a enhanced x60 bus implementation (in relation to Gekko/Broadway/750CL?) which makes sense considering it has 800 MHz DDR3.

Also, Nintendo in the past sure loved their "perfect" multipliers but this time around clocks for GPU (550 MHz) and main processor (1243 MHz) seem to bear no relation, so actual FSB is up for anyone's guess. That actually gives fuel to suppositions of the CPU having been balanced down to favour GPU, as with the GC/Wii architecture a 550 MHz GPU would mean a 1650 MHz CPU part.
 

deviljho

Member
One of the main reasons I don't think we will see an exclusive WiiU game on par with KZ4 is simply budget, Nintendo wouldn't consider spending $80 - $100 million on a single game.

This point is often neglected. Nintendo is already avoiding the "tech race" because they see development budgets rising. They've stated this many times. In fact, Iwata (at a recent Q&A) specifically mentioned budgets with respect to their own games. He more or less said that money wouldn't spend on all games to make them look outstanding, but that there is also an expectation by consumers that a Zelda type game not look "cheap."

So not only would Wii U miss out on 3rd party multiplatform games had its hardware been more powerful, but Nintendo themselves wouldn't want to commit extra resources to leverage that power for their own games.
 
Wouldn't it be smart of nintendo(Iwata) to do a Iwata ask for Need For Speed MW:U? Since they are the fisrt developers to come out and say they are trying to push the console and make a respectable port.that would be interesting interview and could give us more info about the tech of the console. But with it being a good idea... Its probably not going to happen.

If EA allows it...
LOL ;)
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
I know this is the Latte thread, but since there has been a lot of CPU talk lately, I'll allow myself this small remark.

Weird seeing I have already posted the reason for the poor performance of bobcat in that data set. If you can't understand what that means then I'm not going to explain it. I really do not care.
So according to you, the necessary and sufficient condition for Bobcat to be topped by Broadway in that comparison is found entirely in the 64-bit width of Bobcat's fp pipeline. Unfortunately for your explanation, Broadway's fp pipeline is 'just as 64bit'. If reading the brochure, the way you normally do it, was enough, the two CPUs should have had identical or sufficiently-close per-clock performances. They didn't, though, and not by a negligible margin. What does the brochure say about that?
 
This point is often neglected. Nintendo is already avoiding the "tech race" because they see development budgets rising. They've stated this many times. In fact, Iwata (at a recent Q&A) specifically mentioned budgets with respect to their own games. He more or less said that money wouldn't spend on all games to make them look outstanding, but that there is also an expectation by consumers that a Zelda type game not look "cheap."

So not only would Wii U miss out on 3rd party multiplatform games had its hardware been more powerful, but Nintendo themselves wouldn't want to commit extra resources to leverage that power for their own games.

Yip Nintendo will struggle to max out WiiU graphically imo, if they had added an extra core to the CPU, an extra 2GB's of Ram and beefed up the GPU some more then the console would have been in a far better position regarding third party support for the next five years though (basically 50% the power of Durango).

What's done is done though, they chose to spend that $50 - $100 build cost on the tablet.
 
Uncharted 1 and 2 had budgets of $20M; if Nintendo is engaging in HD development now, they should expect to be spending as much on their titles. Bankrolling Bayonetta 2 will probably be in that realm anyway.
 
Uncharted 1 and 2 had budgets of $20M; if Nintendo is engaging in HD development now, they should expect to be spending as much on their titles. Bankrolling Bayonetta 2 will probably be in that realm anyway.

Certainly, Twilight Princess cost $20 million on Gamecube, i expect the next Zelda to be at least $40 million but there is no way they will go $60 million+ and into the realms of Killzone ect imo.

The fact that most of their games are based on a 'cartoon' like artstyle will mean their games cost a lot less and don't require Durango like specs, nevermind PS4 like specs.
 

deviljho

Member
Yip Nintendo will struggle to max out WiiU graphically imo, if they had added an extra core to the CPU, an extra 2GB's of Ram and beefed up the GPU some more then the console would have been in a far better position regarding third party support for the next five years though (basically 50% the power of Durango).

What's done is done though, they chose to spend that $50 - $100 build cost on the tablet.

Except that I'm saying something a bit different. Which is that some 3rd parties wouldn't port no matter how powerful the box. So making a powerful box is slightly wasteful in the respect. Coupled with the fact the a powerful box would not be fully utilized by Nintendo themselves with respect to increased budgets.

Anyway, Latte...
 

Darryl

Banned
Yip Nintendo will struggle to max out WiiU graphically imo, if they had added an extra core to the CPU, an extra 2GB's of Ram and beefed up the GPU some more then the console would have been in a far better position regarding third party support for the next five years though (basically 50% the power of Durango).

What's done is done though, they chose to spend that $50 - $100 build cost on the tablet.

nintendo did a fantastic job pushing every single console they've ever released to the maximum, why do you think they'd struggle with the Wii U?
 

Amir0x

Banned
The fact that most of their games are based on a 'cartoon' like artstyle will mean their games cost a lot less and don't require Durango like specs, nevermind PS4 like specs.

why is it that people have this notion in their head that 'cartoon'/stylistic games somehow don't benefit as much from processing power as any other genre/realistic style?

Visually unique stylistic/cartoon games have improved immeasurably this gen, just as much as games utilizing a realistic style. The same will continue to be true into the next power level bracket, and the one after that. There are so many features, effects, abilities etc that power unlocks for styles of all forms, and yes Nintendo's games can benefit from them just the same :p
 

deviljho

Member
nintendo did a fantastic job pushing every single console they've ever released to the maximum, why do you think they'd struggle with the Wii U?

It's not that they can't, technically. It's that they won't, financially. Iwata has already addressed this specifically for their own game development, and his words match the philosophy they've adopted for their hardware in response to rising game dev budgets as an industry-wide issue.
 

Darryl

Banned
It's not that they can't, technically. It's that they won't, financially. Iwata has already addressed this specifically for their own game development, and his words match the philosophy they've adopted for their hardware in response to game dev budgets as an industry wide issue.

i thought that was the reason why they didn't pursue high end specs on the Wii U, not a reason why they wouldn't push those lower-end specs themselves.

why is it that people have this notion in their head that 'cartoon'/stylistic games somehow don't benefit as much from processing power as any other genre/realistic style?

Visually unique stylistic/cartoon games have improved immeasurably this gen, just as much as games utilizing a realistic style. The same will continue to be true into the next power level bracket, and the one after that. There are so many features, effects, abilities etc that power unlocks for styles of all forms, and yes Nintendo's games can benefit from them just the same :p

they can definitely use the extra power just as much as any other game, but i think it's much easier to achieve "visually pleasing" status than it is in a realistic game.
 

Amir0x

Banned
they can definitely use the extra power just as much as any other game, but i think it's much easier to achieve "visually pleasing" status than it is in a realistic game.

Oh sure. Wind Waker is still visually pleasing, same as Okami. I love me some stylistic games, mmm
 

deviljho

Member
i thought that was the reason why they didn't pursue high end specs on the Wii U, not a reason why they wouldn't push those lower-end specs themselves.

Correct, but it's still unrealistic to expect them to increase game development budgets for certain titles just to make them look better.

http://www.nintendo.co.jp/ir/en/library/events/120127qa/04.html

Iwata said:
You are asking for my comment as a judge, but I also need to think about the software content, so my remarks are two sided. Looking at the software for home console systems, there are certainly the software titles for which very rich graphics must be reproduced on HD displays and which demand a large number of developers to spend a very long time to develop. It is one of the truths that a certain number of such software titles must be prepared, or the consumers will not be satisfied. But we do not think that any and all the software must be created in that fashion. When you look at Nintendo’s software, extraordinary rich graphics, massive gameplay volume and astonishing rendition effects are not necessarily the appealing point. It is, in fact, important for us that our games are appealing in other ways as well. An example of this is the Wii software, "RHYTHM HEAVEN FEVER," that we released last year in Japan. It became one of the hits, but if we had adopted rich photo-realistic graphics, it would have lost much of its appeal rather than improving its appeal. Similarly, about the Japanese title "Tomodachi Collection" for Nintendo DS, the developers themselves confirmed that this software is based upon the "cheap concept." It is not necessary for us to deploy a huge number of people in order to develop such games. When we need massive power and have a lack of internal resources, we collaborate with outside resources and pour necessary resources to where they are needed. We are increasing the frequency of working with outside developers where Mr. Miyamoto and our internal developers alone used to develop. At the same time, however, we do not forget to ask ourselves in each such opportunity, "Isn’t this something our internal resources alone could sufficiently deal with?" Also, when we have such a doubt in the development as, "Will such cheap pictures do in terms of today’s home console graphics’ standard?," sometimes we conclude that "showing such pictures are unique and rather appealing, so it’s OK." So, there are a variety of different ways to show the unique appeal of software. What’s important here is not to narrow down what we can do. Rather, we have to create the dynamic range of appeals that the consumers can appreciate. We decided to make a proposal of an additional screen into the Wii U controller because developers could think of a variety of different possibilities here and there of using both a big TV screen and a screen in a player’s hand. As we will showcase the Wii U at E3 in June this year, the detailed announcements must wait until then, but we are aiming to make a system which shall not be forced into competing with the others where the contenders can fight only with massive developer resources and long development times as their weapons. Having said that, however, as I mentioned, it is true that, in some software areas, we need to be engaged in the power games. Take The Legend of Zelda franchise, for example, the fans must be looking for the graphic representations that they do not see as cheap at all when the title is released for the Wii U. When it is necessary, we do not hesitate to role out our resources.

It's slightly ironic how he talks about trying to avoid longer development times. Just slightly.
 
Sounds to me like they aren't absolutely against bigger budgets. Just that they don't intend to increase the entire size of the company employee-wise in order to make such games, since relatively speaking, there will be fewer games that require huge teams and larger payrolls. He pretty much said that they will spend the money on the next Zelda, and that as they take on such projects, they will temporarily team up with other companies who are more expert in such areas. We're seeing it already with Shin Megami Tensei and Fire Emblem having been announced that they are serious about the collaboration aspect as well.
 

deviljho

Member
Sounds to me like they aren't absolutely against bigger budgets. Just that they don't intend to increase the entire size of the company employee-wise in order to make such games, since relatively speaking, there will be fewer games that require huge teams and larger payrolls. He pretty much said that they will spend the money on the next Zelda, and that as they take on such projects, they will temporarily team up with other companies who are more expert in such areas. We're seeing it already with Shin Megami Tensei and Fire Emblem having been announced that they are serious about the collaboration aspect as well.

That's what I said ;)
 
Not that I'd expect Nintendo to ever go the Rock Star way and put 100 million into a game, but they have yet to really decide on a series where they go all out and cost is little hindrance to the scope and effort put into the game. You'd think that Zelda would be the logical choice. It seems to be the somewhat sleeping giant, that can go from 4-7 million in sales to blockbuster HD, open world hit. Time will tell I guess.

Even if they spend 65-70 million and only make a 10 million profit or so, they'd be bringing more serious gamers into the fold who yes, would expect more such efforts, but would also make use of their console investment with other solid games that were far less expensive to develop.

Point being, if Nintendo can't afford to take a front load of heavy debt to get a more expensive console into the hands of gamers, then they have to find out how to use the same strategy on a scale that works for them. The above just might work I think. Zelda, Metroid, Smash Bros, Xeno(?), i.e. 3-4 expensive games in a generation, along with proper marketing, flanked by a beautiful but less expensive 3D Mario or two, DK, swoops like Bayonetta 2, more collaborative efforts, and eventual 3rd party support would probably spell long term success.
 

Log4Girlz

Member
Not that I'd expect Nintendo to ever go the Rock Star way and put 100 million into a game, but they have yet to really decide on a series where they go all out and cost is little hindrance to the scope and effort put into the game. You'd think that Zelda would be the logical choice. It seems to be the somewhat sleeping giant, that can go from 4-7 million in sales to blockbuster HD, open world hit. Time will tell I guess.

Even if they spend 65-70 million and only make a 10 million profit or so, they'd be bringing more serious gamers into the fold who yes, would expect more such efforts, but would also make use of their console investment with other solid games that were far less expensive to develop.

Point being, if Nintendo can't afford to take a front load of heavy debt to get a more expensive console into the hands of gamers, then they have to find out how to use the same strategy on a scale that works for them. The above just might work I think. Zelda, Metroid, Smash Bros, Xeno(?), i.e. 3-4 expensive games in a generation, along with proper marketing, flanked by a beautiful but less expensive 3D Mario or two, DK, swoops like Bayonetta 2, more collaborative efforts, and eventual 3rd party support would probably spell long term success.

Much like how car companies release a halo car even if it is sold at a loss to garner attention/advertising fo their game, Nintendo needs a big game. I get the sense that is what retro is working on.
 
Well they'll have big games. For instance, no doubt that NSMB U will sell amazingly over the course of the console's lifetime. But they need big games that graphically and scope-wise make the demographic that likes that sort of thing take the Wii U plunge. Even the most graphics-whorish of anti-Nintendo fanboys would have to think about getting a Wii U if Nintendo put 70 Million or so into a Zelda game that for all of its user friendliness, catered to the 'hardcore' in significant ways.
 

deviljho

Member
yeah well in Iwata IGN interview on the power of the Wii U he said opposite. one he said (as we should know) not every nintendo made game is gonna try and push the system graphically. so yes we will see the zelda, and metroid type titles push the system and have bigger budgets as he said they will look to push those games as the fans expect them to. he also said in that interview that they know there are consumers that(for a lack of a better term) are graphics whores. he said they would do everything they can to make sure this upcoming gen isnt like the Wii vs ps360. to me it all comes down to engine and optimization. i think their in house engine is gonna be a beast if rumors are true and its being developed by Retro.

yeah well? i haven't said anything different from what he said in the investor Q&A, or whatever he might have said in an IGN interview.

Nintendo spending big bucks on certain franchises based on market expectations is a given. They already spend lots of money on R&D for Zelda games. Btw, they aren't interested in spending as much money on a Metroid game.

Just because I framed my initial point as "Nintendo wouldn't spend extra money making certain games prettier" doesn't negate the above point. It's just another way of stating that fact. Some games will get big budgets while other games will get smaller budgets. Could those small budget games look better on the Wii U with more money thrown at them? Sure, but Nintendo has already made it clear it won't go down that route. I'm not knocking Nintendo here.

The original point was that improving the tech specs of the console would be a waste since they were never going to spend money utilizing it for all of their games. Which is the exact point Iwata makes over and over with regard to the industry players caught up in the graphics race.

The secondary point, which I made much earlier, goes hand in hand with the above: No matter how powerful they made the Wii U, it still wouldn't get certain 3rd party support.

So the sum of the 2 points above is that it really would be a waste to make a "beefier" machine since Nintendo is already satisfied with the level of tech and quality of output from the Wii U, deeming it "good enough."

Point being, if Nintendo can't afford to take a front load of heavy debt to get a more expensive console into the hands of gamers, then they have to find out how to use the same strategy on a scale that works for them. The above just might work I think. Zelda, Metroid, Smash Bros, Xeno(?), i.e. 3-4 expensive games in a generation, along with proper marketing, flanked by a beautiful but less expensive 3D Mario or two, DK, swoops like Bayonetta 2, more collaborative efforts, and eventual 3rd party support would probably spell long term success.

They don't want to fall into the trap. It's an expensive and costly mistake to have to rely on high production values. They already spend a lot on making Zelda games, and Iwata is right - fans would go apeshit if the next Wii U Zelda game didn't look mind blowing.

Well they'll have big games. For instance, no doubt that NSMB U will sell amazingly over the course of the console's lifetime. But they need big games that graphically and scope-wise make the demographic that likes that sort of thing take the Wii U plunge. Even the most graphics-whorish of anti-Nintendo fanboys would have to think about getting a Wii U if Nintendo put 70 Million or so into a Zelda game that for all of its user friendliness, catered to the 'hardcore' in significant ways.

Nintendo knows they can leverage their talents and resources to make fun games with lower development costs and production values. Margins for games like this (NSMBU) are much higher. Frankly, people who play Metroid games or Xenoblade or even Bayo 2 will accept a lesser level of graphical fidelity for those games than the general Zelda audience will for a Zelda game, IMO. F-Zero GX was expensive and it didn't do all that great, and sales for the three other games above are no where near Zelda, typically. And, of course, we're not talking art style here.

I'm not saying these games or other games will look like shit. I'm saying that Nintendo puts thought into assigning games budgets, and there is a spectrum across which the tradeoff between production budget vs. necessity falls. Some games will have large budgets and make the most out of the Wii U hardware. I'm clearly not in a position to say which ones (aside from Zelda), but the rest of the games outside this "Nintendo AAA class" will obviously not focus on production value.
 

krizzx

Junior Member
Does no one else find it odd that the last few pages in this GPU specific thread have been about CPUs and that the CPU thread hasn't seen a post in a week? http://www.neogaf.com/forum/showthread.php?t=513471&page=6

Those 750 Benchmarks would be great for theorizing the actual performance of Espresso.

Also, strangely enough, the last few post in the CPU thread were about GPUs...

Yay for staying on topic! lol.
 
But when dealing with such low frequency processors, isn't there a way to mitigate it for example clocking the eDram higher than what you would clock the SRAM in the same context?
I mean, the WiiU is 100% backwards compatible, and it uses one of the cores, clocked at the same speed than the Wii's broadway and halves the L2 caché. In Broadway this L2 caché was SRAM, and now it's eDram. Doesn't that mean that the eDram + it's memory controller on the WiiU is at least equal in terms of efficiency to the SRAM + it's memory controller on the original Wii... at the same CPU clock speed?

Sorry, I totally missed your reply (my pc died). We already know that the embedded dram in Latte matches the 1T-SRAM (which is really dynamic RAM with SRAM like performance) of Gamecube and Wii in performance, in order to run Wii games, so it must be pretty good. Maybe Espresso has even better eDRAM then... good point about clock speeds, dram is probably good enough in these circumstances.
 
Status
Not open for further replies.
Top Bottom