• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

wsippel

Banned
Come on? Your head is in the sand. The box has 1 gb of usable ram and a terrible cpu compared to X1 and PS4. The gpu is suddenly going to buck this trend? It's horribly underpowered (again, compared to PS4/X1, I'm not talking about PC here). Doesn't take 7000 posts to figure that out.
My head isn't in the sand, you don't understand what this thread is about. Also, all three next gen systems have "terrible" CPUs - the other two (which, again, are not really relevant to the topic at hand in the first place) simply have more shitty cores clocked at a slightly higher frequency. It's still netbook level tech.


I would ask for someone like blu and wsippel to check out some of these to give their analysis. Some things I noticed:

- GX2 is mentioned, which further support the leaked spec document for the Wii u sometime ago.
- CPU core 1 is definitely defined as the main core. It looks like they are still working on offloading tasks from it.
- The devs are using EDRAM.
- It seems as if the Wii U does support some dx11 features, though maybe not all of them.
There's not much to analyze, we already knew most of those things. For example, core 1 is the one with 2MB cache, so it was always pretty obvious that it would be the "main" core. And while multithreaded rendering is a "DX11" feature, it's just a software thing.

What I find a bit strange is that the game has been in development for so long, yet they apparently only started moving render targets to MEM1 a couple of weeks ago.
 
What do other consoles have to do with this topic anyway? That is what is completely irrelevant. The purpose of this topic isn't and has never been about comparing it to those consoles (as much as some people want it to be), it's to try to see how it works and what makes it tick.
It's not completely irrelevant, it helps to have something to compare it to in these types of discussions.

My head isn't in the sand, you don't understand what this thread is about. Also, all three next gen systems have "terrible" CPUs - the other two (which, again, are not really relevant to the topic at hand in the first place) simply have more shitty cores clocked at a slightly higher frequency. It's still netbook level tech.
Yes, I know, even cell phones from 20 years ago are faster than these cpus. Zach Morris' cell runs circles around the X1 and PS4's netbook cpus. The point is, they're much faster and more powerful than the Wii U's. Just like their gpus and amount/speed of ram. Therefore it's fair to say it's "terribly underpowered". There's nothing hidden in this gpu, it's terrible compared to its competition.
 

Raist

Banned
So assuming the 176gflop figure is accurate, in practical unbiased terms, where does it really stack up against PS360? A wash, slight improvements, weaker?

IIRC Xenos is 240GFlops and RSX is 400.
But as others have said, comparisons like this are difficult when architectures are so different.
 

chaosblade

Unconfirmed Member
It's not completely irrelevant, it helps to have something to compare it to in these types of discussions.

For the purpose of what? To complain about how it's an incompetent system and that Nintendo screwed up? That's not the purpose of this topic either. I mean:

Yes, I know, even cell phones from 20 years ago are faster than these cpus. Zach Morris' cell runs circles around the X1 and PS4's netbook cpus. The point is, they're much faster and more powerful than the Wii U's. Just like their gpus and amount/speed of ram. Therefore it's fair to say it's "terribly underpowered". There's nothing hidden in this gpu, it's terrible compared to its competition.

How does your conclusion here add anything to the discussion? I don't think anyone who has the slightest clue what they're talking about doesn't realize that the WiiU is far less powerful than the X1 or PS4. So repeatedly stating that fact doesn't do anyone any good and just serves to turn this into yet another "shit on the WiiU hardware" thread like practically every other not-nostalgia-driven Nintendo thread on GAF.
 
For the purpose of what? To complain about how it's an incompetent system and that Nintendo screwed up? That's not the purpose of this topic either. I mean:



How does your conclusion here add anything to the discussion? I don't think anyone who has the slightest clue what they're talking about doesn't realize that the WiiU is far less powerful than the X1 or PS4. So repeatedly stating that fact doesn't do anyone any good and just serves to turn this into yet another "shit on the WiiU hardware" thread like practically every other not-nostalgia-driven Nintendo thread on GAF.

What else is there to discuss at this point? It's all but confirmed to have 176 gflops, or maybe it has 200, whatever. Nothing wrong with putting those numbers and other specs we have to use in the real world with comparisons.
 

chaosblade

Unconfirmed Member
What else is there to discuss at this point?

Given how slowly information is becoming available, not much. But that's not really a good excuse to go off topic and complain about the "terribly underpowered hardware" relative to it's competition when, again, everyone already knows this (and it's repeated in every single WiiU thread constantly), and it's not really relevant to the topic.

I don't really have anything to add to this topic, but as someone who is interested just from a general interest/tech standpoint I'd rather see this bumped once every month or two with some possible insight like those Project CARS change logs than check it every few days and find discourse like what we've got going on now.
 
Given how slowly information is becoming available, not much. But that's not really a good excuse to go off topic and complain about the "terribly underpowered hardware" relative to it's competition when, again, everyone already knows this (and it's repeated in every single WiiU thread constantly), and it's not really relevant to the topic.

I don't really have anything to add to this topic, but as someone who is interested just from a general interest/tech standpoint I'd rather see this bumped once every month or two with some possible insight like those Project CARS change logs than check it every few days and find discourse like what we've got going on now.

That sounds reasonable to me, I'll leave.
 
My head isn't in the sand, you don't understand what this thread is about. Also, all three next gen systems have "terrible" CPUs - the other two (which, again, are not really relevant to the topic at hand in the first place) simply have more shitty cores clocked at a slightly higher frequency. It's still netbook level tech.



There's not much to analyze, we already knew most of those things. For example, core 1 is the one with 2MB cache, so it was always pretty obvious that it would be the "main" core. And while multithreaded rendering is a "DX11" feature, it's just a software thing.

What I find a bit strange is that the game has been in development for so long, yet they apparently only started moving render targets to MEM1 a couple of weeks ago.
While it was not going on here, I did see at least a few people from beyond3D that was questioning the validity of those documents awhile ago. *shrug*

What is the 32-bit 11.11.10 HDR format? Is that another "DX11" feature/format?
 
While it was not going on here, I did see at least a few people from beyond3D that was questioning the validity of those documents awhile ago. *shrug*

What is the 32-bit 11.11.10 HDR format? Is that another "DX11" feature/format?

When i searched 32-bit 11.11.10 HDR format, i found this:
DX11 Dynamic Envmap prefers 32bit HDR format (11.11.10).
 

OryoN

Member
if they released the specs and EXPLAINED them.... then it would be on devs they can be held accountable. because no matter how well balanced designed the Wii U is when we get crappy ports no one is holding the devs accountable they are just going to say Wii U underpowered or crappy tech, or on par with ps360.

Let's face it, we know enough about the console to know that it's a pretty modest offering. That's not to say that it's still not very capable, but it suggest that even if they had given detailed specs, you'd still get the same reaction/misunderstanding due to all the crappy ports.

Case in point; you don't need detailed specs - that the general public don't understand - to render accountability. You simply need games that demonstrate what the system is truly capable of. It's always been this way! After the this year's E3, I'd say we've seen more than enough to suggest that sloppy ports were in no way a direct result of the console being "underpowered."
 

HTupolev

Member
What is the 32-bit 11.11.10 HDR format? Is that another "DX11" feature/format?
11/11/10 refers to RGB bit allocation, presumably.. 11 bits red, 11 bits green, 10 bits blue. That sort of allocation isn't really that weird; 16-bit 5/6/5 used to be a common colour format, where you'd get superior colour depth in the greens than in the reds and blues.
 
11/11/10 refers to RGB bit allocation, presumably.. 11 bits red, 11 bits green, 10 bits blue. That sort of allocation isn't really that weird; 16-bit 5/6/5 used to be a common colour format, where you'd get superior colour depth in the greens than in the reds and blues.

oh so kinda like the ps3 and x360 gave extra bits to brown and grey?
 

wsippel

Banned
Yes, I know, even cell phones from 20 years ago are faster than these cpus. Zach Morris' cell runs circles around the X1 and PS4's netbook cpus. The point is, they're much faster and more powerful than the Wii U's. Just like their gpus and amount/speed of ram. Therefore it's fair to say it's "terribly underpowered". There's nothing hidden in this gpu, it's terrible compared to its competition.
Actually, if Latte is a 176GFLOPS part and still manages to outperform the 240GFLOPS Xenos, there most likely is something hidden. And that's what this thread is about..
 
When i searched 32-bit 11.11.10 HDR format, i found this:
DX11 Dynamic Envmap prefers 32bit HDR format (11.11.10).

11/11/10 refers to RGB bit allocation, presumably.. 11 bits red, 11 bits green, 10 bits blue. That sort of allocation isn't really that weird; 16-bit 5/6/5 used to be a common colour format, where you'd get superior colour depth in the greens than in the reds and blues.

R11G11B10 support was exposed in DX10 for PC, but it was already present on 360.
I see. Thank you all for your replies.
 

Frux7

Banned
What you take away from the picture is that it's a very custom design. Sadly that means that calculating its FLOPS could be very difficult. Also seems like a lot of fixed function stuff so that make the GFLOPS number even more pointless.
The whole thing does explain the ports problems though. It could take a while for devs to push the chip.


And by the time they do the XO and PS4 will have taken off. Nintendo really did screw the pooch. This thing should have been launched at least three years ago alongside Mario Kart.
 
And by the time they do the XO and PS4 will have taken off. Nintendo really did screw the pooch. This thing should have been launched at least three years ago alongside Mario Kart.

I don't think launching late is the problem, it's that there are no must have games out, Nintendo needed to come out swinging with the Wii U, and they didn't. if anything going by the release scheduled it should have been pushed back a year in order to have a strong launch.
 

krizzx

Junior Member
It would be nice we could discuss the GPU for 1 page without people bringing console war rubbish in here.

Here we go again with people selectively misusing "hypothesis" as fact for no other reason than to bash Nintendo promote Sony and Microsoft hardware with the exact same redundant argument.

Till this day the strongest console has never one a console a generation. The biggest failure that the Nintendo ever suffered as far as console market share goes is when they made the strongest consoles. They are the only ones who learned from that. All of these people try to hit the business side of Nintendo by correlating it with things like console power and third party support, but Nintendo never went through 5 years of straight losses and they aren't losing now.

As for the 160 shader thing. I'm still not buying that until I see a more concrete explanation of how the results we've seen can be produced by such substantially hard limits in physical units. I don't even see being on par with the last gen consoles being possible with 160 shaders regardless of how much more modern the tech is. What aspects of "more modern tech" aloows 160 shaders to outperform 240? If that is possible then surely there must be some other examples or precedence.

My reason for this is that the logic being used to support this is only viewing the "more modern tech" from a single angle. There is also the angle that more shaders were able to be put into the components at a smaller size do to modernization of technology. Also, we are skipping a critical fact from the earlier estimate of the hardware size. If my memory serves me correctly, the 8 TMUs in Latte are physically 25% larger than a normal AMD 20 SP component. There size is between that of the 40 SP components and 20 SP. Then there is also the comment about wattage vs performance that has recently arisen. Why is the tech being more modern and efficient not explored toward that angle?

I'm feel like sticking with the earlier suggestion of 8 TMUs with 28 or 32 SPs. That would place it as 224 or 256. That seems more rational and gives a better explanation for the improvement shown in games. It would also fit with the physical size of the components. It would make no sense for them to make a larger component that performs less than a smaller, cheaper, less energy consuming, less heat producing component.

I'm not saying anyone is wrong or right here. I'm just curious as to why possibilities aren't being explored more thoroughly and openly in all plausible directions. There is such a tremendously heavy slant towards having the most negative hypothesis accepted by everyone as fact when there are still so many questions.
 

Log4Girlz

Member
I wonder how high the GPU could be clocked theoretically. I mean, is the GPU just super under-clocked? Could it run at 800 mhz if they had chosen to do so? That would be a real boost in power.
 

prag16

Banned
As for the 160 shader thing. I'm still not buying that until I see a more concrete explanation of how the results we've seen can be produced by such substantially hard limits in physical units. I don't even see being on par with the last gen consoles being possible with 160 shaders regardless of how much more modern the tech is. What aspects of "more modern tech" aloows 160 shaders to outperform 240? If that is possible then surely there must be some other examples or precedence.
I'm mostly a bystander, and I know it's not an exact comparison, but my 48 (48!) shader GT220 in my HTPC runs Mass Effect 3 maybe not on par with PS360 but damn close at 1366x768.

I'd be willing to bet it's entirely plausible for 160 shader latte to pull ahead of 240 shader Xenos.
 
I wonder how high the GPU could be clocked theoretically. I mean, is the GPU just super under-clocked? Could it run at 800 mhz if they had chosen to do so? That would be a real boost in power.

I don't know if you're being serious :p but a clock rise of 250MHz is impossible, esp with the power / heat envelope constraints WiiU has.

If it can really be pushed to 40 watts then they could maybe boost the GPU by 50MHz to 600MHz but even then I doubt Nintendo would dice with death regarding systems overheating, they simply have no reason to as WiiU is more than powerful enough for their first party games.
 

Log4Girlz

Member
I don't know if you're being serious :p but a clock rise of 250MHz is impossible, esp with the power / heat envelope constraints WiiU has.

If it can really be pushed to 40 watts then they could maybe boost the GPU by 50MHz to 600MHz but even then I doubt Nintendo would dice with death regarding systems overheating, they simply have no reason to as WiiU is more than powerful enough for their first party games.

I said theoretically. Imagine taking the identical chip and putting a PC class fan/heatsink combo on it, would it comfortably run at 800 mhz? What if the Wii U were a box the size of XboxOne with that giant jet engine fan in it, could it then be pushed to 800 mhz? 900 mhz? 1 ghz? I wonder what that chip can do in terms of clock speed.
 

z0m3le

Banned
It's 217 shaders for Xenos, with ~66% efficiency iirc. So 160 shaders clocked 10% higher, much more modern with say 90% efficiency would perform code (incoming napkin math + estimation) ~30% faster, maybe more
 
It's 217 shaders for Xenos, with ~66% efficiency iirc. So 160 shaders clocked 10% higher, much more modern with say 90% efficiency would perform code (incoming napkin math + estimation) ~30% faster, maybe more

AFAIK it's 240 shaders (48*5-vector units)

The Xenos is a very well documented ATI part.

I'm actually in the camp that 160 shaders is on the lower end for the Wii U, and that the actual count is much higher than that.

If was truly a RV73x derivative it should have 320 shaders just like the Mobility Radeon 4670 card I have in my laptop. Which would make a heck of a lot more sense then, in how it can surpass the 360/PS3's graphical prowess and have DX10/11+ abilities.
 
I have been following the Wii U dev of PCars, it's getting there slowly, can't wait for the first proper screenshots.



attachment.php

broken image link
 
GREAT POST! i totally agree with what was said. People tend to forget Nintendo is a GAMING company only. they wont put themselves in a situation where they are losing money hand over fist. im not an expert on tech but learning everyday. i do know a couple of developers and people who are more tech savy. i worried about Wii U hardware design after seeing the launch ports myself. those worries have have been eased with learning how the system is design and it isnt bing used the right way. 2014 is going to be a great year for Wii U from a technical standpoint. I didnt buy a Wii U because i want a system that is basically a clone of a sony or microsoft system. from a graphical and tech standpoint it has enough to get the job done "IF" that developer is committed to getting the most out of the console.

No one ever said it wouldn't produce great looking Nintendo games. People are quick enough to keep saying 'WiiU is a next gen console !' but when others start comparing it to PS4 and XBO suddenly 'its not fair' ect, ect.

IMO WiiU is a PS360 level machine with slightly more RAM and a more modern GPU capable of lighting, shadow, fire, explosion and depth of field effects not possible on the DX9 level PS360 GPU's.

If PS360 are zero and PS4/XBO are a ten in power terms then WiiU is a two or maybe a three at the most. It's much, much closer to last gen than next gen.
 

krizzx

Junior Member
I'm mostly a bystander, and I know it's not an exact comparison, but my 48 (48!) shader GT220 in my HTPC runs Mass Effect 3 maybe not on par with PS360 but damn close at 1366x768.

I'd be willing to bet it's entirely plausible for 160 shader latte to pull ahead of 240 shader Xenos.

48 shaders at what clock and with what other features supporting it? There is still the other main point I brought. The shader units in Latte are physically larger than AMD's normal 20sp components?

Also, another point I forgot to mention on the power consumption vs hardware performance thing is that the Wii U's target watt rate is "45 watts" not 36 watts. Iwata actually stated this. The fact that 36 watts is the highest recorded in active use simply means that the hardware isn't being pushed or utilized to its full extent.

It makes sense why Nintendo doesn't release more info on the hardware. There are people who keep complaining about Nintendo not releasing information, yet they disregard the information that has been released by Nintendo as PR talk or just outright dismiss it when it doesn't support their negative outlook. What point would their be in releasing more? I doubt anyone who looks negatively on the hardware or company would shift there opinion even in the least. Most of it is the result of bias that existed before they knew anything about the hardware at all.

I have been following the Wii U dev of PCars, it's getting there slowly, can't wait for the first proper screenshots.



fe_lightingandshadowfixed.JPG

This is interesting, though its still a far cry from this. When they get the visuals up to this point on the Wii U, I shall consider buying this game.

large.jpg


Would be nice to see how it stands against the console version for the 360 and PS3 though. Are there any shots available for those?
 

wsippel

Banned
http://dw.diamond.ne.jp/articles/-/6435?page=2

Now this is an interesting development. Seems the Japanese Renesas factory that produces the Wii U edram closing and that Nintendo is shifting production elsewhere.

I wonder how this will effect the GPU production.
Renesas is subcontracting TSMC. It's not really all that interesting, as Renesas had plans to do this for years. Which is why they stopped working on 32nm tech a long time ago.
 

japtor

Member
This is interesting, though its still a far cry from this. When they get the visuals up to this point on the Wii U, I shall consider buying this game.

large.jpg


Would be nice to see how it stands against the console version for the 360 and PS3 though. Are there any shots available for those?
Considering that shot is from the PC version it won't get up to that level, but I figure it'll be reasonably good looking.

And I have no clue if the 360/PS3 versions are still in the works or not (a little while back they were debating dropping them), but for graphical comparison there's the NFS Shift games. I'm more worried about the physics side of PCARS since they're pushing pretty complex stuff on the PC side so far. Graphically it should scale down fine, but for physics I figure they'll have to alter stuff in some way which will affect the gameplay, the trick would be doing that without being too noticeable to the player.
 
Also, another point I forgot to mention on the power consumption vs hardware performance thing is that the Wii U's target watt rate is "45 watts" not 36 watts. Iwata actually stated this. The fact that 36 watts is the highest recorded in active use simply means that the hardware isn't being pushed or utilized to its full extent.

This is interesting, though its still a far cry from this. When they get the visuals up to this point on the Wii U, I shall consider buying this game.

large.jpg

Wasn't the 45 watts already explained when the console is using it's USB ports powering external HDD's ect. Would be interesting if someone took a reading while the console is running a game and then again running the same game with an external HDD attached to see if it changes.

With regards to the screenshot above, good luck achieving that kind of visual level on WiiU or even PS4/XBO at an acceptable framerate, that is prob running on a GTX 680 equipped gaming PC lol !.
 

ikioi

Banned
Till this day the strongest console has never one a console a generation. The biggest failure that the Nintendo ever suffered as far as console market share goes is when they made the strongest consoles.

Only someone completely ignorant and naive of reality would attribute the commercial failures of consoles like the N64 and Gamecube due to their technical power.

Technical power had next to nothing to do with their failures. They failed for other reasons. Marketing, choice of media for games, royalty and licensing costs, etc. Technical power was NOT an issue.

They are the only ones who learned from that.

Nintendo didn't learn anything. If they did they wouldn't have a console that right now is getting smacked by both the N64 and Gamecube in sales.

You seem to be confusing Nintendo's ignorance and arroagance with 'learning'
 

Log4Girlz

Member
Only someone completely ignorant and naive of reality would attribute the commercial failures of consoles like the N64 and Gamecube due to their technical power.

Technical power had next to nothing to do with their failures. They failed for other reasons. Marketing, choice of media for games, royalty and licensing costs, etc. Technical power was NOT an issue.

The sad part is, they were powerful enough to be exciting machines with the potential to do great, but Nintendo dropped the ball. Or rather, stabbed the ball a few hundred times and incinerated it. N64 with cartridges alienating 3rd parties and the Gamecube being the most adorable purple purse for young ladies and mini-discs insuring again, that 3rd parties were well and screwed. If N64 had a CD drive, man they would have dominated. Sega was on the outs and were run by idiots and PSX was still relatively weak in terms of sales and mindshare when the N64 launched.
 

NateDrake

Member
The sad part is, they were powerful enough to be exciting machines with the potential to do great, but Nintendo dropped the ball. Or rather, stabbed the ball a few hundred times and incinerated it. N64 with cartridges alienating 3rd parties and the Gamecube being the most adorable purple purse for young ladies and mini-discs insuring again, that 3rd parties were well and screwed. If N64 had a CD drive, man they would have dominated. Sega was on the outs and were run by idiots and PSX was still relatively weak in terms of sales and mindshare when the N64 launched.

N64 dominated for a while. PS1 took over once big games come out and continued to be relevant late in the generation/early PS2.
 

Log4Girlz

Member
N64 dominated for a while. PS1 took over once big games come out and continued to be relevant late in the generation/early PS2.

Yeah, man N64 could have sold like fucking bonkers. Honestly, I really do see the Wii as a happy accident of design. Nintendo doesn't really know what its doing.
 
48 shaders at what clock and with what other features supporting it? There is still the other main point I brought. The shader units in Latte are physically larger than AMD's normal 20sp components?

Also, another point I forgot to mention on the power consumption vs hardware performance thing is that the Wii U's target watt rate is "45 watts" not 36 watts. Iwata actually stated this. The fact that 36 watts is the highest recorded in active use simply means that the hardware isn't being pushed or utilized to its full extent.

It makes sense why Nintendo doesn't release more info on the hardware. There are people who keep complaining about Nintendo not releasing information, yet they disregard the information that has been released by Nintendo as PR talk or just outright dismiss it when it doesn't support their negative outlook. What point would their be in releasing more? I doubt anyone who looks negatively on the hardware or company would shift there opinion even in the least. Most of it is the result of bias that existed before they knew anything about the hardware at all.



This is interesting, though its still a far cry from this. When they get the visuals up to this point on the Wii U, I shall consider buying this game.

large.jpg


Would be nice to see how it stands against the console version for the 360 and PS3 though. Are there any shots available for those?

Hmm, considering the different view and the off-camera image for the Wii U version, it's not too clear to judge.

Seeing those screens, though, it reminded me on a question that we still don't really have an answer for yet: what customizations was made to the GPU to handle the extra resources to the controller's screen if devs want to display a full 3D environment that is not just mirrored from the main screen? I believe wsippel noticed that FIFA13 used a non-mirrored camera on the controller screen (though, IIRC, it was used wastefully), and several games use it for multiplayer modes. I believe there were some talks and theories on particular parts of the GPU being doubled, but I'm unsure if we came to any conclusions about that.
 

Goodlife

Member
Not really relevant to the tech discussion I'm afraid, but downloaded Pikmin 3 last night.

Looks amazing. If stuff like that can be produced with 160 shaders then a) there is definitely some "secret sauce" somewhere in the WiiU and b) I can see why Nintendo don't both releasing tech specs and just say judge by the games.
 

Heyt

Banned
On the Renesas thing, I read on some blog that Nintendo could not simply shitf production elsewhere. Suposedly Renesas has the patent for the eDram tech WiiU is using. If this was true Nintendo would be trapped in a corner, would they?
 
On the Renesas thing, I read on some blog that Nintendo could not simply shitf production elsewhere. Suposedly Renesas has the patent for the eDram tech WiiU is using. If this was true Nintendo would be trapped in a corner, would they?

from what I've read, Renesas isn't actually dropping Nintendo as a customer, but the particular factory that is making the eDRAM is closing down, so production is being shipped to another factory.
 

krizzx

Junior Member
Considering that shot is from the PC version it won't get up to that level, but I figure it'll be reasonably good looking.

And I have no clue if the 360/PS3 versions are still in the works or not (a little while back they were debating dropping them), but for graphical comparison there's the NFS Shift games. I'm more worried about the physics side of PCARS since they're pushing pretty complex stuff on the PC side so far. Graphically it should scale down fine, but for physics I figure they'll have to alter stuff in some way which will affect the gameplay, the trick would be doing that without being too noticeable to the player.

I would agree but do you remember NFS Most Wanted U?

"Wii U's extra memory allows for PC-quality textures and assets"

I wouldn't rule it out.
 

prag16

Banned
I would agree but do you remember NFS Most Wanted U?

"Wii U's extra memory allows for PC-quality textures and assets"

I wouldn't rule it out.

http://www.videogamer.com/xboxone/c...s_being_made_at_amazing_cinema_quality_2.html

This article reminded me of that. The article itself is PR BS to the core, but had me thinking that maybe some of these teams that are one the second go around with the Wii U (WB Montreal, Treyarch, etc) can push the envelope a bit more.

We'll probably mostly be getting PS360 level geometry in multiplats, but maybe we can get PS4/xbone level textures if we're lucky, at least in some cases.
 

JordanN

Banned
Technical power had next to nothing to do with their failures. They failed for other reasons. Marketing, choice of media for games, royalty and licensing costs, etc. Technical power was NOT an issue.
Ironically, technical power was probably what saved them.
If Nintendo announced to the world N64 could only do 2D, they would likely have been dead. Same goes for the Gamecube if it was never powerful enough for PS2/Xbox multiplats (Dreamcast much?).

Unless Nintendo can pull a Wii miracle, being underpowered (in the home console space) will always spell disaster.
 
http://www.videogamer.com/xboxone/c...s_being_made_at_amazing_cinema_quality_2.html

This article reminded me of that. The article itself is PR BS to the core, but had me thinking that maybe some of these teams that are one the second go around with the Wii U (WB Montreal, Treyarch, etc) can push the envelope a bit more.

We'll probably mostly be getting PS360 level geometry in multiplats, but maybe we can get PS4/xbone level textures if we're lucky, at least in some cases.

PAHAHAHAHAHAH!!!!!! CINEMA QUALITY ASSETS!! HAHAHAHA!!! ON 5GB OF USEABLE RAM HAHAHAHAHAHAHAHAHA!!!!!!
 

Log4Girlz

Member
Ironically, technical power was probably what saved them.
If Nintendo announced to the world N64 could only do 2D, they would likely have been dead. Same goes for the Gamecube if it was never powerful enough for PS2/Xbox multiplats (Dreamcast much?).

Unless Nintendo can pull a Wii miracle, being underpowered (in the home console space) will always spell disaster.

Agreed. There is a difference between being the weakest console available and being overtly under-powered. Its only worked for the Wii.
 

Jrs3000

Member
Ironically, technical power was probably what saved them.
If Nintendo announced to the world N64 could only do 2D, they would likely have been dead. Same goes for the Gamecube if it was never powerful enough for PS2/Xbox multiplats (Dreamcast much?).

Unless Nintendo can pull a Wii miracle, being underpowered (in the home console space) will always spell disaster.

We are getting off topic here.
 
Ironically, technical power was probably what saved them.
If Nintendo announced to the world N64 could only do 2D, they would likely have been dead. Same goes for the Gamecube if it was never powerful enough for PS2/Xbox multiplats (Dreamcast much?).

Unless Nintendo can pull a Wii miracle, being underpowered (in the home console space) will always spell disaster.

Err, the GC killed the PS2 (The biggest piece of crap in modern gaming)
 
One thing Nintendo does correctly is quality. I run a Repair shop and have 10-20 broken ps3 and xbox 360's.....I have 0 nintendo consoles. They come threw here but not often enought. All of their consoles have been enjoyable and they didn't break down every year and make you have to run out and buy a new one.....
 
Status
Not open for further replies.
Top Bottom