• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

If the subpar performance of all these 360 ports so far is the result of "lack of time" and "not getting to know the wii u hardware", should we also expect 360 to ps4 ports to have the same issues? I mean, no dev fully understands the ps4 yet, right?

No, because while Wii U is no doubt in the ballpark of current gen, Durango/Orbis should be several times more powerful. You will probably see some choppy framerates here and there only because developers will be adding more and more complexity to scenes with that extra power. But every system has a learning curve, and results will surely improve over time.
 

LCGeek

formerly sane
If the subpar performance of all these 360 ports so far is the result of "lack of time" and "not getting to know the wii u hardware", should we also expect 360 to ps4 ports to have the same issues? I mean, no dev fully understands the ps4 yet, right?

No because the general power of the PS4 will be various times of the hd twins which will help out quite a bit. Also power that isn't being used isn't all that much to begin with.
 
If the subpar performance of all these 360 ports so far is the result of "lack of time" and "not getting to know the wii u hardware", should we also expect 360 to ps4 ports to have the same issues? I mean, no dev fully understands the ps4 yet, right?
The PS4/Durango should have enough raw strength to handle ports with enchantments, though it will not likely show neither of the systems' true potential neither.
 

Cosmozone

Member
I really haven't been following this. I heard that the Wii U is underpowered. So how does its potential compare to Xbox 360 and PS3?
Basically the CPU is somewhat slower, the GPU somewhat better (basically nothing known about it), the main memory is much larger (2x) but also slower. There's more to the memory layout we don't know about (that edram talk), so it's hard to say. In the end it means ports are not as easy as they should be and take a performance hit because of that, where optimized engines can probably beat current gen engines (if they ever arrive).
 

Prez

Member
Basically the CPU is somewhat slower, the GPU somewhat better (basically nothing known about it), the main memory is much larger (2x) but also slower. There's more to the memory layout we don't know about (that edram talk), so it's hard to say. In the end it means ports are not as easy as they should be and take a performance hit because of that, where optimized engines can probably beat current gen engines (if they ever arrive).

So graphics wise the difference between PS3/360 and Wii U will be similar to the difference between PS2/Xbox and Wii? That's pretty much what I expected.
 

Kenka

Member
So graphics wise the difference between PS3/360 and Wii U will be similar to the difference between PS2/Xbox and Wii? That's pretty much what I expected.
For performance expectation, please say also what price expectation you have.

Nintendo have done very well with their power envelope. So if you want an X720 with similar consumption, it will be in the same ballpark performance-wise. Truth being that for the same price (350$) you can get more raw performance, easily. But the trade-off will be that te console will be less ecological in use.
 

Cosmozone

Member
There are people claiming the large main memory is a waste because of its relatively low throughput but I'm not so sure about that. Weren't there even games in the past that were streaming directly from (the even slower) disc? I simply don't know enough about games development to judge all this. I'm still pretty positive about the whole memory thing and that its a big advantage. Still, I don't expect too visible improvements. Some of the current gen HD games already look great enough, technically.
 
So graphics wise the difference between PS3/360 and Wii U will be similar to the difference between PS2/Xbox and Wii? That's pretty much what I expected.

Wii U has a more modern GPU than the PS3/360, so it will be better when it comes to graphical effects.
 

z0m3le

Banned
Yes, offloading stuff to the GPU is going to be a major trend going forward, but using your own example, a Jaguar core is a lot more powerful than whatever is inside the WiiU.

No, actually I doubt it will be.
http://www.xbitlabs.com/news/cpu/di...ext_Generation_Jaguar_Micro_Architecture.html

^ That is jaguar, a 3.1mm (per core) die size @ 28nm, With a design limit of 4 cores with shared L2 cache of 2MBs and a maximum designed clock speed of 2GHz. 1 channel for memory is also a hard limit of jaguar btw so memory will likely fall to 128bit DDR3 (if they use DDR3 as DDR4 might not be ready in time for mass production in august/september 2013)

Wii U has 3 cores with 32mm (all 3 cores) die size @ 45nm, with asymmetrical L2 cache of 3MBs and an unknown clock speed (lets estimate 1.46Ghz as the low end)

Considering the Wii U has a multicore ARM CPU for os and a DSP for sound, and that AMD has no SMT technology. You are looking at 4 cores /w 1 thread each needing to process sound on a thread/core, and Wii U with 3 cores that doesn't touch the OS, I/O, or Sound, they are fairly similar in wattage, clock speeds and likely performance. (AMD has never really been the best here)

I'm curious as to what customizations they would make to this core, it's obviously very limited and designed for low power consumption, changing it completely into basically what an A10, both doesn't make sense and might be impossible.
 
No, actually I doubt it will be.
http://www.xbitlabs.com/news/cpu/di...ext_Generation_Jaguar_Micro_Architecture.html

^ That is jaguar, a 3.1mm (per core) die size @ 28nm, With a design limit of 4 cores with shared L2 cache of 2MBs and a maximum designed clock speed of 2GHz. 1 channel for memory is also a hard limit of jaguar btw so memory will likely fall to 128bit DDR3 (if they use DDR3 as DDR4 might not be ready in time for mass production in august/september 2013)

Wii U has 3 cores with 32mm (all 3 cores) dies size @ 45nm, with asymmetrical L2 cache of 3MBs and an unknown clock speed (lets estimate 1.46Ghz as the low end)

Considering the Wii U has a multicore ARM CPU for os and a DSP for sound, and that AMD has no SMT technology. You are looking at 4 cores /w 1 thread each needing to process sound on a thread/core, and Wii U with 3 cores that doesn't touch the OS, I/O, or Sound, they are fairly similar in wattage, clock speeds and likely performance. (AMD has never really been the best here)

I'm curious as to what customizations they would make to this core, it's obviously very limited and designed for low power consumption, changing it completely into basically what an A10 is, both doesn't make sense and might be impossible.

If we go off what that actual Playstation engineer said I while back, the PS4 will also include lots of dedicated silicon. I'm thinking it will probably also use a separate I/O controller and audio DSP.
 
No, actually I doubt it will be.
http://www.xbitlabs.com/news/cpu/di...ext_Generation_Jaguar_Micro_Architecture.html

^ That is jaguar, a 3.1mm (per core) die size @ 28nm, With a design limit of 4 cores with shared L2 cache of 2MBs and a maximum designed clock speed of 2GHz. 1 channel for memory is also a hard limit of jaguar btw so memory will likely fall to 128bit DDR3 (if they use DDR3 as DDR4 might not be ready in time for mass production in august/september 2013)

Wii U has 3 cores with 32mm (all 3 cores) die size @ 45nm, with asymmetrical L2 cache of 3MBs and an unknown clock speed (lets estimate 1.46Ghz as the low end)

Considering the Wii U has a multicore ARM CPU for os and a DSP for sound, and that AMD has no SMT technology. You are looking at 4 cores /w 1 thread each needing to process sound on a thread/core, and Wii U with 3 cores that doesn't touch the OS, I/O, or Sound, they are fairly similar in wattage, clock speeds and likely performance. (AMD has never really been the best here)

I'm curious as to what customizations they would make to this core, it's obviously very limited and designed for low power consumption, changing it completely into basically what an A10, both doesn't make sense and might be impossible.

First of all, you are making a number of assumptions about the other consoles without any real info there. Forgetting extra silicon which you don't know whether the other consoles have or not, and comparing Jaguar directly with other available low power CPUs available on the market, it's a very feature packed core, with modern OoO and floating point features, and designed to power laptops coming out in 2013. It is in no way the "limited" CPU you describe.

I also would like to see your source for the memory limitations you mentioned, as they are entirely false so far as I'm aware.
 
I think I should post a best case scenario as well. I think my clocks are correct so this part stays:

DSP: 121.5MHz (system base clock)
CPU: 1458MHz (12 x 121.5)
GPU: 486MHz (4 x 121.5)
MEM1: 486MHz (4 x 121.5)
MEM2: 729MHz (6 x 121.5)

Other things could be different though:

CPU: ~35GFLOPS (1458MHz x 3 cores x 8 floating point instructions per cycle)
GPU: ~620GFLOPS (assuming 640 shader units)
MEM1: ~463.5GB/s (486MHz x 8192bit)
MEM2: ~10.9GB/s (729MHz x 2 x 64bit)

Pretty sure it's much closer to the worst case scenario though...

I can see your clocks as quite possible, although I'd rather not only because they undercut my own (133 Mhz DSP base and then the same multipliers). It would be crazy enough for Nintendo to stick with the same DSP base and keep multiples the same as Wii.

If they plan for such a slow GPU clock, I can see them sticking 400 SPUs on there tops, which is actually higher than my lowball estimate of 320.
 

z0m3le

Banned
First of all, you are making a number of assumptions about the other consoles without any real info there. Forgetting extra silicon which you don't know whether the other consoles have or not, and comparing Jaguar directly with other available low power CPUs available on the market, it's a very feature packed core, with modern OoO and floating point features, and designed to power laptops coming out in 2013. It is in no way the "limited" CPU you describe.

I also would like to see your source for the memory limitations you mentioned, as they are entirely false so far as I'm aware.

I have to make assumptions, since well... These consoles are not real yet, and not everything is even finalized, but I thank Fourth Storm for pointing out the I/O and DSP silicon rumors, however this does not put the Jaguar very far from Wii U's CPU, my low clock estimate for Wii U could also be 2.19Ghz (3x Wii's CPU clock speed)

http://semiaccurate.com/2012/08/28/amd-let-the-new-cat-out-of-the-bag-with-the-jaguar-core/

Third paragraph: "The chip only has a one memory channel."
 
A GPU that's crippled by a weak CPU and low bandwidth.
If you use the CPU the same way as you use the 360, there will be issues. The Wii U seems to be balanced differently than current-gen consoles. It is built around offsetting certain tasks to other processors like the DSP and GPGPU, while the CPU efficiently deals with general code. The memory system is built around using the eDRAM, so the lower bandwidth may not be an issue depending on the buses for the eDRAM.

In either case, the GPU is DX10.1-equailivant with some other features beyond that, so it has an advantage over current-gen in that regard. The Wii basically has the same GPU as the Gamecube, and that is less modern than the original Xbox..so it didn't benefit from a superior feature set over the previous generation.
 
It all falls back on nintendo really. If they wanted to avoid this sort of thing they should have had something to show of some of their more graphically intensive games (like a zelda or 3D mario even if they are a ways off). 2D mario doesn't show off the capability of the platform and pikmin looks like a game that hasn't been made ground up for the system.

Yeah. All people who followed the process wanted was at least some kind of tease about the future and Nintendo refused.

Ah, BG! Hope all has been well. I have to applaud you from stepping away from GAF. It's something I could stand to do myself. You did make many positive contributions to the old threads and taught me a thing or two as well(steered me straight when my math was faulty). Unfortunately, somewhere along the way, much like Ideaman, some posters were seemingly unable to separate what was your own speculation from what you had heard. I think alot of the less tech savvy people put too much weight behind some of your analysis. I've felt compelled myself to correct a few here and there who have misquoted you in order to support their inflated idea of the Wii U's capabilities. And no, that blame should not rest on you. When I read your posts, it was quite obvious which parts were your opinion and which parts were you forwarding info from other sources. But then again, I probably spent too much time digging through all possible infos. :p

Admittedly however, I'm still trying to reconcile your (or was it wsippel's originally?) reports of what the early dev kits contained with what the final has shaped up to apparently be. I covered some of what I consider the likely explanations in my previous post (system specs were in flux, extra elbow room just in case), but what say you? Do you still believe that source which claimed 640 ALUs in the early dev kits was legit? If so, what do you think happened?

Things have been going well. My schedule is coming together to the point where I'm actually able to try gaming again. I've started playing Arkham Asylum and Rayman Origins through Steam and being rusty doesn't even begin to describe my gameplay, haha. I'm also planning to start Darksiders 1.

But yeah I tried to be as clear as possible as to what was coming from me and what came from others.

As for the last part, I still have no reason to doubt it. But IMO without knowing at least the final GPU's clock and process size, there's really no way of knowing what it ended up as. Because with the GPU I never heard about much deviation performance-wise from beginning to end. Anyone removed the heat plate yet?
 
I really wish Nintendo would have had a Zelda game or Mario Galaxy type game come out instead of NSMB U. Those would have been much better games to show off the graphics and gamepad use from their side.

Well, at least this way they have more time to explore both the hardware and gamepad gameplay concepts before they do come out with a 3D Mario/Zelda.
 

z0m3le

Banned
If you use the CPU the same way as you use the 360, there will be issues. The Wii U seems to be balanced differently than current-gen consoles. It is built around offsetting certain tasks to other processors like the DSP and GPGPU, while the CPU efficiently deals with general code. The memory system is built around using the eDRAM, so the lower bandwidth may not be an issue depending on the buses for the eDRAM.

In either case, the GPU is DX10.1-equailivant with some other features beyond that, so it has an advantage over current-gen in that regard. The Wii basically has the same GPU as the Gamecube, and that is less modern than the original Xbox..so it didn't benefit from a superior feature set over the previous generation.

Here is a question, do you know of a feature DX11 supports that DX10.1 does not, that isn't directly related to Microsoft's software? from what I've seen DX10.1 effects and DX11 effects are pretty much on par, and make a lot of sense when DX11 was a small upgrade over 10.1.

That is just a rumour, as is stated in the next sentence of that same article. But even if it was true, that does not tell you much in terms of what type/speed of memory is used for a given implementation, so like I said previously, we can only compare the CPU in isolation from the rest of the system.

Well as many people here will tell you, there is really only 3 options (except for exotic memory types.): GDDR5 (limited to 2GB ram across 8 chips) DDR3 (128bit bus could yield over 30MB/s single channel I believe) or DDR4, which might not be ready for mass production in 2013.

Either way what we were talking about is "underpowered" CPUs, whether that was for PS4 or Wii U, is up to you. Both look comparable if you just look at die size and power consumption.
 
A lot of talk about Jaguar around here...but remember:

jagad02a_med.jpg
 
Was looking at this link: http://www.sumzi.com/en/articles/11/3906.html

I'm assuming Wii U uses UX8GD (I have a feeling it's available even to Nintendo even if it's not on Renesas' page atm). That RAM does run up to 800 Mhz. If wsippel is right on the base clock, I wonder if Nintendo wouldn't want it at 729 Mhz or 1.5x the GPU. Same as "MEM2," which,even with a conservative 64-bit bus from GPU to CPU, would make it a true low latency alternative to the main system RAM.
 

wsippel

Banned
Hi bgassassin! :)


Was looking at this link: http://www.sumzi.com/en/articles/11/3906.html

I'm assuming Wii U uses UX8GD (I have a feeling it's available even to Nintendo even if it's not on Renesas' page atm). That RAM does run up to 800 Mhz. If wsippel is right on the base clock, I wonder if Nintendo wouldn't want it at 729 Mhz or 1.5x the GPU. Same as "MEM2," which,even with a conservative 64-bit bus from GPU to CPU, would make it a true low latency alternative to the main system RAM.
Yeah, I posted that a while ago. It's most likely UX8GD. It simply makes sense. It's built at the Yamagata plant, a plant NEC built years ago after winning the Gamecube contract to fulfill Nintendo's orders. Every chip Nintendo used since was built either there or at IBM's East Fishkill plant, the same plant already confirmed to produce the Wii U CPU. It's also a weird coincidence that UX8GD was designed to support up to 256Mbit, exactly the amount Nintendo plans to use, and that the technology is not yet available to regular customers. Not to mention NEC listed "game consoles" as a target application for UX8GD.

And I guess it could be running at 729MHz. That would impact the latency on the GPU side, though. Either way, sustained latency probably has to be 2ns or less, if only for BC. Which strikes me as really fucking low. Insanely low.
 

Absinthe

Member
Not sure if this has already been brought up, but this post from b3d has some great points, at least from my understanding of things.

http://forum.beyond3d.com/showpost.php?p=1681704


Originally Posted by function
I've been thinking about what you've said here and I've got a couple of questions about it.

Regarding IO, I know you specifically mention that it isn't likely to be an issue, but I'm not sure why. If we were to assume that Broadway was pad limited for IO based on its scaling from Gekko, couldn't it also be that the WiiU CPU is IO limited? It could potentially need 6 times the data that Broadway did (3 cores x twice the speed, even assuming no other increases). Couldn't that mean a potentially greater number of pads that overwhelmed the benefits of only needing on package communication?

No, not really. The my argument was two-fold, with the first being that a chip like Gekko/Broadway needs off-chip connection, but if you make a tricore version, the number of connections aren't going to triple. As you point out, the off-chip data communications needs would increase but that part is addressed by keeping signals on-package.

On that subject, what is it about on package communication reduces IO area requirements? Is it that fewer pads are needed because you can signal faster over the shorter distance, or that smaller contact points are needed because you use less power per 'pin'? Or something else?

Bearing in mind that I'm no IC designer, but a computational scientist, to the best of my knowledge both of your points above are correct. What I don't have is hard numbers, that is, if you want to really want to push the signaling speed per connection, how does that affect the necessary area for the associated drive circuitry? On the other hand, I can't really see that it would be an issue here, and in the cases where I've heard it described in more detail, they've claimed both benefits - much faster signaling at lower cost in die area.

Finally, what do you think Nintendo have added to the cores, or are they different cores entirely? I think you're probably correct and I've been wondering what the changes might be. Some kind of beefed up SIMD / Vector support seems desirable, especially given the expected low clocks.

Sorry for the all the questions, but this is quite an interesting topic!

Although the thread title says GPU, I'm inclined to agree. :)

As to your question, I'll be damned if I know. No developer has yet been heard gnashing his teeth about having to rewrite all SIMD code so Nintendo/IBM adding SIMD blocks to facilitate ports is a possibility. On the other hand Iwata has publicly made vague noises that could be interpreted as that the GPU would be the way to go for parallel FP. Or not. They could also have made a complete rework of the core, a la how different manufacturers produce ARMv7 cores of differing complexity. That would cost a bit though. Or they could have spent gates to beef up only what they deem to be key areas - after all, they have quite a bit of experience by now with where the bottlenecks have proven to be for their particular application space.

While the lack of information is frustrating for the curious, we do know a few things. We know the that the die area is 33mm2 on 45nmSOI, and that the power draw is in the ballpark of 5W. We also know that is going to be compatible with Wii titles, which makes it an open question (but not impossible) if IBM has used a completely unrelated PPC core with sufficient performance headroom per core that performance corner cases can be avoided. "Enhanced" Broadway may indeed be the case.

It's not going to be a powerhouse under any circumstances in raw ALU capabilities compared to contemporary processors. It spends roughly a fifth of the process adjusted die size per core (logic+cache) as the Apple A6 for instance. On the other hand the Cell PPE or the Xenos cores aren't particularly strong either for anything but vector-parallel FP codes that fit into local storage or L1 cache respectively. (Imperfect example could be that for instance the iPhone5 trumps the PS3 in both Geekbench integer and floating point tests). The take home message being that even if the WiiU CPU isn't a powerhouse, it isn't necessarily at much of a disadvantage vs. the current HD twins in general processing tasks even if we think of it as a tweaked Broadway design. If the more modern GPU architecture of the WiiU indeed makes some of the applications that the SIMD units were used for unnecessary, maybe it is a better call to simply skip CPU SIMD. This is a game console, after all.

I have to say though that given what we know today, it seems to punch above its weight even at this point in time. There are a number of multi platform ports on the system, at launch day with what that implies, that perform roughly on par with the established competitors. And those games are not developed with the greater programmability, nor the memory organization of the WiiU in mind. So even without having its strengths fully exploited, it does a similar job at less than half the power draw of its competitors at similar lithographic processes! And its backwards compatible. To what extent its greater programmability and substantial pool of eDRAM can be exploited to improve visuals further down the line will be interesting to follow.
How what we have seen so far can be construed as demonstrating hardware design incompetence on the part of Nintendo is an enigma to me.
 

Argyle

Member
It could very well happen and there's a greater chance of that than not, Mr Prestige. Epic confirmed that it is UE4 capable in July - their words being that developers can make UE4 games for the Wii U if they wished to, but they themselves had no plans at the time, so all this talk of it being 'on par with/less than PS360/2005 tech/7th Gen' should've been killed once and for all back then (8th Gen is here; 8th Gen IS the Current Gen). Truthfully, Epic and UE4 aren't and won't be THAT big a deal, anyway, but that it's capable is still welcome news. We also know that it runs Cry Engine 3 'beautifully', has the new Unity Engine, Havok and also Frostbite 2. As for Luminous, it's scalable, from PCs to mobile and cloud gaming, to the iPad (presumably version 2, as version 3 had been out for a few months and they had been working on that engine for some time) as well as other tablets - that has been reported with quotes at CVG and GameTrailers, among other sites, so unless members here believe that mobiles are currently more powerful, it's not unthinkable to say that the Wii U can have it (in the same quotes, he even says possibly the PS3, FFS...). The 8th Generation WON'T be defined by a giant leap in power and graphics, and all those people expecting one can should prepare to be disappointed because it's clear that it isn't in the industry's interests, and the desire for one isn't as strong as Epic and many members here want to believe. A lot of people are carrying on as if it isn't future-proof or as if it won't get anything once these imaginary, unconfirmed consoles drop, and it's rather pathetic.


Can you humor me and post where you heard about these confirmed engines coming to WiiU? I remember hearing about UE3 but not the others.
 

MDX

Member
How what we have seen so far can be construed as demonstrating hardware design incompetence on the part of Nintendo is an enigma to me.


Nonsense!

http://www.youtube.com/watch?v=YgndOkWGk1A


Its obvious that TANK TANK TANK is a good representation of how
powerful the WiiU actually is. Clearly if the CPU wasn't gimped, if the memory was faster,
and the if GPU was more robust, the developers would have been able to make a better looking game.
 

Alexios

Cores, shaders and BIOS oh my!
Can you humor me and post where you heard about these confirmed engines coming to WiiU? I remember hearing about UE3 but not the others.
I think Unity is provided by Nintendo in the SDK, and I've seen the CryEngine 3 quote, idk about the others. Also, Epic's comment about other devs porting UE4 games to WiiU was purely hypothetical with no mention of what downgrades would have to be done and what not. If Epic themselves don't bother providing a version for WiiU (which I guess doesn't have to do with power when they say current mobiles/tablets could get it) I imagine results may not be ideal. If they even bother. I guess they might bother if workflow really is that much better on UE4 but it's not going to be easy. It's certainly a loss if Epic won't ever do it.
 

Meesh

Member
A GPU that's crippled by a weak CPU and low bandwidth.
Bare my noobishness please, but surely there's a completely logical explanation behind what on the surface appears to be a shortsighted decision by those wacky "N" guys...I mean why intentionally do this to your hardware....
 
This thread has been interesting to read and I don't know shit about tech and I'm just waiting for whatever Retro is working on cos that should look great.

Another thing is why are some people assuming that the Wii U is going to get PS4/720 ports? It's not even getting some PS3/360 ports and we know it can at least handle those. I'm talking about games like Crysis 3, Dead Space 3, GTA 5, Bioshock Infinite. Maybe these games are gonna be announced sometime in the future, who knows.
 

DieH@rd

Banned
Bare my noobishness please, but surely there's a completely logical explanation behind what on the surface appears to be a shortsighted decision by those wacky "N" guys...I mean why intentionally do this to your hardware....

This is a safe way of giving their console HD capability which will give them the [cheap] chance to do another round of rehashes of their ancient IP's...
 

Absinthe

Member
Nonsense!

http://www.youtube.com/watch?v=YgndOkWGk1A


Its obvious that TANK TANK TANK is a good representation of how
powerful the WiiU actually is. Clearly if the CPU wasn't gimped, if the memory was faster,
and the if GPU was more robust, the developers would have been able to make a better looking game.

No. Your logic doesn't explain the other decent cross platt ports (optimized for the 360 architecture nonetheless). Nintendo hardwarde is not soley responsible for a bad shovelware game by a dev, no hardware can solely be held accountable for a crap game. Every console at lauch, has it's fair share of crap games. The fact that ports of codblops2, for example, look good, simply proves that your example is not even valid. You need to re-read the post you quoted for better context.

The eagerness to look for, and extrapolate, anything that screams the Wii U is a Nintendo failure, is getting a little old. Lets wait till all of the specs are in to do that, at minimum.
 

Meesh

Member
This is a safe way of giving their console HD capability which will give them the [cheap] chance to do another round of rehashes of their ancient IP's...
I think the idea though is to gain third party support, not burn any more bridges....
 
Here is a question, do you know of a feature DX11 supports that DX10.1 does not, that isn't directly related to Microsoft's software? from what I've seen DX10.1 effects and DX11 effects are pretty much on par, and make a lot of sense when DX11 was a small upgrade over 10.1.
I'm not familar with them to answer that question. One thing I know about DX11 is that it is suppose to make some features like Tessellation a lot more practical and useful that it was for DX10.
 

Kai Dracon

Writing a dinosaur space opera symphony
This is a safe way of giving their console HD capability which will give them the [cheap] chance to do another round of rehashes of their ancient IP's...

If Nintendo were "cheap" as people keep saying, it seems like they would have released a $200 box nothing but the "pro" style gamepad, that truly was a mere clone of the 360 that was just capable enough of being HD.

At least in a year we'll have "real" next-gen consoles that won't be tainted with any ancient IPs like Madden Football or Need for Speed. And will offer nothing of value or interest beyond being scaled down gaming PCs with the same media center and streaming functions as the current Xbox and Playstation units.
 
No. Your logic doesn't explain the other decent cross platt ports (optimized for the 360 architecture nonetheless). Nintendo hardwarde is not soley responsible for a bad shovelware game by a dev, no hardware can solely be held accountable for a crap game. Every console at lauch, has it's fair share of crap games. The fact that ports of codblops2, for example, look good, simply proves that your example is not even valid. You need to re-read the post you quoted for better context.

The eagerness to look for, and extrapolate, anything that screams the Wii U is a Nintendo failure, is getting a little old. Lets wait till all of the specs are in to do that, at minimum.

It's pretty obvious he was being sarcastic.
 

kinggroin

Banned
And given a boost by a large amount of high functioning (fully read/write, shared between CPU and GPU) EDRAM and supporting processors like DSPs and ARMs.


Or Retro's game.

To the first part, wouldn't that essentially just put the Wii U on par with the previous generation?
 
And given a boost by a large amount of high functioning (fully read/write, shared between CPU and GPU) EDRAM and supporting processors like DSPs and ARMs.


Or Retro's game.

or EAD 4's game. The guys over there are extremely underrated when it comes to their technical achievements IMO. Plus whatever they have cooking will no doubt have a higher budget than whatever retro's game is and it will definitely show
 
Top Bottom