• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Community Thread

Status
Not open for further replies.

Margalis

Banned
It's no coincidence that at least three U launch games are using DoF on-tap, it's obviously an onboard hardware feature and not a struggling coded filter effect. Blatant clear as day.

It's not like DOF is hard to implement. Like...if you have a rendering engine adding shitty DOF takes like an hour. Having DOF as a hardware feature doesn't make sense. Hardware features that make DOF better? Sure.

Now the WiiU DOF looks a lot better than the typical 360 implementation. That could be for a lot of different reasons - less downsampling in some intermediate buffers, higher quality downsampling and filtering. I believe DX11 has better ways of dealing with the interaction of AA and z-buffers as well. (Which might help with depth discontinuities)

Most post-processing in 360 games is done on crappily-downsampled intermediate buffers that are then scaled back up and blurred to hide the artifacts, just having more beef or RAM would improve post-processing effects by a lot.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
It's no coincidence that at least three U launch games are using DoF on-tap, it's obviously an onboard hardware feature and not a struggling coded filter effect. Blatant clear as day.

The point is that the Wii U using DOF is not an explicit example of DX11 capabilities. I'm not contending the likelihood of a fixed DOF function on the GPU. But even then, the quality of that function is unknown. It looks good. Great. But so does DOF in plenty of other games released this generation.

Hell, being a fixed function on the GPU says nothing other than just that: the Wii U easily supports good quality DOF. That's great, but it doesn't have a lot to do with compute programming capabilities nor DX11 anything.
 
It's not like DOF is hard to implement. Like...if you have a rendering engine adding shitty DOF takes like an hour. Having DOF as a hardware feature doesn't make sense. Hardware features that make DOF better? Sure.

Now the WiiU DOF looks a lot better than the typical 360 implementation. That could be for a lot of different reasons - less downsampling in some intermediate buffers, higher quality downsampling and filtering. I believe DX11 has better ways of dealing with the interaction of AA and z-buffers as well. (Which might help with depth discontinuities)

Most post-processing in 360 games is done on crappily-downsampled intermediate buffers that are then scaled back up and blurred to hide the artifacts, just having more beef or RAM would improve post-processing effects by a lot.

The point is that the Wii U using DOF is not an explicit example of DX11 capabilities. I'm not contending the likelihood of a fixed DOF function on the GPU. But even then, the quality of that function is unknown. It looks good. Great. But so does DOF in plenty of other games released this generation.

Hell, being a fixed function on the GPU says nothing other than just that: the Wii U easily supports good quality DOF. That's great, but it doesn't have a lot to do with compute programming capabilities nor DX11 anything.

You see a lot of arguments on here that seem to go down the road of 'yeah, there's a dead man on the floor and another man stood by him with a smoking gun in his hands, but that doesn't mean a bullet sized meteor didn't fly from out of space and kill him, and the man with the gun just tried shooting it out of midair'. I mean... come on now.

If DoF is DX11 (real DoF) then that would make it a feature of the hardwae, no?
 

darthdago

Member
More detail, easier to read text, but the same old 360 ugly.

Just sharper.

I am so disappoint.

I'm still catching up the thread...so maybe someone allready gave you the hint to use a HDMI cable from your 360 to ur TV, then in settings menue (if its not doing it automatically) you have to change to 1080p output...
 
I'm still catching up the thread...so maybe someone allready gave you the hint to use a HDMI cable from your 360 to ur TV, then in settings menue (if its not doing it automatically) you have to change to 1080p output...

Oh blah.

I ain't stupid.

Besides I'm using a VGA cable.

And anyways I've already admitted that was a lie.

I'm quite enjoying it.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
You see a lot of arguments on here that seem to go down the road of 'yeah, there's a dead man on the floor and another man stood by him with a smoking gun in his hands, but that doesn't mean a bullet sized meteor didn't fly from out of space and kill him, and the man with the gun just tried shooting it out of midair'. I mean... come on now.

What the hell are you even talking about? Did you even read my post?
 

MDX

Member
The point is that the Wii U using DOF is not an explicit example of DX11 capabilities.


But of course we are not pointing out just DoF, there are other things going on that taken together is a good indication that the WiiU is using features that DX11 excels at. Which can also mean that the WiiU is not a DX11 machine, but its a sign that Nintendo has looked at what features there are in DX11 or what features are important for next gen, and has made sure to prepare the WiiU to heavily use them.
 

NBtoaster

Member
You see a lot of arguments on here that seem to go down the road of 'yeah, there's a dead man on the floor and another man stood by him with a smoking gun in his hands, but that doesn't mean a bullet sized meteor didn't fly from out of space and kill him, and the man with the gun just tried shooting it out of midair'. I mean... come on now.

Having higher quality DOF (along with other post processing) is something you expect out of a better modern GPU. PC games have had better quality DOF for years. It doesn't mean it's doing it in hardware.

I expect all post processing to be better on Wii U. The usual 1/4 res stuff on the current consoles looks rough.
 
What the hell are you even talking about? Did you even read my post?

Yeah, you negate DoF on WiiU examples as nothing. Did you read my post? So many games using to much better degrees than anything seen on ps360, and you're saying it's not proof positive of DX11 featured hardware? No, maybe not absolute proof, but still, if it looks like dog and barks like a dog, it probably isn't a duck.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
But of course we are not pointing out just DoF, there are other things going on that taken together is a good indication that the WiiU is using features that DX11 excels at. Which can also mean that the WiiU is not a DX11 machine, but its a sign that Nintendo has looked at what features there are in DX11 or what features are important for next gen, and has made sure to prepare the WiiU to heavily use them.

And I'm not arguing against that. As soon as the devkit specs leaked, listing tessellation and compute programming support, I figured Nintendo had made an effort to adopt a somewhat modern feature set to the GPU. And that is something I am very happy about, especially compute programming, as I felt the absence of such features would hinder third party engine efforts next generation (many of which I suspect will be heavy on compute shaders).

But I know not the capabilities of the tessellation nor compute programming on the Wii U, and I'm not going to pretend to. I also don't know what/if Nintendo has bolted on to the GPU as a fixed function. And I dont feel pointing to Pikmin 3's DOF is a solid example of DX11 featured hardware.

Yeah, you negate DoF on WiiU examples as nothing. Did you read my post? So many games using to much better degrees than anything seen on ps360, and you're saying it's not proof positive of DX11 featured hardware? No, maybe not absolute proof, but still, if it looks like dog and barks like a dog, it probably isn't a duck.

No, you didn't read my post, as I did not negate the DOF Wii U as 'nothing'. You seem to have decided I'm aiming to discredit DX11 associated feature sets from the Wii U. I am not. I simply do not agree with the notion that current DOF examples are apparently near absolute proof of DX11 feature sets in the hardware, even if I do believe the hardware does indeed have DX11 feature sets.
 
And I'm not arguing against that. As soon as the devkit specs leaked, listing tessellation and compute programming support, I figured Nintendo had made an effort to adopt a somewhat modern feature set to the GPU. And that is something I am very happy about, especially compute programming, as I felt the absence of such features would hinder third party engine efforts next generation (many of which I suspect will be heavy on compute shaders).

But I know not the capabilities of the tessellation nor compute programming on the Wii U, and I'm not going to pretend to. I also don't know what/if Nintendo has bolted on to the GPU as a fixed function. And I dont feel pointing to Pikmin 3's DOF is a solid example of DX11 featured hardware.



No, you didn't read my post, as I did not negate the DOF Wii U as 'nothing'. You seem to have decided I'm aiming to discredit DX11 associated feature sets from the Wii U. I am not. I simply do not agree with the notion that current DOF examples are apparently near absolute proof of DX11 feature sets in the hardware, even if I do believe the hardware does indeed have DX11 feature sets.

Believe me, I did read your post. And I think the mysterious meteor death story still applies.
 

MDX

Member
No, you didn't read my post, as I did not negate the DOF Wii U as 'nothing'. You seem to have decided I'm aiming to discredit DX11 associated feature sets from the Wii U. I am not. I simply do not agree with the notion that current DOF examples are apparently near absolute proof of DX11 feature sets in the hardware, even if I do believe the hardware does indeed have DX11 feature sets.

But you bring up a point that makes people shake their heads at Nintendo.

I understand they dont want all aspects of their hardware specs to be revealed.
But one would think that stating the WiiU supports DX11 or equivalent would be a
small gesture to the core gamer, that they are admit to be going after.
It says that the WiiU is modern, but doesn't give away the specifics.

edit to add: I feel like I just hosted a fashion make-over event.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
But you bring up a point that makes people shake their heads at Nintendo.

I understand they dont want all aspects of their hardware specs to be revealed.
But one would think that stating the WiiU supports DX11 or equivalent would be a
small gesture to the core gamer, that they are admit to be going after.
It says that the WiiU is modern, but doesn't give away the specifics.

Well, you know, that's Nintendo's problem. If they want to use buzz words to market their system's power then yeah, a select audience will shake their heads because they want to know more, and Reggie mouthing off about how powerful the system is doesn't say much.

Nintendo is private and conservative when it comes to public information on hardware details. I don't really know why. I guess that's just their business model, to avoid going head-to-head with public information from Microsoft and Sony. After all, they could let us know a bit more about the hardware and you'd still have people complaining about not knowing the nitty gritty stuff. They'd rather draw attention to the games and hardware features that can enhanced those games.

That's Nintendo for you.
 

Margalis

Banned
But of course we are not pointing out just DoF, there are other things going on that taken together is a good indication that the WiiU is using features that DX11 excels at.

DirectX11 has 3 major new features IIRC:

Tessellation
Compute shaders
Multithreading

The WiiU apparently supports at least the first 2 to some degree. So I'm not sure what you guys are debating. The WiiU is not going to use the DirectX API and the exact implementation of tessellation and compute shaders may be somewhat different, but if the WiiU does support tessellation (beyond what the 360 supports which was basically ignored by developers) and compute shaders then it is highly in line with DX11 features.
 
And I dont feel pointing to Pikmin 3's DOF is a solid example of DX11 featured hardware.
tbh, if you had a X360 unified shader sm3 gpu with one teraflop or less but more than this gen currently offers you probably could pull this somewhat, albeit wasting a lot more resources than more modern architectures do. With SM4.0 it gets easier and with SM5.0 it's in the specification; the keyword being how taxing it is.

I'm not saying they're pulling it through those hoops and loops (probably too much work to have at a platform starting it's life now), but DoF seems to have a hit on it, seeing Pikmin 3 runs at 720p @ 30 frames per second; could be done by SM4.0, as you said even if I think they're doing it via SM5.0 by gut feeling we can't conclude much; for all means and purposes they might be using the extra overhead they have to force the console to do stuff like this (although, the baseline for it is sm4, so it would pull this much easier than a sm3 gpu like this gen one's could).
 

10k

Banned
How many pages and pages can you guys argue back and forth with zero new information being introduced?

It's absolutely mind-boggling.

You guys do know that if you just wait a couple months all of this stuff will work itself out right? I really don't get why you need to spend like 100 pages arguing when none of you knows much of anything and all your questions will be answered soonish anyway.

It's like a 10 hour debate on what someone will eat for lunch tomorrow. And it's pretty clear you are all arguing what you want to believe rather than what makes sense.

Feature sets are almost irrelevant? Lol no. The WiiU will handle ports from PS4 easier than from PS3? Probably not. You are both saying insane stuff.
Someone tell this man the definitions of speculation, stat!

http://dictionary.reference.com/browse/speculation
 

10k

Banned
I'm playing the Mass Effects again.

What res are they at?
720p at 30fps and 2xAA

My take on hardware so far is the CPU will be 45nm and the GPU will be 40nm?

Why?

Well I bought myself a gtx 570 not too long ago, a 40nm card and it's pretty quiet and cheap for what it does. I think Nintendo would pref a 32nm GPU but the cost would be too high for consumers so they will stick with the 40nm process. I'm ok with that. I will dance like an idiot if it's 32nm though. It'll be like GameCube all over again, small but powerful
 

Aostia

El Capitan Todd

I've found also this: http://nintendoculture.com/more-3rd-party-wiiu-games-coming/
“It’s actually a moment I cant talk about, because my favourite bit of E3 was finding out about all the cool third party stuff that’s coming for the Wii U, which they decided not to talk about, which was awesome. So I just want to say yeah, there’s loads of third party stuff coming which I reckon we might see at Gamescom, or we might see at TGS, but its coming. That was my favourite bit because everyone was saying “No one’s bringing out anything for Wii U”, and I saw some cool things.”
 
My take on hardware so far is the CPU will be 45nm and the GPU will be 40nm?

Why?

Well I bought myself a gtx 570 not too long ago, a 40nm card and it's pretty quiet and cheap for what it does. I think Nintendo would pref a 32nm GPU but the cost would be too high for consumers so they will stick with the 40nm process. I'm ok with that. I will dance like an idiot if it's 32nm though. It'll be like GameCube all over again, small but powerful
Nah, if it's not 40 nm it'll be 28 nm's.

All graphics card manufacturers (AMD/ATi and Nvidia) skipped the 32 nm process.

GPU Manufacturer's: 55, 40, 28 nm's
CPU Manufacturer's: 65, 45, 32, 22 nm's
 

DrWong

Member
I've found also this: http://nintendoculture.com/more-3rd-party-wiiu-games-coming/
“It’s actually a moment I cant talk about, because my favourite bit of E3 was finding out about all the cool third party stuff that’s coming for the Wii U, which they decided not to talk about, which was awesome. So I just want to say yeah, there’s loads of third party stuff coming which I reckon we might see at Gamescom, or we might see at TGS, but its coming. That was my favourite bit because everyone was saying “No one’s bringing out anything for Wii U”, and I saw some cool things.”
Yeah, I was going to post this too.

With other older quotes with the same statement (the guy from Gametrailer?) also reported here, it seems we can expect some cool - not megaton level - stuff coming within September.
 
I've found also this: http://nintendoculture.com/more-3rd-party-wiiu-games-coming/
“It’s actually a moment I cant talk about, because my favourite bit of E3 was finding out about all the cool third party stuff that’s coming for the Wii U, which they decided not to talk about, which was awesome. So I just want to say yeah, there’s loads of third party stuff coming which I reckon we might see at Gamescom, or we might see at TGS, but its coming. That was my favourite bit because everyone was saying “No one’s bringing out anything for Wii U”, and I saw some cool things.”

nice find. I'm not sure but has nintendo ever given a "there are x amount of games in development for the wii u right now" statement the way they did with the 3ds and sony did with the vita?
 

big_erk

Member
Iwata from the interview I just posted
I think that the Wii U will be powerful enough to run very high spec games but the architecture is obviously different than other consoles so there is a need to do some tuning if you really want to max out the performance.



This jives with what others in this thread have been trying to say.
 
R

Rösti

Unconfirmed Member
Here's a rather boring tidbit from the The Joint Steering Committee for Development of RDA, regarding the Nintendo Optical Disc:

A plastic optical disc storage medium used to
distribute video games released by Nintendo,
including the Nintendo GameCube Game Disc, Wii
Optical Disc, and Wii U Optical Disc. They range in
diameter from 80-120 mm and disc capacities range
from 1.4 GB to 25 GB per layer.
Source: http://www.rda-jsc.org/docs/6JSC-ALA-16.pdf

I don't know if this is of any greater significance, but "25 GB per layer" is mentioned, hinting at dual-layer discs perhaps? While the information isn't that extensive, I don't expect this organization to fiddle around with things grabbed from the Internet, so I'd say this is legit.

On other news, Tose is apparently working at something Wii U related, or at least looking into it. Nothing major, seeing as they only do everything. And another report mentioning Q3 for Wii U release has popped up: http://www.ezway.tw/wp-content/uplo...;會重點摘要.pdf

To be specific, it says economic Q3, but it concerns Foxconn and their fiscal year ends on the 31st of December, so it's the same as calendar year. I think three or more reports have spoken about Q3 for release, quite peculiar.
 

andthebeatgoeson

Junior Member
Well, you know, that's Nintendo's problem. If they want to use buzz words to market their system's power then yeah, a select audience will shake their heads because they want to know more, and Reggie mouthing off about how powerful the system is doesn't say much.

Nintendo is private and conservative when it comes to public information on hardware details. I don't really know why. I guess that's just their business model, to avoid going head-to-head with public information from Microsoft and Sony. After all, they could let us know a bit more about the hardware and you'd still have people complaining about not knowing the nitty gritty stuff. They'd rather draw attention to the games and hardware features that can enhanced those games.

That's Nintendo for you.

They don't want to get Dreamcasted? No matter how annoying it is for now, releasing specs justifies the pursuit of high end specs. Not releasing GC specs did nothing to downplay that GC was, at least, the #2 powerful console in that gen. They don't even take part of the conversation and Wii proved that they have a point. Not revealing specs to cater to some crowd that may or may not buy the console does nothing for them at this point in time. I can't hold it against them because I'm desperate for info and the overwhelming truth of this industry is about games. Quality games will override all other factors.
 
Rösti;39784305 said:
Here's a rather boring tidbit from the The Joint Steering Committee for Development of RDA, regarding the Nintendo Optical Disc:


Source: http://www.rda-jsc.org/docs/6JSC-ALA-16.pdf

I don't know if this is of any greater significance, but "25 GB per layer" is mentioned, hinting at dual-layer discs perhaps? While the information isn't that extensive, I don't expect this organization to fiddle around with things grabbed from the Internet, so I'd say this is legit.

On other news, Tose is apparently working at something Wii U related, or at least looking into it. Nothing major, seeing as they only do everything. And another report mentioning Q3 for Wii U release has popped up: http://www.ezway.tw/wp-content/uplo...;會重點摘要.pdf

To be specific, it says economic Q3, but it concerns Foxconn and their fiscal year ends on the 31st of December, so it's the same as calendar year. I think three or more reports have spoken about Q3 for release, quite peculiar.

There are no words for how happy I would feel if it launched in September.

About your other find, is there really a possibility for no dual-layer discs? I thought it was basically set in stone there would be 50GB discs if needed. Oh well, how many games have even used up a whole 25GB? I can't even imagine what the scale of a game would be like if it took up more than that.
 
diameter from 80-120 mm and disc capacities range
from 1.4 GB to 25 GB per layer
80mm and 1.4 GB capacity disks? GC disks confirmed?

I kid, I kid. (but really, GC disks were 1.4 GB and had 80mm diameter. wtf)


Gamecube compatibility is definitely out right?
 
80mm and 1.4 GB capacity disks? GC disks confirmed?

I kid, I kid. (but really, GC disks were 1.4 GB and had 80mm diameter. wtf)


Gamecube compatibility is definitely out right?

GCN optical discs are out for Wii U. I get lost in the tech talk sometimes but I know that much.

I have a game idea to throw out there:

Simultaneous gameplay with 2 players. Imagine a 2D Metroid. Huge. One player can use the big screen while the other uses the small screen. They can do different areas at the same time and clear them quickly. Maybe in one area something has to be activated for the player to progress in another area.

They could then join up on the big screen to destroy Ridey or Mother Brain together.
 

10k

Banned
Nah, if it's not 40 nm it'll be 28 nm's.

All graphics card manufacturers (AMD/ATi and Nvidia) skipped the 32 nm process.

GPU Manufacturer's: 55, 40, 28 nm's
CPU Manufacturer's: 65, 45, 32, 22 nm's
Wasn't that Oban GPU supposed to be 32nm? If you are right than for sure Nintendo is using a 40nm GPU which is quiet enough for me.

Anyways, I'm wondering if techies here can riddle me this?

I was trying to find the benefits of eDRAM compared to the 360's. According to this forum post (yeah it's from gamespot but meh) the 360's 10mb of eDRAM wasnt enough to store a 720p image, so it became useless for AA and dof, etc at 720p. Thqts why games like halo 3 were 640p or lower.
http://www.gamespot.com/forums/topic/26116823
It seemed like a win-win situation where developers could get "free" AA, DOF, motion blur, etc.
Well it turns out that nothing is "free", everything has a price. The 10MB of eDRAM isn't large enough to store a 720p image, the largest image that can be stored is in a non-standard-sub-HD video mode dubbed "640p". Consequently many Xbox 360 games (Halo 3, PGR3/4 etc) aren't native HD.

So if the Wii U does have 32mb of eDRAM on the GPU, will it be enough to store a 720p or even 1080p image and add "free" AA or motion blur or DOF without taxing the GPU? Will the Wii U be able to render 720p games natively without tiling and AA?
 

10k

Banned
There are no words for how happy I would feel if it launched in September.

About your other find, is there really a possibility for no dual-layer discs? I thought it was basically set in stone there would be 50GB discs if needed. Oh well, how many games have even used up a whole 25GB? I can't even imagine what the scale of a game would be like if it took up more than that.
I keep saying the second last Sunday of September. So that would be September 23rd, 2012. Calling it now. I got $387 saved so far.
 
Interesting interview with Iwata

snippet on Wii U's graphical performance:

Staying with graphics but going back to the idea of getting third parties involved, have you approached Epic with the specs of the Wii U to try to make sure that third-parties using Unreal Engine 4 can easily port their games to Wii U?



Iwata said:
I think that the Wii U will be powerful enough to run very high spec games but the architecture is obviously different than other consoles so there is a need to do some tuning if you really want to max out the performance.



We’re not going to deliver a system that has so much horsepower that no matter what you put on there it will run beautifully, and also, because we’re selling the system with the GamePad – which adds extra cost to the package – we don’t want to inflate the cost of each unit by putting in excessive CPU power.

At the moment of the Wii U’s launch it’s likely that it will be most the powerful console on the market – Wii U being a much newer system than either PS3 or Xbxo 360. Are Nintendo looking to take this opportunity to release a game which takes advantage of this visual horse power?



Iwata said:
I’m not against beautiful graphics, but my thinking is that unless the play experience is really rich the wonderful graphics won’t really help. I’m really looking forward to beautiful games coming out on Wii U though, with graphics that we couldn’t have done on the Wii.



There’s definitely the chance for not only graphics, but also other features that our competitor’s consoles don’t have. But I think it will become increasingly difficult from now on to compete over graphics. This is because that no matter how much we increase the number of polygons we can display and improve the shading it will become increasingly difficult to tell the difference.



Obviously people who are experts in the field will see these things and will look at some details and be enthusiastic about improvements in that field, but I don’t think that will be enough from the general consumer’s point of view, so I think when we look at the design of a new games console we need a structure and concept that offers more than just good graphics.

beaten: :(
 

This was actually known by many right after the E3 press conference. Good reminder though.


720p at 30fps and 2xAA

My take on hardware so far is the CPU will be 45nm and the GPU will be 40nm?

Why?

Well I bought myself a gtx 570 not too long ago, a 40nm card and it's pretty quiet and cheap for what it does. I think Nintendo would pref a 32nm GPU but the cost would be too high for consumers so they will stick with the 40nm process. I'm ok with that. I will dance like an idiot if it's 32nm though. It'll be like GameCube all over again, small but powerful

I totally agree with the Gamecube example as something to go by with the Wii U tech. How much money did Nintendo spend in R&D for the Wii U, over a billion right?






On Wii U being a GPGPU/DX11 GPU, I made a late edit to a previous post:


DirectX 11 was released at the end of 2009, so a GPU with it's features should be a given in 2012, also it has been confirmed by developers that Wii U's GPU is a modern one. While "modern" is open to speculation, if it was not based on the features in DirectX 11 it would not be able to handle the features required for Next-Gen games. DirectX 11 does everything DirectX 10 does only more of it and with better performance. Using something based on DirectX 10.1 for the Wii U wouldn't make sense because it's not as optimized.

Compute Shaders were designed for DirectX 11: "Although DirectCompute was introduced with Microsoft* DirectX* 11, it is possible to run a compute shader on Microsoft* DirectX* 10-, 10.1-, and 11-class hardware" http://software.intel.com/en-us/arti...ssor-graphics/

It's possible for them to get Compute Shaders running on DirectX 10.1 but it's not the intended target. This is another reason to conclude that the Wii U is using DirectX 11 Compute Shaders & modern features, thus making it a GPGPU or a modern DX11 GPU.

http://www.neogaf.com/forum/showpost.php?p=39777978&postcount=7795
 
Iwata: "It will become increasingly difficult to tell the difference in graphics"

Internet: Endless controversy over whether Wii U is powerful enough, such that it's almost become the defining point of any discussion of the system.
 
So if the Wii U does have 32mb of eDRAM on the GPU, will it be enough to store a 720p or even 1080p image and add "free" AA or motion blur or DOF without taxing the GPU? Will the Wii U be able to render 720p games natively without tiling and AA?

Pretty positive BG and some others are on the record saying 32MB is enough for 720p with 4 x MSAA or 1080p with no MSAA in a single pass. Thats IIRC, I don't know how to do the calculations myself.
 
Looks like you were beaten Sorcerer. I wonder how literal is Iwata's CPU comment.

DirectX11 has 3 major new features IIRC:

Tessellation
Compute shaders
Multithreading

The WiiU apparently supports at least the first 2 to some degree. So I'm not sure what you guys are debating. The WiiU is not going to use the DirectX API and the exact implementation of tessellation and compute shaders may be somewhat different, but if the WiiU does support tessellation (beyond what the 360 supports which was basically ignored by developers) and compute shaders then it is highly in line with DX11 features.

The compute shader mention is the most interesting part of target specs. We know that was MS' creation for DX11 and like you said Wii U won't be using DX. Nintendo could have just said OpenCL, but it seems like they were giving an indication of the GPU's capability level. Like how in the PS4 target specs, Sony says PS4's GPU will be capable of DX11.5. Obviously Sony won't be using DX like Nintendo and DX11.5 doesn't exist, but they were giving an idea of the capability level of their GPU. And DX has apparently become the de facto template for comparison.

Nah, if it's not 40 nm it'll be 28 nm's.

All graphics card manufacturers (AMD/ATi and Nvidia) skipped the 32 nm process.

GPU Manufacturer's: 55, 40, 28 nm's
CPU Manufacturer's: 65, 45, 32, 22 nm's

I don't know about that. Nintendo has always used main nodes over half nodes for their CPU and GPU. That's what makes 32nm likely IMO.
 

USC-fan

Banned
This was actually known by many right after the E3 press conference. Good reminder though.




I totally agree with the Gamecube example as something to go by with the Wii U tech. How much money did Nintendo spend in R&D for the Wii U, over a billion right?






On Wii U being a GPGPU/DX11 GPU, I made a late edit to a previous post:


DirectX 11 was released at the end of 2009, so a GPU with it's features should be a given in 2012, also it has been confirmed by developers that Wii U's GPU is a modern one. While "modern" is open to speculation, if it was not based on the features in DirectX 11 it would not be able to handle the features required for Next-Gen games. DirectX 11 does everything DirectX 10 does only more of it and with better performance. Using something based on DirectX 10.1 for the Wii U wouldn't make sense because it's not as optimized.

Compute Shaders were designed for DirectX 11: "Although DirectCompute was introduced with Microsoft* DirectX* 11, it is possible to run a compute shader on Microsoft* DirectX* 10-, 10.1-, and 11-class hardware" http://software.intel.com/en-us/arti...ssor-graphics/

It's possible for them to get Compute Shaders running on DirectX 10.1 but it's not the intended target. This is another reason to conclude that the Wii U is using DirectX 11 Compute Shaders & modern features, thus making it a GPGPU or a modern DX11 GPU.

http://www.neogaf.com/forum/showpost.php?p=39777978&postcount=7795
Wiiu will not used dx11 to run games. wiiu is likey dx10.1 that started out on r700. Make little sense to go after feature set in dx11 when they would not be using them. dx11 is a MS product. Nor ps4 or wiiu will use dx11.

The only reason you see us talking about dx11 because it lets us know the feature set of the card. If it is dx11 that means they move onto evergreen or newer gpu which would be a good thing.

putting in excessive CPU power.
Wonder what he means by "excessive CPU power", goes along with all the comment on CPU problems. This pretty much confirms it,, they scaled back on the CPU side.
 

disap.ed

Member
Iwata said:
I think that the Wii U will be powerful enough to run very high spec games but the architecture is obviously different than other consoles so there is a need to do some tuning if you really want to max out the performance.



We’re not going to deliver a system that has so much horsepower that no matter what you put on there it will run beautifully, and also, because we’re selling the system with the GamePad – which adds extra cost to the package – we don’t want to inflate the cost of each unit by putting in excessive CPU power.

Iwata putting excessive power into GPU confirmed.
 

DrWong

Member
Nintendo Gamer had also an Interview – Katsuhiro Harada talks Tekken Tag Tournament 2:
NG: How has it been developing for the Wii U? Is it a relatively easy system to develop on?

KH: The first thing we don’t really know at the moment is the release date of the hardware itself, then we’d be able to release more information about our game in particular. Regarding the hardware, the graphical processing capability is not bad at all. We’re actually able to do pretty much what we want in that regard. The one thing where we do kind of have to put our heads to good use is with the CPU itself – I guess because they’re trying to keep energy consumption pretty low we really do need to come up with some unique ideas to make good use of the CPU in order to attain the goals we need to for the game.
 
Status
Not open for further replies.
Top Bottom