• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU technical discussion (serious discussions welcome)

Fafalada

Fafracer forever
brain_stew said:
It's a compressed, down scaled mirror of the TV screen, what you are seeing is normal.
I'm a bit miffed why Nintendo went with compression there - on the supported ranges, latest Wireless HDMI devices now support full 1080P with 3D and Surround uncompressed, with the 480P Stereo target, compression feels like overkill.

Apophis2036 said:
How so ? -
WiiU -
Tri Core IBM OoOE CPU clocked at 1.2 GHz.
Well, if you want to play paper-specs, 360 CPU is roughly 6x more powerful than WiiUs (PS3 would be closer to 18x).
 

AzaK

Member
Were talking about WiiU's tech in relation to where it will stand when the other next gen consoles arrive.

Im out of all the WiiU tech threads anyway until someone has evidence of how powerful the GPU is and how fast the eDRAM is clocked at, without that info these threads are circle arguments.

I understand but this thread has remained completely focussed on tech specs and not comparing platforms as has been going on the last couple of days. Be nice to keep it that way. There's two more threads for the other stuff.
 

big_erk

Member
Were talking about WiiU's tech in relation to where it will stand when the other next gen consoles arrive.

Im out of all the WiiU tech threads anyway until someone has evidence of how powerful the GPU is and how fast the eDRAM is clocked at, without that info these threads are circle arguments.

I'd probably follow "circle" up with another word, but I get your point.
 

eternalb

Member
I'm a bit miffed why Nintendo went with compression there - on the supported ranges, latest Wireless HDMI devices now support full 1080P with 3D and Surround uncompressed, with the 480P Stereo target, compression feels like overkill.

With no latency?
 

Ashes

Banned
400 - 500 GFLOP vs a 1.8 TFLOP GPU, since when has that been a 300% increase lol...

Just over 4x at most for the GPU and 2x the Ram, no one knows what speed the PS4's CPU will be clocked at, if it's anything like the 720 it will be far lower than the 3.2 GHz CPU's found in PS360 though.

400 to 1800 = 350% increase

I must be missing something. Alright, I'm leaving the thread...
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
400 to 1800 = 350% increase

I must be missing something. Alright, I'm leaving the thread...

The only thing you're missing is that Apophis2036 is as bad at math as he is contributing to tech threads. :p
 

ozfunghi

Member
Mods have shown to be quick on the draw the past couple of months when it came to posting WiiU rumor threads. Maybe one of them can spare the time to moderate about 75% of the last three pages as well please?

EatChildren? Anyone?
 
Meltdowns, meltdowns. Get your meltdowns.

-----

Anyway, can someone elaborate on how exactly "GPGPU" is going to be the Wii U's panacea, as has been made out as on the chalkboard.

From reading B3D, the impression I'm getting is that it likely isn't going to be that much use.

There was also some interesting speculation, that seemed to imply that eDRAM may not be any sort of magic bullet either... that it's bandwidth could be as poor as the system RAM.
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
Meltdowns, meltdowns. Get your meltdowns.

-----

Anyway, can someone elaborate on how exactly "GPGPU" is going to be the Wii U's panacea, as has been made out as on the chalkboard.

From reading B3D, the impression I'm getting is that it likely isn't going to be that much use.

There was also some interesting speculation, that seemed to imply that eDRAM may not be any sort of magic bullet either... that it's bandwidth could be as poor as the system RAM.

Personally I hope we see some advancement in GPGPU use in all platforms, PC included next gen, only of course where the use of such resources makes sense and doesn't greatly affect graphics performance.

Planetside 2 can use it a little to improve particles on a CPU limited set-up but on my set-up when enabled it does drop the framerate by a few fps.
I've honestly not seen it used anywhere in games for anything other than particle rendering yet. (At least outside gaming it's found a home with SHA256 hashing & Bitcoin).

I honestly can't see it seeing much general use though, the workloads stream processors are suitable for is still very narrow, and there's no escaping the fact that any resources taken by a GPU on non-graphic related function is going to take away resources from that which a GPU is good at, graphics and rendering.
 

NBtoaster

Member
Personally I hope we some advancement in GPGPU use in all platforms, PC included next gen, only of course where the use of such resources makes sense and doesn't affect greatly graphics performance.

Planetside 2 can use it a little to improve particles on a CPU limited set-up but on my set-up when enabled it does drop the framerate by a few fps.
I've honestly not seen it used anywhere in games for anything other than particle rendering yet. (At least outside gaming it's found a home with SHA256 hashing & Bitcoin).

I honestly can't see it seeing much general use though, the workloads stream processors are suitable for is still very narrow, and there's no escaping the fact that any resources taken by a GPU on non-graphic related function is going to take away resources from that which a GPU is good at, graphics and rendering.

Isn't all the PhysX stuff on PC GPGPU based? That's been used for things like cloth physics, fluid simulation, particles, and others.

eg:

http://www.youtube.com/watch?v=EWFkDrKvBRU
 
There was also some interesting speculation, that seemed to imply that eDRAM may not be any sort of magic bullet either... that it's bandwidth could be as poor as the system RAM.
Maybe in comparison to what it could have been, but it's not likely to literally be anywhere close to the main mem bandwidth.

It'd defeat the purpose of it's existence.
 
Mods have shown to be quick on the draw the past couple of months when it came to posting WiiU rumor threads. Maybe one of them can spare the time to moderate about 75% of the last three pages as well please?

EatChildren? Anyone?

Apophis is comedy gold. They're giving him a pass. As they should.

Meltdowns, meltdowns. Get your meltdowns.

-----

Anyway, can someone elaborate on how exactly "GPGPU" is going to be the Wii U's panacea, as has been made out as on the chalkboard.

From reading B3D, the impression I'm getting is that it likely isn't going to be that much use.

There was also some interesting speculation, that seemed to imply that eDRAM may not be any sort of magic bullet either... that it's bandwidth could be as poor as the system RAM.
Wii U CPU has a very very weak and slow vector unit, compared clock-for-clock, whatever, its really bad. The GPU could sacrifice some processing time to bring the Wii U just under par in that area. That goes for physics/gravity, sound (offloaded to a DSP in WiiU case), time transforms, and a ton of other gameplay modifiers.
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
Isn't all the PhysX stuff on PC GPGPU based? That's been used for things like cloth physics, fluid simulation, particles, and others.

eg:

http://www.youtube.com/watch?v=EWFkDrKvBRU

Yes, but then things like cloth physics, fluid simulation and particles are pretty much the narrow workload I was talking about, we're not talking about major parts of what would traditionally be the CPU workload anyway, which would be the game logic, AI etc.

There will be interesting uses, some of which will slightly broaden that narrow workload more as developers find tricks people haven't thought of yet.

But that is a million miles away from seeing any of the majority of what is usually the domain of the CPU being done on the GPU, those expecting that kind of offloading are going to be disappointed.
 

z0m3le

Banned
Yes, but then things like cloth physics, fluid simulation and particles are pretty much the narrow workload I was talking about, we're not talking about major parts of what would traditionally be the CPU workload anyway, which would be the game logic, AI etc.

There will be interesting uses, some of which will slightly broaden that narrow workload more as developers find tricks people haven't thought of yet.

But that is a million miles away from seeing any of the majority of what is usually the domain of the CPU being done on the GPU, those expecting that kind of offloading are going to be disappointed.

You also have to think about GPGPU being used as more than just enhancements, but integral to the actual game. It will happen with next get, maybe more so on Wii U if that is all the console can really do.
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
You also have to think about GPGPU being used as more than just enhancements, but integral to the actual game. It will happen with next get, maybe more so on Wii U if that is all the console can really do.

The thing is that in order for it to make sense to move a calculation to the GPU from the CPU it has to be a a fairly small task on a largely parallel-ized dataset, with little to no branching, think of shaders acting on vertices in a scene, or pixels on a framebuffer, these are the workloads a GPU is designed to do, and to do very quickly.

Not all tasks fall into this category, in fact the tasks that can be moved already are being moved, but there is still a lot of tasks the CPU is responsible for that don't make sense to move to the GPU, the GPU would perform very badly if it could do them at all on these tasks.

It's not that technology hasn't moved forward, but rather that GPU tech has massively moved forward for GPU's to undertake these types of parallel task to the point that they do not, and never will do the other types of task well.

And a whole load of those "other types of tasks" are what make up the game logic and AI that run on the CPU in a game.
 

Ashes

Banned
I think I may have interrupted the first decent thread on wiiu tech. Apologies.

Are we allowed to talk about costs? If so what is the reason behind the wii u controller's high cost?

The eurogamer wii u review posted on Gaf implied it would be cheap, but I've been led to believe that it is near half of the console cost. If this has been discussed already, what is it, that is contributing to high cost?

edit: If there is a better thread to direct this question, please direct me, I'll be happy to post it there.
 

z0m3le

Banned
The thing is that in order for it to make sense to move a calculation to the GPU from the CPU it has to be a a fairly small task on a largely parallel-ized dataset, with little to no branching, think of shaders acting on vertices in a scene, or pixels on a framebuffer, these are the workloads a GPU is designed to do, and to do very quickly.

Not all tasks fall into this category, in fact the tasks that can be moved already are being moved, but there is still a lot of tasks the CPU is responsible for that don't make sense to move to the GPU, the GPU would perform very badly if it could do them at all on these tasks.

It's not that technology hasn't moved forward, but rather that GPU tech has massively moved forward for GPU's to undertake these types of parallel task to the point that they do not, and never will do the other types of task well.

And a whole load of those "other types of tasks" are what make up the game logic and AI that run on the CPU in a game.

I was talking more about the things that physx already does, just more integral to design.
 
I think I may have interrupted the first decent thread on wiiu tech. Apologies.

Are we allowed to talk about costs? If so what is the reason behind the wii u controller's high cost?

The eurogamer wii u review posted on Gaf implied it would be cheap, but I've been led to believe that it is near half of the console cost. If this has been discussed already, what is it, that is contributing to high cost?

edit: If there is a better thread to direct this question, please direct me, I'll be happy to post it there.

I think you could talk component costs. What would be frowned upon would be arguing that less or more should have been spent on the it, or that if they had spend less on the controller, they could have spent more on the base unit.

I personally don't know how much that panel cost, or the silicon backing it, and I think a proper tear-down would be really interesting.
 

Ashes

Banned
I think you could talk component costs. What would be frowned upon would be arguing that less or more should have been spent on the it, or that if they had spend less on the controller, they could have spent more on the base unit.

I personally don't know how much that panel cost, or the silicon backing it, and I think a proper tear-down would be really interesting.

Fair enough.

For reference, the eurogamer article implied this:

In a world where Chinese manufacturers can sell complete Android tablets with capacitive touch-screens for £50, it's safe to say that the Wii U GamePad won't be costing Nintendo too much to construct.

That makes sense to me. This doesn't [yet]:

techrader said:
Nintendo hasn't confirmed what it will charge for replacement GamePads, but the standalone tablet controllers will cost ¥13,400 in Japan, or around $172 (£106).

Source.
 
Yes, but then things like cloth physics, fluid simulation and particles are pretty much the narrow workload I was talking about, we're not talking about major parts of what would traditionally be the CPU workload anyway, which would be the game logic, AI etc.

GPGPU could be very useful for complex AI and pathfinding. Not to mention complex physics and world deformations that alter gameplay.

Just look at how the Cell processor is used, since its much closer to a GPU in many respects to how it processes than most other processors.

Complex AI like what we started to see with Spore and what we could see in Sims/Simcity/Elder Scrolls/GTA/ETC in the future. With emergent worlds and the "Butterfly Effect" being reflected in game. That would definitely benefit from the way GPUs and Cell process data with multiple passes
 
So here are all the new GPU features (as exposed by D3D on the R700 at least) compared to previous consoles. There is some stuff that isn't here of course, some obscure (like the R700's support for Fetch Shaders which aren't exposed to either D3D or OpenGL) and some you may have heard of (the R700's tessellator).

I'm going to fill each of this in explaining what they are but it may take awhile. Check back if this post is incomplete when you see it!

Shader Model 4.0: This biggest change is the addition of bitwise operators and native integers.

Geometry Shader: Inserts another shader into the pipeline along side the vertex and pixel shaders. This allows you to write GPU shaders that create primitives. Points, lines or triangles. Doing a shadow volume extrusion entirely on the GPU for instance.

Stream out: Allows you to write directly to memory from a geometry or vertex shader bypassing the pixel shader (and ROPs).
Alpha-to-coverage
8K textures
MSAA textures
2-sided stencil
general render target views
texture arrays
BC4/BC5
optional DirectCompute (CS 4.0)
full floating-point format support

Shader Model 4.1
cubemap arrays
extended MSAA
optional DirectCompute (CS 4.1)

Thanks Popstar. I'm personally interesting in this. I believe that the extra GPU features will eventually be a major factor of Wii U games moving ahead of current-gen expectations.
 

NBtoaster

Member
Alpha-to-coverage: When using MSAA, you can convert an alpha value into a number of opaque and fully transparent fragments. Imagine you have a 4x MSAA buffer and a 50% transparent object. Instead of alpha blending you instead simply write the colour to 2 of the 4 fragments making up the pixel.

8K textures: Larger texture sizes.

Both PS3 and 360 can do alpha to coverage and 360 supports 8K textures.

Of course while other consoles may be able to do some of these things it's probably faster/less of a hack on Wii U.
 
A launch durango game is going to look better than anything capable on the wii u?
I don't know man. There are some beautiful wii games when up rezed to 1080p

While this may have already been clarified before, Aegies specifically said it will be look better "in a technical perspective" In other words, they will be pushing more polygons, effects, etc. It doesn't necessarily mean that it will look more appealing, which is more of a subjective manner.

Controllers are never sold below cost (or even at cost), so you can't really tell much from the retail price of a standalone gamepad except that it costs them less than ¥13,400 to manufacture.

That makes sense. As an example, I think the Wii-mote's manufacturing cost was less than $10.
 

PetrCobra

Member
Fair enough.

For reference, the eurogamer article implied this:



That makes sense to me. This doesn't [yet]:



Source.

Question is if the display really makes for the majority of the total cost for the controller. They made a big deal out of the technology that allows for the close-to-zero latency. The parts could be cheap to produce, or they could be more expensive than expected, we don't know (or does someone?).

And there are probably also some patents/licensed technologies involved which would also be projected to the cost.
 

Ashes

Banned
While this may have already been clarified before, Aegies specifically said it will be look better "in a technical perspective" In other words, they will be pushing more polygons, effects, etc. It doesn't necessarily mean that it will look more appealing, which is more of a subjective manner.



That makes sense. As an example, I think the Wii-mote's manufacturing cost was less than $10.

He does say technically. I'll give you that one. :p

Controllers are never sold below cost (or even at cost), so you can't really tell much from the retail price of a standalone gamepad except that it costs them less than ¥13,400 to manufacture.

Yeah. I guess this makes sense too.
 

JordanN

Banned
There was also some interesting speculation, that seemed to imply that eDRAM may not be any sort of magic bullet either... that it's bandwidth could be as poor as the system RAM.
Where does this idea come from? The whole point of EDRAM is to move data at faster than usual speeds (compared to say its 2GB of ram) and IBM seems to agree with that as well.

IBM Press said:
IBM's unique embedded DRAM, for example, is capable of feeding the multi-core processor large chunks of data to make for a smooth entertainment experience.
http://www-03.ibm.com/press/us/en/pressrelease/34683.wss
 

wsippel

Banned
Meltdowns, meltdowns. Get your meltdowns.

-----

Anyway, can someone elaborate on how exactly "GPGPU" is going to be the Wii U's panacea, as has been made out as on the chalkboard.

From reading B3D, the impression I'm getting is that it likely isn't going to be that much use.

There was also some interesting speculation, that seemed to imply that eDRAM may not be any sort of magic bullet either... that it's bandwidth could be as poor as the system RAM.
The current discussion on B3D is just a few people who have absolutely no idea how the Gamecube and Wii memory architecture worked making wild and completely baseless assumption.
 

Datschge

Member
You also have to think about GPGPU being used as more than just enhancements, but integral to the actual game. It will happen with next get, maybe more so on Wii U if that is all the console can really do.

GPGPU on Wii U may well be seen as a testbed of a system that AMD tries to eventually implement with Fusion/heterogeneous computing in x86/x64 space, having unified address space for both CPU and GPU and all that things. Support for this is mostly a question of software maturity which again depends on how widespread its use is. So I expect it to be used well if similar concepts are used in Xbox720/PS4 and AMD finally sells APUs of that kind and the technology (HSA/Heterogeneous System Architecture) is being widely embraced. If that doesn't happen I don't expect it to be in much use by Wii U games alone either.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
GPGPU on Wii U may well be seen as a testbed of a system that AMD tries to eventually implement with Fusion/heterogeneous computing in x86/x64 space, having unified address space for both CPU and GPU and all that things. Support for this is mostly a question of software maturity which again depends on how widespread its use is. So I expect it to be used well if similar concepts are used in Xbox720/PS4 and AMD finally sells APUs of that kind and the technology (HSA/Heterogeneous System Architecture) is being widely embraced. If that doesn't happen I don't expect it to be in much use by Wii U games alone either.
I largely share your meta-view on the matter - I think WiiU might be a harbinger of future AMD (and not only theirs) APU technologies. Not unlike how Xenos was a harbinger of future AMD (and not only theirs) GPU tech.

BTW, marcan has a few things to add on the CPU front:

@AminKhajehnassi @eubank_josh we suspect a cross between the 750CL and the 750FX but it's unclear. The SMP is new anyway.
 

pottuvoi

Banned
I've honestly not seen it used anywhere in games for anything other than particle rendering yet. (At least outside gaming it's found a home with SHA256 hashing & Bitcoin).
Some more well known usages in games.

Battlefield 3, tiled deferred rendering. (all direct lighting is done by compute shading.))
Just Cause 2, post processing and ocean rendering.
Civilization V, texture decompression.
Rage, texture conversion to dxt.
Crysis 2, post process, possibly lighting.

There should be many more, especially post processing seems to be one of the more obvious and 'easy' things to use GPGPU for.
 
Top Bottom