Please don't do this to this thread.
I'm a bit miffed why Nintendo went with compression there - on the supported ranges, latest Wireless HDMI devices now support full 1080P with 3D and Surround uncompressed, with the 480P Stereo target, compression feels like overkill.brain_stew said:It's a compressed, down scaled mirror of the TV screen, what you are seeing is normal.
Well, if you want to play paper-specs, 360 CPU is roughly 6x more powerful than WiiUs (PS3 would be closer to 18x).Apophis2036 said:How so ? -
WiiU -
Tri Core IBM OoOE CPU clocked at 1.2 GHz.
Were talking about WiiU's tech in relation to where it will stand when the other next gen consoles arrive.
Im out of all the WiiU tech threads anyway until someone has evidence of how powerful the GPU is and how fast the eDRAM is clocked at, without that info these threads are circle arguments.
Were talking about WiiU's tech in relation to where it will stand when the other next gen consoles arrive.
Im out of all the WiiU tech threads anyway until someone has evidence of how powerful the GPU is and how fast the eDRAM is clocked at, without that info these threads are circle arguments.
I'm a bit miffed why Nintendo went with compression there - on the supported ranges, latest Wireless HDMI devices now support full 1080P with 3D and Surround uncompressed, with the 480P Stereo target, compression feels like overkill.
400 - 500 GFLOP vs a 1.8 TFLOP GPU, since when has that been a 300% increase lol...
Just over 4x at most for the GPU and 2x the Ram, no one knows what speed the PS4's CPU will be clocked at, if it's anything like the 720 it will be far lower than the 3.2 GHz CPU's found in PS360 though.
400 to 1800 = 350% increase
I must be missing something. Alright, I'm leaving the thread...
The only thing you're missing is that Apophis2036 is as bad at math as he is contributing to tech threads.
HAHAHAHAHA
GO FUCK YOUR MOTHER.
Yes, perma ban me .
Meltdowns, meltdowns. Get your meltdowns.
-----
Anyway, can someone elaborate on how exactly "GPGPU" is going to be the Wii U's panacea, as has been made out as on the chalkboard.
From reading B3D, the impression I'm getting is that it likely isn't going to be that much use.
There was also some interesting speculation, that seemed to imply that eDRAM may not be any sort of magic bullet either... that it's bandwidth could be as poor as the system RAM.
Personally I hope we some advancement in GPGPU use in all platforms, PC included next gen, only of course where the use of such resources makes sense and doesn't affect greatly graphics performance.
Planetside 2 can use it a little to improve particles on a CPU limited set-up but on my set-up when enabled it does drop the framerate by a few fps.
I've honestly not seen it used anywhere in games for anything other than particle rendering yet. (At least outside gaming it's found a home with SHA256 hashing & Bitcoin).
I honestly can't see it seeing much general use though, the workloads stream processors are suitable for is still very narrow, and there's no escaping the fact that any resources taken by a GPU on non-graphic related function is going to take away resources from that which a GPU is good at, graphics and rendering.
Maybe in comparison to what it could have been, but it's not likely to literally be anywhere close to the main mem bandwidth.There was also some interesting speculation, that seemed to imply that eDRAM may not be any sort of magic bullet either... that it's bandwidth could be as poor as the system RAM.
Mods have shown to be quick on the draw the past couple of months when it came to posting WiiU rumor threads. Maybe one of them can spare the time to moderate about 75% of the last three pages as well please?
EatChildren? Anyone?
Wii U CPU has a very very weak and slow vector unit, compared clock-for-clock, whatever, its really bad. The GPU could sacrifice some processing time to bring the Wii U just under par in that area. That goes for physics/gravity, sound (offloaded to a DSP in WiiU case), time transforms, and a ton of other gameplay modifiers.Meltdowns, meltdowns. Get your meltdowns.
-----
Anyway, can someone elaborate on how exactly "GPGPU" is going to be the Wii U's panacea, as has been made out as on the chalkboard.
From reading B3D, the impression I'm getting is that it likely isn't going to be that much use.
There was also some interesting speculation, that seemed to imply that eDRAM may not be any sort of magic bullet either... that it's bandwidth could be as poor as the system RAM.
Isn't all the PhysX stuff on PC GPGPU based? That's been used for things like cloth physics, fluid simulation, particles, and others.
eg:
http://www.youtube.com/watch?v=EWFkDrKvBRU
Yes, but then things like cloth physics, fluid simulation and particles are pretty much the narrow workload I was talking about, we're not talking about major parts of what would traditionally be the CPU workload anyway, which would be the game logic, AI etc.
There will be interesting uses, some of which will slightly broaden that narrow workload more as developers find tricks people haven't thought of yet.
But that is a million miles away from seeing any of the majority of what is usually the domain of the CPU being done on the GPU, those expecting that kind of offloading are going to be disappointed.
That's too bad. I have an actual serious post I would like to make but it's pretty pointless with what's happened to this thread.
That's too bad. I have an actual serious post I would like to make but it's pretty pointless with what's happened to this thread.
That's too bad. I have an actual serious post I would like to make but it's pretty pointless with what's happened to this thread.
That's too bad. I have an actual serious post I would like to make but it's pretty pointless with what's happened to this thread.
You also have to think about GPGPU being used as more than just enhancements, but integral to the actual game. It will happen with next get, maybe more so on Wii U if that is all the console can really do.
The thing is that in order for it to make sense to move a calculation to the GPU from the CPU it has to be a a fairly small task on a largely parallel-ized dataset, with little to no branching, think of shaders acting on vertices in a scene, or pixels on a framebuffer, these are the workloads a GPU is designed to do, and to do very quickly.
Not all tasks fall into this category, in fact the tasks that can be moved already are being moved, but there is still a lot of tasks the CPU is responsible for that don't make sense to move to the GPU, the GPU would perform very badly if it could do them at all on these tasks.
It's not that technology hasn't moved forward, but rather that GPU tech has massively moved forward for GPU's to undertake these types of parallel task to the point that they do not, and never will do the other types of task well.
And a whole load of those "other types of tasks" are what make up the game logic and AI that run on the CPU in a game.
I think I may have interrupted the first decent thread on wiiu tech. Apologies.
Are we allowed to talk about costs? If so what is the reason behind the wii u controller's high cost?
The eurogamer wii u review posted on Gaf implied it would be cheap, but I've been led to believe that it is near half of the console cost. If this has been discussed already, what is it, that is contributing to high cost?
edit: If there is a better thread to direct this question, please direct me, I'll be happy to post it there.
I think you could talk component costs. What would be frowned upon would be arguing that less or more should have been spent on the it, or that if they had spend less on the controller, they could have spent more on the base unit.
I personally don't know how much that panel cost, or the silicon backing it, and I think a proper tear-down would be really interesting.
In a world where Chinese manufacturers can sell complete Android tablets with capacitive touch-screens for £50, it's safe to say that the Wii U GamePad won't be costing Nintendo too much to construct.
techrader said:Nintendo hasn't confirmed what it will charge for replacement GamePads, but the standalone tablet controllers will cost ¥13,400 in Japan, or around $172 (£106).
Yes, but then things like cloth physics, fluid simulation and particles are pretty much the narrow workload I was talking about, we're not talking about major parts of what would traditionally be the CPU workload anyway, which would be the game logic, AI etc.
So here are all the new GPU features (as exposed by D3D on the R700 at least) compared to previous consoles.
So here are all the new GPU features (as exposed by D3D on the R700 at least) compared to previous consoles. There is some stuff that isn't here of course, some obscure (like the R700's support for Fetch Shaders which aren't exposed to either D3D or OpenGL) and some you may have heard of (the R700's tessellator).
I'm going to fill each of this in explaining what they are but it may take awhile. Check back if this post is incomplete when you see it!
Shader Model 4.0: This biggest change is the addition of bitwise operators and native integers.
Geometry Shader: Inserts another shader into the pipeline along side the vertex and pixel shaders. This allows you to write GPU shaders that create primitives. Points, lines or triangles. Doing a shadow volume extrusion entirely on the GPU for instance.
Stream out: Allows you to write directly to memory from a geometry or vertex shader bypassing the pixel shader (and ROPs).
Alpha-to-coverage
8K textures
MSAA textures
2-sided stencil
general render target views
texture arrays
BC4/BC5
optional DirectCompute (CS 4.0)
full floating-point format support
Shader Model 4.1
cubemap arrays
extended MSAA
optional DirectCompute (CS 4.1)
That makes sense to me. This doesn't [yet]:
Alpha-to-coverage: When using MSAA, you can convert an alpha value into a number of opaque and fully transparent fragments. Imagine you have a 4x MSAA buffer and a 50% transparent object. Instead of alpha blending you instead simply write the colour to 2 of the 4 fragments making up the pixel.
8K textures: Larger texture sizes.
A launch durango game is going to look better than anything capable on the wii u?
I don't know man. There are some beautiful wii games when up rezed to 1080p
Controllers are never sold below cost (or even at cost), so you can't really tell much from the retail price of a standalone gamepad except that it costs them less than ¥13,400 to manufacture.
Fair enough.
For reference, the eurogamer article implied this:
That makes sense to me. This doesn't [yet]:
Source.
While this may have already been clarified before, Aegies specifically said it will be look better "in a technical perspective" In other words, they will be pushing more polygons, effects, etc. It doesn't necessarily mean that it will look more appealing, which is more of a subjective manner.
That makes sense. As an example, I think the Wii-mote's manufacturing cost was less than $10.
Controllers are never sold below cost (or even at cost), so you can't really tell much from the retail price of a standalone gamepad except that it costs them less than ¥13,400 to manufacture.
Where does this idea come from? The whole point of EDRAM is to move data at faster than usual speeds (compared to say its 2GB of ram) and IBM seems to agree with that as well.There was also some interesting speculation, that seemed to imply that eDRAM may not be any sort of magic bullet either... that it's bandwidth could be as poor as the system RAM.
http://www-03.ibm.com/press/us/en/pressrelease/34683.wssIBM Press said:IBM's unique embedded DRAM, for example, is capable of feeding the multi-core processor large chunks of data to make for a smooth entertainment experience.
The current discussion on B3D is just a few people who have absolutely no idea how the Gamecube and Wii memory architecture worked making wild and completely baseless assumption.Meltdowns, meltdowns. Get your meltdowns.
-----
Anyway, can someone elaborate on how exactly "GPGPU" is going to be the Wii U's panacea, as has been made out as on the chalkboard.
From reading B3D, the impression I'm getting is that it likely isn't going to be that much use.
There was also some interesting speculation, that seemed to imply that eDRAM may not be any sort of magic bullet either... that it's bandwidth could be as poor as the system RAM.
You also have to think about GPGPU being used as more than just enhancements, but integral to the actual game. It will happen with next get, maybe more so on Wii U if that is all the console can really do.
I largely share your meta-view on the matter - I think WiiU might be a harbinger of future AMD (and not only theirs) APU technologies. Not unlike how Xenos was a harbinger of future AMD (and not only theirs) GPU tech.GPGPU on Wii U may well be seen as a testbed of a system that AMD tries to eventually implement with Fusion/heterogeneous computing in x86/x64 space, having unified address space for both CPU and GPU and all that things. Support for this is mostly a question of software maturity which again depends on how widespread its use is. So I expect it to be used well if similar concepts are used in Xbox720/PS4 and AMD finally sells APUs of that kind and the technology (HSA/Heterogeneous System Architecture) is being widely embraced. If that doesn't happen I don't expect it to be in much use by Wii U games alone either.
@AminKhajehnassi @eubank_josh we suspect a cross between the 750CL and the 750FX but it's unclear. The SMP is new anyway.
Some more well known usages in games.I've honestly not seen it used anywhere in games for anything other than particle rendering yet. (At least outside gaming it's found a home with SHA256 hashing & Bitcoin).