my attempt at coloring:
red = eDRAM
pink = L1/L2 cache
black = logic
light blue = Audio DSP
white = IO
blue = power connectors?
Green = ARM core/ PCB
So, are we sure about the 32+4 eDRAM configuration? When I saw it I thought that it would be more like 28+4 because I thought it was confirmed that there is 32 MBs of eDRAM in total on that die.
COOL im from 954 too.....Can everyone calm down until we get a definitive count on the flops? (zombie's did this thread no favors :/)
In regards to the Wii U's max power draw, is it not possible that system can draw a higher constant wattage but no game has yet demanded it?
I'm having a real hard time wrapping my head around that kind of performance at that power draw.
all of this and we STILL have no idea what the gpu is and what it can do, i think the reality is this GPU is VERY custom.
Yeah you're right, the GS is just a rasterizer, still I thought the Wii U CPU was similar to one of the cores in the 360 CPU....if it's 12 Gigaflops it's much weaker...although of course there's a different design philosophy behind the whole system (GPGPU).
just a rasterizer, still I thought the Wii U CPU was similar to one of the cores in the 360 CPU....if it's 12 Gigaflops it's much weaker...although of course there's a different design philosophy behind the whole system (GPGPU).
Using FLOPS as a metric Xenon was something like 90 GFLOPS iirc.Much weaker? A xenon core is only 20% faster than a completely unmodified Broadway and the Wii U is using 3 derivative cores of it that, if the GPU is any indication, are surely heavily modified
They actually ran their library of software - not just FIFA and Mass Effect. They seemed to think it wasn't likely to change in future. Besides taking the 33W for gaming and then adding peripheral charging etc brings us close to the max anyway iirc.In regards to the Wii U's max power draw, is it not possible that system can draw a higher constant wattage but no game has yet demanded it?
Xenos can do GPGPU iirc.
They actually ran their library of software - not just FIFA and Mass Effect. They seemed to think it wasn't likely to change in future. Besides taking the 33W for gaming and then adding peripheral charging etc brings us close to the max anyway iirc.
Although I can't remember what the conclusion of those discussions were - i.e. 70W PSU = ? max power draw?
I thought Wii U couldn't do GPGPU?
There is very likely an up to 10W window that games could potentially push extra. Its not unusual either as it happens on titles that really push systems to the metal.
Yeah i'd share that view Antonz.
With the Xbox 360 and PS3 we certainly saw games that consumed more power then others
The ram isn't a bottleneck anymore (actually never was) so try again (you can only make a case for the cpu being a bottleneck of sorts).
It's the viability of its usage that is in question.It was confirmed in September that the Wii U has a GPGPU.
Man beyond3d members are on their own planet when it comes to Wii U.
That was always the question. It was known by tech members that the wiiu had computer shader suppport from day one since we had the base gpu core and 10.1 dx support.It's the viability of its usage that is in question.
Man beyond3d members are on their own planet when it comes to Wii U.
nothing particularly damning it seems, just that we shouldn't put too much hope into the idea of fixed functions bolstering the gpu performance.
What do you mean? What are they saying about all this new info?
i love nintendoland one of the most honest places you can go. I have ONLY taken nintendo's word word for this console and what it will be able to do. like ive said before Iwata is one of the most honest CEO's you will find in the gaming industry. so yeah i knowWii U will be fine and will create beautiful games. just cant wait to E3 so all of the numbers stuff will be irrelevant when we see the games the console will be running.... just the way nintendo wants it to be.
nothing particularly damning it seems, just that we shouldn't put too much hope into the idea of fixed functions bolstering the gpu performance.
Worked Nicely for the GameCube
Wish factor 5 still existed to give WiiU a go.
You are painting with a pretty broad brush seeing a lot of the members that are in this thread here also post on b3d. Pretty sure the guys that started this thread post over there also.basically what they have been saying since the beginning. there is a way to say the console isnt super powerful. but the basically now are saying its a BEST 1.5x 360. may still believes it a 360 nothing better saying the console overall is a big bottleneck. when is the last console nintendo made from the ground up that had a significant bottleneck in architecture and design? basically they are trying to take one aspect of the system and dooming the system as a whole due to that one thing. after what nintendo stated in their last q4 earning report really put my mind at ease.
yeah bayonetta has a chance to be the FIRST game to show what the finalized dev kit is capable of. i mean if thats real time on Wii U thats REALLY good for a so called gimped bottleneck system... according to beyond3d
What? You can't see anything in that clip, it's almost completely blacked-out.yeah bayonetta has a chance to be the FIRST game to show what the finalized dev kit is capable of. i mean if thats real time on Wii U thats REALLY good for a so called gimped bottleneck system... according to beyond3d
Both the director and producer confirmed that to be running in real time on Wii U hardware, but it's too dark, too low-quality, and too short to make any definitive statements about the quality of the entire game.
people keep saying that yes its a given i cant sum up the whole game off of that clip. i can though look at that clip in a vaccum and say thats REALLY damn good. if the full game is on this level We really have nothing to worry about.
What? You can't see anything in that clip, it's almost completely blacked-out.
What? You can't see anything in that clip, it's almost completely blacked-out.
This is a tech thread, not a defense force thread. Please stop posting GIFs.
its suppose to be dark.... im looking at the way the scene is renderd and the texture of the monster. i guess i see something most people dont.
meaning there is no fixed functions on the chip.
Exophase said:It's fun watching the NeoGAF thread.. a lot of people seem to be clinging to this idea that there's all this fixed function graphics magic to make up for the relatively small amount of die area that's dedicated to shaders. What they're forgetting is that this isn't just a GPU, but it's an entire SoC sans CPU. So for instance it needs various peripheral interfaces (SD, USB, NAND) and may include fixed video decoders as well.
If we're going to be talking about this potential I feel it's worth asking - what added fixed function hardware can help a remotely modern GPU design? I figure anything in the critical path for shaders is going to be a problem to go out of the shader array for, or will need additional hardware/software solutions to decouple it like TMUs are.
If we're going to be talking about this potential I feel it's worth asking - what added fixed function hardware can help a remotely modern GPU design? I figure anything in the critical path for shaders is going to be a problem to go out of the shader array for, or will need additional hardware/software solutions to decouple it like TMUs are.
Well seeing that the Wii U GPU even after this reveal is still much of a mystery (until Fourth Storm returns with more inside info)
I least we know the Wii U can already do this in real time:
I see it as well.
Considering what we are seeing in relation to the supposed "weak cpu", I think we should also factor in whether we think Takeda is a liar or not when he said that "There are people saying that the CPU is weak but that is not true."
yeah bayonetta has a chance to be the FIRST game to show what the finalized dev kit is capable of. i mean if thats real time on Wii U thats REALLY good for a so called gimped bottleneck system... according to beyond3d
Please don't start posting gifs in a thread about the gpu die photo.Well seeing that the Wii U GPU even after this reveal is still much of a mystery (until Fourth Storm returns with more inside info)
I least we know the Wii U can already do this in real time:
Please don't start posting gifs in a thread about the gpu die photo.
Sorry, I was doing just that the moment I realized it.could u edit your quote so there aren't gifs on the new page.
Then why does the diagram from Chipworks refer to the larger pool as "slower"? Same reason (for its size, not overall)? I don't mean to sound antagonistic--I'm genuinely curious.It's a higher speed for its size, not overall. But it should be extremely low latency even relative to the 32 MB MEM1.
I'll post this here as it seems to have been forgotten due to being at the bottom of the page.
Apologies. IMO he meant the nature of the macro is slower (lower bandwidth) but you've got enough of it that the bandwidth adds up to more than that of the smaller pool. It's still great to have that extra smaller pool, and the latency should be pretty fantastic.
Apologies. IMO he meant the nature of the macro is slower (lower bandwidth) but you've got enough of it that the bandwidth adds up to more than that of the smaller pool. It's still great to have that extra smaller pool, and the latency should be pretty fantastic.
It missing from the supposed Wii U spec sheet might mean it's there only for BC, though.