JordanN
Banned
You gotta ask Microsoft/Sony but I think it's pretty clear it's pure calculating power.10x better at what? See, there's the point of the thread.
You gotta ask Microsoft/Sony but I think it's pretty clear it's pure calculating power.10x better at what? See, there's the point of the thread.
I don't want to read 6000+ posts so can someone please summarize what is known?
There are other threads for hearsay and groupthink. The last few pages of this thread are an oasis of logic and reason in a sea of sh*t. Sh*t like this post I just quoted.Hahahaha.
That's, literally, the BEST summation of this thread on the last few pages.
Please brace for those who still feel there's some Nintendo-exclusive techniques that will make the Wii U incredibly overpowered
High level summary: Nintendo and AMD customized the hell out of this thing.I don't want to read 6000+ posts so can someone please summarize what is known?
Hahahaha.
That's, literally, the BEST summation of this thread on the last few pages.
Please brace for those who still feel there's some Nintendo-exclusive techniques that will make the Wii U incredibly overpowered
You gotta ask Microsoft/Sony but I think it's pretty clear it's pure calculating power.
I think he means underpowered the similar to how the Wii was back in 2006.
The sheer power difference between Wii U and PS4/XBO will make ports really hard or downright impossible for the majority of devs. This isn't debatable.
You seem to know something everyone else doesn't. The rest of us don't know the "pure calculating power" of the Wii U.
Yep, that's correct.Same here. Btw, someone mentioned you dug up something on Latte having UVD. Is this true? If so, that block has gotta be somewhere and I have an idea that might hinge on it.
Do you really think Wii U is closer to PS4/XBO than 360/PS3 in power? I think you'll be extremely disappointing if so.
Does anyone really "know" anything?
I'm not saying I know Wii U's exact power but I know what the more likely answer is.I think you misunderstood what I'm saying.
Do you ever actually contribute to this thread?
I didn't think so.
(baseless conjecture alert!)
I still think 320 ALUs makes the most sense, 160 with all that memory feels weird to me. I mean: why?
There are other threads for hearsay and groupthink. The last few pages of this thread are an oasis of logic and reason in a sea of sh*t. Sh*t like this post I just quoted.
High level summary: Nintendo and AMD customized the hell out of this thing.
Do you ever actually contribute to this thread?
I didn't think so.
Where's the connection between relative power and interest in chip design? Should we not be interested in Xbox Ones GPU or the PS4 GPU because they're so underpowered in comparison to GPUs that have been on the market for a while now in the PC space?
See there's really no logic to this idea. What's interesting for people here is a GPU that's unexplained. We know loads about the other consoles GPUs, what's the point in speculating something you already know?
But of course never the less we still have the same people coming into this thread with their console wars attitudes. My advice to Heavy and the likes is if you arent interested in the topic at hand then feel free to not be here, you aren't contributing anything worthwhile.
Do you really think Wii U is closer to PS4/XBO than 360/PS3 in power? I think you'll be extremely disappointing if so.
Can't you have this console wars discussion elsewhere? This thread isn't for this kind of thing. Nobody cares that WiiU isn't as powerful as Xbox One or PS4 or any ridiculously ill conceived multiples you or anyone else has come up with (especially ill conceived considering even Xbox One and PS4 aren't exactly close in power themselves).
Can't you have this console wars discussion elsewhere? This thread isn't for this kind of thing. Nobody cares that WiiU isn't as powerful as Xbox One or PS4 or any ridiculously ill conceived multiples you or anyone else has come up with (especially ill conceived considering even Xbox One and PS4 aren't exactly close in power themselves).
I'm not saying I know Wii U's exact power but I know what the more likely answer is.
If Wii U isn't closer to PS4/XBO, knowing Wii U's power would be null for the exact reasons downporting would be hard.
Neither system is ten times more powerful than its predecessor. And a lot of what those systems have over their predecessors is apparently wasted on useless OS features most users will never, ever touch.You gotta ask Microsoft/Sony but I think it's pretty clear it's pure calculating power.
Like a lot of others, I gave up.
Too many people are entirely hostile and passionate about this thing, and perhaps unrealistically so.
I think you should tell Microsoft that*.Neither system is ten times more powerful than its predecessor. And a lot of what those systems have over their predecessors is apparently wasted on useless OS features most users will never, ever touch.
From the outside looking in (I know nothing about this) I'm wondering why even bother when we know it's underpowered. The Wii U is just slightly more powerful than the 360 and PS3. Why waste any significant time at all trying to analyze its GPU? It's like writing a thesis for your PhD on a Transformers film in the sense that you've wrote this incredible, detailed, well-researched paper for an incredibly crappy film.
Hahahaha.
That's, literally, the BEST summation of this thread on the last few pages.
Please brace for those who still feel there's some Nintendo-exclusive techniques that will make the Wii U incredibly overpowered
I think it's more like the Wii U is like the girl next door, while she's no super model, she's unique and mysterious, and for some reason it's fun trying to figure her out.From the outside looking in (I know nothing about this) I'm wondering why even bother when we know it's underpowered. The Wii U is just slightly more powerful than the 360 and PS3. Why waste any significant time at all trying to analyze its GPU? It's like writing a thesis for your PhD on a Transformers film in the sense that you've wrote this incredible, detailed, well-researched paper for an incredibly crappy film.
You gotta ask Microsoft/Sony but I think it's pretty clear it's pure calculating power.
Ignoring the fact that not a single spec is 10x that of the WiiU...yeah, looks like we still need this thread as we keep getting ignorant made up posts like this
I don't think 10x is literally "10x everything" but it all adds up.
The power to do 1080p, advance physics processing, post processing, the power to process larger textures, a strong tessellation unit etc all at once.
It doesn't make sense, either. People should realize by now that the playing field for technological pissing contests is getting smaller every day.Its so weird how its always the same 3-4 people shitting up Wii U technical threads.
I don't think 10x is literally "10x everything" but it all adds up.
The power to do 1080p, advance physics processing, post processing, the power to process larger textures, a strong tessellation unit etc all at once.
If you can show me Wii U/PS3/360 doing that I'll surely retract such statements.
Again, I can't answer this. You'll have to contact Microsoft/Sony for a definite answer of what 10x means (or someone with more knowledge fill me in).From knowledge of basic algebra I can tell you that 10x then it is literally 10x everything.
If that is not what 10x means then please tell me how you are measuring 10x.
I don't think Wii's situation got any better when it was 480p. I'm not seeing it change much.If XB1 and PS4 demand 1080p from the devs, Wii U would sit quite nicely with 720p downspec ports imo (if it were to get many). With regards to gpu feature-set it is of course much closer to those systems than Wii ever was to 360/PS3. Well, it wasn't actually even close and it still managed to get ports.
I don't think 10x is literally "10x everything" but it all adds up.
The power to do 1080p, advance physics processing, post processing, the power to process larger textures, a strong tessellation unit etc all at once.
If you can show me Wii U/PS3/360 doing that I'll surely retract such statements.
It's almost comical how some people can't or refuse to grasp the fact that this nothing like a Wii vs. PS360 situation in terms of capabilities.Neither system is ten times more powerful than its predecessor. And a lot of what those systems have over their predecessors is apparently wasted on useless OS features most users will never, ever touch.
If you read this page, you would know I had to bring up PS4/XBO. It's only console wars if you make it to be (I'm not doing it however).
And the XBO/PS4 are close enough to follow with my downport example.
Ironically, it's posts like these that I think "damaged" this thread more than me or some other users did. I think it would be nice to actually follow the conversation before pointing fingers or hastily joining in.Adds up to what? This thread isn't about proving the Wii U's superiority to other platforms. We're just trying to figure out exactly what to expect from the GPU, and the constant digressions into console wars malarkey are unhelpful.
It's almost comical how some people can't or refuse to grasp the fact that this nothing like a Wii vs. PS360 situation in terms of capabilities.
The raw power gap is not quite as large (easily more than an order of magnitude vs easily less than an order of magnitude) and the feature set is gap is closer still. Architecture and design philosophy are not as divergent either.
Wii U could still miss out on most ports due to business reasons, but that's not the point of this thread. This small handful of posters that get off on attacking anybody who doesn't talk down the Wii U as much as they do are the ones wasting their time, not the group investigating this mystery GPU.
Ironically, it's posts like these that I think "damaged" this thread more than me or some other users did. I think it would be nice to actually follow the conversation before pointing fingers or hastily joining in.
If you read this page, you would know I had to bring up PS4/XBO. It's only console wars if you make it to be (I'm not doing it however).
And the XBO/PS4 are close enough to follow with my downport example.
Yep, that's correct.
Do you have a source for that? I think Ubisoft made a comment at one point to the effect of a 1 million cost for porting. It would be very interesting to know this. If the porting costs were under $2 million, that means flops like FIFA may have even been (barely) profitable, and games like CoD, AC3, and Batman, while not bringing forth windfalls, were solidly in the black. Unless ghosts, AC4, and Arkham origins sell worse than their predecessors, I don't see these companies dropping all support. If they do, it makes it that much harder to take advantage if the system does end up with a solid installed base down the line.activision or ubisoft who've been able to do all their AAA multiplats @ under 2 Million USD, which makes selling ~60k units profitable.
Hey, I'm just saying PS4/XBO represents big leaps over their predecessor and the 10x alludes to that. I never said I knew for sure what 10x meant and you're within your power to take it up with MS/Sony personally. Don't shoot the messenger I guess.I've been reading, don't think there was any need to continue the nonsense that began this little detour from the topic at hand. Let alone bringing up the whole 10x idea which is no more a fact than the sky being purple.
Now you're trying to justify this ''fact'' by talking about some arbitrary level of performance that apparently denotes a factor of 10 (1080p, processing large textures ect). I'm not saying this to insult you but you don't know what your talking about.
Hey, I'm just saying PS4/XBO represents big leaps over their predecessor and the 10x alludes to that. I never said I knew for sure what 10x meant and you're within your power to take it up with MS/Sony personally. Don't shoot the messenger I guess.
I brought it up because I wanted to defend myself. If someone never replied, the conversion would have already been over.
With it not being even close to the same situation?I don't think Wii's situation got any better when it was 480p. I'm not seeing it change much.
Hard to tell. What's your take on block D by the way? I'd say D looks like some sort of (de)compressor - large SRAM pool for dictionaries, some dual port SRAM as work memory. Because (de)compressors apparently have both of those. But D doesn't look like any block in any known AMD GPU design as far as I can tell.I may as well throw this out there rather than stew on it myself...
I wonder if Block P isn't UVD with blocks Q being whatever it is joesiv labeled blocks "H" and "G" on Llano.
Perhaps this might even make sense of the block labeled "TAVC" on Brazos. It's right next to the UVD and could be a consolidated block like many of the others . Might be an "Accelerated Video Codec" or something like that and basically be 2x Latte's Q blocks.
Where that puts shader export, I don't know. Maybe Block "I" now that I'm going with bgassassin in having Block D as the UTDP...
Edit: Damn, it's like the Shin'en thread spilled into here. :\
Do you have a source for that? I think Ubisoft made a comment at one point to the effect of a 1 million cost for porting. It would be very interesting to know this. If the porting costs were under $2 million, that means flops like FIFA may have even been (barely) profitable, and games like CoD, AC3, and Batman, while not bringing forth windfalls, were solidly in the black. Unless ghosts, AC4, and Arkham origins sell worse than their predecessors, I don't see these companies dropping all support. If they do, it makes it that much harder to take advantage if the system does end up with a solid installed base down the line.
Hard to tell. What's your take on block D by the way? I'd say D looks like some sort of (de)compressor - large SRAM pool for dictionaries, some dual port SRAM as work memory. Because (de)compressors apparently have both of those. But D doesn't look like any block in any known AMD GPU design as far as I can tell.
When I'm hearing zero mention of UE4 or Agni's running on Wii U (the tech involved, not the engine) I'm not sure why anyone would believe this.and it probably would of taken 3 or 4 TFLOPs to bury Wii U from a performance stand point
And whoosh. Oh well, I guess I'm done now.z0m3le said:nothing you've just said has to do with tech
This caught my attention (and I guess it's GPU related).
When I'm hearing zero mention of UE4 or Agni's running on Wii U (the tech involved, not the engine) I'm not sure why anyone would believe this.
I think if Wii U really was this "PS2 of next gen" I'm sure many here would like to believe, why oh why is it years after the console was announced all this next gen stuff is showing up on their competitors and not Wii U?
I think 1 or 1.7Tflops is enough to bury Wii U since it's quite evident devs don't care enough to use Wii U for all these fantastic demos or games.
Please. The Wii U could have been packing a teraflop and it still would miss out on the vast majority of the games it's currently missing out on. And the multiplats would probably still be half assed right now.This caught my attention (and I guess it's GPU related).
When I'm hearing zero mention of UE4 or Agni's running on Wii U (the tech involved, not the engine) I'm not sure why anyone would believe this.
I think if Wii U really was this "PS2 of next gen" I'm sure many here would like to believe, why oh why is it years after the console was announced all this next gen stuff is showing up on their competitors and not Wii U?
I think 1 or 1.7Tflops is enough to bury Wii U since it's quite evident devs don't care enough to use Wii U for all these fantastic demos or games.
Nice work. These are the RV770 and Brazos pics I've been relying on. That was the best RV770 I found.Does anyone have better versions of the R770, or a Brazos die shot? The R770 shot in the OP is pretty small, probably wouldn't be worth tweaking, and I didn't see a Brazos shot. Bigger the better, uncompressed ideal!
I've been using Photoshop for 18 years, so I could certainly help with any needed "art".
Alright, here's what I see as going on around the perimeter of RV770. It's a bit hard to tell where some blocks end and other begin, but this should more or less give some idea of where I'm coming from. I've labeled what must be the L2 cache blocks. The other blocks I've outlined are either memory controllers or ROPs. No way for me to say which is which.
http://u.cubeupload.com/Fourth_Storm/rv770annotated.jpg
Hmm, actually if I were to take a guess, I would say that the highlighted block all the way to the top right (with it's small portion of brighter SRAM), would be a memory controller. Descriptions of the layout have memory controllers paired w/ the L2 and ROPs. Obviously the actual die is a bit messier, but if we are to go on that, then the block to the left of that separated block of L2 on top should be a memory controller. I don't think the ROPs and L2s interact with each other, so there would be no need for them to be in close proximity.
Edit: I've revised the annotation to reflect these thoughts.
Q is the display controller and analagous to Latte's F block. Both appear near the display phy and have 32 of those small SRAM banks.
I may as well throw this out there rather than stew on it myself...
I wonder if Block P isn't UVD with blocks Q being whatever it is joesiv labeled blocks "H" and "G" on Llano.
Perhaps this might even make sense of the block labeled "TAVC" on Brazos. It's right next to the UVD and could be a consolidated block like many of the others . Might be an "Accelerated Video Codec" or something like that and basically be 2x Latte's Q blocks.
Where that puts shader export, I don't know. Maybe Block "I" now that I'm going with bgassassin in having Block D as the UTDP...
Edit: Damn, it's like the Shin'en thread spilled into here. :\
Hard to tell. What's your take on block D by the way? I'd say D looks like some sort of (de)compressor - large SRAM pool for dictionaries, some dual port SRAM as work memory. Because (de)compressors apparently have both of those. But D doesn't look like any block in any known AMD GPU design as far as I can tell.