oversitting
Banned
About the only thing XBox 3 has which could very well be considered 3-4x WiiU would be main memory.
try harder.
About the only thing XBox 3 has which could very well be considered 3-4x WiiU would be main memory.
On that note, AFAIK the only developer who has come out and said they had no issues with memory bandwidth was the developer for either Trine 2 or Nano Assault. I can't recall which one, but it was certainly one of the two.
Either way neither developer holds any real weight. They're not big name developers making complete 3D games and worlds. Rather they make small basic e-store indy titles. Trine 2 certainly isn't a graphically advanced game, nor is Nano Assault. So the fact they didn't have bandwidth issues or CPU issues means nothing.
Can you provide a source where a big name developer has said they've had no issues with the memory bandwidth of the Wii U? By big name i mean a 3rd party developer or publisher who make complex 3D games such as COD, Mass Effect, GTA, a Capcom, EA, Ubisoft, etc.
As per above.
Also given how bad the Trine 2 port on PS3 was, having AA on the menu and text, i wouldn't rate them very highly or competely at all. They're cool and made a great game, but they're not technical wizards.
The knee jerk defenders are at least as bad. (Yeah I'm still up)This thread has really turned to shit. Goodbye to the technical discussion.
Thanks anti-Nintendo crowd.
Haha, I literally called this kind of post a few minutes before it happenedEither way neither developer holds any real weight. They're not big name developers making complete 3D games and worlds. Rather they make small basic e-store indy titles. Trine 2 certainly isn't a graphically advanced game, nor is Nano Assault. So the fact they didn't have bandwidth issues or CPU issues means nothing.
Going by that logic we should discount the numerous devs that have had shoddy PS3 ports, which includes a good number of the larger onesAlso given how bad the Trine 2 port on PS3 was, having AA on the menu and text, i wouldn't rate them very highly or competely at all. They're cool and made a great game, but they're not technical wizards.
You won't have 8 cores for games, how many you will have is arguable (rumours were saying 6) but it certainly won't be 8. Also WiiU has 3 cores plus Audio DSP.
Goodness. What the fuck happened here?
Time to get back at probably the most mysterious aspect of the Wii U.
What do you guys suspect will be the power output of the GPU? Using PC graphics card as an estimate, what are your conservative, optimistic, and most likely cases?
What new news exactly has come about that would provide any more insight?Can we please get back to the interesting discussion on the various components and how they work?
What new news exactly has come about that would provide any more insight?
This thread was bumped after the spec leaks for Durangorbis rather than anything new about the Wii U.
What new news exactly has come about that would provide any more insight?
We have the same comments trotted out fro Shin'en for the millionth time. Yay?
This thread was bumped after the spec leaks for Durangorbis rather than anything new about the Wii U.
Wow.. I been reading this thread since it began, and i have to say, it has fallen far.. FAR from what it was intended. this is the issue. How can you discount developers who've made games for the system cause it doesn't meet your narrative or viewpoint?
Would it help if the Aliens guys said it again perhaps or would you not trust them either?
No one is arguing it's more powerful, or hell close to power as the other two, but to just write off Developers cause they don't fit your narrative is intellectually dishonest at BEST. We get it, Wii U doesn't have "magic parts and algorithms" to make it uber powerful... but comments and crap like this have completely ruined the conversation that was going on from posters like Fourth Storm, Blu, etc.
If the current rumours are accurate, a factor of 4-6 in GPU performance (and a newer architecture as well), up to 7 in CPU performance and around 6 in external memory bandwidth. (The latter is 15 instead of 6 for Orbis, but its memory setup is not comparable)
In the same interview, it's assumed they had no problems with the memory at all (including bandwidth).On that note, AFAIK the only developer who has come out and said they had no issues with memory bandwidth was the developer for either Trine 2 or Nano Assault. I can't recall which one, but it was certainly one of the two.
Either way neither developer holds any real weight. They're not big name developers making complete 3D games and worlds. Rather they make small basic e-store indy titles. Trine 2 certainly isn't a graphically advanced game, nor is Nano Assault. So the fact they didn't have bandwidth issues or CPU issues means nothing.
Can you provide a source where a big name developer has said they've had no issues with the memory bandwidth of the Wii U? By big name i mean a 3rd party developer or publisher who make complex 3D games such as COD, Mass Effect, GTA, a Capcom, EA, Ubisoft, etc.
And about the ram, are other parameters than latency such as bandwidth favorable compare to current platforms?
Shin'en: Im not able to be precise on that but for us as a developer we see everything works perfectly together.
What's "pushing high horsepower engines" even supposed to mean?No offense, but Shin'en aren't known for pushing modern high horsepower engines. They havent ran into any problems because they're adjusted to working on a much smaller space.
Graphical performance of a card compared for the wii-u, I would say <5670 on GDDR5. 440GFLOP estimates on the gpu is what I'd go with. This will probably translate to 2x the 360's pure GPU performance. The Edram in each system is likely very different. Would be much too hard to say what the effect is without much more data.
IW Engine has high horsepower. See, look at the horsies!What's "pushing high horsepower engines" even supposed to mean?
Graphical performance of a card compared for the wii-u, I would say <5670 on GDDR5. 440GFLOP estimates on the gpu is what I'd go with. This will probably translate to 2x the 360's pure GPU performance. The Edram in each system is likely very different. Would be much too hard to say what the effect is without much more data.
DX11 brought tessellation and many AA techniques. MLAA would be the big thing AMD was pushing at the time of DX11. Wii U will lack tessellation at least. I think there is also some texture things. I don't really know about the entire DX11 API but there is a difference between hardware than can run it and hardware that can't. OGL will not magically allow you to do things not inside the hardware.
The entire CPU is about the size of an single 45nm atom core. Theres not much say about it except its tiny. The 750 core hasn't been in development at IBM for years except for when nintendo wanted a new console. Im sure nintendo didn't pay IBM as much money as intel sinks into CPU development for us to compare it to a core 2.
From the actual chip. Because some person hear someone that said something about something other doesn't make it likely. There is no enough space on the GPU for 600GFLOPs.Where are you even getting this from? Most estimates peg is around 600GLOPs. It seems like you're ignoring the other GPUs that it's said to be close to and the numerous other estimates
Yes.Bottom line: can the Wii U outperform the current gen machines or not?
In the same interview, it's assumed they had no problems with the memory at all (including bandwidth).
Yes.
Anyone who says otherwise is uninformed or trying to stir up shit.
Where are you even getting this from? Most estimates peg is around 600GLOPs. It seems like you're ignoring the other GPUs that it's said to be close to and the numerous other estimates
The knee jerk defenders are at least as bad. (Yeah I'm still up)
Where are you even getting this from? Most estimates peg is around 600GLOPs. It seems like you're ignoring the other GPUs that it's said to be close to and the numerous other estimates
Reality.You base this on what?
Yes.
Anyone who says otherwise is uninformed or trying to stir up shit.
Reality.
Please provide a link when you make such accusations. I certainly don't remember anyone calling eDRAM "A Nintard wishful-thinking setup".
As per my previous post, they're a small indy devleoper who made a rather simplistic game. They're also heavily invested into Nintendo given they only make titles for Nintendo's platforms.
Which assumes both next gen consoles will not have a DSP, which we don't know. And also that DSP processing would be any substantial burden for modern CPU cores, on PC CPU Audio is already negligible.
True, if two cores on Durango are OS reserved the difference would be less, but it's still double the cores at 1.33x the clock rate. Even if the Wii U CPU does substantially higher instructions per clock (which I don't know that it does), that's pretty hard to overcome. And for the PS4 I have not heard of the OS taking up any cores.
It's political punditry thinly disguised as technically discussion. What information is contained in that post? "game X has less AA, therefore WiiU is inferior". So what, if I do a lazy port of an Android game to windows, than implies my PC is inferior to my phone?While ikioi's post had a negative slant, I don't see how it wasn't technical discussion. Unless the thread is only intended for effusive praise of Nintendo's design choices.
Several things to clarify:Graphical performance of a card compared for the wii-u, I would say <5670 on GDDR5. 440GFLOP estimates on the gpu is what I'd go with. This will probably translate to 2x the 360's pure GPU performance. The Edram in each system is likely very different. Would be much too hard to say what the effect is without much more data.
DX11 brought tessellation and many AA techniques. MLAA would be the big thing AMD was pushing at the time of DX11. Wii U will lack tessellation at least. I think there is also some texture things. I don't really know about the entire DX11 API but there is a difference between hardware than can run it and hardware that can't. OGL will not magically allow you to do things not inside the hardware.
The entire CPU is about the size of an single 45nm atom core. Theres not much say about it except its tiny. The 750 core hasn't been in development at IBM for years except for when nintendo wanted a new console. Im sure nintendo didn't pay IBM as much money as intel sinks into CPU development for us to compare it to a core 2.
I think he's looking at the size of the HD5670 (104mm2) and the transistor count (627 million) and assuming that WiiU's GPU will be of a similar size and amount of transistors when you remove eDRAM and any other extras on there (DSP, ARM CPU perhaps). He's then dropping the clock to 550Mhz which works out at 440Gflops. A bit conservative maybe but overall pretty reasonable.
Personally I think 440Gflops or 528Gflops are the two most likely numbers (unless we take Matts comment about the shader config being very unusual in which case who knows what the exact number could be).
Although nobody used those exact words the idea that the eDRAM could alleviate issues with the slow RAM was brought up and dismissed with extreme prejudice many times.
Then Durango specs leak, it has the same type of slow RAM, same amount of eDRAM even though it may be targeting higher resolution and has more horsepower meaning in essence it has proportionally less eDRAM than the WiiU , less L2 cache per core, not proportionally but in absolute terms, and somehow that setup *does* alleviate issues with slow RAM via....magic?
I think he's looking at the size of the HD5670 (104mm2) and the transistor count (627 million) and assuming that WiiU's GPU will be of a similar size and amount of transistors when you remove eDRAM and any other extras on there (DSP, ARM CPU perhaps). He's then dropping the clock to 550Mhz which works out at 440Gflops. A bit conservative maybe but overall pretty reasonable.
Personally I think 440Gflops or 528Gflops are the two most likely numbers (unless we take Matts comment about the shader config being very unusual in which case who knows what the exact number could be).
Like what?
The Mass Effect 3 port that runs worse then the Xbox 360 version.
Or the Assasins Creed version?
etc
They're both DDR3, but that doesn't limit them to the same bandwidth and speed. The rumor says it's 68GB/s, 5x the Wii U bandwidth.
What reality are you in? If you've some how built a bridge between our reality and a reality where Nintendo released a more powerful Wii U, help me cross over!
Please, don't leave me behind!
How can you discount developers who've made games for the system cause it doesn't meet your narrative or viewpoint?
Technical discussion: Data, e.g dye sizes, FLOPs. Opinions from developers on bottlenecks (or lack thereof). etc.
Not technical discussion: screenshot informed conjecture from dudes on the internet.
The rhetorical trick that certain people like to use is that whenever anyone says something negative about the WiiU those people are to be trusted especially if they aren't making WiiU games and have no horse in that race - even if it means they have no real technical knowledge and are not speaking from experience.
Meanwhile if you have actual technical knowledge of the WiiU and a new dev kit you are making a WiiU game, and therefore it's in your best interest to pimp the game and the system, and therefore nothing positive you say has merit.
It's a pretty clever rhetorical strategy that filters out anything positive by design.
Isnt this 5670 a 61 TDP card? The whole wuu uses half of that running games. Look at the 5550 is 39 TPD and that even uses more power than the entire wuu.
wuu whole console uses 33 watts running games....
Wuu would be lucky to hit over 350 glfop. The 40nm 5550 uses 39 watts and is at 352 gflops.
No offense, but Shin'en aren't known for pushing modern high horsepower engines. They havent ran into any problems because they're adjusted to working on a much smaller space.
What's "pushing high horsepower engines" even supposed to mean?
You mean like how i talked about the Wii U's MEM2 pool being on a 64bit bus. DDR3 1600 on a 64bit bus. 4 chips, 512 megabyte capcity, all on a 16 bit bus. 16bit x 4 = 64bit. 200mhz base clock x 4 x 16 = 10.2gbs per second of bandwidth. This is in comparison to the next gen consoles which appear to be using 256bit for their main memory pools. The Xbox 360 and PS3 also used 128bit bus for their GDDR3, which still provides more raw bandwidth then the Wii Us. Even with a modern memory controller there's no way the Wii U's ram is on par even in the real world vs the Xbox 360's memory.
Or the likelyhood that the GPU only has 8 ROPs due to the low memory bandwidth of the Wii U. ROPs are bandwidth dependant, there's no point adding more ROPs unless you can feed them data fast enough.
To which i expanded on by using the XBox 360 as an example. With the Xbox 360 the ROPs were intergrated into the eDRAM. Due to this configuration the ROPs could be fed data at around 256 gigabytes per second. The Wii U's eDRAM implamentation does not seem to be similar to this, with its bus being considerably slower.
Or the fact the CPU is the size of a single Intel ATOM core and has an incredibly low TDP. It's also based on the decade old IBM PPC 750 based architecture. Its performance is going to be anything but stellar.
Isnt this 5670 a 61 TDP card? The whole wuu uses half of that running games. Look at the 5550 is 39 TPD and that even uses more power than the entire wuu.
wuu whole console uses 33 watts running games....
Wuu would be lucky to hit over 350 glfop. The 40nm 5550 uses 39 watts and is at 352 gflops.
They're both DDR3, but that doesn't limit them to the same bandwidth and speed. The rumor says it's 68GB/s, 5x the Wii U bandwidth.
ozfunghi said:Are we supposed to discredit the opinion of a developer because he is used to working on less powerful hardware? So while he can squeeze more out of this hardware, it somehow "doesn't count" because they are used to working on underpowered hardware? As if, it's basically cheating and therefor the results he gets and by extension his opinion, are invalid?
Its always a binned part....They would likely be using some newer power gating and they clocked the chip much lower than the retail 5670. It more like comparing to the mobile 5750.
you have 33 watts to power the whole system, do the math.... funny you have a card that matches the specs but you over look it because it wrecks your math. The 5550...Don't know why you're always so focused on direct TDP comparisons. For a start that GPU (the 61w card) is using 2GB of GDDR5, that stuff easily uses 20 or so watts by itself. Second the GPU is clocked at 775Mhz, downclock it to 550Mhz (29%) and you'll lower the TDP by a larger percentage then you've lowered the clock speed. Finally this is a custom part, which could easily have some legacy PC features removed in order to save power.
The idea that a 156mm2 GPU with around 1 billion transistors will be lucky to hit 350Gflops at 550Mhz is quite mad.
I'd agree with this request. This thread is specifically for in-depth technical discussion, and not really the place for discussing how nice the Zelda demo does or doesn't look. Off-hand comments are fine, but big GIFs like those are very attention-grabbing, so inevitably draw the discussion towards them (particularly when quoted again).
Plus, you're drawing attention away from the boring spreadsheet I was about to post
Okay, so I realised that, even though the R700 architecture doesn't have a strict ratio between the number of SPUs and texture units, there are certain combinations which are possible, and certain ones which aren't. So, I put together a handy table on what the possible configurations are, and also what Gflops and texture fillrate values you'd get for each one:
On the left are the number of SPUs and on the top are the number of texture units (both using "Wikipedia numbering"). The table then shows you whether a configuration is possible or not, with a green Y meaning it is a viable configuration, and a red N meaning (surprise surprise) it isn't. Then, when you've found a configuration that's viable, you can look to the column of numbers on the right to tell you how many Gflops you've got, and then you can look to the row of numbers on the bottom to tell you your texture fillrate. The number of ROPs in R700 is unconstrained by the other specifications, it just has to be a multiple of 4 and you're good (and you can multiply the number of ROPs by 0.55 to get your pixel fillrate in GP/s).
Well, you're starting with facts, I'll give you that.You mean like how i talked about the Wii U's MEM2 pool being on a 64bit bus. DDR3 1600 on a 64bit bus. 4 chips, 512 megabyte capcity, all on a 16 bit bus. 16bit x 4 = 64bit. 200mhz base clock x 4 x 16 = 10.2gbs per second of bandwidth. This is in comparison to the next gen consoles which appear to be using 256bit for their main memory pools. The Xbox 360 and PS3 also used 128bit bus for their GDDR3, which still provides more raw bandwidth then the Wii Us. Even with a modern memory controller there's no way the Wii U's ram is on par even in the real world vs the Xbox 360's memory.
Or the likelyhood that the GPU only has 8 ROPs due to the low memory bandwidth of the Wii U. ROPs are bandwidth dependant, there's no point adding more ROPs unless you can feed them data fast enough.
To which i expanded on by using the XBox 360 as an example. With the Xbox 360 the ROPs were intergrated into the eDRAM. Due to this configuration the ROPs could be fed data at around 256 gigabytes per second. The Wii U's eDRAM implamentation does not seem to be similar to this, with its bus being considerably slower.
Or the fact the CPU is the size of a single Intel ATOM core and has an incredibly low TDP. It's also based on the decade old IBM PPC 750 based architecture. Its performance is going to be anything but stellar.
You won't have 8 cores for games, how many you will have is arguable (rumours were saying 6) but it certainly won't be 8. Also WiiU has 3 cores plus Audio DSP.
Stop pretending to know what you are talking about... just because to take what other people say and reorganize it before regurgitation doesn't mean you understand it.
afaik, 360's edram was off chip... connected by a very bw limiting bus
I'm not completely knowledgeable on this. But then again, I'm not going to act like I am either.