darkazcura
Member
There were people who said (in this thread) that the Wii U is closer to the Xbox One than 360. And that's either nonsense or fanboy talk.
In terms of features it surely is, but in terms of power, it is not.
There were people who said (in this thread) that the Wii U is closer to the Xbox One than 360. And that's either nonsense or fanboy talk.
Has this been confirmed? We don't even know if Wii U has hull and domain shaders.In terms of features it surely is, but in terms of power, it is not.
Aren't you the one who used your experience in Unity to discuss the potential of Frostbite 3 coming to the Wii U?
There were people who said (in this thread) that the Wii U is closer to the Xbox One than 360. And that's either nonsense or fanboy talk.
There were people who said (in this thread) that the Wii U is closer to the Xbox One than 360. And that's either nonsense or fanboy talk.
Has this been confirmed? We don't even know if Wii U has hull and domain shaders.
Nice ad hominem.
Nobody in their right mind have said such a thing, and don't remember it being mentioned at all. But carry on.
Architecture wise it certainly is. Raw power wise it is closer to the previous gen consoles. If you don't understand that distinction there are plenty of places to read up on it, even here.
Has this been confirmed? We don't even know if Wii U has hull and domain shaders.
Just a tessellator hinted to be in the HD 2000/3000/4000 range. That would put in in the middle.
Even if they're not officially part of the company, I see their impartiality as roughly in the same place as those that are. They've clearly aligned themselves with Nintendo at this point and I treat their statements accordingly.This is not the same at all.
Unlike the companies you posted, Shin'en don't work for Nintendo. They just publish the majority of their games on Nintendo hardware. They are completely third party with no contractual bindings.
Shin'en has seen success on Nintendo hardware since the GBA regardless of how every other third party dev was doing. They have nothing to lose or gain by doing this. They are just speaking from their own experience and view point.
What is with all of these people trying to write off their statement as just PR or marketing? I could see this hurting their rep more than helping it.
There were people who said (in this thread) that the Wii U is closer to the Xbox One than 360. And that's either nonsense or fanboy talk.
Yepp, I agree. It happens particularily often in threads about gaming hardware. And even more often in threads about Nintendo gaming hardware. I really hate all this "ha, my console is more powerful than yours" discussion. It's childish and annoying.
Especially when people are too fucking narrow minded to even read the OP properly. Instead, they just take any opportunity to derail any thread with their stupid fanboy agendas.
Shin'en said the Wii U has newer tech, and programmers need time to utilize it properly. Something that's always been like that for every console ever made. There is no need to put that statement out of context and bring out PS4/Wii U hardware wars.
There is no fucking war. Both mashines will be able to produce nice looking graphics. And if one of them isn't your cup of tea and doesn't meet your expectations - then fucking deal with it, but please stop derailing threads with your embarassing, hidden fanboyism.
I really thought the notion that the Wii U will be comparable to the PS4/Uno had been squashed. Do some people truly believe that?
Architecture wise it certainly is. Raw power wise it is closer to the previous gen consoles. If you don't understand that distinction there are plenty of places to read up on it, even here.
This affirmation doesn't make any sense. The HD5000 series started development before the HD4000 series was finalized, and the HD4000 series was finalized way before being at sale.M°°nblade;60180481 said:Yes ofcourse, everybody knows that. But the date of finalisation of a chip says little of architecture choices.
2008 GPU tech customised in 2011 is still 2008 tech at it's core. Because it's based on 2008 GPU architecture with a dx10 feature set which is different from actual 2011 GPU architecture with a dx11 feature set. I doubt anyone currently discussing this has enough technical expertise to form a valid opinion whether the difference between a dx9 GPU and dx10 GPU is bigger or smaller than the difference between a dx10 and a dx11 GPU. The same goes for x86 and powerpc architecture.
See, if you say the gap between Wii U and XBO is smaller than the gap between 360 and XBO, meaning that Wii U is more capable than 360, that is correct. That person is not necessarily saying Wii U is more comparable to XBO than it is to 360 in terms of performance.
Reading comprehension.
It's cool how you didn't even need to read the post to have an opinion.
http://www.neogaf.com/forum/showpost.php?p=60021281&postcount=288
There's no need to start yelling. If you reread my post you'll see that I'm not downplaying anything since I address people on both sides claiming its either closer to the X360 or closer to the XBone. In my simple mind, the Wii GPU is as many years and generations ahead of Xenos as the Xbone GPU seems to be ahead of the Wii U GPU and it is as much directX feature set levels apart from the X360 as the Xbone.This affirmation doesn't make any sense. The HD5000 series started development before the HD4000 series was finalized, and the HD4000 series was finalized way before being at sale.
I don't understand this necessity to downplay whatever facts could benefit Nintendo. EVERY GPU ON THE MARKET STARTS ITS DEVELOPMENT UPON SOME BASE MODEL WHICH OF COURSE IS AN OLDER MODEL, AND THEN EVOLVE FROM IT. AND EVERY GPU HAS A DEVELOPMENT CYCLE OF AT LEAST 2 YEARS (in the case of Nintendo, it may have been a longer development since they go for ultra-customized designs that diverge from standard designs a lot more than what is normal even for console vendors).
Nobody's saying it's impossible.I can't understand why the fact that Nintendo started to work with it's GPU in 2008 makes impossible for them to go higher than any 2008 design, when some of the GPU you claim to be superior in terms of architectural advances started their development BEFORE and upon OLDER DESIGNS.
Of course you're downplaying it. You're drawing a picture here where the WiiU GPU is old because its development started in 2007-2008, as if XboxOne and PS4 GPUs were made in less than a year. That's a FALLACY.M°°nblade;60454381 said:There's no need to start yelling. If you reread my post you'll see that I'm not downplaying anything since I address people on both sides claiming its either closer to the X360 or closer to the XBone. In my simple mind, the Wii GPU is as many years and generations ahead of Xenos as the Xbone GPU seems to be ahead of the Wii U GPU and it is as much directX feature set levels apart from the X360 as the Xbone.
The real problem is that the (usual) people claiming it is closer to one or another seem to have even less understanding of GPU technology than I do.
¿Some "bells and whistles" the design we've seen that differs completely from any R700 design we've seen?M°°nblade said:Nobody's saying it's impossible.
However, people do think it's weird to assume that, if Nintendo wanted to go higher than an 2008 design, why they have decided to pick up a 2008 GPU and hang some bells and whistles on it in the first place instead of just picking a 2009, 2010 or 2011 GPU which seems a lot easier to achieve the same result.
If the launch was premature, and Nintendo has had problems to finalize it as it should have, then its natural than a system with a much modern and different architecture than the 360/PS3 has so many problems to show its strength.And like other people just mentioned: does a architecture even matter?
The console is out for more than 6 months and we still haven't seen any releases, screenshots or other footage of 2013 games that put it above PS360 level. Not from multiplatform games, not from exclusive games, not even from Nintendo games.
The proof is in the pudding and all the Wii U pudding we can eat this year tastes like PS360 pudding.
There were people who said (in this thread) that the Wii U is closer to the Xbox One than 360. And that's either nonsense or fanboy talk.
Cliff notes version:
0)Dearest freezamite. eDRAM should never appear in the same sentence as "low latency". eDRAM is a denser, slower drop-in for SRAM. Repeat after me: eDRAM is smaller and slower than SRAM.
2)"generations ahead" is a feature set metric. As in, "The Geforce GT610 is several generations ahead of the Geforce 8800 (but this implies absolutely nothing about performance)".
3)Optimizing for caches is both barely possible and hardly necessary. Cache's automatically optimize latency and bandwidth of repeat access patterns. That's the whole idea, and it hasn't changed in 30 years.
The only angle you even have is A)manual prefetch instructions and B)cache-bypassing stores for data you know won't be valuable as a cache entry. Intel's had the relevant instructions since the Pentium 3, AMD since the original Athlon, and of course Xenon and Cell PPE have them as well. Nothing about this is new.
And we can immediately throw away A: manual prefetch, because any modern CPU has automatic prediction-based prefetching, and all you're going to do with manual prefetch instructions is cause interference.
B is a fringe benefit in total edge cases. You gain a complete cache refill if you happen to stream out exactly enough data to overwrite your cache hierarchy once. If you stream less, you don't lose your whole cache, so the penalty diminishes. If you stream more, the penalty stays the same but gets amortized over more work performed.
I've written software "seriously" for 20 years, and I seriously can't tell you how and what to optimize for a CPU cache, much less for a CPU cache of a certain size. My best generic advice is "don't". Completely puzzled by that statement.
In power its closer to 360. In architecture its closer to xbone. Is that so hard to grasp ?
Next gen architecture, current gen power.
I don't even know why you quoted me if you just want to write your nonsense.
People were talking about the power of the WiiU and Xbox One.
The talk about architecture is just goalpost moving at this point.
IBM's eDram implementation with the POWER7 series had a latency access times of only 5.7 nano-seconds, compared to 7.5ns of 6MB SRAM on "Tukwila" designs.Rolf NB said:0)Dearest freezamite. eDRAM should never appear in the same sentence as "low latency". eDRAM is a denser, slower drop-in for SRAM. Repeat after me: eDRAM is smaller and slower than SRAM.
Of course is a feature set metric, but newer features means better performance in a console. At least first and second parties should take advantage of them.Rolf NB said:2)"generations ahead" is a feature set metric. As in, "The Geforce GT610 is several generations ahead of the Geforce 8800 (but this implies absolutely nothing about performance)".
Optimizing for much bigger caches or ones that change specifications is of course possible and much necessary as has been stated by Shin'en.Rolf NB said:3)Optimizing for caches is both barely possible and hardly necessary. Cache's automatically optimize latency and bandwidth of repeat access patterns. That's the whole idea, and it hasn't changed in 30 years.
That statement had other implications as I read it. While you focus on low level instruction optimization, I think they were focusing on the fact that you have a huge 2MB+1MB L2 lockable cache so you can make use of it in a more flexible way to reduce accesses to main memory.Rolf NB said:The only angle you even have is A)manual prefetch instructions and B)cache-bypassing stores for data you know won't be valuable as a cache entry. Intel's had the relevant instructions since the Pentium 3, AMD since the original Athlon, and of course Xenon and Cell PPE have them as well. Nothing about this is new.
And we can immediately throw away A: manual prefetch, because any modern CPU has automatic prediction-based prefetching, and all you're going to do with manual prefetch instructions is cause interference.
B is a fringe benefit in total edge cases. You gain a complete cache refill if you happen to stream out exactly enough data to overwrite your cache hierarchy once. If you stream less, you don't lose your whole cache, so the penalty diminishes. If you stream more, the penalty stays the same but gets amortized over more work performed.
I've written software "seriously" for 20 years, and I seriously can't tell you how and what to optimize for a CPU cache, much less for a CPU cache of a certain size. My best generic advice is "don't". Completely puzzled by that statement.
That's pretty much what I'm expecting, and I'm fine with that.These discussions about the Wii U's power are starting to get ridiculous. I think we all have a pretty good idea of what the console is going to be capable of when developed specifically for the Wii U. Take a PS360 game, add high resolution textures due to having 2x the amount of RAM, better draw distances, and some modern lighting/GPU techniques due to having a newer GPU feature-set, and Bam. Done. That's the Wii U.
Would be interesting to know how many of the people in the WiiU GPU thread do have a degree in computer engineering.
Why do people keep comparing the WiiU to the X360 & PS3 ?
Also if the WiiU is so powerful where are the games ?
I know the WiiU has eDRAM & its supposed to make all the difference, but why did Nintendo bother with eDRAM as it is so expensive & not just go for more DDR3 or even have GDDR5 ?
I have had a WiiU since launch & i want to know where the games are !?
That's pretty much what I'm expecting, and I'm fine with that.
I honestly think the best looking Wii U games will match the first PS4/Xbone games, albeit at 720p.
I don't see the Wii U ever getting a game on par with the high-end late gen PS3 games
We already have games announced that crap on PS3 graphics, so so much for that.
I don't see the Wii U ever getting a game on par with the high-end late gen PS3 games, let alone their launch PS4 games. Nintendo will never put those kinds of resources into a game.
Do you honestly believe that? If so, can I buy some pot off you?
I dunno why this thread was bumped, but just in case people are wondering what those guys know, and just in case people are still believing they don't understand anything outside of Nintendo hardware: http://www.youtube.com/watch?v=er8CoAAmv2A
Nothing. That's the point. Those guys can exploit hardware in ways probably not even the guys who designed the systems really understand. And that's not limited to Nintendo systems.What does this have to do with the Wii U?