Yeah, that's bullshit.Yes it is cheaper. Nothing in the wiiu was based on performance. It was base on keeping cost low.
Yeah, that's bullshit.Yes it is cheaper. Nothing in the wiiu was based on performance. It was base on keeping cost low.
WiiU is a different architecture from PS360, thus too early to definitively say exactly how much more or less powerful it is compared to them.
I see. That sounds like a very bad choice from Nintendo then. Why make it different and harder for developers when getting support is already tricky in the first place?
oh lordYes it is cheaper. Nothing in the wiiu was based on performance. It was base on keeping cost low.
I see. That sounds like a very bad choice from Nintendo then. Why make it different and harder for developers when getting support is already tricky in the first place?
Different doesn't necessarily mean harder. Just that devs haven't yet had the time or experience to really work much magic with the hardware, as they've been doing for years on the PS360.I see. That sounds like a very bad choice from Nintendo then. Why make it different and harder for developers when getting support is already tricky in the first place?
The proof is in the pudding, and currently Wii U titles seem to be bandwidth starved. Choking on transparencies, almost no MSAA use, still sub HD resolutions...
Facts
1 Generally speaking, although being a non-negligible parameter, RAM bandwidth is less vital than the GPU power or the memory amount,
So how do they intend to keep the GPU fed without this bandwidth? Sorry but I stopped ready about here.
So how do they intend to keep the GPU fed without this bandwidth? Sorry but I stopped ready about here.
As soon as I read this I knew exactly who posted it without even having to glance over to the username. I have to tip my hat to your persistence however.Edram was added to the wiiu as a cost saving measure.
Yes wiiu doesnt have enough bandwith.
Yeah, that's bullshit.
Not enough... for what? It has games, it plays games.
Wsippel, my dear friend wsippel, my good buddy wsippel, please learn to ignore USC-fan.
OT: Some good news about the WiiU with a title that makes it sound like bad news for the WiiU? Ingenious! Not even being sarcastic about this one.
It's not really news though is it? I'm not that far through the whole thing but isn't this just ideamans speculation.
It's not really news though is it? I'm not that far through the whole thing but isn't this just ideamans speculation.
Well, the hypothesis part that is small, yes. The rest, no, it's facts, and there is a part in the second half with statements from two different Wii U developers + a little comment from Hynix at the end also.
Which comments they made?Facts about Ram in general though, not so much the WiiU theres an awful lot of speculation in the main body of the article which contains an lot of coulds and mights.
I wouldn't hold up developer comments as a beacon of truth either with how things like Darksiders turned out in relation to the comments they made.
Which comments they made?
WiiU version will be the best because of the consoles power or something alog those lines. DF analysis had it as the worst performing.
When used correctly, the Wii U may have the potential to be slightly faster than its 7 year old competitors.
Or: not enough sighs.
Facts about Ram in general though, not so much the WiiU theres an awful lot of speculation in the main body of the article which contains an lot of coulds and mights.
I wouldn't hold up developer comments as a beacon of truth either with how things like Darksiders turned out in relation to the comments they made.
Facts about Ram in general though, not so much the WiiU theres an awful lot of speculation in the main body of the article which contains an lot of coulds and mights.
I wouldn't hold up developer comments as a beacon of truth either with how things like Darksiders turned out in relation to the comments they made.
They never said that...
When used correctly, the Wii U may have the potential to be slightly faster than its 7 year old competitors.
Or: not enough sighs.
As in not yet using the eDRAM at all? No, no way in hell. Games would be running at 5-10 FPS instead of 20-30.
I remember those times when we just played games and legendary ones arrived at the SNES. What did we think?
Edram was added to the wiiu as a cost saving measure.
Yes wiiu doesnt have enough bandwith.
Is anyone release surprised by this?
Nintendo have been known to be build efficient consoles. The Gamecube imho was one of the best engineered consoles of all time. On paper it was clearly inferior to the Xbox, yet in the real world the two consoles were far more evenly matched.
The Gamecube featured eDRAM, something we've seen the Xbox 360, Wii, and now Wii U feature. Its IBM PPC Gekko CPU featured double the cache of the Xbox's Intel CPU, and from all accounts despite being clocked almost half that of the Xbox's CPU it was superior in many ways. Its memory was arguarbly the best of that generation with its 1T-SRAM providing high bandwidth and low latency. The Gamecube still had less RAM then the Xbox, but due to its higher bandwidth and lower latency along with some fantastic texture compression tech from ATi the Gamecube was if anything superior in this regard. The Cube's bus configuration betwen memory, GPU, CPU, was incredibly efficient.
I'm not at all surprised to hear that in the real world the Wii U's memory bandwidth is not an issue. The CPU has as significantly larger amount of cache vs the Xbox 360 and PS3, and it also features a very short pipeline and out of order which should help prevent stalls. The MCM likely has allowed for a significantly higher bus speed between the CPU and GPU vs the HD Twins. The 32 megabytes of eDRAM on the GPU is of course a big help, so too the increased register count for the GPU iteself. No doubt like other Nintendo consoles AMD have provided some efficient texture compression and other related hardware features to again reduce bandwidth and data sizes. Then there's the MEM1 pool which by all accounts has significantly lower latenacy then the Xbox 360 and PS3's, is also bidirectional, and with double the amount of physical memory available developers should again be able to significantly reduce memory I/O.
As per the OPs post, in the real world the Xbox 360's memory is not capable of 22 gigabytes per second or anywhere near that. The Xbox 360's bus maxes out at around 10 gigabytes per second each way, and it's bandwidth is even lower then 10 gigabytes per second when factor in over heads and stalls. The Xbox 360 also has less then 512 megabytes of RAM available to games, from memory its around 468 megabytes. This is in comparison to the Wii U's 1024 megabytes which is available to developers. A lot more memory swapping and I/O would need to occur on the Xbox 360 to over come its smaller memory capacity, where as with the Wii U developers can reduce memory reads and writes as they can store significnatly more data at a time.
Hey this is my article
I've written it one month ago but because of my PC problems + busy irl it was postponed to now.
I want to thanks several gafers, particularly blu and Alstrong, but also Thraktor, Durante and Popstar for their insight.
I hope you find this interesting.
Wait, Idea Man wrote this? Hmmmmmm. Interesting
There's so much wrong with this post, I'm not sure where to start or if there's even a point considering how your mind is already made up.
good news? Wii U? color me surprised!
good news? Wii U? color me surprised!
Why didn't they go for a more traditional approach though? Sure fast ram is more expensive, but if we're talking about something like 1GB GDDR5 we are talking about like 30 dollars or something? Or is it because the GPU doesn't support that or something?
I can't wait to see what next-gen Mario and Zelda look like.
...NES lost a bit of my trust...but okay.
They put this under fucking 'facts'.
/sigh
This was a site I was hopeful for...remember all them interviews gaf? Them were the days.