• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U has 2GB of DDR3 RAM, [Up: RAM 43% slower than 360/PS3 RAM]

True but there's also saving money

And there's always pricedrops as well. If Wii U isn't for you right now, just wait till a few years into the generation where Nintendo will have their best games out, the price will be cheaper, and you have a better idea of the system. People act like they must own everything at launch and then get pissed when they have buyer's remorse.
 

v1oz

Member
They didn't spend $300 on the console - that's part of the problem. A lot of R&D and mfg cost went into the controller and I think we're seeing the results of that by way of cut corners in the console proper.
They do weird things with their money. I remember ATI saying Nintendo had spent as much money on R&D for the Wii GPU as Microsoft.
 

AmFreak

Member
Dude you might just want to take a break from video games for a while. If it makes you feel better 2 years after the other two launch they will look outdated from a PC perspective.



This is not how manufacturing costs work.

Sure their cost to build it all together isn't included, but what the hell do u think is the cost of manufacturing?!? Even if you would go crazy and add 20€ for manufacturing and shipping, it wouldn't change anything (aka you could build such a pc for 280€).
 
Honestly I don't know why I expected anything more from nintendo. I guess I should have just kept my expectations in check but I think (hope) that I'll still have some fun with this console once the sony/microsoft consoles come out. But this launch has been a fucking mess.
 

thirty

Banned
how much memory bandwidth did the original xbox have? if wii was pretty much on par with xbox wouldnt this be like xbox 1.7?
 

beril

Member
They do weird things with their money. I remember ATI saying Nintendo had spent as much money on R&D for the Wii GPU as Microsoft.

The 360 GPU was a slightly modified PC component, while the Wii GPU was a sligthly modified Gamecube component. Would one really be much more expensive than the other as far as R&D goes?
 

Oemenia

Banned
The 360 GPU was a slightly modified PC component, while the Wii GPU was a sligthly modified Gamecube component. Would one really be much more expensive than the other as far as R&D goes?
Thats really selling the Xenos short, it was almost a year ahead of the generations of GPUs that came out at the same time.
 
I'll be willing to bet that both the PS4 and Xbox 3 will have >4Gb of DDR3 RAM. Probably faster than 800Mhz, but still slower than what's in the current generation of consoles.
 

Alex

Member
It seems a little weaker than I'd have figured, but I never really guessed it was going to be even a mild step up, mostly just par.

We've known the CPU is a bit weak for awhile, and while the GPU is stronger, people seem to ignore the fact that it has to draw an additional 854×480 in fillrate to the tablet, unless I missed something, which is pretty huge. Memory seems kind of disappointing, though.

Maybe there's some silver lining to all this, but I didn't actually think they were going to go with a solution more radical than a better dumping ground for their own software and it's delivery. It seems like... another Wii, and that's really what I expected.
 

v1oz

Member
The 360 GPU was a slightly modified PC component, while the Wii GPU was a sligthly modified Gamecube component. Would one really be much more expensive than the other as far as R&D goes?
Xenos was an original custom part. I wouldn't call it a slightly modified PC part. Microsoft actually owns the design.
 

Biggzy

Member
It seems a little weaker than I'd have figured, but I never really guessed it was going to be even a mild step up, mostly just par.

We've known the CPU is a bit weak for awhile, and while the GPU is stronger, people seem to ignore the fact that it has to draw an additional 854×480 in fillrate to the tablet, unless I missed something, which is pretty huge. Memory seems kind of disappointing, though.

Maybe there's some silver lining to all this, but I didn't actually think they were going to go with a solution more radical than a better dumping ground for their own software and it's delivery. It seems like... another Wii, and that's really what I expected.

Secretly we all knew this would happen, but some people got themselves worked up and convinced themselves this would a significant step up from the 360/PS3.

I am not too bothered as all I wanted was a Nintendo system that didn't look awful on my 46' HDTV.
 
Secretly we all knew this would happen, but some people got themselves worked up and convinced themselves this would a significant step up from the 360/PS3.

I am not too bothered as all I wanted was a Nintendo system that didn't look awful on my 46' HDTV.

Supposing the rumored Durango and Orbis specs are accurate, I wonder how they would be priced.

If they are significantly more powerful than the Wii U, I would assume the consoles themselves would have a much higher entry point.

The Wii U, like the Wii, may be the budget solution for the 8th generation of gaming.
 

Biggzy

Member
Supposing the rumored Durango and Orbis specs are accurate, I wonder how they would be priced.

If they are significantly more powerful than the Wii U, I would assume the consoles themselves would have a much higher entry point.

The Wii U, like the Wii, may be the budget solution for the 8th generation of gaming.

That's if Nintendo can show people that the WiiU has experiences you can't find anywhere else, because that is predominantly why the Wii was so successful.

Durango and Orbis will definetly be more expensive than the WiiU, but both Sony and Microsoft know that you can't price them too high otherwise you get into the situation that Sony experienced with the PS3.
 

golem

Member
They didn't spend $300 on the console - that's part of the problem. A lot of R&D and mfg cost went into the controller and I think we're seeing the results of that by way of cut corners in the console proper.

WTF its an ancient touchscreen with buttons on the side.

Seems pretty much Nintendo built a system from parts they fished out of a Best Buy recycle bin. How they are losing money (or claiming to at least) per system is beyond me.
 

Feep

Banned
WTF its an ancient touchscreen with buttons on the side.

Seems pretty much Nintendo built a system from parts they fished out of a Best Buy recycle bin. How they are losing money (or claiming to at least) per system is beyond me.
You really don't understand how the controller works.
 

djyella

Member
Yes, I'm desperate not to experience a repeat of the Wii years. Having to sit through all those commercials for high-definition Xbox games was soul-crushing.
Yeah, I felt the same way...soul crushing is a good way of putting it hahaha. I still had fun with exclusive games for the Wii though. About the RAM though, why are people glossing over the fact that Durango is rumored to be using slow RAM as well? Also that Orbis may follow suit so that they can up the amount? So all three next gen systems will be in the same boat when it comes to ports from PS360. Why isn't this argument brought up more?
 

LeleSocho

Banned
WTF its an ancient touchscreen with buttons on the side.

Seems pretty much Nintendo built a system from parts they fished out of a Best Buy recycle bin. How they are losing money (or claiming to at least) per system is beyond me.

The wireless tech of the gamepad is one of the best available on the planet and it's actually pretty impressive it's that what made the cost to go up.
Still i don't think it's worth to have a pricey console with weak hardware only for a good wireless tech. I would have been so much happier (and would have actually bought unlike the wiiu) with a 200$ console with the same hardware or a 350$ console only based on the hardware both without the pad stuff.
 
They specs absolutely are the reason. The GC had pretty good parity with the other consoles for ports, online features notwithstanding. Third parties didn't give two shits about the Wii because it wasn't merely porting over assets and an HD-twins engine for a game...it was essentially creating a whole new game, and very often a poor imitation of the HD twins' version. Hence, no one gave a shit. Very bad downward spiral that doesn't appear to have been broken with the Wii U.

GC missed out on plenty of multiplatform titles that it was perfectly capable of running, because publishers didn't see an audience for them big enough to generate sufficient ROI. Wii U isn't getting most of the announced early-2013 multiplatform titles because...
 
Not really, you guys forget Microsoft lied pretty hard on the actual numbers Xbox managed to pull, like how they said the GPU did 140 GFlops then 80 GFlops.

I'm a little rusty on that, but I know for a fact it wasn't higher than 3.2 GB/s; probably less; If my memory serves right probably closer to 2 GB/s peak transfer rate. We're talking about DDR @ 200 MHz after all, plus severe bottlenecks.


Also bare in mind Xbox had no framebuffer, they had to use that main pool of ram for everything, including Z-buffering.
 

Raist

Banned
Not really, you guys forget Microsoft lied pretty hard on the actual numbers Xbox managed to pull, like how they said the GPU did 140 GFlops then 80 GFlops.

I'm a little rusty on that, but I know for a fact it wasn't higher than 3.2 GB/s; probably less; If my memory serves right probably closer to 2 GB/s peak transfer rate. We're talking about DDR @ 200 MHz after all, plus severe bottlenecks.


Also bare in mind Xbox had no framebuffer, they had to use that main pool of ram for everything, including Z-buffering.

The memory used on the Xbox is very fast by PC memory standards but only decent by video memory standards. The Samsung chips used on our unit are 5ns parts that run at 200MHz DDR offering the effective bandwidth of a 400MHz solution. When combined with NVIDIA's 128-bit TwinBank memory architecture this offers a total of 6.4GB/s of memory bandwidth that is to be shared between the IGP and the CPU.

http://www.anandtech.com/show/853
 

netBuff

Member
The wireless tech of the gamepad is one of the best available on the planet and it's actually pretty impressive it's that what made the cost to go up.
Still i don't think it's worth to have a pricey console with weak hardware only for a good wireless tech. I would have been so much happier (and would have actually bought unlike the wiiu) with a 200$ console with the same hardware or a 350$ console only based on the hardware both without the pad stuff.

And this "miracle" WiFi streaming tech is going to be implemented in pretty much every device on the planet soon (Android 4.2 on Nexus devices, Intel Wireless Display 3.5 in pretty much every Intel laptop chipset): Miracast is not going to seem all that impressive for long.

Yes, how it works with the Wii U GamePad is pretty impressive, but I find it hard to believe that this is very expensive technology (although the needed extra chips probably add up).

But I also don't feel like the Wii U is an expensive console - people seem to forget about inflation: €270/€315 including taxes seems very reasonable to me.
 
Why is tech talk of that era so hard to get to these days.

The RAM theorectical peak was 6.4 GB/s (thanks, was doing math for how much a 200 MHz DDR part with dual channel and 64 bits could get to and getting 3.2 GB/s), yes, but I recall effective bottlenecks and realworld numbers didn't add up completely.


I'm pretty pissed that I can't make a point because I can't find the evidence I've been looking for on google though. I can only enforce that I don't think I've gone crazy just yet.
 

TheD

The Detective
why are people glossing over the fact that Durango is rumored to be using slow RAM as well? Also that Orbis may follow suit so that they can up the amount? So all three next gen systems will be in the same boat when it comes to ports from PS360. Why isn't this argument brought up more?

Because what you are stating is stupid.

It is easy as hell to get faster RAM than what is in the 360 and PS3!
GDDR5 will easily beat the GDDR3 or XDR in the 360 and PS3 on even a 64Bit bus!
DDR4 on the same size bus as the 360 and PS3 also is a fair bit faster (min 34132 MB/s)
They would not go backwards from what they already have on the market!
 

Raist

Banned
Why is tech talk of that era so hard to get to these days.

The RAM theorectical peak was 6.4 GB/s (thanks, was doing math for how much a 200 MHz DDR part with dual channel and 64 bits could get to and getting 3.2 GB/s), yes, but I recall effective bottlenecks and realworld numbers didn't add up completely.


I'm pretty pissed that I can't make a point because I can't find the evidence I've been looking for on google though. I can only enforce that I don't think I've gone crazy just yet.

The one bottleneck I rememver of is that the CPU couldn't access more than 20% of the RAM or something.
 
The one bottleneck I rememver of is that the CPU couldn't access more than 20% of the RAM or something.
I believe you're reffering to this:

Memory subsystem: 4x16MB (64MB) 5ns DDR SDRAM operating at 200MHz (effectively 400MHz). TwinBank 128-bit DDR SDRAM memory controllers offering 6.4GB/s shared memory bandwidth. Maximum of 1.06GB/s used by CPU. Minimum of 5.34GB/s used by all other hardware.

The bank itself was unified, but the bandwidth ratio wasn't. I've never heard about a 20% access limitation; but I suspect that would be a pain in the arse for the console homebrew.



But nothing point to me being right on the rest. Gotta admit defeat. There was a really nice tech discussion thread on G4tv a few years back in which I suspect I've read something regarding the bus actually being halved, but it's been dead for years now. Thinking more about it I might also be confusing the CPU GFlop thing, I recall quite well that a Pentium III 733 MHz doesn't pull out 3 Gigaflops.
 
Well now, anti-Nintendo fanboys will spread the misconception that Wii U is not even on par with PS360 technically. I just hope this don't make third-parties mind and become a common sense regarding Wii U.
 

netBuff

Member
Well now, anti-Nintendo fanboys will spread the misconception that Wii U is not even on par with PS360 technically. I just hope this don't make third-parties mind and become a common sense regarding Wii U.

Publishers and developers aren't basing their publishing/developing decisions on forums ramblings, but on their knowledge of the hardware as well as the estimated market potential for their game on a specific platform.
 
I'll take you up on a ban bet that you're wrong.
Are you implying that one will be pleasantly surprised or disappointed, relative to what he wrote.
They didn't spend $300 on the console - that's part of the problem. A lot of R&D and mfg cost went into the controller and I think we're seeing the results of that by way of cut corners in the console proper.
I thought the implication was that each unit was being sold at a negative margin, with regard to variable costs per unit (ergo I don't think that includes sunk R&D costs).

And I just can't see where that money's going - controller or not.
 

jerd

Member
This thread and others make me wish I was around GAF when the Wii specs started to come out. I bet that was a great fun time to be a gaffer.
 

teh_pwn

"Saturated fat causes heart disease as much as Brawndo is what plants crave."
Much more realistic this time around.

Some still clinging to pie in the sky, but overall a whole hell of a lot more grounded.

Wait for spaceworld. ED RAM GPGPU PhysX expansion pack will be bundled with Retro Zelda and you will all eat DrinkyCrow. That's what the extra space is for in the controller battery pack!
 
Wait for spaceworld. ED RAM GPGPU PhysX expansion pack will be bundled with Retro Zelda and you will all eat DrinkyCrow. That's what the extra space is for in the controller battery pack!
Like I said some still clung to the pie in the sky.

Those with even cursory understanding were much more grounded. That's not to say we were right. The consensus at one point was DDR3 at 17GB/s, a GPU clocked at around 500MHz and something like 596Gflop, and a tricore single threaded Gekko variant. Probably based on something modern and stripped in the CPU sense. With it's 32MB scratchpad, and audio DSP.

By the end of it we had everything fairly close, but most of it clocked higher than in actuality.

Earlier was a crapshoot, but we've known for well over a year what parts were being used in the SDK's. So we always had a general idea of the capability being shot for.

Some were crazy... but this is the internet. WiiUspec thread ramblings might have veered into the realm of crazy... but that's not a Nintendo fan exclusive phenomenon.
 

Camilos

Banned
This thread and others make me wish I was around GAF when the Wii specs started to come out. I bet that was a great fun time to be a gaffer.
If I remember correctly, it was around that time that anyone who enjoyed nice visuals was being called a graphics whore.
 

benny_a

extra source of jiggaflops
Some were crazy... but this is the internet. WiiUspec thread ramblings might have veered into the realm of crazy... but that's not a Nintendo fan exclusive phenomenon.

You're right that no one group on GAF has the monopoly on crazies.

If the PS4 or 720 end up being worse than anticipated. Or one being worse than anticipated you will see the exact same thing again.
Consoles are the not the only ones with questionable people in it though, you have some PCGAF people that are saying that no game next-generation will match The Witcher 2 graphics on high on a PC that came out in 2011.
 
Top Bottom