• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Sumo: Wii U specs are surprising; "way more memory" than PS3/360

TheD

The Detective
Sound can actually be quite intensive in consoles. Its a reason why Nintendo has put things like the dsp in the Wii u. Nintendo is giving tools to ease the cpu burden but its also an issue of programming to use it. Straight ports that arent tweaked to use all the bonus goodies nintendo has implemented in the hardware could suffer.

No, it is not.

Just because one dev made a 360 game with over 200 sounds at a time and eating one whole core does not mean that just about any other game gets close to doing that.


It's actually difficult for Nintendo to wind up with a worse GPU/RAM than the PS3/360 in the year 2012. But the CPU is the one area where they could actually screw up and they very well may have.

They likely utilized a more powerful version of the current Wii's IBM cpu (rather than switching to far superior cpu from AMD or Intel) in order to provide easy backwards compatibility with the Wii and Gamecube with no effort required on their part to emulate them. While this is nice, from the dev complaints so far, it appears the cpu is lacking even when compared to the 360 and PS3, and that's a significant reason for concern.

To is very easy to get worse GPUs and less RAM than the 360 or PS3.

The Wii CPU is tiny, it would perfectly fine to chuck it in WiiU just for BC and a Power7 CPU would have no trouble emulating it either.
 
http://www.eurogamer.net/articles/digitalfoundry-ibm-teases-on-wii-u-cpu

If the Wii U CPU was based on Broadway, I doubt IBM would be so open to compare some of the tech with Watson, which was based on POWER7.....

It's just PR. As far as IBM is concerned both are "powered" by the same PowerPC ISA. There's no reason to draw any additional inference. It's like nVidia promoting the Tegra chips as having GeForce graphics, even though the actual embedded GPUs are based on a really dated architecture that bears little resemblance to their current DX11 enabled unified shader GPUs. Marketing doesn't care if these technologies are from very different parts of the family tree.
 
http://www.eurogamer.net/articles/digitalfoundry-ibm-teases-on-wii-u-cpu

If the Wii U CPU was based on Broadway, I doubt IBM would be so open to compare some of the tech with Watson, which was based on POWER7.....
It's called PR. I think someone's pointed out that technically eDRAM is "technology that powers the Watson."
Sound can actually be quite intensive in consoles. Its a reason why Nintendo has put things like the dsp in the Wii u. Nintendo is giving tools to ease the cpu burden but its also an issue of programming to use it. Straight ports that arent tweaked to use all the bonus goodies nintendo has implemented in the hardware could suffer.
How difficult or time consuming do you think it would it be to utilise the audio DSP (EDIT for clarification: in porting up from the PS360 or down from the PS4720)?

It sounds like a similar situation to the PS3 where developers could utilise the SPEs when porting from the 360, but often didn't at least early on afaik?
 
Many games even use sound middleware like FMOD, so for those cases you don't really have to do anything to use the DSP, just let the middleware provider take care of it, which I sure all of them have done already.
 
If the first game is anything to go by, the main development platform for Transformed will most likely be PS3. I mean, what they've shown off of the Wii U version so far doesn't look any different from the others. The Wii U may possibly have more memory but I doubt multiplatform developers are going to use it.

Personally I think games built from the ground up for Wii U will look great (just take a look at Rayman Legends for reference) but with multiplatform software we wont tell the difference. At the same time I'm quite sceptical with Wii U since Xbox and PS "next gen" is just around the corner. The whole screen in a controller thing just doesn't do it enough for me.
 

Stewox

Banned
So Wii U has more RAM, and the GPU can do the same effects as the X360. The only new info is that they were worried that the Wii U GPU wasn't capable of the same effects as a X360 before they git dev kits.

The GPU can do 3 generations better effects and new features PS3 nor X360 have.
 
But every game needs sound and using the DSP you have more processing power that can be used for anything, cause the CPU is freed from this one task. Am i wrong?

DSPs, and custom chips in general are a good idea for performing specific tasks. I've loved the idea of having custom chips since the 8-16-bit generations and, I particularly liked the way they were implemented on the Amiga, but it doesn't make much sense nowadays. Back in those days, CPUs were quite weak, so it made a lot of sense to have custom chips perform critical tasks, but it's not particularly cost-effective anymore. The industry has been moving away from that concept for a while now. Even modern GPUs are massively parallel CPUs that have moved from being fixed-function to fully programmable, and full convergence between CPUs and GPUs seems inevitable down the line.

I don't think there's any need for DSPs with modern CPUs and GPUs having enough power and being more flexible, but it's nice to have it on the Wii U, seeing as its CPU seems to be a bit on the weak side compared to the rest of the system. The problem right now is that developers, particularly those working on launch titles, might disregard it for lack of time.


The GPU can do 3 generations better effects and new features PS3 nor X360 have.

That's quite a strange wording. You would be better off simply saying that the Wii U's is a DX11-level GPU, whereas RSX-Xenos are DX9.0c+ level GPUs.
 

lockload

Member
Good to know the wii-u compares well with 6 year old hardware, seriously how many of these threads do we need?

Shouldn't this just be expected, i doubt its possible to source hardware less powerful 6 years on
 

wsippel

Banned
DSPs, and custom chips in general are a good idea for performing specific tasks. I've loved the idea of having custom chips since the 8-16-bit generations and, I particularly liked the way they were implemented on the Amiga, but it doesn't make much sense nowadays. Back in those days, CPUs were quite weak, so it made a lot of sense to have custom chips perform critical tasks, but it's not particularly cost-effective anymore. The industry has been moving away from that concept for a while now. Even modern GPUs are massively parallel CPUs that have moved from being fixed-function to fully programmable, and full convergence between CPUs and GPUs seems inevitable down the line.

I don't think there's any need for DSPs with modern CPUs and GPUs having enough power and being more flexible, but it's nice to have it on the Wii U, seeing as its CPU seems to be a bit on the weak side. The problem right now is that developers, particularly those working on launch titles, might disregard it for lack of time.
It still is very much cost and energy efficient. CPUs don't perform very well at audio tasks. It's not much different from GPUs - why use a GPU if you have a powerful, massively parallel CPU? Easy: Even that massively parallel CPU sucks at 3D graphics. And it sucks at audio. Chips like an AD TigerSHARC run circles around an i7 at a fraction of the cost and power consumption. Audio processing isn't playing a MP3 in iTunes after all.

The main reason DSPs become less and less common on the PC side is that it's too much work to support them properly. At least in games.
 
Good to know the wii-u compares well with 6 year old hardware, seriously how many of these threads do we need?

Shouldn't this just be expected, i doubt its possible to source hardware less powerful 6 years on

It's definitely possible, but it doesn't make any sense. Nintendo invests in cost-effective hardware so they can make a profit from hardware sales early on, but that doesn't mean that they go for the cheapest parts. They invest in what makes sense to them, and going for less capable hardware wouldn't make sense seeing that people are ready to make the jump to a Nintendo HD system. Having hardware capable of easily receiving ports from current or newer platforms is very important for them this time around.
 

TheD

The Detective
But every game needs sound and using the DSP you have more processing power that can be used for anything, cause the CPU is freed from this one task. Am i wrong?

No, the cost of the DSP would have been better spent on a more powerful CPU.

Even the pile of derp that is the xenon does not have a problem with the advanced audio in BFBC2, BF3, Crackdown ect.


It still is very much cost and energy efficient. CPUs don't perform very well at audio tasks. It's not much different from GPUs - why use a GPU if you have a powerful, massively parallel CPU? Easy: Even that massively parallel CPU sucks at 3D graphics. And it sucks at audio. Chips like an AD TigerSHARC run circles around an i7 at a fraction of the cost and power consumption. Audio processing isn't playing a MP3 in iTunes after all.

CPUs handle audio tasks just fine as proven by all the games that use them for advanced audio effects and the fact that most DAWs only support support software plugins!

Audio is much simpler than 3d graphics.

The main reason DSPs become less and less common on the PC side is that it's too much work to support them properly. At least in games.

No, it is because they became pointless in light of ever faster CPUs!
 
No, the cost of the DSP would have been better spent on a more powerful CPU.

Even the pile of derp that is the xenon does not have a problem with the advanced audio in BFBC2, BF3, Crackdown ect.

I'm not saying it's a good choice, but most likely the options were it being there with the same CPU, or not with the same CPU.

Nintendo choices rarely make sense. Did it make sense limiting their hardware with the Wii? No it hindered 3rd party development. Did it make sense on the GCN to have a pool of 24MB blazing fast RAM and 16 MB of incredibly slow and a miniDVD that stored barely over a gig? Or the N64 era cartridges?

Nintendo priorities have always been different from the norm.
 

TheD

The Detective
I'm not saying it's a good choice, but most likely the options were it being there with the same CPU, or not with the same CPU.

Nintendo choices rarely make sense. Did it make sense limiting their hardware with the Wii? No it hindered 3rd party development. Did it make sense on the GCN to have a pool of 24MB blazing fast RAM and 16 MB of incredibly slow and a miniDVD that stored barely over a gig? Or the N64 era cartridges?

Nintendo priorities have always been different from the norm.

Remember that this is all just rumors at this point.

The cost of a hardware DSP and board space need for it would be very unwelcome for such a small console.
 
No, the cost of the DSP would have been better spent on a more powerful CPU.

Even the pile of derp that is the xenon does not have a problem with the advanced audio in BFBC2, BF3, Crackdown ect.




CPUs handle audio tasks just fine as proven by all the games that use them for advanced audio effects and the fact that most DAWs only support support software plugins!

Audio is much simpler than 3d graphics.



No, it is because they became pointless in light of ever faster CPUs!

We don´t know the WiiU CPU power, do we? That´s all speculating over vague interviews with devs working on quick and dirty ports :) Don´t think the CPU will be weaker than the 360s, in my opinion the DSP should be seen as a plus for future games. Devs won´t have to worry about the sound and its costs. I think it will make sense if the devs use it.

@Thundermonkey: Wii Discs only had about one Gig? Don´t rememeber that, think you are wrong^^
 

Triple U

Banned
I personally don't see Wii-U doing much in the way of advanced audio. I think that the DSP + modest CPU is a cheaper combination Nintendo found, and we know they are all about the profit.
 
Because devs in general did very well in faking a lot of things later this gen. And we've (I've) been seeing a lot more comments from people who haven't been that impressed by some of the next-gen demos.

you're kidding yourself. Just Crysis 3 fully maxed is pretty much next gen (watch the trailer, look at the textures, etc). And at 1080P.

PS4 specs should deliver that with ease as a straightforward port. Optimized PS4 titles 4 years later will be almost unimaginable.

Nintendo choices rarely make sense. Did it make sense limiting their hardware with the Wii? No it hindered 3rd party development. Did it make sense on the GCN to have a pool of 24MB blazing fast RAM and 16 MB of incredibly slow and a miniDVD that stored barely over a gig? Or the N64 era cartridges?

Yeah, this is what bugs me when people trot out "This is Nintendo, they know what they're doing" or the like.

I'd argue evidence shows they definitely do not, know what they're doing, when it comes to hardware.

Bottom line Iwata said it best about Wii U, Nintendo literally does not care that the competition will bring forth machines with better graphics. Those are Iwata's words, not mine. I wish I had the interview offhand but, yeah. He said that. He did not say "we did the best we could" or "we tried to deliver a lot for our budget". He said "we dont care that the other guys will have better graphics".


ALL THAT SAID, I'm not sure better hardware would have done much for the Wii, except decimate Nintendo's bottom line. That machine's appeal was not based on visuals. Unfortunately I think The Wii U is (intended or not) aimed strictly at the heart of the core, and that's why all this consternation over it's capability.
 

op_ivy

Fallen Xbot (cannot continue gaining levels in this class)
serious question - why are the wii-u's specs still a secret?
 
I personally don't see Wii-U doing much in the way of advanced audio. I think that the DSP + modest CPU is a cheaper combination Nintendo found, and we know they are all about the profit.

Exactly, it's a cost-cutting measure. DSPs can be dirty cheap. It's cheaper to have an inexpensive CPU and offload the audio processing to the DSP. It wouldn't make sense at all to have it in the first place if they had a beefy CPU, because audio processing is hardly taxing on a modern CPU when compared to IA, rendering, physics, etc.


serious question - why are the wii-u's specs still a secret?

Nintendo doesn't like people focusing on raw specifications.
 
Exactly, it's a cost-cutting measure. DSPs can be dirty cheap. It's cheaper to have an inexpensive CPU and offload the audio processing to the DSP. It wouldn't make sense at all to have it in the first place if they had a beefy CPU, because audio processing is hardly taxing on a modern CPU when compared to IA, rendering, physics, etc.




Nintendo doesn't like people focusing on raw specifications.

That didn't stop MS and Sony from making their extremely weak CPUs do audio though...
 
All modern non SOC GPUs from AMD and Nvidia are GPGPUs! even the 360 has a GPU that can act as a GPGPU.

A GPGPU is only suited to a smallish subset of tasks, it can not help a CPU on any problem that is not hugely parallel or in anyway latency sensitive and doing so will take away from graphics processing.

Sound processing is hardly a CPU killer, the transistors would be better spend in the CPU.

http://www.neogaf.com/forum/showpost.php?p=40205173&postcount=3195

The cost of a hardware DSP and board space need for it would be very unwelcome for such a small console.

Wii has a DSP.

you're kidding yourself. Just Crysis 3 fully maxed is pretty much next gen (watch the trailer, look at the textures, etc). And at 1080P.

PS4 specs should deliver that with ease as a straightforward port. Optimized PS4 titles 4 years later will be almost unimaginable.

You should revisit some of the pasts threads when these demos first came out then. I still remember comments from people not being able to see the difference in PC BF3 vs the console.
 
Nintendo doesn't like people focusing on raw specifications.

Sure, but the reason they don't like people focusing on their specifications is because they're probably pretty lackluster. It's one thing not to make a big deal of the power of the hardware upon it's announcement, but it's quite another to keep them secret and embargoed under penalty of death.

Ironically, there'd probably be a lot less focus on the specs of the Wii U if Nintendo actually confirmed them. That the discussion is dominated by them now is pretty much entirely because we don't know anything about them outside of a bunch of unreliable (and usually contradictory) 'industry sources', so people still have room to speculate wildly.
 
@Thundermonkey: Wii Discs only had about one Gig? Don´t rememeber that, think you are wrong^^
That was in direct reference to the GCN. Wii discs are standard dual layer DVD's. Up to like 9 gigs.
Yeah, this is what bugs me when people trot out "This is Nintendo, they know what they're doing" or the like.

I'd argue evidence shows they definitely do not, know what they're doing, when it comes to hardware.

Bottom line Iwata said it best about Wii U, Nintendo literally does not care that the competition will bring forth machines with better graphics. Those are Iwata's words, not mine. I wish I had the interview offhand but, yeah. He said that. He did not say "we did the best we could" or "we tried to deliver a lot for our budget". He said "we dont care that the other guys will have better graphics".
Don't get me wrong.

Some of those odd design choices have paid off in the past. Without that blazing fast pool of RAM a lot of the texture effects in something like Star Fox Adventures wouldn't have been possible. But on the other end its fairly small framebuffer ended up with two generations of hardware featuring fairly heavy dithering. I just think most devs would have liked it amounting to more than just 24 MB. Because of that the games ran better when constantly streaming or in bite sized chunks like the Rogue Leader games.

I'm firmly in the camp that the hardware itself will be a meager jump over the two current consoles. Depending on how games are designed for it, they could be stunners (though limited in comparison to what is possible) while showing little in the way of improvements. Higher res textures, higher asset variety, higher precision effects. But like anything time, budget, and talent will decide how beautiful WiiU games are.
 

Stewox

Banned
The amount of RAM is of no consequence until we know:


How much is reserved for the OS.

How much is used due to the screen in the controller.

Type of RAM.

Memory interface bandwidth amount and speed.

consequence? ... maybe relevance.

OS won't reserve more than 200 MB

This high prediction comes from the fact that the game's are suspendable in mid play and you can browse-

Maybe developers who suck out all the RAM will be able to take it all without worrying about this and disable the suspend mode altogether.

screen and controller are irrelevant, you can just have a static menu with practicall no use (5 megs, whatever)
 

Shion

Member
You should revisit some of the pasts threads when these demos first came out then. I still remember comments from people not being able to see the difference in PC BF3 vs the console.
I find it hard to believe that, by 2014/2015, games aren't going to look a hell of a lot better than anything that we've seen on current gen consoles.
 
http://www.neogaf.com/forum/showpost.php?p=40205173&postcount=3195
You should revisit some of the pasts threads when these demos first came out then. I still remember comments from people not being able to see the difference in PC BF3 vs the console.

I'd actually agree to an extent, but trying telling some PC guy's that. To them, BF3 console, or choose your console version here, might as well be an 8 bit Nintendo game or something. Never mind the horrors of 30 FPS and lowly 720P (which I have to agree, after watching my brother PC game a lot, I am ready for a move to 1080P, which 720P never bothered me before).

In other words ironically I'm typically on your side, arguing that the console version of game X is not THAT far behind the PC version. I'd even argue that for BF3, to an extent. I think this because PC ports are based on the console versions though, not because of any diminishing returns factor.

But I was speaking of Crysis 3 anyways, and again, that trailer, just the textures alone...you run that in 1080P and I think you're in the ballpark of next gen. Though I think next gen will far exceed it eventually (IE, what Killzone 2 was to 1st gen 360 games)

Watch Dogs and Star Wars 1313 are other good examples of sort of next gen, running now. They impress me much much more than say, UE4 demo because they are real.
 

Stewox

Banned
I'd actually agree to an extent, but trying telling some PC guy's that. To them, BF3 console, or choose your console version here, might as well be an 8 bit Nintendo game or something. Never mind the horrors of 30 FPS and lowly 720P (which I have to agree, after watching my brother PC game a lot, I am ready for a move to 1080P, which 720P never bothered me before).

In other words ironically I'm typically on your side, arguing that the console version of game X is not THAT far behind the PC version. I'd even argue that for BF3, to an extent. I think this because PC ports are based on the console versions though, not because of any diminishing returns factor.

But I was speaking of Crysis 3 anyways, and again, that trailer, just the textures alone...you run that in 1080P and I think you're in the ballpark of next gen. Though I think next gen will far exceed it eventually (IE, what Killzone 2 was to 1st gen 360 games)

Watch Dogs and Star Wars 1313 are other good examples of sort of next gen, running now. They impress me much much more than say, UE4 demo because they are real.

So you're talking about consoles - that's a good idea to point out, because you cannot say crysis 3 is next-gen if it's out this gen, so what gen are then you talking about.

This talk of gen gen next previus has always been on my "not to say" things. I never use these silly words in my talk.

In the PC world, generations are decided by GPU releases. 1080p is a HDTV standard, PC had more than that for years.
 
I find it hard to believe that, by 2014/2015, games aren't going to look a hell of a lot better than anything that we've seen on current gen consoles.
Assuredly.

But that means different things in this era than it has in the past.

The difference between crunching 3 million pps with no per pixel effects and 20 million with is a hell of a lot more stark than 500 million and 1 billion.

Realtime GI, advanced tessellation techniques may make all the difference in the world alone.

Until we've seen exactly what the next generation of engines are doing I'm not prepared to say WiiU won't be able to do the same, in a stripped down, possibly faked manner.
 

omonimo

Banned
No, it is not.

Just because one dev made a 360 game with over 200 sounds at a time and eating one whole core does not mean that just about any other game gets close to doing that.




To is very easy to get worse GPUs and less RAM than the 360 or PS3.

The Wii CPU is tiny, it would perfectly fine to chuck it in WiiU just for BC and a Power7 CPU would have no trouble emulating it either.

Well... in 2012? If WiiU is it, Nintendo has a real natural talent to screw an hardware at this point...
 
Related question: wasn't there a relatively thorough albeit old leak of specs a while ago anyway, that was apparently copy-pasted from SDK documentation.

Is there any reason we're not using that for reference anymore?
 

wsippel

Banned
CPUs handle audio tasks just fine as proven by all the games that use them for advanced audio effects and the fact that most DAWs only support support software plugins!

Audio is much simpler than 3d graphics.
Yeah, what do I know? It's not like I actually worked on pro audio applications and drivers for DSP boards or something...


Exactly, it's a cost-cutting measure. DSPs can be dirty cheap. It's cheaper to have an inexpensive CPU and offload the audio processing to the DSP. It wouldn't make sense at all to have it in the first place if they had a beefy CPU, because audio processing is hardly taxing on a modern CPU when compared to IA, rendering, physics, etc.
Quite a few modern games waste more CPU cycles on audio than on AI or physics, actually. They have to, as processing audio on a CPU is simply not very efficient. There's a presentation by the Microsoft Xbox Division about CPU usage floating around, feel free to google it. It's quite an eye opener. And with companies like Firelight and Audiokinetic working on middleware that uses production grade effects in realtime in games, it's not unreasonable to expect the CPU load to go up quite a bit - still requiring a similar percentage (~15-30%) even on CPUs an order of magnitude more powerful.
 

hodgy100

Member
the Broadway is a Power pc chip so wouldnt that mean that any "enhanced broadway" could just technically be a newer powerPC chip?
 
Related question: wasn't there a relatively thorough albeit old leak of specs a while ago anyway, that was apparently copy-pasted from SDK documentation.

Is there any reason we're not using that for reference anymore?

Those leaked specs did not say as much about the CPU as the GPU:

"Main Application Processor

PowerPC architecture.
Three cores (fully coherent).
3MB aggregate L2 Cache size.
core 0: 512 KB
core 1: 2048 KB
core 2: 512 KB
Write gatherer per core.
Locked (L1d) cache DMA per core."

http://www.neogaf.com/forum/showthread.php?t=476997


Apparently those specs were right on the money for earlier dev kits from late last year....
 
Those leaked specs did not say as much about the CPU as the GPU:

"Main Application Processor

PowerPC architecture.
Three cores (fully coherent).
3MB aggregate L2 Cache size.
core 0: 512 KB
core 1: 2048 KB
core 2: 512 KB
Write gatherer per core.
Locked (L1d) cache DMA per core."

http://www.neogaf.com/forum/showthread.php?t=476997


Apparently those specs were right on the money for earlier dev kits from late last year....
Yip. Those are the ones. Basically, I'm wondering why we're now essentially ignoring those specs - for example this thread - ignores that we already knew the Wii U will have ~1.5 GB RAM, (possibly more like 2 GB in final units iirc) ergo this isn't really new news.

Presumably, it's just because there isn't any new news.
 
Wii U is rumored to have Compute Shaders ( Shader Model 5), and compiling Shader Model 4 code to run on Shader Model 5 has a negative performance impact.

Could this be the reason why WiiU seems weaker than it's true potential? Poor ports and a lack of understanding of next gen architecture?
 

stupidvillager

Neo Member
Question. If the 750 line from IBM has been pretty much phased out and and aside from backwards compatibility, why try to go with that line? Why not build the backwards comp. into a new chip? From what I have read and understand, (though it isnt much) IBM never intended the 750 family to go smaller than 90nm. So instead of retooling and enhancing and shrinking broadway, why not go with a new chip?
 
Those leaked specs did not say as much about the CPU as the GPU:

"Main Application Processor

PowerPC architecture.
Three cores (fully coherent).
3MB aggregate L2 Cache size.
core 0: 512 KB
core 1: 2048 KB
core 2: 512 KB
Write gatherer per core.
Locked (L1d) cache DMA per core."

http://www.neogaf.com/forum/showthread.php?t=476997


Apparently those specs were right on the money for earlier dev kits from late last year....

Yip. Those are the ones. Basically, I'm wondering why we're now essentially ignoring those specs - for example this thread - ignores that we already knew the Wii U will have ~1.5 GB RAM, (possibly more like 2 GB in final units iirc) ergo this isn't really new news.

Presumably, it's just because there isn't any new news.

.
 
Top Bottom