If the speculation is right i meanSpeculation = fact
If the speculation is right i meanSpeculation = fact
Sound can actually be quite intensive in consoles. Its a reason why Nintendo has put things like the dsp in the Wii u. Nintendo is giving tools to ease the cpu burden but its also an issue of programming to use it. Straight ports that arent tweaked to use all the bonus goodies nintendo has implemented in the hardware could suffer.
It's actually difficult for Nintendo to wind up with a worse GPU/RAM than the PS3/360 in the year 2012. But the CPU is the one area where they could actually screw up and they very well may have.
They likely utilized a more powerful version of the current Wii's IBM cpu (rather than switching to far superior cpu from AMD or Intel) in order to provide easy backwards compatibility with the Wii and Gamecube with no effort required on their part to emulate them. While this is nice, from the dev complaints so far, it appears the cpu is lacking even when compared to the 360 and PS3, and that's a significant reason for concern.
http://www.eurogamer.net/articles/digitalfoundry-ibm-teases-on-wii-u-cpu
If the Wii U CPU was based on Broadway, I doubt IBM would be so open to compare some of the tech with Watson, which was based on POWER7.....
http://www.eurogamer.net/articles/digitalfoundry-ibm-teases-on-wii-u-cpu
If the Wii U CPU was based on Broadway, I doubt IBM would be so open to compare some of the tech with Watson, which was based on POWER7.....
It's called PR. I think someone's pointed out that technically eDRAM is "technology that powers the Watson."http://www.eurogamer.net/articles/digitalfoundry-ibm-teases-on-wii-u-cpu
If the Wii U CPU was based on Broadway, I doubt IBM would be so open to compare some of the tech with Watson, which was based on POWER7.....
How difficult or time consuming do you think it would it be to utilise the audio DSP (EDIT for clarification: in porting up from the PS360 or down from the PS4720)?Sound can actually be quite intensive in consoles. Its a reason why Nintendo has put things like the dsp in the Wii u. Nintendo is giving tools to ease the cpu burden but its also an issue of programming to use it. Straight ports that arent tweaked to use all the bonus goodies nintendo has implemented in the hardware could suffer.
Isn't a audio DSP better to have than using the CPU to do the same thing?
No, having processing power that can be used for anything instead of just audio is much better.
But every game needs sound and using the DSP you have more processing power that can be used for anything, cause the CPU is freed from this one task. Am i wrong?
Less flexible but far more efficient. You get comparable audio processing performance at a fraction of the cost, die size and power consumption.Isn't a audio DSP better to have than using the CPU to do the same thing?
So Wii U has more RAM, and the GPU can do the same effects as the X360. The only new info is that they were worried that the Wii U GPU wasn't capable of the same effects as a X360 before they git dev kits.
But every game needs sound and using the DSP you have more processing power that can be used for anything, cause the CPU is freed from this one task. Am i wrong?
The GPU can do 3 generations better effects and new features PS3 nor X360 have.
It still is very much cost and energy efficient. CPUs don't perform very well at audio tasks. It's not much different from GPUs - why use a GPU if you have a powerful, massively parallel CPU? Easy: Even that massively parallel CPU sucks at 3D graphics. And it sucks at audio. Chips like an AD TigerSHARC run circles around an i7 at a fraction of the cost and power consumption. Audio processing isn't playing a MP3 in iTunes after all.DSPs, and custom chips in general are a good idea for performing specific tasks. I've loved the idea of having custom chips since the 8-16-bit generations and, I particularly liked the way they were implemented on the Amiga, but it doesn't make much sense nowadays. Back in those days, CPUs were quite weak, so it made a lot of sense to have custom chips perform critical tasks, but it's not particularly cost-effective anymore. The industry has been moving away from that concept for a while now. Even modern GPUs are massively parallel CPUs that have moved from being fixed-function to fully programmable, and full convergence between CPUs and GPUs seems inevitable down the line.
I don't think there's any need for DSPs with modern CPUs and GPUs having enough power and being more flexible, but it's nice to have it on the Wii U, seeing as its CPU seems to be a bit on the weak side. The problem right now is that developers, particularly those working on launch titles, might disregard it for lack of time.
Good to know the wii-u compares well with 6 year old hardware, seriously how many of these threads do we need?
Shouldn't this just be expected, i doubt its possible to source hardware less powerful 6 years on
But every game needs sound and using the DSP you have more processing power that can be used for anything, cause the CPU is freed from this one task. Am i wrong?
It still is very much cost and energy efficient. CPUs don't perform very well at audio tasks. It's not much different from GPUs - why use a GPU if you have a powerful, massively parallel CPU? Easy: Even that massively parallel CPU sucks at 3D graphics. And it sucks at audio. Chips like an AD TigerSHARC run circles around an i7 at a fraction of the cost and power consumption. Audio processing isn't playing a MP3 in iTunes after all.
The main reason DSPs become less and less common on the PC side is that it's too much work to support them properly. At least in games.
No, the cost of the DSP would have been better spent on a more powerful CPU.
Even the pile of derp that is the xenon does not have a problem with the advanced audio in BFBC2, BF3, Crackdown ect.
I'm not saying it's a good choice, but most likely the options were it being there with the same CPU, or not with the same CPU.
Nintendo choices rarely make sense. Did it make sense limiting their hardware with the Wii? No it hindered 3rd party development. Did it make sense on the GCN to have a pool of 24MB blazing fast RAM and 16 MB of incredibly slow and a miniDVD that stored barely over a gig? Or the N64 era cartridges?
Nintendo priorities have always been different from the norm.
No, the cost of the DSP would have been better spent on a more powerful CPU.
Even the pile of derp that is the xenon does not have a problem with the advanced audio in BFBC2, BF3, Crackdown ect.
CPUs handle audio tasks just fine as proven by all the games that use them for advanced audio effects and the fact that most DAWs only support support software plugins!
Audio is much simpler than 3d graphics.
No, it is because they became pointless in light of ever faster CPUs!
Speculation = fact
The cost of a hardware DSP and board space need for it would be very unwelcome for such a small console.
Because devs in general did very well in faking a lot of things later this gen. And we've (I've) been seeing a lot more comments from people who haven't been that impressed by some of the next-gen demos.
Nintendo choices rarely make sense. Did it make sense limiting their hardware with the Wii? No it hindered 3rd party development. Did it make sense on the GCN to have a pool of 24MB blazing fast RAM and 16 MB of incredibly slow and a miniDVD that stored barely over a gig? Or the N64 era cartridges?
I personally don't see Wii-U doing much in the way of advanced audio. I think that the DSP + modest CPU is a cheaper combination Nintendo found, and we know they are all about the profit.
serious question - why are the wii-u's specs still a secret?
Exactly, it's a cost-cutting measure. DSPs can be dirty cheap. It's cheaper to have an inexpensive CPU and offload the audio processing to the DSP. It wouldn't make sense at all to have it in the first place if they had a beefy CPU, because audio processing is hardly taxing on a modern CPU when compared to IA, rendering, physics, etc.
Nintendo doesn't like people focusing on raw specifications.
All modern non SOC GPUs from AMD and Nvidia are GPGPUs! even the 360 has a GPU that can act as a GPGPU.
A GPGPU is only suited to a smallish subset of tasks, it can not help a CPU on any problem that is not hugely parallel or in anyway latency sensitive and doing so will take away from graphics processing.
Sound processing is hardly a CPU killer, the transistors would be better spend in the CPU.
The cost of a hardware DSP and board space need for it would be very unwelcome for such a small console.
you're kidding yourself. Just Crysis 3 fully maxed is pretty much next gen (watch the trailer, look at the textures, etc). And at 1080P.
PS4 specs should deliver that with ease as a straightforward port. Optimized PS4 titles 4 years later will be almost unimaginable.
Nintendo doesn't like people focusing on raw specifications.
That was in direct reference to the GCN. Wii discs are standard dual layer DVD's. Up to like 9 gigs.@Thundermonkey: Wii Discs only had about one Gig? Don´t rememeber that, think you are wrong^^
Don't get me wrong.Yeah, this is what bugs me when people trot out "This is Nintendo, they know what they're doing" or the like.
I'd argue evidence shows they definitely do not, know what they're doing, when it comes to hardware.
Bottom line Iwata said it best about Wii U, Nintendo literally does not care that the competition will bring forth machines with better graphics. Those are Iwata's words, not mine. I wish I had the interview offhand but, yeah. He said that. He did not say "we did the best we could" or "we tried to deliver a lot for our budget". He said "we dont care that the other guys will have better graphics".
The amount of RAM is of no consequence until we know:
How much is reserved for the OS.
How much is used due to the screen in the controller.
Type of RAM.
Memory interface bandwidth amount and speed.
I find it hard to believe that, by 2014/2015, games aren't going to look a hell of a lot better than anything that we've seen on current gen consoles.You should revisit some of the pasts threads when these demos first came out then. I still remember comments from people not being able to see the difference in PC BF3 vs the console.
http://www.neogaf.com/forum/showpost.php?p=40205173&postcount=3195
You should revisit some of the pasts threads when these demos first came out then. I still remember comments from people not being able to see the difference in PC BF3 vs the console.
Way more memory than 512 mb? Impressive.
I'd actually agree to an extent, but trying telling some PC guy's that. To them, BF3 console, or choose your console version here, might as well be an 8 bit Nintendo game or something. Never mind the horrors of 30 FPS and lowly 720P (which I have to agree, after watching my brother PC game a lot, I am ready for a move to 1080P, which 720P never bothered me before).
In other words ironically I'm typically on your side, arguing that the console version of game X is not THAT far behind the PC version. I'd even argue that for BF3, to an extent. I think this because PC ports are based on the console versions though, not because of any diminishing returns factor.
But I was speaking of Crysis 3 anyways, and again, that trailer, just the textures alone...you run that in 1080P and I think you're in the ballpark of next gen. Though I think next gen will far exceed it eventually (IE, what Killzone 2 was to 1st gen 360 games)
Watch Dogs and Star Wars 1313 are other good examples of sort of next gen, running now. They impress me much much more than say, UE4 demo because they are real.
Assuredly.I find it hard to believe that, by 2014/2015, games aren't going to look a hell of a lot better than anything that we've seen on current gen consoles.
No, it is not.
Just because one dev made a 360 game with over 200 sounds at a time and eating one whole core does not mean that just about any other game gets close to doing that.
To is very easy to get worse GPUs and less RAM than the 360 or PS3.
The Wii CPU is tiny, it would perfectly fine to chuck it in WiiU just for BC and a Power7 CPU would have no trouble emulating it either.
Yeah, what do I know? It's not like I actually worked on pro audio applications and drivers for DSP boards or something...CPUs handle audio tasks just fine as proven by all the games that use them for advanced audio effects and the fact that most DAWs only support support software plugins!
Audio is much simpler than 3d graphics.
Quite a few modern games waste more CPU cycles on audio than on AI or physics, actually. They have to, as processing audio on a CPU is simply not very efficient. There's a presentation by the Microsoft Xbox Division about CPU usage floating around, feel free to google it. It's quite an eye opener. And with companies like Firelight and Audiokinetic working on middleware that uses production grade effects in realtime in games, it's not unreasonable to expect the CPU load to go up quite a bit - still requiring a similar percentage (~15-30%) even on CPUs an order of magnitude more powerful.Exactly, it's a cost-cutting measure. DSPs can be dirty cheap. It's cheaper to have an inexpensive CPU and offload the audio processing to the DSP. It wouldn't make sense at all to have it in the first place if they had a beefy CPU, because audio processing is hardly taxing on a modern CPU when compared to IA, rendering, physics, etc.
Related question: wasn't there a relatively thorough albeit old leak of specs a while ago anyway, that was apparently copy-pasted from SDK documentation.
Is there any reason we're not using that for reference anymore?
Yip. Those are the ones. Basically, I'm wondering why we're now essentially ignoring those specs - for example this thread - ignores that we already knew the Wii U will have ~1.5 GB RAM, (possibly more like 2 GB in final units iirc) ergo this isn't really new news.Those leaked specs did not say as much about the CPU as the GPU:
"Main Application Processor
PowerPC architecture.
Three cores (fully coherent).
3MB aggregate L2 Cache size.
core 0: 512 KB
core 1: 2048 KB
core 2: 512 KB
Write gatherer per core.
Locked (L1d) cache DMA per core."
http://www.neogaf.com/forum/showthread.php?t=476997
Apparently those specs were right on the money for earlier dev kits from late last year....
The Wii having more than the Xbox surprises me. Do you know what the GC had?
arguing that its abundance helps make the console's graphics equal to that of the Xbox 360 and PlayStation 3
Hurray!
That Sonic & Sega All-Stars series is always pushing the graphic envelope....
Those leaked specs did not say as much about the CPU as the GPU:
"Main Application Processor
PowerPC architecture.
Three cores (fully coherent).
3MB aggregate L2 Cache size.
core 0: 512 KB
core 1: 2048 KB
core 2: 512 KB
Write gatherer per core.
Locked (L1d) cache DMA per core."
http://www.neogaf.com/forum/showthread.php?t=476997
Apparently those specs were right on the money for earlier dev kits from late last year....
Yip. Those are the ones. Basically, I'm wondering why we're now essentially ignoring those specs - for example this thread - ignores that we already knew the Wii U will have ~1.5 GB RAM, (possibly more like 2 GB in final units iirc) ergo this isn't really new news.
Presumably, it's just because there isn't any new news.
That didn't stop MS and Sony from making their extremely weak CPUs do audio though...