Don't believe those Darksider 2 bullshit. It takes a couple months to get anything working on a new platform.
This quote brought to mind something that wsippel mentioned earlier, back in June...
Maybe this is along the same lines as the Eurogamer dev quote, sounds very familiar.
Odd, I thought that Nintendo was pleased with their SDK.
Because Microsoft isn't the standard
goruka said:Disclaimer: IALD (I am a Licensed Developer)
Because of NDA I can't really say much, but i'd take developing for WiiU than for 360 or PS3 any day. The Hardware, APIs are much simpler and familiar. The hardware in WiiU is DX10 level, while 360 and PS3 are DX9 level with some extra stuff hacked on.
Basically that means, besides the more friendly and flexible hardware, implementing most common rendering techniques can be done more efficiently. (OpenGL 3.x features, OpenCL).
So it's not just about "raw performance". In contrast, DX11 level hardware (what will likely power PS4 or xb720), even if likely to be much faster, won't be that different to program for than WiiU.
Anonymous Coward said:Posting anonymously, just because.
Speaking as a developer who's worked on the PS3, the Xbox 360 and the WiiU. The CPU on the WiiU has some nice things on it. But its not as powerful as the Xbox360 chip. I think N went to IBM and asked them: 'What's cheap to put on the chip?' and IBM said 'Well we have this sh*t that no-one wants.' and N said 'we'll take it.'. It does have better branch prediction than the PPCs in the PS3 and Xbox360.
The Espresso chip doesn't have any sort of vector processing. It does have paired singles, but that's a pain, a real pain to use. The floating point registers are 64 bit doubles, so when people talk about paired singles I assumed you split the register in two. No the registers are actually 96 bits wide, its actually a double and a single. To load it you have to load in your single, do a merge operation to move that single to the upper 32 bits, and load in your second one. This makes the stacks explode, because to save a floating point register in the callee takes three operations, and 12 bytes no matter what.
While the WiiU has 1 gig of RAM available to the game to use, the RAM is slow. The cache on the chip is also slow. We had tested out memory bandwidth between cache and main memory on the xbox360 and the WiiU. The main memory access on the Xbox360 is about 2x-4x times as fast as accesses the cache on the WiiU. Yes I mean that the external to the chip RAM on the Xbox360 is faster than the cache memory on the WiiU. I don't remember the full results but I think we figured out accessing the hard drive on the Xbox360 was faster than the RAM on the WiiU too.
The optical drive is also slow. I don't know for sure but it feels like the same drive that went into the PS3. And on the PS3 we used the hard drive to cache things to improve load speeds. Without a hard drive on the WiiU we can't do that.
I won't go into the OS, and the programming environment, but let me just say I hate programming for Windows, and I prefer programming on the Xbox360 to the WiiU.
While the GPU in the WiiU is better (probably because ATI doesn't make anything worse these days), they don't have the CPU and RAM to back it up. Who knows maybe things will be better from launch, but I'm glad to leave the WiiU behind.
cpct0 said:OP AC:
I used to code for Wii. Haven't coded for WiiU. So I cannot tell, only extrapolating from what you are saying here.
However, what you are giving as info is mostly the same than Wii used to have. I expected they kept full compatibility between the WiiU and the Wii, so they could emulate the system. That probably explains the chips.
Your PS (Paired Single) experience is mostly what I would expect from a newbie assembly programmer. Sorry. Yes, it's very hard to code PSes but once you get the hang of it, it's very efficient.
As far as your memory experience, I would expect the WiiU to use the equivalent from the Wii, meaning they have a very fast internal memory, and a cacheless external memory. It's powerful if you understand how to work its magic, and you need to know how to use caches or other accumulators to transfer data.
Not saying it isn't a pain. It is. Especially if you want to code as a general purpose guy (big company), with compatibility on multiple platforms. Most multiplatform have one kind of memory, so it expects fast and efficient RAM for its whole game. However, if you code solely for the WiiU, and have a background in Wii or in GameCube, you'll feel right at home I'm sure. Read your comments, and it all rang bells.
LordLimecat:
It would make sense if the WiiU uses the same system than the Wii. Wii uses 2 kind of RAM, first one is very quick for random access, but you have very little of it. Second one is very quick for sequential write access, but horribly slow for random read access. Depending on tests, you can get magnitude of slowness in that kind of RAM on Wii. Now, I don't have experience in WiiU (and even if I did, I would keep this confidential, to be honest), but I do feel in a familiar place.
-full disclosure- Work for EA, all info here was double-checked for availability in the likes of Wikipedia and Google. Opinions are mine.
Anonymous Coward said:Posting anonymously, just because.
Speaking as a developer who's worked on the PS3, the Xbox 360 and the WiiU. The CPU on the WiiU has some nice things on it. But its not as powerful as the Xbox360 chip. I think N went to IBM and asked them: 'What's cheap to put on the chip?' and IBM said 'Well we have this sh*t that no-one wants.' and N said 'we'll take it.'. It does have better branch prediction than the PPCs in the PS3 and Xbox360.
The Espresso chip doesn't have any sort of vector processing. It does have paired singles, but that's a pain, a real pain to use. The floating point registers are 64 bit doubles, so when people talk about paired singles I assumed you split the register in two. No the registers are actually 96 bits wide, its actually a double and a single. To load it you have to load in your single, do a merge operation to move that single to the upper 32 bits, and load in your second one. This makes the stacks explode, because to save a floating point register in the callee takes three operations, and 12 bytes no matter what.
While the WiiU has 1 gig of RAM available to the game to use, the RAM is slow. The cache on the chip is also slow. We had tested out memory bandwidth between cache and main memory on the xbox360 and the WiiU. The main memory access on the Xbox360 is about 2x-4x times as fast as accesses the cache on the WiiU. Yes I mean that the external to the chip RAM on the Xbox360 is faster than the cache memory on the WiiU. I don't remember the full results but I think we figured out accessing the hard drive on the Xbox360 was faster than the RAM on the WiiU too.
The optical drive is also slow. I don't know for sure but it feels like the same drive that went into the PS3. And on the PS3 we used the hard drive to cache things to improve load speeds. Without a hard drive on the WiiU we can't do that.
I won't go into the OS, and the programming environment, but let me just say I hate programming for Windows, and I prefer programming on the Xbox360 to the WiiU.
While the GPU in the WiiU is better (probably because ATI doesn't make anything worse these days), they don't have the CPU and RAM to back it up. Who knows maybe things will be better from launch, but I'm glad to leave the WiiU behind.
This quote brought to mind something that wsippel mentioned earlier, back in June...
Maybe this is along the same lines as the Eurogamer dev quote, sounds very familiar.
We need to get Jackson (and any other devs) in this thread. Every other report so far has said the opposite and considering historically we always heard devs complain about these things publicly I doubt they would be afraid to speak the truth. Unless this is about the SDK changing a lot up to the final version.
Hope they can access the eDram to fix all these issues soon.
There were a few interesting posts on a Slashdot story from people claiming to be licensed developers.
http://games.slashdot.org/story/12/11/18/2234225/nintendo-wii-u-teardown-reveals-simple-design
There were a few interesting posts on a Slashdot story from people claiming to be licensed developers.
http://games.slashdot.org/story/12/11/18/2234225/nintendo-wii-u-teardown-reveals-simple-design
Tell us more about the useless bit?No, Vita and 360 are very similar now.
Yes, they are using Multi debugger. It's what they have been using forever. It's ancient and useless.
The second quote pretty much seals the deal: From that comment as well as others, I don't think people should expect much improved ports in the future, 360 and PS3 will likely continue to be where it's at for consoles.
X360 and Vita both started with solid tool-chains from day one, as well as very solid OS foundations. 3DS probably as well.Isn't that how all new platforms start out?
This quote brought to mind something that wsippel mentioned earlier, back in June...
Maybe this is along the same lines as the Eurogamer dev quote, sounds very familiar.
Why not read the third quote? There's no way the Wii U's RAM is slower than 360's HDD. Even if Wii U is using N64 era RAM.
Well that's not very encouraging.
There were a few interesting posts on a Slashdot story from people claiming to be licensed developers.
http://games.slashdot.org/story/12/11/18/2234225/nintendo-wii-u-teardown-reveals-simple-design
Why not read the third quote? There's no way the Wii U's RAM is slower than 360's HDD. Even if Wii U is using N64 era RAM.
Because the third developer is speculating and apparently hasn't had any access to the Wii U or its specs and hardware setup.
Source?And let's not forget about the fact that pretty much all 3rd party games on Wii U seem to have problems (even Trine 2 is apparently missing some light effects) and regarding 1st party software, ZombiU (pretty much the only demanding game in that category) is suffering slowdowns.
That is....ugh.There were a few interesting posts on a Slashdot story from people claiming to be licensed developers.
http://games.slashdot.org/story/12/11/18/2234225/nintendo-wii-u-teardown-reveals-simple-design
Source?
And let's not forget about the fact that pretty much all 3rd party games on Wii U seem to have problems (even Trine 2 is apparently missing some light effects) and regarding 1st party software, ZombiU (pretty much the only demanding game in that category) is suffering slowdowns.
There were a few interesting posts on a Slashdot story from people claiming to be licensed developers.
Posting anonymously, just because.
Speaking as a developer who's worked on the PS3, the Xbox 360 and the WiiU. The CPU on the WiiU has some nice things on it. But its not as powerful as the Xbox360 chip. I think N went to IBM and asked them: 'What's cheap to put on the chip?' and IBM said 'Well we have this sh*t that no-one wants.' and N said 'we'll take it.'. It does have better branch prediction than the PPCs in the PS3 and Xbox360.
The Espresso chip doesn't have any sort of vector processing. It does have paired singles, but that's a pain, a real pain to use. The floating point registers are 64 bit doubles, so when people talk about paired singles I assumed you split the register in two. No the registers are actually 96 bits wide, its actually a double and a single. To load it you have to load in your single, do a merge operation to move that single to the upper 32 bits, and load in your second one. This makes the stacks explode, because to save a floating point register in the callee takes three operations, and 12 bytes no matter what.
While the WiiU has 1 gig of RAM available to the game to use, the RAM is slow. The cache on the chip is also slow. We had tested out memory bandwidth between cache and main memory on the xbox360 and the WiiU. The main memory access on the Xbox360 is about 2x-4x times as fast as accesses the cache on the WiiU. Yes I mean that the external to the chip RAM on the Xbox360 is faster than the cache memory on the WiiU. I don't remember the full results but I think we figured out accessing the hard drive on the Xbox360 was faster than the RAM on the WiiU too.
The optical drive is also slow. I don't know for sure but it feels like the same drive that went into the PS3. And on the PS3 we used the hard drive to cache things to improve load speeds. Without a hard drive on the WiiU we can't do that.
I won't go into the OS, and the programming environment, but let me just say I hate programming for Windows, and I prefer programming on the Xbox360 to the WiiU.
While the GPU in the WiiU is better (probably because ATI doesn't make anything worse these days), they don't have the CPU and RAM to back it up. Who knows maybe things will be better from launch, but I'm glad to leave the WiiU behind.
http://games.slashdot.org/story/12/11/18/2234225/nintendo-wii-u-teardown-reveals-simple-design
I can't stop reading the article in the DBZ announcer's voice now, what the hell.
Yea, unlike the 360 and PS3, where no games have slowdowns.
This is launch software. Remember Splinter Cell Double Agent on PS3? We should wait for the next round of titles before we make definitive judgements.
Oh he was doing so well at fooling people until that rubbish I highlighted. I mean really, XBox 360's HDD is faster than 12.8GB/s DDR3 memory?
Also WiiU's drive is faster then PS3's, we already know that, he doesn't and he's a (fake) developer no less.
Quite a few people mentioned it in the GiantBomb live stream thread - I haven't seen or played the game myself in any way, so this could be false.
"The next round of titles" is pretty much irrelevant - next spring, only very few high-profile 3rd party titles are slated to release on Wii U. And fall releases absolutely won't matter in terms of technical proficiency as not only will 360 and PS3 ports still be more important than Wii U versions, but PS4 and Xbox 720 titles are going to take the spotlight.
I wouldn't be surprised if random access times are pretty much identical. Optical media are extremely slow in that respect. With the Wii U apparently not providing any disk cache, developers won't have an easy way to deal with long access times.
Well that's not very encouraging.
They were comparing it to the PC version.
Isnt the next Xbox/PS3 at least still a year off? The whole double E3 reveal (one E3 to reveal the systems, and then the next E3 for the launch line ups)?
"The next round of titles" is pretty much irrelevant - next spring, only very few high-profile 3rd party titles are slated to release on Wii U. And fall releases absolutely won't matter in terms of technical proficiency as not only will 360 and PS3 ports still be more important than Wii U versions, but PS4 and Xbox 720 titles are going to take the spotlight.
Well now, that second Anonymous dev quote lines up with all the weak CPU rumblings before launch and now low-speed RAM reveals. That probably doesnt bode too well then!
Two E3s aren't necessary to show and launch a system.Isnt the next Xbox/PS3 at least still a year off? The whole double E3 reveal (one E3 to reveal the systems, and then the next E3 for the launch line ups)?
You realise that one of those so called developers tried to claim that 360's HDD (with a hundred or so MB/s bandwidth and 12ms latency) is faster to access than WiiU's 12.8GB/s 10ns RAM? One of the tallest tales I've heard in a long time. It ranks right up there with someone who tried to make me believe my next door neighbor had flown to the moon in a bin when I was a kid..
Are you kidding me?, you read the bit claiming a HDD was faster than DDR3 RAM and you think "Yeah he's believable"?
Are you kidding me?, you read the bit claiming a HDD was faster than DDR3 RAM and you think "Yeah he's believable"?
Its say something bad about Wii U power, so it must be true! Doesn't matter if what he's saying defies even the most pessimistic takes on the system and if true, it would put the system well bellow the Dreamcast in performance (even the N64!? What was the bandwidth and latency on the SNES RAM?
Because the third developer is speculating and apparently hasn't had any access to the Wii U or its specs and hardware setup - and is contradicting the second posting at that.
Quite a few people claimed Trine for Wii U to be missing light effects compared to the 360 version - not the PC port.
Its say something bad about Wii U power, so it must be true! Doesn't matter if what he's saying defies even the most pessimistic takes on the system and if true, it would put the system well bellow the Dreamcast in performance (even the N64!? What was the bandwidth and latency on the SNES RAM?
That could be an anomaly in the way they tested performance - the anonymous developer isn't recounting hard facts, but his/her experience developing for the system.
Quite a few people claimed Trine for Wii U to be missing light effects compared to the 360 version - not the PC port.