The madden and killzone demos have been reached?
pls... don't...
The madden and killzone demos have been reached?
The madden and killzone demos have been reached?
madden not but killzone actually do look like alike trailer.
The madden and killzone demos have been reached?
It's funny how people are freaking out about port performance when this was discussed at length and expected in the old Wii U speculation threads. Where's BG when you need him? lol
Ports from current gen games need "work" to run on the Wii U decently, just like a PC does.
So this is basically just Nintendo magic:
http://www.youtube.com/watch?feature=player_embedded&v=y9uYCU8jFiU
http://www.youtube.com/watch?v=6OHUwDShrD4&playnext=1&list=PL7982B2177A8DA0D2&feature=results_video
These were running on pre-alpha dev kits. What's the big surprise that ports of current gen games that were made for older hardware have some performance issues on Wii U? Every time a performance hiccup occurs on a Wii U it seems that the hardware is to blame and not the developers. Wasn't it just confirmed that Wii U can run Unreal Engine 4 and Unity Engine 4?
Slower RAM and lower clock CPU be damned, those demos are more than enough convincing to me and it was confirmed that the Wii U dev kits had several enhancements since that time and not the opposite. If Gearbox and Epic are impressed with the Wii U and have already confirmed that it's more powerful than current gen hardware and even considered "next gen" due to it being modern technology, why did everyone all of a sudden jump off the cliff with these ports that in no way indicate what the Wii U is actually capable of when a game is developed for the system itself?
If Gearbox and Epic are impressed with the Wii U and have already confirmed that it's more powerful than current gen hardware and even considered "next gen" due to it being modern technology, why did everyone all of a sudden jump off the cliff with these ports that in no way indicate what the Wii U is actually capable of when a game is developed for the system itself?
Slower RAM and lower clock CPU be damned, those demos are more than enough convincing to me and it was confirmed that the Wii U dev kits had several enhancements since that time and not the opposite. If Gearbox and Epic are impressed with the Wii U and have already confirmed that it's more powerful than current gen hardware and even considered "next gen" due to it being modern technology, why did everyone all of a sudden jump off the cliff with these ports that in no way indicate what the Wii U is actually capable of when a game is developed for the system itself?
Because there has never been a 'next-gen' console that struggled to achieve parity with the previous generation of hardware, even at launch. That is the top & bottom of it.
Do you really think developers with Nintendo dev kits are going to publicly criticize a yet unreleased console? Just like Nintendo they have a vested interest in making the Wii U a success. Everyone who divulged actual information on the Wii U hardware or criticized it did so anonymously.
Some would argue Wii struggled to achieve parity with Xbox.
This is difficult to believe, but after people pointed to the Arkam post in the speculation thread I've read several pages of it and some posts just seem strange.What if Nintendo downgraded the specs in the final hardware?
This is difficult to believe, but after people pointed to the Arkam post in the speculation thread I've read several pages of it and some posts just seem strange.
Going the posts in the speculation thread, it seems like there are information about 3 different WiiU dev kits (maybe even more):
1. the earliest one, for obvious reasons the most different of the 3 from the final hardware
2. the middle one, the dev kit Arkam commented on saying it was a little less powerful than an Xbox 360, similar to the final hardware but not the same
3. the final dev kit, similar to the previous but with better (and final) hardware, we know now that it's at the very least better than an Xbox 360 in several aspects
What is strange is that several people, people who don't usually go around telling lies on this forum (and going by what they posted in the past some of them can certainly have some connection with developers), reacted to the Arkam post saying that, going by their sources, the early dev kit was several time more powerful than an Xbox 360. They never said to have direct info, so maybe this is the problem, but I think it's difficult to believe their sources lied to all of them (unless it's always the same source for everyone, but this also doesn't seem believable).
wsippel (I think, I've read the thread yesterday and I'm going by memory) mentioned that the early dev kit used GDDR3. He doesn't mention a source, so I'm not sure if that info it's reliable or not, but I started to think... Is it possibile that Nintendo changed the memory configuration from GDDR3 in the early dev kit to (a bigger quantity of) DDR3 in the middle and final versions?
Given that an early dev kit is usually a "normal" pc, I suppose that's possibile, it can't be really similar to the final hardware.
Eventually that change was probably a good choice, but this thought fueled the idea that in the dev kit progression some changes at least modified the balance of the hardware enough to require a different way to approach it from an engine point of view (but, frankly, if the "several times more powerful than an Xbox 360" info at the time of the early dev kits was true, the only explanation is a significant downgrade of some parts).
Now, after I've written all this text, what do other people think? Crazy talk? Or there is something that could be believable?
To spin this about face, do you really think Randy Pitchford would have said generally positive and supportive things over and over that weren't true if he didn't have to? You lie by omission perhaps, but you don't go out of your way to praise something
SmokeyDave said:Because there has never been a 'next-gen' console that struggled to achieve parity with the previous generation of hardware, even at launch. That is the top & bottom of it.
Maybe the information was just wrong, of course.Why do we need such a complex explanation to account for all the fake rumors and "sources" people have put on the internet? Maybe the information was just wrong.
Good point, although that was partially because the Xbox was a beast as well as the Wii being weak. Still, it's a relatively new phenomenon.
It exist more than 3 devkits models (or upgrades) on wii U.
The first ones were really "gimped" in performance, but I think V3-4-5 ... are nearly "the same", not big changes if I remember correctly.
lherre, can you explain in every man's words how the information coming from the disk drive is processed by the RAM for the OS, the RAM for the games, the eDRAM and finally the GPU? I really want to understand how a system can be "unbalanced" (or balanced for that matter).It exist more than 3 devkits models (or upgrades) on wii U.
The first ones were really "gimped" in performance, but I think V3-4-5 ... are nearly "the same", not big changes if I remember correctly.
I'm dying here people.
1. PC have Vram on graphic cards which is main pool not normal Ram which is nice to have fast but it isn't like Vram.
2. Good ram can only if game is more advanced where assets which are very big need to be loaded fucking fast, add bazylion of effects like in Crysis 2 and that is place when good high speed ram will matter.
It's good you had good times.
My point is THIS is the main RAM on WiiU. Intensive stuff will be run on eDRAM.
My point is Crysis run fine on DDR2-800 as far as you have a decent GPU, and it streams A LOT of hires assets.
Oh, by the way, would be funny to see those mocking of me now Anandtech corrected and DOUBLED their estimated bandwidth.
But that decent GPU on your PC has dedicated Vram, the Wii U has 2gB of ram total. (+ the eDRAM)
It exist more than 3 devkits models (or upgrades) on wii U.
The first ones were really "gimped" in performance, but I think V3-4-5 ... are nearly "the same", not big changes if I remember correctly.
Thanks for the information.The changes were rather noticeable between V3 and V4, more limited between V4 and V5. But in a V5 framework, several teams managed to improve the performance, the framerate, of their projects, by a lot (so a combination of this newest dev kit + sdk + software optimization). Some even found rather late in the development cycle a problem in their use of the CPU that hindered its total power, it was quickly fixed and added a nice boost.
Nintendo should have monitored those ports and ensure that their system, their sdk, documentation, all the tools to develop on Wii U, are built in a way than even small teams with low resources could take advantage of the specificities of the platform. I hope all these chains will improve with time.
Still, they could have been a bit more generous for their own games.Ah well, as long as it can run Nintendo's first party stuff what do they care? Xbox 720 and PS4 can have the third parties to themselves again in another year.
I checked Anandtech's page and they still think the RAM has a peak BW of 12.8 GB/s.
Anandtech 11/19 said:There are four 4Gb (512MB) Hynix DDR3-800 devices surrounding the Wii U's MCM (Multi Chip Module). Memory is shared between the CPU and GPU, and if I'm decoding the DRAM part numbers correctly it looks like these are 16-bit devices giving the Wii U a total of 6.4GB/s of peak memory bandwidth. That doesn't sound like a lot, but the Wii U is supposed to have a good amount of eDRAM for both the CPU and GPU to use.
What the hell is the CPU supposed to do with eDRAM?
The Killzone one yes.
But a PC have no unified pool of RAM. GPU's are more like an isolated subsystem attached by the PCIE bus. That means a GPU running over PCIE 2.1 @x16 have a peak 8 GB/s transfer link. If you run out of VRAM, your performance will die, since you can't stream data efficiently from the main RAM at that speed. Consoles do not work that way.
DDR3 1600 is the standard memory right now for high end PC's, btw. Anything over that is overclock.
Well unlike the Xbox 360, the eDRAM might not be built into the GPU so the CPU could share it.
Well there is a third tiny die on the package which might be the eDRAM, unless it has already been identified as something else...
Thanks for your info. Hope you can give us a little more in light of thos teardown, but we take what we can get.It exist more than 3 devkits models (or upgrades) on wii U.
The first ones were really "gimped" in performance, but I think V3-4-5 ... are nearly "the same", not big changes if I remember correctly.
It is launch time, and a lot of these game finished development some time ago. Even in your example, the developers discovered something quite late in the development cycle, and it would take some time for them to scan through the entire game to ensure that the extra processor speed and effectively being used. Some of the problems, like the slowdowns, happens more randomly and developers probably didn't have time to go through all the issues.Like i said on IRC, i don't think the problem we see on Wii U ports originates from a lack of "total power" (the "performances" under optimal use that the system can deliver and that could be measured somewhat). It really seems it's more a matter of a non-quick-with-not-many-dev-behind-friendly architecture, not-well-enough-designed-for-titles-like-that SDK, poor documentations, something like that.
I think i've revealed that in a v4 dev kit context, my sources struggled at keeping the framerate of their projects (exclusives) at a steady 30fps (a semester ago at least) with an intricate 3D usage of the second screen, but since then, the final dev kit + new sdk + other parameters made them reaching roughly 60 fps. So since E3, i've never heard of performance issues, of a problem with the main ram speed that could have hindered the games, etc.
There is something going on with ports. And even if it could be partially explained by a lack of optimization and tailoring of these titles for the Wii U specifics, i'm pretty disappointed by the near total absence of benefit those ports gained on the system, whereas with 2x more memory, a more modern GPU architecture, more cache, framebuffer, etc, we should have at the very least expected a tad more anti-aliasing, texture-filtering, a tad higher quality textures, viewing distance, whatever. Nintendo should have monitored those ports and ensure that their system, their sdk, documentation, all the tools to develop on Wii U, are built in a way than even small teams with low resources could take advantage of the specificities of the platform. I hope all these chains will improve with time.
Unlikely to be anything else. The fact it's a separate package suggests the CPU may well have (limited) access to it.
- Good enough visuals. I'm pretty confident that the machine is capable of 1.5x-2x the 360. That is good enough for me.
Do we know what kind of area that third die is?
Well there is a third tiny die on the package which might be the eDRAM, unless it has already been identified as something else...
Do we know what kind of area that third die is?
Nvm, the site says 2.65mm^2.
I'm not sure that can be eDram. Based on density estimates for Power7 eDRAM at 32nm, for example, only 3-4MB of eDram would fit in that kind of space. This likely isn't 32nm either (?)
The extra ARM processor?
which is why you buy a GPU with enough memory to support the resolution you plan to run at. You don't want it to run out of local memory and have to fetch from system ram because its so slow.
Your comparison wasn't valid, end of.
Thanks for your info. Hope you can give us a little more in light of thos teardown, but we take what we can get.
It is launch time, and a lot of these game finished development some time ago. Even in your example, the developers discovered something quite late in the development cycle, and it would take some time for them to scan through the entire game to ensure that the extra processor speed and effectively being used. Some of the problems, like the slowdowns, happens more randomly and developers probably didn't have time to go through all the issues. These titles generally are more ambitious and larger than any console launch before, and a lot of them were ported from systems that is not a traditional generation weaker, but are very different in artitecture. The games coming after launch will have a bigger advantages, including being able to access miiverse and more time with the final dev kits.
That third tiny Die is most certainly not the eDRAM. It's TINY (2.65mm² according to Anandtech), which is far, far too small to be the eDRAM. The eDRAM should easily be 10+ times the size of that.