Twenty7KVN
Member
I wounder how close it would get at 1.2ghz.That is not what he said:
Source: https://twitter.com/marcan42/status/274181216054423552
He said at the same clockspeed it would win. It is definitely not the same clockspeed.
I wounder how close it would get at 1.2ghz.That is not what he said:
Source: https://twitter.com/marcan42/status/274181216054423552
He said at the same clockspeed it would win. It is definitely not the same clockspeed.
Right. Crowd AI/ pathfinding isn't. And that's something GPUs are really damn good at.
And on SIMD code, it wouldn't even win at the same clock speed. Of which it's removed by more than a factor of 2.He said at the same clockspeed it would win. It is definitely not the same clockspeed.
Those are good questions. Especially the second one, I really don't get why that demo is held to such a high standard by people.2 questions to some people:
- Have a comparable xenon performance CPU is an achievement in 2012?
- Is there any special thing in the Zelda demo? Maybe i'm blind (it's an honest question). I don't see anything special on it (except it is zelda in hd).
Skyrim and games like it need CPU performance to run all the scripts for all the actors in the cell. Basically, your "handful of active enemies/npcs" assumption is off.Ok this isn't really relevant to the thread but I've seen it mentioned a few times in relation to CPU power. Why the hell would Skyrim be particularly CPU demanding? I haven't played it, but it seems close enough to oblivion or morrowind. A slow paced rpg with a handful of enemies/npcs at most active at once, with little other dynamic objects and awful animations (ok not really relevant, but I just can't mention the series without commenting on the animations). Other than a massive streaming world there doesn't seem to be anything particularly impressive from a technical point of view.
I didn't see that tweet, was just going by what was quoted here. Still, Xenon IPC is really, really terrible.That is not what he said:
Source: https://twitter.com/marcan42/status/274181216054423552
He said at the same clockspeed it would win. It is definitely not the same clockspeed.
Is your avatar from Dundersalt?All I can say is what I've been playing on the wiiU the last couple weeks looks awesome....
For 5 minutes, then I'm completely into the game and forget about how detailed everything is.
Wait, are you telling me that somehow this thing will be able to port next gen stuff easier than current gen stuff due to similarities in architecture?
2 questions to some people:
- Have a comparable xenon performance CPU is an achievement in 2012?
- Is there any special thing in the Zelda demo? Maybe i'm blind (it's an honest question). I don't see anything special on it (except it is zelda in hd).
Aw, man that is just awful.Are you fucking kidding me ? 1.25 GHz ?
I change my statement.
Devs were not lazy with ports on WiiU they were fucking wizards to achieve that on WiiU.
- Is there any special thing in the Zelda demo? Maybe i'm blind (it's an honest question). I don't see anything special on it (except it is zelda in hd).
It may be easier to get good utilization out of the Wii U architecture that way, but better utilization alone won't gap a 5x+ performance difference.Wait, are you telling me that somehow this thing will be able to port next gen stuff easier than current gen stuff due to similarities in architecture?
sorry I annoyed you?Not to call out you specifically, but it's getting almost as annoying to see a " I DON'T CARE ABOUT GRAPHICS POST" in a graphics thread as " I DONT CARE ABOUT SALES
Is your avatar from Dundersalt?
Wait, are you telling me that somehow this thing will be able to port next gen stuff easier than current gen stuff due to similarities in architecture?
2 questions to some people:
- Have a comparable xenon performance CPU is an achievement in 2012?
- Is there any special thing in the Zelda demo? Maybe i'm blind (it's an honest question). I don't see anything special on it (except it is zelda in hd).
Those are good questions. Especially the second one, I really don't get why that demo is held to such a high standard by people.
Man the possibility of the WiiU being hacked so early on it's life cycle scares me.
I mean, on one side I'm happy that we can avoid Nintendo region locking bullshit.
But on the other if piracy finds it's way to the console so quickly and easily there's no way this won't harm the console in the long run.
Being from a country were piracy is practically socially acceptable I can count on the fingers of my hands the amount of people I know that buy original Wii and DS games.
Nintendo seems to have done a good job with the 3DS security, hopefully it will be the same with the WiiU.
In a perfect world hackers would find a way to allow region free gaming and homebrew on the WiiU without opening the doors to criminals exploit their works.
Kenka mentioned me a few pages back, so I might as well give my two cents.
First, it's worth keeping in mind that the general expectation until very recently was a CPU around 2GHz (many estimates around the 1.8GHz mark) and a GPU 500MHz or under (my guess was 480MHz).
The main take-home from the real clock speeds (higher clocked GPU than expected, lower clocked CPU than expected) is that the console is even more GPU-centric than expected. And, from the sheer die size difference between the CPU and GPU, we already knew it was going to be seriously GPU centric.
Basically, Nintendo's philosophy with the Wii U hardware is to have all Gflop-limited code (ie code which consists largely of raw computational grunt work, like physics) offloaded to the GPU, and keep the CPU dedicated to latency-limited code like AI. The reason for this is simply that GPUs offer much better Gflop per watt and Gflop per mm² characteristics, and when you've got a finite budget and thermal envelope, these things are important (even to MS and Sony, although their budgets and thermal envelopes may be much higher). With out-of-order execution, a short pipeline and a large cache the CPU should be well-suited to handling latency-limited code, and I wouldn't be surprised if it could actually handle pathfinding routines significantly better than Xenon or Cell (even with the much lower clock speed). Of course, if you were to try to run physics code on Wii U's CPU it would likely get trounced, but that's not how the console's designed to operate.
The thing is that, by all indications, MS and Sony's next consoles will operate on the same principle. The same factors of GPUs being better than CPUs at many tasks these days applies to them, and it looks like they'll combine Jaguar CPUs (which would be very similar to Wii U's CPU in performance, although clocked higher) with big beefy GPUs (obviously much more powerful than Wii U's).
@marcan42 said:So yes, the Wii U CPU is nothing to write home about, but don't compare it clock per clock with a 360 and claim it's much worse. It isn't."
Compared to hardware that launched at the same time, the PSP, yes it was. Of course looking back Nintendo made great decisions on the DS. My point is they are not interested in an arms race with sony/MS.
I find it amazing actually that Nintendo can do this actually. Iwata is a very shrewd business man. I think however that Nintendo is blowing a huge chance to be truly dominant for years by not looking forward just a little bit more on the tech side.
- Is there any special thing in the Zelda demo? Maybe i'm blind (it's an honest question). I don't see anything special on it (except it is zelda in hd).
Ok this isn't really relevant to the thread but I've seen it mentioned a few times in relation to CPU power. Why the hell would Skyrim be particularly CPU demanding? I haven't played it, but it seems close enough to oblivion or morrowind. A slow paced rpg with a handful of enemies/npcs at most active at once, with little other dynamic objects and awful animations (ok not really relevant, but I just can't mention the series without commenting on the animations). Other than a massive streaming world there doesn't seem to be anything particularly impressive from a technical point of view.
Does Dundersalt still excist? It reminds me of my childhood =)Yup, family fav when we visit norway.
While this news is slightly disappointing, I don't think everyone should be concerned or too worried. I mean if we can get amazing looking games like the Wonderful 101, NintendoLand, and ZombieU (not to mention the great looking games currently in their pipelines), then surely the state of game development shouldn't be too dire going forward. And this is just the beginning, I'm sure devs will be able to squeeze more juice out of this machine in due time.
I hate to sound like a broken record, but I will reiterate something I've said many times around the Wii's launch: third party devs must take a page out of Nintendo's first parties' book and focus on prioritizing style over polygon count. It's time to save costs, think outside the box, and give it an honest shot. Lest we forget "limitations can breed innovation."
The industry is on shaky grounds as is, we really can't afford any more unfeasible development cycles and another generational budget jump.
There's nothing special to the zelda demo except it's easily one of the best looking 3d zelda they made. Are you against zelda looking better? You're own statements lead me to believe that you wouldn't want a gc looking zelda in 20XX either.
Direct PS360->Wii ports were impossible except for 2D games like Rayman Origins.
Direct PS4/720->WiiU ports shouldn't be impossible but the question is if it will be worth the effort to devs.
If the next Zelda game is contained on a single room, yeah, we could.
Nintendo doesn't seem to have "gimped" on anything within the price range they had to remain within, considering the total components that make up the system.
You better hope those developers manage to GPGPU everything and still have enough GPU performance left over for graphics.
You say this, but its by design. It's either or. You either sacrifice clock speed for IPC, or sacrifice the clock for the IPC. It would be prohibitly inpractical to try and build a CPU with the best of both. Especially a consumer grade.I didn't see that tweet, was just going by what was quoted here. Still, Xenon IPC is really, really terrible.
How would we know? It's not about what the demo does, it's about how it does it. And this we don't know. You can't really look at the results and say, "yes, that's obviously DX9/ 10/ 11". You'd need to look at the code. Almost everything can be faked/ approximated on pretty much any hardware.My question about it is comparing with games that are out there. There is nothing that couldn't be achieved in ps360 systems, this is why I ask why is so special, not that the game is irrelevant or not interesting for me. I'm speaking strictly about the tech involved in that demo. I saw a lot of claims that is the best "thing" they saw and that it is not posible in ps3/x360. My question was about this.
If the fact that it's good only because is zelda is another question that is not tech related.
I think that the Wii U will be powerful enough to run very high spec games but the architecture is obviously different than other consoles so there is a need to do some tuning if you really want to max out the performance.


We’re not going to deliver a system that has so much horsepower that no matter what you put on there it will run beautifully, and also, because we’re selling the system with the GamePad – which adds extra cost to the package – we don’t want to inflate the cost of each unit by putting in excessive CPU power.Staying with graphics but going back to the idea of getting third parties involved, have you approached Epic with the specs of the Wii U to try to make sure that third-parties using Unreal Engine 4 can easily port their games to Wii U?


I think that the Wii U will be powerful enough to run very high spec games but the architecture is obviously different than other consoles so there is a need to do some tuning if you really want to max out the performance.


We’re not going to deliver a system that has so much horsepower that no matter what you put on there it will run beautifully, and also, because we’re selling the system with the GamePad – which adds extra cost to the package – we don’t want to inflate the cost of each unit by putting in excessive CPU power.
Is this supposed to be the console's GPU?
Can we settle the generation argument by agreeing that game systems come out in batches?
The Dreamcast, GBA, PS2, Gamecube, and Xbox were the 1998-2001 batch.
The Nintendo DS, PSP, Xbox 360, Wii, and PS3 are the 2004-2006 batch.
The 3DS, Vita, Wii U, Durango, and Orbis will all be part of the 2011-2013 batch.
Does Dundersalt still excist? It reminds me of my childhood =)
Tell me the last time anything ever needed to be updated in real-time in any room other than the one you're currently present in in a Zelda title.
Zelda rooms have *always* been pretty much standalone.
Is that why the UI supposedly runs like ass? CPU is just struggling on it?
Would I be wrong to think that Nintendo's extremely low TDP target was more of a factor in any "gimping" than the price target?
Not sure if this one was posted yet; trawling through these pages is making me cry inside:
marcan said:So yes, the Wii U CPU is nothing to write home about, but don't compare it clock per clock with a 360 and claim it's much worse. It isn't."
No, it´s never been like that with regards to technology and never will be. WiiU is current gen, barely.
Is this supposed to be the console's GPU?
I guess we've been playing different games / systems. I can't remember ever having access to such a diverse range of software before.
My question about it is comparing with games that are out there. There is nothing that couldn't be achieved in ps360 systems, this is why I ask why is so special, not that the game is irrelevant or not interesting for me. I'm speaking strictly about the tech involved in that demo. I saw a lot of claims that it is the best "thing" they saw and that it is not posible in ps3/x360. My question was about this.
If the fact that it's good only because is zelda is another question that is not tech related.
I'm supposed to buy this console for Monster Hunter. Dafuq.
The problem Nintendo has is their market aims. It seems to be a conscious choice.
To the enthusiast gamer, everything that's not as powerful as current technology allows is "gimped". It's a bit like an auto enthusiast who rails that a mass market consumer minivan is a piece of crap because somewhere in the world, there are Ferraris. It's true that the minivan is no Ferrari, so the enthusiast has a point - but he's also missing the point that the minivan's job is not to be a Ferrari. And in some ways, it's better... like cargo capacity, fuel efficiency, etc.
It's hard for me even with this news to see the Wii U's hardware as "Nintendo cheeps out on j00 suckers". Because Nintendo's goal was not to make a $500 console that was way more powerful than PS360 plus included an iPad. Again, their self-chosen path and problem is that they deal with the mass market. A $300 console (the base model) sounds like their absolute upper limit for MSRP, to not scare away the authentic mainstream audience. Within that price, their concept for the system included an expensive to develop, and not cheap to produce touch screen / motion sensing interface device.
Nintendo doesn't seem to have "gimped" on anything within the price range they had to remain within, considering the total components that make up the system. If the Xbox 360 had ha a cheaper CPU, it could have had more ram, for example. But there were specific priorities and they were followed. Wii U was designed with specific priorities and this is what we got.
The joke with the FUD being spread is that you still have ports like ACIII at launch, made in a rush, that effectively look and run about like the PS3 version of the same game. If people stopped and thought for a moment, they'd see that clearly, something in Nintendo's design strategy for the console is working. Otherwise that game would not exist on Wii U and if it did, never with that kind of port parity.
Edit: I would add that the most questionable thing in the entire matter IMO is Nintendo's very obvious entreaties to 3rd parties about Wii U being friendly towards them from a development and power standpoint. Obviously, working on the console involves some major strategic shifts and while that doesn't mean the hardware is bad, it probably does make Nintendo's official PR line sound like damage control. But then we have all those months and months of some 3rd parties saying the hardware is great, some griping it sucks, etc etc. Opinions, woohah!
Except, you know, ARM CPUs haven't been in-order since the Cortex A9.Good lord the amount of morons in this thread is ridiculous. Go look up In Order and Out of Order execution, phone ARM processors are In Order, the WiiU is Out of Order.
There's more factors to it than that. A core i7 is relatively highly clocked and has better IPC than anything else out there by a country instruction.You say this, but its by design. It's either or. You either sacrifice clock speed for IPC, or sacrifice the clock for the IPC. It would be prohibitly inpractical to try and build a CPU with the best of both. Especially a consumer grade.
No, it´s never been like that with regards to technology and never will be. WiiU is current gen, barely.