• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Metro Last Light dev: 'Wii U has horrible, slow CPU' [Up: DICE dev comments]

Chaplain

Member
I'll be getting a WiiU but, now that comments of real developers start coming out... I don't think Nintendo listened to developers as they said a few months ago.

I believe this also means that this is one of the reasons Nintendo didn't release real specs, because they knew that if they did people (like the hardcore) would not buy it. Pretty deceitful if you ask me. =[
 

Nirolak

Mrgrgr
It depends entirely on the game/engine and settings. Look up some Arma 2 benchmarks for fun.

Yep:

armaa2-cpus-1280wmo3t.png
 
When looking for Mirror's Edge 2 reactions on twitter, I noticed that a DICE designer has also commented on the CPU:

dicewiiuf1qdg.png

If there would really be some problems with the CPU, it is really easy for a decent developer to use the GPGPU for KI & Physic-Effects. That will be true for PS4/Xbox3 too. And they could use the Wii U DSP-Chip for the audio processing, that took up to 30% CPU power for some Xbox360 games.

But some developers are not so experienced right now.
 

pestul

Member
Yeah, those colorful graphics are nice, but the Wii Us gpu is also sharing from a very slow memory pool compared to what traditional modern gpus are used to utilizing. Yeah yeah eDram I know.. but it isn't always that simple to take a conventional engine and recode it to compensate for that kind of architecture (or GPGPU).
 

Vinci

Danish
Eh, I don't know. Personally, though I can't in good consciousness assert to be above the fray as that's not a call one can make about oneself, I honestly don't feel like I've got a dog in this fight. I've probably on the whole been more critical than not of the Wii U, but I do own the machine and like Nintendo software, so I have no vested interest in laughing at them as they falter against mightier machines.

I understand their game. There once was a time when they tried to be competitive in having the best graphics in town, and that didn't work all that well for them. Then the Wii was a runaway success. So, yeah, I understand that there's more than one way to succeed in this biz.

However, I'm not quite convinced that their gamble paid off this time. It's possible to take a nuanced stance. I don't necessarily think that they should have put a $400-$500 machine that was going to knock your socks off visually and sells for a massive loss.

However, once again going the low tech route when there was a chance to be at least a marginal improvement over the competition while seemingly going all in on the GamePad concept was a rather dubious decision. People are allowed different reactions, sure, but that particular innovation hasn't struck me as anything more than "neat." I'm not entirely convinced that it's integral or that there's a clear vision for what really sets this system apart.

We're talking about 'tech' threads in general here, though. Let's be honest: When people on GAF talk about tech, they by and large don't consider things like controllers or peripherals into that discussion. People dismissed whatever technical innovations were in the Wii Remote, so it's safe to assume they will do the same with the Wii U's controller.

It's about CPU, GPU, etc. and so on. I think the breakdown I listed is fairly representative of why tech threads, by and large, are so divisive. Yes, there are exceptions, but these three categories seem to make up a pretty extensive amount of the population posting on the gaming side - and thus are a good illustration of why such threads go bonkers.

I tend to avoid tech threads because, honestly, I have a hard time agreeing with any of these sides completely. I think Nintendo's approach is smarter; I think Sony and MS's approach is cooler; and I think the PC is the best gaming experience available and wonder why more console gamers don't play in that space more often.

Oh well.

I believe this also means that this is one of the reasons Nintendo didn't release real specs, because they knew that if they did people (like the hardcore) would not buy it. Pretty deceitful if you ask me. =[

Yes, the 'hardcore' - always the faithful proponent and fanboi of Nintendo's wares.
 

Nirolak

Mrgrgr
For the benefit of those of us who may be at work, behind a company firewall that blocks image sharing websites and therefore cannot read the pasted comment from the DICE dev...

What did the DICE dev say?

Here's a text version:

DICE said:
Gustav Halling ‏@gustavhalling

This is also what I been hearing within the industry, to bad since it will shorten its life a lot when new gen. starts. http://m.kotaku.com/5962354/the-wii-u-has-a-horrible-slow-cpu-according-to-these-developers

---

6h Jason Leeming Jason Leeming ‏@JayLeemin

@gustavhalling Sounds like they just couldn't be bothered. Even with a slow CPU, the Wii U has a faster GPU and a lot more RAM than PS360.

---

6h Gustav Halling Gustav Halling ‏@gustavhalling

@JayLeemin GPU and ram is nice to have shaders/textures loaded. Physics and gameplay run on CPU mostly so player count is affected etc.
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
I'll be getting a WiiU but, now that comments of real developers start coming out... I don't think Nintendo listened to developers as they said a few months ago. More like we'll do our own shit, deal with it.

Exactly. My guess is that the more Nintendo heard what third parties were saying, the more they were convinced it was the wrong way to go.

For what reason I don't know. I wonder if they just think that business model is unsustainable. Perhaps they wanted to offer a console that would cost much less to develop for? It's puzzling.

Either way, we've got yet another Gamecube/Wii third party situation on our hands. As long as the first party games are there I'm happy with it as I'll be getting one of the other new consoles as well, but I feel bad for those interested in gaming but who end up with only a Wii U.
 
Wasn't the ooe processing supposed to make up for raw clock speed in comparison to 360/ps3? I remember reading that, but I'm pretty technologically ignorant so I'm not going to pretend I know what I'm talking about.
 
If there would really be some problems with the CPU, it is really easy for a decent developer to use the GPGPU for KI, Physic-Effects. That will be true for PS4/Xbox3 too.

But some developers are not so experienced now.

Yeah, that guy at DICE who asserts that it could be problematic probably isn't a decent developer.
 
I'm both surprised and not surprised that this thread is so many pages. I mean what is there to do say other than this is a pretty big disappointment no matter which way you slice it.
 

pestul

Member
but it is 1280x1024....CPU bottleneck does not really occure on that resolution

Also i see a lot of people with insane setups getting very bad performance in Arma 2, so that isnt a good engine to go by imo...
I'm not sure what you're trying to say. Most Wii U games are going to be 1280x720.. or less pixels than that graphic.

EDIT: Oh I see, you're talking about that specific game engine and issues.
 

Nirolak

Mrgrgr
but it is 1280x1024....CPU bottleneck does not really occure on that resolution

Also i see a lot of people with insane setups getting very bad performance in Arma 2, so that isnt a good engine to go by imo...

Your CPU isn't usually doing per pixel calculations, so the resolution isn't going to determine your CPU bottlenecking. It's an integer focused calculator normally used for gameplay and simulation mechanics.
 

Chaplain

Member
Nintendo never play the numbers game.

That is not an accurate statement. Only starting with the Wii did Nintendo start doing this. All systems up until the Wii had real specs revealed.

If a company doesn't want to reveal real specs, it is because they are hiding something. Developers coming out now and saying what the Wii U can actually do, shows that no matter how hard a company tries to conceal the truth, it will eventually come out.

I find it strange though that these developers couldn't release this info last week. Because if they had, many would have not bought the Wii U.
 

TheExodu5

Banned
True but Arma2 is not really an optimized game

If you read the article you can see they can recreate these outcomes with 6 different games, all im saying is CPU isnt everything.

And I can recreate outcomes similar to Arma 2 with many other different games as well. Take BFBC2 (possibly BF3), Starcraft 2, World of Warcraft, Civilization V. These games are all going to be CPU limited unless you're running an overclocked Sandy Bridge or Ivy Bridge CPU.

It's really a sliding scale for a lot of games. As you change settings/resolution or change hardware, the scales tip in favor to the CPU versus the GPU and vice versa.

If a company doesn't want to reveal real specs, it is because they are hiding something. Developers coming out now and saying what the Wii U can actually do, shows that no matter how hard a company tries to conceal the truth, it will eventually come out.

No, it simply means hardware specs are not the main focus of this piece of hardware.

See Apple if you want another example of this. They don't focus on hardware specs, and yet they have the fastest tablet on the market by a fairly large margin.
 

Durante

Member
Guru3d did a cool article on CPU scaling a few years ago, for gaming GPU and RAM always mattered a lot more then the CPU above certain resolutions and clock speed:

This was for example stalker

untitledihs9d.png
Stalker is a game from 5 years ago.

Assasin's Creed 3:
313qn.png


but it is 1280x1024....CPU bottleneck does not really occure on that resolution
Huh? That's the wrong way around entirely. The lower the resolution, the more likely your framerate is to be bottlenecked by the CPU.
 

Vinci

Danish
That is not an accurate statement. Only starting with the Wii did Nintendo start doing this. All systems up until the Wii had real specs revealed.

If a company doesn't want to reveal real specs, it is because they are hiding something. Developers coming out now and saying what the Wii U can actually do, shows that no matter how hard a company tries to conceal the truth, it will eventually come out.

I find it strange though that these developers couldn't release this info last week. Because if they had, many would have not bought the Wii U.

Nintendo revealed Gamecube specs. Those specs were weaker than the PS2's and X-Box's specs because Nintendo gave real-world performance numbers rather than theoretical numbers like the other two did. This was used to attack the Gamecube and Nintendo. Afterwards, they decided not to reveal specs anymore.

You can disagree with this choice. But the notion that they got rid of specs simply to hide how weak the Wii was is paranoia at its finest. No one, not even for a second, thought the Wii was anywhere in the same realm of the 360 or PS3. No one.
 
The funny thing is: I took for granted that it was going to be more powerful, and yet I would have found embarrassing even if the console was just marginally superior (let's say 2X-3X, in general terms).

Turned out that in many ways it's even inferior, which is somehow hilarious when you think that we are talking about 6-7 years old tech and even some smartphones and tablets are starting to match that level of hardware capabilities.

It's really astounding, isn't it? Months and months of speculation and 'insider info' were saying it's going to be anywhere from 3-6x or what have you, it was basically confirmed. I believed it 100%. People were so sure and confident about it, forum insiders predicting it. And here we are, a scenario worse than anyone but the most ardent anti-Ninty fanboy could have imagined.
 

MedIC86

Member
Stalker is a game from 5 years ago.

Huh? That's the wrong way around entirely. The lower the resolution, the more likely your framerate is to be bottlenecked by the CPU.

Yeah sorry thats what i meant, i wrote it wrong, the higher your resolution the more GPU becomes a factor.
 

Durante

Member
If there would really be some problems with the CPU, it is really easy for a decent developer to use the GPGPU for KI & Physic-Effects. That will be true for PS4/Xbox3 too. And they could use the Wii U DSP-Chip for the audio processing, that took up to 30% CPU power for some Xbox360 games.
Really easy? I have to call you out on that. I've done GPGPU since 2005, and "really easy" is the last thing I'd call it. It has been getting progressively easier since, but it's still far more involved than just putting it on a CPU.
 

TheExodu5

Banned
It's really astounding, isn't it? Months and months of speculation and 'insider info' were saying it's going to be anywhere from 3-6x or what have you, it was basically confirmed. I believed it 100%. People were so sure and confident about it, forum insiders predicting it. And here we are, a scenario worse than anyone but the most ardent anti-Ninty fanboy could have imagined.

Does this number keep getting higher or something?

First it was 1.5x. Then it was 2-3x. Now it's 3-6x?

What's with this revisionist history?
 

gofreak

GAF's Bob Woodward
If there would really be some problems with the CPU, it is really easy for a decent developer to use the GPGPU for KI & Physic-Effects.

In theory some of the 360/PS3 ports could use GPGPU for some tasks to improve performance where the CPU is limiting framerate at the moment.

It's not a common way of doing things with 360/PS3 games though, and developers are unlikely to explore that for one SKU :/

When that kind of processing becomes more common - PS4/720 - Wii-U probably will have big relative GPU performance problems.
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
Does this number keep getting higher or something?

First it was 1.5x. Then it was 2-3x. Now it's 3-6x?

What's with this revisionist history?

Even IdeaMan who was super-positive about the system maxed out at 2.25-2.5x 360, and that was including the output to the screen.
 

Alexios

Cores, shaders and BIOS oh my!
It's really astounding, isn't it? Months and months of speculation and 'insider info' were saying it's going to be anywhere from 3-6x or what have you, it was basically confirmed. I believed it 100%. People were so sure and confident about it, forum insiders predicting it. And here we are, a scenario worse than anyone but the most ardent anti-Ninty fanboy could have imagined.
What's this 100% confirmed scenario we ended up with then? Is it 0.5 or what is the current number? Need to know pls.
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
I find it strange though that these developers couldn't release this info last week. Because if they had, many would have not bought the Wii U.

And I find it strange you think so many people in the "Average Joe" consumer base actually care about RAM speed/CPU power.
 

Vinci

Danish
Does this number keep getting higher or something?

First it was 1.5x. Then it was 2-3x. Now it's 3-6x?

What's with this revisionist history?

It keeps getting higher as his disappoint grows.

And I find it strange you think so many people in the "Average Joe" consumer base actually care about RAM speed/CPU power.

In his defense, the 'Average Joe' does care about whether the system gets games he wants or not. The magical formula which gets those games on a system? You're right. 'Average Joe' is, like, "What the fuck do I know?" but it's still relevant to his interests.
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
What's this 100% confirmed scenario we ended up with then? Is it 0.5 or what is the current number? Need to know pls.

It might be around equally powered, but I know the screen takes up some so I'd estimate a little above 360.

My guess is we won't see any game look relatively close to above 360 until E3. Probably the next Zelda/EAD Mario.
 

2MF

Member
In theory some of the 360/PS3 ports could use GPGPU for some tasks to improve performance where the CPU is limiting framerate at the moment.

It's not a common way of doing things with 360/PS3 games though, and developers are unlikely to explore that for one SKU :/

When that kind of processing becomes more common - PS4/720 - Wii-U probably will have big relative GPU performance problems.

Porting code from the CPU to the GPU to gain performance is not always possible due to the very different way they work. Even when it's possible, it's harder than porting from PPU to SPU on the PS3 Cell, and we all know how much trouble most devs had with the latter...

I wouldn't count much on the GPU to save a weak CPU, if the CPU is in fact weak. Some developers may pull it off for specific games.
 

gofreak

GAF's Bob Woodward
Even IdeaMan who was super-positive about the system maxed out at 2.25-2.5x 360, and that was including the output to the screen.

In fairness, there was at least one or two posters in there harping on and on about 600Gflops in the GPU - which would be on the 3x end of that scale. I don't know how seriously people were taking them...but it came up a lot.

I don't think anyone was ever speculating much beyond that though, not least 6x.


Porting code from the CPU to the GPU to gain performance is not always possible due to the very different way they work. Even when it's possible, it's harder than porting from PPU to SPU on the PS3 Cell, and we all know how much trouble most devs had with the latter...


That's all true. But I think we will see the model gain some traction, and middleware on the physics side for example, will provide some relatively transparent solutions earlier on. The CPUs in the next systems are unlikely to be 'big' - even if I do hope they at least outperform 360's :)
 

SmokyDave

Member
It's really astounding, isn't it? Months and months of speculation and 'insider info' were saying it's going to be anywhere from 3-6x or what have you, it was basically confirmed. I believed it 100%. People were so sure and confident about it, forum insiders predicting it.
Nah, I saw this coming from the moment they revealed the system....

And here we are, a scenario worse than anyone but the most ardent anti-Ninty fanboy could have imagined.

....ahem. Um....

*whistles innocently*
 
What's this 100% confirmed scenario we ended up with then? Is it 0.5 or what is the current number? Need to know pls.

Like 1.25 maybe? No idea. DBZ power level would be more accurate than my estimate, but from what I gather it's probably anywhere from 1.10x to 1.5x, so 110% to 150%.

In fairness, there was at least one or two posters in there harping on and on about 600Gflops in the GPU - which would be on the 3x end of that scale. I don't know how seriously people were taking them...but it came up a lot.

I don't think anyone was ever speculating much beyond that though, not least 6x.
Wasn't that bgassassin? correct me if I'm wrong
 

TheExodu5

Banned
Like 1.25 maybe? No idea. DBZ power level would be more accurate than my estimate, but from what I gather it's probably anywhere from 1.10x to 1.5x, so 110% to 150%.

There is no current number. Some aspects of the system are better, some are worse. It's all going to depend on how the game is developed. Games that are developed primarily for the Wii U hardware will obviously fare better than ports that originate on other platforms.
 

MedIC86

Member
Games that are developed primarily for the Wii U hardware will obviously fare better than ports that originate on other platforms.

This is also the case with the PS3, besides having a pretty powerfull processor most multiplatform games performed worse on ps3 then the xbox. But the PS3 devved games (like uncharted) looked and performed pretty good.
 
Nah, I saw this coming from the moment they revealed the system....



....ahem. Um....

*whistles innocently*

I dunno, I seriously doubt that, like if you had to put some money on it, that you'd say it was = to or just a tiny bit above 7 year old hardware? That the RAM would be 43% slower, that the CPU would be a joke? There were plenty of people who said that in a trollish kinda way but did you genuinely believe it? I certainly didn't, I always said it would be more powerful.
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
In fairness, there was at least one or two posters in there harping on and on about 600Gflops in the GPU - which would be on the 3x end of that scale. I don't know how seriously people were taking them...but it came up a lot.

I don't think anyone was ever speculating much beyond that though, not least 6x.

True. It's GAF, though. There's always one or two something.

I think the words GPGPU and EDRAM should be bannable.

Also we all should have listened to Arkam. He tried to tell us.

Again, to be fair, when somebody comes into a thread and says that a system released 6 years after its competitors will have a worse CPU than said competitors, it deserves to be met with criticism. It's just so illogical it's mind-boggling. I bet even most ardent anti-Nintendo people wouldn't have thought they'd actually go ahead and do it.
 
I really don't want to derail but since I'm not into specs and such things I want to know what config I could have for 300$ (not counting monitor and speakers even if clearly everyone have a TV and not everyone have a PC monitor.. but counting mouse, keyboard and controller)?

I'm not a specialist but I'm almost sure that I won't have a good PC for gaming at 300$... am I wrong?

So maybe some of Nintendo choices may be discussable but it's not like having more RAM, a better CPU and everything else wouldn't have led to a 400 or 500$ system.

god I hate all this... it's like I'm on the hardware.fr french forum.

I know hardware is part of gaming or at least it's clearly what shows the orientation of the games we'll play but no one here knows Nintendo's strategy? You really need to read the article about Wii and its creation. It's either this or Nintendo goes frontal with MS and Sony... which would be really really stupid.
 

Data West

coaches in the WNBA
You guys should have known something was up when Vigil said it was real powerful. Just like how they said the console versions of Darksiders 2 won't have tearing.
 

Vinci

Danish
Again, to be fair, when somebody comes into a thread and says that a system released 6 years after its competitors will have a worse CPU than said competitors, it deserves to be met with criticism. It's just so illogical it's mind-boggling. I bet even most ardent anti-Nintendo people wouldn't have thought they'd actually go ahead and do it.

I always assumed it would be on the low-end, but they sort of went beyond the low-end of the range I had in mind.

Chû Totoro;44595807 said:
I know hardware is part of gaming or at least it's clearly what shows the orientation of the games we'll play but no one here knows Nintendo's strategy? You really need to read the article about Wii and its creation. It's either this or Nintendo goes frontal with MS and Sony... which would be really really stupid.

Nintendo has come out with a system that, similar to its predecessor, creates massive roadblocks for 3rd parties to work with it. That is bad. Definitively. I mean, I'm fine if Nintendo brings out the Wii U and MS and Sony bring out the Durango and Orbis, and those two systems are dramatically more powerful... so long as the games are portable from one to the other without too massive of an investment or trouble.

This aspect of the Wii U has, IMO, been botched. And I am not a Nintendo hater by any viewpoint.
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
There is no current number. Some aspects of the system are better, some are worse. It's all going to depend on how the game is developed. Games that are developed primarily for the Wii U hardware will obviously fare better than ports that originate on other platforms.

Same thing with Gamecube and PS3. Next E3 will be the big one for Nintendo to show these off, if they exist.
 

Nirolak

Mrgrgr
This is also the case with the PS3, besides having a pretty powerfull processor most multiplatform games performed worse on ps3 then the xbox. But the PS3 devved games (like uncharted) looked and performed pretty good.

The PS3 also has memory architecture issues and an unideal GPU.

You really don't want your system to stick out on basic things as a poor performer, otherwise things go badly with porting.
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
Chû Totoro;44595807 said:
I really don't want to derail but since I'm not into specs and such things I want to know what config I could have for 300$ (not counting monitor and speakers even if clearly everyone have a TV and not everyone have a PC monitor.. but counting mouse, keyboard and controller)?

I'm not a specialist but I'm almost sure that I won't have a good PC for gaming at 300$... am I wrong?

So maybe some of Nintendo choices may be discussable but it's not like having more RAM, a better CPU and everything else wouldn't have led to a 400 or 500$ system.

god I hate all this... it's like I'm on the hardware.fr french forum.

I know hardware is part of gaming or at least it's clearly what shows the orientation of the games we'll play but no one here knows Nintendo's strategy? You really need to read the article about Wii and its creation. It's either this or Nintendo goes frontal with MS and Sony... which would be really really stupid.

One thing people keep ignoring on GAF is the cost of the controller. There was a thread last week about how new the tech in the controller was, which I'm sure drove costs up.

I agree with you on the last part. Nintendo just isn't in the market to go head-to-head with Sony/MS. As I've said for over a year now, I believe Nintendo is firmly entrenching itself as the "stopgap" system. My guess is we see the next Nintendo system about halfway between the release of the PS4 and PS5.
 
Top Bottom