• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Metro Last Light dev: 'Wii U has horrible, slow CPU' [Up: DICE dev comments]

ThatObviousUser

ὁ αἴσχιστος παῖς εἶ
You never know with Nintendo and secrets. They don't like to announce things until it's the last moment.

For all we know Metro: Last Light and BF4 might be announced Dec 10

Delusion: It's what's for dinner tonight.

With a side helping of snuggling a body pillow with Reggie's face taped to it.
 

KageMaru

Member
My guess is that the eDRAM provides some benefit to using compute shaders. And that's probably the extent that it's a GPGPU any more than the others in the R700 line. It's certainly not going to be modified to the extent that it's comparable to Southern Islands at GPGPU. Way too many SRAM registers on those cards to duplicate at 40nm.

The eDRAM may have some benefit but I still see GPGPU making that big of an impact. IMO it'll be as limited to what we see in games today (particle physics is one example).
 
You people are stupid.


Let me break it down for you using something we can all understand. The Kenny ratio.


Now Kenny Rogers wrote a song called Lucielle back in '77 that essentially launched his solo career after leaving The First Addition. Personally I think he never should've left those guys instead of swinging his smelly cock and balls around with Dolly Parton. Why get rid of a good thing? Who the hell is she to tell Kenny what to do? Anyway he did a few movies with a gorilla and did a commercial for some knife company. Afterwards went on to star in the classic film Six Pack. Six Pack was basically The Bad News Bears with more reckless drunken child abuse. Hilarious. Kenny loved getting drunk and employing underage labor so as far as he knew he was filming a documentary. It was a triumph and it brought our fractured nation together after the whole fiasco the year before at the Knoxville world's fair with New Edition/The First Addition. We were ready to move on and forgive.

So now Kenny's beard at the time was a sexy salt and peppered little number gracefully hugging his meaty, ample jowls. That's the way we liked it. That's the way we'll remember Kenny - beating Dolly Parton senselessly on stage at the 1981 country music awards after losing to Alabama's Elvira. We all loved that song. Alabama went on to become Georgia's state band. Hats were made.

So using inequality rules with Kenny's beard as the absolute value of a complex number we can then calculate how many Kenny's would be needed to fuck the shit out of the Wii U's CPU. Simple.

91327057233293293745.gif



So now we're at roughly 12 full Kenny's and a chinstrap beard.



Case closed.



You're welcome.
 

zoukka

Member
These same developers will have no issues "learning a new console" when it's, you know, more powerful.

Yep, the level of effort depends on the dev. Some are eager to try developing to more simple hardware like handhelds, mobile and weaker home consoles. Some are more rigid in their planning and it's perfectly ok for them.
 
IMO, Nintendo has made a bet. Put a decent GPU, but a slower CPU. They made everything to ease the CPU's pain: DSP for sound, ARM cores for OS. And maybe GPGPU fonctions.
That's what Nintendo is betting on: Developpers using less CPU, Sony and Microsoft aiming on a slower CPU too, bref, everyone going GPGPU.
If so, Nintendo could get downgraded port as well, if not, 3rd party support is unlikely to happen. That may explain why Wii U """"struggle"""" with actual ports. Because they're made for console relying on CPU with high frequencies, lot of thread and FP.
 

OniShiro

Banned
IMO, Nintendo has made a bet. Put a decent GPU, but a slower CPU. They made everything to ease the CPU's pain: DSP for sound, ARM cores for OS. And maybe GPGPU fonctions.
That's what Nintendo is betting on: Developpers using less CPU, Sony and Microsoft aiming on a slower CPU too, bref, everyone going GPGPU.
If so, Nintendo could get downgraded port as well, if not, 3rd party support is unlikely to happen. That may explain why Wii U """"struggle"""" with actual ports. Because they're made for console relying on CPU with high frequencies, lot of thread and FP.

The difference being that the Wii U CPU is supposedly barely on par or worse than the PS360 CPUs, while the PS4 and X720 while not being beasts will be fairly supperior to the PS360 CPUs.
 

Ryoku

Member
Yep, the level of effort depends on the dev. Some are eager to try developing to more simple hardware like handhelds, mobile and weaker home consoles. Some are more rigid in their planning and it's perfectly ok for them.

I'd put some of it on the publisher, as well.
 
The difference being that the Wii U CPU is supposedly barely on par or worse than the PS360 CPUs, while the PS4 and X720 while not being beasts will be fairly supperior to the PS360 CPUs.




Well it depends. If those are Jaguar CPU, they may not be that much faster. It may depends on the purpose. But my point was that we may not have games relying that much on CPU (high frequencies, lot of threads etc...). At least, that's what Nintendo is betting, personally, I don't know what to expect.
 
I'd put some of it on the publisher, as well.
Yeah, I've never worked in the VG industry but pubs probably allocate budgets/resources on particular SKUs depending on the strength of their respective business cases. Considering no one will legitimately expect WiiU versions of multiplat games to sell as well as their ps360 counterparts, I'm not sure they'll invest a lot in them.
 

Durante

Member
With all this talks about how underpowered the wii u is, I thought of something interesting. Will the fact that the wii u is cheap and underpowered put nintendo in best position for cloud gaming?
I don't see it. If you just want to do cloud gaming (let's leave the question of who would want that out for a moment) then something like Ouya is sufficient, ad it's much cheaper than even the Wii U.

My guess is that the eDRAM provides some benefit to using compute shaders. And that's probably the extent that it's a GPGPU any more than the others in the R700 line. It's certainly not going to be modified to the extent that it's comparable to Southern Islands at GPGPU. Way too many SRAM registers on those cards to duplicate at 40nm.
That makes sense.
 

CrunchinJelly

formerly cjelly
Eurogamer got a response from someone at 4A/THQ:

http://www.eurogamer.net/articles/2...-slow-cpu-claim-but-developer-concern-remains

Yesterday Eurogamer spoke with THQ's Huw Beynon, who works full time as a representative of 4A Games and Metro, to expand on Shishkovtsov's comment.

“I think there was one comment made by Oles the programmer - the guy who built the engine,” he said.

“It's a very CPU intensive game. I think it's been verified by plenty of other sources, including your own Digital Foundry guys, that the CPU on Wii U on the face of it isn't as fast as some of the other consoles out there. Lots of developers are finding ways to get around that because of other interesting parts of the platform.

“I think that what frustrates me about the way the story's been spun out is that there's been no opportunity to say, 'Well, yes, on that one individual piece maybe it's not as... maybe his opinion is that it's not as easy for the way that the 4A engine's been built as is the others.

“What it doesn't go on to look at is to say that, you know, we could probably get around that. We could probably get Metro to run on an iPad if we wanted, or on pretty much anything. Just as in the same way that between PC and current console versions there are some compromises that need to be made in certain places and we strive to get the very best performance that we can from any platform we release on.

“But I understand that there's a real appetite in the media at the moment because the Wii U is a hot topic to spam some stories that are going to attract a lot of links if they present it in a certain way.”

So, will Metro: Last Light ever appear on Nintendo's first high definition home console?

“We looked at Wii U as a target platform,” Beynon said. "It's a really small studio. There were 50 for Metro 2033, there are 80 now. With Metro 2033 most of their experience was with the PC. The Xbox 360 was their first console version. We've now added PlayStation 3 to the mix.

“We genuinely looked at what it would take to bring the game to Wii U. It's certainly possible, and it's something we thought we'd like to do. The reality is that would mean a dedicated team, dedicated time and effort, and it would either result in a detriment to what we're trying to focus on, already adding a PlayStation 3 SKU, or we probably wouldn't be able to do the Wii U version the justice that we'd want.

“It would be a port or we wouldn't be able to get to grips with the system. That's really the essence of it. It's something we can potentially look at and return to later. Given the targets we've set for the game, it didn't make sense to proceed with it at this point.”
 
So let's say this is true, and the Wii U does indeed become another system that 3rd parties don't want to develop for. Where does that leave Nintendo? Most people would think right down the gutter of crap. Yet there is one aspect in gaming that most of us completely ignore and that has also seem to be hitting a high mark now. Indie gaming is something that Nintendo can really pay huge attention too, which I'm sure they are aware of by now.

With Nintendo's online be more open which is sorta like an free open door for Indie devs to really open up there games and get them out there. I strongly believe that Indie Devs are going to be in paradise with this system. Now of course a lot of you out there will not agree with me on this but I'm just saying that more and more Indie Devs would love to push there games on to the Wii U, hell even mobile gaming will also see its way on to the Wii U down the road. I'm hoping it will anyways cause i believe that Indie Devs are really going to change this industry for the best in the near future. At least to me i think anyways.
 

Panajev2001a

GAF's Pleasant Genius
So let's say this is true, and the Wii U does indeed become another system that 3rd parties don't want to develop for. Where does that leave Nintendo? Most people would think right down the gutter of crap. Yet there is one aspect in gaming that most of us completely ignore and that has also seem to be hitting a high mark now. Indie gaming is something that Nintendo can really pay huge attention too, which I'm sure they are aware of by now.

With Nintendo's online be more open which is sorta like an free open door for Indie devs to really open up there games and get them out there. I strongly believe that Indie Devs are going to be in paradise with this system. Now of course a lot of you out there will not agree with me on this but I'm just saying that more and more Indie Devs would love to push there games on to the Wii U, hell even mobile gaming will also see its way on to the Wii U down the road. I'm hoping it will anyways cause i believe that Indie Devs are really going to change this industry for the best in the near future. At least to me i think anyways.

Usually small indie studios are not the ones with super expensive middleware and tools and sometimes without the time or experience to optimize things down to the metal, those creative indie developers are the ones who would suffer more by slow HW, big AAA studios can throw artists and programmers at the problem and optimize each asset, each scene, and work around big technical bottlenecks also thanks to their experience in faking detail and great effects with believable hacks. Well, up to a certain point.
With that said, I hope Unity mentions more and more of their WiiU plans soon and that Nintendo opens up more to the indies scene like MS and Sony have done.
 
“But I understand that there's a real appetite in the media at the moment because the Wii U is a hot topic to spam some stories that are going to attract a lot of links if they present it in a certain way.”

Sums up my thoughts on the whole situation. And article about Last Light that has gotten more press due to current Wii U concerns by enthusiast. This piece will be remembered for that an likely nothing else. But it does draw some attention to 4A & Metro , so its kinda like Duct Tape PR ina round about way (though not internationally)
 

Kenka

Member
Back to my cluelessness and questions about "how hardware works". The X360 has a tri-core/six-threads CPU with a frequency of 3.2 GHz. I remember people say that the architecture plays a big role and that a Pentium 4 at 3 GHz is comfortably surpassed by a i-2100 clocked lower. Maybe this is a similar case? The WiiU CPU may have a lower clock frequency but higher output? It's a serious question, I am not interested in siding with anyone, I just want to learn a bit.
 
Usually small indie studios are not the ones with super expensive middleware and tools and sometimes without the time or experience to optimize things down to the metal, those creative indie developers are the ones who would suffer more by slow HW, big AAA studios can throw artists and programmers at the problem and optimize each asset, each scene, and work around big technical bottlenecks also thanks to their experience in faking detail and great effects with believable hacks. Well, up to a certain point.
With that said, I hope Unity mentions more and more of their WiiU plans soon and that Nintendo opens up more to the indies scene like MS and Sony have done.

Agreed.
 

Log4Girlz

Member
Back to my cluelessness and questions about "how hardware works". The X360 has a tri-core/six-threads CPU with a frequency of 3.2 GHz. I remember people say that the architecture plays a big role and that a Pentium 4 at 3 GHz is comfortably surpassed by a i-2100 clocked lower. Maybe this is a similar case? The WiiU CPU may have a lower clock frequency but higher output? It's a serious question, I am not interested in siding with anyone, I just want to learn a bit.

Architecture has a lot to do with performance. Transistor counts, frequencies, architecture, all factor in to how "powerful" a CPU is. Now, unless the Wii CPU was some beast, which I have yet to hear anyone every claim it was, I doubt a triple core broadway is going to impress anyone.
 

TheD

The Detective
Back to my cluelessness and questions about "how hardware works". The X360 has a tri-core/six-threads CPU with a frequency of 3.2 GHz. I remember people say that the architecture plays a big role and that a Pentium 4 at 3 GHz is comfortably surpassed by a i-2100 clocked lower. Maybe this is a similar case? The WiiU CPU may have a lower clock frequency but higher output? It's a serious question, I am not interested in siding with anyone, I just want to learn a bit.

No.

The reason the i3 is faster is because it uses a much more advanced architecture, that takes up a lot more transistors (165 million transistors vs 504 million transistors), more transistors means a bigger die (at the same process node).

IBM says the WiiU CPU is on a 45nm process, the same as the CPU in the 360 slim.
The WiiU CPU die size is 33mm squared, the 360 CPU is (by my calculations) 93.4mm squared.
 

Persona86

Banned
If I had a quarter for every person who is using the term GPGPU and probably have absolutely no idea what they go on about...
Haha looks like we've got a know it all here.

All I'm saying is; if they stick more ram on the CPU harddrive, then the developers can actually make the most out of the 360 bit systems without having to spend too much time hacking the GPGPU motherboard.

Unfortunately with the WiiU, developers now have to waste time trying free up limited harddrive space from the CPU as if the CPU ram has a magical workaround haha. It's all bullcrap, and it's all Nintendo's fault. They need to pull their socks up ASAP.
 
Reading this thread feel like I've gone back in time to 2006/7 with PS3/Sony fans in exactly the same denial mode when sub-par ports of common games were being released because of complexities in its architecture. Lots of talk about SPE's acting as GPGPU-like parts and feeding general purpose code onto them.

The Wii-U is going to get sub-par ports of multi-platform games while PS3/360 are out, that is for sure, once they are both EOL'd it is likely that third parties will migrate to Orbis/Durango and kill current gen development, if Wii-U does not have the userbase for non-Nintendo games, it is likely that it will be dead for third parties. However, just like PS3 got the best looking games of the generation via strong first party development, Wii-U will also get great looking games via Nintendo's internal studios who will use the weird architecture to it's fullest.

To me it means that instead of becoming more third party friendly, Nintendo are set to become more reliant on first party games, the opposite of what was intended with Wii-U. All it would have taken is $10 more on the CPU and $5 more on the RAM. For less than $20 per unit Nintendo have basically sabotaged healthy third party relations. The difference between Nintendo and Sony (who were in the same position in 2006) is that Sony sent out first party developers and engineers to third parties to help them get to grips with PS3's oddball architecture, sometimes that meant they wound up helping the 360 version of the game as well, but they still did it and ensured that PS3 received all major third party games, they even got previously exclusive games ported like Bioshock and Mass Effect. I'm not sure that Nintendo will take the same steps to ensure third party support.
 

Clockwork

Member
No.

The reason the i3 is faster is because it uses a much more advanced architecture, that takes up a lot more transistors ( 165 million transistors vs 504 million transistors), more transistors means a bigger die (at the same process node).


IBM says the WiiU CPU is on a 45nm process, the same as the CPU in the 360 slim.
The 360 CPU is nearly 3 times the size.


I haven't looked, but aren't the CPU and GPU integrated together in the 360 Slim? So is that 3x size for just for Xenon or Xenon+Xenos?
 

TheD

The Detective
I haven't looked, but aren't the CPU and GPU integrated together in the 360 Slim? So is that 3x size for just for Xenon or Xenon+Xenos?

That is just the CPU size.

I took the die size of the old 65nm non integrated CPU (135mm squared) then took into account the process shrink from 65nm to 45nm (45 goes 1.4444444444444 times into 65, then took 135 and divided it by 1.4444444444444, getting 93.4).
 

Clockwork

Member
That is just the CPU size.

I took the die size of the old 65nm non integrated CPU (135mm squared) then took into account the process shrink from 65nm to 45nm (45 goes 1.4444444444444 times into 65, then took 135 and divided it by 1.4444444444444).

Thanks. It's 5am and I'm on my cell phone.

I just wanted to paint a correct picture in my head.
 
That is just the CPU size.

I took the die size of the old 65nm non integrated CPU (135mm squared) then took into account the process shrink from 65nm to 45nm (45 goes 1.4444444444444 times into 65, then took 135 and divided it by 1.4444444444444).

In practice it doesn't scale lineary though. For example, Xenon originally was 70% larger in 90nm than in 65nm (176mm^2 vs 135mm^2 according to anandtech).

I agree that the Wii U CPU can't be very powerful if you look at the die size and approximate transistor count. However it's still of some importance how this room is spent. Some will argue that Xenon focuses too much on floating point performance (neglecting integer performance) and that an enhanced Broadway might be more balanced for typical game code, especially when it's relieved by a dedicated DSP for audio.
 

TheD

The Detective
In practice it doesn't scale lineary though. For example, Xenon originally was 70% larger in 90nm than in 65nm (176mm^2 vs 135mm^2 according to anandtech).

I agree that the Wii U CPU can't be very powerful if you look at the die size and approximate transistor count. However it's still of some importance how this room is spent. Some will argue that Xenon focuses too much on floating point performance (neglecting integer performance) and that an enhanced Broadway might be more balanced for typical game code, especially when it's relieved by a dedicated DSP for audio.

Not scaling linearly does not do mean much for the Wii U CPU (it would mean the 360 CPU is even bigger).

The SIMD units in the xenon are not huge, total die space wise.

Broadway is not a fast CPU at all, it is nearly the same as the CPUs in the Apple G3 macs.
They did not stand any chance at all vs the P4, yet alone a CPU (xenon) that is 1.5x as powerful as it.
 

Clockwork

Member
I also want to know what the smaller chip is on the MCM. Anandtech states off chip memory, but that was just an assumption. They even question marked it.
 

AlStrong

Member
That is just the CPU size.

I took the die size of the old 65nm non integrated CPU (135mm squared) then took into account the process shrink from 65nm to 45nm (45 goes 1.4444444444444 times into 65, then took 135 and divided it by 1.4444444444444, getting 93.4).

Just take the SoC image and go from there. The CPU takes up roughly 40% of the die shot (no memory I/O as well since that was always on the GPU), and we know the die is about 168mm^2.

http://www.hotchips.org/wp-content/uploads/hc_archives/archive22/HC22.23.215-1-Jensen-XBox360.pdf

The XeCPU ends up around 67mm^2 give or take.
 
Top Bottom