• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
Well you're hive minding a bit there dude. You're generalizing an entire thread and 'demographic' based on one post, without specifically calling out any one bit of information.

You're also transposing a different meaning onto blu's post: You could still "reject the notion that power matters" whilst agreeing with what he said. He didn't even mention the power debate, just that it's refreshing to hear a developer be so open and speak about the technical aspects. That doesn't happen a lot with WiiU.

The stereotypical Nintendo zeelot may in many people's view preach "gameplay > power" etc, but even then this is a thread about the GPU; so people enjoying hearing experienced, competent developers openly talk about it is to be expected.
I agree. To be fair, though, devs have NDAs to prevent them from talking too much about its tech. Some developers will attempt to work around them, while others will understandably try to avoid letting any valuable info out. For example, look at how careful lherre has been with Wii U info.
That doesn't invalidate those that don't. Every engine has different demands and there are guaranteed to be bottlenecks with Wii U hardware that prevent it from running all engines well. Like every other console.
True, though at this time we are not sure if those issues are from true bottlenecks or are due from unoptimized software tools, lack of experience on Wii U's unique architecture, or budget management.
 
I think Nintendo like 5-6 year cycles so yes. I can see PS4/One going till 2022, like I've said before. With their new console I hope they stick with the Power architecture and go with a Power 7 based processor (or whatever is current at the time), x86 must die!

Might as well wish for the death of the PC gaming industry then, bub. The other two switched to x86 partly to make ports less dreadful.

That, and IBM doesn't seem interested in contemporizing their low end line (i. e. creating SoCs or better scaling at lower wattage processors, the latter why Apple dumped them). Someone argued x86 is an older architecture; x86 gets continuous updates across practically every segment.
 

The_Lump

Banned
I agree. To be fair, though, devs have NDAs to prevent them from talking too much about its tech. Some developers will attempt to work around them, while others will understandably try to avoid letting any valuable info out. For example, look at how careful lherre has been with Wii U info.

Yup, exactly. That's why it's cool when someone in the know openly discusses it on a technical level (regardless of the contents of the discussion) :) Which I think is what blu was saying.
 
Might as well wish for the death of the PC gaming industry then, bub. The other two switched to x86 partly to make ports less dreadful

Or Microsoft could get around to making Windows not so dependent on the x86 architecture. They've made it work on ARM (another architecture that's more efficient than x86). Also, isn't the PC market as whole shrinking anyway? PC should be following consoles, and not the other way around.
 
Or Microsoft could get around to making Windows not so dependent on the x86 architecture. They've made it work on ARM

MS made Windows work on ARM, yes, but with none of the legacy software (they had to create new ground up versions of Office, etc.). And it bombed.

(another architecture that's more efficient than x86).

WAS more efficient, and only at lower wattages/performance brackets. (Silvermont, the next Atom, should be an even bigger jump.)

Also, isn't the PC market as whole shrinking anyway? PC should be following consoles, and not the other way around.

Contracting, and for different reasons (i. e. tablets). And PC naysayers have been a thing for ages. Saying PCs following consoles is as asinine as saying Nintendo should go 3rd party. We don't need the multiplatform bar being lowered any further, which is why x86 lending a bit of familiarity is a godsend.
 
I think at this point everybody knows Wii U has more horsepower than 360 and PS3, with the main problem being that the system is selling so awful that budgets are the one big road block.

We can see how this affects visual comparisons, with GTA V simply looking better as a whole than X due to the massive difference in budget. X does in fact seem to have more advanced lightning, and is running at 1080p (right?) so if GTA V was made for the Wii U it would look noticeably better than 360 and PS3 versions just like Bayonetta 2 looks better than any Platinum game on HD twins and runs at 1080p 60fps.

In a way, it's pretty impressive how such a modest but calculated hardware improvement over current gen can produce significant results on screen. What's not nearly as impressive is at what cost it's being sold as.

And that's before you take into account the DX9 feature sets and compare them to the weird DX10.1/DX11 mashed potato feature set (confirmed by documentation from Slightly Mad Studios, Shin'En with regards to tesselation, Unity regarding their engine on the Wii U being capable of DX11 equivalent features and Iwata who confirmed Compute Shaders).

Wii U being capable of DX11 equivalent features and Compute Shaders, when said it like that is disingenuous. Truth is DX10 has been capable of tesselation (instanced) at least since 2008 . Same for compute shaders on DX10 hardware.
For a wider perspective on these things:

ATI Toy Shop Demo - Read as how ATI in 2005 was already pushing for GPGPU, as they use the GPU for water simulation. Or how the 360 GPU could be used for physics.

It's not about being able to replicate or support certain features, it's about the hardware being designed to make them feasible in real time. AMD and NVIDIA strive to address bottlenecks that prevent some of these features to work in tandem, and allow the developers to actually put them to use.

AMD's solution is the APU HSA design, which is being employed in a more advanced and matured fashion on the Xbox One and specially the Playstation 4 (GDDr5 became ready earlier than AMD anticipated, ESRAM was the previous short term solution to HUMA bandwidth limitations).
In short, Wii U is neither architecturally designed nor does it have enough processing power to actually take advantage of many of the features it theoretically supports. At least in the sense that it's able to actually put them to work, together, in a game.
 

prag16

Banned
I think at this point everybody knows Wii U has more horsepower than 360 and PS3, with the main problem being that the system is selling so awful that budgets are the one big road block.

We can see how this affects visual comparisons, with GTA V simply looking better as a whole than X due to the massive difference in budget. X does in fact seem to have more advanced lightning, and is running at 1080p (right?) so if GTA V was made for the Wii U it would look noticeably better than 360 and PS3 versions just like Bayonetta 2 looks better than any Platinum game on HD twins and runs at 1080p 60fps.

In a way, it's pretty impressive how such a modest but calculated hardware improvement over current gen can produce significant results on screen. What's not nearly as impressive is at what cost it's being sold as.



Wii U being capable of DX11 equivalent features and Compute Shaders, when said it like that is disingenuous. Truth is DX10 has been capable of tesselation (instanced) at least since 2008 . Same for compute shaders on DX10 hardware.
For a wider perspective on these things:

ATI Toy Shop Demo - Read as how ATI in 2005 was already pushing for GPGPU, as they use the GPU for water simulation. Or how the 360 GPU could be used for physics.

It's not about being able to replicate or support certain features, it's about the hardware being designed to make them feasible in real time. AMD and NVIDIA strive to address bottlenecks that prevent some of these features to work in tandem, and allow the developers to actually put them to use.

AMD's solution is the APU HSA design, which is being employed in a more advanced and matured fashion on the Xbox One and specially the Playstation 4 (GDDr5 became ready earlier than AMD anticipated, ESRAM was the previous short term solution to HUMA bandwidth limitations).
In short, Wii U is neither architecturally designed nor does it have enough processing power to actually take advantage of many of the features it theoretically supports. At least in the sense that it's able to actually put them to work, together, in a game.
I don't like some of your commentary, but this is actually a very good and reasonable post. Can't find any significant fault with it; well done.
 

Colonel

Neo Member
Indeed, that's why they're still in business. They don't spend 100 million to profit 10 million. They spend 10 million (if that) to profit 100 million. Hell, Pokemon probably does that twice a generation. The NSMB games probably see even greater margins.[/QUOTE]

Doesnt Nintendo have a Nintendo 101 where the newly hired are tasked to make a SMB lvl? If so then I would see why NSMB has suhce a greater margin then most games.
 
Have you guys seen the Splinter Cell footage for Wii U? Looks really good. Shadows seem pretty high quality.

http://www.neogaf.com/forum/showthread.php?t=652667

One thing I noticed. Some views, like thermal, is mirrored onto the gamepad.

But when you do something like snake cam, or the tri-copter thingy, you get two different views, but the main screen goes into this "static" view at like...sub 5 fps.

GkQoYxa.jpg
 

prag16

Banned
Have you guys seen the Splinter Cell footage for Wii U? Looks really good. Shadows seem pretty high quality.

http://www.neogaf.com/forum/showthread.php?t=652667

One thing I noticed. Some views, like thermal, is mirrored onto the gamepad.

But when you do something like snake cam, or the tri-copter thingy, you get two different views, but the main screen goes into this "static" view at like...sub 5 fps.

Wonder what the DF faceoff will bring. IGN says frame drops during cutscenes on Wii U (but counteracted by tearing in PS360 versions). Do we have any other info yet?

I think you said earlier (I think it was you) that you'll take frame drops over tearing any day. I agree 100%. There's very little as immersion breaking as the view constantly splitting in pieces whenever the camera moves, even if the framerate is insanely high.
 
Wonder what the DF faceoff will bring. IGN says frame drops during cutscenes on Wii U (but counteracted by tearing in PS360 versions). Do we have any other info yet?

I think you said earlier (I think it was you) that you'll take frame drops over tearing any day. I agree 100%. There's very little as immersion breaking as the view constantly splitting in pieces whenever the camera moves, even if the framerate is insanely high.

tearing is awful, thankfully its something rarely seen on the wii u
 
Have you guys seen the Splinter Cell footage for Wii U? Looks really good. Shadows seem pretty high quality.

http://www.neogaf.com/forum/showthread.php?t=652667

One thing I noticed. Some views, like thermal, is mirrored onto the gamepad.

But when you do something like snake cam, or the tri-copter thingy, you get two different views, but the main screen goes into this "static" view at like...sub 5 fps.

GkQoYxa.jpg

Unless I am seeing things incorrectly, the AF looks really low
 
Wonder what the DF faceoff will bring. IGN says frame drops during cutscenes on Wii U (but counteracted by tearing in PS360 versions). Do we have any other info yet?

I think you said earlier (I think it was you) that you'll take frame drops over tearing any day. I agree 100%. There's very little as immersion breaking as the view constantly splitting in pieces whenever the camera moves, even if the framerate is insanely high.

Frame drops are less jarring than tearing. I swear tearing makes me want to vomit. Gives me such a headache.

The Vsync that seems to be happening for Wii U games is very nice.
 

fred

Member
Yup, it was indeed he, and I concurred. There's nothing that annoys me more than a developer upping IQ so that everything looks great but doesn't have v-synch enabled so it all looks great when the image is static only to have it turn to complete shite whenever you move the camera. I mean, what's the fucking point ffs..?!?!!?

Grrrrrrrr
 

Log4Girlz

Member
I hope Dave becomes a mod sooner or later.

If they really want an awesome combo make both me and Dave mods. We can fill the role of Ami without resorting to some of the questionable tactics.

If they ever decide to promote an idiot to modhood I hope its me. I would bring unprecedented stupidity to GAF.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
In short, Wii U is neither architecturally designed nor does it have enough processing power to actually take advantage of many of the features it theoretically supports. At least in the sense that it's able to actually put them to work, together, in a game.
Are you saying that from the position of a person who's tried and failed?
 

krizzx

Junior Member
So now Halo, Gears and Uncharted aren't impressive because they don't run at 60fps ?... wow, just wow.

This thread really is unreal, if you post in here and say anything other than 'WiiU games don't look impressive because of small budgets / unfinished dev kits / unfinished dev tools / third parties dont care ect, ect' you are swarmed by the same four or five people who still care enough to post.

Instead of swarming on people who dare to have a different opinion to this why don't you ask yourself why PS360 games run at a lower resolution and framerate on WiiU than 8 year old hardware ?.

Why Wii up ports like MH3U can only hit 45fps max while running at 1080p ?.

Why several companies refuse to call the WiiU a next gen system ?.

Why several publishers? have dropped support because the consoles hardware is not powerful enough to support the latest engines (Frostbite 3) ?.

Why the big Winter 3D Mario is a 3DS up port in all but name ?.

Why Retro are making a 2D platformer that could have ran on Wii instead of an impressive looking tech showcase for the system ?.

WiiU was NEVER intended to be anything more than a tiny upgrade to PS360 while adding the Tablet controller for 'innovation' IMO and nothing show for the console so far has done anything to change my view.

This is such a fallacious post. You are drawing irrelevant conclusions in mass. In most, if not all of these case there is already a substantial reason given and it has nothing to do with whether or not the Wii U is next or if it is stronger.

Monster Hunter Tri has variable frame rate and Capcom has stated that they've had trouble developing for the Wii U hardware. It has nothing to with the systems ability or inability to do such a thing. It has to do with the developers having difficulty getting it up and running properly like they said. Most of the launch period ports had issues being ports do to tools

Its not "publishers", its one publisher that publishes for a lot of development company. D.I.C.E., the devs who make the Frostbyte engine, Crytek and all of those other devs have there games published by EA and everyone knows that EA upset with Nintendo over their refusal to take the origin deal and are actively blocking the releases.

Crytek said that Crysis was up and running on the Wii U just fine but that "EA" cancelled, not Crytek. The makers of the Frostbite engine later went back in said that the reason its not coming to the Wii U is because there isn't enough interest in the console to make the effort the port the engine. None of these devs have a choice in the matter because EA publishes their games and they can't go against EA's decision.

http://kotaku.com/5988140/wii-u-can-handle-crysis-3-and-almost-did-says-crytek-head
http://mynintendonews.com/2013/07/0...ble-on-wii-u-but-the-console-is-low-priority/
http://www.nintendolife.com/news/20...pers_can_use_unreal_engine_4_for_wii_u_titles


Also a lot of devs, like the dev who make Frozenbyte(who said that Trine 2: DC couldn't even run on the 360/PS3 anymore without downgrades), Voofoo, Precursor Entertainment, Shin'en, have all boasted about how much more powerful the Wii U is than most people realize.

http://nintendoenthusiast.com/22684...its-full-potential-has-not-yet-been-realised/
http://www.nintendolife.com/news/20...several_generations_ahead_of_current_consoles
http://www.nintendolife.com/news/20...trine_2_wii_u_eshop_and_working_with_nintendo
http://www.videogamer.com/wiiu/deus...ii_u_directors_cut_enhancements_detailed.html

You see these "relevent" Link's? Thats called substantiating your statement with fact. You are not backing up anything you claim. You are just making broad generalizations with no proof, and drawing conclusions that contradict the facts. In fact, in the cases like when you are saying Frostbyte isn't coming to the Wii U do to strength, you are telling a lie, because it was specifically stated that it was strong enough to run the Frostbyte 3, the next gen Frostbyte engine, but that there just wasn't enough interest.

Donkey Kong and Mario 3D Land U are being made for the Wii U aren't being made for the Wii because:
1. Nintendo isn't making anymore games for the Wii. They stopped over a year ago before the Wii U was even released.
2. The games won't cost much or take a long time to make.
3. The Wii U is in dire need of more games releases at a faster pace which leads back to 2.
4. Because they can. They still have other, larger and more expensives games in the work.
You are taking facts and drawing conclusions that follow.

How did you even come to the conclusion that two games being made based on already made means that the entirety of the system is weak? That would entail that the system is capable of nothing else. You are making "MASSIVE" leaps in logic and generalization based on small cherry picked occurences that in now way are the result of what you are trying to make them out to be.
 

JordanN

Banned
Are you saying that from the position of a person who's tried and failed?
Anyone remembers Wii U's power envelope? I think that's the bigger limiting factor than anything.

Built on the 45(?)nm process, has a really tiny amount of silicon and draws only 33 watts(?), I think it's reasonable to assume the console wont likely do that DX10.1/11 stuff plus push a high resolution with geometry,physics etc (how else would the game look good?) as Can Crusher said. Let alone, to a degree mirroring PS4/XBO some were/are pushing.

Maybe if the console wasn't designed to be a small box with the power draw of a... CD Player... while having to support a screen controller, you wouldn't have to worry about how it could be hamstrung somewhere.
rniLbzS.jpg

Really tiny and I'm suppose to believe it's got this 2-3x magic somewhere (or it's going to do intensive 1080p)? That's not Gamecube.

Edit: Woah, I was way off base on the toaster. Good thing I don't use that thing too often. I guess the closest thing would be: CD Player. Wii U's architecture is only good for music.
 

Schnozberry

Member
Anyone remembers Wii U's power envelope? I think that's the bigger limiting factor than anything.

Built on the 45(?)nm process, has a really tiny amount of silicon and draws only 33 watts(?), I think it's reasonable to assume the console wont likely do that DX10.1/11 stuff plus push a high resolution with geometry,physics etc (how else would the game look good?) as Can Crusher said. Let alone, to a degree mirroring PS4/XBO some were/are pushing.

Maybe if the console wasn't designed to be a small box with the power draw of a toaster while having to support a screen controller, you wouldn't have to worry about how it could be hamstrung somewhere.
rniLbzS.jpg

Really tiny and I'm suppose to believe it's got this 2-3x magic somewhere? That's not Gamecube.

A toaster consumes 800-1500w of Power. If the Wii U were designed around the Power Draw of a toaster, it would likely be a tri-SLI PC.

I don't think anybody with any attachment to reality is saying Wii U will mirror the performance of PS4/XBO. There are quite a few very competent developers saying that it has some nice feature improvements, and once people exploit them it should give a nice bump in certain areas over the PS3 and Xbox 360. If you're unhappy with that, then good for you, but I think you're going to miss out on some very cool game experiences you won't find anywhere else.

The real limiting factor in getting the best performance out of the Wii U is going to be budgets and the SDK documentation that everyone has said is the drizzling shits.
 
Anyone remembers Wii U's power envelope? I think that's the bigger limiting factor than anything.

Built on the 45(?)nm process, has a really tiny amount of silicon and draws only 33 watts(?), I think it's reasonable to assume the console wont likely do that DX10.1/11 stuff plus push a high resolution with geometry,physics etc (how else would the game look good?) as Can Crusher said. Let alone, to a degree mirroring PS4/XBO some were/are pushing.

Maybe if the console wasn't designed to be a small box with the power draw of a... CD Player... while having to support a screen controller, you wouldn't have to worry about how it could be hamstrung somewhere.
rniLbzS.jpg

Really tiny and I'm suppose to believe it's got this 2-3x magic somewhere (or it's going to do intensive 1080p)? That's not Gamecube.

Edit: Woah, I was way off base on the toaster. Good thing I don't use that thing too often. I guess the closest thing would be: CD Player. Wii U's architecture is only good for music.

Is that a educated guess or just assumption based on you've read about toasters compared to Latte?
 
Are you saying that from the position of a person who's tried and failed?

I'm obviously speculating, but what other conclusion could you arrive to? Unless you believe Nintendo created a miracle design, and other hardware manufacturers are simply run by people who don't have a clue.
 
I'm obviously speculating, but what other conclusion could you arrive to? Unless you believe Nintendo created a miracle design, and other hardware manufacturers are simply run by people who don't have a clue.

I don't think they quite did it this time, but they have been known to do this...
Look at the GameCube compared to the PS2 and Xbox... The bolded pretty much describes that generation from a computational stand-point... other hardware mistakes notwithstanding.
 

Donnie

Member
Anyone remembers Wii U's power envelope? I think that's the bigger limiting factor than anything.

Built on the 45(?)nm process, has a really tiny amount of silicon and draws only 33 watts(?), I think it's reasonable to assume the console wont likely do that DX10.1/11 stuff plus push a high resolution with geometry,physics etc (how else would the game look good?) as Can Crusher said. Let alone, to a degree mirroring PS4/XBO some were/are pushing.

Maybe if the console wasn't designed to be a small box with the power draw of a... CD Player... while having to support a screen controller, you wouldn't have to worry about how it could be hamstrung somewhere.
rniLbzS.jpg

Really tiny and I'm suppose to believe it's got this 2-3x magic somewhere (or it's going to do intensive 1080p)? That's not Gamecube.

Edit: Woah, I was way off base on the toaster. Good thing I don't use that thing too often. I guess the closest thing would be: CD Player. Wii U's architecture is only good for music.

2-3x what?, also tiny compared to what? WiiU's GPU clearly has significantly more transistors than 360's GPU for instance. Also its incorrect to suggest that newer features (DX10.1/11 ect) would neccesarily require more performance to use. A lot of the features in these new APIs are not just about making something better looking but also doing something faster and more efficiently.
 

Donnie

Member
I'm obviously speculating, but what other conclusion could you arrive to? Unless you believe Nintendo created a miracle design, and other hardware manufacturers are simply run by people who don't have a clue.

You could not come to a conclusion unless it has some kind of basis in fact. What features are you refering to and what basis do you have for believing them to be unusable on WiiU given what we know of its performance?
 

spisho

Neo Member
I don't think they quite did it this time, but they have been known to do this...
Look at the GameCube compared to the PS2 and Xbox... The bolded pretty much describes that generation from a computational stand-point... other hardware mistakes notwithstanding.
There was nothing miraculous about the Gamecube. It came out at pretty much the same time as the Xbox and was clearly worse. The PS2 came out a year and a half before it and was debatably better.
 

JordanN

Banned
1. 2-3x what?, 2.also tiny compared to what? 3.WiiU's GPU clearly has significantly more transistors than 360's GPU for instance. 4.Also its incorrect to suggest that newer features (DX10.1/11 ect) would neccesarily require more performance to use. A lot of the features in these new APIs are not just about making something better looking but also doing something faster and more efficiently.
1.PS3/360. And before you ask why, it was someone here who suggested it (Megabytecr?). Not bothered to look it up though.
2.I'm actually commenting on its size. It's smaller than a fingernail making it really tiny for a HD console (and perhaps non-mobile electronics in general).
3.It does? How many?
4.That's nice.
 
2-3x what?, also tiny compared to what? WiiU's GPU clearly has significantly more transistors than 360's GPU for instance. Also its incorrect to suggest that newer features (DX10.1/11 ect) would neccesarily require more performance to use. A lot of the features in these new APIs are not just about making something better looking but also doing something faster and more efficiently.

Saying that it's better than the PS3/360 is an easy copout. Given that those two machines are on the way out, that's basically a must.

I think the real travesty is that the Wii U won't be any easier to develop, if not harder, than the other two machines, while being less powerful.
 
1.PS3/360. And before you ask why, it was someone here who suggested it (Megabytecr?). Not bothered to look it up though.
2.I'm actually commenting on its size. It's smaller than a fingernail making it really tiny for a HD console (and perhaps non-mobile electronics in general).
3.It does? How many?
4.That's nice.

Power seems to have more to do with operating frequency and transistor count then anything else.
 

Frodo

Member
Have you guys seen the Splinter Cell footage for Wii U? Looks really good. Shadows seem pretty high quality.

http://www.neogaf.com/forum/showthread.php?t=652667

One thing I noticed. Some views, like thermal, is mirrored onto the gamepad.

But when you do something like snake cam, or the tri-copter thingy, you get two different views, but the main screen goes into this "static" view at like...sub 5 fps.

GkQoYxa.jpg


I think it is pretty clear that the main reason why they only show this almost static image on the TV screen is to not have to animate Sam using the snake cam or whatever other device he is using at the moment (since you don't have this feature in any other version), and knowing Ubisoft they probably just put this screen in there to save money and time. If you are trying to use this example to say Wii U can not render two different 3D environments at the same time there are BLOPS2 and even ZombiU and Nintendo Land and whatnot out there to prove it otherwise.
 

USC-fan

Banned
Anyone remembers Wii U's power envelope? I think that's the bigger limiting factor than anything.

Built on the 45(?)nm process, has a really tiny amount of silicon and draws only 33 watts(?), I think it's reasonable to assume the console wont likely do that DX10.1/11 stuff plus push a high resolution with geometry,physics etc (how else would the game look good?) as Can Crusher said. Let alone, to a degree mirroring PS4/XBO some were/are pushing.

Maybe if the console wasn't designed to be a small box with the power draw of a... CD Player... while having to support a screen controller, you wouldn't have to worry about how it could be hamstrung somewhere.
rniLbzS.jpg

Really tiny and I'm suppose to believe it's got this 2-3x magic somewhere (or it's going to do intensive 1080p)? That's not Gamecube.

Edit: Woah, I was way off base on the toaster. Good thing I don't use that thing too often. I guess the closest thing would be: CD Player. Wii U's architecture is only good for music.
Fun fact the CPU in the ps4\xbone uses more power than the entire wiiu.
 
Oh, it's not shrinking, it's contracting.
Hm.
Could someone do me a favor and explain how shrinking is not contracting, and/or vice-versa?

Guess I should have been more clear. It's not shrinking in the "danger of disappearing" sense, just losing traction. And it's to mobile, something that's eaten away at everything else as well. Are the console and handheld markets also collapsing?
 

USC-fan

Banned
No it doesn't. 8 Jaguar cores at 1.6 GHz is going to use about 20W.

Yes it does. The 4 core 1.6 uses 15watts. 8 core is at 30 watts without ram or anything else.

http://www.semiconductorstore.com/cart/pc/viewPrd.asp?idproduct=48536

Fun Fact: In this day and age, that means very little when it comes to actual on screen performance for games.
It has everything to do with it... Law of physics and all.

This thread has gone off the deep end. Please rename to WUST 3.0: Death Throes edition
 

strata8

Member
Yes it does. The 4 core 1.6 uses 15watts. 8 core is at 30 watts without ram or anything else.

http://www.semiconductorstore.com/cart/pc/viewPrd.asp?idproduct=48536

TDP =/= power consumption. Anandtech measured the consumption of the A4-5000 (4 x 1.5 GHz) and found that the entire laptop - which includes the idling GPU, RAM, HDD, etc - used 11.5W when loading the CPU:
I also suspect the 15W TDP is perhaps a bit conservative, total platform power consumption with all CPU cores firing never exceeded 12W (meaning SoC power consumption is far lower, likely sub-10W).

Unless increasing the CPU speed by 100 MHz/6% increases power usage by 50% (unheard of), there's no way 8 cores are going to hit 30W in normal scenarios.
 

USC-fan

Banned
TDP =/= power consumption. Anandtech measured the consumption of the A4-5000 (4 x 1.5 GHz) and found that the entire laptop - which includes the idling GPU, RAM, HDD, etc - used 11.5W when loading the CPU:


Unless increasing the CPU speed by 100 MHz/6% increases power usage by 50% (unheard of), there's no way 8 cores are going to hit 30W in normal scenarios.

People alway jump to binned mobile parts to prove these crazy low power numbers. Perfect for a WUST thread.

that part[APU] is 15W TDP and yet the part that i quoted [CPU] at 1.6ghz for with no gpu is also 15w TDP... hmmmmm G-Series SoC GX416RA no GPU 15W
 
I think it is pretty clear that the main reason why they only show this almost static image on the TV screen is to not have to animate Sam using the snake cam or whatever other device he is using at the moment (since you don't have this feature in any other version), and knowing Ubisoft they probably just put this screen in there to save money and time. If you are trying to use this example to say Wii U can not render two different 3D environments at the same time there are BLOPS2 and even ZombiU and Nintendo Land and whatnot out there to prove it otherwise.

The graphics on the other games you mention look a lot worse. The shaders on splinter cell look a lot more advanced than what devs have been recently doing on the Wii U. It's a really clean looking (high IQ) game as well.
 

Raist

Banned
I don't think they quite did it this time, but they have been known to do this...
Look at the GameCube compared to the PS2 and Xbox... The bolded pretty much describes that generation from a computational stand-point... other hardware mistakes notwithstanding.

Power consumption was roughly equivalent for these consoles.
 

strata8

Member
People alway jump to binned mobile parts to prove these crazy low power numbers. Perfect for a WUST thread.

that part[APU] is 15W TDP and yet the part that i quoted [CPU] at 1.6ghz for with no gpu is also 15w TDP... hmmmmm G-Series SoC GX416RA no GPU 15W

Binned mobile part? There's no such thing as a non-mobile Jaguar part, as it's inherently designed for 4-25W TDPs. The A4 series isn't exactly a premium chip either.

Try and find some actual power consumption numbers for that 15W CPU.
 
Status
Not open for further replies.
Top Bottom