• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

The_Lump

Banned
Thanks for the correction.
Here is a source.
http://**************/2012/09/wii-u-specs-2-gb-memory-75-w-power/

Interestingly the source says that the Wii U draws 40w at the dash and can pull up to 75. Which doesn't make sense... because during gameplay, you hit ~33w at the plug... meaning the console itself (minus the PSU) is pulling around 25w or so...

No sarcasm. As I said above... the PSU is capable of 75 watts.. but is never near that.


Ah ok, gotcha.

Well I've been assuming the 33w Anandtech quoted was what the console was sucking in from the PSU, not at the wall. Hmmm. Pretty useless info then and I don't know why 33w is being used to work out the TDP all this time?

I'm still not convinced that the output number on the PSU is in fact what the PSU takes in from the plug. I've always assumed that was what it actually outputs (max). Obviously it draws more than that from the wall (ie at 80% efficiency it would be drawing ~90w to create that 75w). I mean think about when you buy a generic replacement psu: if your device needs 50w you find one which outputs 50w. Not one which says it outputs 100w and then your left guessing how efficient it is?

I may be wrong though. (and I don't really care either way before I get labelled as an apologist ;) )
 

The_Lump

Banned
USC is right. It measures 33w at the wall, so you have to grant a few watts for the psu itself, unless I'm missing something. Nothing connects to the Wii U itself without the PSU...

Wii U also has 4 USB slots, so in theory you could connect 4 USB HDDs at once, which is why the psu rating leaves so much breathing room.


I had assumed that's what tech sites like Anandtech are able to do (measure what the console is using) but obviously if that's a measurement from the wall then fair enough.
 
Ah ok, gotcha.

Well I've been assuming the 33w Anandtech quoted was what the console was sucking in from the PSU, not at the wall. Hmmm. Pretty useless info then and I don't know why 33w is being used to work out the TDP all this time?

I'm still not convinced that the output number on the PSU is in fact what the PSU takes in from the plug. I've always assumed that was what it actually outputs (max). Obviously it draws more than that from the wall (ie at 80% efficiency it would be drawing ~90w to create that 75w). I mean think about when you buy a generic replacement psu: if your device needs 50w you find one which outputs 50w. Not one which says it outputs 100w and then your left guessing how efficient it is?

I may be wrong though. (and I don't really care either way before I get labelled as an apologist ;) )
IIRC, the 75w figure takes into account for the efficiency rating, as by law (false advertising). I could be wrong. Punch my face if I am.
 

The_Lump

Banned
IIRC, the 75w figure takes into account for the efficiency rating, as by law (false advertising). I could be wrong. Punch my face if I am.


Haha. Well I was thinking along the same lines. Face punches all round maybe?

Ok but still raises the question: what was Iwata on about with his "40w / 75w" comment?
 

jeffers

Member
usb 2.0 is 500ma max at 5v (4.75->5.25) so say 2.5w for each (though theres some hocus about the usb controller itself having a limit i think? also assuming ninty kept to spec) but can do worst case and say 10w for the 4 usb slots.

The psu output is 75w so it should be fine doing it, but think we're gonna have to go with nintys past console precedent for power use vs its rating. (unless sometime in the future it suddenly jumps up and we know about it :p)

Anyone fancy checking past consoles rating on the console vs output of psu?

edit: so yeah GC has the same W rating from psu output to console input
 
It's either 2 y connected or 4 with their own Power supplies... So, it shouldn't draw 75 with 4 hdd's or 2 y connected hdds..

This is true. It will account for some increase in power consumption but not get it up to 75w. What was mentioned about Wii's psu being rated at 52w is also relevent here, but I refrained from making a statement because I am not sure if that applies to what it can convert or what it draws at the wall.
 

Schnozberry

Member
Do we know what type of watt meter was used to test the Wii U? Some are more reliable than others. I have a Fluke 1735 that I was given from work for electronics testing and a Kill-a-Watt tester that I got online for $40, and the fluke measures my Wii U during regular usage at 48w (I have a USB Hard Drive plugged in with a Y Cable and a USB Ethernet Adapter), and the Kill-a-Watt tester measures it at 39w. The Fluke tester runs about $2000 from most dealers.

I guess my point is that judging the Wii U power consumption and the TDP of internal components solely based on analysis found online means that you're assuming there isn't any inaccuracy possible in the measurements.
 

jeffers

Member
Do we know what type of watt meter was used to test the Wii U? Some are more reliable than others. I have a Fluke 1735 that I was given from work for electronics testing and a Kill-a-Watt tester that I got online for $40, and the fluke measures my Wii U during regular usage at 48w (I have a USB Hard Drive plugged in with a Y Cable and a USB Ethernet Adapter), and the Kill-a-Watt tester measures it at 39w. The Fluke tester runs about $2000 from most dealers.

I guess my point is that judging the Wii U power consumption and the TDP of internal components solely based on analysis found online means that you're assuming there isn't any inaccuracy possible in the measurements.

where'd you measure the power? wall->psu or psu->console?
 

The_Lump

Banned
Do we know what type of watt meter was used to test the Wii U? Some are more reliable than others. I have a Fluke 1735 that I was given from work for electronics testing and a Kill-a-Watt tester that I got online for $40, and the fluke measures my Wii U during regular usage at 48w (I have a USB Hard Drive plugged in with a Y Cable and a USB Ethernet Adapter), and the Kill-a-Watt tester measures it at 39w. The Fluke tester runs about $2000 from most dealers.

I guess my point is that judging the Wii U power consumption and the TDP of internal components solely based on analysis found online means that you're assuming there isn't any inaccuracy possible in the measurements.


The plot thickens...

Good info, cheers :)
 
Do we know what type of watt meter was used to test the Wii U? Some are more reliable than others. I have a Fluke 1735 that I was given from work for electronics testing and a Kill-a-Watt tester that I got online for $40, and the fluke measures my Wii U during regular usage at 48w (I have a USB Hard Drive plugged in with a Y Cable and a USB Ethernet Adapter), and the Kill-a-Watt tester measures it at 39w. The Fluke tester runs about $2000 from most dealers.

I guess my point is that judging the Wii U power consumption and the TDP of internal components solely based on analysis found online means that you're assuming there isn't any inaccuracy possible in the measurements.

Nice setup! Maybe try testing the Fluke on something with a known power draw? Seems off compared to other independently run tests, but I supppse they could all be wrong if they are using cheap tools.
 

Schnozberry

Member
where'd you measure the power? wall->psu or psu->console?

I measured both at the wall. I'm actually not certain how I could reliably measure the other end. The clips on the fluke meter are too large for a good connection, and I'm afraid I'll damage the Wii U power supply. The Kill-a-Watt simply can't.
 

Schnozberry

Member
Nice setup! Maybe try testing the Fluke on something with a known power draw? Seems off compared to other independently run tests, but I supppse they could all be wrong if they are using cheap tools.

The fluke measures my PC at 390w. Kill-a-watt says 343w. Also, the Kill-a-watt gives weird readings from my LG 47" HDTV. When powering it on, it fluctuates between 58w-122w, and with the fluke meter it shows a constant power draw of around 92w.

I just tested the Wii U without the Ethernet adapter or my hard drive plugged in. Kill-a-watt says 29w at the main menu. Fluke meter says 35w. When I run Splinter Cell Conviction, the Kill-a-Watt says ~33w, and the fluke says 38w. That would mean my hard drive and Ethernet adapter draw an additional 10w.
 

wilsoe2

Neo Member
Do we know what type of watt meter was used to test the Wii U? Some are more reliable than others. I have a Fluke 1735 that I was given from work for electronics testing and a Kill-a-Watt tester that I got online for $40, and the fluke measures my Wii U during regular usage at 48w.

I measured wattage for W101 yesterday with a Kill-a-Watt tester and posted the results here. I didn't realize it could be so inaccurate... i might also experiment with my multimeter tonight and see what happens.
 
The fluke measures my PC at 390w. Kill-a-watt says 343w. Also, the Kill-a-watt gives weird readings from my LG 47" HDTV. When powering it on, it fluctuates between 58w-122w, and with the fluke meter it shows a constant power draw of around 92w.

I just tested the Wii U without the Ethernet adapter or my hard drive plugged in. Kill-a-watt says 29w at the main menu. Fluke meter says 35w. When I run Splinter Cell Conviction, the Kill-a-Watt says ~33w, and the fluke says 38w. That would mean my hard drive and Ethernet adapter draw an additional 10w.

So maybe Iwata's statement that 40 watts was reaslistic wasn't as far off as we had initially thought?
 
If you need a bigger heatsink, that means that your hardware produces more heat and uses more energy. That is a bad thing, not a good thing.

The fact that the Wii U only needs a small heat sink means that is extremely efficient, or as many posters have attested when talking about the Wii U's power draw "it blows cool air". That is great.

Pathetic is needing a monster heatsink like the older 360s if you want to keep them from red ringing. http://www.youtube.com/watch?v=gPhiTYKm3PU This is pathetic.

One thing that Nintendo can brag about is that they have the most heat/energy efficient console of the next gen.

The bigger the heatsink the more pathetic. How he came to the conclusion that having a small heatsink was pathetic is beyond me. Ideally, you would want hardware that doesn't need a heatsink at all. We are far from nano computers though.

Everyone is generalizing way too much here. Not everything needs to be passively cooled, some things it makes sense to have big beefy heatsinks for and others it doesn't. To call any big or small heatsink pathetic is just jumping the gun.
 

jeffers

Member
might (not for you) be worth trying to put the wiiu in an enclosed space, make that fan work harder. prob wont make that much power difference though.
 

Schnozberry

Member
So maybe Iwata's statement that 40 watts was reaslistic wasn't as far off as we had initially thought?

Possibly. The Kill-a-Watt meter seems to under report everything I test with it. I don't know if I have a shitty one or if that's typical. The Fluke is expensive and calibrated so I tend to trust it.

The weirdest thing about the Kill-a-Watt is that it seems to have serious issues gauging power while products are in standby mode. For instance, the fluke says my TV draws about 0.1w in standby mode. According to the Kill-a-watt, it's drawing 10w! So in that case it seems to over report draw. For a ballpark number the cheap watt meters aren't bad. But until we know what meters were used for the assumptions about the Wii U, I wouldn't take it to the bank that they are completely accurate and reliable.
 

Schnozberry

Member
And that means?

What does TDP stand for?

Sorry, I really dont get this type of discussion, but I find it fascinating to read about.

The TDP is thermal design power. It is the maximum amount of energy a cooling system is designed to dissipate. The assumptions about the Wii U was that the maximum power draw for the system was around 33w without peripherals plugged in. If my tester is correct, then it could be closer to 40w. That gives a little bit more leeway in the discussion about what kind of GPU power can be expected from the hardware.

On the other end of the coin, we don't know exactly what kind of power draw to expect from other components other than the GPU, so it just means more guessing.
 
The TDP is thermal design power. It is the maximum amount of energy a cooling system is designed to dissipate. The assumptions about the Wii U was that the maximum power draw for the system was around 33w without peripherals plugged in. If my tester is correct, then it could be closer to 40w. That gives a little bit more leeway in the discussion about what kind of GPU power can be expected from the hardware.

On the other end of the coin, we don't know exactly what kind of power draw to expect from other components other than the GPU, so it just means more guessing.

Okay, thanks for explaining it, so basically, we know less of what it is possible of, and it could be more or less than or estimates by more than many speculate?
 

USC-fan

Banned
The TDP is thermal design power. It is the maximum amount of energy a cooling system is designed to dissipate. The assumptions about the Wii U was that the maximum power draw for the system was around 33w without peripherals plugged in. If my tester is correct, then it could be closer to 40w. That gives a little bit more leeway in the discussion about what kind of GPU power can be expected from the hardware.

On the other end of the coin, we don't know exactly what kind of power draw to expect from other components other than the GPU, so it just means more guessing.
Unless you are reading without the PSU is doesnt change anything. We have always went with 33 watts since that was the highest reading.

Even at 38w once you take the PSU out you are at 30-34 watts. That depends on teh efficiency of the PSU which at the very high end is 90%.

Also we have multi source of measurement and against your source. You should put some photo to back it up. I havent seen any number above 34 watts
 

Schnozberry

Member
Okay, thanks for explaining it, so basically, we know less of what it is possible of, and it could be more or less than or estimates by more than many speculate?

I hesitate to draw any conclusions because I just don't know. But I think using the 33w number as a hard limiter for system performance is assuming too much.
 
I agree with you that it's a nice detail that the Smash Bros team added to Link's model, but I also think you are also undervaluing it a bit. This is a chaotic game that can go up to at least 4-players, and will likely have several items that can create alot of effects on-screen, runs at 60fps and maybe even 1080p. Considering that you will likely not see Link's model up-close during actual gameplay, I can see how some would consider clothes physics in a game like this to be a nice surprise. In contrast, that Uncharted 3 video had alot of heavily scripted parts and had some budget on making the scene very cinematic.
.

I must say that I really don't understand anything of technic, so I'm probably off, but it's really this "cloth physics" dfferent from what you can admire for example in the hair and the sleeve of Nariko in PSBR? Like in this video http://www.youtube.com/watch?v=CaqmiuTieDk .... This too is a chaotic game that can go up to at least 4-players, and will likely have several items that can create alot of effects on-screen, runs at 60fps, and I believe it's not really an hight budget game...
 
Yeah, I'm a bit skeptical. You sure that thing is properly calibrated, Schnoz? No disrespect implied, but it does fly in the face of all other readings from multiple independent channels. That's why I suggested you test it out on something you already know the wattage of.

It could be possible that it's correct, though, which would make more sense of Iwata's figures. But then we need to question the accuracy of the meters used by all other power consumption tests we find online, including those used to get the peak consumption of the Radeon HD5550 we used as a comparison. I doubt that the PCIe device used by TechPowerup on that card was a Fluke-level product...
 

The_Lump

Banned
Unless you are reading without the PSU is doesnt change anything. We have always went with 33 watts since that was the highest reading.

Even at 38w once you take the PSU out you are at 30-34 watts. That depends on teh efficiency of the PSU which at the very high end is 90%.

Also we have multi source of measurement and against your source. You should put some photo to back it up. I havent seen any number above 34 watts


It changes quite a lot. 60% of 48w is more than 60% of 33w.
 

Schnozberry

Member
Yeah, I'm a bit skeptical. You sure that thing is properly calibrated, Schnoz? No disrespect implied, but it does fly in the face of all other readings from multiple independent channels. That's why I suggested you test it out on something you already know the wattage of.

It could be possible that it's correct, though, which would make more sense of Iwata's figures. But then we need to question the accuracy of the meters used by all other power consumption tests we find online, including those used to get the peak consumption of the Radeon HD5550 we used as a comparison. I doubt that the PCIe device used by TechPowerup on that card was a Fluke-level product...

Can I be 100% certain? No, I guess I can't. LG lists the average power consumption of my TV at 94 watts, and the Fluke Meter says it's drawing around 92. The Kill-a-Watt device says 87.

The problem is I don't know what devices are commonly used for reviews. It's possibly they are using a high end meter and it's calibrated right on the nuts. If they use the same meter for every test, then comparing hardware is relatively straight forward and the accuracy isn't really that important if you're within an acceptable range. The problem is we are assuming that the 33w is dead on, and then attempting to extrapolate that into performance numbers.

Edit: Do we know the efficiency numbers of the Wii U PSU, or is that an educated guess as well?
 
Can I be 100% certain? No, I guess I can't. LG lists the average power consumption of my TV at 94 watts, and the Fluke Meter says it's drawing around 92. The Kill-a-Watt device says 87.

The problem is I don't know what devices are commonly used for reviews. It's possibly they are using a high end meter and it's calibrated right on the nuts. If they use the same meter for every test, then comparing hardware is relatively straight forward and the accuracy isn't really that important if you're within an acceptable range. The problem is we are assuming that the 33w is dead on, and then attempting to extrapolate that into performance numbers.

Edit: Do we know the efficiency numbers of the Wii U PSU, or is that an educated guess as well?

Don´t remember the outlets but I don´t think they used equipment as capable as yours, plus your TV measurement seems spot on (+-2).
 

jeffers

Member
can you get the guy who let you borrow it to calibrate it for you? hes/department bound to have an electronic dummy load to go with it.
 

Schnozberry

Member
can you get the guy who let you borrow it to calibrate it for you? hes/department bound to have an electronic dummy load to go with it.

Yeah, I work at an electric utility, so I'll see if I can get it calibrated and try again. Honestly, I think it was calibrated less than a month ago due to regulatory requirements, so I don't see how it could be off by much.
 
Can I be 100% certain? No, I guess I can't. LG lists the average power consumption of my TV at 94 watts, and the Fluke Meter says it's drawing around 92. The Kill-a-Watt device says 87.

The problem is I don't know what devices are commonly used for reviews. It's possibly they are using a high end meter and it's calibrated right on the nuts. If they use the same meter for every test, then comparing hardware is relatively straight forward and the accuracy isn't really that important if you're within an acceptable range. The problem is we are assuming that the 33w is dead on, and then attempting to extrapolate that into performance numbers.

Edit: Do we know the efficiency numbers of the Wii U PSU, or is that an educated guess as well?

Fair enough. Thanks for you efforts! We don't know the efficiency of the PSU, but we've been giving it the benefit of the doubt and figuring 90%.
 

krizzx

Junior Member
Alright, we got some more Sonic fodder to analyze.

A video and a ton bunch of screens.

http://gamingtrend.com/2013/08/22/sonic-dark-world-screens-and-video/

frozenfactory_zone4_130801_004.jpg
frozenfactory_zone2_130801_001.jpg


Some nice alpha usage.
coop_stealthb_130801_02.jpg
mi1-miiverseitem_barrier_130801_003.jpg


That drawing distance...
mi2-miiverseitem_bomb_130801_002a.jpg
mi3-miiverseitem_invincible_130801_002b.jpg

I can definitely say beyond a shadow of a doubt that the Wii U supersedes the PS3/360 in texture quality. This is one area where is a tremendous leap from the last gen.

The polygon count in Sonic is also clearly a step up from Generations again. I believe we have more than a enough evidence overall at this point in time to conclude that Latte has far superior polygon drawing capability. Not just a small bump. Perhaps we should take another look at the dual graphics engine possibility.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Fair enough. Thanks for you efforts! We don't know the efficiency of the PSU, but we've been giving it the benefit of the doubt and figuring 90%.
We really need to know the PSU details to gauge that with any certainty, but a PSU tends to get best efficiency around its middle power range, so your 90% might not be far off the mark for a 75W quality PSU while supplying 40W.
 
Alright, we got some more Sonic fodder to analyze.

A video and a ton bunch of screens.

http://gamingtrend.com/2013/08/22/sonic-dark-world-screens-and-video/



I can definitely say beyond a shadow of a doubt that the Wii U supersedes the PS3/360 in texture quality. This is one area where is a tremendous leap from the last gen.

The polygon count in Sonic is also clearly a step up from Generations again. I believe we have more than a enough evidence overall at this point in time to conclude that Latte has far superior polygon drawing capability. Not just a small bump. Perhaps we should take another look at the dual graphics engine possibility.
Those shots look way too clean but the video does make the game look much better than the first time that I saw it.

Also 1080P? Are those screens native resolution?
 
Possibly. The Kill-a-Watt meter seems to under report everything I test with it. I don't know if I have a shitty one or if that's typical. The Fluke is expensive and calibrated so I tend to trust it.

The weirdest thing about the Kill-a-Watt is that it seems to have serious issues gauging power while products are in standby mode. For instance, the fluke says my TV draws about 0.1w in standby mode. According to the Kill-a-watt, it's drawing 10w! So in that case it seems to over report draw. For a ballpark number the cheap watt meters aren't bad. But until we know what meters were used for the assumptions about the Wii U, I wouldn't take it to the bank that they are completely accurate and reliable.


Cheap power meters tend to be unreliable for things like PC PSUs (and PSUs from other electronic devices). You won't have issues reading basically the exact value from the meter while measuring a light bulb, but readings from computers and probably also TVs will be all over the place.


Anyways, for the moment I'd tend to trust websites like anandtech that they actually used the necessary equipment.


Fourth Storm said:
Fair enough. Thanks for you efforts! We don't know the efficiency of the PSU, but we've been giving it the benefit of the doubt and figuring 90%.


We really can't know unless some one tests it with a device like these ones (worth ~€90k) *: http://www.chromaate.com/product/detial/9704?cid=5398
It might be in the 70, 80 or 90% range (which makes quite a difference tbh). The only thing we got is that during gameplay it's operating at or around the point of its highest efficiency (however high/low that may be) as it's fairly close to being utilized at ~50%.

This is true for every other console, PC, graphics card etc. Say for example a HD7870 isn't actually maxing out at ~160W (afaik), it's actually drawing sth. like 140W (max.) and the rest is PSU inefficiency.


* This is being used by computerbase.de for their reviews of PC PSUs.
 

JordanN

Banned
The robots look more or less the same as they did in Sonic Generations.

The environments are relatively simple. It's mostly cubes and flat planes. The only interesting thing seems to be the screenshot of Sonic's cloak. It's basically some kind of cubemap.

Those shots look way too clean but the video does make the game look much better than the first time that I saw it.

Also 1080P? Are those screens native resolution?
Unless a developer confirms it or someone pixel counts it, you wouldn't know for sure. High res renders from dev kits exist.
 

krizzx

Junior Member
The robots look more or less the same as they did in Sonic Generations.


The environments are relatively simple. It's mostly cubes and flat planes. The only interesting thing seems to be the screenshot of Sonic's cloak.


Unless a developer confirms it or someone pixel counts it, you wouldn't know for sure. High res renders from dev kits exist.

That screenshot you just posted just make it even clearler how huge a leap in graphics the Wii U is capable of as far as texture quality and effects go.

To top is off, while you are attacking the environment, the screenshot you posted as even more barren backgrounds. No plants casting shadows or anything.

The roundness, the texture resolution, the shading quality and everything about the robot has been taken up a few levels.

Those shots look way too clean but the video does make the game look much better than the first time that I saw it.

Also 1080P? Are those screens native resolution?

Whatever the resolution is, that look sto be 60 FPS and generations was 720p 30 FPS on the consoles if I recall. Not only is it pushing substantially higher quality effects and textures with more polygons, but its doing it at at least twice the rendering rate.
 
Why doesn't some one just email the folks who had the 33 watt power measurement and ask them what equipment they used. Though from what we're getting it doesn't look like they're using some thing of the level of the Fluke device. Since the readings its giving fall in line with LG's rating for the TV, and with what Iwata said about the Wii U. Plus it's coming from a device that sounds like it was recently calibrated.
 

StevieP

Banned
The roundness, the texture resolution, the shading quality and everything about the robot has been taken to a whole other level..

Actually it's relatively low poly and low resolution textures. With bullshot image quality.

Yes it doesn't matter for a relatively fast paced game, but it's not something to really point out as a huge positive.
 

krizzx

Junior Member
Actually it's relatively low poly and low resolution textures. With bullshot image quality.

Yes it doesn't matter for a relatively fast paced game, but it's not something to really point out as a huge positive.

The screen shots look like they were captured from the video in the link. Is the entire video a bullshot too? What evidence do you have to suggest that this they are simply bullshots?

Shading looks like phong lighting.

Tf2 had that.
WOiPopH.png

Please stop. You have jumped far out of bounds and are comparing grapefruits to cucumbers now.

We the same model from two games in the same series of the same make. The only differences here are the system strength. Its an Apple's to Apple's comparison. Trying to go make round about augments like that is just grasping for straws in defense at this point.

Wii U renditions is clearly a huge leap over the previous one.
 
Cheap power meters tend to be unreliable for things like PC PSUs (and PSUs from other electronic devices). You won't have issues reading basically the exact value from the meter while measuring a light bulb, but readings from computers and probably also TVs will be all over the place.


Anyways, for the moment I'd tend to trust websites like anandtech that they actually used the necessary equipment.





We really can't know unless some one tests it with a device like these ones (worth ~€90k) *: http://www.chromaate.com/product/detial/9704?cid=5398
It might be in the 70, 80 or 90% range (which makes quite a difference tbh). The only thing we got is that during gameplay it's operating at or around the point of its highest efficiency (however high/low that may be) as it's fairly close to being utilized at ~50%.

This is true for every other console, PC, graphics card etc. Say for example a HD7870 isn't actually maxing out at ~160W (afaik), it's actually drawing sth. like 140W (max.) and the rest is PSU inefficiency.


* This is being used by computerbase.de for their reviews of PC PSUs.

AnandTech is a home tech-repair site. They're probably using consumer-level devices for things like this.
 

StevieP

Banned
The screen shots look like they were captured from the video in the link. Is the entire video a bullshot too? What evidence do you have to suggest that this they are simply bullshots?

Perfect anti-aliasing and a 1080p resolution isn't going to happen. Aliasing is difficult to detect in motion sometimes, hence the confusion posed to you by the videos.
 

krizzx

Junior Member
Perfect anti-aliasing and a 1080p resolution isn't going to happen. Aliasing is difficult to detect in motion sometimes, hence the confusion posed to you by the videos.

How do you know 1080p isn't going to happen? There a many 1080p games on the Wii U arleady and more are getting anounced.

Mario Kart 8 is suppose to be 1080p. Smash Bros. U is suppose to be 1080p. Its seems that with the updates to system performance, efficiency and stability they are able to do far more than what they were a launch.

I have no reason to believe that this game cannot be done in 1080p on the Wii U GPU.

Then on top of all of that, you discount the fact that there is no discernible aliasing in the trailer so that you can dismissed the screenshots based on lack of aliasing. I think you may be hoping these are bullshots more than evidence suggests.
 
Status
Not open for further replies.
Top Bottom