• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Community Thread

Status
Not open for further replies.

10k

Banned
Yeah, can we please stop with the "it's double the flops inside a console because it's a closed box" nonsense?

576GFLOPS is unacceptable to me on 2012, especially of Iwata is truthful about his statement that the graphical differences between the three consoles won't be huge. I'd be okay with 768 as a minimum and 1000 as the standard next gen for Nintendo.

Basically what we've heard is the Wii U has an impressive GPU (not bleeding edge, but it's great for what it does), fast RAM and lots of it, and a mediocre CPU.
 

alfolla

Neo Member
Yeah, can we please stop with the "it's double the flops inside a console because it's a closed box" nonsense?

576GFLOPS is unacceptable to me on 2012, especially of Iwata is truthful about his statement that the graphical differences between the three consoles won't be huge. I'd be okay with 768 as a minimum and 1000 as the standard next gen for Nintendo.

Basically what we've heard is the Wii U has an impressive GPU (not bleeding edge, but it's great for what it does), fast RAM and lots of it, and a mediocre CPU.

Whatever it is.
If it comes to give me (avatar quote) on screen, i'm ok.
 

jerd

Member
Yeah, can we please stop with the "it's double the flops inside a console because it's a closed box" nonsense?

576GFLOPS is unacceptable to me on 2012, especially of Iwata is truthful about his statement that the graphical differences between the three consoles won't be huge. I'd be okay with 768 as a minimum and 1000 as the standard next gen for Nintendo.

Basically what we've heard is the Wii U has an impressive GPU (not bleeding edge, but it's great for what it does), fast RAM and lots of it, and a mediocre CPU.

I would love to agree with you, but I think ~576 is what we can expect. Maybe 576 in a best case scenario. What we are looking at is a console with a tablet controller that Nintendo is looking to sell for a profit, and we are expecting to see for around $300. I expect it to be noticeably more powerful than this gen, but it won't blow ps360 out of the water.
 

StevieP

Banned
That is definitely not the "minimum" that is close to maximum. Again people should not worry so much about gflops. Its not an equal measurement between gpu's. Just because a gpu has 2x the flops means you could have 4x the performance or more in running games.

So these number really dont matter...

Unless it's in an Xbox, then it matters. Because it is 2x the PC flops. Or whatever it is that you said yesterday.

Yeah, can we please stop with the "it's double the flops inside a console because it's a closed box" nonsense?

576GFLOPS is unacceptable to me on 2012, especially of Iwata is truthful about his statement that the graphical differences between the three consoles won't be huge. I'd be okay with 768 as a minimum and 1000 as the standard next gen for Nintendo.

Basically what we've heard is the Wii U has an impressive GPU (not bleeding edge, but it's great for what it does), fast RAM and lots of it, and a mediocre CPU.

600 GF is basically the highest you can expect out of the Wii U at this point, I wouldn't set yourself up for disappointment. It is a lower-powered/lower-speed GPU that has a modern feature set, however.
 

USC-fan

Banned
Unless it's in an Xbox, then it matters. Because it is 2x the PC flops. Or whatever it is that you said yesterday.
I was talking about console hardware vs PC hardware. Using PS4 as an example when someone was saying these games could never run on console becaus ethey are running on PC High end PC Hardware.

Because people dont like what I have to say, why dont you explain to people in this thread why worrying about gflop is pointless? Or maybe BG can do it?
 
Another crazy idea:

Imagine two players strapping on a tablet each onto their chest (game would obviously have to come with some sort of vest to make it safe) and then have them try to "hit" each other by firing off light-pulses with the Wii mote that get registered by the sensor bar on the tablet.
 

StevieP

Banned
I was talking about console hardware vs PC hardware. Using PS4 as an example when someone was saying these games could never run on console becaus ethey are running on PC High end PC Hardware.

Because people dont like what I have to say, why dont you explain to people in this thread why worrying about gflop is pointless? Or maybe BG can do it?

Wii U performing like 1TF PC GPU confirmed. UE4 "getting interesting" here we come
 

BlackJace

Member
Man, I didn't know a big part of GAF have their heads so high in the clouds when it comes to next gen expectations.

It's really like some of them want to be disappointed.
 
Another crazy idea:

Imagine two players strapping on a tablet each onto their chest (game would obviously have to come with some sort of vest to make it safe) and then have them try to "hit" each other by firing off light-pulses with the Wii mote that get registered by the sensor bar on the tablet.
haha, thats great. :)
 

StevieP

Banned
Man, I didn't know a big part of GAF have their heads so high in the clouds when it comes to next gen expectations.

It's really like some of them want to be disappointed.

It happens every generation, tbh.

Everyone who has pie-in-the-sky expectations about hardware grunt leaves disappointed (especially those who only own Nintendo consoles the last couple rounds lol).

The CG at next year's conferences is going to make it unbearable to slog through their associated threads.
 
So the immediate stupid question that pops up in my head is what score would Xbox 360 gpu get?

I actually googled it but did not really find anything concrete.

What does that mean exactly?

We probably won't find the answer, but the discussion on the E6760 a couple of days ago does say a few things about what we might see as Wii U GPU and maybe even the CPU being weak on paper but due to advanced architecture and efficiency might actually be a fairly strong package. Remember this is all just speculation on my part on what can be possible for Wii U on a 75 watt power supply based on information we have around and is not indicative of anything final.


Let's look at the current gen console GPU's and their PC equivalents at the time.

http://www.technobuffalo.com/gaming/console-wars-round-5-xbox-360-vs-ps3-gpu/

The G70 which is 7800GTX was around 200GFLOPS and the r520 Radeon x1800XT performed similarly

The Xenos is 240GFLOPS but is comparable to the x1800xt and the 7800GTX and will actually perform better in lower resolutions because of its more advanced architecture that will be used later on in the HD2XXX series. The 7800GTX is what was used as a basis for the RSX.

The advanced architecture over the x1800xt equivalent is what made the xenos gpu a better gpu than the RSX (but cell makes up for it). Even though the Xenos is a little bit slower in speed compared to the x1800xt, the newer architecture actually made it perform better in real world gaming applications.

What does this say about the Wii U dev kit and the HD4850 55nm if it was actually in there? A ~1TFLOP card released in January 2009 might have been used in the devkit as a buddy card to simulate the same raw speed as what WiiU’s GPU. However the difference to the real GPU is the architecture will be a lot more advanced and probably closer to southern islands in features. I wouldn't hesitate to say that it might even be more advanced in features having an extra year of development.

The HD4850 is around 4.2x the power of Xenos in raw numbers and even more efficient due to the more advanced architecture. You could say that it is something closer to 5x the performance of xenos in real world applications. In real terms it can be seen as that it can run the same game at 1920x1080 at 60fps using slightly higher details compared to the Xbox 360 running it at 720p at 30fps. From far away you might not see the difference especially in still pictures.

The early WiiU devkits may have had a base card 5x Xenos on paper, now with a much better architecture when the final Wii U GPU is replaced in the devkit sometime in the next few weeks. How does this tie in with the E6760 a card released over a year ago in 2011? It seems that the E6760 on a 40nm process may only be 576GFLOPS but handily beats a radeon 4850 and at 35watts. It has been said that the Wii U GPU is about ~600GFLOPS and if that is anything to go by then having an architecture a year more advanced than the E6760 might bring real world GPU Performance a lot more than what the HD4850 can output and even what the E6760 can output or 6x xenos.

If this is anything to go by a Wii U GPU with 720GFLOPS at 30 watts on a 32nm process may actually be possible and may fit in with a 75 watts power supply limit. A Wii U gpu with 720GFLOPS and 2012 architecture is not just 3x xenos; it may well be more than double that. This does not mean the Wii U will be 7.5x xenos as the Wii U CPU is reportedly the weak part of the equation, since a lot of the processing will be offloaded to the GPU. If The Wii U CPU is only ~10watts and loosely based on a 476FP IBM processor I wouldn't think it would be much more powerful than the current cell processor in the PS3 and at best twice the processing power. What this means for real world gaming would be maybe 3.5x-4.5x the power of the xbox360 and with around 4x the ram compared to the Xbox 360 it would make sense.

What do 3.5 x-4.5 x xbox360 powers get you? You can probably get 720p at 60fps with one tablet at 576p at 60fps all with 4x AA and slightly more effects and much higher quality textures due to ram. Does this make it catch up to PS4 and xbox360? HELL NO, but the good thing is that the architecture will allow Wii U to have multiplatform titles without a complete overhaul of development compared to how it is with Wii. Another bonus on the Orbis and Durango side of things is that these advanced architectures will also benefit and apply to the next gen SONY and MS systems and having a 1.8TFLOP with 2013 architecture is going to murder PS3 and the Xbox 360 as a whole even if the CPU's in these new machines follow a similar route to Wii U due to something like GPGPU capabilities.


A HD6570 class GPU optimised for a console and on 32nm will be good enough for 720p 30fps 2xaa next gen gaming or optimised current gen gaming (720p 60fps 4xAA +576p tablet 60fps 4xaa)

I am however expecting 720p 60FPS 4xAA and more effects on PS4 and the next Xbox or 3-4 more performance. Basically a HD7850 card optimised for consoles which will have something along the lines of the performance of a 7870 on PC. I also expect these consoles to be $399 when launched.

Is Wii U a half gen leap in graphics compared to the leap from PS2 to PS3?

Probably a little less than half, by PS3 standards but we have to remember that the PS3 is probably a generation and a half ahead due to its price point and the money it lost during release. If Wii was released and was actually 8x the gamecube it would have probably released for $299 instead of $249. The Wii U is may be going back to its gamecube roots in power increase in that it will be a two proper generations increase from the Gamecube if you count how many years from release or possibly 64 gamecubes duct taped together. Still this might only be 1/3 of the eventual power of Orbis and Durango when all is said an done.

Some caveats on what I have said points to a Wii U being on a 32nm process. At 40nm things change, not dramatically but at least 10% less performance compared to 32nm.We Also do not know if the 4850 was used in the devkits or if the Wii U dev kit now has a ~600 GFLOP GPU in there. The only thing we know is development of the GPU started in 2009 and that it is not 2005 tech like some people here keep saying. However it is interesting how three years development can make a 35w TDP card at 40nm can beat a 55nm 110w card in performance. at 32nm and extra development who know what we can get. it certainly won't be 1.5x xenos as some might claim thats for sure. Also a ~600GFLOP gpu in a console will not perform the same as a ~1200GFLOP GPU on a pc. So those hoping for a miracle might be disappointed but architecture advancments do help and it showed with xenos compared to the x1800xt.

Here is a question for anyone in the know, is final silicon for the Wii U GPU in any devkit yet or is that yet to happen? If not when is it going to happen?
 

BlackJace

Member
It happens every generation, tbh.

Everyone who has pie-in-the-sky expectations leaves disappointed (especially those who only own Nintendo consoles the last couple rounds lol).

The CG at next year's conferences is going to make it unbearable to slog through their associated threads.

If I can resist, I'll be taking a vacation away from GAF for a while after that.

You mean like a lot of people expectations for the Xbox 720 power?.

The HD Twins, yes.
 

StevieP

Banned
If I can resist, I'll be taking a GAF-cation for a while.

If anyone wants to dig out the Madden trailer thread or the Killzone 2 trailer thread, they'd probably be pretty entertaining reads.

is final silicon for the Wii U GPU in any devkit yet or is that yet to happen?

The latest dev kit iteration is said to have final silicon

I just don't want a Wii-like situation to happen again.

Power-wise, the gulf won't be anywhere close to this generation. This system isn't using 1999 parts anymore, in other words, which makes things completely un-portable. BG has made an easy term to quantify it: ("running a PC game on low vs running a PC game on very high"). Third party support wise? Well... it'll be slightly better I guess. Not by much, by the sounds of it.
 

alfolla

Neo Member
600 GF is basically the highest you can expect out of the Wii U at this point, I wouldn't set yourself up for disappointment. It is a lower-powered/lower-speed GPU that has a modern feature set, however.

I'm not worried by general performance of the Wii U.
Telling the truth all these numbers and multipliers are confusing.

I just don't want a Wii-like situation to happen again.
 
I was talking about console hardware vs PC hardware. Using PS4 as an example when someone was saying these games could never run on console becaus ethey are running on PC High end PC Hardware.

Because people dont like what I have to say, why dont you explain to people in this thread why worrying about gflop is pointless? Or maybe BG can do it?

Seriously mate, no needle intended, but I don't see the point because there's no give or middle ground with you; it's your opinion is the only one that counts or hit the road Jack. 'Winning the argument' takes president over sharing and evolving ideas... can't see the point in it, just another 10 page head fuck waiting to happen.
 
If I'm reading it right, an interesting Patent Application was published today (Filed: Nov. 2, 2011), that talks about a delay measurement method being used to synchronize what's being displayed on the TV and the Upad.

Abstract
An example game apparatus generates and outputs a predetermined test image to a television. A terminal device has its image pickup section acquire a pickup image of a screen of the television, and transmits the pickup image acquired by the image pickup section to the game apparatus. The game apparatus determines whether or not the pickup image includes the test image. When the pickup image is determined to include the test image, an image delay time is calculated on the basis of the time of the determination, the time of the output of the test image by the game apparatus, and a processing time between the acquisition of the pickup image and the determination. The game apparatus uses the image delay time to achieve synchronization between the terminal device and the television and also between image display and sound output of the television.

Not surprisingly, Nintendo had to come up with a way to fight TV-processing-induced delays:

[0004] In the case of the game systems using televisions as display devices, there might be delays in displaying game images. Specifically, for the purpose of, for example, enhancement of the quality of images, recent digital televisions subject input images to various types of video processing, and display video-processed images. The video processing is generally time-consuming, and therefore, there might be a delay between the game apparatus outputting a game image to the television and the television displaying the game image. Accordingly, game systems using televisions as display devices have a problem of poor response of game displays to game operations.
[0005] Therefore, the present specification discloses a delay measurement system and method for measuring a delay in outputting an image or sound to a display device such as a television. The present specification also discloses a game apparatus, an information processing apparatus, a storage medium having a game program or information processing program stored therein, a game system, and an image display method which solve or reduce any problem to be caused by the delay as mentioned above.

Not sure if old and/or mentioned in past patent documents.
 
If I'm reading it right, an interesting Patent Application was published today (Filed: Nov. 2, 2011), that talks about a delay measurement method being used to synchronize what's being displayed on the TV and the Upad.

Abstract


Not surpusingly, Nintendo had to come up with a way to fight TV-processing-induced delays:



Can't tell if old.

I actually ponded about something like this after hearing the delay reports. Do you think it would have a hit on resources? Having to 'store' the image momentarily!?
 

D-e-f-

Banned
And that site got it from us, we discovered and posted this first.

de Blob would be interesting but I heavily doubt it's that. It's impossible to predict because they could be hired by many different publishers for porting.

Go N actually got it from notenoughshaders.com, not from GAF

I hope people dont think that the de Blob is a AAA title.

I don't know where you're going with this but AAA in hex color (#AAA000) is a dirt brownish/gold color which you could sort of make in de Blod so it has that! :p

Random question:

I don't own a PS3 or 360, so sorry if this is a stupid question, but can you email a PS3/360 owner and have them receive it on their system like you can with the Wii?

nope. though Xbox live lets you send messages via their xbox.com website if you have your own gamertag. so Wii trumps all with this superbly (almost completely) useless feature :D
 
I actually ponded about something like this after hearing the delay reports. Do you think it would have a hit on resources? Having to 'store' the image momentarily!?

I'm no expert, but I don't think a function so integral to the WiiU "experience" (TV+Upad) would put a major strain on the system.

Aren't DSPs (or whatever the dedicated compress/decompress chip is called) there for exactly this reason?

(Someone wiser should really answer that though)
 

RedSwirl

Junior Member
How much do we actually expect to be feasible for a $300 2012 console that will be sold at profit? How much are we expecting Microsoft and Sony to charge?
 

jerd

Member
How much do we actually expect to be feasible for a $300 2012 console that will be sold at profit? How much are we expecting Microsoft and Sony to charge?

This is what I've been thinking. Somewhat related note, does anybody know how much it currently costs Sony to produce a PS3? Or Microsoft a 360? I can only find production costs from 2009-2010, and even those aren't from reliable sources.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
If I'm reading it right, an interesting Patent Application was published today (Filed: Nov. 2, 2011), that talks about a delay measurement method being used to synchronize what's being displayed on the TV and the Upad.

Abstract


Not surprisingly, Nintendo had to come up with a way to fight TV-processing-induced delays:



Not sure if old and/or mentioned in past patent documents.
Nice find there. And indeed, the possibility for something like that was already speculated on these boards.
 

Sadist

Member
So there are rumours going around about a possible Final Fantasy XIII-3.

Might be me, but if I were S-E a trilogy box for Wii U wouldn't be a bad idea.
 
Another crazy idea:

Imagine two players strapping on a tablet each onto their chest (game would obviously have to come with some sort of vest to make it safe) and then have them try to "hit" each other by firing off light-pulses with the Wii mote that get registered by the sensor bar on the tablet.

Think of all the money they could make from all those vest accessories!
 
The latest dev kit iteration is said to have final silicon

Cool and was that in the kits during e3? if the retail units are even 20% faster than current dev kits (~600GFLOPS) then we are in for a treat. It might only mean single digit increases in frames per second but it might just be enough to run next gen at 30fps at 720p with very minimal downgrades. i.e. No wii situation in a technical sense. In a political sense well thats a matter for the publishers and developers to hurdle through.



The only way to fix this now is to supply new keys for the rest of the units going forward. which might mean the first million wii u units might have the ability to be region free if hacked. Probably only the day one units in japan. I doubt they would have manufactured enough boards to make a million units yet. Probably will be a non issue since it will be nipped in the bud early.
 
It's fake rubbish, disregard it.


Such an informative post. What is this "it" you are talking about and why is it "fake"? Who are you talking to? It is as if people are on 15 cents per post and they really want to buy that happy meal.

Edit.

Carry on. my bad for not seeing the post made before mine. It must have been made after i hit reply.
 

jerd

Member
So I don't see this posted anywhere around gaf, but I'm also not sure if it comes from a banned source. Apparently the president of Camelot said that he "would love to make a sequel" to DK64. Obviously this is far from anything concrete, and considering there is no link to an interview, grains of salt and whatnot. But seriously, how incredible would HD 3D Donkey Kong be? Personally, I have some great memories of DK64.
 
So I don't see this posted anywhere around gaf, but I'm also not sure if it comes from a banned source. Apparently the president of Camelot said that he "would love to make a sequel" to DK64. Obviously this is far from anything concrete, and considering there is no link to an interview, grains of salt and whatnot. But seriously, how incredible would HD 3D Donkey Kong be? Personally, I have some great memories of DK64.

http://m.neogaf.com/showthread.php?t=481231&highlight=camelot

Already a thread about it. Above.
 

RedSwirl

Junior Member
This is what I've been thinking. Somewhat related note, does anybody know how much it currently costs Sony to produce a PS3? Or Microsoft a 360? I can only find production costs from 2009-2010, and even those aren't from reliable sources.

More than that: I don't think anybody should have ever expected Nintendo to try to sell a $400 or $500 beast of a console.
 

JordanN

Banned
Curiously, there appears to be more hot debating and discussion going on about the Wii U's guts than there was the 3DS, Vita or any console in recent memory.
With 3DS, DMP announced the GPU when the handheld was first announced so speculation was kept at a low. Vita already had its CPU/GPU and memory revealed as well, so less room for talk.

Wii U has been nothing but vague. The closest information we got was IBM revealing a custom POWER7 cpu but apparently, not alot of people care about CPU's. AMD has been completely hush on the GPU although there was that Marc Diana article confirming the R700 nature of the GPU but nothing more specific. Also, it wasn't widely publicized either and it doesn't help soon a thousand websites starting writing what they believe the console's power is.

It's kind of sad actually and even angers me why companies try to hide this information when the real damage doesn't come from those who hate the Wii U, but it harms those who actually care about the system.

The Wii U is using DDR3?
Nintendo's recent choices of ram favor speed/low latency than actual quantity so I'm 99% sure it's GDDR5 or better unless as I stated before, the edram would make up the difference.
 
How can it be fake if the word SPECULATION is stated in the title?

Because everything is FAKE.


I honestly have no idea what you folks are talking about, the last time I posted was the last time I visited.
I can change, just give me a chance!
 

MDX

Member
Because everything is FAKE.

barnorama-35.gif
 

StevieP

Banned
Nintendo's recent choices of ram favor speed/low latency than actual quantity so I'm 99% sure it's GDDR5 or better unless as I stated before, the edram would make up the difference.

DDR3 + EDRam makes a lot of sense, doesn't it?

Especially if you're talking about a quantity larger than 1 or 1.5gb and a system meant to be close to profitable around the $300 mark.
 
Status
Not open for further replies.
Top Bottom