• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Community Thread

Status
Not open for further replies.

z0m3le

Banned
I just completely lost track of what you're trying to argue/what your point is


your presumptions hold less ground than the combination of guesswork and bits of knowledge bg is actually basing his claims off of, yet its perfectly fine for you to state your "factual" statements but other information presented to you has to be argued; of which you are arguing, trying to be critical of something, but I can't quite understand what you're critical of.

Its like a cyclical argument with out the rehashes...so I suppose that would make it a spiral...a spiral going in the direction of gravity...


I mean, I would love to see these "last leaked dev docs" of which I am unable to locate

http://www.neogaf.com/forum/showpost.php?p=38549553&postcount=1 I'm guessing these... they point to any modern GPU though, although it does say R700, but it's obviously custom if it supports DX11.
 
I mean, I would love to see these "last leaked dev docs" of which I am unable to locate

The only thing I can think of was the spec leak for the early target specs. Which are both old and didn't give much of an indication of anything other than saying base features come from the R700 line.

EDIT: Which z0m3le linked to.

How much wood would a Woodchuck Chuck if a Woodchuck could Chuck Brown.

More than it can chuck wood if it could?

I get back from my banning and the Wii U speculation thread is so much emptier. :(

I have a really random question for you guys too: Does anyone know for sure if I'll be able to use my existing sensor bar that I've been using for my Wii? It's been stuck with those included double-sided tape pieces to my TV for 2 years. I'm not sure I'll be able to cleanly take it off. :lol:

The port is the same for Wii U I believe so I'd assume you're good to go. And yeah the move to Community was WUST's "downfall".
 

Terrell

Member
And people have the nerve to say that I disengage from hardware babble because I'm a Nintendo fanboy... no, it's the example left on this page of the thread that keeps me disengaged from hardware babble. I'd rather be goalie in a game of lawn darts, actually.
 
http://www.neogaf.com/forum/showpost.php?p=38549553&postcount=1 I'm guessing these... they point to any modern GPU though, although it does say R700, but it's obviously custom if it supports DX11.

Heh, I didn't notice before that the leak bluntly stated that the GPU was modeled after the series:

GX2 is a 3D graphics API for the Nintendo Wii U system (also known as Cafe). The API is designed to be as efficient as GX(1) from the Nintendo GameCube and Wii systems. Current features are modeled after OpenGL and the AMD r7xx series of graphics processors. Wii U’s graphics processor is referred to as GPU7.
 

nordique

Member
The only thing I can think of was the spec leak for the early target specs. Which are both old and didn't give much of an indication of anything other than saying base features come from the R700 line.

EDIT: Which z0m3le linked to.


Oh, the post that says "credit to bgassasin" in caps at the top?

Thus indicating something you would be fully aware of when you make Wii U power assertions?



In that case, I don't think that works in his favour, bg :p
 

JordanN

Banned
Hey, has this been discussed before?

Panorama view looks like the successor to the Garden Demo?
WCN5b.png

2ooC7.png



Also, on Nintendo's website they don't mention the developer. I wonder why...
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
Why is anybody assuming the Wii U's GPU is DX anything? Because it has compute programming and apparently tessellation support? Neither of these guaranty the chip as DX11 anything.
 

Gahiggidy

My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
Even the Kotaku "power problem" article described the Wii U as a DX11 machine.
 

nordique

Member
Heh, I didn't notice before that the leak bluntly stated that the GPU was modeled after the series:

GX2 is a 3D graphics API for the Nintendo Wii U system (also known as Cafe). The API is designed to be as efficient as GX(1) from the Nintendo GameCube and Wii systems. Current features are modeled after OpenGL and the AMD r7xx series of graphics processors. Wii U’s graphics processor is referred to as GPU7.


Being modelled after an r7xx still does not make it = to an r7xx in feature set. Its like saying a Lexus SUV is modelled after a Toyota Siena family van....or vice versa. It could be stripped down, souped up, in a myriad of different ways, and could have certain specifications. Is it really hard to believe Nintendo would want to design a chip that can obtain PS360 ports, but be featured enough to obtain next-gen ports and support the most popular engines in the future, AND be cheap enough for Nintendo to put into what they deem a mass market console?


Plus, bg did state that is based on an older kit. It might be possible the final Wii U GPU was not in such a kit; the final Wii U GPU is what will be in the console.

I'm not trying to defend it based on "potentially" being a super powerful card all of a sudden; no that would be folly. But I am of the belief that such a card is highly customized, brand new, not off-the-shelf, and thus simply cannot be compared to traditional "rxxx" cards "as is"

We still do not know anything in complete technical terms of the final Wii U GPU

That is the bottom line. It might be more featured than any of us realize, even if it is not the most powerful card out there.

To me, if people are willing to suggest they understand how weak the Wii U CPU and Wii U power is relative to current gen HD systems, based on a bunch of articles that themselves vary in terms claims, then to me it is perfectly reasonable to expect a highly customized, fully featured card, capable of handling modern games even if it is not the most powerful card around. The former is easier to accept than the later, because the later involves listening to people who have a formulated and researched idea on it, and these people don't have the luxury of IGN credentials or whatever behind them.

As blu stated (I dunno if it was this thread or another), [paraphrased] all developers and game designers are not created equal.

Some will do the cut & paste job, have connections with a journalist from IGN or some other large gaming site, port Mr PS360 game to the Wii U, let their connection know the CPU is weaksauce and lacking compared to the PS360, and journalist will simply report on what their source tells them.

But, for instance, (a hypothetical counter example) Mr Retro Studios developer who has (perhaps) the same Wii U kit as cut&paste boy, and discovers he has to code uniquely and specifically to that console. He comes away impressed with the hardware, pleased it is capable enough to craft his vision, surprised as how difficult it actually was to develop for, yet rewarding given how he was able to push the system....but unfortunately, he does not have a connection to call his own to specifically break down specs or say what he discovered and how it is actually a new technique.

Little, simple things like that, can lead to some of these discussions we find ourselves faced with on boards like this.



In end, the cake is a lie.
 

ADANIEL1960

Neo Member
Nah, he's Austrian.
/trollololol

From the Emerald city no doubt

DX9/11
Never can understand why everybody keeps refering to a MS graphics api on a Nintendo platform.

CPU
Where did the talk come from that stated CPU was weak.
On what are they basis was it supposed to be weak ...GHz???, if so, then likely irrelevent.
I thought Gearbox had said the CPU was great.

CPU / GPU out of balance - this does not sound like Nintendo. They have always tried to make the most balanced systems.
 
And people have the nerve to say that I disengage from hardware babble because I'm a Nintendo fanboy... no, it's the example left on this page of the thread that keeps me disengaged from hardware babble. I'd rather be goalie in a game of lawn darts, actually.

The debating makes it fun though. But then again I enjoy debating.

Hey, has this been discussed before?

It better cost as much as it did to play the Garden demo.

Plus, bg did state that is based on an older kit. It might be possible the final Wii U GPU was not in such a kit; the final Wii U GPU is what will be in the console.

No I said they were old. They are from the early target specs, but they are well over a year old now so some of that may have been subject to change up till now, though it was still so vague that bringing up possible change doesn't mean much. But like you mentioned and I mentioned before, only the base features are what the specs mentioned. That doesn't make it a true R700 GPU. Because those GPUs suck at tessellation and are not truly capable of modern DX11-level features. If Wii U's GPU does well with those things then we definitely know it's not a "straight-up" R700. Ports from DX9-level hardware, a port from a DX7-level console, and some first-party launch fillers aren't going to be a legitimate indication of that.
 

z0m3le

Banned
Why is anybody assuming the Wii U's GPU is DX anything? Because it has compute programming and apparently tessellation support? Neither of these guaranty the chip as DX11 anything.

I don't think any of us are saying the GPU is going to use DX11, but AMD builds their GPUs to those specifications, if developers are telling us that the GPU is feature rich and at least matches DX11 in feature set, then it's obviously not just an r700 chip, it has been customized to exceed the specifications of that DX10.1 series. This means the GPGPU performance on the Wii U's GPU is likely changed, someone here touched on one reason the r700 series did so poorly as a GPGPU, and that was memory bandwidth, which is from all accounts vastly different on Wii U (32MB cache)

All we know about Wii U's GPU is that it is not simply a r700, thanks in large part because devs have told us that it is a feature rich 2012 GPU. (not to be confused with a power house)
 

OryoN

Member
I think people are overestimating the performance gap Xbox3 and PS4 will have over Wii U.
At the same time, they are underestimating how much that gap closes(visually) by having an HD capable GPU with modern features.

Just compare current consoles to what's happening with mobile gaming recently, on platforms with roughly the same flops performance rating as Wii, but with more modern GPUs. The visual gap, while still massive, seems significantly smaller than the theoretical performace gap would lead one to conclude.

For all the hype and attention this matter has been receiving, we're talkig about a mere 3 - 3.6x Wii U performance gap(assuming Wii U's GPU @ ~500 GFLOPs, Xbox3 @ 1.5 TFLOPs, and PS4 @ 1.8 TFLOPs). That's not even in the same spectrum as the ~20x gap current consoles have over Wii - which still managed to wow me with certain games(SMG, MP3:C!)

As stated, that 3-3.6x gap(let's call it 3-4x) assume Wii U GPU performance is 500 GFLOPs. A lowballed figure, just to be safe. If it turns out to be a bit more powerful, that gap only gets smaller.

We would also have to consider how having an efficient & balanced system plays a part in closing this gap. Admittedly, its entirely possible that this could even widen the gap a bit more. I don't know how much effort MS or Sony devoted to this area, but they better to a much better job than what they did this gen. Only time will tell, but it's already known that this is an area Nintendo pays special attention to. If the other two makers drop the ball here, expect that gap to close even more, in realworld situations.

One other reason I described this 3-4x gap as something "mere", is the fact that targeting 1080p is going to eat up a significant portion of that performance margin alone. Yet, some people around here expect 1080p & 60fps out of next-gen games on those consoles. They are usually the sames suspects who believe the Wii U will be too vastly underpowered to compete at all. Imagine if every multiplaform game on Wii U target 720/30fps, but 1080p/60fps on the other consoles. Where's the perferformace gap then? Our 4x metric just got gobbled up the instant the game loads!

A barely noticeable resolution bump, and a slightly smoother gameplay isn't going to convince the masses that those consoles are so much more powerful than Wii U. Yet, that is exactly what you could technically do with 4x Wii U GPU raw performance. If processing power is going to be their sales pitch next gen, then I'm more concerned for Sony and MS than I am for Nintendo, because 3-4x only goes so far these days. Which leads to my next point.

I personally believe Microsoft and Sony will have a harder time, than people think, trying to demonstrate that their consoles are AS powerful as these target specs suggest. IMO, that job would be more effectively done if they target 720p, and push a ton of environmental detail, onscreen characters and animations all at once.

Things like that are WAY more noticeable than simply bumping the resoluton from 720 to 1080p. At that point, if devs want Wii U to approach parity with stuff like that, games may have to be rendered at sub HD resolution. They won't want to drop too many pixels, however, so they'll have no choice but to and scale back onscreen detail here and there as well. In cases like that, the gap will be noticeable, but even then.... bearable.

Given what has been discussed thus far, expecting Wii U's situation to be so much worse than this, or so much better than this... to me, doesn't seem like a realistic outlook. But we'll see...

(sh!t...sorry for the length. Since I don't post very often, it seems I always have a lot to say when I finally do)
 

Log4Girlz

Member
Hey, has this been discussed before?

Panorama view looks like the successor to the Garden Demo?
WCN5b.png

2ooC7.png



Also, on Nintendo's website they don't mention the developer. I wonder why...

That screen is beautiful. I wonder what are the odds that the exactly same screen is used in a future portable.
 

z0m3le

Banned
Great, grounded post.
Thanks for the post, wish we had more level headed people posting like this, it would make explaining the differences so much easier. we really won't see a generation between Wii U and the other consoles, it will likely be smaller than PS2 to Xbox (the PS2 btw is 4 times more powerful than the Dreamcast, so anyone bunching those two together when comparing to the Xbox should really rethink what that box could do.)
 
Is it kinda probable that the CPU in the Wii U has been underclocked due to heat issues?
It feels like such a tremendous waste from Nintendo, all because they didn't want to make their console slightly larger.
 

MDX

Member
I personally believe Microsoft and Sony will have a harder time, than people think, trying to demonstrate that their consoles are AS powerful as these target specs suggest. IMO, that job would be more effectively done if they target 720p, and push a ton of environmental detail, onscreen characters and animations all at once.


I agree, and I think people are overestimating how many current PS3/360 owners are interested in investing in a new console. Every year we are seeing improvements visually in games on these older systems. But Wii owners have a reason to upgrade to HD.
 

D-e-f-

Banned
I get back from my banning and the Wii U speculation thread is so much emptier. :(

I have a really random question for you guys too: Does anyone know for sure if I'll be able to use my existing sensor bar that I've been using for my Wii? It's been stuck with those included double-sided tape pieces to my TV for 2 years. I'm not sure I'll be able to cleanly take it off. :lol:

I see no reason why you wouldn't. Pictures of the Wii U's backside show a regular Wii Sensor Bar port (the red thing). Besides, since you can use ANY light source (like candles) as a sensor bar, there's literally 0 reason for them to change anything about it.

I dug up this pic from earlier in this thread:
1nfcn.jpg


PS: Hi! Longtime lurker (been reading WUSTs since February), first time poster. :)
 

HylianTom

Banned
I get back from my banning and the Wii U speculation thread is so much emptier. :(
No worries.. it'll get closer and closer to its old craziness as we approach Launch Day. The allure of price/preorder/launch festivities is just too powerful, too intoxicating for many. There are, no doubt, good times ahead. E3 hangover may also be contributing to the more relaxed pace. Well, that and the last summer of a fading Nintendo console's lifespan is usually a fantastic time for folks to wrap-up any unfinished gaming/backlog/replay business.

And seeing so many new peeps is really encouraging - welcome aboard, ladies and gents! In some respects, one could say that the industry will be going through some "interesting times" soon, so it's good to see new faces for the approaching events.. :)
 

D-e-f-

Banned
So...is it a real time render? Or CG?
Or just a video?

The way they've shown that (from what I've seen) it looked like straight up video to me. But what kind of irritates me is that you can apparently switch from day to night just like with the Zelda HD Experience from last year's E3. This does either mean the Zelda HD thing was secretly a video after all or that those Panorama View things are real time CG that allows you to manipulate lighting conditions.
 

HylianTom

Banned
I tried putting bratwurst in my gumbo last Fall instead of sausage, along with chicken, shrimp, and ham. It worked-out really surprisingly well!
 

10k

Banned
Hmm. I know the difference will be noticeable, especially for those looking for the differences. But the idea of a well-put-together game on 360 v2.0 hardware being considered ugly or just "not good enough" is something I find comical, reeking of e-peen measuring contests.

In this context, I appreciate Nintendo's sense of self-preservation. They'll get to that "nit-on-an-asshair realtime rendering" level of hardware eventually. :)
After buying a ps3 last summer and finishing game alike uncharted 3, heavy rain, final fantasy XIII, and god of war III, I came to the realization that those games are beautiful and if the wii u is supposed to be more powerful then the PS3, then I will be happy with the Wii U's graphics. It's gameplay first for me, but I think those beautiful ps3 games with a little more AA and better framerate will be enough for me next gen. Will that be enough power for ports though? BG says yes, as it's a matter of effort from developers not power. In which case I think the Wii U will be powerful enough.
 

alfolla

Neo Member
The way they've shown that (from what I've seen) it looked like straight up video to me. But what kind of irritates me is that you can apparently switch from day to night just like with the Zelda HD Experience from last year's E3. This does either mean the Zelda HD thing was secretly a video after all or that those Panorama View things are real time CG that allows you to manipulate lighting conditions.

If you can switch from day to night lighting it means that there is something "calculated", lighting itself at least.

Am i wrong?
 

D-e-f-

Banned
If you can switch from day to night lighting it means that there is something "calculated", lighting itself at least.

Am i wrong?

No idea, I'm just randomly speculating since I have no idea how this actually works. I gotta look at the old videos again to see if there was some kind of transition or a straight up immediate switch.

I wonder why they never released full-rez video of the showfloor versions for the Bird and Zelda HD demos. (Well I guess they probably didn't want to show any video without the controller being visible...)
 

Hakai

Member
I'm keeping my expectations in low mode. I think if I keep focusing that the power is important I will be missing the point, so I'm not stressing this out.

Yeah I'm expecting beautiful first-party games, but nothing out of this world, though I hope they will be artistically gorgeous.

The same thing about Third-Party support, if it comes nice, but is not something I will expect, but I will keep an eye for the reasons they will give to not port the game for the Wii U.

So..huh... yeah I buy Nintendo machines basically to buy Nintendo games, if they can offer more better, I always hope for that, but in the last generations it seems that this goal is far far away. So expecting nothing and anything that shows up will be a surprise =D
 

MDX

Member
The way they've shown that (from what I've seen) it looked like straight up video to me. But what kind of irritates me is that you can apparently switch from day to night just like with the Zelda HD Experience from last year's E3. This does either mean the Zelda HD thing was secretly a video after all or that those Panorama View things are real time CG that allows you to manipulate lighting conditions.


Some Panoramas are a mix of real video with elements of CG characters.
For example, the hang-gliding is real, but they added fake birds to it..

The night day thing in the Panoramas are realtime night and day situations.
For example, in the London video, they filmed during the day and during the night.
Then they synced the locations as exact as possible to the location of where the bus was when filmed. So when you toggle between day vs night, all you are doing are switching videos.

The Zelda day/night toggle was just to show off that the WiiU will have some impressive lighting technology available for developers. We are seeing some of this being exploited in ZombiU.
 

Donnie

Member
They consider the Wii U close to that DX9-performance/DX11-capabilities combo

That is the quote you are basing this all on? Dx10 is the combo?

what does that mean?

DirectX doesn't define performance, it defines features. The DX9 performance comment is probably just a really clumsy way of saying that in there opinion the performance of the GPU isn't enough to be considered a class above Xenos/RSX (which were DX9 level GPU's). But considering there are DX11 PC GPU's much weaker than Xenos/RSX (there's on with only 80Gflops AFAIK) its really a poor attempt at labelling performance.

The only really interesting piece of info there IMO is the DX11 capabilities comment, which is clearly saying that its feature set is beyond that of the DX10.1 R700 series of GPU's.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
On topic: I think the BT CD label on the prototype means "boot from cd" and WL is wired link. Just guessing.
My guess is:

"BT CD" - Blue Tooth and "CD" (whatever that pad streaming protocol is called internally)
"WL" - wired link.

Basically, I think it's a wired/radio switch for the pad.
 
FYI, Nintendo's Annual Meeting of Shareholders is tonight at 9PM EDT/6PM PDT. I wouldn't expect any major news, but we might get a couple interesting tidbits out of the Q&A.

Note that this is not to be confused with Nintendo's Q1 earnings release and subsequent investor briefing, which is still a month away.
 
R

Rösti

Unconfirmed Member
FYI, Nintendo's Annual Meeting of Shareholders is tonight at 9PM EDT/6PM PDT. I wouldn't expect any major news, but we might get a couple interesting tidbits out of the Q&A.

Note that this is not to be confused with Nintendo's Q1 earnings release and subsequent investor briefing, which is still a month away.
I was just about to post this. The Q&A sessions are rather rich on questions actually, and answers are quite long. Just look at the Q&A from the most recent meeting: http://www.nintendo.co.jp/ir/en/stock/meeting/110629qa/index.html

Now when they have a new console launching in just a few months, I assume the investors have a heap of questions. So this should be quite interesting.

Convocation notice: http://www.nintendo.co.jp/ir/pdf/2012/convocation_notice1206e.pdf

On another note, NoA Rob appears uninterested in answering my inquiry (even with a simple "No comment"). Of course, it could be that he is currently on vacation and unable because of that to reply. If not, I shall see to pursuing him via other means of communication. Well, at least if we are still alone in the dark after the upcoming investor meeting.
 
R

Rösti

Unconfirmed Member
When will we be able to read this Q&A in english?
The English version of the Q&A for the 71st Annual General Meeting of Shareholders (which was held on the 29th of June) was uploaded on the 6th of July last year. The Japanese version was uploaded on the 1st of July. So, the delay appears to be a week or so for the translated version.
 

Hakai

Member
Rösti;39286620 said:
The English version of the Q&A for the 71st Annual General Meeting of Shareholders (which was held on the 29th of June) was uploaded on the 6th of July last year. The Japanese version was uploaded on the 1st of July. So, the delay appears to be a week or so for the translated version.

Nice, thanks Rösti! But if something important is dropped today it will probably be news this week or something?
 
Status
Not open for further replies.
Top Bottom