• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Community Thread

Status
Not open for further replies.
For me at least, i don't care what power is behind the Wii U. I just want the damn thing, being able to play games from two perspectives is one of the main reasons that i want a Wii U. The whole Social aspect is a great feature that Nintendo incorporated into there next home console.
 
I see what you are saying, but I dont agree with this.
How was Nintendo supposed to come out with a HD console for $250 back in 2006?
Even if they sold it at break even? Unless you are saying that the Wii could have presented games at a higher resolution but Nintendo capped it at SD.

Also, I dont see how leaving the classic controller out is feature crippling when the wiimote could essentially do what classic controller did with added features. I see it more as an evolution. Thats like saying Nintendo feature crippled the NES because they didnt adopt the joystick.
No, no, that wasn't my point at all, if you read it like that then it came across wrong.

I was trying to convey that back when the wii was announced, they announced the classic controller straight away so people were wondering if it was included in the price. The wii was supposed to cost $200 and they were pressured by western retailers to price it higher (that was one of the reasons wii sports was included, which turned to be genius); but in order to reduce the risk of launching a platform not relying on power/straight evolution, they could also have included the classic controller in it just in case it got sour fast; they didn't though, and that enforced the developers not being able to count on you having one.

Wasn't even touching on the power paradigm there; as I bet that if Nintendo knew what they do know today they'd probably have changed it a little even if it was still a SD console in the end; problem wasn't breaking even, it was also that they wanted to have a plan B that allowed them to pricedrop the console in a short amount of time without loosing too much money had it failed. They had at least other system prototyped in R&D, probably two of them, higher spec'ed. Kinda like GB Evo never came to be when the DS became successful.

If they knew what they were getting at/had the platform success for granted they'd probably try to go for a X1600 or an even lower spec "current" gpu at the time; X360 gpu was top of the line so they would never go for something like that (doesn't even really make sense for consoles), but still, Unreal Engine lowest asked for specs are a ATi 9500/9600; and that was cheap enough.


Backtracking: If you launched a Xbox with Kinect you'd still have to include a Xbox controller and most devs would probably ignore it or just use it for gimmick complimentary stuff; Nintendo forced their vision for the platform by crippling it's standard input method. (power being another thing they crippled a bit too much for other reasons)

Of course it also helped that the controller went through a balancing act in order to feasibly allow for regular gaming on it taking certain game genres aside (fighters, ugh), Kinect couldn't pull it.
 
On the Nintendo side, there are lots of intentional feature crippling as well. I guess the more popular ones are the Wii itself and the way it didn't bring about a classic controller (doing so would allow devs to ignore the new control method) and well... The 3DS.

Nintendo realized long ago the core customer and dev wanted a second analog in there, just like they actually wanted one analog on DS before it, they only conceded to one because they clearly feel that a dual analog standard control scheme on a portable would be less than ideal as an experience as for FPS's there would be a steeper step in there for "regular gamers" or "gaming on the go" could complicate some games, take from using features such as the gyroscope and the touch screen and developers could over-rely on it as a means to excuse the way their camera doesn't behave correctly by making you use that extra stick all the time. That's perfectly valid on a home console of course (and they know that) but would overcomplicate and fail to differentiate the experience on there.

Hence, they didn't make it part of the initial standard for the platform on purpose;.

No, they didn't include a second stick because one wouldn't fit within the device. Have you seen the 3DS teardown? The thing is very well designed and laid out, and there is no room in the case for ANYTHING. As it is they had to use a small battery that they knew would only power the device for 4 hours, you think they choose that on purpose? A second stick's internals would be where the battery currently is. Sure they could have made the device considerably larger (like the CirclePad Pro does), but as it was the DS was at the limit for what most people would consider "portable".

You can be sure that if Nintendo could have crammed a quality second analogue pad in the 3DS without harming portability, they would have.
 
No, they didn't include a second stick because one wouldn't fit within the device. Have you seen the 3DS teardown? The thing is very well designed and laid out, and there is no room in the case for ANYTHING. As it is they had to use a small battery that they knew would only power the device for 4 hours, you think they choose that on purpose? A second stick's internals would be where the battery currently is. Sure they could have made the device considerably larger (like the CirclePad Pro does), but as it was the DS was at the limit for what most people would consider "portable".
Yes, I also know that, the battery is right beneath the right side, and the joystick is probably the thickest part on the left side, it goes right till the bottom.

That certainly played a part, but I still think in the end it was fortunate they left it out; and they did again with the 3DS XL, no breakdown yet, but they probably could make it fit without much hassle in there. It's part of their vision.

EDIT: Actually did point that out the other day with images:

3.JPG
n1CKAdbPrHyNPNuW.large

to another user in another forum. ;) But thanks for bringing it up, it's also a very good point, even if they could it would be very hard and hamper the current form factor, but I don't believe it was the only thing at play.
You can be sure that if Nintendo could have crammed a quality second analogue pad in the 3DS without harming portability, they would have.
That, I don't think so; Nintendo is a "less is more" kind of company.

Sometimes too minimalist, but most of the time as a means to an end.


But I respect your vision.
 

chris3116

Member
For me at least, i don't care what power is behind the Wii U. I just want the damn thing, being able to play games from two perspectives is one of the main reasons that i want a Wii U. The whole Social aspect is a great feature that Nintendo incorporated into there next home console.

I totally agree and I think the same way as you.
 
Yes, I also know that, the battery is right beneath the right side, and the joystick is probably the thickest part on the left side, it goes right till the bottom.

That certainly played a part, but I still think in the end it was fortunate they left it out; and they did again with the 3DS XL, no breakdown yet, but they probably could make it fit without much hassle in there. It's part of their vision.

EDIT: Actually did point that out the other day with images:

There is no need for a second pad on the 3DS XL, hardly any of the games use it and for the games that do its an option. Not worth for Nintendo to add it on there. As for the whole Circle Pad Pro for the 3DS XL, I expect it to me a small add on. Nintendo knows how many hated the look of the original version of it. So we will have to wait and see.
 

Redford

aka Cabbie
What would have been nice was if they'd put the D-Pad under the face buttons. Would work for me personally, since I can't remember the last time I used the D-Pad for much.
 

D-e-f-

Banned
A lot of people tend to feel that the more power the better (and in the paper, yes, the easier it is to get something running the better; but that's also a fallacy, if running a 128 bit game on a X360 is easy then you don't do a 128 bit game anymore). That's not necessarily true, for there's not really a switch in there that says "max details on", stuff like if you crawl the grass bends, tree's branches moving with the wind or generating more than 5 faces on a FPS game are not automatic, every detail takes time to implement and the more detailed a game is the more work it takes. That's why after 6 years Crysis still looks so good, the gpu tech improved, but if other dev chases after the same thing they practically have to start anew when it comes to details; unless they use pre-loaded engine's for everything, which is certainly Epic's plan. And maybe I'm old fashioned but that dependency on external tech is not something I find good for the industry, in the end it might be a monopoly just like any other; a monopoly within a market who had it's monopolies), no less.
[...]
z0m3le was talking about development costs a few posts up, a current gen game costing 15 million is a really cheap game; they often cost 30 million if not more.

An FF last gen costed 30 million, with full 5 year production and FMV's, when this gen started the rule was multiply by four, so a FF costing more than 100 million was no surprise; I doubt the figure dropped to less than 3 times the dev costs seeing the work needed on assets. I believe 20 something million for a exclusive game is the norm, almost 30 for a multiplat one (costs an extra 3/4 million if done alongside); but I haven't looked at where those numbers were stated for 1/2 years so I'm talking out of my memory alone.

I also remember stranglehold costing 30 million, sure the costs dropped a little, but still.


Of course, with all that said you can't go against progression altogether, which is why any platform should be appropriately powerful for what it is.

Sorry for the rant.

Don't be sorry, very interesting rant! :)

I feel that issue about development costs gets overlooked a lot when I hear about people salivating over next-gen after seeing SW:1313 and stuff. All the effort and resources it takes to create these amazingly detailed assets, animations and environments is just madness! People need to look at the credits for AAA games like Assassin's Creed, Dragon Age or Max Payne 3 - they're half an hour long and sometimes more than a dozen studios have worked on these games all over the world! a single game! on current consoles! that is absolutely insane! That Star Wars game looks like it does because it seems to be highly linear and controlled in terms of level progression and they are LucasArts collaborating with friggin' ILM!

It seems to me like Nintendo tries to be the last bastion for lower budget games with DS and Wii both sticking with visuals in the realm of the hardware's respective previous generations and 3DS and Wii U doing the same/similar thing.

I'm dragging this from the depths: (1 page away, but still)

Nintendo doesn't really seem afraid of it. Wii Motion Plus, Wii Fit, Classic Controllers and GC controllers (these two not being compatible) :p

Rather, they make money out of it.

Kinda, but I believe it's only because they were sure that they had software to really sell these things. With Wii Motion Plus they banked big on Wii Sports Resort and then the new Wii Play Motion thing since Wii Play sold so ridiculously well simply because it came bundled with a controller. They were clever in the way that they "forced" the hardware to onto consumers with bundles for well established brands.

But the lack of support for WM+ also shows that it just wasn't enough to make many games require the use of it. Were there even more than 10 games? Flingsmash, WS:Resort, Zangeki No Reginleiv, Skyward Sword, Red Steel 2, Wii Play Motion ..those are all I can think of, am I missing something?

Of course the fact that actual interesting game ideas kind of didn't happen very often after the WM+ launched didn't help much either. It's never just one thing, I see that (Wii Fit sold a ton as well but people probably didn't have many good game ideas for a bathroom scale *cough* Rock 'n' Roll Climber? *cough*)

Well technically we're seeing games now that aren't being played by the console like they were meant to, possibly due to the RAM ammount (on Crysis 2 it was certainly a big part of the equation), how much would a 512 MB GDDR3 @ 700 MHz expansion module cost right now?

Probably $10 or $20, they could even have included that cost at launch in their profit margin (it was cheaper but Nintendo actually did give out the wii condoms this gen to launch systems that lacked them; taking from their profit margin)

Of course the motivation for doing this to a manufacturer is low; but think about this, it's all a matter of marketing and leverage against competition. If consoles a year from now are launching with 4 GB as a response to the wii-u then it would be a wise counter-measure to have an expansion slot for RAM.

I meant the "not being played like they were meant to" more like with Perfect Dark on N64 where the expansion was required for players to access the campaign (!) and many mp features while being technically "optional".

Consumers are viewed as seemingly stupid already and I don't think any company would trust them to handle upgrading their RAM even with a simple expansion stick that looks like an USB drive these days.

Like I said, for us who are somewhat tech savvy and "know" about these things it would be totally cool to be able to enhance our console a bit to keep up with other machines but I just don't see that working in a big-picture scenario with the way the industry has gone.

--------------------------------

Also, I dont see how leaving the classic controller out is feature crippling when the wiimote could essentially do what classic controller did with added features. I see it more as an evolution. Thats like saying Nintendo feature crippled the NES because they didnt adopt the joystick.

It fits the feature crippling description because had it been included, devs could just say "ah what the hell" and make a regular-ass game but Ninitendo might've envisioned that to "encourage" developers to come up with unique ideas for the Wii Remote since that's the only controller that 100% have. of course history went a different route but I believe they might have thought that we'd see more Elebits/Eledees and less 3rd Person Action Game With Crappy Camera Because No Camera Control games.

They're basically the polar opposite of what Sony is doing in trying to force creativity by imposing hardware limitations and different interfaces while Sony tries to make everything the exact same every time but with more bells and whistles (that's why they haven't changed a thing about their controller other than including pointless sixaxis to be able to say "hey we have Teh Motions too!" at the last minute (which is probably a good thing 'cause that boomerang controller was hideous). Sony's basically sticking to what works and lets developers make games with all the toys they can throw in the ball pool while Nintendo likes to challenge devs. Sony puts developers in a sandbox and says "you've been here before, right? do whatever you like" and Nintendo might hand them a hammer or a saw and asks "what can you make with these things we've given you". both produce interesting results.

Sorry, I don't know how this turned into a Sony vs Nintendo-design philosophy rant.

The original XBox and PS2 were capable of rendering games in HD, Nintendo could have allowed the Wii to do so as well for less than $5 per console.

Huh, what really? HD? 720p+? 'Cause if you mean 480p, GameCube did that as well. Please enlighten me.

---

oh god this was a looong ass post and quite hard to glue together lol
 
Well than if what your saying is true they never move from r700. Evergreen is dx 11 and while they did not use dx11 it tells us a feature set of the cards. They are hardware differences between dx parts. Again I believe they started with a high watt r700 at 55nm and then drop it to 40nm. I don't think they ever move to any different core. Every other new core is sm5 and dx11. There are hardware differences. Plus everything we have said they never moved to dx11.

Why do you keep stating this as a the most likely conclusion? If anything, we have been getting hints from numerous sources/insiders that GPU7 could have dx11-level capabilities, though it is nothing official for either case yet.
 

D-e-f-

Banned
What would have been nice was if they'd put the D-Pad under the face buttons. Would work for me personally, since I can't remember the last time I used the D-Pad for much.

gaaaah why would you want such a thing?

I play my 2D games (Virtual Console, eShop stuff like Mutant Mudds, Mighty Switch Force) with the 3DS with the D-Pad. Moving it to the left would throw off sooo many people and games wouldn't work. the D-pad is used for lefty-mode as well and moving it to the right renders that impossible because you can't do that with just the circle pad (which would have to stand in for ABXY).
 

USC-fan

Banned
Why do you keep stating this as a the most likely conclusion? If anything, we have been getting hints from numerous sources/insiders that GPU7 could have dx11-level capabilities, though it is nothing official for either case yet.

I have not seen anything that hints at this. Besides people reading between the lines and reading way into things. Kinda like the gpgpu rumors.

They will not be using dx anyway so I don't see why they would even go after a dx11 support.
 

Redford

aka Cabbie
gaaaah why would you want such a thing?

I play my 2D games (Virtual Console, eShop stuff like Mutant Mudds, Mighty Switch Force) with the 3DS with the D-Pad. Moving it to the left would throw off sooo many people and games wouldn't work. the D-pad is used for lefty-mode as well and moving it to the right renders that impossible because you can't do that with just the circle pad (which would have to stand in for ABXY).

I just find having two directional inputs right next to eachother redundant, at first glance. :p
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Well than if what your saying is true they never move from r700. Evergreen is dx 11 and while they did not use dx11 it tells us a feature set of the cards. They are hardware differences between dx parts. Again I believe they started with a high watt r700 at 55nm and then drop it to 40nm. I don't think they ever move to any different core. Every other new core is sm5 and dx11. There are hardware differences. Plus everything we have said they never moved to dx11.
I think you misunderstood me. I'm not disputing the differences between shader models and DX versions compliance at all. I'm saying that nintendo took an R700 (a SM4 part codenamed Wekiva) and turned it into something which is *not* Wekiva (read: AMD did that for nintendo). I'm not saying that the new part is DX11/SM5 compliant, I'm just saying it's a branch off of a Wekiva. Similarly to how Xenos was a branch off of a mainline design AMD had been working on, and which ended up as R600, but Xenos is not R600 at all.

You still have it wrong with the x360. R520 were out after the dev kits already had production xenos in them. Ms switched gpu until that point. Comparing it to the x360 really goes against everything you say with the wiiu.
At an old job of mine I've had access to functional AMD parts as early as 3 moths before their official launch. It would not have been much of a problem for MS to go with a R520 in the beta devkits, if they deemed that important.
 
I have not seen anything that hints at this. Besides people reading between the lines and reading way into things. Kinda like the gpgpu rumors.

They will not be using dx anyway so I don't see why they would even go after a dx11 support.

Which is why I said "dx11-level," since you're right that the Wii U will not use direct X.

Actually, I have heard that dx10.1 has the same features as dx11, but dx11 is alot more efficient doing them. Since the 360/PS3 has dx9/equivalent features, we may not see much evidence on that upgrade until development for the other consoles are more focused on.
 

Roman

Member
I just find having two directional inputs right next to eachother redundant, at first glance. :p

So you want to have the D-Pad somewhere else just to not use it?

In what situation would you use the stick and D-Pad at the same time? Have you done this on the 3DS?
 

Redford

aka Cabbie
So you want to have the D-Pad somewhere else just to not use it?

In what situation would you use the stick and D-Pad at the same time? Have you done this on the 3DS?

It would be an alternative to dual-analog, is what I'm saying. That's all.

For me personally, if I have to make that clear.

Your avatar sucks, btw
 

Roman

Member
It would be an alternative to dual-analog, is what I'm saying. That's all.

For me personally, if I have to make that clear.

Your avatar sucks, btw

Doesn't the gyrometer make dual analog controls unnecessary for a lot of games?

You write that just after switching back to your old one? Come on son.
 

Redford

aka Cabbie
Doesn't the gyrometer make dual analog controls unnecessary for a lot of games?

You write that just after switching back to your old one? Come on son.

A couple of posters already sort of discussed Nintendo's intentions this on this page, so it depends on personal taste, really. Something new being mandated at the expense of something traditional... I wouldn't say the gyroscope makes DA-like setups unnecessary, as some simply prefer a physical input. I like gyroscope control in OoT3D, and it impressed me way more than I thought it would. I'm just spitballing ideas, considering the apparent implausibility of a second stick because of it's design footprint on the hardware.

And I always take your criticism to heart bro
 

Roman

Member
A couple of posters already sort of discussed Nintendo's intentions this on this page, so it depends on personal taste, really. Something new being mandated at the expense of something traditional... I wouldn't say the gyroscope makes DA-like setups unnecessary, as some simply prefer a physical input. I like gyroscope control in OoT3D, and it impressed me way more than I thought it would. I'm just spitballing ideas, considering the apparent implausibility of a second stick because of it's design footprint on the hardware.

And I always take your criticism to heart bro

I agree with the sentiment on Ocarina of Time 3D and the GamePad has both dual analog and a gyro so I do not see a problem. It doesn't seem to me that the left hand input is going to differ much from the 3DS.

bro hug?
 

USC-fan

Banned
I think you misunderstood me. I'm not disputing the differences between shader models and DX versions compliance at all. I'm saying that nintendo took an R700 (a SM4 part codenamed Wekiva) and turned it into something which is *not* Wekiva (read: AMD did that for nintendo). I'm not saying that the new part is DX11/SM5 compliant, I'm just saying it's a branch off of a Wekiva. Similarly to how Xenos was a branch off of a mainline design AMD had been working on, and which ended up as R600, but Xenos is not R600 at all.
The branch off for Xenos fit into the next gen parts for amd. While not the same they are both unified shader model. AMD move to this just after x360 launch with card that used unified shaders. It not like MS developed unified shaders and then amd copy them. No amd design the parts.

Now i dont have a clue where they would have gone with a r700 core. Nothing AMD is coming out with would work in a r700 core.

At an old job of mine I've had access to functional AMD parts as early as 3 moths before their official launch. It would not have been much of a problem for MS to go with a R520 in the beta devkits, if they deemed that important.
By even 3-5 months before the launch they had already production gpu in the x360 dev kits. Remember the x1800 launch after the x360...



Which is why I said "dx11-level," since you're right that the Wii U will not use direct X.

Actually, I have heard that dx10.1 has the same features as dx11, but dx11 is alot more efficient doing them. Since the 360/PS3 has dx9/equivalent features, we may not see much evidence on that upgrade until development for the other consoles are more focused on.
Direct X10, 10.1, 11 architecture: does it matter to GPU U?

Using RV7x0 and its DirectX 10.1 architecture as starting point for building GPU U, DX11 compliance doesn’t matter much. The computational building blocks of the GPU, the SIMDs in RV7x0 are more than capable of taking GPGPU workloads and the RV7x0, just like the R6x0 architecture has a hardware tessellator. The API refinements that exist in DX11 may seem irrelevant to Wii U, given it doesn’t use DirectX as API. However, we must keep in mind that, for example, the tessellation mechanism implemented in pre-DX11 ATI parts is somewhat clunky, requiring two-passing. Another example would be that DX11 also introduces atomic operations that are useful in quite a few occasions, and pre DX11 do not include such functionality.

The refinements made to the SIMDs (VLIW4, instead of VLIW5) and texture sampling units in Cayman compared to RV7x0 can, in contrast, have an impact on the performances of the GPU U.
http://www.beyond3d.com/content/articles/118/6

This is a good read if you care about the tech angle of the gpu. They went really high on a lot of stuff in the article. Given the tdp of the console its a lot closer to the lower end of each spec.
 
Any reason you chose those values BG? And Wii U is looking to be about 2/3 of that right?

I had an early idea of what I believe would be a great, six-year console from both a power and cost viewpoint. From there I came up with what I thought would be an acceptable minimum that would still be in a very acceptable range of that console. PS4 was exactly what that idea was spec-wise, but Sony seems to be indicating they will wait more than six years for a successor, which I don't like.

At the currently speculated best Wii U would either be barely at or short of 2/3 of that GPU.

What would be the comparable specs for the 720? PS4?

Basic, rough specs.

Wii U

3-core CPU (possibly 476FP cores)
2GB (DDR3?) and 32MB eDRAM
480-800 GFLOP GPU

PS4

4-core CPU (possibly Jaguar cores)
2-4GB GDDR5
1.8 TFLOP GPU

Xbox 3


8(?)-core CPU (possibly Jaguar cores)
8GB DDR3/4 and > 32MB eDRAM
≥1TFLOP GPU

Well than if what your saying is true they never move from r700. Evergreen is dx 11 and while they did not use dx11 it tells us a feature set of the cards. They are hardware differences between dx parts. Again I believe they started with a high watt r700 at 55nm and then drop it to 40nm. I don't think they ever move to any different core. Every other new core is sm5 and dx11. There are hardware differences. Plus everything we have said they never moved to dx11.

You still have it wrong with the x360. R520 were out after the dev kits already had production xenos in them. Ms switched gpu until that point. Comparing it to the x360 really goes against everything you say with the wiiu.

I have not seen anything that hints at this. Besides people reading between the lines and reading way into things. Kinda like the gpgpu rumors.

They will not be using dx anyway so I don't see why they would even go after a dx11 support.


And you keep focusing on what was in the early kit and the base architecture details from the early target specs when that arguably holds less weight than the points we've brought up when it comes to the feature set of the GPU. Like I said before, even ERP said it's flawed to do what you're doing.

http://forum.beyond3d.com/showpost.php?p=1653227&postcount=12899

You guys read way to much into early devkits, they're an indicator of the general architecture and very rough performance (sometimes), and not very much else.
Before devkits with real silicon are available, usually less than 12 months before launch, devkits are there to let developers play with the new feature sets and the core OS API's, often with very different performance profiles.
Specs usually are "locked" and I use the term loosely down 2+ years before a product launches. That's not to say they don't "evolve" as process issues arrise, or partners deadlines slip, or information about a competitors platform leak, but you rarely see wholesale changes.
The 360 devkits had 2 different video cards, neither of which were very much like the final hardware, but with the exception of the late memory size change, disclosed developers knew what was in the final box before the first devkit even shipped.
 

stupidvillager

Neo Member
Well than if what your saying is true they never move from r700. Evergreen is dx 11 and while they did not use dx11 it tells us a feature set of the cards. They are hardware differences between dx parts. Again I believe they started with a high watt r700 at 55nm and then drop it to 40nm. I don't think they ever move to any different core. Every other new core is sm5 and dx11. There are hardware differences. Plus everything we have said they never moved to dx11.

You still have it wrong with the x360. R520 were out after the dev kits already had production xenos in them. Ms switched gpu until that point. Comparing it to the x360 really goes against everything you say with the wiiu.

So what features is the Wii U missing to bring it up DX11 levels?
 

USC-fan

Banned
Basic, rough specs.

Wii U

3-core CPU (possibly 476FP cores)
2GB (DDR3?) and 32MB eDRAM
480-800 GFLOP GPU



And you keep focusing on what was in the early kit and the base architecture details from the early target specs when that arguably holds less weight than the points we've brought up when it comes to the feature set of the GPU. Like I said before, even ERP said it's flawed to do what you're doing.

http://forum.beyond3d.com/showpost.php?p=1653227&postcount=12899
800 GLFOP!!!! Wow you still shooting for the sky! It might not even be half of that.....

Where do these xbox 720 gpu number coming from? I seen the ps4 numbers reported but nothing on the x720.

I going off everything we know not just early kits. The biggest thing I am going after is the power consumption of the console.

The big red flags with the wiiu have not changed, the small console size and NOW the psu specs. Even putting in the latest and greatest from amd will get you around 500 glfop and that leave no power left for cpu. Maybe this is the reason for the reports of terrible cpu performance.


So what features is the Wii U missing to bring it up DX11 levels?
Check my last post before this one. I link to a good article about this.
 
800 GLFOP!!!! Wow you still shooting for the sky! It might not even be half of that.....


I going off everything we know not just early kits. The biggest thing I am going after is the power consumption of the console.

The big red flags with the wiiu have not changed, the small console size and the psu specs. Even putting in the latest and greatest for amd will get you around 500 glfop and that leave no power left for cpu. Maybe this is the reason for the terrible reports of cpu performance.


Check my last post before this one. I link to a good article about this.

Yeah, focus on the max instead of the range. And I already said what the early kit was at so it won't be half. Don't even know why you said that.

You're all over the place. How does power consumption tell you that the GPU will have DX11 features or that there will be emphasis on compute functions? That's what we've been debating and power consumption has nothing to do with that. You've focused on the early kit placeholder GPU and the early target specs. And even then I've shown you a GPU with low power consumption, current features, and raw power equivalent to what the early kit had, which was greater than 500 GFLOPs.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
The branch off for Xenos fit into the next gen parts for amd. While not the same they are both unified shader model. AMD move to this just after x360 launch with card that used unified shaders. It not like MS developed unified shaders and then amd copy them. No amd design the parts.
I'm not sure what you're arguing here. I never implied MS developed anything for AMD. Let's look at the timeline once again. First off, AMD did not move to unified shader in the consumer space 'just after 360' - it was R500 that launched along with 360; first R600 launched mid-2007. And second, I'm not sure what you imply with Xenos being a unified shader architecture. Tahiti is unified shader architecture as well - how close is it to Xenos?

In case you're missing that, Xenos is not R600 - there are some serious architectural differences between the two, For instance, R600 is VLIW5, while Xenos is simd4+scalar (you can call it VLIW2 if you like).

Now i dont have a clue where they would have gone with a r700 core. Nothing AMD is coming out with would work in a r700 core.
You lost me here. Please, elaborate.

By even 3-5 months before the launch they had already production gpu in the x360 dev kits. Remember the x1800 launch after the x360...
No, both X1800 and X1600 officially launched in early Oct 2005. MS sent out the final kits in early Sep 2005. Subtract from that a month for production and logistics, and that means they had their final kit done in Aug 2005. 3-5 months from Oct for the R520 means MS could've shipped beta "n" kits with R520 if they really wanted to.
 

stupidvillager

Neo Member
800 GLFOP!!!! Wow you still shooting for the sky! It might not even be half of that.....

Where do these xbox 720 gpu number coming from? I seen the ps4 numbers reported but nothing on the x720.

I going off everything we know not just early kits. The biggest thing I am going after is the power consumption of the console.

The big red flags with the wiiu have not changed, the small console size and NOW the psu specs. Even putting in the latest and greatest from amd will get you around 500 glfop and that leave no power left for cpu. Maybe this is the reason for the reports of terrible cpu performance.


Check my last post before this one. I link to a good article about this.

Wasnt compute shader support and tessellation introduced in shader model 5 and DX11?
 

Madn

Member
Yay, I've been accepted! I was hoping for it seeing all the new members, even though I registered only two weeks ago.
As for next gen I'm definitely substituting my wii + computer with a wii u + better computer
 
Wasnt compute shader support and tessellation introduced in shader model 5 and DX11?

They've been around much longer. When I say they, I mean GPGPU functionality. The main issue with tessellation from what I understand is that the tessellation unit has only recently become decent enough to properly handle tessellation. I've also seen it said that nVidia is ahead of AMD in this area, but that may have just been someone's opinion as I can't remember where I saw it.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Wasnt compute shader support and tessellation introduced in shader model 5 and DX11?
In the MS API - yes. R700, though, a SM4.1/DX10.1 -compliant part, has compute shaders, and is openCL-compliant.
 

stupidvillager

Neo Member
They've been around much longer. When I say they, I mean GPGPU functionality. The main issue with tessellation from what I understand is that the tessellation unit has only recently become decent enough to properly handle tessellation. I've also seen it said that nVidia is ahead of AMD in this area, but that may have just been someone's opinion as I can't remember where I saw it.

I understand that they have been around awhile, but my point is that just recently DX11 supports them and that it doesnt matter what DX level the Wii U GPU is. If these functions are just now supported in Shader model 5 and DX11 and the Wii U GPU supports these then it would make sense that the Wii U is not just a r700 DX10.1 card. Even if the tessellation unit is somewhat "clunky", it is still there and most likely optimized further than what is in a standard r700.
 

D-e-f-

Banned
Yay, I've been accepted! I was hoping for it seeing all the new members, even though I registered only two weeks ago.
As for next gen I'm definitely substituting my wii + computer with a wii u + better computer

so you're saying you're upgrading from a Wii to a Wii U and from your PC to a PC U

makes me think, everything should be prefixed and suffixed with Nintendo-logic.

"Today I went and bought myself a Toaster 64"

"Yo I needed a new pair of socks that have grip on the bottom so I got myself some Super Socks"

"I got rid of my old car, I needed something fresh. Wanna test drive my Bike U?
 

Madn

Member
so you're saying you're upgrading from a Wii to a Wii U and from your PC to a PC U

makes me think, everything should be prefixed and suffixed with Nintendo-logic.

"Today I went and bought myself a Toaster 64"

"Yo I needed a new pair of socks that have grip on the bottom so I got myself some Super Socks"

"I got rid of my old car, I needed something fresh. Wanna test drive my Bike U?

How could you forget the fridge advance sp? Now foldable!
 

Meelow

Banned
I'm finally accepted!, I've been waiting 4 months lol, hi everybody!.

As for Wii U I'm very excited, and can't wait to get it, I know a lot of people are worried about specs but if Nintendo pulls the right moves, we hopefully won't have to worry if the PS4 and 720 are that much stronger.
 
I understand that they have been around awhile, but my point is that just recently DX11 supports them and that it doesnt matter what DX level the Wii U GPU is. If these functions are just now supported in Shader model 5 and DX11 and the Wii U GPU supports these then it would make sense that the Wii U is not just a r700 DX10.1 card. Even if the tessellation unit is somewhat "clunky", it is still there and most likely optimized further than what is in a standard r700.

It is true that OpenGL/CL have provided them from an API standpoint longer. But longer doesn't necessarily mean better. They still had to bring OpenGL 4.0 up to similar standards as DX11 from what I remember seeing. And those are the standards being used by modern games which an R700 wouldn't be capable of. Because while an R700 is up to DX10.1, it's also up to OpenGL 3.3.

But I would be surprised to see the GPU in Wii U having an older generation tess unit. The one in the R700 is the same as the one in Xbox 360's Xenos.
 
Yeah, focus on the max instead of the range. And I already said what the early kit was at so it won't be half. Don't even know why you said that.

You're all over the place. How does power consumption tell you that the GPU will have DX11 features or that there will be emphasis on compute functions. That's what we've been debating and power consumption has nothing to do with that. You've focused on the early kit placeholder GPU and the early target specs. And even then I've shown you a GPU with low power consumption, current features, and raw power equivalent to what the early kit had, which was greater than 500 GFLOPs.

I don't think you're crazy for seeing ~800 GFLOPS as a possibility. Using the design principles of the e6760, they might have opted to restore functionality to those 2 disabled SIMD units you spoke of in the early dev kits but maintain a slow clock (say, 480 Mhz). More shaders and less clock speed would probably amount to a lower TDP than the opposite way around.

If that is the case, we might be looking at a CPU clock around 1.4-1.5 Ghz, since Nintendo insists on maintaining "balance." I was doing some reading and it seems the 476fp does 1.6Ghz in single core configs, but lower once you start adding cores. That would also fit into Espresso's comments about the cpu being clocked closer to Broadway than Xenon.
 

10k

Banned
The Waffle Maker Cube.

I'm off to my birthday BBQ. Usually big news happens when I'm gone from the Internet. Don't be surprised if Nintendo drops a babomb or goes third party by the end of the night :p
 

USC-fan

Banned
Yeah, focus on the max instead of the range. And I already said what the early kit was at so it won't be half. Don't even know why you said that.

You're all over the place. How does power consumption tell you that the GPU will have DX11 features or that there will be emphasis on compute functions? That's what we've been debating and power consumption has nothing to do with that. You've focused on the early kit placeholder GPU and the early target specs. And even then I've shown you a GPU with low power consumption, current features, and raw power equivalent to what the early kit had, which was greater than 500 GFLOPs.

BC the range is wayyyyy off. It is impossible to have 800 flop in the wiiu. There is no enough power. In the system.

Not all over the place, power consumption tells us the correct range. Which high end 575-300 gflops. You show the latest and greatest from amd. That gpu came out this year, what does that prove? That is best case at this point. Then that using about 70% of the power the whole system uses.

It really doesn't matter what features it has at this point. There is such little power available it won't change performance much at all. Worse case they are still using r700 core. Going by all the info we have, they are still on a r700.

I focused on all the info we have on the system.
 
Yeah, errm, just been activated. One thing Ive wanted to add to this is something the user ldhere is it? When the dev kit specs appeared and people started saying "target specs", ldhere made a post saying 'who said these are target specs.'

I think it's been over looked a little and wanted to link to it but I don't think I'm selling his user name right in the search tool!
 

10k

Banned
I don't think you're crazy for seeing ~800 GFLOPS as a possibility. Using the design principles of the e6760, they might have opted to restore functionality to those 2 disabled SIMD units you spoke of in the early dev kits but maintain a slow clock (say, 480 Mhz). More shaders and less clock speed would probably amount to a lower TDP than the opposite way around.

If that is the case, we might be looking at a CPU clock around 1.4-1.5 Ghz, since Nintendo insists on maintaining "balance." I was doing some reading and it seems the 476fp does 1.6Ghz in single core configs, but lower once you start adding cores. That would also fit into Espresso's comments about the cpu being clocked closer to Broadway than Xenon.
What I don't get is, the 360 and PS3 are clocked at 3.2 Ghz. Shouldnt an HD console be at minimum clocked around 3Ghz to process good AI and Physics that are expected from today's engines? I just find it hard to believe that an HD console would have a CPU clock lower than 2.5GHz and here you guys are talking about 1.6GHz :(
 
What I don't get is, the 360 and PS3 are clocked at 3.2 Ghz. Shouldnt an HD console be at minimum clocked around 3Ghz to process good AI and Physics that are expected from today's engines? I just find it hard to believe that an HD console would have a CPU clock lower than 2.5GHz and here you guys are talking about 1.6GHz :(

The PS3 and X360 CPUs are really inefficient. The clock speed is high, but that means very little in the scheme of how it performs. Go look up some old benchmarks of a 3 GHz Pentium D compared to a Core 2 Duo E6300 (1.86 Ghz) to see what the discussion's about.
 
I don't think you're crazy for seeing ~800 GFLOPS as a possibility. Using the design principles of the e6760, they might have opted to restore functionality to those 2 disabled SIMD units you spoke of in the early dev kits but maintain a slow clock (say, 480 Mhz). More shaders and less clock speed would probably amount to a lower TDP than the opposite way around.

If that is the case, we might be looking at a CPU clock around 1.4-1.5 Ghz, since Nintendo insists on maintaining "balance." I was doing some reading and it seems the 476fp does 1.6Ghz in single core configs, but lower once you start adding cores. That would also fit into Espresso's comments about the cpu being clocked closer to Broadway than Xenon.

I agree with your assessment.

BC the range is wayyyyy off. It is impossible to have 800 flop in the wiiu. There is no enough power. In the system.

Not all over the place, power consumption tells us the correct range. Which high end 575-300 gflops. You show the latest and greatest from amd. That gpu came out this year, what does that prove? That is best case at this point. Then that using about 70% of the power the whole system uses.

It really doesn't matter what features it has at this point. There is such little power available it won't change performance much at all. Worse case they are still using r700 core. Going by all the info we have, they are still on a r700.

I focused on all the info we have on the system.

What you are doing is making assumptions and are passing them off as fact.

You're assuming the high end is 575 (576) GFLOPs.

You're assuming the e6760 is the best case. And it's on a 40nm process.

The feature set has always been important for future ports, and now you're trying to say it's not. Even if Wii U were just a 360 with more memory and a modern feature set, that would go a long way for the future as devs are still making PS360 games that are starting to be truly blown away by the PC versions.

Going by all the info we have you can't assume, which is what you are doing again, that they are still on an R700. Especially when the final silicon came out about six months ago.

You haven't focused on all the info we have. You picked what you want to focus on. Selective reading has been one of the most consistent things I've seen with those that want to make Wii U weaker than what it is.

Yeah, errm, just been activated. One thing Ive wanted to add to this is something the user ldhere is it? When the dev kit specs appeared and people started saying "target specs", ldhere made a post saying 'who said these are target specs.'

I think it's been over looked a little and wanted to link to it but I don't think I'm selling his user name right in the search tool!

lherre

What I don't get is, the 360 and PS3 are clocked at 3.2 Ghz. Shouldnt an HD console be at minimum clocked around 3Ghz to process good AI and Physics that are expected from today's engines? I just find it hard to believe that an HD console would have a CPU clock lower than 2.5GHz and here you guys are talking about 1.6GHz :(

They are in-order processors. They need the higher clock speed to make up for how they execute instructions.
 

D-e-f-

Banned
Yeah, errm, just been activated. One thing Ive wanted to add to this is something the user ldhere is it? When the dev kit specs appeared and people started saying "target specs", ldhere made a post saying 'who said these are target specs.'

I think it's been over looked a little and wanted to link to it but I don't think I'm selling his user name right in the search tool!


"lherre" is his username

After watching Gameexplain's analysis of Nintendoland,
I'm even more interested in playing it:

http://www.youtube.com/watch?v=3DC78LiO5yM



.

yea, NintendoLand gets an undeserved bad rap because it was so ill presented at the press briefing.

A few pages ago I wrote something about how I think the game might be secretly awesome.

I think this is actually better than Wii Sports. Not in terms of instant mass appeal but in terms of long term value for "Us".
 
Status
Not open for further replies.
Top Bottom