• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

magash

Member
You sure can.

What is your take on this whole "Enhanced Broadway" stuff. Surely an "Enhanced Broadway" will find it very difficult or impossible to run games like Assassins Creed 3 unless "Enhanced" means something entirely different.
 

EVIL

Member
I was thinking similar this morning. If Nintendo save their graphical showcase for too close to the 720/PS4 reveal, they run the risk of looking rather foolish. If they'd had something to show earlier it could have helped them in the lead up to launch.

would have mattered much, if the wii u would be as powerfull as ps4/720 people would still think ps4 and 720 have much better looking games because of nintendo = kiddy. this argument will never end.
 
5 processes per core, where did that come from? So each core is more like a separate CPU with multiple cores? Surely you can't get 5 threads using hyper threading, or can you?

Don't worry about it. He's confusing an imaginary number for IPC (instructions per clock) with SMT (symmetrical multi-threading). In either case there is no evidence that 5 is an accurate measure of the WiiU CPU's capabilities.
 
Not what I was saying.

You were trying to use examples of how power defined a generation. Great.

But then, for some reason, you decide it's ok to lump 32/64bit in together, even though there is a 100% difference between the two.
I didn't, but rather common convention has; presumably because despite that 100% difference and first release roughly the same span after the Saturn as before the Dreamcast - it wasn't seen as a big enough increase in power over the Saturn and PS1 to be considered a new generation.
 

Goodlife

Member
I didn't, but rather common convention has; presumably because despite that 100% difference and first release roughly the same span after the Saturn as before the Dreamcast - it wasn't seen as a big enough increase in power over the Saturn and PS1 to be considered a new generation.

So why's the 4bit to 8 bit considered enough??
 

Theonik

Member
What is your take on this whole "Enhanced Broadway" stuff. Surely an "Enhanced Broadway" will find it very difficult or impossible to run games like Assassins Creed 3 unless "Enhanced" means something entirely different.
Given the WiiU CPU is meant to have out of order execution, the idea of an enhanced Broadway is quite vague here. It's definitely not 3 Broadway cores on a single die.
Edit: I just found this, brings back so many memories
 
So why's the 4bit to 8 bit considered enough??
You should presumably ask someone alive during those eras who discerned a generational leap between Pong and Pitfall. It could be, I assume, that the former's hardware wasn't technically capable of the games seen on the latter.

You could also ask them why two Atari consoles are considered part of the 2nd gen. Or why the TurboGrafx16 first releasing a year after the 7800 wasn't part of the third gen.
 

gogogow

Member
Killzone 2 graphics quality was achieved though.

What KZ2 did you play lol. Character models, hair, particle effects, animation, scale of environment, cloth physics, 16xAA. The limitless amount of polygons will never be achieved, unless it's a CG cutscene, like uh....the KZ2 unveil trailer. To even come close to that, tesselation and POM needs to be used and back then, yeah, no game used those effects, since they are DX11 effects.

What Guerilla Games have achieved is a great looking game, one of the best imo, but it's nowhere close to the CG trailer they showed.


Look at that goddam tire, probably more polygons than on the entire screen of the real KZ2/3.
killzone-next-gen-2005jsq1.jpg
 

Kenka

Member
Ahem, in a modern specs sheet, how do you know how many "bits" the console has (I mean by that 16bit, 32bit, 128bit, etc.). It is the bus width of the GPU ?
 
Killzone 2 graphics quality was achieved though.

Killzone 2 was a great looking game for sure, the team at GG worked hard no doubt! But to say the game achieved graphics quality seen in that CG trailer is compete bs. That trailer shouldn't have been shown at all. All the fake CG trailers shown at that press conference came from Sony Europe studios, so I am sure Phil was to blame.
 

2MF

Member
Ahem, in a modern specs sheet, how do you know how many "bits" the console has (I mean by that 16but, 32bit, 128bit, etc.). It is the bus width of the GPU ?

Usually it's the register/pointer size of the CPU. It's not really a good measure of performance anymore (if it ever was).
 

Shaheed79

dabbled in the jelly
You have lherre, Arkam, EatChildren, someone who decided to call themselves Espresso on B3D, IdeaMan but in a more optimistic fashion, and whoever Eurogamer's sources are corroborating spec/SDK copy/pasta leaks.

That's a lot of insiders you named there. And all of those people didn't know that Wii U had 2gigs of total system memory? Or did they all just feel the extra gig wasn't worth mentioning? I think that 1gig "that applications can use" number comes from a non-final version of the Wii U dev kit.
That's the best anyone's going to get. And really more than most expected to get until someone tears it apart and puts it under a microscope or something.
I don't give rumors more credibility because we do not have more specific official information.

There's been very little public comment on specifics regarding the hardware - until Iwata confirmed a few things in Nintendo direct. You have people saying things like "Oh it's great, blah blah." which is essentially meaningless - as they've got PR hats on.
That's not exactly true, but it's fine if you believe that. Regardless, it doesn't make these rumors any more or less reliable.

It could be some huge Nintendo conspiracy wherein they've given different devkits and there are better devkits and no one knows who got what - what an awesome way to cultivate developer relations - or it could just be Occam's Razor.
Or it could simply be that Nintendo has a list of preferred developers whom they like to work more closely with. And these lucky developers receive the newest dev kits first, and are told more specific information about the final hardware before other developers. If you want to define that as a conspiracy then ok. But I'm sure that this is quite common, especially when a company is trying to keep certain elements of their product a secret.

There is a certain narrative in rumor topics, such as this one, and going against that narrative seems to be discouraged, not by the mods mind you, but by other gafers.

If people ask certain questions, with the intent of determining the credibility and accuracy of the source in question, then I think it should be encouraged. If those important and relevant questions can not be answered, for whatever reason, then it would make sense to remain skeptical.

This is not a question of "do you trust the mods?" as their credibility is not in question. To my knowledge the mods who have directly commented on Wii U specs are not Wii U developers, and neither are some of the sources. This is where the possibility for error lies as we receive 2nd and even 3rd hand information.

What we have in this topic looks to me like a circle of self confirmations, and other people corroborating the people who probably were the source of their own information. I was attempting to dig through all of that red tape, and find a single, credible and non-anonymous source, that is confirmed to be working on the final spec Wii U dev kits.

New information in the rumor isn't even specific, but rather so vague that it could be interpreted in very different ways. This is especially true of the "enhanced Broadway" spec. Using that kind of terminology to describe the CPU, and omitting half of the ram, says to me that the source of the rumor is probably someone who does not have access to the final dev kits and its specifications.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
What KZ2 did you play lol. Character models, hair, particle effects, animation, scale of environment, cloth physics, 16xAA. The limitless amount of polygons will never be achieved, unless it's a CG cutscene, like uh....the KZ2 unveil trailer. To even come close to that, tesselation and POM needs to be used and back then, yeah, no game used those effects, since they are DX11 effects.

What Guerilla Games have achieved is a great looking game, one of the best imo, but it's nowhere close to the CG trailer they showed.


Look at that goddam tire, probably more polygons than on the entire screen of the real KZ2/3.
killzone-next-gen-2005jsq1.jpg
I particularly love the ultra-generic human heads used in this mockup - as if they were downloaded off turbosquid. Oh well, PR suits and their 'we have to show something amazing!'
 

2MF

Member
OK, and is this info easily obtainable or can we derivate it from other specs ?

I would be curious to see in what bit-generation the WiiU is. Simple nostalgia.

The Broadway was a 32-bit CPU, POWER7 is 64-bit I believe. I don't know what the Wii U will have, although IBM has said at least twice it uses a POWER7-derivative.
 
What KZ2 did you play lol. Character models, hair, particle effects, animation, scale of environment, cloth physics, 16xAA. The limitless amount of polygons will never be achieved, unless it's a CG cutscene, like uh....the KZ2 unveil trailer. To even come close to that, tesselation and POM needs to be used and back then, yeah, no game used those effects, since they are DX11 effects.

What Guerilla Games have achieved is a great looking game, one of the best imo, but it's nowhere close to the CG trailer they showed.


Look at that goddam tire, probably more polygons than on the entire screen of the real KZ2/3.
killzone-next-gen-2005jsq1.jpg

Wow this screenshot brings back memories, for almost a year GAF was all about arguing over this screenshot and the trailer.
 

Theonik

Member
Why does the wiiu need so much ram for the OS? shouldnt 512 be enough?
Future proofing as well as allowing many of the features that can be used outside of games in-game like, a full featured web browser, would be my guess.
What else you can do while in-game is anyone's guess.
 
That's a lot of insiders you named there. And all of those people didn't know that Wii U had 2gigs of total system memory? Or did they all just feel the extra gig wasn't worth mentioning? I think that 1gig "that applications can use" number comes from a non-final version of the Wii U dev kit.

I don't give rumors more credibility because we do not have more specific official information.


That's not exactly true, but it's fine if you believe that. Regardless, it doesn't make these rumors any more or less reliable.


Or it could simply be that Nintendo has a list of preferred developers whom they like to work more closely with. And these lucky developers receive the newest dev kits first, and are told more specific information about the final hardware before other developers. If you want to define that as a conspiracy then ok. But I'm sure that this is quite common, especially when a company is trying to keep certain elements of their product a secret.

There is a certain narrative in rumor topics, such as this one, and going against that narrative seems to be discouraged, not by the mods mind you, but by other gafers.

If people ask certain questions, with the intent of determining the credibility and accuracy of the source in question, then I think it should be encouraged. If those important and relevant questions can not be answered, for whatever reason, then it would make sense to remain skeptical.

This is not a question of "do you trust the mods?" as their credibility is not in question. To my knowledge the mods who have directly commented on Wii U specs are not Wii U developers, and neither are some of the sources. This is where the possibility for error lies as we receive 2nd and even 3rd hand information.

What we have in this topic looks to me like a circle of self confirmations, and other people corroborating the people who probably were the source of their own information. I was attempting to dig through all of that red tape, and find a single, credible and non-anonymous source, that is confirmed to be working on the final spec Wii U dev kits.

New information in the rumor isn't even specific, but rather so vague that it could be interpreted in very different ways. This is especially true of the "enhanced Broadway" spec. Using that kind of terminology to describe the CPU, and omitting half of the ram, says to me that the source of the rumor is probably someone who does not have access to the final dev kits and its specifications.

Bgassasian and Ideaman said the system might have 2 gigs in total instead of 1.5 gigs weeks before Iwata confirmed the number.
 

Kenka

Member
The Broadway was a 32-bit CPU, POWER7 is 64-bit I believe. I don't know what the Wii U will have, although IBM has said at least twice it uses a POWER7-derivative.
So... WiiU is 64-bit like the Nintendo 64. And now that I think of it, the Xbox came with a Pentium III which calculates number up to 32 bit long. That means the "128-bit" generation was an abusive expression. Thanks !
 

Recon_NL

Banned
What KZ2 did you play lol. Character models, hair, particle effects, animation, scale of environment, cloth physics, 16xAA. The limitless amount of polygons will never be achieved, unless it's a CG cutscene, like uh....the KZ2 unveil trailer. To even come close to that, tesselation and POM needs to be used and back then, yeah, no game used those effects, since they are DX11 effects.

What Guerilla Games have achieved is a great looking game, one of the best imo, but it's nowhere close to the CG trailer they showed.


Look at that goddam tire, probably more polygons than on the entire screen of the real KZ2/3.
killzone-next-gen-2005jsq1.jpg

Textures look fucking great in Killzone:

30wuyb5.jpg
 

Shaheed79

dabbled in the jelly
Bgassasian and Ideaman said the system might have 2 gigs in total instead of 1.5 gigs a month ago before Iwata confirmed the number.

2gigs has been heavily rumored for over a year now. That number isn't anything new. What I find strange is that no one, who cosigned the rumor in this topic, said "that's what I heard but it has 1gb more ram".
 

Panajev2001a

GAF's Pleasant Genius
tesselation and POM needs to be used and back then, yeah, no game used those effects, since they are DX11 effects.

Tessellation = dynamic vertex creation and while you could not do it that neatly on the GPU before the addition of Geometry Shaders (TRUFORM and the like was not exactly what the market probably wanted) which sit before the Vertex Shaders (which operate on one vertex at a time), you could do it on consoles.
You could do it quite neatly on the PS2 where the vertex pipeline was probably much more flexible than some people give it credit for being completely programmable (on the VU's) and fully capable of generating vertices on the fly and sending them to the GS to be rendered.
Many developers are probably doing the same on Xbox 360, when they are not using the built-in tessellator, and PS3 (where the SPU's already tend to pre-process geometry before sending it to the RSX, see EDGE tools).

Parallax mapping is also present in current generation games, see Ghostbusters for example.
 

Zarx

Member
For those of you calling bullshits please see this link:

Official screenshots from darksiders 2 WII-U version not pc version scaled down the Wii-U version. This week as well

This is from The Verge a trustworthy site as well.

Please stop the Wii-U is going to be a player next gen.

I'm on iPad so if someone wants to post the images feel free.

http://www.theverge.com/gaming/2012/9/14/3330908/darksiders-2-wii-u-screenshots#3786553

You can find most of those screens here http://www.videogamer.com/xbox360/darksiders_2/screenshots
.html
from before the PS360PC launch and they are exactly the same except that the Verge has lower resolution






 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
So... WiiU is 64-bit like the Nintendo 64. And now that I think of it, the Xbox came with a Pentium III which calculates number up to 32 bit long. That means the "128-bit" generation was an abusive expression. Thanks !
The bitness matter is a tricky one. Many N-bit CPUs have M-bit registers (where M > N) for various specialized purposes. For instance, the venerable x86 has 80bit 'extended precision' (IIRC) fp registers (well, technically a stack of those), which the FPU used either in full, or as 64- or 32-bit formats. Most PPCs since the beginning of the lineage have had double-precision (i.e. 64bit) FPUs while employing a 32bit general-purpose register file. And this is before even touching on SIMD register files. So the 128bitness was really reached a long time ago, just not in the general-purpose register file. The reason for that is that 128 bits are hardly justifiable for individual general-purpose scalars. Just to give you a hint - a 128bit uint used as a coordinate could span the known universe at the nm level.
 

wsippel

Banned
5 processes per core, where did that come from? So each core is more like a separate CPU with multiple cores? Surely you can't get 5 threads using hyper threading, or can you?
Sure. Probably makes the most sense with CIC (Clustered-Integer-Core), where each core has several parallel pipelines with their own execution units. Though as far as I'm aware, IBM never used that approach - it's a DEC Alpha and AMD Bulldozer thing.
 

Thrakier

Member
No one is denying that but the infamous trailer is CG.

Sure it is but the point is, if they would've had the gameplay trailer ready by this time it would've wowed the people the exact same way. Hell, it even did after they've shown it several years later. They pretty much fullfilled what they promised whitout going into details.

However, that's not the problem. I expect comapnies to exaggerate the quality of their products. The problem is that you claim Nintendo doesn't do that. They do, they do it heavily. Just think of "you will say wow". Or what they promised what Wii Motion control will bring to gaming and what it really brought. Stating that Nintendo "isn't lieing through it's teeth" it's pretty naive. If anything, they are among the worst and they make a shit load of money with that.
 

ozfunghi

Member
Sure it is but the point is, if they would've had the gameplay trailer ready by this time it would've wowed the people the exact same way. Hell, it even did after they've shown it several years later. They pretty much fullfilled what they promised whitout going into details.

However, that's not the problem. I expect comapnies to exaggerate the quality of their products. The problem is that you claim Nintendo doesn't do that. They do, they do it heavily. Just think of "you will say wow". Or what they promised what Wii Motion control will bring to gaming and what it really brought. Stating that Nintendo "isn't lieing through it's teeth" it's pretty naive. If anything, they are among the worst and they make a shit load of money with that.

I said "wow". A different kind of "wow" but "wow" nevertheless :)
 

Goodlife

Member
Sure it is but the point is, if they would've had the gameplay trailer ready by this time it would've wowed the people the exact same way. Hell, it even did after they've shown it several years later. They pretty much fullfilled what they promised whitout going into details.

However, that's not the problem. I expect comapnies to exaggerate the quality of their products. The problem is that you claim Nintendo doesn't do that. They do, they do it heavily. Just think of "you will say wow". Or what they promised what Wii Motion control will bring to gaming and what it really brought. Stating that Nintendo "isn't lieing through it's teeth" it's pretty naive. If anything, they are among the worst and they make a shit load of money with that.

I said "wow" when I saw SMG
And Motion control bought me back into gaming.
 

Thrakier

Member
I said "wow" when I saw SMG
And Motion control bought me back into gaming.

The very nice artstyle of SMG doesn't compensate for the tons of games which looked like shit from 2001, even worse because upscaled on an HDTV.

The concept of motion control was great, the execution an ugly accident. It could've been so much better but it wasn't even close to what Nintendo made us believen beforehand. Instead of proofing their concept with a new generation, they pussy out and throw an underpowered DS clone on the market.

I'm guessing no one, not even the most hardcore ninty fanboys here on GAF, are as excited for Wii U as they were for Wii back then.
 

Goodlife

Member
The very nice artstyle of SMG doesn't compensate for the tons of games which looked like shit from 2001, even worse because upscaled on an HDTV.

The concept of motion control was great, the execution an ugly accident. It could've been so much better but it wasn't even close to what Nintendo made us believen beforehand. Instead of proofing their concept with a new generation, they pussy out and throw an underpowered DS clone on the market.

I'm guessing no one, not even the most hardcore ninty fanboys here on GAF, are as excited for Wii U as they were for Wii back then.

Sorry, you missed out on some key words in my post.

"I" and "me"

I don't care what you thought about it, or what you think I should think.
So I'll repeat.

I said "wow" when I saw SMG
And Motion control bought me back into gaming.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
I'm guessing no one, not even the most hardcore ninty fanboys here on GAF, are as excited for Wii U as they were for Wii back then.
Not sure if I meet 'the most hardocre of ninty fanboys' criterion but I am way more excited for WiiU than I was for Wii, and I love IR aiming. Must be something about the launch (window) lineup of both consoles.
 

IdeaMan

My source is my ass!
That's a lot of insiders you named there. And all of those people didn't know that Wii U had 2gigs of total system memory? Or did they all just feel the extra gig wasn't worth mentioning? I think that 1gig "that applications can use" number comes from a non-final version of the Wii U dev kit.

Well, i kinda revealed all this memory situation in February, back in the days 90% of gafers doubted of this huge size reserved for the OS, as explained in some posts earlier in this thread, with links to those discussions that were central in the Wii U speculation thread 2 i think.

But i understand your position, take all those reveal with huge grain of salt. That's the best you can get thought, you've seen what happens when a dev publicly leak something (well, unwinlingly in this case) with the updated version of the Gamepad: their jobs are at stakes because they signed NDA. So don't expect more than 54545434 layers of anonymous. The correct course of action is to listen to their reveal, and judge the accuracy of their info once it's confirmed or not.
 

Eteric Rice

Member
The very nice artstyle of SMG doesn't compensate for the tons of games which looked like shit from 2001, even worse because upscaled on an HDTV.

The concept of motion control was great, the execution an ugly accident. It could've been so much better but it wasn't even close to what Nintendo made us believen beforehand. Instead of proofing their concept with a new generation, they pussy out and throw an underpowered DS clone on the market.

I'm guessing no one, not even the most hardcore ninty fanboys here on GAF, are as excited for Wii U as they were for Wii back then.

As a fellow that works at a retail video game store, you'd be surprised how many people are happy that Nintendo is going back to buttons primarily. One of the biggest dislikes about the Wii (from core gamers that I've talked to), was the moving around aspect of the system.
 

mrklaw

MrArseFace
You are making the quick assumption of thinking that there wouldn't be a learning curve involved in developing games for the PS4/720.


If they are x86 CPUs with AMD GPUs, some of the third party games could easily look great just by using higher quality PC assets. Oddly in that case it's the first party games that might suffer :)
 

Donnie

Member
1823 is 2.5x Wii CPU clock though (rounded up).

Yes, and looking further into it I see that Wii's Audio chip was 121.5Mhz. So its very likely that WiiU's Audio chip is the same 121.5Mhz rather than the 120Mhz we've heard. That would make 1823Mhz possible for the CPU.
 
That's ok because they haven't paired it with a modern GPU :p


It's a shame though if WiiU doesn't stretch it's legs until later in 2013, because by then you'll have shots and clips of PS4/720 stuff coming through. Perhaps understandable from 3rd par devs with a limited budget/time, but perhaps Nintendo should have pushed to get something more graphically advanced out for launch - pikmin 3 is the best coming up and even that isn't exactly pushing anything

The PS4 and 720 will be in the same position at launch as the U is now. Rushed ports and games looking similar to current gen games in terms of eye candy. We won't see games stretching the legs of the PS4 and 720 until late 2014. The 360 launch got a great deal of flak with most of their launch games looking no better than Xbox games, the same with the PS3 and PS2.

And by the time late 2014 comes along developers would have learnt a great deal about squeezing more out of the U. That's partly why it's a huge advantage starting a generation.

The eye candy for Retro's new game as well as Monolith's new game and the latest 3D Mario should be close to the first party launch titles from Sony and Microsoft studios.
 
Top Bottom