• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Speculation Thread of Brains Beware: Wii U Re-Unveiling At E3 2012

Status
Not open for further replies.

guek

Banned
TunaLover said:
How do you think will be the cost breakdown for the bundle?
I say:
20% controller
80% system

The high cost on the controller doesn´t allow Nintendo make a great investement in hardware, since they (historically) don´t lose money for each unit sold, and they try to keep prices low.

nah, the controller is going to be cheap cheap cheap I think. I predict a price point of $300, assuming 720/ps4 don't come out until 2013. Then price drop PAZAAANG to $250 and everyone's like aaaaaw jeah bruthas, we gonna smash some bros.

Anywho, I don't think the controller will be that expensive to produce, though I might be wrong. I also think nintendo is going to take a slight loss on this puppy.
 
AlStrong said:
The figure for RSX is misleading because it isn't calculated right, plus it's a peak figure that doesn't distinguish between pixel shaders or vertex shaders or the fact that half of the pixel shader ALUs have to handle texture addressing instructions...

Xenos is unified, so it doesn't matter, and they handle vec4+1 as opposed to vec4 on RSX/G7x. Texture addressing is fully orthogonal. Lots of devs take advantage of this since you can, for instance, issue more texture instructions whilst ALU ops complete or vice versa if for some reason the dev is doing something texture heavy, they might as well do some extra math on some other shader simultaneously.

Anyways, the main point is that directly comparing GFLOPs when the architectures are fundamentally different is pointless.

Oh yeah. It was as much joking as anything else (hence the measurement in duct tape). I took the Nvidia/ATI difference in consideration, but was having too much fun with it. :p

But anyway that is the type of info I like to read, but don't think to search out. I knew that 400 was a peak and felt it seemed high, but wasn't confident enough in my knowledge on it to disregard it like you did.

AlStrong said:
You're assuming 256-bit buses for both memory types.

Yes and no (although the no essentially works out that way). Yes on the GPU since there was a Game Watch article that indicated a 4870 was used so in the dev kit so I assume they would keep a 256-bit bus. The DDR3 calc was 64-bit, but using IBM's quad-channel memory interface.

Grampa Simpson said:
I really want you to start comping parts on these and calculating power budgets.

I don't think that DDR3 runs that fast in Mhz, maybe in MT/s though....

You are alive! Actually I have been trying to with those specs. For example the leaked Thames Pro is 850Mhz, 1408 ALUs and is 90W. The Lombok XL leaked amount is 900Mhz, 768 ALUs and 60W. As you can see my GPU is closer to the Lombok XL. That's also why I focused on a split pool with DDR3. As for the CPU, if the PowerPC A2 can be built with 16 cores at 2.3GHz and be 65W, I would assume the one I'm proposing could be made under 40W.

As for the memory clock, when I searched that out awhile back there seemed to be DDR3 capable of those clocks and faster. I went with that number due to the multiples.

TunaLover said:
How do you think will be the cost breakdown for the bundle?
I say:
20% controller
80% system

The high cost on the controller doesn´t allow Nintendo make a great investement in hardware, since they (historically) don´t lose money for each unit sold, and they try to keep prices low.

I don't know how they would split it, but I do know that after looking closer at the patent I'm back to expecting the minimum price to be $349.

Luigiv said:
Sort of. bgassassin is going purely by proccessor power, which are identical in architeture but 50% on the Wii. However that isn't really the whole story. The Wii's other major enhancement is that it's dog slow 16MB ARAM chip was replaced with a blazing fast 64MB GDDR3 chip. This makes a huge difference and in practise allows the devs to extract more out of the proccessors then they would have otherwise.

Yep. Like I said it was just having fun with question Ace posed awhile back and just got a little carried away.
 

guek

Banned
bgassassin said:
Yes on the GPU since there was a Game Watch article that indicated a 4870 was used so in the dev kit so I assume they would keep a 256-bit bus.

oh yeeeeeeeeeeah I forgot about that article. But wasn't it discredited somehow?

That was back in June too. I still think it's insane that the month of E3 was about the last time we heard any real talks about what's in the box.
 

spanks

Member
I haven't been following this thread much, but: do we know if the control sticks have click 'buttons' (e.g. R3,L3)? Because if not, that seems like a major roadblock in porting games like COD/BF3. I guess they could move crouch/melee over to the touchscreen, but that drastically affects the playability. If they really did forget those 2 extra buttons, damn, they pretty much lost the COD audience before they even started.
 

Rolf NB

Member
Grampa Simpson said:
I really want you to start comping parts on these and calculating power budgets.

I don't think that DDR3 runs that fast in Mhz, maybe in MT/s though....
A quick check on current graphics cards indicates that GDDR5 tops out at around 1.4GHz (2.8GHz "effective"). Half of his predicted 5.something.
 
spanks said:
I haven't been following this thread much, but: do we know if the control sticks have click 'buttons' (e.g. R3,L3)? Because if not, that seems like a major roadblock in porting games like COD/BF3. I guess they could move crouch/melee over to the touchscreen, but that drastically affects the playability. If they really did forget those 2 extra buttons, damn, they pretty much lost the COD audience before they even started.
There are no sticks.
 
BurntPork said:
Something tells me that this will be proven wrong yet again. I mean, Samaritan...
And whose gonna approve a game with that kind of budget? It is only wet dream. Even if they can box a supercomputer in PS4 and 360, it'll take probably until the generation after that for games with this kind of graphic to actually be produced maybe bar very few titles.
 

BurntPork

Banned
bgassassin said:
It's total system BP and I just realized I transposed my CPU clock so while not significant it should be 156.4 GCs.

Actually Thunder, RSX is supposedly 400.4 GFLOPS while Xenos is 240. Reiterates how I felt that 360 is essentially a "modern at the time" Gamecube.

Yeah Doom. Going by FLOPS and the numbers given for Wii that's it, lol.

Guek, my (I don't know how many-th) version of my specs look like this based on my recent GPU prediction.


CPU - Tri-core 3.645Ghz (16MB L2 cache) (87.48 GFLOPS)

Memory (assuming DDR3/GDDR5 split) - 1-1.5 GB of DDR3 1822.5Mhz, 512MB-1GB of GDDR5 5467.5Mhz (Don't let the ranges fool you. There would be no more than 2GB total and most likely 1.5GB total.)

GPU - 896 ALUs 911.25Mhz (64MB? e1T-SRAM) (1632.96 GFLOPS)

Memory Bandwidth - GDDR5 175GB/s, DDR3 58.3GB/s (not sure if this is completely accurate)
No point in even mentioning 512MB, since that's the one thing we know it'll have more than.

walking fiend said:
And whose gonna approve a game with that kind of budget? It is only wet dream. Even if they can box a supercomputer in PS4 and 360, it'll take probably until the generation after that for games with this kind of graphic to actually be produced maybe bar very few titles.
Actually, it will be possible to put the approximate power of a GTX 580 into a console if the GPU is on 28nm, so Samaritan will be possible on the next XBox and PlayStation. Nintendo could do it too if they'd take a bit of a risk to really show the hardcore that they're serious.
 
BurntPork said:
No point in even mentioning 512MB, since that's the one thing we know it'll have more than.

The 512MB is for the DDR5.
BG is assuming it'll have two different allocations of RAM (one DDR3 and one DDR5).

Personally, I think that 2GB is the MINIMUM that Nintendo will go with, rather than the maximum.
They want companies like Ubisoft and EA and Activision to gush over it, so they'll increase things to please them. Not drastically, but I could see 2.5GB in this thing. Maybe more.
 

Kenka

Member
DDR3 is RAM and GDDR5 would be VRAM in this case, right ? What are the consequences if the GDDR5 is clocked a 2.8 Ghz and not 5 as bg said ? Only the pixel fillrate would be halved right ? Does that mean we get only half the FPS for the same IQ ?
 

BurntPork

Banned
AceBandage said:
The 512MB is for the DDR5.
BG is assuming it'll have two different allocations of RAM (one DDR3 and one DDR5).

Personally, I think that 2GB is the MINIMUM that Nintendo will go with, rather than the maximum.
They want companies like Ubisoft and EA and Activision to gush over it, so they'll increase things to please them. Not drastically, but I could see 2.5GB in this thing. Maybe more.
Oh. I took it that he meant "or," not "and." I don't see them splitting the RAM pool. Isn't that part of what caused the issues with PS3 development? All three consoles will use a shared pool, imo.
 
bgassassin said:
Memory (assuming DDR3/GDDR5 split) - 1-1.5 GB of DDR3 1822.5Mhz, 512MB-1GB of GDDR5 5467.5Mhz (Don't let the ranges fool you. There would be no more than 2GB total and most likely 1.5GB total.)

Zero chance the memory will work at those frequencies, runs to hot. It is also very unlikely DDR3 will be used, way to slow.
 
BurntPork said:
Oh. I took it that he meant "or," not "and." I don't see them splitting the RAM pool. Isn't that part of what caused the issues with PS3 development? All three consoles will use a shared pool, imo.


No, split RAM is how all PC games are made.

Video RAM and System Memory. Though, with consoles, they can be shared a lot of times.
 

BurntPork

Banned
ElTopo said:
No way. None. I'm not even sure it'll necessarily come out for Wii U.
Why not? Chinatown Wars sold over 1 million, so there's an audience among Nintendo fans.

AceBandage said:
No, split RAM is how all PC games are made.

Video RAM and System Memory. Though, with consoles, they can be shared a lot of times.
True, but I figured that PC APIs made that easier to deal with than it is in a closed box. If the split pool isn't the issue, why to PS3 multiplat games tend to have problems with textures when compared to 360 versions?
 
Actually, it will be possible to put the approximate power of a GTX 580 into a console if the GPU is on 28nm, so Samaritan will be possible on the next XBox and PlayStation. Nintendo could do it too if they'd take a bit of a risk to really show the hardcore that they're serious.
As I said, it won't matter even if they put a supercomputer inside. Games that utilize the power won't be developed.
 

spanks

Member
Jaded Alyx said:
There are no sticks.
So the Wii U has 2 less buttons than the other major consoles... why hasn't anyone made a big deal out of it? It seems like the Wii U is built upon the idea of parity with the HD consoles on every level, but now any developer that's making a first/third person shooter is going to have to compromise their controller input for the Wii U versions. Like I mentioned, the touch screen would be okay for menus and maps, but stretching your thumbs over to the screen and guessing where the virtual melee button is in the heat of battle - and remember you're taking your thumb off the look/move stick at the same time - is stupid. I can't see any 'hardcore' (Call of Duty, GTA, Madden) players taking the console seriously, if that's the case.
 
BurntPork said:
Why not? Chinatown Wars sold over 1 million, so there's an audience among Nintendo fans.


True, but I figured that PC APIs made that easier to deal with than it is in a closed box. If the split pool isn't the issue, why to PS3 multiplat games tend to have problems with textures when compared to 360 versions?


Cell.
spanks said:
So the Wii U has 2 less buttons than the other major consoles... why hasn't anyone made a big deal out of it? It seems like the Wii U is built upon the idea of parity with the HD consoles on every level, but now any developer that's making a first/third person shooter is going to have to compromise their controller input for the Wii U versions. Like I mentioned, the touch screen would be okay for menus and maps, but stretching your thumbs over to the screen and guessing where the virtual melee button is in the heat of battle - and remember you're taking your thumb off the look/move stick at the same time - is stupid. I can't see any 'hardcore' (Call of Duty, GTA, Madden) players taking the console seriously, if that's the case.

There was a huge deal made about it at E3, actually.
And we have no idea if it will have clickable pads or not.
 

BurntPork

Banned
walking fiend said:
As I said, it won't matter even if they put a supercomputer inside. Games that utilize the power won't be developed.
I'm sure that was said back when Epic started showing off UE3...

Keep in mind that Samaritan ran at a higher resolution than 1080p, so it might be possible to turn some fluff down with a minimal loss in visual quality, at least to the naked eye.

AceBandage said:
Oh right. Cell sucks lol
 

AniHawk

Member
spanks said:
So the Wii U has 2 less buttons than the other major consoles... why hasn't anyone made a big deal out of it? It seems like the Wii U is built upon the idea of parity with the HD consoles on every level, but now any developer that's making a first/third person shooter is going to have to compromise their controller input for the Wii U versions. Like I mentioned, the touch screen would be okay for menus and maps, but stretching your thumbs over to the screen and guessing where the virtual melee button is in the heat of battle - and remember you're taking your thumb off the look/move stick at the same time - is stupid. I can't see any 'hardcore' (Call of Duty, GTA, Madden) players taking the console seriously, if that's the case.

if designers can't design around the lack of two buttons, they're in the wrong profession.
 
ElTopo said:
Quite a few seem to consider that quite a disappointment given the huge number of DS, even though (the original) 2D GTAs were significantly less successful than their 3D counterparts. They might simply ask themselves whether it's worth it to port the game to Wii U and come to the conclusion that they'd be better off developing some DLC.

It's similar to why games didn't end up on the Gamecube - developers thought it wasn't worth the effort. I wouldn't be surprised to hear something like 'We'd love to do a Wii U version, but it wouldn't make sense if it's just the same game.'


I'm not so sure Nintendo will give them a choice.
They have championed the GTA series a few times. They'll get it on the Wii U one way or the other.
 
ElTopo said:
I like your optimism, but I'm a pessimist. After all these years I don't expect 3rd parties to truly support a Nintendo console, whatever the reasons may be.


If the reason is moneyhats, they won't say no.

In fact, I think that may be one of the reasons Nintendo is publishing Lego City Stories. To try and drum up a userbase for that kind of game.
 
AceBandage said:
If the reason is moneyhats, they won't say no.

In fact, I think that may be one of the reasons Nintendo is publishing Lego City Stories. To try and drum up a userbase for that kind of game.
I understand what you mean, but that sounds utterly bizarre in my head.
 
BurntPork said:
I'm sure that was said back when Epic started showing off UE3..
1. Are you? PC games already looked fabulous before UE3. Doom 3 and Far Cry were jaw dropping for their time, and they were real games. Is there currently any game that looks even nearly as detailed and complex as that demo?

2. That time, the budget of the games weren't this crazy and the industry could sustain this amount of increase in budget (well, actually many developers and publishers couldn't); but this time the budget will be so high that the industry at its current state won't be able to sustain it. The amount of money that is spent will just not add up. Maybe multi-million sellers can sustain development costs of $100m+, but how many publishers are willing to take that type of risks? It'll need to sell at least 2m copies at $50 just to harness a 'revenue' equal to that, and there will still remain all the licensing they should pay, distributing costs and marketing. And I believe we have already passed 100m, haven't we?

Keep in mind that Samaritan ran at a higher resolution than 1080p, so it might be possible to turn some fluff down with a minimal loss in visual quality, at least to the naked eye.
Well, by the same line of reasoning you could say they may turn off even more fluff and make it look almost the same in a yet another weaker device.
 

BurntPork

Banned
walking fiend said:
1. Are you? PC games already looked fabulous before UE3. Doom 3 and Far Cry were jaw dropping for their time, and they were real games. Is there currently any game that looks even nearly as detailed and complex as that demo?

2. That time, the budget of the games weren't this crazy and the industry could sustain this amount of increase in budget (well, actually many developers and publishers couldn't); but this time the budget will be so high that the industry at its current state won't be able to sustain it. The amount of money that is spent will just not add up. Maybe multi-million sellers can sustain development costs of $100m+, but how many publishers are willing to take that type of risks? It'll need to sell at least 2m copies at $50 just to harness a 'revenue' equal to that, and there will still remain all the licensing they should pay, distributing costs and marketing. And I believe we have already passed 100m, haven't we?


Well, by the same line of reasoning you could say they may turn off even more fluff and make it look almost the same in a yet another weaker device.
No point in arguing over the future. You never know what will happen.

*thread loses its point*

Rolf NB said:
Ridiculously excessive amounts of memory make that easier to deal with on PCs.
True.

And Luckyman is here. I can't see what he said, but I bet that I disagree with it.
 
Jaded Alyx said:
I understand what you mean, but that sounds utterly bizarre in my head.


Well, let's look at it.

Nintendo has said over and over again how they would love the GTA series on their systems (going so far as to get an exclusive one on the DS, despite it not being what fans wanted...).

To kind of appeal to R*, they'd need to prove that their kind of game can do well on the system.
So what do they do? The publish a game that is basically GTA, but more family friendly (heaven forbid Nintendo make it a Mature game, but that's another story).
Lego City Stories sells well, R* sees a market open on the Wii U (and 3DS maybe), and Nintendo now has a userbase for games like GTA and RDR.

It's quite brilliant, really.

They get a more "mature" audience on their console, without dirtying up their own image for game making.
 
AceBandage said:
Well, let's look at it.

Nintendo has said over and over again how they would love the GTA series on their systems (going so far as to get an exclusive one on the DS, despite it not being what fans wanted...).

To kind of appeal to R*, they'd need to prove that their kind of game can do well on the system.
So what do they do? The publish a game that is basically GTA, but more family friendly (heaven forbid Nintendo make it a Mature game, but that's another story).
Lego City Stories sells well, R* sees a market open on the Wii U (and 3DS maybe), and Nintendo now has a userbase for games like GTA and RDR.

It's quite brilliant, really.

They get a more "mature" audience on their console, without dirtying up their own image for game making.
Yeah, I get all that. It's just the 'LEGO opening the door to Grand Theft Auto' thing I found amusing.

I wouldn't be surprised though if Rockstar don't make that connection.
 
AceBandage said:
To kind of appeal to R*, they'd need to prove that their kind of game can do well on the system.
So what do they do? The publish a game that is basically GTA, but more family friendly (heaven forbid Nintendo make it a Mature game, but that's another story).


Nintendo have published "mature" titles in the past.
 
BurntPork said:
Wait, GTA V is being announced next week? Damn, then it won't be announced for Wii U, at least not yet. This is really bad...


I think they might say the game is not for Nintendo audience or something lol. I hope they announce it though.
 

Kenka

Member
What are the actual odds of having a link to a guide about PC components performance and know what happenned to this awesome poster who used to know all things related to technical stuff and had a footballer on his avatar (forgot his name, right now) ?
 
Kenka said:
What are the actual odds of having a link to a guide about PC components performance and know what happenned to this awesome poster who used to know all things related to technical stuff and had a footballer on his avatar (forgot his name, right now) ?
brain_stew? Hmm..good question.
 
Nuclear Muffin said:
Nintendo will be lucky to get it at all. Grudges go a long way for a long time...
Grudges?
nickcv said:
wait, what happened?

why is BP banned?
I was gonna ask the same.
On Grand Theft Auto, GTA console games tend to release in Spring and Fall, so theres still a possibility of the Wii U getting a version.
 
Those controllers are gonna be EXPENSIVE and they'll be a shortage at launch like usual.

If I have a job, I might get one, but that would force me to overhaul my TV/Receiver setup and re-program my remote...
 
Status
Not open for further replies.
Top Bottom