• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Speculation Thread 2: Can't take anymore of this!!!

Status
Not open for further replies.

Bullza2o

Member
It depends how they go about getting it working. Naturally, they could just force the Wii U's eShop to only connect to one region, and limit the accounts only to work in the region of their creation, but that would mean you wouldn't be able to use your account (or the games stored on it) if you ever moved to another country.

If they let you connect regardless of the region of the account, it'd cause problems; if you buy games on the American eShop and then choose to log on to the Japanese eShop via a japanese Wii U, you'd lose access to any content not present on the Japanese eShop, so this is unlikely.

Alternatively they could have it work like the PSN. Accounts are seperated by region, and the accounts region designates which regions PSN the system connects to; American account connects to the American eShop, Japanese account connects to the Japanese eShop.

In short, the following are possible:

A. Regional accounts, eShop region tied to console, accounts only usable in their own region
B. Regional accounts, eShop region alternates based on accounts region, accounts usable worldwide

We have to hope for B. It really depends on whether they want people using the Wii U on holiday.
Thanks for the analysis. Yes, I am really hoping that B is the case.
 

disap.ed

Member
Is it possible that we are looking at graphical power differences between all three consoles in the range of this (below) for next generation or will it be more broad? if wii u was using a 28nm gpu what would be the TDP for something along the lines of a 7750 equivalent in power designed for the console?

Radeon HD 7750 Wii U
Radeon HD 7770 XBOX
Radeon HD 7790 PS4

If we get a 7750 or 7770 in range of power for next gen I would be very happy for all three to be in that range and ecstatic if WiiU was even 7750 equivalent

I am not even sure if it is a given that the wii GPU will be over 1200GFLOPS in performance but if it is then would that be around 5 times more graphical processing than the GPU on the XBOX 360?

What would be the Max TDP the GPU would have to be to fit in the WiiU case? if it is under 66W for GPU then something like a 7750 in power would be out of the question. What if we looked at something in the mobility range and if tweaked it up to meet a power equivalent to about 5x the xbox 360.

5870M TDP 50W 40nm 1120GFLOPS
6950M TDP 50W 40nm 1113GFLOPS

What is the concensus on what TDP the WiiU is likely to consume? how much goes into the GPU-CPU-drive and other parts?

I could see something like a 7750 happen in the WiiU but with lower clock rates (the 7750 will be clocked around 900MHz I guess). These card will be revealed in 2 weeks so we will see which power they are drawing but when I see a 7950 @900 MHz with 1792 shader units and 3GB GDDR5 RAM draws around 150W I am really optimistic that it will be well below 100W. GCN also seems to scale really nice, so I could see a 50W GPU with 768 shader units @600 - 700 MHz as a possibility.
 

DCKing

Member
Which GPU within the RV7xx / Radeon 4xxx series best fits with twice the performance of Xenos ?
The RV770LE we've been discussing as the devkit part does slightly better than twice the Xenos performance in all areas except shaders (it does almost 4x the perfomance in that).

I kind of figured the 2x rumour was about that, and the 5x rumour factored in CPU and memory as well. It's a pretty nonsensical to talk about "X times" more powerful though, especially if you have no idea what was measured at all.
 

z0m3le

Banned
Just for reference, the HD7750m is 36watt TDP 768sp@650mhz and 1070 Gflops.

Wii U will use a custom part, but it's nice to know that they have room for something like this:
HD7950m is 50watt TDP 1408sp@700mhz and 1749 Gflops.

Both are very powerful compared to PS360, and even if the next boxes pushed 2000Gflops, you wouldn't really see a large difference, at best you'd see games with 720p on Wii U, or this gen's HD at 640p while the other guys push 720p...
 

darthdago

Member
The RV770LE we've been discussing as the devkit part does slightly better than twice the Xenos performance in all areas except shaders (it does almost 4x the perfomance in that).

I kind of figured the 2x rumour was about that, and the 5x rumour factored in CPU and memory as well. It's a pretty nonsensical to talk about "X times" more powerful though, especially if you have no idea what was measured at all.

ok, but lets take it as if they have measured the same performance areas as they have had with the early dev kit?!
What would be in range then?

Do anyone think that it will be possible that Nintendo surprise everyone and grab money from deep down their pockets and put a HD7*** in Wii U (if they have not already planned that with AMD - hear birds wispering -)
 

ExReey

Member
Uh-oh, Nintendo Gaf is not going to like this.
Source: In Theory: Can Wii U Offer Next-Gen Power? [Note: registration required]

"It's been a week of measurements, comparisons and percentages. The next-generation Xbox will be six times more powerful than the 360, and 20 per cent more powerful than the Wii U. As for Nintendo's next-gen console, apparently that's packing twice the power of Microsoft's current offering.
...

Can anyone tell me how that works out, mathematically?
 

Pociask

Member
Zelda is a particularly good candidate for Nintendo-style DLC.
There is several examples in Zelda's history, starting by The Wind Waker, where some stuff has been cutted down in a game for differents reasons (2 dungeons in TWW for deadline reason in december 2002).

So with well integrated DLC, instead of having lazy developers just cutting dungeons, or just saying frak to hearts pieces containers and dropping full heart containers in Phantom Hourglass (in a stuping mini game when not buying it at all), and having some micro island with nothing to do, or caves full or rupees and nothing else (TP anyone?), just create a DLC entry point. DLC is a good answer sometimes.
And a bunch of weeks after release, push the new content and make the cave openable with a bomb, make the orb or something reacting to a particular song, make a NPC asking you for something new, etc...

This doesn't mean the game will not be packed full of content, and I prefer a more complete main quest with 18 hearts on the retail disc, and having some good DLC sometime later that allow me to play more side-quests and find up to an extra 24 pieces of hearts.

This would also allow the Zelda team to go crazy-go-nuts on dungeon design. Instead of designing content for someone who had never played Zelda before, they can design dungeons that involve advanced mechanics and utilize every item obtained in the game proper. They don't have to fit into a story - heck, the end of the DLC dungeon could just be fighting a boss from an old game. Heck, have the loot be a Yoshi doll.
 
The RV770LE we've been discussing as the devkit part does slightly better than twice the Xenos performance in all areas except shaders (it does almost 4x the perfomance in that).

I kind of figured the 2x rumour was about that, and the 5x rumour factored in CPU and memory as well. It's a pretty nonsensical to talk about "X times" more powerful though, especially if you have no idea what was measured at all.

Err, if it's an RV770LE, you think Nintendo is going to go with a 256 bit wide bus that is anathema to console makers? (Especially Nintendo of all people).

Alternatively, a RV770LE with a 128 bit bus would probably require a lot of reengineering, doubt Nintendo paid for AMD to go back and fundamentally rip up an oldass chip. If it's in that class it'd be a RV740Pro or 770CE. Would call the 740 Pro MUCH more likely, (though I think it's still too ambitious and predict a RV730 as I have from day one).
 

Akai

Member
Do anyone think that it will be possible that Nintendo surprise everyone and grab money from deep down their pockets and put a HD7*** in Wii U (if they have not already planned that with AMD - hear birds wispering -)

I don't think this is even realistic for the next MS and Sony systems coming a year or two down the road...
 

Caramello

Member
Err, if it's an RV770LE, you think Nintendo is going to go with a 256 bit wide bus that is anathema to console makers? (Especially Nintendo of all people).

Alternatively, a RV770LE with a 128 bit bus would probably require a lot of reengineering, doubt Nintendo paid for AMD to go back and fundamentally rip up an oldass chip. If it's in that class it'd be a RV740Pro or 770CE. Would call the 740 Pro MUCH more likely, (though I think it's still too ambitious and predict a RV730 as I have from day one).

Err, he said the chip in the dev kit, not the retail hardware.
 

z0m3le

Banned
From history, the dev kit boxes usually are older hardware then anything in the final box, PS3 for instance had a geforce 6 series gpu in it's dev box, this was obviously not what the PS3 ended up with (a custom geforce 7 series part)

Microsoft's 360 ended up with a custom chip that wasn't even ready in 2005, and outperforms the PS3's Geforce 7 series custom chip even though it came out a year prior.

Nintendo while using a HD4000 series chip in their dev kit, means nothing of their final hardware. It's likely a HD7000 series chip, possibly mobile in nature.

The HD7750m is a great place to start as I previously mentioned up the page a little. 36watt 1tflop+ and clocks at 650mhz.
 
Can anyone tell me how that works out, mathematically?
220px-Impossible_objects.svg.png
 
Err, he said the chip in the dev kit, not the retail hardware.

Umm, so they're shipping dev kits with twice the memory bandwidth as retail? Dont think it works that way.

Microsoft's 360 ended up with a custom chip that wasn't even ready in 2005, and outperforms the PS3's Geforce 7 series custom chip even though it came out a year prior.

Yeah, but legend is 360 early dev kits were vastly underpowered compared to the final, in contrast to the above. That's the only plausible way to do it. And that's why 360 didn't have anything very "next gen" until Gears of War a year after launch.
 

disap.ed

Member
Just for reference, the HD7750m is 36watt TDP 768sp@650mhz and 1070 Gflops.

Wii U will use a custom part, but it's nice to know that they have room for something like this:
HD7950m is 50watt TDP 1408sp@700mhz and 1749 Gflops.

Both are very powerful compared to PS360, and even if the next boxes pushed 2000Gflops, you wouldn't really see a large difference, at best you'd see games with 720p on Wii U, or this gen's HD at 640p while the other guys push 720p...

May I ask where these numbers are from?

Yeah, but legend is 360 early dev kits were vastly underpowered compared to the final, in contrast to the above. That's the only plausible way to do it. And that's why 360 didn't have anything very "next gen" until Gears of War a year after launch.

So will the WiiU *believe*


Head: explodes
 

StevieP

Banned
Just for reference, the HD7750m is 36watt TDP 768sp@650mhz and 1070 Gflops.

Wii U will use a custom part, but it's nice to know that they have room for something like this:
HD7950m is 50watt TDP 1408sp@700mhz and 1749 Gflops.

Both are very powerful compared to PS360, and even if the next boxes pushed 2000Gflops, you wouldn't really see a large difference, at best you'd see games with 720p on Wii U, or this gen's HD at 640p while the other guys push 720p...

All of the TDP figures you've seen for the ATI 7000 series cards (aside from the already-released models) are fake. At least the last time I looked at them a month or so ago.

Yeah, but legend is 360 early dev kits were vastly underpowered compared to the final

Ding ding, motherfucking ding. Do me a favour and read this over and over again. And then apply it to ALL consoles, not just the ones that you favour.
 

TunaLover

Member
Now that I remember that info by Kotaku about Wii U having 8GB of internal memory, I wonder how they had access to that info... Being so secretive as Nintendo is , that info. could be leaked by Ubisoft, but that french site (01net?) didn´t mentioned nothing about it...
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
Ding ding, motherfucking ding. Do me a favour and read this over and over again. And then apply it to ALL consoles, not just the ones that you favour.

I thought I was the only one who saw the irony in that statement.
 
Edit:

The novelty of posting wore off really fast. Just seeing this up here is making me ill so I'm removing it.

Don't worry it has been saved for pospertiy by AceBandage :p. It was actually a decent first post too; certainly more interesting than mine was IIRC.

Do you mean Metroid prime?
Yes, it does look familiar indeed.

MP3C to be pedantic :p.

Why don't we get two tone consoles? Is it a cost thing, or a design thing?

I love the see the Wii U with some sort of color scheme (black and white, white and blue, or whatever).

It is a Nintendo not wanting to put colour into their home consoles because of the Fisher Price toy Nintendo am kiddy meme thing. I'm pretty sure Iwata and/or Miyamoto stated this in an Iwata Asks not too long ago.

I want a time travel dungeon a la Spirit Temple, but like "Link Timeshift" in dungeon, so you go from kid to adult via a portal every room needed or smth.

Also, when do I "graduate" from Junior Member to Member? I feel as if Junior Member is like the GAF stone Homer had to drag on The Simpsons.

Also, was I the *only* one hoping ALL of Lanayru would timeshift when you heal ze d-ragon?

3 months of membership + 300 posts, I believe.

Oh and for those vexed about the Wii U Circle Pads not being analogue sticks, Nintendo has stated that they are superior to the 3DS Circle Pads and have more travel to them. So as Vinci wisely said, we are just going to have to suck it and see. Game software is always designed to take into account the strengths and weaknesses of its control system so as long as the devs are competent I am hopeful that their decision will prove to be the correct one.
 

onilink88

Member
Well, true. All I wish for is downloading JP demos though.

It was possible to have a US PSN account as well as a Japanese one with the PS3. I got myself one just for the sake of downloading the Resonance of Fate demo. All I had to do was make a new account on the PS3, and choose Japan as my region when signing up for PSN. It'd be nice if we could do something similar to with the Nintendo Network, but I'm guessing we'll be most likely be locked in to whatever region we're in...

On another note, I just came out of that, uh, "Wii U not a Brute Force whatever" thread... Christ, that thing was a fucking travesty.
 
All of the TDP figures you've seen for the ATI 7000 series cards (aside from the already-released models) are fake. At least the last time I looked at them a month or so ago.



Ding ding, motherfucking ding. Do me a favour and read this over and over again. And then apply it to ALL consoles, not just the ones that you favour.


LOL..like that's gonna happen.
 
Zelda is a particularly good candidate for Nintendo-style DLC.
There is several examples in Zelda's history, starting by The Wind Waker, where some stuff has been cutted down in a game for differents reasons (2 dungeons in TWW for deadline reason in december 2002).

This doesn't mean the game will not be packed full of content, and I prefer a more complete main quest with 18 hearts on the retail disc, and having some good DLC sometime later that allow me to play more side-quests and find up to an extra 24 pieces of hearts.

Very much agree. If handled correctly, Zelda would be absolutely perfect for DLC.
 
One thing I've been recently thinking about again: The early devkits supposedly used a triple core, dual threaded PowerPC. The final silicon is supposedly different. I wonder if it would make sense to replace the CPU with six single threaded PPC47x cores. You don't really need 64bit integer arithmetic in a console, and the PPC476 for example is small, low power consumption, highly scalable, OoOE, very short pipeline, and pretty good at number crunching.

Answering this seriously this time, what if they went with 4-way SMT instead of 2-way?

I could see something like a 7750 happen in the WiiU but with lower clock rates (the 7750 will be clocked around 900MHz I guess). These card will be revealed in 2 weeks so we will see which power they are drawing but when I see a 7950 @900 MHz with 1792 shader units and 3GB GDDR5 RAM draws around 150W I am really optimistic that it will be well below 100W. GCN also seems to scale really nice, so I could see a 50W GPU with 768 shader units @600 - 700 MHz as a possibility.

I don't really seen Nintendo using GCN. They'll most likely beef up the CPU first.

ok, but lets take it as if they have measured the same performance areas as they have had with the early dev kit?!
What would be in range then?

Do anyone think that it will be possible that Nintendo surprise everyone and grab money from deep down their pockets and put a HD7*** in Wii U (if they have not already planned that with AMD - hear birds wispering -)

Something no more than 1.5TFLOPs (doubt it comes close to that) and I personally doubt the latter.

Err, if it's an RV770LE, you think Nintendo is going to go with a 256 bit wide bus that is anathema to console makers? (Especially Nintendo of all people).

You didn't answer my question.

From history, the dev kit boxes usually are older hardware then anything in the final box, PS3 for instance had a geforce 6 series gpu in it's dev box, this was obviously not what the PS3 ended up with (a custom geforce 7 series part)

Microsoft's 360 ended up with a custom chip that wasn't even ready in 2005, and outperforms the PS3's Geforce 7 series custom chip even though it came out a year prior.

Nintendo while using a HD4000 series chip in their dev kit, means nothing of their final hardware. It's likely a HD7000 series chip, possibly mobile in nature.

The HD7750m is a great place to start as I previously mentioned up the page a little. 36watt 1tflop+ and clocks at 650mhz.

The PS3 had those Geforce 6s in dual SLI. And as I've already mentioned I doubt we'll see GCN. As I mentioned in the 2x thread, I can see a possibility of Nintendo making their own unified shading architecture that is neither VLIW or GCN-based.
 
Ding ding, motherfucking ding. Do me a favour and read this over and over again. And then apply it to ALL consoles, not just the ones that you favour.


Why would it apply to all consoles? It's a case by case basis.

The point in question was why Nintendo would be using dev kits with twice as wide bus as retail. That's a decrease in power from dev to retail which isn't really possible.
 
The main reason why you can't make a logical point is first because of your skewed view of Wii U that doesn't match up with anything we know and second I still don't really know what your expectations are of the other consoles. I don't think I've ever seen you say them. What are your expectations of the two consoles?



Other consoles: At least ~10X current.

Wii U: 1.0-1.5X current.
 

StevieP

Banned
Why would it apply to all consoles? It's a case by case basis.

The point in question was why Nintendo would be using dev kits with twice as wide bus as retail. That's a decrease in power from dev to retail which isn't really possible.

The PS3, as already mentioned, had SLI 6800s in their dev kits. Tell me, pray tell, how they ended up with the RSX? Why not 2 RSXs SLId on a bus? They're there as an approximate equivalency in early dev kits, not as an exact analogue. My god man. It's not a case by case basis. It's almost universally the case that dev kits are always weaker than the final hardware.

Other consoles: At least ~10X current.

Wii U: 1.0-1.5X current.

Show me the proof of either claim. Without using off-the-cuff developer comments.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
What's all this about eating shit?
 
Developers getting to use the cell for gpu functions is kind of responsible for those amazing first party efforts correct? Any chance of seeing some developer savvy like that with our current rumors or was that situation unique to the cell because it was kind of designed to be a gpu too?
 
The PS3, as already mentioned, had SLI 6800s in their dev kits. Tell me, pray tell, how they ended up with the RSX? Why not 2 RSXs SLId on a bus? They're there as an approximate equivalency in early dev kits, not as an exact analogue. My god man. It's not a case by case basis. It's almost universally the case that dev kits are always weaker than the final hardware.

SLI 6800's though, would provide a good approximation of 7800GTX, which was what ended up in PS3...

I dont know the history of every dev kit like I do the 360's, but I have no proof dev kits are "almost universally less powerful". Anyways, it's still case by case. If you were pushing new exotic tech like PS2 in the day, I can see weakling dev kits. If however you are shooting for something far less than current state of the art...

But all this doesnt matter to the technical infeasibility of pushing out dev kits with 256 busses if you're planning 128 in retail which was the topic... Wii U could in turn gain power over dev kit revisions and still end up at 1-1.5X as I predict, it's not exclusionary.

Show me the proof of either claim. Without using off-the-cuff developer comments.

There is no proof as there is no known specs on any of the machines, especially PS4/XB3 which likely dont exist in firm form yet, rumors notwithstanding. I believe to know enough from Nintendo's history, shown demos (all in 720P no AA btw), reliable rumors (01.net) and public statements by Nintendo and other devs, that I believe a very accurate guesstimate is that Wii U is much closer to this gen than "next". I believe to know enough from Microsoft and Sony's history and strategy, as well as Samaritan demo, to assume they will produce a full leap worthy of being called next gen. Any such leap will be at least 10X+ based on the current state of technology (remember, 10X is a much more conservative leap when the gen is lasting 7-8 years than previous times when generations lasted 5 years).

Stick a HD 6870 in a next gen console and it's probably close enough to be called 10X. And that's a pretty moderate, mid-low card by pc standards.
 

User Tron

Member
Answering this seriously this time, what if they went with 4-way SMT instead of 2-way?
It makes no sense for a gaming console. First you have be aware that OoOE and SMT try to achieve the same thing: feed the processing units so that they don't idle. OoOE does this by doing operations before they were due, while SMT does this by checking if another thread on the core could use a processing unit that would idle otherwise. Having both only makes sense in environment which is hard to predict, which does not apply to gaming code. The compiler can help you with OoOE but not so much with SMT as it has less or even no control which threads are running on which core. So having 6 no SMT cores instead of 3 cores with 2 way SMT is far better suited for gaming console IMHO. Let alone that you probably get more real processing units.
 

StevieP

Banned
specialguy said:
SLI 6800's though, would provide a good approximation of 7800GTX, which was what ended up in PS3...

I dont know the history of every dev kit like I do the 360's, but I have no proof dev kits are "almost universally less powerful". Anyways, it's still case by case. If you were pushing new exotic tech like PS2 in the day, I can see weakling dev kits. If however you are shooting for something far less than current state of the art...

There is no proof as there is no known specs on any of the machines, especially PS4/XB3 which likely dont exist in firm form yet, rumors notwithstanding. I believe to know enough from Nintendo's history, shown demos (all in 720P no AA btw), reliable rumors (01.net) and public statements by Nintendo and other devs, that I believe a very accurate guesstimate is that Wii U is much closer to this gen than "next". I believe to know enough from Microsoft and Sony's history and strategy, as well as Samaritan demo, to assume they will produce a full leap worthy of being called next gen. Any such leap will be at least 10X+ based on the current state of technology (remember, 10X is a much more conservative leap when the gen is lasting 7-8 years than previous times when generations lasted 5 years).

Oh my god.
 
Developers getting to use the cell for gpu functions is kind of responsible for those amazing first party efforts correct? Any chance of seeing some developer savvy like that with our current rumors or was that situation unique to the cell because it was kind of designed to be a gpu too?

I hope we don't see anything like that in Wii U. And developers didn't "get to use" Cell, they had no choice if they wanted to make something that looked good because of how poor RSX was as a GPU.
 

Luigiv

Member
Why do people still continue to waste effort replying to specialguy? He seems incredibly intent on justifying his name and therefore we should just except him for how he is. He's a "special" guy after all.
 
Other consoles: At least ~10X current.

Wii U: 1.0-1.5X current.


1.0-1.5? 1.0 can barely run 720p at 30fps add in extra for tablet controllers and you basically have a machine that will run slower than a 360. Might as well put in a radeon 6620g in there if you want 1.5x the power. Where did you get the idea for this?

10x current that's like having a 7870 which will be what, a 150w part on 28nm at $299 retail? I won't discount it but how expensive will the console be in that case? $499 without kinect $599 with kinect? Yeah that's really going to sell like hotcakes
 
There is no proof as there is no known specs on any of the machines, especially PS4/XB3 which likely dont exist in firm form yet, rumors notwithstanding. I believe to know enough from Nintendo's history, shown demos (all in 720P no AA btw), reliable rumors (01.net) and public statements by Nintendo and other devs, that I believe a very accurate guesstimate is that Wii U is much closer to this gen than "next". I believe to know enough from Microsoft and Sony's history and strategy, as well as Samaritan demo, to assume they will produce a full leap worthy of being called next gen. Any such leap will be at least 10X+ based on the current state of technology (remember, 10X is a much more conservative leap when the gen is lasting 7-8 years than previous times when generations lasted 5 years).

I'm starting to believe you have no clue and are just spouting subjective numbers.

It makes no sense for a gaming console. First you have be aware that OoOE and SMT try to achieve the same thing: feed the processing units so that they don't idle. OoOE does this by doing operations before they were due, while SMT does this by checking if another thread on the core could use a processing unit that would idle otherwise. Having both only makes sense in environment which is hard to predict, which does not apply to gaming code. The compiler can help you with OoOE but not so much with SMT as it has less or even no control which threads are running on which core. So having 6 no SMT cores instead of 3 cores with 2 way SMT is far better suited for gaming console IMHO. Let alone that you probably get more real processing units.

I agree, but we're also talking about a gaming console that needs to support more than one uPad. The I/O ARM will help, but I can see them needing more than that to support more than one controller. I believe some possible changes were made relating to supporting more than one controller effortlessly.

Also, can anyone explain why a 256bit wide bus is frowned upon by developers ?

Not devs, the hardware makers. That would increase the motherboard's complexity. But if someone wanted to avoid using eDRAM, they could go that route.
 
Status
Not open for further replies.
Top Bottom