• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

Earendil

Member
Its has compute shader support so yes it is a gpgpu, but again the r700 is terrible at running this code. Also the best thing to run physic on is a gpgpu but again they run this code on the cpu. Its like driving a moped and saying its can run the quarter mile, so then its a drag car....

So if they design this low power gpu to make up for the old poor performing cpu, it worst than i thought. Now we have the power at 45 watts which i said months ago... Wiiu is just poorly design. System is not balance at all and why in the world do they need 1 GB of ram for OS? Are you kidding me...

You said there was no such thing as a GPGPU...
 

dark10x

Digital Foundry pixel pusher
Xbox used "DX8" while PS2 was more in line with DX7. So I'm not sure what you are talking about...

What I meant was that PS2 effects were in line with what we saw from DX7, compare PS2 games to XBox games and you will see effects missing on a constant basis, sometimes it's because the PS2 wasn't powerful enough for it, and other things were simply only possible thanks to the nearly 2 extra years that Xbox spent in development.
I know exactly what you mean, but that's still a weird thing to say. PS2 was capable of a lot of things that DX7 class PC hardware was not. The hardware is just too different from what was available on the PC at that time to draw a real comparison.
 
Blimey, since the U spec thread got moved to Community I hadn't seen anything posted by USC-fan for ages. I see nothing has changed lol.

And of course the praise from named developers regarding the GPU is also nonsense...right..?

And just thought I'd also point out that the GPU, along with every single other Radeon HD GPU for years is r700-based. Doesn't mean that it's an r700 GPU.
 

Earendil

Member
Blimey, since the U spec thread got moved to Community I hadn't seen anything posted by USC-fan for ages. I see nothing has changed lol.

And of course the praise from named developers regarding the GPU is also nonsense...right..?

And just thought I'd also point out that the GPU, along with every single other Radeon HD GPU for years is r700-based. Doesn't mean that it's an r700 GPU.

It's been mentioned a hundred times, but his head is still in the sand.
 

USC-fan

Banned
Blimey, since the U spec thread got moved to Community I hadn't seen anything posted by USC-fan for ages. I see nothing has changed lol.

And of course the praise from named developers regarding the GPU is also nonsense...right..?

And just thought I'd also point out that the GPU, along with every single other Radeon HD GPU for years is r700-based. Doesn't mean that it's an r700 GPU.

Right because those are all 10.1x dx cards with the exact same features set of the r700... lol

Everything posted backs up what I been saying for months. Even down to the power consumption. Go back to that thread and look at the gpu people were posting, 80, 90, 110 watts cards. The whole system uses 45 watts. Like I said months ago, just look at the small case.
 

stupidvillager

Neo Member
Its has compute shader support so yes it is a gpgpu, but again the r700 is terrible at running this code. Also the best thing to run physic on is a gpgpu but again they run this code on the cpu. Its like driving a moped and saying its can run the quarter mile, so then its a drag car....

So if they design this low power gpu to make up for the old poor performing cpu, it worst than i thought. Now we have the power at 45 watts which i said months ago... Wiiu is just poorly design. System is not balance at all and why in the world do they need 1 GB of ram for OS? Are you kidding me...

Or......Nintendo and AMD could have created a custom part not really resembling anything in the R700 range or retail scene and made it a viable GPGPU. Lets just say that maybe this part took a couple of years to design and was just finished last year.
 

USC-fan

Banned
Or......Nintendo and AMD could have created a custom part not really resembling anything in the R700 range or retail scene and made it a viable GPGPU. Lets just say that maybe this part took a couple of years to design and was just finished last year.

The problem is they started with a poor design. AMD already had better design out. DX11 design that run compute shader code a lot better.

I think this goes back to that story on how they got a great deal on this old r700 design. It all about keeping cost down. Best bang for your buck.
 

IdeaMan

My source is my ass!
Weak compare to what? So are saying the system is balance?

the way some people talk about the CPU, it's like it's an hindrance for developers, and i never heard once my sources saying that. Now maybe other studios have met issues with their projects, but i suspect it's more relevant to coding problems, how to exploit the CPU, rather than its performances.
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
the way some people talk about the CPU, it's like it's an hindrance for developers, and i never heard once my sources saying that. Now maybe other studios have met issues with their projects, but i suspect it's more relevant to coding problems, how to exploit the CPU, rather than its performances.

Amusing to see this line used to defend Nintendo hardware given the ridicule that was heaped on Sony fans when it was used to defend the PS2. :p
 

stupidvillager

Neo Member
Right because those are all 10.1x dx cards with the exact same features set of the r700... lol

Everything posted backs up what I been saying for months. Even down to the power consumption. Go back to that thread and look at the gpu people were posting, 80, 90, 110 watts cards. The whole system uses 45 watts. Like I said months ago, just look at the small case.

Its also been said by devs that it exceeds DX10.1 and SM4, doesnt use them anyway, but has capabilities that they dont.
 

USC-fan

Banned
the way some people talk about the CPU, it's like it's an hindrance for developers, and i never heard once my sources saying that. Now maybe other studios have met issues with their projects, but i suspect it's more relevant to coding problems, how to exploit the CPU, rather than its performances.
Doubt you will here many problem from those porting ps360 games.

The real issue is next year when the next gen system launch from MS and sony.

The insider says the Wii U has the power to run Xbox 360 and PlayStation 3 ports with little difficulty. But they predict trouble when major third-party companies start producing games for the next Xbox and PlayStation, which will be about 6-8 times more powerful than the current 360 and PS3 according to several Kotaku sources who are aware of Sony and Microsft's plans for those machines.
http://kotaku.com/5920931/the-wii-us-power-problem

People also seem to forget about the second screen it has to render. So not only is going to be way less powerful than the next gen systems, it also going to have to do more.
 
Doubt you will here many problem from those porting ps360 games.

The real issue is next year when the next gen system launch from MS and sony.


http://kotaku.com/5920931/the-wii-us-power-problem

People also seem to forget about the second screen it has to render. So not only is going to be way less powerful than the next gen systems, it also going to have to do more.

Must be true. Memo to GAFii, cancel all WiiU pre orders. CANCEL CANCEL CANCEL.
 

stupidvillager

Neo Member
Ok. So if the R700 core and R800 core are pretty similar why not go with the cheaper if you know that you are going to be using a custom part anyways. I dont think Nintendo is going to use GCN but they are also not going to use just an R700 core. Its a custom part.
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
It's also anonymous, so it must be 100% unequivocally true.

"I don't like it, so it must be 100% unequivocally false" is an equally ridiculous position to take too, you realise?
 
Birddemorock.gif
 
Doubt you will here many problem from those porting ps360 games.

The real issue is next year when the next gen system launch from MS and sony.


http://kotaku.com/5920931/the-wii-us-power-problem

People also seem to forget about the second screen it has to render. So not only is going to be way less powerful than the next gen systems, it also going to have to do more.

Actually you're more likely to have developers porting from the PS3 and 360 to the U having problems rather than from the PS4 and 720 to the U thanks to Nintendo's (surprising) forward thinking with regards to the architecture.

And the PS4 and 720, going by what we know of all 3 consoles thanks to IdeaMan, Bgassassin and lherre, should only be around twice as powerful as the U. Everything we know so far - 3MB of CPU cache for the CPU compared to 1MB of cache for the 360, 4 times as much RAM, 32MB of eDRAM for the GPU compared to 10MB, the DSP, the IO processor, the OoOE CPU and what's likely to be a GPU with a DX11 equivalent feature set that's pushing anywhere between 600-800GFLOPS - should make it 3 or 4 times more powerful than the 360, making the 720 around twice as powerful as the U...which in the grand scheme of things isn't a huge gap. That's in the same ballpark in terms of power just like last gen.

As for the second screen, didn't someone work out (sorry, can't remember who it was now!) that it only takes 12MFLOPS to bung something on there..?
 
Not sure this is the best place to ask so I may end up posting it in another thread. It's also something I very easily could have missed.

What happened to the NFC reader? I haven't heard anything about it. I didn't see it mentioned in the NY reveal presentation. I didn't see it listed on Nintendo's spec sheet. It seems like this could have a lot of potential, but nobody is talking about it. If those "in the know" could comment on whether this is still included, I would appreciate it. I guess I'll find out on Nov 18th since I've already locked mine up :) Still, it would be good to know that this is still included (not that it changes my purchase decision).
 

USC-fan

Banned
Actually you're more likely to have developers porting from the PS3 and 360 to the U having problems rather than from the PS4 and 720 to the U thanks to Nintendo's (surprising) forward thinking with regards to the architecture.

And the PS4 and 720, going by what we know of all 3 consoles thanks to IdeaMan, Bgassassin and lherre, should only be around twice as powerful as the U. Everything we know so far - 3MB of CPU cache for the CPU compared to 1MB of cache for the 360, 4 times as much RAM, 32MB of eDRAM for the GPU compared to 10MB, the DSP, the IO processor, the OoOE CPU and what's likely to be a GPU with a DX11 equivalent feature set that's pushing anywhere between 600-800GFLOPS - should make it 3 or 4 times more powerful than the 360, making the 720 around twice as powerful as the U...which in the grand scheme of things isn't a huge gap. That's in the same ballpark in terms of power just like last gen.

As for the second screen, didn't someone work out (sorry, can't remember who it was now!) that it only takes 12MFLOPS to bung something on there..?
Its fun to make up numbers. Its easier to port from x720 or ps4 than ps360, now i heard it all!!! What is this forward thinking design? lol

I feel sorry for Bgassassin. People come up with these crazy posts and say he told me so! Lol good luck Bgassassin!!

The wiiu is gpu half your gflops numbers. It is impossible given the power consumption to reach anywhere close to those numbers you posted.
 

IdeaMan

My source is my ass!
It was real time AND had a 2nd scene running on the pad.

the amount of improvements Nintendo manage to reach since this demo (early dev kit, etc) is astounding, thanks to the new dev kits, SDK, and other parameters, it's VERY promising for the second wave of Wii U titles.
 

IdeaMan

My source is my ass!
Not sure this is the best place to ask so I may end up posting it in another thread. It's also something I very easily could have missed.

What happened to the NFC reader? I haven't heard anything about it. I didn't see it mentioned in the NY reveal presentation. I didn't see it listed on Nintendo's spec sheet. It seems like this could have a lot of potential, but nobody is talking about it. If those "in the know" could comment on whether this is still included, I would appreciate it. I guess I'll find out on Nov 18th since I've already locked mine up :) Still, it would be good to know that this is still included (not that it changes my purchase decision).

it's still included. it's just not on the top of the list of things Nintendo need to work on and complete right now.
 
the amount of improvements Nintendo manage to reach since this demo (early dev kit, etc) is astounding, thanks to the new dev kits, SDK, and other parameters, it's VERY promising for the second wave of Wii U titles.

I'm super excited to see some ground up Wii U games. Wonder when the first one will be shown
 
Offscreen small gifs of nonrealtime footage always looks good.

Remember when Brute Force was "Bettar than Halo 2"?

Real-time footage (literally, it was controllable, if only through QTE gameplay), on an older dev. version of the Wii U, and taken off-screen (the direct-screen is better looking).

http://www.youtube.com/watch?v=GcapRBQoMWk

Mind you, the one on the floor was even better looking than what was shown at the presentation.
 
Not sure this is the best place to ask so I may end up posting it in another thread. It's also something I very easily could have missed.

What happened to the NFC reader? I haven't heard anything about it. I didn't see it mentioned in the NY reveal presentation. I didn't see it listed on Nintendo's spec sheet. It seems like this could have a lot of potential, but nobody is talking about it. If those "in the know" could comment on whether this is still included, I would appreciate it. I guess I'll find out on Nov 18th since I've already locked mine up :) Still, it would be good to know that this is still included (not that it changes my purchase decision).
Looks like it's still there, but none of the launch window games are doing anything with it.
 

onQ123

Member

could Next Gen see 1 of the biggest improvements between console gens do to GPGPUs?


GPGPUs in consoles will be just like Cell in the PS3 ,when devs use the Cell like it's a normal CPU nothing special happens but when they use it to it's full potential it makes it hard to believe that it's the same console that the other games are on. something will happen with GPGPUs in consoles most of the games that we are seeing now for the Wii U are using the GPGPU like a normal GPU but when they start to using it to pull off new graphical tasks Wii U will step well above the PS3 & Xbox 360.


but just like with the PS3, devs will take the shortest route to the finish line.
 

KageMaru

Member

I don't understand the point of posting this gif. Nothing in that gif is impossible on current gen hardware. Gears of War used the same rain and water effects in 2006 for example.

It was a video recorded from an early Wii U dev unit, not a production Wii U unit running the demo

It's not like the production Wii unit would be any slower than an earlier dev kit, so I don't get the point.

Edit:


could Next Gen see 1 of the biggest improvements between console gens do to GPGPUs?


GPGPUs in consoles will be just like Cell in the PS3 ,when devs use the Cell like it's a normal CPU nothing special happens but when they use it to it's full potential it makes it hard to believe that it's the same console that the other games are on. something will happen with GPGPUs in consoles most of the games that we are seeing now for the Wii U are using the GPGPU like a normal GPU but when they start to using it to pull off new graphical tasks Wii U will step well above the PS3 & Xbox 360.


but just like the PS3 devs will take the shortest route to the finish line.

GPGPUs will make some difference, but I question how big of an impact since as it was mentioned at B3D, devs will always be hesitant to pull resources away from rendering for compute.
 

nikatapi

Member
the amount of improvements Nintendo manage to reach since this demo (early dev kit, etc) is astounding, thanks to the new dev kits, SDK, and other parameters, it's VERY promising for the second wave of Wii U titles.

That sounds good, the bird demo on the showfloor was impressive with the separate rendering for the gamepad.

Too bad Nintendo failed to deliver a graphically impressive game (in comparison to the other HD consoles) and WiiU is being regarded as an equal (or even lower spec) machine.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Its has compute shader support so yes it is a gpgpu, but again the r700 is terrible at running this code.
Which code is that?

Also the best thing to run physic on is a gpgpu but again they run this code on the cpu.
They?

Its like driving a moped and saying its can run the quarter mile, so then its a drag car....
Wait, I have a better one: it's like a gaffer who can type on a keyboard, who suddenly thinks he's an expert in everything that involves kbd usage.

So if they design this low power gpu to make up for the old poor performing cpu, it worst than i thought. Now we have the power at 45 watts which i said months ago... Wiiu is just poorly design. System is not balance at all and why in the world do they need 1 GB of ram for OS? Are you kidding me...
They're clearly kidding you. It's apparent at this stage.
 

fritolay

Member
I have an idea.

Can we make a definitive GAF Wii-U Hardware thread, where we nominate say the top 10 guys that we think everyone should hear their opinion? Have them state it in the OP and go from there. I think right now there is too much BS that this thread doesn't even make sense. Let's hear from the expert hardware Gaffers we surely have some that people respect, and get that on the front page. Or just edit this one. Please.
 

LCGeek

formerly sane
I have an idea.

Can we make a definitive GAF Wii-U Hardware thread, where we nominate say the top 10 guys that we think everyone should hear their opinion? Have them state it in the OP and go from there. I think right now there is too much BS that this thread doesn't even make sense. Let's hear from the expert hardware Gaffers we surely have some that people respect, and get that on the front page. Or just edit this one. Please.

Doubtful that would shut up certain types here and kill off the thread.
 

USC-fan

Banned
could Next Gen see 1 of the biggest improvements between console gens do to GPGPUs?


GPGPUs in consoles will be just like Cell in the PS3 ,when devs use the Cell like it's a normal CPU nothing special happens but when they use it to it's full potential it makes it hard to believe that it's the same console that the other games are on. something will happen with GPGPUs in consoles most of the games that we are seeing now for the Wii U are using the GPGPU like a normal GPU but when they start to using it to pull off new graphical tasks Wii U will step well above the PS3 & Xbox 360.


but just like the PS3 devs will take the shortest route to the finish line.
You missed the point of what devs did with the cell. They off loaded gfx onto the cell. The hold back for system today are the gpu. CPU are just good enough. SO to take away gpu power to run cpu compute code it just stupid. Poor design in gaming console since most of its is gpu work.

Than again if you cpu doesnt perform well than you have to offload. But again poor design because of an unbalance system.
 
I have an idea.

Can we make a definitive GAF Wii-U Hardware thread, where we nominate say the top 10 guys that we think everyone should hear their opinion? Have them state it in the OP and go from there. I think right now there is too much BS that this thread doesn't even make sense. Let's hear from the expert hardware Gaffers we surely have some that people respect, and get that on the front page. Or just edit this one. Please.

How much are tickets? USC gets my vote.
 

theman5141

Neo Member
Its fun to make up numbers. Its easier to port from x720 or ps4 than ps360, now i heard it all!!! What is this forward thinking design? lol

I feel sorry for Bgassassin. People come up with these crazy posts and say he told me so! Lol good luck Bgassassin!!

The wiiu is gpu half your gflops numbers. It is impossible given the power consumption to reach anywhere close to those numbers you posted.

Numbers may be ever-so-slightly idealistic, but there definitely is a case to be made for the feasibility of most, not all "next-gen" down ports. Downporting, with a little effort and *gasp* ingenuity shouldn't be above the expertise of most developers. Also, not sure why you think the floating point calculations are closer to "half" 600-800 Gflops, as this range has been regarded as quite a sensible estimate for some time now. Consensus is leaning more towards the lower end of that range, but within said range nonetheless. Even if Wii U was clocked lower than 500 mHz, it will still allow for much higher performance levels vs. xenos/Reality synthesizer simply by virtue of the fact that it's based on a GPU design 2-3 generations beyond the other two.

Not sure where you're deriving 45w from as the absolute ceiling for Wii U power consumption. Iwata clearly stated that the machine can consume up to 75w.
 

USC-fan

Banned
Which code is that?


They?


Wait, I have a better one: it's like a gaffer who can type on a keyboard, who suddenly thinks he's an expert in everything that involves kbd usage.


They're clearly kidding you. It's apparent at this stage.

1. computer shader code.
2. Havok
3-4. Blu nice personal attack.

Please play again when you have something better to add.

Numbers may be ever-so-slightly idealistic, but there definitely is a case to be made for the feasibility of most, not all "next-gen" down ports. Downporting, with a little effort and *gasp* ingenuity shouldn't be above the expertise of most developers. Also, not sure why you think the floating point calculations are closer to "half" 600-800 Gflops, as this range has been regarded as quite a sensible estimate for some time now. Consensus is leaning more towards the lower end of that range, but within said range nonetheless. Even if Wii U was clocked lower than 500 mHz, it will still allow for much higher performance levels vs. xenos/Reality synthesizer simply by virtue of the fact that it's based on a GPU design 2-3 generations beyond the other two.

Not sure where you're deriving 45w from as the absolute ceiling for Wii U power consumption. Iwata clearly stated that the machine can consume up to 75w.
That range is wrong. It is impossible in fact. The best case was 550 or so but that power consumption is even too high and that used the latest and greatest form amd. it 350-500 glfop would be the correct range and likely around 400-450. AS from design gpu improvement this will be even greater with next gen system seeing the wiiu uses a 2008 design. Compare to a 2013 design in the next gen system going by the leaks.

The power brick is 75w, no way it uses this much power. It is just not impossible. It would be a terrible design and no device would run at 100% of its psu. 45-50w is what i said since E3. We had the power brick from a gaffer and he was 100% right.
 

Cuth

Member
It's also anonymous, so it must be 100% unequivocally true.
I never get this kind of reasoning, given than being anonymous often means being able to speak freely.
Of course this doesn't mean that everything said by anonymous developers is true, but it's also true that publicly almost everyone in the game industry won't say bad things about a certain console, even if they are unhappy with some choices made by the hardware maker.
 

Earendil

Member
I never get this kind of reasoning, given than being anonymous often means being able to speak freely.
Of course this doesn't mean that everything said by anonymous developers is true, but it's also true that publicly almost everyone in the game industry won't say bad things about a certain console, even if they are unhappy with some choices made by the hardware maker.

Plenty of people went on record saying bad thing about the Wii.
 

onQ123

Member
You missed the point of what devs did with the cell. They off loaded gfx onto the cell. The hold back for system today are the gpu. CPU are just good enough. SO to take away gpu power to run cpu compute code it just stupid. Poor design in gaming console since most of its is gpu work.

Than again if you cpu doesnt perform well than you have to offload. But again poor design because of an unbalance system.

Think about what you just said!


so what's the problem?
 

ozfunghi

Member
I have an idea.

Can we make a definitive GAF Wii-U Hardware thread, where we nominate say the top 10 guys that we think everyone should hear their opinion? Have them state it in the OP and go from there. I think right now there is too much BS that this thread doesn't even make sense. Let's hear from the expert hardware Gaffers we surely have some that people respect, and get that on the front page. Or just edit this one. Please.

We all already know who to listen to:

Wsippel
Blu
BGassassin
 
Top Bottom