• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

Even if the CPU and GPU are roughly around 360's raw power, having 1gb of ram for gaming will go a long way. The tablet overhead is the real wild card in all of this.
 

D-e-f-

Banned
Remember:

They used a video with scroll lag to show off Nintendo TVii - and I don't believe that it will have scroll lag on the day it's being released.
And now you got screenshots with heavy jaggies. I think the final product will look slightly different.

I wonder if that could be related to how they're taking screenshots and some smoothing filter or AA being applied via post processing or something? (note: no idea what I'm talking about here!)

I believe the screenshot tool in steam sometimes works in a similar way in that it doesn't capture some post-processing effects or whatever which makes screenshots look different than it actually looks while you're playing.

I say this (not only to start my third paragraph in a row with a stupid "I..." sentence) because I haven't noticed any jaggies in actual gameplay footage of those NintendoLand games.
 
Everything but the lighting model, yes it could. I'm not familiar with Ip5 though so maybe it can do the lighting as well. Realistically people won't begin seeing what the Wii U is truly capable of until 2nd generation games that are built from the ground up for Wii U or PC to Wii U, and only from good developers.

I'm usually keen on these things, but I don't see anything impressive in those shots. Low resolution buffer for the shadows. I don't see any GI anywhere (I think people said there were, someone please point it out if there is), same old bloom we've been seeing on games for years now.

For a very low demanding game. Surprised they couldn't throw in some AA.
 
That conflicts with IBM Oban:
http://www.fudzilla.com/processors/item/25619-oban-initial-product-run-is-real

Honestly moving to x86 would hurt 360 backwards compatibility and it's too huge a selling point for them. I doubt all x86 rumors for XB3 and find them all wishful thinking but if you have a link to one that is compelling, I'll certainly take a look.

I always kinda thought the Oban was actually the WiiU SoC. The assumption that Oban was for MS was always just an educated guess based on prevailing rumors at the time. Back then everyone though 720 was definitely AMD+PPC, making it a reasonable guess.
 

KlotePino

Member
Really hope these are just some shots without the AA they'll implement because it looks ridiculously aliased. I'm not gonna complain about how the Wii U has current gen specs in next-gen times but atleast make your games look smooth and nice.
 

z0m3le

Banned
I always kinda thought the Oban was actually the WiiU SoC. The assumption that Oban was for MS was always just an educated guess based on prevailing rumors at the time. Back then everyone though 720 was definitely AMD+PPC, making it a reasonable guess.

Except last week Oban had manufacturing issues that could delay a XBOX 720 launch at the end of NEXT year. It's in the other link I listed. (fact: Oban isn't Wii U's CPU)
 

onQ123

Member
That conflicts with IBM Oban:
http://www.fudzilla.com/processors/item/25619-oban-initial-product-run-is-real

and this from just last week with rumors of delays:
http://www.ign.com/articles/2012/09/06/xbox-720-could-be-delayed-by-manufacturing-trouble

Honestly moving to x86 would hurt 360 backwards compatibility and it's too huge a selling point for them. I doubt all x86 rumors for XB3 and find them all wishful thinking but if you have a link to one that is compelling, I'll certainly take a look.

Well there is this

So, what is the XBox Next? SemiAccurate has been saying for a while that all signs were pointing toward a PowerPC, specifically an IBM Power-EN variant. The dark horse was an x86 CPU, but it was a long shot. It looks like the long shot came through, moles are now openly talking about AMD x86 CPU cores and more surprisingly, a newer than expected GPU. How new? HD7000 series, or at least a variant of the GCN cores, heavily tweaked by Microsoft for their specific needs.

http://semiaccurate.com/2012/09/04/microsoft-xbox-next-delay-rumors-abound/


& this

specs-1.jpg



both saying that there was a switch from IBM to AMD.
 


God. Come on, Nintendo. It's not hard to make something better than this on a 360-level hardware.

For some reason this shot clearly lacks the dof present in that smaller shot. Dof helps a big deal with aliasing.

This has happened before with shaders "disappearing" from different pikmin 3 captures.
 
For some reason this shot clearly lacks the dof present in that smaller shot. Dof helps a big deal with aliasing.

This has happened before with shaders "disappearing" from different pikmin 3 captures.

Indeed. Nice find.
There are even more differences: The "english" screenshot is missing some reflections/specular highlights. Must be another build of the game.
 

z0m3le

Banned
Well there is this



http://semiaccurate.com/2012/09/04/microsoft-xbox-next-delay-rumors-abound/


& this

specs-1.jpg



Both saying that there was a switch from IBM to AMD.
The first one sure is weird, since it says that it is surprised that it is using HD7000, while the first reports from January were also using HD7000...

As for the second one, I don't know who Seronx is, but if his information is accurate:
1. No backwards compatibility for 360 or PS3 is basically confirmed.
2. Jaguar cores even custom ones, would still be based on the design of Jaguar, which are low frequency parts... Likely couldn't be clocked to even 2.5GHz. (Jaguar parts currently target sub 2GHz speeds)
3. All AMD CPUs lack any SMT, so it's 1 thread per core, meaning both these consoles would lack the extra 2 threads found in current gen hardware.
4. 192 GCN SPs is only 3CUs... This is much much lower than anyone expects out of these consoles... GFLOPs performance would at 800MHz be ~120% of Xbox360 (192*.800*2=307.2 GFLOPs)

For these reasons, I highly doubt he is a good source, but if he is, Wii U > XB3/PS4 so make sure you have your preorder.
 

z0m3le

Banned
In this case they are going to be paired with a discrete GPU.

Is that the $150 raytracer he lists? Is the guy an insider? because I have a hard time believing that Microsoft would walk away from their backwards compatibility, and into a weaker CPU when all reports we were getting is the complete opposite.
 
I am saying it shouldn't matter to you that I call that person disingenuous for reasons stated in my first reply. I assumed that you were taking issue with me making that claim.

Ah ok. Well for me I saw what was in the OP before they came out.


If you do not mind who are the two, what company do they work for and what did they say?

They've already been mentioned, but considering all they've said publicly (lherre over the last year) I don't think we'll learn that info.
 

v1oz

Member
The change to x86 AMD on PS4/Xbox720 will make straight ports to the Wii U even more tricky. Not withstanding the fact software will need to be re-engineered to use far less hardware threads on Wii U.

Which makes me think publishers will continue to treat the Wii U as a last gen console; recycle all their older PS3/360 tech/tools/engines and put their junior teams on Wii U games. Just like they did on the Wii.
 

onQ123

Member
The first one sure is weird, since it says that it is surprised that it is using HD7000, while the first reports from January were also using HD7000...

As for the second one, I don't know who Seronx is, but if his information is accurate:
1. No backwards compatibility for 360 or PS3 is basically confirmed.
2. Jaguar cores even custom ones, would still be based on the design of Jaguar, which are low frequency parts... Likely couldn't be clocked to even 2.5GHz. (Jaguar parts currently target sub 2GHz speeds)
3. All AMD CPUs lack any SMT, so it's 1 thread per core, meaning both these consoles would lack the extra 2 threads found in current gen hardware.
4. 192 GCN SPs is only 3CUs... This is much much lower than anyone expects out of these consoles... GFLOPs performance would at 800MHz be ~120% of Xbox360 (192*.800*2=307.2 GFLOPs)

For these reasons, I highly doubt he is a good source, but if he is, Wii U > XB3/PS4 so make sure you have your preorder.

that's basically the CPU of the new consoles like a APU in a laptop of desktop that also has a GPU.
 
The first one sure is weird, since it says that it is surprised that it is using HD7000, while the first reports from January were also using HD7000...

As for the second one, I don't know who Seronx is, but if his information is accurate:
1. No backwards compatibility for 360 or PS3 is basically confirmed.
2. Jaguar cores even custom ones, would still be based on the design of Jaguar, which are low frequency parts... Likely couldn't be clocked to even 2.5GHz. (Jaguar parts currently target sub 2GHz speeds)
3. All AMD CPUs lack any SMT, so it's 1 thread per core, meaning both these consoles would lack the extra 2 threads found in current gen hardware.
4. 192 GCN SPs is only 3CUs... This is much much lower than anyone expects out of these consoles... GFLOPs performance would at 800MHz be ~120% of Xbox360 (192*.800*2=307.2 GFLOPs)

For these reasons, I highly doubt he is a good source, but if he is, Wii U > XB3/PS4 so make sure you have your preorder.

A Netbook CPU? I said wow!

Dat GPU better be a beast to do tons of GPGPU stuff...
 

Mr Swine

Banned
Is that the $150 raytracer he lists? Is the guy an insider? because I have a hard time believing that Microsoft would walk away from their backwards compatibility, and into a weaker CPU when all reports we were getting is the complete opposite.

Well you know that the more powerful console you go for the louder and hotter it becomes. And maybe they want to go with a small console this time?
 
Is that the $150 raytracer he lists? Is the guy an insider? because I have a hard time believing that Microsoft would walk away from their backwards compatibility, and into a weaker CPU when all reports we were getting is the complete opposite.

Good questions. I haven't heard anything that suggests PS4 changed their GPU target and MS apparently is leaving out certain details like Nintendo did, at least for now, so only those working directly with the hardware (and who they may tell) know just how well it performs. I'm assuming through pushing the GPU like lherre said they did with Wii U a year ago.

But this guy isn't the first person to mention both changed to Jaguar. The first I remember happened on GAF.
 

The_Lump

Banned
I watched the NL footage, that wasn't impressive or anything current gen couldn't do.





These aren't, either.


While I'm inclined to agree your comment may end up true on the whole (although I'll change my mind if we see AA implemented as I think some of those effects at that resolution and at that framerate with AA would be quite impressive); You've got no basis for that statement. And neither does anyone to the contrary. We just don't know if it's possible on current gen. Might no look spectacular, but then again it might be employing subtle features which 360s GPU cannot - we just don't know.

I'm playing devils advocate a bit here. But sweeping matter of fact statements only incite arguments, and don't encourage conversation.
 

z0m3le

Banned
The change to x86 AMD on PS4/Xbox720 will make straight ports to the Wii U even more tricky. Not withstanding the fact software will need to be re-engineered to use far less hardware threads on Wii U.

Which makes me think publishers will continue to treat the Wii U as a last gen console; recycle all their older PS3/360 tech/tools/engines and put their junior teams on Wii U games. Just like they did on the Wii.

Wii U is a 3 core CPU, with the possibility that it uses 6 threads. With it's own ARM CPU for OS and a DSP for sound, Wii U is well equipped to only have 3 threads and might have 6.
Jaguar quad cores have a 100% chance of having 1 thread per core, so you are talking about 1 less thread at worst, and Wii U actually having 2 extra threads at best.

Jaguar also shouldn't be able to compete with a power7 architecture even one shrunken down. Jaguar cores are built for tablets and netbooks. Not laptops, those would be Trinity and it's successor.

This rumor is highly unlikely IMO, who here even wants it to be true? jaguar cores means 0% chance of backwards compatibility.
 
Yeah the BC thing troubles me about Durango going X86. Especially since I think one of the recent rumors I trusted said MS was very concerned with BC.

Maybe they can do emulation. Maybe they dont care and ditch it. Maybe they are actually including a mini 360 chipset as the leak docs (this would be awful imo)
 
Yeah the BC thing troubles me about Durango going X86. Especially since I think one of the recent rumors I trusted said MS was very concerned with BC.

Maybe they can do emulation. Maybe they dont care and ditch it. Maybe they are actually including a mini 360 chipset as the leak docs (this would be awful imo)

How about an SKU ~50 bucks more expensive that has BC? So $399 for the standard SKU, then $450 or $499 for one with BC and some other random crap/accessories.
 

jerd

Member
I've got a question that I always wonder whenever somebody says that images will have to be rendered twice on Wii U. Is there a way for it to only be rendered once, just shown twice that would require less processing power?

http://www.youtube.com/watch?v=jDeDUZGhJA4


So here NBA 2k13 is being shown both on the TV and the Gamepad at the same time at all times. Does this mean the game is being rendered twice or is it some other trickery?
 
Wii U is a 3 core CPU, with the possibility that it uses 6 threads. With it's own ARM CPU for OS and a DSP for sound, Wii U is well equipped to only have 3 threads and might have 6.
Jaguar quad cores have a 100% chance of having 1 thread per core, so you are talking about 1 less thread at worst, and Wii U actually having 2 extra threads at best.

Jaguar also shouldn't be able to compete with a power7 architecture even one shrunken down. Jaguar cores are built for tablets and netbooks. Not laptops, those would be Trinity and it's successor.

This rumor is highly unlikely IMO, who here even wants it to be true? jaguar cores means 0% chance of backwards compatibility.

People (on GAF and elsewhere) only read:

"Jaguar 4 core amd x86 cpu" and start orgasm over it without checking what that would actually mean. Most people on GAF have no clue about specs yet participate in specs threads. Wii U hardware/specs threads should have taught you that :p
 

The_Lump

Banned
How about an SKU ~50 bucks more expensive that has BC? So $399 for the standard SKU, then $450 or $499 for one with BC and some other random crap/accessories.

Suppose they could do a 'PS3' and stump IP the cash to include BC in the first gen of the hardware, then faze it out.


Speaking of BC: Would code designed for Broadway be useable on a later PowerPC CPU? Or is that down to the compiler?
 

z0m3le

Banned
I've got a question that I always wonder whenever somebody says that images will have to be rendered twice on Wii U. Is there a way for it to only be rendered once, just shown twice that would require less processing power?

http://www.youtube.com/watch?v=jDeDUZGhJA4


So here NBA 2k13 is being shown both on the TV and the Gamepad at the same time at all times. Does this mean the game is being rendered twice or is it some other trickery?

You can have a scene rendered once and sent to both screens, yes.
 
I've got a question that I always wonder whenever somebody says that images will have to be rendered twice on Wii U. Is there a way for it to only be rendered once, just shown twice that would require less processing power?

http://www.youtube.com/watch?v=jDeDUZGhJA4


So here NBA 2k13 is being shown both on the TV and the Gamepad at the same time at all times. Does this mean the game is being rendered twice or is it some other trickery?

Its most likely rendered twice.

Not sure though
 
People on GAF only read:

"Jaguar 4 core amd x86 cpu" and start orgasm over it without checking what that would actually mean. Most people on GAF have no clue about specs yet participate in specs threads. Wii U hardware/specs threads should have taught you that :p

I don't think they saw the "Jaguar" part.
 

ozfunghi

Member
I've got a question that I always wonder whenever somebody says that images will have to be rendered twice on Wii U. Is there a way for it to only be rendered once, just shown twice that would require less processing power?

http://www.youtube.com/watch?v=jDeDUZGhJA4


So here NBA 2k13 is being shown both on the TV and the Gamepad at the same time at all times. Does this mean the game is being rendered twice or is it some other trickery?

If it's the same image on both screens it shouldn't be rendered twice, just downscaled on the controller screen... i believe.
 

The_Lump

Banned
Wii U is a 3 core CPU, with the possibility that it uses 6 threads. With it's own ARM CPU for OS and a DSP for sound, Wii U is well equipped to only have 3 threads and might have 6.
Jaguar quad cores have a 100% chance of having 1 thread per core, so you are talking about 1 less thread at worst, and Wii U actually having 2 extra threads at best.

Jaguar also shouldn't be able to compete with a power7 architecture even one shrunken down. Jaguar cores are built for tablets and netbooks. Not laptops, those would be Trinity and it's successor.

This rumor is highly unlikely IMO, who here even wants it to be true? jaguar cores means 0% chance of backwards compatibility.


Yeah but....Jaguars! They've got claws n shit! Rawr!
 

mrklaw

MrArseFace
I've got a question that I always wonder whenever somebody says that images will have to be rendered twice on Wii U. Is there a way for it to only be rendered once, just shown twice that would require less processing power?

http://www.youtube.com/watch?v=jDeDUZGhJA4


So here NBA 2k13 is being shown both on the TV and the Gamepad at the same time at all times. Does this mean the game is being rendered twice or is it some other trickery?

if its the same view (looks like it), then you could just downscale the image and beam it to the gamepad without having to draw it twice
 
This rumor is highly unlikely IMO, who here even wants it to be true? jaguar cores means 0% chance of backwards compatibility.



i'm an armchair system designer at best but i think 8 jag cores might be pretty sweet.

thats a lot of cores that can do a lot of different things. dedicate 2-3 to os? you've still got 5-6 left over. i'm guessing devs can do some cell-like gpu helping with all those cores too, if they want.

lots of symmetric oooe general purpose cores might be pretty cool imo.
 

ozfunghi

Member
I don't think they saw the "Jaguar" part.

bg, the new tweet from IBM, does it bring something new to the table or how do you interpret that? Is it still possible that it's that "476" you and Wsippel believed to be the most likely candidate? Can we forget about "3core Broadway"?
 

jaxpunk

Member
i'm an armchair system designer at best but i think 8 jag cores might be pretty sweet.

thats a lot of cores that can do a lot of different things. dedicate 2-3 to os? you've still got 5-6 left over. i'm guessing devs can do some cell-like gpu helping with all those cores too, if they want.

lots of symmetric oooe general purpose cores might be pretty cool imo.

I'm a leg rest system designer we should get together.

I heard the wii u is powered by magic, hey I should make a thread!

The magic of the wii-u open it up to see!!!!
 
i'm an armchair system designer at best but i think 8 jag cores might be pretty sweet.

thats a lot of cores that can do a lot of different things. dedicate 2-3 to os? you've still got 5-6 left over. i'm guessing devs can do some cell-like gpu helping with all those cores too, if they want.

lots of symmetric oooe general purpose cores might be pretty cool imo.

Not really.

Using a 4 core Trinity would give a massive power boost and is far more suitable for a console than anything Jaguar
 

z0m3le

Banned
i'm an armchair system designer at best but i think 8 jag cores might be pretty sweet.

thats a lot of cores that can do a lot of different things. dedicate 2-3 to os? you've still got 5-6 left over. i'm guessing devs can do some cell-like gpu helping with all those cores too, if they want.

lots of symmetric oooe general purpose cores might be pretty cool imo.

This:
Not really.

Using a 4 core Trinity would give a massive power boost and is far more suitable for a console than anything Jaguar

But also an IBM CPU with 2 way SMT and 4 cores would give what we have been hearing from 720 (a strong CPU) jaguar cores are tablet cores, and having 8 won't really mean much if each thread is so slow that you can't get your frames up to par with 360. (8 small slow threads from Jaguar, or 8 fast and powerful threads from IBM allowing backwards compatibility and way cooler stuff than you can do with Jaguar.)

It makes some sense for PS4 because if they really are going APU, but only 1 GPU inside of the APU (like the original target specs were) moving to a smaller CPU like Jaguar, would allow you to shrink your chip, and have a smaller unit, obviously 1.8TFLOPs with a weak CPU can be off set some.

Microsoft would have a more powerful box with a 4 core IBM processor and a 1.2TFLOPs GPU though, although the PS4 might push more polygons, Microsoft's console would hit higher FPS, run a full featured OS in the background and still run the games at a similar difference to 360 and PS3 now.
 

beril

Member
The change to x86 AMD on PS4/Xbox720 will make straight ports to the Wii U even more tricky. Not withstanding the fact software will need to be re-engineered to use far less hardware threads on Wii U.

Which makes me think publishers will continue to treat the Wii U as a last gen console; recycle all their older PS3/360 tech/tools/engines and put their junior teams on Wii U games. Just like they did on the Wii.

Will that really have any effect for most developers? Do you even need to care about what instruction set you're using unless you're coding straight up assemlby? I'm not sure how much that is used in modern games; maybe for some small highly optimized routines, but it wouldn't be too much of hassle to rewrite some small chunks of assemler code. Granted, I haven't really programmed anything that needed a whole lot optimization but I've moved my game code over a range of different cpus and just relied on the compilers to do the job.
 

D-e-f-

Banned
I wonder if that could be related to how they're taking screenshots and some smoothing filter or AA being applied via post processing or something? (note: no idea what I'm talking about here!)

I believe the screenshot tool in steam sometimes works in a similar way in that it doesn't capture some post-processing effects or whatever which makes screenshots look different than it actually looks while you're playing.

I say this (not only to start my third paragraph in a row with a stupid "I..." sentence) because I haven't noticed any jaggies in actual gameplay footage of those NintendoLand games.

anyone who knows about these things care to comment if that was basically nonsense what I suggested there or could that actually be what's happening?

PS: Jaguar: meow! (Sony the new Atari confirmed!)
 

milsorgen

Banned
Meh, still too sparse to work out its performance. We need number of shaders, clockspeeds, cache amounts etc.

Still using a 13 year old CPU design for a modern console is extremely embarrassing though. I'm assuming that the ingame OS uses the 32mb of memory.

Yeah super embarrassing, just like when Intel went back to the Pentium 3 architecture after netburst failed to live up to it's hype. Super embarrassing all that old tech that lead to fantastically fast processors putting them back on the top.

Yeah totally, a tech's age completely determines its value.

http://en.wikipedia.org/wiki/Intel_P6
 
Top Bottom