• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU technical discussion (serious discussions welcome)

JordanN

Banned
Unless Star Wars is migrating to UE4, along with a couple other next gen titles in development, there will have to be some version of UE3 running on the Orbango lol
I don't think he means UE3 wont run on Durango, only that it's not going to receive official support for it.

So Durango UE3 = same as all other console UE3 (ex: PS3, 360, Wii U etc).
 

AzaK

Member
Not sure it's of any value but wasn't there recently a reasonably high profile game mentioned that's in development for 360/PS3/Wii U by one team, and 720/PS4/PC by another team? I wonder if that indicates anything of merit?

Could either be UE4 really only feasible on the Orbis/Durango and PC with Wii U having to be relegated to UE3 or just where the publisher sees the system's general audience/power levels.
 
About that... what 3D does Assassins Creed 3 provide exactly? It has side by side, top/bottom or color 3D. How is that supposed to work? I have the game but don't have a 3D tv... You can have color 3D on the gamepad, and my TV splits up in two screens for the other options... as you can see, i'm not really into 3D display technology, lol. I assume the color thing is for use with some red/green glasses? I can't imagine that works brilliantly.

The Side by Side and Top Bottom are for 3D tvs, the color 3D is for getting 3D on non-colored tvs. If your tv was a 3D set, you could send it the sbs or tb versions and the tv would know how to handle them to alternate the frames to produce the 3D image.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
No, I don't think that's likely. They run far too well for doing everything in the main memory pool. And I don't think that after 6 years of 360 having an eDRAM pool is such an alien/weird concept to developers (yes, I know the specifics differ).
Outside of straight fb purposes, I'd expect streamout/memexport to be the only thing common between U-GPU's and Xenos' edram pools - IOW, the mechanism for bypassing edram.
 
There is no official UE3 support for this platforms so ...
Wow, thanks for that info. That is interesting.

OK, thanks. It's a little surprising as I would have expected developers to use it for cross-generation games, but I suppose if developers had been asking for it, Epic would have provided.
I think that could have been Epic's initial plan. However, perhaps Epic figured it would be better for them to phrase out UE3 and push everyone to upgrade to UE3. Lherre already said that the support list for UE4 was surprising. Now we know why.
 

Thraktor

Member
Many are stating that the WiiU's CPU is small and slow.

If Sony is going with the following:
http://www.neogaf.com/forum/showthread.php?t=505093

An APU for its console, how large and powerful can its CPU actually be?
Considering that "only one of the GPU 304A, 304B is active at a time"

My confusion of why Sony would take such an approach aside (there's a thread for that, I suppose), going with a APU-style approach would only restrict them if they wanted to go with a 6 or 8 "core" bulldozer-derived CPU, which would strike me as a waste of silicon for a console CPU anyway. They're apparently going with Jaguar cores, which you could fit a bunch of on a die alongside a GPU.

I think that could have been Epic's initial plan. However, perhaps Epic figured it would be better for them to phrase out UE3 and push everyone to upgrade to UE3. Lherre already said that the support list for UE4 was surprising. Now we know why.

You might well be right. It probably makes more sense for them to just support one highly scalable engine than two separate engines for two separate markets.
 

Durante

Member
Outside of straight fb purposes, I'd expect streamout/memexport to be the only thing common between U-GPU's and Xenos' edram pools - IOW, the mechanism for bypassing edram.
But "straight FB purposes" is exactly what the conversation was about. Using eDRAM for the FB is something you can expect every game on the platform to do, don't you agree?
 
You might well be right. It probably makes more sense for them to just support one highly scalable engine than two separate engines for two separate markets.

This could benefit the Wii U (and 360/PS3/iDevices) when it comes to UE4 ports. On the flipside..this could initially annoy publishers that wanted to make games for PS4/Xbox Next but didn't want to make that transition from UE3 to UE4 yet. At the end, Epic will make more money selling the UE4 as a scalable engine that all gaming consoles can use in various degrees.
 

MDX

Member
But "straight FB purposes" is exactly what the conversation was about. Using eDRAM for the FB is something you can expect every game on the platform to do, don't you agree?


And what if instead of the eDRAM they use the DDR3 as the framebuffer?
 
Okay so now that we know The Wii U won't be up to par with Microsoft and Sony's next gen systems based off rumors here. What can Nintendo do to insure people to run out and buy this console? also to make sure gamer's have a reason to use it all the time and not end up being a Closet console 2 years into its life cycle. The only possible aspect that i can think off, is Exclusives from 3rd parties. Could this be something Nintendo is doing but we just don't see it yet?
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
But "straight FB purposes" is exactly what the conversation was about. Using eDRAM for the FB is something you can expect every game on the platform to do, don't you agree?
My bad, somehow I misread your post diametrically.
 
Okay so now that we know The Wii U won't be up to par with Microsoft and Sony's next gen systems based off rumors here. What can Nintendo do to insure people to run out and buy this console? also to make sure gamer's have a reason to use it all the time and not end up being a Closet console 2 years into its life cycle. The only possible aspect that i can think off, is Exclusives from 3rd parties. Could this be something Nintendo is doing but we just don't see it yet?
This is a more of "Nintendo business" question than a technical one, but one thing Nintendo is doing is to try to integrate the Wii U gamepad into your living/bed/bathroom. Effectively succeeding in that will reduce the system's chances of ever being put away.
 
The Side by Side and Top Bottom are for 3D tvs, the color 3D is for getting 3D on non-colored tvs. If your tv was a 3D set, you could send it the sbs or tb versions and the tv would know how to handle them to alternate the frames to produce the 3D image.
On this note, I'm glad to hear it has both top/bottom and side-by-side. I've had at least one game that did only side-by-side (Sonic Generations), but on a set like mine where in 3D mode the rows are split between eyes for 1920x540 each it's a bad mix--since the game is providing only half of the horizontal pixels and the set has to halve the resolution vertically, each eye would only get 960x540.
Fatalmephisto said:
also to make sure gamer's have a reason to use it all the time and not end up being a Closet console 2 years into its life cycle.
I think Nintendo's fine with people sticking it in the closet, as long as the closet allows plenty of wireless range for the gamepad.
 

Stewox

Banned
Many are stating that the WiiU's CPU is small and slow.

Those were all unreliable sources, the latest one has been officially nabbed last week. This whole thing is old news, to serious wiiu tech followers, i won't explain details all over again now bc i'm on cell, other guys from speculation threads should be well aware of this, for latest example check "metro dev cpu horrible" thread, last or prelast page.

Same "cpu slow, not enough shaders" crap was coming out just before E3 2012, i wasn't registered here yet, but i warned and pointed out other forums not only the noumerous clues that make it inaccurate, but about precisely the possibility of old impressions and very early analysis with usual subjective tone that get carried over the months. AND THIS IS EXACTLY WHAT HAS BEEN PROVEN OFFICIALLY IN METRO DEV SLOW CPU DEBACLE, IT IS A PILE OF SHIT.

No competent dev would look at hw for a few days and start blasting bs , but ofcourse those who do don't have idea, or by media bribe, or are some friend of friend of a qa tester or janitor, or simply media bs overexaggeration and vaguelization.

The stupid media looks at the MCM and sees the cpu chip smaller than gpu and says "oh its so small it must be slow too and we got our super secret squirrel master detective unnamed bribed dev idiot source whos saying the same thing", while not mentioning GPGPU and OooE as well as RAM pipeline latency advancements.

They have no evidence, no comparison for their "slow". It's not slower than x360 cpu, that's a fact.

Everyone in this thread should already know this, this shouldn't be a discussion.
 

TheD

The Detective
Those were all unreliable sources, the latest one has been officially nabbed last week. This whole thing is old news, to serious wiiu tech followers, i won't explain details all over again now bc i'm on cell, other guys from speculation threads should be well aware of this, for latest example check "metro dev cpu horrible" thread, last or prelast page.

Same "cpu slow, not enough shaders" crap was coming out just before E3 2012, i wasn't registered here yet, but i warned and pointed out other forums not only the noumerous clues that make it inaccurate, but about precisely the possibility of old impressions and very early analysis with usual subjective tone that get carried over the months. AND THIS IS EXACTLY WHAT HAS BEEN PROVEN OFFICIALLY IN METRO DEV SLOW CPU DEBACLE, IT IS A PILE OF SHIT.

No competent dev would look at hw for a few days and start blasting bs , but ofcourse those who do don't have idea, or by media bribe, or are some friend of friend of a qa tester or janitor, or simply media bs overexaggeration and vaguelization.

The stupid media looks at the MCM and sees the cpu chip small and says "oh its so small it must be slow too and we got our super secret squirrel master detective unnamed bribed dev idiot source whos saying the same thing".

What an appalling post.

The WiiU has a small CPU, that is built on a process that is only as advanced as the other consoles and uses a lot less power.

None of that paints a good picture of it's performance!

The media is not "stupid" because they came to the logical conclusion, the idea that people have been bribed is nothing short of deluded and Devs are not idiots!

GET OVER IT!
 

Durante

Member
Those were all unreliable sources, the latest one has been officially nabbed last week.
Which unreliable sources are you talking about here, exactly? Because we know very reliably that the CPU is (a) tiny and (b) runs at a low clock speed. I don't see why you discredit CPU die size as a good partial indicator of performance. Oh wait, I do, it's because it's the one fact where you can't discredit the source instead.

Also media bribes? Are you still serious now?
 
IIRC, that it's "small" derives from teardowns and measurement - the latest of which was Anandtech's?

That it's "slow" at least in terms of pure clockspeed derives from multiple sources, but final confirmation was derived from a hacker named marcam. It was also verified by GAF source lherre, iirc.

That it's "horrible" comes from the Metro 2033 dev.

The OP could probably use updating with die sizes, and potentially power draw, since they're confirmed measured figures.
 
Mhz isn't everything. Actually it's pretty low on the totem pole.

Can't respond to every detail right now.

Well, we have a rough idea of how powerful the CPU is since Broadway is based on the 750CL (or the other way around). Compared to Wii's CPU, it is clocked about 70% faster, triple-core, and have a bigger assymmerical cache. It is clocked fast compared to any other CPU with such a short stage-pipeline. If you combine all the cores, you will have about 5x Broadway in raw power. With the additional cache and other improvements, it may be closer to 6x Broadway or 9x Gekko.

Broadway has relatively weak SIMD, but it did very favor well in some tasks like general purpose tasks clock-to-clock compared to the PPE in Cell and the cores in Xenon. Espresso, with its boost in speed and bigger cache, may surpass those processors in several ways.
 

Thraktor

Member
Since this is the technical discussion thread, let's discuss what they could planning engineering-wise.

If you look at the PS3 / 360 the 'slim' redesigns are are a lot bigger than the slimline "PStwo" redesign from the previous generation. And also more expensive.

The emphasis on low-power consumption for the Wii U may point to a strategy of being able to come out with a smaller, cheaper version early in the console's lifetime in comparison to previous generations. And then competing on price.

I really don't know anything about this sort of hardware engineering so anyone with a little more knowledge want to speculate about how quick the Wii U could be turned into a Wii Ultralite and what sort of cost-savings that would give?

This is something that interests me with the Wii U's design. We know from the Wii mini that they are doing die-shrinks these days, so it's certainly a possibility that they'd release a smaller model at some point. In fact, in a few years time they could release a new model with ~14nm CPU/GPU, no optical drive and say 256GB of flash, and they could really squeeze it into a tiny package. The fact that they've got about 90% of retail games on the eShop would make a download-only version of the console feasible, but they'd have to learn from the mistakes of the PSPGo, and obviously keep a disc-based model on the shelves as well.
 

MDX

Member
This is something that interests me with the Wii U's design. We know from the Wii mini that they are doing die-shrinks these days, so it's certainly a possibility that they'd release a smaller model at some point. In fact, in a few years time they could release a new model with ~14nm CPU/GPU, no optical drive and say 256GB of flash, and they could really squeeze it into a tiny package. The fact that they've got about 90% of retail games on the eShop would make a download-only version of the console feasible, but they'd have to learn from the mistakes of the PSPGo, and obviously keep a disc-based model on the shelves as well.


They solved the retail issue by having users buy credit from stores.
 

MDX

Member
Those were all unreliable sources, the latest one has been officially nabbed last week. This whole thing is old news, to serious wiiu tech followers, i won't explain details all over again now bc i'm on cell, other guys from speculation threads should be well aware of this, for latest example check "metro dev cpu horrible" thread, last or prelast page.

Im not making any assumptions about the CPU.
What Im saying is, those that are critical of the WiiU CPU, because its size,
how do they reconcile Sony using an APU?
 

NBtoaster

Member
Im not making any assumptions about the CPU.
What Im saying is, those that are critical of the WiiU CPU, because its size,
how do they reconcile Sony using an APU?

We don't know the size or power consumption of whatever Sony is using. And it's fairly unlikely it'll be smaller or require less power than what's in the Wii U.
 

wsippel

Banned
Which unreliable sources are you talking about here, exactly? Because we know very reliably that the CPU is (a) tiny and (b) runs at a low clock speed. I don't see why you discredit CPU die size as a good partial indicator of performance. Oh wait, I do, it's because it's the one fact where you can't discredit the source instead.

Also media bribes? Are you still serious now?
It would be a lot bigger if it used SRAM, though. Very different cache-to-logic ratio. That makes comparisons to any other chip pretty hard, as the only other chip with eDRAM is Blue Gene/Q, and that's an 18 core monstrosity with 72 SIMD units and 32MB L2 cache. Still, for comparison's sake, Blue Gene/Q is 360mm^2 and has 1.47 billion transistors. Uses the same process as Espresso (Cu-45HP).
 
Not completely related to the topic, but I was looking at a Beyond3D thread about polygon counts, and I was a bit shocked on the polygon counts for Samus' character model for Other M and Metroid Prime 3. They were over 15k and 18k respectfully, which are actually comparable to alot of character models for HD consoles. Is the Wii in-game polygons/sec way less than 10% of what the 360/PS3 should be able to put out? Are games for HD-consoles using that much more polygons in the gaming world considering that they can save on polygons by using advance shaders and normal mapping vs pure geometry?

On how this relates to the Wii U: if devs from Temco and Retro was able to pull off such complex models with no normal mapping or modern shaders on the Wii, can this experience be carried over to make some ridiculously detailed character models using the much more powerful but BC-friendly Wii U?
 
IIRC, that it's "small" derives from teardowns and measurement - the latest of which was Anandtech's?

That it's "slow" at least in terms of pure clockspeed derives from multiple sources, but final confirmation was derived from a hacker named marcam. It was also verified by GAF source lherre, iirc.

That it's "horrible" comes from the Metro 2033 dev.

The OP could probably use updating with die sizes, and potentially power draw, since they're confirmed measured figures.

Do you really think the Wii U CPU is slower because the clock speed is slower than the Xbox360 one? What do you do in a serious tech thread anyway? Inform yourself about CPU tech, for example of cache, edram, out-of-order execution, operations per cycle, etc. Of course the Wii U CPU has more power than the Xbox360 one.
 

z0m3le

Banned

Because if you compare the 360's 3 core Xenon to the Wii's 1 core broadway, you get this:
http://www.neogaf.com/forum/showpost.php?p=44966628&postcount=646 Just to bring up this post from a few pages back, I had a question...

If 1 Xenon core only out performed Broadway by 20%, than could this be why Wii U's CPU cores were originally clocked at 1GHz? (a clock speed increase of 37%) wouldn't this than put the Wii U's 3 cores at 1.24GHz with as good or better performance clock for clock as Broadway, quite a bit beyond Xenon (Broadway would match or beat a Xenon core, at only 875MHz?) so more like a 1.5x (I know we love those multiples) or 50% faster than Xenon according to the emulator developer from the quoted post above?

Obviously for certain tasks that are SIMD heavy would still likely fall short on Wii U, but for general computing of game logic and what not, wouldn't Wii U's CPU cores reasonably faster? or was the developer missing some other key component of Xenon's power that was unlocked later on?
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
He is. So are the developers claiming it's weaker. It's both more and less powerful. It depends on the workload.

It's certainly an interesting enigma of a device. So many conflicting stories about it's power.

Our family is currently playing Nintendoland and the graphics look pretty good. And enough power to have 2 players split screen on the TV and another player on the gamepad.
 

wsippel

Banned
It's certainly an interesting enigma of a device. So many conflicting stories about it's power.

Our family is currently playing Nintendoland and the graphics look pretty good. And enough power to have 2 players split screen on the TV and another player on the gamepad.
Yeah, Nintendo Land surprised me as well. The GPU is certainly not bad at all. The CPU is where most of the confusion comes from, but it isn't really all that confusing once you realize how different the designs are. The 750 family is extremely different from the PPEs.
 
Because if you compare the 360's 3 core Xenon to the Wii's 1 core broadway, you get this:
My query was towards the basis of such certainty towards the U'CPU outperforming Xenon. And it was rhetorical in that the answer is there is no basis for certainty.

This seems the most balanced assessment at present.
He is. So are the developers claiming it's weaker. It's both more and less powerful. It depends on the workload.

---
On something of an aside, even if the developer quote is accurate - and it seems a bit of a strange comparison considering the PS3 and 360 CPUs aren't afaik equivalent, your calculations appear to be off I believe. Assuming the developer is comparing core-to-core a 1.7 multiplier applied to a .83 "performance" amounts to a 40% increase.
 

z0m3le

Banned
My query was towards the basis of such certainty towards the U'CPU outperforming Xenon. And it was rhetorical in that the answer is there is no basis for certainty.

This seems the most balanced assessment at present.


---
On something of an aside, even if the developer quote is accurate - and it seems a bit of a strange comparison considering the PS3 and 360 CPUs aren't afaik equivalent, your calculations appear to be off I believe. Assuming the developer is comparing core-to-core a 1.7 multiplier applied to a .83 "performance" amounts to a 40% increase.

However Wii U's CPU is more than just 3 Wii CPUs, added cache and advancements in architecture would likely yield a few more percentage points, which is why I said "more like 1.5X or 50%" it was just a simple estimation and I also had taken into consideration the SIMDs, also post #816 I point out that Crysis 3 is going to run on a dual core 2.8GHz AMD processor for it's minimum requirements, the more advanced AMD dual cores are only ~25GFLOPs, so I'm not sure how much the CPU will even be a barrier for next gen content. It more than seems possible that the CPU won't really matter much at all, especially if other consoles take up the 3.1mm^2 @28nm a core Jaguars.

I just find it really strange that people would write off a console without any real knowledge or comparisons with other future consoles, Wii U is already playing current gen ports "on par" with PS3/360 and the Wii U's CPU might be the most similar performance wise with other next gen consoles.
 

Durante

Member
It's not an "enigma" at all. It's just a different architecture. The fact that one piece of hadware can be both faster and slower than another depending on the workload is only unusual if you have an overly simplified view of performance.

Generally, there are a few things very wrong with the CPU conversation in this thread:
1) Multipliers applied to the overall performance of a CPU. That just can't work for two architectures which are very different.
2) Using Cell and Xenon interchangeably for comparison. They are very much different. Cell (in PS3) has 7 SPEs, and even a single SPE can offer better performance than a single PPE.
3) People freaking out when you call the CPU "weak" and starting to throw out buzzwords and -sentences which they read somewhere without fully understanding them.

Because if you compare the 360's 3 core Xenon to the Wii's 1 core broadway, you get this:
That's one workload. The real question is, in how much of an average game workload does the CPU perform worse/better compared to Xenon? My assumption is that, as long as developers don't put the majority of their FP code on the GPU (which requires significant effort and re-engineering, and takes away GPU performance from graphics), there will be a lot of FP code which performs significantly worse.
 

wsippel

Banned
That's one workload. The real question is, in how much of an average game workload does the CPU perform worse/better compared to Xenon? My assumption is that, as long as developers don't put the majority of their FP code on the GPU (which requires significant effort and re-engineering, and takes away GPU performance from graphics), there will be a lot of FP code which performs significantly worse.
I wonder how much floating point heavy code currently used can be redesigned to work better on a CPU like Espresso. I mean achieving comparable, maybe even better results using different solvers/ algorithms.
 

z0m3le

Banned
That's one workload. The real question is, in how much of an average game workload does the CPU perform worse/better compared to Xenon? My assumption is that, as long as developers don't put the majority of their FP code on the GPU (which requires significant effort and re-engineering, and takes away GPU performance from graphics), there will be a lot of FP code which performs significantly worse.

Which was also covered in what I said. With all honesty, Wii U is already handling current gen ports, so the CPU is likely not causing that much of a head ache, next gen consoles will have far inferior FP performance from their CPUs, especially if they go with Jaguar. Again PS4's CPU is likely much closer in performance to Wii U's CPU than any other part of the PS4, that is without much argument btw. Jaguar cores are tiny and while they might pull off more performance than the Wii U's, it's only ~20% faster than bobcat, which @ 1.6GHz was nothing to write home about.
 

YAWN

Ask me which Shakespeare novel is best
Hey guys, I just got my Wii U today for Christmas, and really looking forward to playing it.
Still, I started the update at around 10:20 this morning, and it's now 12:30 and around three quarters finished. Does it normally take this long?
 

Durante

Member
Even at 1.6 Ghz, 8 Jaguar cores (which is the absolute minimum they should go with with such a tiny/low power core) would still deliver over 50 GFlops from what I can decipher from the specs. And hey, maybe MS does go with 16 after all. That would be pretty awesome.

Also, I don't think 1.6 GHz is a limit set in stone just because that's what they plan to use in tablets -- a console is not a tablet after all.
 

z0m3le

Banned
Hey guys, I just got my Wii U today for Christmas, and really looking forward to playing it.
Still, I started the update at around 10:20 this morning, and it's now 12:30 and around three quarters finished. Does it normally take this long?

Congratulations, it can take up to 4 hours, but there was a way to cancel it and allow it to continue to download in the background while you played. However, there is a thread for that.
 

wsippel

Banned
Hey guys, I just got my Wii U today for Christmas, and really looking forward to playing it.
Still, I started the update at around 10:20 this morning, and it's now 12:30 and around three quarters finished. Does it normally take this long?
Only when servers are getting hammered. The update is pretty big and it's christmas, so a ton of people are updating right now. Launch days was terrible as well. A few days later and it took only a few minutes. The 2.0 update took me three hours, the 2.1 update, which was roughly the same size, took half an hour on my extremely slow connection.


Even at 1.6 Ghz, 8 Jaguar cores (which is the absolute minimum they should go with with such a tiny/low power core) would still deliver over 50 GFlops from what I can decipher from the specs. And hey, maybe MS does go with 16 after all. That would be pretty awesome.
Would it? As the number of cores increases, efficiency always goes down. It's not a big issue for typical supercomputing number crunching, but games are very different animals.
 

Rolf NB

Member
Which was also covered in what I said. With all honesty, Wii U is already handling current gen ports, so the CPU is likely not causing that much of a head ache, next gen consoles will have far inferior FP performance from their CPUs, especially if they go with Jaguar. Again PS4's CPU is likely much closer in performance to Wii U's CPU than any other part of the PS4, that is without much argument btw. Jaguar cores are tiny and while they might pull off more performance than the Wii U's, it's only ~20% faster than bobcat, which @ 1.6GHz was nothing to write home about.
Performance on current-gen ports is a mixed bag trending on the "slower than 6years+ old competition" side of things. In no way do they prove the CPU can keep up.

Besides, I don't even buy into the PS4 Jaguar rumour. It would be a hefty performance regression, which is what you're counting on to make the WiiU CPU look competitive. But this won't happen. Jaguar/APU stuff in consoles, it it materializes at all, will be used for low power media modes, active standby with downloads continuing, that sort of thing. It will not be what runs the games. It can't. It would make no sense to launch a successor machine with those sorts of components at all.
 
Top Bottom