• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU technical discussion (serious discussions welcome)

Meelow

Banned
You're looking at it too broad. There were several shifts to scalable engines and towards more singular architecture since the PS2 era. The Wii U will be technically capable of anything the PS4 can do, even if its 1/4th to 1/15th the speed, its technically capable.

However, the PS4 is in a whole nother league, if leaked specs are true, to the Wii U, and it probably wouldn't be fiscally sound to attempt to do a game thats even near a playable faithful representation of the PS4 game on Wii U.

"Graphics uber alles" has never decided commercial success. Trying to contrive comparisons to prior console generations is also largely a fools errand, as the gaming community and the world economy were much different back then.

I think I might be the only one on GAF who thinks Durango and Orbis are going to be huge albatrosses for the industry if Sony and Microsoft can't keep the price down.

In any case, let me know where to send my money for the purchase of that photo.

Its really hard to use past consoles to describe the differences between current consoles and generation numbers are too vague in their definition.

I will say however that its not as big a difference as that of Wii and PS3/360 (though I think the difference between Sony and MS's new consoles is going to be bigger than last time out by the looks of it).

Thanks, I'm happy that Nintendo put effort into the Wii U specs and not what they did with the Wii, the specs between WiiU/PS4/X720 are very interesting I think.
 
This isn't accurate at all.

How so? It's a pretty sizeable gap in the raw horsepower and functionality of the machines, and to make a game look visually similar would be a pretty big undertaking. Moreso if there are games that are heavy on CPU usage as the Wii U is already hitting problems with this gen, magical DSP and GPGPU accounted for. The difference in bandwidth is ~15x, and they're at the same resolutions.
 

Donnie

Member
Oh I get what he's trying to say. Im my random BS made up estimation, the difference between Wii U and Orbis is wider than Dreamcast to Xbox, but the Wii was functionally handicapped in addition to having gimped horsepower vs XB360. Not a fair comparison.

The Dreamcast to XBox comparison is a difficult one. In some ways WiiU vs PS4 are even further apart, and in others they're closer together. Its even further complicated by the huge architectural differences between Dreamcast and XBox.
 

LCGeek

formerly sane
Its really hard to use past consoles to describe the differences between current consoles and generation numbers are too vague in their definition.

I will say however that its not as big a difference as that of Wii and PS3/360 (though I think the difference between Sony and MS's new consoles is going to be bigger than last time out by the looks of it).

You really shouldn't considering how gpus have evolved and how much power isn't being used from the last generation till right now.
 

Donnie

Member
How so? It's a pretty sizeable gap in the raw horsepower and functionality of the machines, and to make a game look visually similar would be a pretty big undertaking. Moreso if there are games that are heavy on CPU usage as the Wii U is already hitting problems with this gen, magical DSP and GPGPU accounted for. The difference in bandwidth is ~15x, and they're at the same resolutions.

The difference in bandwidth isn't 15x as that completely ignores WiiU's embedded memory (its like saying PS4 will have 3x the bandwidth of XBox3). Also I personally still think that most problems with WiiU's CPU vs current gen comes down to design choice, how games are optimised for two very different CPU's. Obviously 8 Jaguar cores will be quite a bit more powerful than 3 WiiU CPU's. Though if its true that XBox3 will have 6 for games then I'd expect multiplatform games to build around that number.
 
How so? It's a pretty sizeable gap in the raw horsepower and functionality of the machines, and to make a game look visually similar would be a pretty big undertaking. Moreso if there are games that are heavy on CPU usage as the Wii U is already hitting problems with this gen, magical DSP and GPGPU accounted for. The difference in bandwidth is ~15x, and they're at the same resolutions.

Because lowering asset quality isn't some crazy expensive thing that would make porting a game fiscally unsound.

Having to rebuild your engine from scratch to support a platform that doesn't have programmable shaders is much less "fiscally sound" than lowering the quality of some items to support a platform with a bit less grunt.

Hardware feature set wise it's all there for the Wii U. They're not going to have to reprogram entire engines to support the platform. They're going to drop the polygon count, cut down texture/normal map resolution, maybe a few less enemies on screen, and things like that. Which is fiscally minimal as far as porting effort goes, and is NO WHERES NEAR the expense of having to create an engine from scratch to support a platform.

Most of these 3rd party games are going to be on the PC where people will have systems (and play them on systems) with a similar or larger gulf in power than the difference between the Wii U, 720, and PS4.

I mean shit look at AMD's 7XXX line, the 7750 is a 819 gflop part, with 72GB/s of memory bandwidth, and then you have the 7970 a 4.3Tflop part, with 288GB/s of bandwidth. Yet both of them will play the same games, the 7970's going to play those games much smoother, and looking way better than the 7750, but it'll still play them. One is just going to run with higher quality assets than the other one, which is not some fiscally unsound thing.

As long as feature sets are there (which we know the Wii U GPU has) then the games can be ported with lower quality assets which is much cheaper to do than having to rebuild an engine for a system with missing major features.
 
Because lowering asset quality isn't some crazy expensive thing that would make porting a game fiscally unsound.

Having to rebuild your engine from scratch to support a platform that doesn't have programmable shaders is much less "fiscally sound" than lowering the quality of some items to support a platform with a bit less grunt.

Hardware feature set wise it's all there for the Wii U. They're not going to have to reprogram entire engines to support the platform. They're going to drop the polygon count, cut down texture/normal map resolution, maybe a few less enemies on screen, and things like that. Which is fiscally minimal as far as porting effort goes, and is NO WHERES NEAR the expense of having to create an engine from scratch to support a platform.

Most of these 3rd party games are going to be on the PC where people will have systems (and play them on systems) with a similar or larger gulf in power than the difference between the Wii U, 720, and PS4.

I mean shit look at AMD's 7XXX line, the 7750 is a 819 gflop part, with 72GB/s of memory bandwidth, and then you have the 7970 a 4.3Tflop part, with 288GB/s of bandwidth. Yet both of them will play the same games, the 7970's going to play those games much smoother, and looking way better than the 7750, but it'll still play them. One is just going to run with higher quality assets than the other one, which is not some fiscally unsound thing.

As long as feature sets are there (which we know the Wii U GPU has) then the games can be ported with lower quality assets which is much cheaper to do than having to rebuild an engine for a system with missing major features.

We might be looking at different things here.

Factoring in the CPU, which limits a lot of possibilities such as a lot of the primo stuff we've been seeing in a lot of these demo reels, the base featureset of the U is gimped. Being a order of magnitude more advanced, that's expected. In addition, while we don't know the complete spec of the UGPU, there's bound to be a lot of effects that are just not responsible to have in a lower spec system (We're already seeing problems with low res world shadows of this gen U ports, which again might be a bandwidth issue).

I don't see how the U can handle next gen particle, physics, and lighting engines, unless it takes the iOS way and prebakes those assets. If it does try to undertake those engines we would see sharp concessions elsewhere. In addition, while this might be alleviated by lowering of asset fidelity, we're still talking 1GB (slower) DDR3 RAM available to devs versus 3.5GB GDDR5

So you would have a gimped port, that consumers will likely pass on versus original content or the same game on a superior system, like they do already unless the console in question is the market's main egg (As the PS2 was). Thus sales will be impacted, thus it wouldnt be fiscally responsible. All IMO, of course.
 

FLAguy954

Junior Member
Because lowering asset quality isn't some crazy expensive thing that would make porting a game fiscally unsound.

Having to rebuild your engine from scratch to support a platform that doesn't have programmable shaders is much less "fiscally sound" than lowering the quality of some items to support a platform with a bit less grunt.

Hardware feature set wise it's all there for the Wii U. They're not going to have to reprogram entire engines to support the platform. They're going to drop the polygon count, cut down texture/normal map resolution, maybe a few less enemies on screen, and things like that. Which is fiscally minimal as far as porting effort goes, and is NO WHERES NEAR the expense of having to create an engine from scratch to support a platform.

Most of these 3rd party games are going to be on the PC where people will have systems (and play them on systems) with a similar or larger gulf in power than the difference between the Wii U, 720, and PS4.

I mean shit look at AMD's 7XXX line, the 7750 is a 819 gflop part, with 72GB/s of memory bandwidth, and then you have the 7970 a 4.3Tflop part, with 288GB/s of bandwidth. Yet both of them will play the same games, the 7970's going to play those games much smoother, and looking way better than the 7750, but it'll still play them. One is just going to run with higher quality assets than the other one, which is not some fiscally unsound thing.

As long as feature sets are there (which we know the Wii U GPU has) then the games can be ported with lower quality assets which is much cheaper to do than having to rebuild an engine for a system with missing major features.

This. I am continuously baffled as to why many tech-minded posters seem to completely (or selectively) ignore engine scalibility. I doesn't make any sense.
 

VanWinkle

Member
I don't see how downscaling a PS4 game to Wii U would be any different to downscaling a PC game to the PS3/360. It's not like engines don't scale at all. Most multiplatform are already designed to scale.
 

neo-berserk

Neo Member
We might be looking at different things here.

Factoring in the CPU, which limits a lot of possibilities such as a lot of the primo stuff we've been seeing in a lot of these demo reels, the base featureset of the U is gimped. Being a order of magnitude more advanced, that's expected. In addition, while we don't know the complete spec of the UGPU, there's bound to be a lot of effects that are just not responsible to have in a lower spec system (We're already seeing problems with low res world shadows of this gen U ports, which again might be a bandwidth issue).

I don't see how the U can handle next gen particle, physics, and lighting engines, unless it takes the iOS way and prebakes those assets. If it does try to undertake those engines we would see sharp concessions elsewhere. In addition, while this might be alleviated by lowering of asset fidelity, we're still talking 1GB (slower) DDR3 RAM available to devs versus 3.5GB GDDR5

So you would have a gimped port, that consumers will likely pass on versus original content or the same game on a superior system, like they do already unless the console in question is the market's main egg (As the PS2 was). Thus sales will be impacted, thus it wouldnt be fiscally responsible. All IMO, of course.
exactly you can downscale graphic but not cpu features,like ;in pc,if the minimum specs is four cores,you can't play with 3 cores.
 

Donnie

Member
We might be looking at different things here.

Factoring in the CPU, which limits a lot of possibilities such as a lot of the primo stuff we've been seeing in a lot of these demo reels, the base featureset of the U is gimped. Being a order of magnitude more advanced, that's expected. In addition, while we don't know the complete spec of the UGPU, there's bound to be a lot of effects that are just not responsible to have in a lower spec system (We're already seeing problems with low res world shadows of this gen U ports, which again might be a bandwidth issue).

I don't see how the U can handle next gen particle, physics, and lighting engines, unless it takes the iOS way and prebakes those assets. If it does try to undertake those engines we would see sharp concessions elsewhere. In addition, while this might be alleviated by lowering of asset fidelity, we're still talking 1GB (slower) DDR3 RAM available to devs versus 3.5GB GDDR5

So you would have a gimped port, that consumers will likely pass on versus original content or the same game on a superior system, like they do already unless the console in question is the market's main egg (As the PS2 was). Thus sales will be impacted, thus it wouldnt be fiscally responsible. All IMO, of course.

No we're looking at 1GB DDR3 + 32MB eDRAM vs 3.5GB GDDR5 (obviously my point here is one of bandwidth rather than the extra 32MB space).
 

Gahiggidy

My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
What would be nice is an x-ray or micrograph to give us the real numbers and finally see what the heck is under the hood of the mystery that is the WiiU
I think my local library has some of those. Will head there later this week with my Wii U in tow to take a look.
 
Interesting, so if we take the rumors as true, etc does it seem more like WiiU/PS4/X720 are more like 6th gen than 7th gen?
I mean the difference like Dreamcast to PS2 to GameCube to Xbox and not the difference between Wii to PS3/360.
There's no clear cut answer for that because generations are defined not by a threshold of acceptable performance but by what is standard. No developer liked the PS2 that generation, not even Naughty Dog (no, really) but they had to work with it... so, "bugger" and they did. DC never really took off the ground as standard; multiplatform for it was PSone games running in 480p with extra bells and whistles; if it had though, polygon throughput would have been halved in multiplataform games as well as limited to 1.2 GB's of disc storage which would have helped Gamecube actually.


Anyway, Nintendo hopes that Wii U becomes that (PS2) standard by selling quite a bit and not being as casual focused as the Wii and by having a modern GPU this time around, meaning that even if the performance is reduced, cross platform using the same base tech is always doable.

If it does get accepted as that minimum denominator then games will be made first and foremost with that standard in mind and then upres it and enable some extras. That way multiplatform games could have 1080p standard on X720/PS4 and be held down to 720p on it, perhaps with some other downgrades but doable nonetheless. That's their best case scenario, that best case scenario could also extend PS3 and X360 lifespans on the perception of them having similar capabilities though, and that's not good for Nintendo unless ports start to improve.

The worse case scenario is, developers focusing on the x86 optimization and deprecating on the PPC branch support for their tech right after PS3 and x360 die, relegating the Wii U to be forever technologically bundled with them (like happened with the Wii and all further evolutions "custom made"). Other problem is if most PS4/X720 games are not made with Wii U and 1080p in mind (for their spec) then that'll make parity harder and harder, as porting will mean taking things away from the Wii U version rather than simply adding it to the higher spec'ed versions.

Basically making games works out one way, forward. Backward albeit usually doable is a butcher job.


From the looks of it though, it definitely looks like Nintendo undershooted; and that this time around Sony and Microsoft aren't actually overshooting it (they're perhaps overshooting on the RAM department though).

There will be a point where undershooting to a certain extent won't be a problem though, but it's not up for Nintendo to guess that but to conjuncture (basically if you always do the same thing you're bound to be right, sometime); be it because developers really need the multiplatform approach seeing development costs are always increasing (and after the Wii and DS Nintendo leads in mass appeal for the time being), be it because due to those development costs increasing, at some point going all out on production values won't really be a good idea for most games (just like it's not economically viable to invest in a mobile phone game that really pushes the hardware and so no one does) developers will just have to accept what's good enough and go for there.

Of course, in an artistry driven industry that'll only happen when situation is so severe that not being business minded means certain doom (we've been further from that); when they're made to by higher ups (PS2 scenario) due to said machine being the only contender for lead platform (not gonna happen with 2 consoles being 86 like the PC) or when engine and asset scalability works really well (Tesselation could work wonder's there, but as with any new tech time will tell how it's gonna be used). In the end though, there's always pressure and undermining by settling for less. (and less power is less, for the developer community)
 

Donnie

Member
exactly you can downscale graphic but not cpu features,like ;in pc,if the minimum specs is four cores,you can't play with 3 cores.

The developer decided the minimum spec. It is certainly possible to scale a game down to less CPU cores (the minimum spec on a PC game already does this to a degree). 8 down to 3 could be tricky, even 6 (XBox3) down to 3 could be hard, but not impossible.
 

Donnie

Member
I still don't see where 32MB is going to be good for anything but some postprocessing effects, but ok. It obviously isn't fast enough to have devs enable AA...

Its going to remove pretty much all the bandwidth taken by rendering away from WiiU's main memory. Which is why you can't compare PS4's main memory bandwidth to WiiU's, they don't have the same job within their respective system.

As I said its the same as saying PS4 has 3x the bandwidth of XBox3.
 
Wii u -> ps4 is about Wii -> ps3.

Might be a smidgin closer but nothing to write home about. Advantage for the Wii U this time is 720p still looks pretty good compared to the 480p on a HDTV.
 
We might be looking at different things here.

Factoring in the CPU, which limits a lot of possibilities such as a lot of the primo stuff we've been seeing in a lot of these demo reels, the base featureset of the U is gimped. Being a order of magnitude more advanced, that's expected. In addition, while we don't know the complete spec of the UGPU, there's bound to be a lot of effects that are just not responsible to have in a lower spec system (We're already seeing problems with low res world shadows of this gen U ports, which again might be a bandwidth issue).

A feature set, is a set of features. Having less power than another system doesn't make those features non existent. The Wii U GPU supports the same feature set being a modern GPU as other DX11 parts (outside of DX11's specific way of doing tesselation but that's been covered). One part can be more powerful than the other and they still support the same feature set.

The 720 and PS4 aren't some order of Magnitude more advanced than the Wii U GPU. They're DX11 parts, the Wii U part is based on a DX 10.1 part. The move to DX 11 added mainly 3 features, 2 of which the RV700 line supported directly, and 1 (tesselation) which it could do but not the way DX11 did it.

Having the same feature set but lower power, just means you do the same things but less of them, or less efficiently.

I don't see how the U can handle next gen particle, physics, and lighting engines, unless it takes the iOS way and prebakes those assets. If it does try to undertake those engines we would see sharp concessions elsewhere. In addition, while this might be alleviated by lowering of asset fidelity, we're still talking 1GB (slower) DDR3 RAM available to devs versus 3.5GB GDDR5

And you're ignoring the Wii U's edram. Particles, physics, and even lighting are all things that can still be done, and just have there be less of them or at a lower resolution. These aren't things that would have to be reprogrammed from scratch. the 720/PS4 have a spot with XXX number of particles, the Wii U with does it with XX particles. I mean shit dude the difference of DDR3 vs GDDR3/5 can be seen even in high end to low end graphics cards. There's a couple of DX11 graphics cards out there using DDR3 instead of a GDDR varient. Guess what? They can still run the same games!!!!!! They just don't look as good.

I'm not saying that these games are going to look as nice, run at the same frame rate or resolution as their 720/PS4 cousins. These games can be run on the system and would require less of an investment than reprogramming an entire engine for a missing feature set like programmable shaders.

So you would have a gimped port, that consumers will likely pass on versus original content or the same game on a superior system, like they do already unless the console in question is the market's main egg (As the PS2 was). Thus sales will be impacted, thus it wouldnt be fiscally responsible. All IMO, of course.

1. Millions of people own only 1 system. I would imagine they would rather get a versions with lower assets, than no version at all. 2. If this is your attitude then every gamer on the planet should be playing a system with an i7, and a 7970, and people should just forget consoles all together. Since the console versions are always going to be weaker lower asset gimped versions and not even come close to the grunt of a high end PC.
 
Hi all, just catching up on this thread. I've lurked all the WUST and wanted to note I'm willing to put some money up for the Chipworks photo's. I'm interested in finding out more as well. Anyway, back to lurking and learning more about the architecture :)
 

Donnie

Member
Wii u -> ps4 is about Wii -> ps3.

Might be a smidgin closer but nothing to write home about. Advantage for the Wii U this time is 720p still looks pretty good compared to the 480p on a HDTV.

Its more than a smidgin closer IMO and the advantage is architecture (that's one of the main reasons why its closer).
 

neo-berserk

Neo Member
The developer decided the minimum spec. It is certainly possible to scale a game down to less CPU cores (the minimum spec on a PC game already does this to a degree).

what i mean is,if next gen pc port support minimum 4cores then it will be impossible to get port on inferior hardware,because of the engine utilise all sort of features like;lots of physics and tons of npcs with different a.i .Exactly like this gen when the wii can't get port of even a game that can be play on p4@ 3ghz.

edit; seen some post that answer my questions.
 
what i mean is,if next gen pc port support minimum 4cores then it will be impossible to get port on inferior hardware,because of the engine utilise all sort of features like;lots of physics and tons of npcs with different a.i .Exactly like this gen when the wii can't get port of even a game that can be play on p4@ 3ghz.

No, you're not understanding why the Wii didn't get a port of that game. Plus the fact that the upcoming engines are way more scalable than the ones of previous gens. Number of cores also isn't an exact measure of power, or ability of a CPU. I would take the 4 cores of an i7 2600k over 12 Jaguar cores any day of the week.
 
I'm not convinced. Between the R7xx, which we're still not sure the Wii is based on, and the R1xxx, there have been many architectural changes that made effects that much more efficient per clock, that made things (that don't have a 'resolution' per se) feasible in realtime at a playable clip. Inclusion of these effects on older hardware may have perhaps not been available to use at a playable speed.

Then we are just assuming most textures and other assets just scale down cleanly, which in a lot of cases they do not. It wouldn't be a faithful representation to what I was alluding to earlier.

In addition, there are many engines that aren't GPU bound that are being ignored.

You can't take games made to run in Intel HD3000/4000 that run faster on more advanced GPUs and use them as examples of differences on dedicated hardware. Especially ignoring everything but the unchanged parts of the graphics pipeline
 

Donnie

Member
what i mean is,if next gen pc port support minimum 4cores then it will be impossible to get port on inferior hardware,because of the engine utilise all sort of features like;lots of physics and tons of npcs with different a.i .Exactly like this gen when the wii can't get port of even a game that can be play on p4@ 3ghz.

Everything can be scaled down, number of NPC's, physics/AI complexity ect. It all just comes down to wether a publisher wants it to happen. Wii's biggest problem wasn't one of CPU core numbers or GPU power, but rather functionality. More specifically the GPU being a DX7 level fixed function architecture.
 

wsippel

Banned
Ok folks, just found this after a quick search on Chipworks:

https://chipworks.secure.force.com/...tionStr=CatalogSearchInc&searchText=nintendo

Wii's GPU/SoC is labelled the NEC D811301-K11. This makes it near certain that the Renesas D813301 is the Wii U GPU.
Yeah, it looks like you're right. The copyright date threw me off, but then again, that's Renesas' copyright, not AMD's. The S1C is probably a tiny EEPROM. There's no manufacturer or copyright marking, just a product code - similar to the Wii EEPROM.
 

Meelow

Banned
There's no clear cut answer for that because generations are defined not by a threshold of acceptable performance but by what is standard. No developer liked the PS2 that generation, not even Naughty Dog (no, really) but they had to work with it... so, "bugger" and they did. DC never really took off the ground as standard; multiplatform for it was PSone games running in 480p with extra bells and whistles; if it had though, polygon throughput would have been halved in multiplataform games as well as limited to 1.2 GB's of disc storage which would have helped Gamecube actually.


Anyway, Nintendo hopes that Wii U becomes that (PS2) standard by selling quite a bit and not being as casual focused as the Wii and by having a modern GPU this time around, meaning that even if the performance is reduced, cross platform using the same base tech is always doable.

If it does get accepted as that minimum denominator then games will be made first and foremost with that standard in mind and then upres it and enable some extras. That way multiplatform games could have 1080p standard on X720/PS4 and be held down to 720p on it, perhaps with some other downgrades but doable nonetheless. That's their best case scenario, that best case scenario could also extend PS3 and X360 lifespans on the perception of them having similar capabilities though, and that's not good for Nintendo unless ports start to improve.

The worse case scenario is, developers focusing on the x86 optimization and deprecating on the PPC branch support for their tech right after PS3 and x360 die, relegating the Wii U to be forever technologically bundled with them (like happened with the Wii and all further evolutions "custom made"). Other problem is if most PS4/X720 games are not made with Wii U and 1080p in mind (for their spec) then that'll make parity harder and harder, as porting will mean taking things away from the Wii U version rather than simply adding it to the higher spec'ed versions.

Basically making games works out one way, forward. Backward albeit usually doable is a butcher job.


From the looks of it though, it definitely looks like Nintendo undershooted; and that this time around Sony and Microsoft aren't actually overshooting it (they're perhaps overshooting on the RAM department though).

There will be a point where undershooting to a certain extent won't be a problem though, but it's not up for Nintendo to guess that but to conjuncture (basically if you always do the same thing you're bound to be right, sometime); be it because developers really need the multiplatform approach seeing development costs are always increasing (and after the Wii and DS Nintendo leads in mass appeal for the time being), be it because due to those development costs increasing, at some point going all out on production values won't really be a good idea for most games (just like it's not economically viable to invest in a mobile phone game that really pushes the hardware and so no one does) developers will just have to accept what's good enough and go for there.

Of course, in an artistry driven industry that'll only happen when situation is so severe that not being business minded means certain doom (we've been further from that); when they're made to by higher ups (PS2 scenario) due to said machine being the only contender for lead platform (not gonna happen with 2 consoles being 86 like the PC) or when engine and asset scalability works really well (Tesselation could work wonder's there, but as with any new tech time will tell how it's gonna be used). In the end though, there's always pressure and undermining by settling for less. (and less power is less, for the developer community)

Interesting, so if I am reading this correctly it really depends on the third party companies if they want to make a Wii U version or not? With my comment I mean the consoles will be closer architecture wise than what the Wii/PS3/360 showed, and the power difference won't be as big as them like the 6th generation.

No developer liked the PS2 that generation, not even Naughty Dog

Wait what?
 

Donnie

Member
No, you're not understanding why the Wii didn't get a port of that game. Plus the fact that the upcoming engines are way more scalable than the ones of previous gens. Number of cores also isn't an exact measure of power, or ability of a CPU. I would take the 4 cores of an i7 2600k over 12 Jaguar cores any day of the week.

Would probably take a 4 core I7 2600k over 16 Jaguar cores at 1.6Ghz.
 
I still don't see where 32MB is going to be good for anything but some postprocessing effects, but ok. It obviously isn't fast enough to have devs enable AA...
For most developers and whilst working multiplatform it'll be useless because their work pipelines won't be panned out to take advantage of that, and so they won't.

For Nintendo though, having a big framebuffer means they can keep some Gamecube era tricks up their sleeve. This machine is very gamecube'y in various architecture senses. In this case in the sense that it actually keeps the overhead to do things via the same line of thought.

I'll explain, the reason GC and Wii had so little hit doing EMBM (think water effects) and fur shading amongst other things was because it had 1 MB of texture cache eDRAM embedded on the GPU, that meant they had a direct channel to spam the GPU with repetitive small size assets and thus reduce the hit of having to do it via the main RAM bank taking bandwidth away for it.

That meant spaming fur shading was more of an issue of how many textures per pass you could do, and why games like Pikmin and Mario Galaxy abused EMBM so much it seemed like they could spam the effect on all surfaces without hit; water maps were similar.

They had a channel for that, and in this sense they're retaining it. I'm sure they'll use it.

X360 and PS3 games this gen have barely used fur shading by comparison, because X360 framebuffer is so small for HD that nobody would ever dream to use it for something else, and PS3 because it lacks eDRAM. "That" is an extra (like post processing effects) but an extra nonetheless, always helps in setting the games apart and enriching them.
 

Shaanyboi

Banned
Interesting, so if I am reading this correctly it really depends on the third party companies if they want to make a Wii U version or not? With my comment I mean the consoles will be closer architecture wise than what the Wii/PS3/360 showed, and the power difference won't be as big as them like the 6th generation.

It certainly is sounding like more of a viable option than it was on the Wii. It's just a matter if they're willing to take that plunge, and frankly, I don't know that I'd bet on them doing that.
 

Meelow

Banned
It certainly is sounding like more of a viable option than it was on the Wii. It's just a matter if they're willing to take that plunge, and frankly, I don't know that I'd bet on them doing that.

Well everyone says never bet against Nintendo because they usually prove wrong so maybe everyone should best against Nintendo lol...
 

ozfunghi

Member
Wii u -> ps4 is about Wii -> ps3.

Might be a smidgin closer but nothing to write home about. Advantage for the Wii U this time is 720p still looks pretty good compared to the 480p on a HDTV.

Well i guess case closed.


So,

"difficult" TEV units in SD -> fully programmable shaders in HD with 8x the grunt

is the same as

fully programmable shaders in HD -> fully programmable shaders in HD with 4 to 6x the grunt.

Glad i can learn something here.
 

Shaanyboi

Banned
Well everyone says never bet against Nintendo because they usually prove wrong so maybe everyone should best against Nintendo lol...

Oh, I'd never bet against Nintendo. Their games keep me buying their hardware. And that stuff gets attention. I just wouldn't count on third parties supporting Nintendo. Which sucks, but it's a dumb fucking reality that we live in.
 

AzaK

Member
Interesting, so if I am reading this correctly it really depends on the third party companies if they want to make a Wii U version or not? With my comment I mean the consoles will be closer architecture wise than what the Wii/PS3/360 showed, and the power difference won't be as big as them like the 6th generation.

Pretty much but I think we need to take into consideration the fact that there will be two consoles of much closer performance, just like the 360/PS3/Wii generation. What this could mean is that publishers see enough income on those two machines alone (Like they did with the PS3/360) that it's a lot easier for them to ignore Wii U.

Given then Wii U's pretty slow start, that might mean more ammunition for publishers to say "Naa, won't bother".
 

Meelow

Banned
Oh, I'd never bet against Nintendo. Their games keep me buying their hardware. And that stuff gets attention. I just wouldn't count on third parties supporting Nintendo. Which sucks, but it's a dumb fucking reality that we live in.

Well if we bet against them maybe we could see decent Wii U third party support lol.

Pretty much but I think we need to take into consideration the fact that there will be two consoles of much closer performance, just like the 360/PS3/Wii generation. What this could mean is that publishers see enough income on those two machines alone (Like they did with the PS3/360) that it's a lot easier for them to ignore Wii U.

Given then Wii U's pretty slow start, that might mean more ammunition for publishers to say "Naa, won't bother".

See that's the problem, but what happens if the PS4 and Xbox 720 don't have a strong start? Would devs just keep making PS3/360 games or would they just ride through with PS4/720? I mean I know some devs think that people with a PS3/360 could just buy that version and instead making a Wii U version, but wouldn't that also apply to any console?
 

guek

Banned
Given then Wii U's pretty slow start, that might mean more ammunition for publishers to say "Naa, won't bother".

On one hand, downporting to Wii U isn't going to be free, programmable shaders or no. It'd still require some work and resources that devs might not see as worthwhile.


On the other hand, Wii U hasn't had a "slow start," it's had a pretty average start. It's fairly unreasonable to expect PS4/XB3 to climb in sales much faster than Wii U has. While there will be exceptions, it's unlikely that most 3rd parties will go next gen only for quite some time.

What matters most in the meantime is establishing a market on the Wii U using cross-generation ports alongside PS3/360 versions. People can talk specs all they want, but most devs are going to go where the money is, which is why I think the next few years are going to ridiculous when it comes to ports.
 
So,

"difficult" TEV units in SD -> fully programmable shaders in HD with 8x the grunt

is the same as

fully programmable shaders in HD -> fully programmable shaders in HD with 4 to 6x the grunt.

Glad i can learn something here.

PS3 had pretty ancient shaders compared to todays GPUs. It still had dedicated pixel and vertex shaders. Also it lacked any real programbility with general purpose code.

Wii U will lack all the GPGPU performance in the new consoles as well as the cpu being very much bound. There will be a decent difference.
 

neo-berserk

Neo Member
No, you're not understanding why the Wii didn't get a port of that game. Plus the fact that the upcoming engines are way more scalable than the ones of previous gens. Number of cores also isn't an exact measure of power, or ability of a CPU. I would take the 4 cores of an i7 2600k over 12 Jaguar cores any day of the week.

no i understand perfectly...i just think(well i change my mind)that every port could* be port to wii U because of the following;black ops port to wii,dead rising is a good exemple etc.I think it will be up to the dev if he want to make his vision of is game downgrade or not.
 
No developer liked the PS2 that generation, not even Naughty Dog
Wait what?

Naughty Dog's Jason Rubin: "Yes, it's very nice that Nintendo Gamecube can do eight layers in one pass. It's all set up for you. Believe me, I would have loved it"
Source: http://uk.ign.com/articles/2000/11/04/gamecube-versus-playstation-2

Dude was actually defending the ability to get the same results with optimization on the PS2, but in between admitted of course he'd like a simpler architecture that did all that out of the box without the same hit. This was shortly before Sony announced they were acquiring Naughty Dog, so although they were working exclusively on it they were technically free to do whatever they wanted.

So he actually "liked it" because he was betting on that horse and believed it would sell the most thus warranting the effort, but otherwise "hell no".

Capcom (and Shinji Mikami in particular) also hated it, one of the reasons being that not even the debug units were reliable, they had to change most of them every 6 months or so. And then they also really didn't really like the architecture. It just felt ancient even against the DC (the architecture, not actual power), and the texturing capabilities were really bad.
 

Shaanyboi

Banned
See that's the problem, but what happens if the PS4 and Xbox 720 don't have a strong start? Would devs just keep making PS3/360 games or would they just ride through with PS4/720? I mean I know some devs think that people with a PS3/360 could just buy that version and instead making a Wii U version, but wouldn't that also apply to any console?

Except... I think in general, the gaming community is going to guarantee the 720's/PS4's success. Or atleast for one of them assuming they both don't fuck up on pricing, etc. But atleast enough to ensure devs develop for atleast ONE of the platforms. And if they don't do AS well, then yeah, I'd imagine you'd see a lot of up-ports from the 360/PS3 until they do.


Nintendo, with the way they're now focused on more affordable-to-manufacture hardware, and just the general feeling towards them, it's less of a safe bet how well they're going to do in the grand scheme of things.
 

Berg

Member
So,

"difficult" TEV units in SD -> fully programmable shaders in HD with 8x the grunt

is the same as

fully programmable shaders in HD -> fully programmable shaders in HD with 4 to 6x the grunt.

Glad i can learn something here.

Not sure why you quoted me...Thought it was clear enough that I was being sarcastic.
 

Thraktor

Member
Ok folks, just found this after a quick search on Chipworks:

https://chipworks.secure.force.com/...tionStr=CatalogSearchInc&searchText=nintendo

Wii's GPU/SoC is labelled the NEC D811301-K11. This makes it near certain that the Renesas D813301 is the Wii U GPU.

Or the D813301 is a small component that's directly related to the Broadway GPU (i.e. it's there for backwards compatibility). I'd want to check to be 100% sure either way.

True, though I think there should be enough recognizable "radeon" bits in there to get an idea of the GPU-proper. Something like the DSP and ARM etc will essentially just be bolted on ( ARM proc may be pretty obvious on its own), or at least unique enough from the meat of the GPU (shader arrays). It's a similar idea with the other fixed function stuff like UVD & display controllers etc for the PC GPUs.

I'd expect the eDRAM to be very obvious. :)

Maybe Chipworks' report/analysis will help things along?

Anyhoo, my schedule isn't going to be good for the next while, so I can't do it, but thanks. :) Just thought I could chime in to show it shouldn't be a daunting task.

Yeah, the eDRAM will be pretty obvious right off the bat, and the Radeon bits should be similar enough to make counting them easy (especially at that resolution). To be honest, it's probably just that I'm really interested in the esoteric bits (particularly the memory interfaces), so while I'd consider that an important part of the exercise, most people are probably only interested in the headline specs (ie ALUs/ROPs/etc.), which will be easy enough to give.


Alright, if no-one else is going to volunteer themselves, I might as well do so myself. I don't have the expertise of someone like AlStrong, but I've been studying the chip pretty extensively over the past year and a half, and have spent enough time poring over die photos of R700 chips and so forth that I would feel fairly comfortable figuring out the vital statistics, at the very least. I can also have a decent stab at the rest of it, so things like the memory interfaces, internal changes to the GPU cores, the type of ARM core and DSP used, etc., but of course couldn't guarantee that I'll get all of that stuff 100% accurate (it's only for the real nerds, though). What I might do is PM Blu and Durante, and if neither are willing to do it themselves, I'll ask them for pointers on specific components, and go to them with any questions I have while doing it. Best to have back-up for something like this, I suppose.

I also think (hope) that I've been here long enough to be trusted with it, although of course if people don't feel so I wouldn't be offended. I have the next couple of days off, as well, so I have enough time to really put the work in on it. I should say in advance that it may be a few days before I could post anything, both because I'd want to have a lot of time to simply go over the photo, and also to make sure what I'm writing is accurate and run it past people like Blu, Durante and AlStrong for errors. As soon as something like this gets posted on GAF it's going to be all over the internet within 10 minutes, so there isn't really much scope for screwing up.

Anyway, we can give folk a little more time to mull it over, then take a decision later tonight.
 

Meelow

Banned
Source: http://uk.ign.com/articles/2000/11/04/gamecube-versus-playstation-2

Dude was actually defending the ability to get the same results with optimization on the PS2, but in between admitted of course he'd like a simpler architecture that did all that out of the box without the same hit. This was shortly before Sony announced they were acquiring Naughty Dog, so although they were working exclusively on it they were technically free to do whatever they wanted.

So he actually "liked it" because he was betting on that horse and believed it would sell the most thus warranting the effort, but otherwise "hell no".

Capcom (and Shinji Mikami in particular) also hated it, one of the reasons being that not even the debug units were reliable, they had to change most of them every 6 months or so. And then they also really didn't really like the architecture. It just felt ancient even against the DC (the architecture, not actual power), and the texturing capabilities were really bad.

Yeah I remember reading that the PS2 architecture was "different", it is interesting that even Naughty Dog said something about it compared to GameCube.

Except... I think in general, the gaming community is going to guarantee the 720's/PS4's success. Or atleast for one of them assuming they both don't fuck up on pricing, etc. But atleast enough to ensure devs develop for atleast ONE of the platforms. And if they don't do AS well, then yeah, I'd imagine you'd see a lot of up-ports from the 360/PS3 until they do.


Nintendo, with the way they're now focused on more affordable-to-manufacture hardware, and just the general feeling towards them, it's less of a safe bet how well they're going to do in the grand scheme of things.

I guess we really have to wait and see what happens, if the PS4/720 aren't all that at launch than I could see PS3/360 versions for the next few years.

Didn't Ubisoft also confirm something about the development costs of Wii U?
 

Gahiggidy

My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
I'll chip in $20 for the chip-pics... but only with a money back guarantee if the results turn out to be disappointing.
 
Top Bottom