• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

Easy_D

never left the stone age
It was just announce for the PS4 today. Already runs on ps3, x360, & wii. Only ps3 is nvidia gpu.



It ran on the wii. How can it not run well on the wiiu? Doesnt make much sense. Must be a business reason or Nintendo didnt paid to have it supported. EIther way doesnt make much sense...

Are you sure you're not confusing PhysX with HAVOK?
 

Blades64

Banned
physx is nVidia feature. Wii U is going AMD.

It can be run on AMD but it's probably not supported well.

It was just announce for the PS4 today. Already runs on ps3, x360, & wii. Only ps3 is nvidia gpu.

It ran on the wii. How can it not run well on the wiiu? Doesnt make much sense. Must be a business reason or Nintendo didnt paid to have it supported. EIther way doesnt make much sense...

Are you sure you guys are talking about the same thing? Normal physx is just middlewear right?
While PhysX is Nvidia's thing right? Just making sure if they're one in the same or not.
 

prag16

Banned
I thought you implied the problems were with developers. If you want to compare releases of things like Watch Dogs or something (same time launches), that might be better.

And yeah, they can use it as proof, for money hogs like Activision, seeing those poor COD sales is probably enough for them to dislike the Wii U. That won't stop them from making the games though.

I indicated that the developers certainly have a hand in creating a self fulfilling prophecy by shoveling crap out there then pointing and saying "look, Wii (U)'s audience doesn't buy this stuff." Are you saying using the games I listed as proof is really a valid and intellectually honest thing to do?

If the Wii's history is any indication, Wii U will still get CoD. I doubt the port from PS360 costs them enough that 100,000 sales doesn't turn at least some profit. Future installments will sell more as the installed base goes up.

Yes, comparing simultaneous releases with feature parity will be a better indicator (e.g. AC4, Watch_Dogs even though it's cross-gen, etc). Obviously they'll sell more on PS360, but the degree of the flop is what we're paying attention to.
 

wsippel

Banned
You don't have to be super specific. Think of it this way, let's imagine for a moment cross-gen game X, which will also appear on the Xbox 360. In your opinion, what improvements should be theoretically possible? Let us assume for comparison's sake that its counterpart is running at 720 p.

Can it run the same title at 1080 p?

Can it run the same title with better textures, lighting and 1080 p?

Can it run the same title with better textures, lighting, 1080 and 60 fps?

Can it run the same title the same as above but also running a completely different, independent fully 3D gameplay session on the gamepad?

Each progression, though not specific, would imply a certain level of performance improvement.

Doubling resolution implies a near doubling of graphical processing load. As does merely doubling frame rate. Doing both implies a rough quadrupling.

These questions are rough and won't answer exactly how much more capable the Wii U is graphically from last gen, but it would give a rough estimate.
Again: Impossible to tell. Depends on too many factors, and there are still too many unanswered questions.



That just another middleware. Cannot just be a replacement in game that used physx. they would have to be design with that middleware.

This could be why some games are not coming to the wiiu. I know the witcher 3 using physx...hmmm.
PhysX runs on Wii U. Trine 2 uses PhysX for example.
 
Are you sure you guys are talking about the same thing? Normal physx is just middlewear right?
While PhysX is Nvidia's things right? Just making sure if they're one in the same or not.

It's the same thing, it's a middleware offered by nVidia. They shat on AMD users forcing it to be CPU only. They want SOMEONE to use it so consoles and (just announced) PS4 is going to use it. I'm assuming it'll make use of the GPU, if not, and they gimp it like their PC AMD counterpart, no one would use it.

I indicated that the developers certainly have a hand in creating a self fulfilling prophecy by shoveling crap out there then pointing and saying "look, Wii (U)'s audience doesn't buy this stuff." Are you saying using the games I listed as proof is really a valid and intellectually honest thing to do?

If the Wii's history is any indication, Wii U will still get CoD. I doubt the port from PS360 costs them enough that 100,000 sales doesn't turn at least some profit. Future installments will sell more as the installed base goes up.

Yes, comparing simultaneous releases with feature parity will be a better indicator (e.g. AC4, Watch_Dogs even though it's cross-gen, etc). Obviously they'll sell more on PS360, but the degree of the flop is what we're paying attention to.
The thing is with PS3 and 360, PS3 got the shaft time and time and time again. Why would Wii U be exempt of that behavior? Is it their fault? Maybe, but it's nothing exclusive to what's happening on Nintendo's consoles.
 

prag16

Banned
The thing is with PS3 and 360, PS3 got the shaft time and time and time again. Why would Wii U be exempt of that behavior? Is it their fault? Maybe, but it's nothing exclusive to what's happening on Nintendo's consoles.

To a degree, but by and large the developers stuck with PS3 (even if the ports were often slightly inferior to the 360 versions) until the console finally started selling.
 
To a degree, but by and large the developers stuck with PS3 (even if the ports were often slightly inferior to the 360 versions) until the console finally started selling.

They stuck because it was new one "on par" tech wise with the 360. Nintendo finally caught up with the rest in power (and exceeded them) but many companies are looking at the prospects of revenue from Durango and PS4. Systems that will allow more customized features and selling points to customers.

Hell, Hollywood can't sell movies anymore to the "young crowd" without spending hundreds of millions on CGI, and these big companies like EA, Sony, Activision realize they need to up the graphics. If the Wii U can't provide it, they'll "drop it" and support it minimally.
 

USC-fan

Banned
wiiu does support it but why was it not announce? No "official" support?

http://www.neogaf.com/forum/showpost.php?p=45059487&postcount=67

If I remember right we use a newer version of PhysX on Wii U than on the other consoles, so that might do a difference (still within the 2.8.x though). I would say that whatever difference there is in physics, it's rather minimal. If this thread is still active tomorrow, I can find out more from the programmers.
 
Wiiu 1.5x
X720 9x
Ps4 11x

Real life...

Blimey, must troll harder.

There's plenty we don't know about the Wii U's components but let's have a look at what we do know:

The 360 has 1MB of CPU cache, the Wii U has 3MB of cache and eDRAM has improved over the last 8 years.
The 360's CPU is In-Order and has to deal with sound, IO and OS functions. Expresso is Out-Of-Order, meaning there are less idle cycles, and doesn't have to deal with those tasks thanks to the Wii U having a DSPand ARM co-processor.
The GPU in the 360 has a DX9-equivalent feature set, Latte has a DX11-equivalent feature set.
The 360 has a tesselation unit, but it's pants. The Wii U has a more modern tesselation unit which according to Shin'En is usable.
The 360's GPU has 10MB of eDRAM on a daughter-die, Latte has 32MB of eDRAM and 3MB of SRAM on-die.
The 360 has 512MB of RAM, the Wii U has 2GB of RAM with considerably less latency.
The 360 as a whole has bottlenecks coming out of its ears, the Wii U has been designed specifically to avoid bottlenecks.

And we don't have a Scooby Doo what half of the silicon in the GPU does, but we do know that the Broadway in the Wii outperforms Xenon in the 360 at general processing tasks and that Expresso, whilst being a different chip, must share some similarities...and no, it's not 3 Broadways duct-taped together lol.

In terms of real world performance I'm expecting the Wii U to be 3-4 times more powerful than the 360. The 360 is a badly designed machine tbh.
 

USC-fan

Banned
Blimey, must troll harder.

There's plenty we don't know about the Wii U's components but let's have a look at what we do know:

The 360 has 1MB of CPU cache, the Wii U has 3MB of cache and eDRAM has improved over the last 8 years.
The 360's CPU is In-Order and has to deal with sound, IO and OS functions. Expresso is Out-Of-Order, meaning there are less idle cycles, and doesn't have to deal with those tasks thanks to the Wii U having a DSPand ARM co-processor.
The GPU in the 360 has a DX9-equivalent feature set, Latte has a DX11-equivalent feature set.
The 360 has a tesselation unit, but it's pants. The Wii U has a more modern tesselation unit which according to Shin'En is usable.
The 360's GPU has 10MB of eDRAM on a daughter-die, Latte has 32MB of eDRAM and 3MB of SRAM on-die.
The 360 has 512MB of RAM, the Wii U has 2GB of RAM with considerably less latency.
The 360 as a whole has bottlenecks coming out of its ears, the Wii U has been designed specifically to avoid bottlenecks.

And we don't have a Scooby Doo what half of the silicon in the GPU does, but we do know that the Broadway in the Wii outperforms Xenon in the 360 at general processing tasks and that Expresso, whilst being a different chip, must share some similarities...and no, it's not 3 Broadways duct-taped together lol.

In terms of real world performance I'm expecting the Wii U to be 3-4 times more powerful than the 360. The 360 is a badly designed machine tbh.
So where does that put the xbox 720 and ps4?
 

roddur

Member
Blimey, must troll harder.

There's plenty we don't know about the Wii U's components but let's have a look at what we do know:

The 360 has 1MB of CPU cache, the Wii U has 3MB of cache and eDRAM has improved over the last 8 years.
The 360's CPU is In-Order and has to deal with sound, IO and OS functions. Expresso is Out-Of-Order, meaning there are less idle cycles, and doesn't have to deal with those tasks thanks to the Wii U having a DSPand ARM co-processor.
The GPU in the 360 has a DX9-equivalent feature set, Latte has a DX11-equivalent feature set.
The 360 has a tesselation unit, but it's pants. The Wii U has a more modern tesselation unit which according to Shin'En is usable.
The 360's GPU has 10MB of eDRAM on a daughter-die, Latte has 32MB of eDRAM and 3MB of SRAM on-die.
The 360 has 512MB of RAM, the Wii U has 2GB of RAM with considerably less latency.
The 360 as a whole has bottlenecks coming out of its ears, the Wii U has been designed specifically to avoid bottlenecks.

And we don't have a Scooby Doo what half of the silicon in the GPU does, but we do know that the Broadway in the Wii outperforms Xenon in the 360 at general processing tasks and that Expresso, whilst being a different chip, must share some similarities...and no, it's not 3 Broadways duct-taped together lol.

In terms of real world performance I'm expecting the Wii U to be 3-4 times more powerful than the 360. The 360 is a badly designed machine tbh.

oh, you've opened another can of comparison posts. just wait
 
So where does that put the xbox 720 and ps4?

They are still where they are. Wii U having far less bottlenecks is doing jack shit for PS4/720. Unless youre saying Nintendo removing bottlenecks won Wii U hardware removes PS4/720 bottlenecks...

I don't get your logic here...
 

AOC83

Banned
Blimey, must troll harder.

There's plenty we don't know about the Wii U's components but let's have a look at what we do know:

The 360 has 1MB of CPU cache, the Wii U has 3MB of cache and eDRAM has improved over the last 8 years.
The 360's CPU is In-Order and has to deal with sound, IO and OS functions. Expresso is Out-Of-Order, meaning there are less idle cycles, and doesn't have to deal with those tasks thanks to the Wii U having a DSPand ARM co-processor.
The GPU in the 360 has a DX9-equivalent feature set, Latte has a DX11-equivalent feature set.
The 360 has a tesselation unit, but it's pants. The Wii U has a more modern tesselation unit which according to Shin'En is usable.
The 360's GPU has 10MB of eDRAM on a daughter-die, Latte has 32MB of eDRAM and 3MB of SRAM on-die.
The 360 has 512MB of RAM, the Wii U has 2GB of RAM with considerably less latency.
The 360 as a whole has bottlenecks coming out of its ears, the Wii U has been designed specifically to avoid bottlenecks.

And we don't have a Scooby Doo what half of the silicon in the GPU does, but we do know that the Broadway in the Wii outperforms Xenon in the 360 at general processing tasks and that Expresso, whilst being a different chip, must share some similarities...and no, it's not 3 Broadways duct-taped together lol.

In terms of real world performance I'm expecting the Wii U to be 3-4 times more powerful than the 360. The 360 is a badly designed machine tbh.


It´s 1 GB Ram for games at half the speed and a CPU that is considerably worse .
 
It´s 1 GB Ram for games at half the speed and a CPU that is considerably worse .

So how does the Wii Us memorysystem work exacty? You seem to have all the details we lack.

How does NFS: U pull PC textures compared to PS360 when the bandwidth is "gimped"?
 

Azure J

Member
Blimey, must troll harder.

There's plenty we don't know about the Wii U's components but let's have a look at what we do know:

The 360 has 1MB of CPU cache, the Wii U has 3MB of cache and eDRAM has improved over the last 8 years.
The 360's CPU is In-Order and has to deal with sound, IO and OS functions. Expresso is Out-Of-Order, meaning there are less idle cycles, and doesn't have to deal with those tasks thanks to the Wii U having a DSPand ARM co-processor.
The GPU in the 360 has a DX9-equivalent feature set, Latte has a DX11-equivalent feature set.
The 360 has a tesselation unit, but it's pants. The Wii U has a more modern tesselation unit which according to Shin'En is usable.
The 360's GPU has 10MB of eDRAM on a daughter-die, Latte has 32MB of eDRAM and 3MB of SRAM on-die.
The 360 has 512MB of RAM, the Wii U has 2GB of RAM with considerably less latency.
The 360 as a whole has bottlenecks coming out of its ears, the Wii U has been designed specifically to avoid bottlenecks.

And we don't have a Scooby Doo what half of the silicon in the GPU does, but we do know that the Broadway in the Wii outperforms Xenon in the 360 at general processing tasks and that Expresso, whilst being a different chip, must share some similarities...and no, it's not 3 Broadways duct-taped together lol.

In terms of real world performance I'm expecting the Wii U to be 3-4 times more powerful than the 360. The 360 is a badly designed machine tbh.

While I agree with the overall point you're trying to make, I'd rephrase this. 360's architecture is only "badly designed" in retrospect. It did an admirable job for 8 years (RROD stuff notwithstanding) and produced a lot of cool things.

Not too well obviously.

Don't we have a quote (from Criterion I believe) stating otherwise? I seem to recall the memory setup getting a bit of praise from someone up there, but my memory's foggy and I haven't been GAFfing it up as much regarding technical aspects of Wii U in a while.
 

AkiraGr

Banned
It´s 1 GB Ram for games at half the speed and a CPU that is considerably worse .

Are you serious with that kind of claims? Or do you even know what are you talking about?

The architecture of Wii U tramps Ps360 in pure performance. Even on paper. The Ps4 or Nextbox We do not have a clue about performance except target renders of Ps4 videos and gameplay footage of Killzone Dark Fall.

A previous poster said that Wii U can not manage graphics like Killzone, I will say I beg to differ but this is wishfull right now thinking lets wait until E3.
 
Oh wow.. some of you guys...

I hope you don't mean me, lol

Not saying Wii U is close to PS4/720. Its not. Those two are far ahead of Wii U.

But just putting Wii U at PS360 level is just wrong. Wii U IS more powerful than PS360 and it will show it soon. NFS: U is just "the beginning" since the final dev kits and documentation/SDK came very late.
 

deviljho

Member
tech thread said:
business talk

ix7Uwp3Zb05dr.gif
 
Even if the Wii U is on par with the 360 which I certainly disagree (it is better - have checked all Wii U tech threads) it clearly has 2 advantages that will show in due time:

- Has more RAM for better textures (NFSMW U is using PC textures)
- Has a GPU, that while it is a mystery, it has all modern GPU features (above DirectX 9)

I understand people that want Nintendo to have a beast machine. It comes down to opinion though. I beleive Wii U is a great architecture that will provide very good looking games and great gameplay experiences with the plus advantage of the gamepad, a feature which I really love since November when I bought it. I love my PS3 also and current gen games IMO look really good, but it is getting harder for me now to play on the PS3 without the gamepad.

Regarding other NextGen systems, I probably will get one of those or PC, but first I will wait to catch up on other Wii U and PS3 games while I check how things turn for PS4X720.
 
It´s 1 GB Ram for games at half the speed and a CPU that is considerably worse .

Nintendo (and Microsoft) have sacrificed bandwidth for latency. Think of that pipe/water analogy above but have a bigger tap to fill a kettle for the PS4. That's great. You can turn your tap on and get more water flowing through the tap at once. The only downside is that with the Wii U and 720 taps the water flows as soon as youtturn your tap on, but your PS4 tap waits a few seconds before water comes out. RAM latency was a huge problem for developers last gen.

As for the CPU, as I've already mentioned, it's better at general processing than Xenon. Xenon is miles better at floating point work but that's because the PS3 and 360 were poorly designed. The CPUs in the PS4 and 720 will also be 'considerably worse' than Xenon at floating point work. That's what GPUs are for.
 
While I agree with the overall point you're trying to make, I'd rephrase this. 360's architecture is only "badly designed" in retrospect. It did an admirable job for 8 years (RROD stuff notwithstanding) and produced a lot of cool things.



Don't we have a quote (from Criterion I believe) stating otherwise? I seem to recall the memory setup getting a bit of praise from someone up there, but my memory's foggy and I haven't been GAFfing it up as much regarding technical aspects of Wii U in a while.

Not sure about Criterion but Ancel with Ubisoft Montpellier and one of the guys from Shin'En have praised the Wii U's RAM, and there hasn't been a single developer - not even an anonymous one - that's complained about it.

And, yes, I may have been too harsh on the design of both the PS3 and 360 in my posts. 'Oddly designed' would probably be more fitting. I'm still boggled by the choice from both platform holders not to have a DSP.
 
It´s 1 GB Ram for games at half the speed and a CPU that is considerably worse .
Well, both your reply and his post is incomplete. A more fair comparison would be something like this:
There's plenty we don't know about the Wii U's components but let's have a look at what we do know:

The 360 has 1MB of CPU cache, the Wii U has 3MB of cache and eDRAM has improved over the last 8 years (although WiiU uses eDram that has more or less the same performance as the original Wii SRAM, but it would be good to have a comparison of cicles wasted between the WiiU and the Xbox360).

The 360's CPU is In-Order and has to deal with sound, IO and OS functions. Expresso is Out-Of-Order, meaning there are less idle cycles, and doesn't have to deal with those tasks thanks to the Wii U having a DSPand ARM co-processor. To have a much more complete picture. The Xbox360 has a huge pipeline of more than 30 stages that has of course huge penalizations when it comes to pipeline stalls. It also runs at 3,2Ghz compared to the Espresso's 1,243Ghz and to mitigate the lack of OoE can run two threads of code in every core. Since the 1MB of L2 cache is shared, the more threads you run the less caché avaliable for each core.
Caché can be programmed in both processors (Espresso and Xenon) to improve it's use.


The GPU in the 360 has a DX9-equivalent feature set, Latte has a DX11-equivalent feature set. Latte is also faster (550Mhz vs 500Mhz of the Xenos) and has some sort of hardware functions a la TEV which are adapted to the standards required for the WiiU.

The 360 has a tesselation unit, but it's pants. The Wii U has a more modern tesselation unit which according to Shin'En is usable.

The 360's GPU has 10MB of eDRAM on a daughter-die, Latte has 32MB of eDRAM and 3MB of SRAM on-die. The 32MB of on-die eDram and the (2+1MB) are the main memory of the WiiU. Unlike the Xbox 360 eDram which can only be written by the GPU at a maximum bandwidth speed of 32GB/s, the eDram on the WiiU can be written and read by both the GPU and the CPU so the buffers and data written on it doesn't have to travel to the big pool of RAM to be passed again to the CPU/GPU for it to be read.

The 360 has 512MB of RAM, the Wii U has 2GB of RAM with considerably less latency. The maximum bandwidth of the Xbox 360's big memory pool is of 22.4GB/s, while the bandwidth of WiiU's big pool of RAM is of 12.8GB/s (faster RAM but with a 64 bit bus).
The fact that the WiiU uses a MCM design and that the northbridge is located on the GPU in both consoles, reduces the latency between the main memory and the CPU even more than the fact that the WiiU's memory has lower latency itself).
Of course, the communication between CPU and GPU is also much faster due to that.


The 360 as a whole has bottlenecks coming out of its ears, the Wii U has been designed specifically to avoid bottlenecks.Yes, but this is not data per-se. I mean, this is something that can be seen when looking at the whole points that we pointed before, not anything that adds information that wasn't listed before.

And we don't have a Scooby Doo what half of the silicon in the GPU does, but we do know that the Broadway in the Wii outperforms Xenon in the 360 at general processing tasks and that Expresso, whilst being a different chip, must share some similarities...and no, it's not 3 Broadways duct-taped together lol.Well, Nintendo and other developers have given clues to us. For example, we know that it has a tessellation unit (we were told by Shin'en), a new texture compression algorithm also compatible with the one used on the Wii (a superset of it maybe or an entirely new one + the one used on the Wii) and some other Wii functions adapted to work with WiiU games confirmed by Iwata himself.

This would be a more complete picture of both consoles, and although I firmly think that the WiiU will provide better results than most people expect, I wouldn't use this sort of multipliers (1.5x, 2x, 3x, 4x or whatever) to describe it.
It's simply a much more modern and efficient design with a "bit" (not a bit per-se, what I mean is that the difference in power is not as great as it is in features and efficiency) more theoretical power to play around.
 
Are you serious with that kind of claims? Or do you even know what are you talking about?

The architecture of Wii U tramps Ps360 in pure performance. Even on paper. The Ps4 or Nextbox We do not have a clue about performance except target renders of Ps4 videos and gameplay footage of Killzone Dark Fall.

A previous poster said that Wii U can not manage graphics like Killzone, I will say I beg to differ but this is wishfull right now thinking lets wait until E3.

1) That guy doesn't know what he's talking about.

2) You're saying things that are false as well but some true... lets get things straight.
Wii U CPU > PS360. I think there are a few things the 360/PS3 processors do better, but overall, Wii U's is better. We know how the Jaguar CPU's will perform. It's based on architectures that exist. The difference? It's upgraded x 2. The GPU is also known hardware. GCN, a tad more powerful than a 7850 (which has 16 CU's as opposed to PS4's 18+ other things).

3) It's Shadow Fall ;P

4) Wii U matching 1080p at that AA with that AF won't happen unless it's very "basic." Killzone had a lot of volumetric particle affects that I don't think the Wii U would be able to do, and if it did, it would take up more than it's worth.

I hope you don't mean me, lol

Not saying Wii U is close to PS4/720. Its not. Those two are far ahead of Wii U.

But just putting Wii U at PS360 level is just wrong. Wii U IS more powerful than PS360 and it will show it soon. NFS: U is just "the beginning" since the final dev kits and documentation/SDK came very late.

Lol, not you... well, I think you overhype Wii U, the... other guy... really downplays it (to say the least). I have my biases though.
 

wsippel

Banned
It´s 1 GB Ram for games at half the speed and a CPU that is considerably worse .
You have no idea what you're talking about. My suggestion: Either ask questions or just read and don't post anything, don't post asinine statements about things you don't understand.
 
Well, both your reply and his post is incomplete. A more fair comparison would be something like this:


This would be a more complete picture of both consoles, and although I firmly think that the WiiU will provide better results than most people expect, I wouldn't use this sort of multipliers (1.5x, 2x, 3x, 4x or whatever) to describe it.
It's simply a much more modern and efficient design with a "bit" (not a bit per-se, what I mean is that the difference in power is not as great as it is in features and efficiency) more theoretical power to play around.

Cheers for the clarifications and additions, much appreciated. ;o)
 
In your opinion, how much more powerful can the Wii U be that the previous generation when only drawing 45 watts?

I'd thought that by now that it is blatantly obvious that 'Power' just isn't as important as it used to be... Obviously to a great extent it is important. But after a certain point, all you get is gravy. Very expensive gravy.
 

prag16

Banned
You have no idea what you're talking about. My suggestion: Either ask questions or just read and don't post anything, don't post asinine statements about things you don't understand.

Problem is we all feed these trolls. :(

Some of the people who are down on the Wii U actually want to have a conversation. But this guy, USC-fan, Van Owen, etc are just here to rile us up.
 

Schnozberry

Member
4) Wii U matching 1080p at that AA with that AF won't happen unless it's very "basic." Killzone had a lot of volumetric particle affects that I don't think the Wii U would be able to do, and if it did, it would take up more than it's worth.

I don't think the Wii U will be able to match the scale of graphics we saw from Shadow Fall, and I don't think it has to. If Retro or whoever else within Nintendo's farm team decides to make an exclusive FPS or "First Person Adventure" as they called the Prime Series, it just needs to make interesting use of the tablet and deliver gameplay mechanics not seen in other shooters. If the graphics are good enough and the gameplay is great, people will flock to it.

My problem with the comparisons to Shadow Fall is that once the actual gameplay got started, it was right back to hiding behind walls and shooting people in the face. That's great and all, but it just felt like every shooter from last gen with better explosions. Same thing for the Capcom Dungeon Crawler Engine Demo, once you get past how clean it looked, it seemed like another game where you swing swords at orcs and dragons. Again, that's cool, but I've done that before. Wiping the Vaseline off the lens to give me a sharper picture only goes so far.
 
Well, both your reply and his post is incomplete. A more fair comparison would be something like this: ...
It's simply a much more modern and efficient design with a "bit" (not a bit per-se, what I mean is that the difference in power is not as great as it is in features and efficiency) more theoretical power to play around.

Thank you for detailing that out. Best wishes!
 
I don't think the Wii U will be able to match the scale of graphics we saw from Shadow Fall, and I don't think it has to. If Retro or whoever else within Nintendo's farm team decides to make an exclusive FPS or "First Person Adventure" as they called the Prime Series, it just needs to make interesting use of the tablet and deliver gameplay mechanics not seen in other shooters. If the graphics are good enough and the gameplay is great, people will flock to it.

My problem with the comparisons to Shadow Fall is that once the actual gameplay got started, it was right back to hiding behind walls and shooting people in the face. That's great and all, but it just felt like every shooter from last gen with better explosions. Same thing for the Capcom Dungeon Crawler Engine Demo, once you get past how clean it looked, it seemed like another game where you swing swords at orcs and dragons. Again, that's cool, but I've done that before. Wiping the Vaseline off the lens to give me a sharper picture only goes so far.

Well, we aren't comparing gameplay. Most of anything Nintendo has dumps on Killzone. They were talking about technically, and technically, the Wii U can't compete.
 

Oblivion

Fetishing muscular manly men in skintight hosery
2) You're saying things that are false as well but some true... lets get things straight.
Wii U CPU > PS360. I think there are a few things the 360/PS3 processors do better, but overall, Wii U's is better. We know how the Jaguar CPU's will perform. It's based on architectures that exist. The difference? It's upgraded x 2. The GPU is also known hardware. GCN, a tad more powerful than a 7850 (which has 16 CU's as opposed to PS4's 18+ other things).

4) Wii U matching 1080p at that AA with that AF won't happen unless it's very "basic." Killzone had a lot of volumetric particle affects that I don't think the Wii U would be able to do, and if it did, it would take up more than it's worth.

Wait, you say that the CPU is stronger on Wii-U, but it won't be able to replicate KZ? There's more RAM and the GPU is way more powerful than the PS360, so how could it NOT achieve something that's not only similar, but surpasses KZ (if only slightly)?
 

Blades64

Banned
I think the Wii U could potentially compare visually to Killzone, albeit at a lower res.

I'm inclined to agree, but still I reserve final judgment. I think the Wii U was designed with 720p in mind (for intensive 3D games). Sure it can achieve 1080p on titles like Rayman, but I believe most of its games will be in 720p, and I also believe that we will get some rather nice looking games out of the Wii U. Still not so sure about reaching the fidelity of that of the Killzone demo though...
 

Schnozberry

Member
Wait, you say that the CPU is stronger on Wii-U, but it won't be able to replicate KZ? There's more RAM and the GPU is way more powerful than the PS360, so how could it NOT achieve something that's not only similar, but surpasses KZ (if only slightly)?

He's referring to Killzone Shadow Fall for the PS4.
 
That is true. If game consoles lived or died on tech specs, Wii U is already in a shallow grave.

I'm not implying that ;]

Wait, you say that the CPU is stronger on Wii-U, but it won't be able to replicate KZ? There's more RAM and the GPU is way more powerful than the PS360, so how could it NOT achieve something that's not only similar, but surpasses KZ (if only slightly)?

Shadow Fall, not KZ2 or 3 lol.

Also, CPU doesn't dictate graphics (though the Wii U GPU is more powerful than PS360 as well).
 

Schnozberry

Member
I'm inclined to agree, but still I reserve final judgment. I think the Wii U was designed with 720p in mind (for intensive 3D games). Sure it can achieve 1080p on titles like Rayman, but I believe most of its games will be in 720p, and I also believe that we will get some rather nice looking games out of the Wii U. Still not so sure about reaching the fidelity of that of the Killzone demo though...

720p is fine for most people in most console gaming situations, since few people sit close enough to their TV to really discern the fine detail differences between the resolutions. I think the resolution debate is a bit overblown. The increased fidelity that faster hardware brings, like higher quality assets and larger game worlds, are obvious enough improvements that everybody will notice. I'd take Killzone Shadow Fall at 720/60fps over 1080/30fps all day. The ability to say that your game runs 1080p offers more to people marketing games than it does to the players.
 

Noirulus

Member
Are you serious with that kind of claims? Or do you even know what are you talking about?

The architecture of Wii U tramps Ps360 in pure performance. Even on paper. The Ps4 or Nextbox We do not have a clue about performance except target renders of Ps4 videos and gameplay footage of Killzone Dark Fall.

A previous poster said that Wii U can not manage graphics like Killzone, I will say I beg to differ but this is wishfull right now thinking lets wait until E3.

Wait, so you think that the Wii U can manage Killzone: SF's level of graphics, or Killzone 2/3 level of graphics?
 
I assume you're familiar with the concept of a self fulfilling prophecy. Even though based on a lot of your comments, you could've fooled me...
If you're referring to the idea that publishers avoid the Wii U, because they don't see a market, and therefore no market is created - I'm perfectly aware.

It isn't their job to do so. There is no onus on them to do so. If a self-fulfilling prophecy is in action, the onus is on Nintendo to break it.
This. Let's look at the examples of multiplat "failures":

AGAIN, SELF FULFILLING PROPHECY. Some if it's on Nintendo, but not all of it; not the majority. With the exception of MAYBE Ubisoft, these third parties are a bunch of ass clowns with the way they handled the Wii (and now the Wii U).
Your examples, you cite lateness. The Wii U was the one LTTP. You cite shoddiness. Nintendo didn't provide the best documentation or support, as we know from these threads.

Did you expect half a dozen publishers to delay half a dozen games just for Nintendo's console?
Did you expect them to price it lower, still sell paltry amounts, and make their investment even less worthwhile?
Did you expect them to dedicate resources to DLC for an unproven SKU, when more prosperous uses of those resources could be applied?

COD BLOPS is the biggest third party game there is. COD BLOPS2 released five days late, because of Nintendo's timing. COD BLOPS2 continues to sell well past it's launch anyway on other platforms. COD BLOPS2 Wii U, may not have sold 100K globally.

It's business. Publishers will assess their returns against their internal expectations for the SKUs in question. I doubt any of them had lofty expectations. But even then I imagine the Wii U is failing to meet them.

---

Although, as the thread seems to have returned to tech talk, this will be my last post on the matter.
 

Mithos

Member
COD BLOPS is the biggest third party game there is. COD BLOPS2 released five days late, because of Nintendo's timing. COD BLOPS2 continues to sell well past it's launch anyway on other platforms. COD BLOPS2 Wii U, may not have sold 100K globally.

And still it probably have or are just about to break even, and by the time BLOPS3 comes would have made a hefty profit. I think people need to go back and check how COD games have sold on Nintendo platforms previously, and see there is nothing new to these "low" sales.

Evergreen of gtfo is pretty much how it goes on Nintendo platform if you want to sell millions and millions.
 

AkiraGr

Banned
1) That guy doesn't know what he's talking about.

Can you please be more specific about what I don't know enlighten me. Maybe you know something better than me I don't recall that I said I know everything.

2) You're saying things that are false as well but some true... lets get things straight.
Wii U CPU > PS360. I think there are a few things the 360/PS3 processors do better, but overall, Wii U's is better. We know how the Jaguar CPU's will perform. It's based on architectures that exist. The difference? It's upgraded x 2.

What things the Ps3 or Xbox 360 do better? Have you developed a game on any of this systems? Did you do any comparison in their architecture against a custom CPU built by IBM like the Cell of the Ps3 and MCM Power PC CPU on the Wii U and the Xenon? My questions are sincere. As the Jaguar it is not upgraded x2, its custom alright but it's built on a single die APU with 4MB memory cache for 8 cores processors which in fact the normal jaguar has 2MB memory cache. If by any magic trick or any sorcery done by AMD engineers or Sony engineers that I do not know, the specs of the CPU reveal that the CPU is JUST TWO Jaguars DUCTED TAPED together clocked at 1,6 GHz. The Jaguar is a cheap solution for small portable devices and "mews" enough power for it's size just 3.1mm2 per die, without L2 cache, low consumption on power without problems for overheating. It is a very good choice for costs but not on performance.

http://www.xbitlabs.com/news/mainbo...t_Core_AMD_Jaguar_Microprocessors_Report.html

http://www.xbitlabs.com/news/cpu/di...ext_Generation_Jaguar_Micro_Architecture.html

Also where are exactly elements you present as facts about the Ps4 GPU?

The GPU is also known hardware. GCN, a tad more powerful than a 7850 (which has 16 CU's as opposed to PS4's 18+ other things).

http://www.polygon.com/2013/2/20/4009940/playstation-4-tech-specs-hardware-details

The official specs specified as an next generation BASED graphic engine

Main Processor
Single-chip custom processor
CPU: x86-64 AMD "Jaguar," 8 cores
GPU: 1.84 TFLOPS, AMD next-generation Radeon based graphics engine

I would love see that to be true but on a single APU from AMD that the most powerfull GPU they have planned for the market RADEON HD 8850 consumes 130Watts, ONLY the GPU and has the performance of 2,99Tflops and IT IS NEXT GEN.

http://www.cpu-world.com/news_2012/2012092201__Sea_Islands_Radeon_HD_8800-Series_Specs_Revealed.html

The RADEON HD 7850 is previous generation and its specs are:860MHz Engine Clock,2GB GDDR5 Memory,1200MHz Memory,Clock (4.8 Gbps GDDR5),153.6GB/s memory bandwidth (maximum),1.76 TFLOPS,Single Precision compute power,GCN Architecture and runs at peak power at 96Watts

http://www.techpowerup.com/reviews/AMD/HD_7850_HD_7870/24.html

If by any means AMD manage to fit ALL that in a single APU and runs at the imaginary(your reference) 130watts of power consumption I would BE VERY impressed. Not forget the amazing 1,8Tflops of performance of the Ps3 specs.

GPU: RSX @550MHz

1.8 TFLOPS floating point performance
Full HD (up to 1080p) x 2 channels
Multi-way programmable parallel floating point shader pipelines

http://playstation.about.com/od/ps3/a/PS3SpecsDetails_3.htm

3) It's Shadow Fall ;P

I know, rushed typing my mistake.


4) Wii U matching 1080p at that AA with that AF won't happen unless it's very "basic." Killzone had a lot of volumetric particle affects that I don't think the Wii U would be able to do, and if it did, it would take up more than it's worth.

So you are guessing without even knowing about the facts of what Wii U can do or not? Volumetric particle effects with 1080p and AA? You have a very sharp eye to notice AA on a direct feed video of a game.

I have my biases though.

Sure you do.
 
Status
Not open for further replies.
Top Bottom