• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU technical discussion (serious discussions welcome)

z0m3le

Banned
The main reason GCN is better than VLIW5 is because it's a simpler architecture to optimize drivers for on PC, where games are programmed upon much more abstract layers than on consoles (and also GPGPU I presume, but I don't know how a modified VLIW5 would perform in that regard).

VLIW5 in a console makes a lot of sense because EVERYTHING will be programmed towards this architecture, which is more efficient per mm^2 than GCN.

Yes, VLIW5 was rarely taken advantage of, thus the theoretical FLOPs were never really reached, because usually only 3 or 4 of those 5 units were used, this is why they moved on to VLIW4, and further improvements to this were made with GCN which gets much closer to the theoretical FLOPs.

And like you said, in a console this would be very different. A well coded game will have all 5 units in use and can rarely go unused, making it hit it's theoretical FLOPs much more often than it ever did on PC, this can also be seen somewhat with simpler tasks such as cryptography I believe, where VLIW5 performs as well as GCN per shader.

Case in point, Bitcoin mining sees HD5870 regularly hitting 440Mhash/s /w 1600SP @ 1GHz, HD 7950 hits ~510Mhash/s /w 1792SP @ 1GHz.

This is of course just one "benchmark" that uses all SPs all of the time, VLIW5 being coded for at such a low level should in theory be very similar in performance.

Another quick reason I think I'm standing on solid ground with this theory is VLIW4, the HD6970 had 1536SP, out performed the HD5870 in just about every benchmark, but has ~identical performance in bitcoin mining. HD6000 series "cut the fat" (the 5 units shrank to 4 units) to gain this performance, but consoles work in a different way than computers and coding closer to the metal allows VLIW5 to use its full potential which would put the HD5870 beyond HD6970 at the same clocks.

That is of course without taking into account other advances in the architecture like using more advance tessellation units or other small changes that were made between these cards, remember though that we have no idea how custom, if at all Wii U's GPU is. Nintendo's history points to some fairly big customization however, and I wouldn't put VLIW5's flops below GCN's in a console setting.
 
HD5770 vs 7700 clocked the same. The 5770 has more raw FLOPS by 20+% but is outperformed in almost every task by as much as 37%
To be fair, it's VLIW5 on a PC environment, remember AMD scaled it down to VLIW4 on HD 69x0's because the 5-way capacity wasn't used often if at all in real applications.

Not using it is 20% of every stream processor in there; on a console and coding specifically for it though, you can use it (although sure, not every game will, but you can).

GCN is more efficient, for sure; and the fact Wii U is the different one here won't do it any favours; but fact is such difference can be reduced at least slightly and developers can do it if they want to take the most performance out of it.

EDIT: Above poster, thank you.
 
Where was i taking about efficiency?

Yeah thats one downside for Wii U. But the featureset is basically complete and comparable through all 3 consoles.

This is currently false. You guys are hoping some "secret sauce" was tweaked and added to r7xx architecture that puts it feature set equal to dx11.1, when it was originally DX10 based. This simply can't be confirmed. If it can be, please show me a source. I find it hard to believe that nintendio tweaked much older architecture to support features that newer architecture does. Doesn't make much sense, why not just use the new architecture from the get go? DX11 based gpu's were definitely available. The r7xx architecture it using is from 2008.

What we know for sure is the architecture that it's based on is a dx10 feature set. So what we still have is a DX10 vs DX11.1 feature set, plus GCN architecture efficency per flop differences(edit: which I guess how much of a difference this is, is being disputed in posts up above. though the consensus suggests there is still some form of difference, even if smaller), and the overall much higher flop counts.

I personally would like to know what the differences of DX10 vs DX11.1 are? I'm assuming the Wii U GPU will have that feature set until it can be proven otherwise from at least some sort of reliable source with concrete details.

Well r700 was DX10.1 based first off, also the only thing holding R700 back from DX11 requirements iirc was the tessellation unit used. However the difference between DX10.1 and DX11 was extremely small, mostly added Microsoft's own tools, which wouldn't be used on a console setting, and remember it was antonz that told us it had 2011/2012 bells and whistles, which does point to DX11, as does articles stating it has DX11 features with DX9 performance (obviously DX9 isn't a performance benchmark, so the person saying this is trying to frame the Wii U as a weak performing GPU, but doesn't do this by attacking it's features.

This could mean anything. Bells and Whistles? And who is antonz? Someone posted a list of features added in 11.1 vs 11 n one of the Orbis threads, and the list was quite long.



edit: I'm doing some reading and the main difference is DX10 does not support tessellation but DX11 does. There some other thing like advance DOF and shadows being a bit better on dx11. Most say besides the Tessellation the differences were pretty small. *continues reading*

other differences with DX11
* Multithreaded Rendering
* Compute Shaders (GPGPU?). I thought Wii U had gpgpu?
 

z0m3le

Banned
http://www.anandtech.com/show/4061/amds-radeon-hd-6970-radeon-hd-6950/4 here is a bit more on why AMD went to VLIW4 in the first place.

This is currently false. You guys are hoping some "secret sauce" was tweaked and added to r7xx architecture that puts it feature set equal to dx11.1, when it was originally DX10 based. This simply can't be confirmed. If it can be, please show me a source. I find it hard to believe that nintendio tweaked much older architecture to support features that newer architecture does. Doesn't make much sense, why not just use the new architecture from the get go? DX11 based gpu's were definitely available. The r7xx architecture it using is from 2008.

What we know for sure is the architecture that it's based on is a dx10 feature set. So what we still have is a DX10 vs DX11.1 feature set, plus GCN architecture efficency per flop differences, and the overall much high flop counts.

I personally I would like to know what the differences of DX10 vs DX11.1 are.

Well r700 was DX10.1 based first off, also the only thing holding R700 back from DX11 requirements iirc was the tessellation unit used. However the difference between DX10.1 and DX11 was extremely small, mostly added Microsoft's own tools, which wouldn't be used on a console setting, and remember it was antonz that told us Wii U's GPU had 2011/2012 bells and whistles, which does point to DX11, as does articles stating it has DX11 features with DX9 performance (obviously DX9 isn't a performance benchmark, so the person saying this is trying to frame the Wii U as a weak performing GPU, but doesn't do this by attacking it's features.

This could mean anything. Bells and Whistles? And who is antonz? Someone posted a list of features added in 11.1 vs 11 n one of the Orbis threads, and the list was quite long.

Antonz is a GAF member who has been confirmed to be a Wii U developer. He said this in the last WUST I believe, it was actually to a response from Burntpork, one of his last acts before being banned. Also DX11.1 was added to HD7000 series when they released the HD8000 OEM chips, which are just upclocks of HD7000, so DX11.1's feature set does not seem very hard to add to DX11, as they were likely added through drivers and not directly needing new hardware, since the HD8000 OEM is just a rebrand.
 
This is currently false. You guys are hoping some "secret sauce" was tweaked and added to r7xx architecture that puts it feature set equal to dx11.1, when it was originally DX10 based. This simply can't be confirmed. If it can be, please show me a source. I find it hard to believe that nintendio tweaked much older architecture to support features that newer architecture does. Doesn't make much sense, why not just use the new architecture from the get go? DX11 based gpu's were definitely available. The r7xx architecture it using is from 2008.

What we know for sure is the architecture that it's based on is a dx10 feature set. So what we still have is a DX10 vs DX11.1 feature set, plus GCN architecture efficency per flop differences, and the overall much high flop counts.

I personally I would like to know what the differences of DX10 vs DX11.1 are.

DX11 added 3 main things to the feature set, 2 of which were doable on AMDs RV700 line through DX, and 1 of which it did, but in a way different than what the DX11 API went about it. The 2 things that were added that could already be run on a DX10.1 card were Compute Shaders, and Multithreaded rendering. Now the third thing added was Tesselation, which the RV700 chips had. It just was not as good at it, and did things differently than the official DX11 spec. However that doesn't matter for the Wii U since the Wii U won't be using DX11. So the 3 main things DX11 added over DX10 are all doable on a RV700 chip, with 1 just not being able to be done through the DX api, but again that's not an issue for the Wii U.
 
DX11 added 3 main things to the feature set, 2 of which were doable on AMDs RV700 line through DX, and 1 of which it did, but in a way different than what the DX11 API went about it. The 2 things that were added that could already be run on a DX10.1 card were Compute Shaders, and Multithreaded rendering. Now the third thing added was Tesselation, which the RV700 chips had. It just was not as good at it, and did things differently than the official DX11 spec. However that doesn't matter for the Wii U since the Wii U won't be using DX11. So the 3 main things DX11 added over DX10 are all doable on a RV700 chip, with 1 just not being able to be done through the DX api, but again that's not an issue for the Wii U.

Yea I just read about those 3 main things look at my edit up above lol.

Everything I've read has said DX10.1 flat out didn't support Tesselation. Was rv700 chip just an exception to this rule? Is it still safe to assume Durango/Orbis' gpu's would still be a bit better at Tessellation though?
 

z0m3le

Banned
Yea I just read about those 3 main things look at my edit up above lol.

Everything I've read has said DX10.1 flat out didn't support Tesselation. Was rv700 chip just an exception to this rule? Is it still safe to assume Durango/Orbis' gpu's would still be a bit better at Tessellation though?

RV700 used Gen 2 Tessellation engine from AMD, so it could do it, but it wasn't used. It went about it in a different way as well, which was part of the reason it wasn't used, but mostly because everything uses DX11 and DX11 had a different way to use it. I would half expect Gen 2 to be replaced in Wii U, since Nintendo must expect Sony and Microsoft to have gone with Gen 3+, and Gen 2 might not be compatible. Part of the customization that would make the most sense in Wii U's GPU of which we are fairly certain that customizations were made.
 
I think Nintendo really just wanted to make sure NSMBU was a launch title that they didn't really try to do anything to push the graphics or make it have a higher resolution, I think NSMBU could of easily been native 1080p on the Wii U.

if 1080p was so easy to accomplish then why wasn't it 1080p? you mention it being a launch title and Nintendo not trying to push graphics, but that doesn't negate the claim that "NSMBU could of easily been native 1080p."
 
RV700 used Gen 2 Tessellation engine from AMD, so it could do it, but it wasn't used. It went about it in a different way as well, which was part of the reason it wasn't used, but mostly because everything uses DX11 and DX11 had a different way to use it. I would half expect Gen 2 to be replaced in Wii U, since Nintendo must expect Sony and Microsoft to have gone with Gen 3+, and Gen 2 might not be compatible. Part of the customization that would make the most sense in Wii U's GPU of which we are fairly certain that customizations were made.
Other thing from newer hardware generations that's supposed to to have been added being Eyefinity support. (introduced from R8xx onwards)

It's certainly not just a R7xx.
 

z0m3le

Banned
if 1080p was so easy to accomplish then why wasn't it 1080p? you mention it being a launch title and Nintendo not trying to push graphics, but that doesn't negate the claim that "NSMBU could of easily been native 1080p."

It honestly might have something to do with final hardware coming Q3 2012, this might of made it miss launch had they of not gotten the game to run perfectly on the unfinished hardware. Obviously this is just speculation, but it should be reasonable to assume.

Other thing from newer hardware generations that's supposed to to have been added being Eyefinity support. (introduced from R8xx onwards)

It's certainly not just a R7xx.

Yes, this is a big one, to me it was the first evidence that it couldn't be just vanilla R700.
 
RV700 used Gen 2 Tessellation engine from AMD, so it could do it, but it wasn't used. It went about it in a different way as well, which was part of the reason it wasn't used, but mostly because everything uses DX11 and DX11 had a different way to use it. I would half expect Gen 2 to be replaced in Wii U, since Nintendo must expect Sony and Microsoft to have gone with Gen 3+, and Gen 2 might not be compatible. Part of the customization that would make the most sense in Wii U's GPU of which we are fairly certain that customizations were made.

Yea makes sense, but until a Wii U game uses Tessellation we won't know for sure. I wonder if certain devs will comment on wether or not there using it in their upcoming games. It wouldn't be breaking NDA, I don't think. Some interviewer has got to just ask the right question.
 

Darryl

Banned
if 1080p was so easy to accomplish then why wasn't it 1080p? you mention it being a launch title and Nintendo not trying to push graphics, but that doesn't negate the claim that "NSMBU could of easily been native 1080p."


it probably began as 720p and they didn't think it was worth the work to attempt to increase the resolution. NSMB:U isn't exactly trying to wow anybody with its visuals and with the nature of the design choices i don't think the increase would be very noticeable as it is.
 

NateDrake

Member
it probably began as 720p and they didn't think it was worth the work to attempt to increase the resolution. NSMB:U isn't exactly trying to wow anybody with its visuals and with the nature of the design choices i don't think the increase would be very noticeable as it is.

Agreed.
 
It honestly might have something to do with final hardware coming Q3 2012, this might of made it miss launch had they of not gotten the game to run perfectly on the unfinished hardware. Obviously this is just speculation, but it should be reasonable to assume.

so second gen games should have at least some 1080p examples?
 

z0m3le

Banned
so second gen games should have at least some 1080p examples?

There are already 1080p games on Wii U. off the top of my head:
Toki Tori 2
Monster Hunter 3U
Skylanders Giants
Scribblenauts Unlimited
Not sure if Trine 2 is 1080p, but iirc there are a few others.

Yea makes sense, but until a Wii U game uses Tessellation we won't know for sure. I wonder if certain devs will comment on wether or not there using it in their upcoming games. It wouldn't be breaking NDA, I don't think. Some interviewer has got to just ask the right question.

"X" seems to be using tessellation on the mountain ranges, sadly the thing about tessellation is that good tessellation should be impossible to tell.
 

Daschysta

Member
If the fully powered dev kits were available only as late as Q3 2012 a noticeable jump in visual fidelity of second wave software (independent from the expected jump that comes with being fully accustomed to development on new hardware) should be expected no? I'm perfectly satisfied with the system visually already, but some more pep would be welcomed by many people. I think that X could be a game that shows the jump over last gen systems if they polish it a bit, the textures for a game of that scale already appear to be above most PS360 open world games to my untrained eye, such as RDR or Oblivion.

The question will likely be answered when 1 of 3 events occures. Final direct feed footage of "X" is released or leaked, Retro's game is showed off by Nintendo, at E3 or otherwise, 3D mario is displayed at E3 or otherwise.

The Wii-U should already be able to display more complex textures in higher res than PS360, but one of these games will probably demonstrate whether the Wii-U can handle and use some of the effects that are in question.

Personally i'm perfectly satisfied already, as I know for certain Nintendo's unique style of game will look beautiful, even if were in fact only PS360 level (though it's likely stronger). However the implications for 3rd party software and the profitability of doing so depends on cross features and other things that relate to visuals, so hopefully the GPU and memory structure is capable of DX 11.1 compatible features and other API's that would make porting as easy as possible, thus ameliorating some of the 3rd party apathy surrounding the system.
 
http://www.anandtech.com/show/4061/amds-radeon-hd-6970-radeon-hd-6950/4 here is a bit more on why AMD went to VLIW4 in the first place.



Well r700 was DX10.1 based first off, also the only thing holding R700 back from DX11 requirements iirc was the tessellation unit used. However the difference between DX10.1 and DX11 was extremely small, mostly added Microsoft's own tools, which wouldn't be used on a console setting, and remember it was antonz that told us Wii U's GPU had 2011/2012 bells and whistles, which does point to DX11, as does articles stating it has DX11 features with DX9 performance (obviously DX9 isn't a performance benchmark, so the person saying this is trying to frame the Wii U as a weak performing GPU, but doesn't do this by attacking it's features.



Antonz is a GAF member who has been confirmed to be a Wii U developer. He said this in the last WUST I believe, it was actually to a response from Burntpork, one of his last acts before being banned. Also DX11.1 was added to HD7000 series when they released the HD8000 OEM chips, which are just upclocks of HD7000, so DX11.1's feature set does not seem very hard to add to DX11, as they were likely added through drivers and not directly needing new hardware, since the HD8000 OEM is just a rebrand.

To back up z0m3ie and Antonz, there are other insiders that has implied or stated that there are some features beyond dx10.1/SM4. This includesLi Mu Bai from the Beyond3D threads who threw out some very interesting post, and Arkam who was previously confirmed to be legit from the mods.
 
I believe that I recall someone at one point stating that good tessellation will be "invisible" to the viewer, because it suppose to be seamless. That could make it more complicated to find tessellation in a game.

You mean it gained more FPS per actual flop (see Durante's remark). But this is in PC context, is it proven/known that this is the case without the "software layer" known as windows? Meaning, in a dedicated console? Or does the formula to calculate flops differ for GCN architecture (which is somewhat what you are saying)?



Right, but this is all in a windows environment. The argument was whether the same discrepancy holds up in a console environment.

Yes, thanks for correcting my phrasing.
 
To back up z0m3ie and Antonz, there are other insiders that has implied or stated that there are some features beyond dx10.1/SM4. This includesLi Mu Bai from the Beyond3D threads who threw out some very interesting post, and Arkam who was previously confirmed to be legit from the mods.

yet none of those guys say it supports Tessellation. We just know it supports some features beyond dx10/SM4. It appears there definitely a chance it does, we still just don't know for sure.
 

Roo

Member
I did.

Yes this gets you excited. He basically said the console is memory intensified design.



And the GPU is "fairly mature and goinf the same direction as competitors" is also good, but not surprising.

So does this mean Wii U will get essentially the exact same games from Orbis and Durango with a few compromises like less fps, a lower resolution and/or some effects missing?
A case similar to PC>>PS360?
 

z0m3le

Banned
yet none of those guys say it supports Tessellation. We just know it supports some features beyond dx10/SM4. It appears there definitely a chance it does, we still just don't know for sure.

Again, tessellation is known, it's even in the spec sheet list that was posted months ago. RV700 also had a tessellation unit... Exclusives will certainly take advantage of tessellation unit if desired, multiplatforms would likely require Gen 3 of AMD's Tessellation units rather than the Gen 2 found in RV700, considering I keep saying Gen 2, you'd think it would point to tessellation being an old feature, in fact Xenos has a tessellation unit but it is very early and only supports surface tessellation iirc.

PS Arkam confirmed the spec sheet leak that I am talking about mentioning tessellation.

So does this mean Wii U will get essentially the exact same games from Orbis and Durango with a few compromises like less fps a lower resolution or some effects missing?
A case similar to PC>>PS360?

If developers want to put them there, yes... though in the case of PS4 having 400+ GFLOPs dedicated to just gpgpu, would likely be hard to replicate on Wii U and some of those games might be impossible, but any physic based game play that is needed would also be random and hard to program for since the variables would be far too huge, it is like guessing a lotto ticket number correctly.

Program for is a bad term, plan for would be more accurate, basically what I'm saying is that GPGPU becoming important to the games design is very limited because a game can not plan for any particular thing to happen since it is suppose to simulate a more realistic effect in the way objects move and break, this would mean that they can't plan some events to use destruction since the destruction could end up completely different and end up making the level uncomplete-able, though they of course know about these sort of things and will still allow a huge visual increase in fidelity.
 

ahm998

Member
In my opinion I think the Wii U will get all competitors game with 720p resolution & the other will be standard with 1080p.

Because the Durango & ps4 more power than Wii U between 2.25-3 times and this is the difference between 720p & 1080p.
 

z0m3le

Banned
In my opinion I think the Wii U will get all competitors game with 720p resolution & the other will be standard with 1080p.

Because the Durango & ps4 more power than Wii U between 2.25-3 times and this is the difference between 720p & 1080p.

Right, if it was limited to pixel fill rate, you'd be completely correct, but there are other things to take into consideration, the games should all be possible given how scale-able everything is, but it would be more than just a lower resolution, you'd also see lower effects, think of it as PC settings... You can bump the resolution down and this helps a lot with performance, but often you have to lower other things from high to medium or even low to get a good frame rate.

Developers would have to do this in a more manual way, creating lower resolution assets (if they don't already exist for lower pc specs) maybe having less characters on screen, stuff of this nature.
 
yet none of those guys say it supports Tessellation. We just know it supports some features beyond dx10/SM4. It appears there definitely a chance it does, we still just don't know for sure.

As Z0m3ie pointed out, we know that there is some sort of tessellation unit in the Wii U. I don't have any details on it, though we are logically guessing that its based on the original one found in the r700. We, unfortunately, don't know much about even the original one. From what I heard, tessellation on the r700 series was not commercially used at all, and was only shown once in a demo.

AMD apparently had some very ambitious plans for that line of GPUs, but it didn't all work out partially due to dx10 being exclusive to the underperformed Microsoft's Vista OS. Since the Wii U is a closed system, it is expected for it to eventually be put to use.
 
There are already 1080p games on Wii U. off the top of my head:
Toki Tori 2
Monster Hunter 3U
Skylanders Giants
Scribblenauts Unlimited
Not sure if Trine 2 is 1080p, but iirc there are a few others.

Trine 2 is 720p.

Rayman, Toki Tori, Scribblenauts and MH3 are not demanding games.

And I'm reading here that the Nintendo Magic is back xD
 

ugoo18

Member
There are already 1080p games on Wii U. off the top of my head:
Toki Tori 2
Monster Hunter 3U
Skylanders Giants
Scribblenauts Unlimited
Not sure if Trine 2 is 1080p, but iirc there are a few others.



"X" seems to be using tessellation on the mountain ranges, sadly the thing about tessellation is that good tessellation should be impossible to tell.

Trine 2 is 720p, however it is locked at 720p + the 480p pad stream unlike the PS360 version which like has been said already run at a variable resolution.

The following are all 1080p

Toki Tori
Monster Hunter Tri Ultimate
Skylanders Giants
Scribblenauts Unlimited
Dragon Quest X
Transformers Prime
Rayman Legends
Cloudberry Kingdom
 

z0m3le

Banned
Trine 2 is 720p.

Rayman, Toki Tori, Scribblenauts and MH3 are not demanding games.

And I'm reading here that the Nintendo Magic is back xD

I was pointing out to someone who thought there were no 1080p games, some 1080p games.

Also this isn't the thread for "magic" should save that for other threads.
 
I believe people have already mentioned this before. Why are people surprised?

I honestly do not recall it being mentioned before.

Trine 2 is 720p.

Rayman, Toki Tori, Scribblenauts and MH3 are not demanding games.

And I'm reading here that the Nintendo Magic is back xD

Shinobi's summary of the discussions in this thread at the other forum was a bit disingenuous. There are good discussions here, and more tech-savvy people like Durante, wsippel, and Blu enforce fact-checking. We actually analyze the trick and mirrors of the "magic" ;)
 
Oh, yes, magic, a 2011 game from a digital service... I believe in magic...
Well, how else could a low budget indie platformer run better? Wii U version actually renders both 720p and 480p simultaneously while the PS3 and 360 versions can't hold just 720p. And Wii U version has a more consistent framerate on top of that.

Must be that Magic you keep bringing up here and at B3D.
 
I was pointing out to someone who thought there were no 1080p games, some 1080p games.

Also this isn't the thread for "magic" should save that for other threads.

I'm reading some "magic" on some posts.

But yes, there are some 1080p games on Wii U, some non demanding games. It will depend on the resources used by the game, just as the 1080p Xbox 360/PS3 games.

Well, how else could a low budget indie platformer run better? Wii U version actually renders both 720p and 480p simultaneously while the PS3 and 360 versions can't hold just 720p. And Wii U version has a more consistent framerate on top of that.

Must be that Magic you keep bringing up here and at B3D.

Come on, the Wii U version is a year later than Xbox 360 and PS3 version, even The Witcher 2 Xbox 360 is better than the original TW2 for PC, and it is not because the power. And they are a really good devs but sorry, Trine 2 is not a demanding game, it is beautiful because the art.
 
Come on, the Wii U version is a year later than Xbox 360 and PS3 version, even The Witcher 2 Xbox 360 is better than the original TW2 for PC, and it is not because the power.
Frozenbyte's been pretty forthcoming about it, they've utilized Wii U's performance gains not only in locking resolution and tightening the framerate, but also improving the physics model and adding PC content deemed "too much" for PS360 to handle. Clearly they must be Wizards.
 

Lizardus

Member
I'm reading some "magic" on some posts.

But yes, there are some 1080p games on Wii U, some non demanding games. It will depend on the resources used by the game, just as the 1080p Xbox 360/PS3 games.



Come on, the Wii U version is a year later than Xbox 360 and PS3 version, even The Witcher 2 Xbox 360 is better than the original TW2 for PC, and it is not because the power. And they are a really good devs but sorry, Trine 2 is not a demanding game, it is beautiful because the art.

My sides have separated and filed for a divorce!!

Please do some research before spouting bullshit.
 
Frozenbyte's been pretty forthcoming about it, they've utilized Wii U's performance gains not only in locking resolution and tightening the framerate, but also improving the physics model and adding PC content deemed "too much" for PS360 to handle. Clearly they must be Wizards.

Wizard = optimization and more powerful GPU.

My sides have separated and filed for a divorce!!

Please do some research before spouting bullshit.

What? The Xbox 360 is the enhanced edition, I compared this to first version for PC.

360_lighting1.png

PC_lighting1.png
 

z0m3le

Banned
Guys seriously, Trine 2 really doesn't matter. In the context of what I said, I thought I remembered it being 1080p, it was not, and thus it was corrected.

If you want to discuss whether Trine 2 is a demanding game or not, make a thread about it, if you want to compare the Wii U to 360 or PS3, there are dozens of threads for that, this isn't one of them.
 
Guys seriously, Trine 2 really doesn't matter. In the context of what I said, I thought I remembered it being 1080p, it was not, and thus it was corrected.

If you want to discuss whether Trine 2 is a demanding game or not, make a thread about it, if you want to compare the Wii U to 360 or PS3, there are dozens of threads for that, this isn't one of them.

You are right, even Trine 2 is not a "game made from the ground" for Wii U.
 

SmokeMaxX

Member
And? what is you point? PS3 run some games worst than Xbox 360 version, and Wii U run some games worst than PS360.

Why? I have both version, Xbox 360 have some thing looking better than the original (not enhanced) PC version.

This is a stupid argument. I can make a game that runs better on the original Wii than on a $1000 PC. What point does that prove?
 
Top Bottom