• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Wii U Reality Check - Black Op's II No Native 1080'p Support

Gemüsepizza;43145076 said:
It is not bad news because 720p is bad, but because it is an indication about the power of the Wii U. To increase the resolution by about 2 times, you need roughly 2 times the power. If Wii U can not do this, this could mean Wii U is less than 2 times as powerful as current gen.

You would be correct if we knew Treyarch tried to make it 1080p and failed and therefore had to settle for 720p, but we don't. You're jumping to conclusions.
 
Exactly. Held back by outdated consoles.

Really even pc game only or pc centric games like bf3, anyway im not so sure ive ever read people expecting a super jump with wii u ever, the biggest selling point of wii u isnt the power its ninty first party in hd and what the second screen can bring to the table.
 
Really even pc game only or pc centric games like bf3, anyway im not so sure ive ever read people expecting a super jump with wii u ever, the biggest selling point of wii u isnt the power its ninty first party in hd and what the second screen can bring to the table.
Battlefield 3 on PC does look substantially better than on consoles.
 
You would be correct if we knew Treyarch tried to make it 1080p and failed and therefore had to settle for 720p, but we don't. You're jumping to conclusions.

Well, I wrote "it could mean". But Blops is not the only multiplat title on Wii U which only supports 720p, so do you think this means almost nobody tried?
 
Gemüsepizza;43146621 said:
Well, I wrote "it could mean". But Blops is not the only multiplat title on Wii U which only supports 720p, so you think almost nobody tried?

I think most didn't care too, and felt 720p is sufficient enough. 1080p is very taxing and developers could push more in 720p. I bet most of the launch games on the ps4/720 will be in 720p and some of them will still look significantly better than what we have on consoles today. Resolution is just one aspect.
 
Were people expecting a miracle from a GPU that's barely more powerful than the ATI 1800 and Nvidia 7800 based chips that are found in the 360 and PS3 from 7 years ago?

If the rumors of the new consoles supporting DirectX instead of just OpenGL are true, I can't wait to see how crazy things get when every game that comes out is a watered down port of the PC version. I'm just glad this exotic technology bullcrap that consoles been trying to pull over the years is finally coming to an end.
 
Gemüsepizza;43145766 said:
Every time a new Sony/Microsoft console was released, we could see a difference compared to the old generation. With the Wii U, I would be happy if we at least could measure a difference.

Wait wait wait... what?

It was just this generation (Xbox 360, PS3, and Wii) in which quite a difference was noticable. You know, Xbox to Xbox 360 and PS2 and PS3. Not everytime was like that.

And with Wii U, seems like we'll have to wait a little bit more to see its real potential in the graphics area. Just like PS3 and Xbox 360 games have been evolving since the first game.

But, don't generalize that Sony and Microsoft have always released the strongest consoles, in the graphics department. (Well, we could say that Microsoft did with Xbox, yet questionably Xbox 360 lost against PS3).
 
Wait, did people actually think it would run native 1080p? The PS360 versions can't even do 720p, the Wii U isn't that much more powerful than the PS3/Xbox 360.

Just get the superior PC version and everything will be alright.
 
Wii U seems to be as powerful as a x360, with more modern shaders. There might be other things I'm missing, but that's the only major difference I've spotted so far (clearly visible in Nano Assault Neo)
 
Gemüsepizza;43146621 said:
Well, I wrote "it could mean". But Blops is not the only multiplat title on Wii U which only supports 720p, so do you think this means almost nobody tried?

Did these developers try to make games with lots of improves in graphics or try to run the game with the same quality that Ps360 game they have + dual screen functions?

And these company can have a engine, but they need to adapt it to run on Wii U. What looks, in this first moment, is that most company only try the second option.

Still have other factors such as which investment was made in the ports, how long they had to do the port and what knowledge they had of the hardware in this work (since the kits always come with differences in power).
 
Gemüsepizza;43145076 said:
It is not bad news because 720p is bad, but because it is an indication about the power of the Wii U. To increase the resolution by about 2 times, you need roughly 2 times the power. If Wii U can not do this, this could mean Wii U is less than 2 times as powerful as current gen.

OK, then, let's imagine for a second that Wii U was as powerful as a super computer cluster. OK? Got it?

Now, take the assets and code from CoD or almost any game. Rebuild it to run on said super computer.

Now, are you ready to have your mind blown?

It's likely that the game would STILL BE 720P! WOAH! AMAZING! WOW! WHO'D HAVE THUNK IT?!?!?!!!?!?!

Unless the developers do something (And that something depends on how they made the game in the first place and their engine's flexibility) it will essentially run the same on every machine that meets minimum spec and shares that architecture (unified shaders etc as we know it)

Therefore, there is ABSOLUTELY,I'll repeat.....AB SO LUTE LY NOTHING we can take away from CoD being in 720p on Wii U that shows us how powerful it is. Straight ported to the 720 or PS4 it'd also be running at 720.

Until the developers come out and say it there is nothing in this that shows us how powerful or not the machine is.
 
OK, then, let's imagine for a second that Wii U was as powerful as a super computer cluster. OK? Got it?

Now, take the assets and code from CoD or almost any game. Rebuild it to run on said super computer.

Now, are you ready to have your mind blown?

It's likely that the game would STILL BE 720P! WOAH! AMAZING! WOW! WHO'D HAVE THUNK IT?!?!?!!!?!?!

Unless the developers do something (And that something depends on how they made the game in the first place and their engine's flexibility) it will essentially run the same on every machine that meets minimum spec and shares that architecture (unified shaders etc as we know it)

Therefore, there is ABSOLUTELY,I'll repeat.....AB SO LUTE LY NOTHING we can take away from CoD being in 720p on Wii U that shows us how powerful it is. Straight ported to the 720 or PS4 it'd also be running at 720.

Until the developers come out and say it there is nothing in this that shows us how powerful or not the machine is.

Well nope. The game can run at any resolution you want on PC, so I don't see why they wouldn't be able to go above 720p on a console given it's powerful enough.
 
Wait wait wait... what?

It was just this generation (Xbox 360, PS3, and Wii) in which quite a difference was noticable. You know, Xbox to Xbox 360 and PS2 and PS3. Not everytime was like that.

And with Wii U, seems like we'll have to wait a little bit more to see its real potential in the graphics area. Just like PS3 and Xbox 360 games have been evolving since the first game.

But, don't generalize that Sony and Microsoft have always released the strongest consoles, in the graphics department. (Well, we could say that Microsoft did with Xbox, yet questionably Xbox 360 lost against PS3).

You didn't read his post properly. All he said was that there was a noticeable jump when Sony/ms launched a new console comparable to the previous generation, and it's true. Wipeout, Ridge Racer, TTT 2, Halo, pgr 3, kameo, Resistance.
 
Gemüsepizza;43146387 said:
devs have good experience with modern hardware, so Wii U launch games should already use the hardware quite good. Of course there is always room for further optimization, but don't expect too much.
You cannot do magic just because you have new hardware. We've seen this through this whole generation on PS3, and it'll be the same next generation too. We might get higher resolution, in some cases, but you still need better assets, new engines built for the specific hardware with better physics, lighting, shadows, particles, AA, etc if you're get people to say wow when it comes to graphics. And obviously Wii U is no different. You might get wow effects from the fact that we now have a second screen though.
 
The ignorance surrounding Wii U's hardware is really disgusting. Worse part is even when you try to ignore it, 10 new people show up in their place instead. It makes the forum unreadable. -_-
 
Well nope. The game can run at any resolution you want on PC, so I don't see why they wouldn't be able to go above 720p on a console given it's powerful enough.

First, we're talking about consoles. Second, like I said, it depends on what's required in the engine and effort on the developer's part. Games don't magically know about the machine they run on. It's likely that this is a straight port to Wii U from 360. Possibly locked to a set resolution on consoles. Remember consoles are played on a TV, of which a large number, if not most are 720. There's no real need to up it to 1080.

Until Activision come out and say "We tried 1080 but it just couldn't handle it" going around and saying Wii U can't handle CoD at 1080 is disingenuous.

The ignorance surrounding Wii U's hardware is really disgusting. Worse part is even when you try to ignore it, 10 new people show up in their place instead. It makes the forum unreadable. -_-

Yup, and notice that most of them never set foot in a WUST. Suggesting to me they don't really care and they are just here to troll.
 
First, we're talking about consoles. Second, like I said, it depends on what's required in the engine and effort on the developer's part. Games don't magically know about the machine they run on. It's likely that this is a straight port to Wii U from 360. Possibly locked to a set resolution on consoles. Remember consoles are played on a TV, of which a large number, if not most are 720. There's no real need to up it to 1080.

Until Activision come out and say "We tried 1080 but it just couldn't handle it" going around and saying Wii U can't handle CoD at 1080 is disingenuous.

It's not locked to a set resolution. BLOPS had a different res on PS3 and 360, and was easily changable on PC. As a port, there's no reason not to increase resolution if they can, if they're not going to be putting additional assets or effects in.

It's pretty safe to say that considering the pad output, the need for 60fps and the how limited other 1080p titles on Wii U are, that the Wii U would struggle wih COD at 1080p.
 
First, we're talking about consoles. Second, like I said, it depends on what's required in the engine and effort on the developer's part. Games don't magically know about the machine they run on. It's likely that this is a straight port to Wii U from 360. Possibly locked to a set resolution on consoles. Remember consoles are played on a TV, of which a large number, if not most are 720. There's no real need to up it to 1080.
Oh please... he's right, and you know it. Don't make a bunch of useless excuses. For the CoD to run in 1080p @ 60FPS on WiiU, it would have to be approx 3x more powerful than current consoles, and there's just no way that's the case.
 
You cannot do magic just because you have new hardware. We've seen this through this whole generation on PS3, and it'll be the same next generation too. We might get higher resolution, in some cases, but you still need better assets, new engines built for the specific hardware with better physics, lighting, shadows, particles, AA, etc if you're get people to say wow when it comes to graphics. And obviously Wii U is no different. You might get wow effects from the fact that we now have a second screen though.

Right. We can also see the PS3 this generation. When they made ​​a port from 360 to PS3, the game was generally worse, due to differences in architecture. But a game made from scratch in the Ps3's architecture looks equal or better than the 360's.

But the port solution is still faster and cheaper, although that often result in a inferior port. In the case of the Wii U, with little knowledge of architecture and the rush to launch, surely these games are just poorly optimized ports.

And the news, even though they are ports, all of then seem to be running better on Wii U somehow showing that there is rather more power than current devices. At least enough to compensate for the differences in architectures.

And about the power of Wii U, for what I been read, its look like the difference between Ps2 to Wii (a little bigger than the Ps2 to XBOX1).
 
Gemüsepizza;43145910 said:
Why are you reducing graphics to the resolution? I am fine with 720p. But then those games should have way better models, better effects, better lighting, ... I can't see that in any Wii U game (yet). It looks exactly like the last 7 years. So why should someone, who is not terribly interested in Nintendo exclusives, buy the Wii U if he already has a PS3 or Xbox 360?


Not disagreeing or commenting on anything else you said, just wanted to point out that this part isn't really true. Maybe the last 2/3 year's (out of the last 7) tops.


On the subject, I think the extra power WiiU does have (however much that may be) is tied up not only by the usual "port"/"early in console life" issues, but utilising the second screen. Must take some beef out of the console powering two screens, even just for simple 2D stuff. That shit don't come free, right?
 
Oh please... he's right, and you know it. Don't make a bunch of useless excuses. For the CoD to run in 1080p @ 60FPS on WiiU, it would have to be approx 3x more powerful than current consoles, and there's just no way that's the case.

Can I see your spec sheet? And for the record I've never said Wii U could handle 1080 CoD, just that making assumptions about it not being able to based on the fact that the port isn't in 1080 is illogical.

Gemüsepizza;43145910 said:
Why are you reducing graphics to the resolution? I am fine with 720p. But then those games should have way better models, better effects, better lighting, ... I can't see that in any Wii U game (yet). It looks exactly like the last 7 years. So why should someone, who is not terribly interested in Nintendo exclusives, buy the Wii U if he already has a PS3 or Xbox 360?

Because it has a touch screen controller. You know, that innovative thing that opens up lots of gameplay possibilities, as opposed to just increasing pixel counts and fancy graphics.

You also know that Wii U has a good amount of EDRAM, a bit greater than twice the RAM for games and a GPGPU which we can assume is way more efficient at GPGPU code than current gen consoles. Wait until next year when we start to see some games take advantage of the hardware. They should be a good decent bump above what we're seeing from these straight ports and launch titles.
 
Oh please... he's right, and you know it. Don't make a bunch of useless excuses. For the CoD to run in 1080p @ 60FPS on WiiU, it would have to be approx 3x more powerful than current consoles, and there's just no way that's the case.

You dont know that and neither do i, do you have experince with the wii u hardware fuck even the inner workings of the engine running cod blops 2?, there is no way this late into the cycle that they were going to spend more money doing anything other than a quick port job to wii u it doesnt make any sense for them to do so.
 
Can I see your spec sheet? And for the record I've never said Wii U could handle 1080 CoD, just that making assumptions about it not being able to based on the fact that the port isn't in 1080 is illogical.

Well didn't you know, every launch 360 game was never surpassed ever. Kameo, Condemned, and COD2 still hold the crown in both highest resolution, highest polygon count, they still employ the most advanced and complex graphic and physic solutions as well as the best sound, best use of online features and what have you.

You are expecting people to use their brain right before a system launch... that's not going to end well.
 
Well didn't you know, every launch 360 game was never surpassed ever. Kameo, Condemned, and COD2 still hold the crown in both highest resolution, highest polygon count, they still employ the most advanced and complex graphic and physic solutions as well as the best sound, best use of online features and what have you.

You are expecting people to use their brain right before a system launch... that's not going to end well.

You are correct, I am the one being illogical. Silly me.
 
new or exotic hardware argument does not work with wiiu.

the horse on the wiiu is the gpu. And it is well known that it is more or less standard issue amd gpu. There is nothing to learn here.

The argument would work if the cpu was powerful and exotic. But it is more or less confirmed to be under-powered and weak. Even weaker than the hd twins.

So whatever pzazz is going to come out of this thing is going to come from the gpu. Which is well known standard issue stuff.

no learning curve.
 
lol my retina macbook pro runs MW3 at 2880x1800 with ultra everything at 60FPS.

http://www.youtube.com/watch?v=C0azk1F_AEw
How are you running a MW3 ultra at 2880x1800 at ultra on 60fps when this guy cannot even run it on 1366x768 at a steady 60fps (30-55) your video card is a geforce GT650m which is equivalent to a HD7750M mobile gpu or a a HD6570 desktop GPU which is only a 588GFLOP card.

Your GPU is barely able to run Black Ops 2 at native 720p, ultra settings and a consistent 60fps let alone run it 5.5x faster than what you claim. Heck even a GTX 580 which can only barely run modern warfare 3 at 2560x1600 at 60fps on ultra which is a lower resolution than the retina display .

Can you tell us again that you are running MW3 on ultra at native 2880x1800 on a geforce GT650m?

I am not saying that you cannot do this on your $2800 laptop with a $60 GPU but hey Apple products are magical right.

By the way your GPU is Wii U class on a $2800 laptop.

new or exotic hardware argument does not work with wiiu.

the horse on the wiiu is the gpu. And it is well known that it is more or less standard issue amd gpu. There is nothing to learn here.

The argument would work if the cpu was powerful and exotic. But it is more or less confirmed to be under-powered and weak. Even weaker than the hd twins.

So whatever pzazz is going to come out of this thing is going to come from the gpu. Which is well known standard issue stuff.

no learning curve..

Sources of confirmation on these three please? I bet there isn't any confirmation at all but hey I could be wrong.
 
OK, then, let's imagine for a second that Wii U was as powerful as a super computer cluster. OK? Got it?

Now, take the assets and code from CoD or almost any game. Rebuild it to run on said super computer.

Now, are you ready to have your mind blown?

It's likely that the game would STILL BE 720P! WOAH! AMAZING! WOW! WHO'D HAVE THUNK IT?!?!?!!!?!?!

Unless the developers do something (And that something depends on how they made the game in the first place and their engine's flexibility) it will essentially run the same on every machine that meets minimum spec and shares that architecture (unified shaders etc as we know it)

Therefore, there is ABSOLUTELY,I'll repeat.....AB SO LUTE LY NOTHING we can take away from CoD being in 720p on Wii U that shows us how powerful it is. Straight ported to the 720 or PS4 it'd also be running at 720.

Until the developers come out and say it there is nothing in this that shows us how powerful or not the machine is.


3D games can run in any res they want and you have no idea about porting software!
You can not just run an other console's version of the game!
It has to be reprogrammed to run on a different console and with that a different res is no problem at all from a programming POV!
It is not like they are just different PCs or the games are programmed in Java FFS!
Hell, they even programmed in support of the Wii U screen!

You are hugely ignorant and should stop posting in threads like this!
 
http://www.youtube.com/watch?v=C0azk1F_AEw
How are you running a MW3 ultra at 2880x1800 at ultra on 60fps when this guy cannot even run it on 1366x768 at a steady 60fps (30-55) your video card is a geforce GT650m which is equivalent to a HD7750M mobile gpu or a a HD6570 desktop GPU which is only a 588GFLOP card.

Your GPU is barely able to run Black Ops 2 at native 720p, ultra settings and a consistent 60fps let alone run it 5.5x faster than what you claim. Heck even a GTX 580 which can only barely run modern warfare 3 at 2560x1600 at 60fps on ultra which is a lower resolution than the retina display .

Can you tell us again that you are running MW3 on ultra at native 2880x1800 on a geforce GT650m?

I am not saying that you cannot do this on your $2800 laptop with a $60 GPU but hey Apple products are magical right.

By the way your GPU is Wii U class on a $2800 laptop.



Sources of confirmation on these three please? I bet there isn't any confirmation at all but hey I could be wrong.
MW3 was optimized like ass on the PC, for some reason. I can run BLOPS at 2560x1600 w/4xAA and always stay over 60fps. And my PC isn't even very good.

So the comments about Treyarch being lazy and not optimizing don't make any sense.
 
So how does the gamepad factor into this? If your doing two player then that's another 480p going on. Is that part of this or something different?
 
As long as the true difference maker is still in the Wii U version (Off-TV play) then I'm happy going for that version of the game.
 
OK, then, let's imagine for a second that Wii U was as powerful as a super computer cluster. OK? Got it?

Now, take the assets and code from CoD or almost any game. Rebuild it to run on said super computer.

Now, are you ready to have your mind blown?

It's likely that the game would STILL BE 720P! WOAH! AMAZING! WOW! WHO'D HAVE THUNK IT?!?!?!!!?!?!

Unless the developers do something (And that something depends on how they made the game in the first place and their engine's flexibility) it will essentially run the same on every machine that meets minimum spec and shares that architecture (unified shaders etc as we know it)

Therefore, there is ABSOLUTELY,I'll repeat.....AB SO LUTE LY NOTHING we can take away from CoD being in 720p on Wii U that shows us how powerful it is. Straight ported to the 720 or PS4 it'd also be running at 720.

Until the developers come out and say it there is nothing in this that shows us how powerful or not the machine is.

Calm down man. If there's anything making threads like these unreadable it's posts like these.
 
The top PC games are still just console ports (mostly) and don´t come close to use the full power of a high-end PC. When the next-gen machines launch you will see a considerable jump in graphics on the PC too.
Exactly.

None of the wiiu launch window games show that it has better hardware than ps360 at all which has not been the case ever with playstation or xbox console launch games.There are always a few games which stand head and shoulders above the then previous gen games technically.
 
new or exotic hardware argument does not work with wiiu.

the horse on the wiiu is the gpu. And it is well known that it is more or less standard issue amd gpu. There is nothing to learn here.

The argument would work if the cpu was powerful and exotic. But it is more or less confirmed to be under-powered and weak. Even weaker than the hd twins.

So whatever pzazz is going to come out of this thing is going to come from the gpu. Which is well known standard issue stuff.

no learning curve.

That's why X360 launch titles look as good as current X360 titles.

Because the X360 was explicitly designed with off-the-shelf equivalent parts and to utilise standard DX libraries, so there was literally nothing to learn about optimising for it, and titles never changed over time.

wtfisthisshit.jpg
 
No surprise about the resolution, but i'm more curious about what makes the WiiU version the complete package. I know about the controller/TV stuff, but i dont play local coop. Anything else?
 
What res were the 360/PS3 versions of Black Ops? Do we have info on BO2 resolutions all across board?

There's an Eurogamer Digitalfoundry-Article on Black Ops 2's technical aspects, "Call of Duty: Black Ops 2 and the 60FPS Challenge" (http://www.eurogamer.net/articles/digitalfoundry-black-ops-2-60fps-challenge)
At least on 360, Black Ops 2 seems to be rendered at "880x720 with 2x multi-sampling anti-aliasing (MSAA)"

More information on CoD: Black Ops (which had huge framerate problems in MP matches on both consoles) is available from Eurogamer as well: Face-Off: Call of Duty: Black Ops (http://www.eurogamer.net/articles/digitalfoundry-call-of-duty-black-ops-faceoff)
The Xbox 360 version ran at 1040x608 with 2x MSAA; the PS3 edition at 960x544, again with 2x MSAA.

No surprise about the resolution, but i'm more curious about what makes the WiiU version the complete package. I know about the controller/TV stuff, but i dont play local coop. Anything else?

Most likely a meaningless marketing phrase.
 
My head hurts.

Always expect the bare minimum a port team of 14 can accomplish with Nintendo hardware.

It's not like any company aside from a happy few are really putting massive dollars into WiiU development yet. Most seem to be treating the system as just another thing to port their final current gen titles to. Future hasn't been decided.

Just please stop going into these things thinking anything other than "PS3/360 game with Upad functionality." Occasionally you'll be surprised when a title actually tries to achieve something with dev confirmed more powerful hardware. Well... we know the GPU is quite a bit more powerful. Purportedly SM4+ functionality. CPU is still a mystery. But what we have heard doesn't speak too kindly. And at the start at least, double the usable memory of the 360/PS3.

This isn't a metric jump, and a lot of that power could be used up through inefficient Upad usage. But it's not like it will be in any way a jump back. Just might not be powerful enough over those prior consoles for devs to want to push.

Even if that caliber of hardware can still surprise us to this day. (discounting things like Image Quality)
 
new or exotic hardware argument does not work with wiiu.

the horse on the wiiu is the gpu. And it is well known that it is more or less standard issue amd gpu. There is nothing to learn here.

The argument would work if the cpu was powerful and exotic. But it is more or less confirmed to be under-powered and weak. Even weaker than the hd twins.

So whatever pzazz is going to come out of this thing is going to come from the gpu. Which is well known standard issue stuff.

no learning curve.

standard hardware != standard software. Nintendo doesn't use DirectX, nor does Nintendo use OpenGL (or it does, but highly recommends NOT doing so such is the case with the 3DS). Nintendo has their own APIs, their own systems, their own SDK. There is no magic button that turns x-box 360 code into fully optimized Wii U code... and if it's true that devs didn't get "complete" SDKs till 6-8 months ago it's EASILY possible they didn't try for 1080p due to the fact that they are unfamiliar with the SDK and were working in a tight time constraint.

That said, I don't particularly care either way... as I've said many times, I'm a Nintendo fanboy... I'm getting the Wii U for nintendo games. I have a gaming PC for third party titles like this.
 
Can I see your spec sheet? And for the record I've never said Wii U could handle 1080 CoD, just that making assumptions about it not being able to based on the fact that the port isn't in 1080 is illogical.

You do realize we can make fairly knowledgeable guesses with some of the information we have, right?

- Price of the system
- Performance of other games
- Performance of similar games on existing consoles and PC

We can be reasonably certain the Wii U is not 3x as powerful as existing consoles given this information.
 
standard hardware != standard software. Nintendo doesn't use DirectX, nor does Nintendo use OpenGL (or it does, but highly recommends NOT doing so such is the case with the 3DS). Nintendo has their own APIs, their own systems, their own SDK. There is no magic button that turns x-box 360 code into fully optimized Wii U code... and if it's true that devs didn't get "complete" SDKs till 6-8 months ago it's EASILY possible they didn't try for 1080p due to the fact that they are unfamiliar with the SDK and were working in a tight time constraint.

That said, I don't particularly care either way... as I've said many times, I'm a Nintendo fanboy... I'm getting the Wii U for nintendo games. I have a gaming PC for third party titles like this.

(yes yes, quoting myself)

To add on top of this some real world examples: Just compare games designed for windows, and compare ports that run in Linux and Mac. All three systems use the same hardware (and in fact, you can have 1 system that runs all of those operating systems). A game designed on windows and direct X and then ported to OpenGL generally favors Windows, if only because they don't put huge teams to optimize code. This is ALL the same hardware, but you can have a HUGE framerate difference between the systems.

Other things to take into account is that Mac only supports Shader Model 3.2 or so, meaning even though the HARDWARE is the same the SOFTWARE limits what you can and can't do. Likewise, if someone where to make Mac a lead software platform (hahahahahahhaahha), it would mean more work porting it to Windows, and it may not take advantage of all of the HARDWARE features because the original platform is SOFTWARE limited.

Now, all of this would mean practically nothing if all game developers developed games on the lowest possible hardware level... but they don't... That would be both stupid and a waste of time. They rely on high level software calls to do their work and it takes time to learn what a system is best at, and what software implementation gets it done... knowing the hardware means you MAY have an idea, but it by no means guarantees it since this is still customized hardware.

(edit) Oh, and if you think it only goes one way... Valve recently noted on their blog that after heavy optimizing, they were able to get their DirectX code for Left 4 Dead 2 running faster on OpenGL in linux than windows... but it took a lot of time... though now they can apply that same knowledge to future ports a lot faster... They've been working on Linux in secret for a couple years now, and that's just one company. Give game devs a year or two with the hardware and it'll make all the difference in the world.


(edit 2) TL;DR: On modern PCs between Linux, windows, and Mac there can be a huge difference in performance/features based SOLELY on software implementations and time spent porting... 6 months with a semi-finalized SDK isn't long enough to know the system, especially when you have a hard deadline to release your game.
 
Not too surprising, but the engine is pretty old - hopefully there are efforts to boost those FPS issues with that extra power. We'll see though.
 
Top Bottom