• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Community Thread

Status
Not open for further replies.

AzaK

Member
I personally do not think a new software thread is needed. This one already moves slow enough as it is.

Frankly, the reason there is so much hardware discussion is because there hasn't really been anything new software wise since E3.

I think as we get closer and closer to launch things will even out again.

How are they going to buffer the pad image? They are going to have to keep X previous frames and feed them to the pad with a delay.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
How are they going to buffer the pad image? They are going to have to keep X previous frames and feed them to the pad with a delay.
X = 1, if you take into account that:

1. Both outputs go out at the same rate (60Hz),
2. There's vsync on both outputs.

ed: Or not. I was thinking more in terms of playing with the duration of a blocking BufferSwap(), but on a second thought, that might not be the best solution. Basically X would need to be greater than 1, though in practice that number would still be low - if we assume a 100ms TV lag, then at 60Hz (i.e. 16.6ms frame duration) that's 100 / 16.6 = 6 output-buffer queue for the pad.
 
And if it has a GPGPU, then it's going to be even harder for launch games to take advantage of that. We probably won't see anything truly using it for a year or so.

Yep. If Wii U had more raw power it could have made a difference in overcoming the lack of optimization.

So off the tech topic, what could/is Nintendo doing to entice third-party developers. I seem to remember them talking a big game about incentives and whatnot, but did we ever hear what those are? For example, could Nintendo offer bonuses to devs if the best selling version of a multi-platform game was the Wii U version? It seems like this could potentially keep the devs from phoning in Wii U versions and hopefully prevent a Wii scenario. Just a thought.

The only one I know of is Nintendo matching marketing dollars for select 3rd party launch titles.

Fair enough. What type of changes would affect what they could try? More compiler breaks?

Also, and this is pure ignorance on my part, but when you get a new devkit, do you have to 'port' the game to the new kit? Does it depend on how different the kits are? For example, the shift from a 'best performance match' kit to a final silicon chip?

It wouldn't be more as how I see it that was a part of the 5th kit tweaks considering the timing. And what would be too tough for me to answer. Vigil once talked about that there were things they couldn't do in the 2nd kit that they could do in the first. With the kits having actual retail-level component and the extended adjustment phase there may have been certain changes that positively or negatively caused them to decide to wait and adjust later.

And while you put port in quotes, I would still believe that's not the proper term to use. Someone might be willing to give you a proper answer on that.
 
It wouldn't be more as how I see it that was a part of the 5th kit tweaks considering the timing. And what would be too tough for me to answer. Vigil once talked about that there were things they couldn't do in the 2nd kit that they could do in the first. With the kits having actual retail-level component and the extended adjustment phase there may have been certain changes that positively or negatively caused them to decide to wait and adjust later.

And while you put port in quotes, I would still believe that's not the proper term to use. Someone might be willing to give you a proper answer on that.

Thanks for the answer, it's still a little bit over my head, but I will go do some study!
 

darthdago

Member
The only one I know of is Nintendo matching marketing dollars for select 3rd party launch titles.

What about Project zero (Fatal Frame)? they bought it (at least half way) and co-own it now together with Tecmo-Koei...

So wouldnt you also think that Nintendo is of course willing to spend a lot of money for getting games to their consoles?
 

Meelow

Banned
Wii U First Devkits (CAT-DEV V1-V2) – Classic Controller Adapter
CAT-DEV-diagram1-700x349.jpg

CAT-DEV-V1-3.jpg
 

japtor

Member
So? Those can come later.
Cause power on paper doesn't prove anything, without results (games) there's no evidence to support the specs. Like if the CPU is 1.8ghz with a bunch of features the 360 doesn't have, people would just point at the clockspeed and use the not so impressive games as evidence to justify their opinion that the Wii U is weak. If it's around 500Gflops you're just asking for this:

"2x GPU? Two GameCubes duct taped together all over again!"

It'd be the same situation as now...but with specific numbers. Plus when other consoles come out it'll just be used as more evidence that they're that many times better (whether the specs are actually comparable or not). Until people can see for themselves that it can run "next gen" stuff (or at least visuals beyond the current one) there's going to be doubts about the power regardless of what the spec sheet says.
 

JordanN

Banned
Cause power on paper doesn't prove anything, without results (games) there's no evidence to support the specs.
That's stupid. You don't need games to show a GTX 680 is more powerful than a PS3.


japtor said:
Like if the CPU is 1.8ghz with a bunch of features the 360 doesn't have, people would just point at the clockspeed and use the not so impressive games as evidence to justify their opinion that the Wii U is weak.
CPU speed has never been an accurate measure of power when comparing to other systems.


japtor said:
If it's around 500Gflops you're just asking for this:

"2x GPU? Two GameCubes duct taped together all over again!"
That's still better than "this shit aint even Gamecube level".

japtor said:
It'd be the same situation as now...but with specific numbers.
Except it wouldn't. There would be an actual foundation laid.


japtor said:
Plus when other consoles come out it'll just be used as more evidence that they're that many times better (whether the specs are actually comparable or not).
Better there's some actual truth mixed in rather than just present blanket open statements.

japtor said:
Until people can see for themselves that it can run "next gen" stuff (or at least visuals beyond the current one) there's going to be doubts about the power regardless of what the spec sheet says.
This is ridiculous. So say PS3 only got NES ports. Would people say it's not next gen even if it has a 1000x more power?
 

jerd

Member
Sorry to keep drifting away from the hardware stuff, but has the potential for Nintendo to continue to sell games at the $50 mark been discussed? If they could do that and keep an acceptable profit for everyone involved, I think new games for $10 cheaper than the competition could be a great selling point.
 

AniHawk

Member
Sorry to keep drifting away from the hardware stuff, but has the potential for Nintendo to continue to sell games at the $50 mark been discussed? If they could do that and keep an acceptable profit for everyone involved, I think new games for $10 cheaper than the competition could be a great selling point.

if what shockingalberto hinted at is anything to go by, i think $59.99 is here to stay.

turns out it was nintendo, not sony, who was ahead of the times in the 64/32 bit generation.
 

darthdago

Member
Sorry to keep drifting away from the hardware stuff, but has the potential for Nintendo to continue to sell games at the $50 mark been discussed? If they could do that and keep an acceptable profit for everyone involved, I think new games for $10 cheaper than the competition could be a great selling point.

IMO they could still sell with a price tag of 50 $, € or whatever and make a good deal.
But there comes the problem!!
If they do so they just slap into the faces of 3rd party publishers who claimed last gen they have to raise prices cos of HD development.
finally less 3rd party games will come to WiiU then (cos most 3rd party games cant challenge a Nintendo game with such a price tag if the own is 60 or 70 for excample)
So I think in that case Nintendo will follow the stream and raise their own prices to 3rd party levels (even means more money for Nintendo).
 

Ryoku

Member
Sorry to keep drifting away from the hardware stuff, but has the potential for Nintendo to continue to sell games at the $50 mark been discussed? If they could do that and keep an acceptable profit for everyone involved, I think new games for $10 cheaper than the competition could be a great selling point.

The thing is, Nintendo itself would have no problem selling games at $49.99. However, since their games will be competing with other games on the platform (and Nintendo wants to be nice with third parties this time around) I think they'll opt to price the games at $59.99.

EDIT: Beaten, lol. Damn, literally the same exact message.
 

AzaK

Member
X = 1, if you take into account that:

1. Both outputs go out at the same rate (60Hz),
2. There's vsync on both outputs.

ed: Or not. I was thinking more in terms of playing with the duration of a blocking BufferSwap(), but on a second thought, that might not be the best solution. Basically X would need to be greater than 1, though in practice that number would still be low - if we assume a 100ms TV lag, then at 60Hz (i.e. 16.6ms frame duration) that's 100 / 16.6 = 6 output-buffer queue for the pad.

I initially thought this whole delay thing was so wrong but I guess some of us are already gaming with this delay.
 

japtor

Member
That's stupid. You don't need games to show a GTX 680 is more powerful than a PS3.

CPU speed has never been an accurate measure of power when comparing to other systems.

That's still better than "this shit aint even Gamecube level".

Except it wouldn't. There would be an actual foundation laid.

Evidence will come out when games are made for them so I'm not seeing how this is a problem. Specs aren't suppose to stop these things, they're only suppose to assist reality.

This is ridiculous. So say PS3 only got NES ports. Would people say it's not next gen even if it has a 1000x more power?
Here's the problem I'm getting at, going by one of your previous posts:
Prehaps I shouldn't say harm (as that comes off as emotional) but it will forever annoy me to no end that I wont have the privilege to correct or put Wii U in neutral light whereas those who own other consoles wont have such hurdles available to them.
Assuming you're trying to correct others on NeoGAF, many will believe what they want to believe and twist anything to fit their view like ignoring/omitting details in the CPU example (or they could just be trolling). An "actual foundation" or "reality" would have no bearing on them, thisisneogafdude.gif basically.
I initially thought this whole delay thing was so wrong but I guess some of us are already gaming with this delay.
Probably, modern TVs kind of suck. Depending on the game and your sensitivity it may not be as noticeable though, and of course if the TV has a "game mode" that'll help too. Computer LCDs have varying lag too, took me a while to get used to mine after switching from a CRT. I don't notice it anymore but it drove me nuts at first.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
I initially thought this whole delay thing was so wrong but I guess some of us are already gaming with this delay.
Well, hopefully some of us will be gaming without that delay come the U ; )
 
As i have said before GPU FLOP numbers mean very little, my PC has a GTX 550 Ti and it's rated at just under 700 gigaFLOPs, now there are other PC GPU's that are over 2000 teraFLOPS and are not as powerful as my card.

LOL, no there arent.

Anyways, there used to be some discrepancy between AMD card flops and Nvidia card flops (more efficient) but A) nothing to make 700 gflops card top 2000 card and B ) starting with Kepler even that has mostly evaporated (Kepler is much closer to AMD card in flops).

Besides, I've always been of the opinion AMD flops werent allowed to stretch their power in PC games, since those must be aimed at two vendors, but in console they would. So you might see some of the prior AMD flops edge really pay off in a console (but again thats less relevant now anyway). ANYWAY
 

darthdago

Member
all in all (I think) noone should fear about the possible quality of games and features of GPU/CPU.
If we all remember the interview from mid June in which Iwata said that currently devs are only using half the potential of the WiiU hardware.

So for me it will be ferking great and I cant wait for the Wii U to release....
 

D-e-f-

Banned
So off the tech topic, what could/is Nintendo doing to entice third-party developers. I seem to remember them talking a big game about incentives and whatnot, but did we ever hear what those are? For example, could Nintendo offer bonuses to devs if the best selling version of a multi-platform game was the Wii U version? It seems like this could potentially keep the devs from phoning in Wii U versions and hopefully prevent a Wii scenario. Just a thought.

I refer to the old interview I found form the GCN days that nobody reacted to:


oh and here's small blast from the past that shows not much has changed seemingly:

IGN.com said:
IGNcube: When did work begin on the GCN version of the title?

Sam: Soon after Blood Omen 2 for PS2 and Xbox were released games like Resident Evil and Eternal Darkness came out on GameCube and made us re-evaluate the GameCube as a viable platform for games targeted towards older audiences. Within a few weeks we had our first working version.

sound familiar? :D

IGN.com said:
IGNcube: Finally, what do you think of the GameCube hardware? How would you rate its power versus PS2 and Xbox? Has it been easy to develop for?

Sam: The GameCube has its plusses and minuses. Its graphical output is great, but the limited amounts of RAM and disk space were a big challenge for us.

I guess if you switch the pros and cons here that sounds like where Wii U is going? :)

Legacy of Kain: Blood Omen 2 interview about the GCN version from 2002.
http://cube.ign.com/articles/377/377943p1.html

bonus bits:

IGN.com said:
IGNcube: Blood Omen on PS2 had a sometimes-sluggish framerate. In our play so far, the GCN version runs at 60 frames. Can you talk about this?

Sam: Sure, the GameCube has a fantastic graphics chip. We were definitely impressed with the performance increase.

IGNcube: What other changes have you made to the game with its transition to GameCube, if any?

Sam: The major changes we made to the game technically were due to the reduced disk space and smaller amount of RAM. Everything had to be compressed then uncompressed on the fly as it was streamed off the disk. Our conversion team did an outstanding job.

it seems that what happened back then was that 3rd parties needed to be convinced that there was an audience for darker titles geared towards the 18-40 age demographic (remember how the N64 suffered from the kiddie console label in a similar way that the Wii suffered from the "family games" mindset?). we basically need Darksiders, Aliens, ZombiU and Assassin's to do really well across the board.
 

darthdago

Member
LOL, no there arent.

Anyways, there used to be some discrepancy between AMD card flops and Nvidia card flops (more efficient) but A) nothing to make 700 gflops card top 2000 card and B ) starting with Kepler even that has mostly evaporated (Kepler is much closer to AMD card in flops).

Besides, I've always been of the opinion AMD flops werent allowed to stretch their power in PC games, since those must be aimed at two vendors, but in console they would. So you might see some of the prior AMD flops edge really pay off in a console (but again thats less relevant now anyway). ANYWAY


Sice you seem to know, or even any other who really knows, plz explain to me.

I do understand it that way that somehow the flops have some importance (otherwise neither Nvidia nor AMD would advertise with that, or?)
But only for example if we have a card from lets say 2005 and it would have 1,2 TFlops, could a card from 2010 which has 700 GFlops perform better due to the more modern features?
That is the way I do understand it?!?!
 

MDX

Member
DDR3 + EDRam makes a lot of sense, doesn't it?

Especially if you're talking about a quantity larger than 1 or 1.5gb and a system meant to be close to profitable around the $300 mark.

If Nintendo is going with something like the AMD-Radeon-E6760, it comes with a package of 1 GB GDDR5- potentially MEM1, why would Nintendo replace that with DDR3 if they will set up their memory pools similar to their older designs?:

The Wii
MEM1 24MB 1T-SRAM (internal main memory)
MEM2 64MB GDDR3 (external main memory)
eDRAM 3MB (embedded)

I dont see how 32MB of eDRAM becomes MEM1.


x amount of gddr5 as internal main memory
x amount of ddr3 as external main memory
32MB of embedded memory
 
Sice you seem to know, or even any other who really knows, plz explain to me.

I do understand it that way that somehow the flops have some importance (otherwise neither Nvidia nor AMD would advertise with that, or?)
But only for example if we have a card from lets say 2005 and it would have 1,2 TFlops, could a card from 2010 which has 700 GFlops perform better due to the more modern features?
That is the way I do understand it?!?!

the HD7750 which is around 819GFLOPS is faster than the 4870 which is 1.2TFLOPS

http://www.anandtech.com/bench/Product/513?vs=535

the 4870 is faster on DX9 games but on DX11 the HD7750 is faster both at stock clocks.

so probably not but close. maybe having extra development time and being on console might help slightly but its all speculation at this point.
 

MDX

Member
I just realized, with the cancellation of Metro, and the delay of Ghost Recon, CoD not being officially announced, the WiiU is starting off with a dearth of FPSs, one of the most popular genres of this gen.

If there was one genre Nintendo should have tried to launch with, alongside Mario, should have been an impressive FPS.


.
 
I just realized, with the cancellation of Metro, and the delay of Ghost Recon, CoD not being officially announced, the WiiU is starting off with a dearth of FPSs, one of the most popular genres of this gen.

If there was one genre Nintendo should have tried to launch with, alongside Mario, should have been an impressive FPS.


.

I think its too soon to draw any conclusions
 

japtor

Member
LOL, no there arent.

Anyways, there used to be some discrepancy between AMD card flops and Nvidia card flops (more efficient) but A) nothing to make 700 gflops card top 2000 card and B ) starting with Kepler even that has mostly evaporated (Kepler is much closer to AMD card in flops).

Besides, I've always been of the opinion AMD flops werent allowed to stretch their power in PC games, since those must be aimed at two vendors, but in console they would. So you might see some of the prior AMD flops edge really pay off in a console (but again thats less relevant now anyway). ANYWAY
How many PC flops would a Wii U flop if a Wii U could flop PC flops?

Twice as many according to USC-fan!
Wow this completely vindicates ideaman. So far he was right on every single piece of information.
I've just assumed his contact(s) were in Ubisoft (cause France of course...and Ubi loves to leak), then the ZombiU team once that was shown off. Some stuff he hinted at vaguely matched up with elements of the game, not really completely on but close enough that it seemed more than coincidental. Like discrepancies could've just been cause the source was being purposefully vague...or IdeaMan, cause that's just his thing.
I just realized, with the cancellation of Metro, and the delay of Ghost Recon, CoD not being officially announced, the WiiU is starting off with a dearth of FPSs, one of the most popular genres of this gen.

If there was one genre Nintendo should have tried to launch with, alongside Mario, should have been an impressive FPS.
Maybe the lack of FPS is an indication of next gen!

(They should've went after FreeRadical years ago to secure TS4 when they were shopping it around)
 

D-e-f-

Banned
I just realized, with the cancellation of Metro, and the delay of Ghost Recon, CoD not being officially announced, the WiiU is starting off with a dearth of FPSs, one of the most popular genres of this gen.

If there was one genre Nintendo should have tried to launch with, alongside Mario, should have been an impressive FPS.
.
Ghost Recon Online wasn't delayed. They're naturally making the PC version first as that's the main SKU off of which they're basing the console port. I don't think that means anything as it's important to get a free-to-play PC game right before you can adapt this to a console environment properly.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
If Nintendo is going with something like the AMD-Radeon-E6760, it comes with a package of 1 GB GDDR5- potentially MEM1, why would Nintendo replace that with DDR3 if they will set up their memory pools similar to their older designs?:

The Wii
MEM1 24MB 1T-SRAM (internal main memory)
MEM2 64MB GDDR3 (external main memory)
eDRAM 3MB (embedded)

I dont see how 32MB of eDRAM becomes MEM1.


x amount of gddr5 as internal main memory
x amount of ddr3 as external main memory
32MB of embedded memory
MEM1 and MEM2 are not local to the GPU in the cube architecture. Well, ok, the Wii was a bit more complicated than that, in the sense that MEM2 was more GPU centric in some ways, as in, it served the GPU better (high BW, high latency).

So far MEM1 on the U appears to be GPU-local, *but* with sufficiently swift CPU access so that the pool could match what the 1T-SRAM had to offer.
 
Are you expecting high profile FPS's available for the launch window?

Well Aliens is confirmed and Blops2 is on its way as well, but my point was as of now we only have a partial list, I would wait for the nintendo's fall conference for a more complete representation of launch/ launch window titles.
 
Another crazy idea:

Imagine a helmet in which you place one tablet in a slot so that the screen and camera faces your eyes. You now have a VR-helmet!
The camera would track your eyemovements, the gyro your headmovement, and in your hand you hold a Wii U Pro controller that you play with as the screen displays a VR-world.

If the helmet have a second slot, you could put in a second, forward-facing, tablet and enable a pretty cool form of augmented reality.
 

IdeaMan

My source is my ass!
(Not a tech guy...at all) If the e3 demos likely weren't optimised for the fifth kits (would they not just have pulled code there were using right before the 5th kits, since that would have been more stable??), is it likely some games will see reasonable improvements before launch? Or is there not enough time to take advantage of optimization in a meaningful way?

Again, I have no idea on this sort of stuff.

As explained in one of the previous speculation thread, third-parties that received the mass production dev kit (roughly 1 month before E3) saw an increase in the framerate of their games instantly, and it seems without having to work hard to adapt their build. I think it happened like that, because my sources got the new boxes, and were able to do some benchmarks with their games and witness the overall increase in performance in percentage (less than 10%) very quickly.

And they burned their E3 demos mere days before the show, so they had 1 month, a tad less, to adapt their content to this new dev kit. It seems, grain of salt, that even with the delay in their shipping, it didn't bring drastic changes (so more like tweaking or minor changes), like a replacement of component, that may have pushed them to re-do a lot of code, adjust their game to this hypothetical new hardware setting in a rush before E3. Or there's also the scenario where there has been an hardware modification, but slight, or done in a way that it didn't represent an hassle for developers because no one reported me difficulty with those.

All in all, the games at launch will see a lot of improvements, especially in the image quality department, there will be anti-aliasing (for the ones that lacked it), texture filtering, less problems with texture pop-ups, etc. Thanks to the familiarization of studios with those latest dev kits, new SDK releases, optimizations on their end (their engine, etc.), of middleware used, the fact that they will grow more accustomed with the asymmetric screen setup, and a lot of other parameters, the Wii U titles on the shelves will be prettier than the E3 demos.

Now, my sources told me than any additional enhancement they will retrieve from all these factors will benefit their games in a particular way. Concretely, if they gain 5 frame per second, their builds won't run at 35fps, they will take advantage of those to add more effects, etc, and the framerate will be lowered to 30 fps.

With that said, it must be perfectly clear however that for this context (third-parties projects, most of them being games not made for Wii U from the start, etc.), all these improvements won't make that a game you saw running at 720p/30fps at E3 will be either 1080p/30fps or 720p/60fps. The jump to reach those next visual steps is too big compare to the additional performances they will manage to grab from all the previously mentioned points.
 

IdeaMan

My source is my ass!

You heard it first here :p

Wow this completely vindicates ideaman. So far he was right on every single piece of information.

Thanks :) But the HDD and online info weren't confirmed, we're waiting for more explanations from Nintendo. And it seems the controller screen size didn't change however, i think for this point we'll need a Nintendo tech guy to confirm or deny that the DRC screen we saw at E3 2011 is the same that the one in the final Gamepad, or that indeed there was a very slight increase in size, like from 6,2" to 6,25". Or my sources or me mixed with the increase in size of the Gamepad itself, or they just ditched this modification (as it was "virtual", third-parties were noticed of it through the Nintendo developers portal).
 

MDX

Member
Well it has Medal Of Honor Warfighter which releases on the 23rd of October.

Well Aliens is confirmed and Blops2 is on its way as well, but my point was as of now we only have a partial list, I would wait for the nintendo's fall conference for a more complete representation of launch/ launch window titles.

So we got potentially three First Person shooters for the WiiU for the launch window.
Better than the Wii's launch.

And maybe some unannounced games for the first half of 2013.
 

alfolla

Neo Member
As explained in one of the previous speculation thread, third-parties that received the mass production dev kit (roughly 1 month before E3) saw an increase in the framerate of their games instantly, and it seems without having to work hard to adapt their build. I think it happened like that, because my sources got the new boxes, and were able to do some benchmarks with their games and witness the overall increase in performance in percentage (less than 10%) very quickly.

And they burned their E3 demos mere days before the show, so they had 1 month, a tad less, to adapt their content to this new dev kit. It seems, grain of salt, that even with the delay in their shipping, it didn't bring drastic changes (so more like tweaking or minor changes), like a replacement of component, that would have pushed them to re-do a lot of code, adjust their game to this hypothetical new hardware setting in a rush before E3. Or there's also the scenario where there has been an hardware modification, but slight, or done in a way any hardware that it didn't represent an hassle for developers because no one reported me problems with those.

All in all, the games at launch will see a lot of improvements, especially in the image quality department, there will be anti-aliasing (for the ones that lacked it), texture filtering, less problems with texture pop-ups, etc. Thanks to the familiarization of studios with those latest dev kits, new SDK releases, optimizations on their end (their engine, etc.), of middleware used, the fact that they will grow more accustomed with the asymmetric screen setup, and a lot of other parameters, the Wii U titles on the shelves will be prettier than the E3 demos.

Now, my sources told me than any additional enhancement they will retrieve from all these factors will benefit their games in a particular way. Concretely, if they gain 5 frame per second, their builds won't run at 35fps, they will take advantage of those to add more effects, etc, and the framerate will be lowered to 30 fps.

With that said, it must be perfectly clear however that for this context (third-parties projects, most of them being games not made for Wii U from the start, etc.), all these improvements won't make that a game you saw running at 720p/30fps at E3 will be either 1080p/30fps or 720p/60fps. The jump to reach those next visual steps is too big compare to the additional performances they will manage to grab from all the previously mentioned points.

Texture pop-up is massive in Arkam City, at the point that i would not buy the game if WB Montreal doesn't fix this problem.

A question i always wanted to ask is: in which way ground up made games for Wii U will benefit from the extra horse power of the console?
I mean, do software houses really have to face a whole different way of writing the game code?
 

coughlanio

Member
Texture pop-up is massive in Arkam City, at the point that i would not buy the game if WB Montreal doesn't fix this problem.

A question i always wanted to ask is: in which way ground up made games for Wii U will benefit from the extra horse power of the console?
I mean, do software houses really have to face a whole different way of writing the game code?

If Sony and Microsoft decide to go with off the shelf PPC parts next generation, development for multiple platforms should be trivial, at least between the next Playstation and Wii U.
 
Texture pop-up is massive in Arkam City, at the point that i would not buy the game if WB Montreal doesn't fix this problem.

A question i always wanted to ask is: in which way ground up made games for Wii U will benefit from the extra horse power of the console?
I mean, do software houses really have to face a whole different way of writing the game code?

Someone correct me if I'm wrong, but the way I understand Arkam to be, and what texture popping to be is UE3 uses a low quality texture for in the distance and loads a higher quality texture in when objects move to the foreground. So all we are seeing in Arkam at the moment is UE3 not tweaked for Wii U correctly so the high quality foreground textures aren't loading in.

No way the game will launch like that I can only think.
 
Texture pop-up is massive in Arkam City, at the point that i would not buy the game if WB Montreal doesn't fix this problem.

A question i always wanted to ask is: in which way ground up made games for Wii U will benefit from the extra horse power of the console?
I mean, do software houses really have to face a whole different way of writing the game code?

With how everything sounds, the GPU may be handling things like lighting and physics. I posted this awhile back, but here is an example of lighting being handled by the GPU.

http://www.youtube.com/watch?v=C6TUVsmNUKI

So if the CPU isn't quite capable of handling those things on its own, yet the game is still developed that way then you can see why a game built with that in mind will benefit much better.

If Sony and Microsoft decide to go with off the shelf PPC parts next generation, development for multiple platforms should be trivial, at least between the next Playstation and Wii U.

They're pretty much set for AMD CPUs.
 

HylianTom

Banned
It sounds like it'll miss out more due to publisher perceptions of the Wii U customers
and/or devs/pubs going down in flames and not being able to afford ports due to being stupid with development costs.
BINGO. This time around, specs are going to be a convenient excuse. Plausible deniability and all that jazz..

Even if AC3, ZombiU, etc sell well at launch - watch the excuses fly. We could make a sad drinking game out of it
(much like we could make a pubs/devs/studios going "buh-bye" drinking game)
.
 
Status
Not open for further replies.
Top Bottom