• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo: Wii U Re-Unveil At E3 2012, Working On Strong Launch, No Pokemon?

Massa

Member
StevieP said:
Except for the hardware required to run it.



That's exactly what the process of optimization is - coders aren't magicians. "Same results" is subjective, in other words.

You really have no idea what you're going on about here, sorry.
 

Shanadeus

Banned
Thunder Monkey said:
So much optimization you may as well run it on a 360. I doubt any of the three use a GPU higher than 800 spu's, and that's honestly pushing it if they want to keep their power budget low.

Now if both wait until late 2014 to release? They very well could hit that benchmark. But that gives Nintendo two years free on the market. And neither would want to give Nintendo that much of an advantage.

Now if Nintendo launches next fall, MS the spring of 2013, and Sony sometime in 2014, you're going to be looking at an odd generation. As far apart as the WiiU is from the 720, the 720 would be from the PS4. Under this scenario the only one that would likely have the power to run Samaritan is the PS4.

And no way in hell is Sony going to give Nintendo two years and MS one year.
I always assumed you were kinda dumb, but you kinda make sense.

My apologies.
 
iamshadowlark said:
Lol. Thats not what optimization is at all. What about things like reorganizing your data structures and rerwriting more efficient algorithms. Nah you know it all.
And most of those efficiencies come from lowering precision on a whole range of effects. Not that they can't get close. But then again I don't think any of the three are going to be using hardware that much more powerful than a 2012 PC. Sony maybe. MS no.
 

BurntPork

Banned
iamshadowlark said:
Honestly there is nothing in samaritan that a proper next gen machine couldn't handle.
Technically, based on known info, Wii U has tech necessary to use the vast majority of of the effects in Samaritan; it just has to be toned-down. (Keep in mind that the Wii U's GPU is customized and will probably be a lot more than just an R700 in its final form, even if that is the base.)

And it really sucks that Wii U will probably won't be out until November next year. That means I'll probably have to wait a year longer than I expected to buy it. :/ And before you talk to me about the benefits of waiting for a larger library, I have four fucking game systems in my house right now, plus a semi-capable laptop, plus I'm getting a new Android phone in two weeks. I could not give any less of a fuck about one of my systems experiencing a drought and I'll never understand why you guys do. Still, I understand and expected this as soon as I found out about the loss.

Also, I've been watching over the past few days, and to everyone who made fun of me behind my back,

peter_griffin_fcuk_you_gif.gif


Okay, I'm done now.
 

Shanadeus

Banned
BurntPork said:
Technically, based on known info, Wii U has tech necessary to use the vast majority of of the effects in Samaritan; it just has to be toned-down. (Keep in mind that the Wii U's GPU is customized and will probably be a lot more than just an R700 in its final form, even if that is the base.)

And it really sucks that Wii U will probably won't be out until November next year. That means I'll probably have to wait a year longer than I expected to buy it. :/ And before you talk to me about the benefits of waiting for a larger library, I have four fucking game systems in my house right now, plus a semi-capable laptop, plus I'm getting a new Android phone in two weeks. I could not give any less of a fuck about one of my systems experiencing a drought and I'll never understand why you guys do. Still, I understand and expected this as soon as I found out about the loss.

Also, I've been watching over the past few days, and to everyone who made fun of me behind my back,

peter_griffin_fcuk_you_gif.gif


Okay, I'm done now.
Welcome back.

D:
 

StevieP

Banned
Thunder Monkey said:
And most of those efficiencies come from lowering precision on a whole range of effects. Not that they can't get close. But then again I don't think any of the three are going to be using hardware that much more powerful than a 2012 PC. Sony maybe. MS no.

Yes, this is what I mean.

Lol. Thats not what optimization is at all. What about things like reorganizing your data structures and rerwriting more efficient algorithms. Nah you know it all.

Reorganizing data paths isn't going to net you the means to run what's going on SandyBridge + 3 GTX 580s on the equivalent of a single GTS 450. And "efficient algorithms" is exactly what I was just talking about, worded differently.

So you optimize in order to achieve poorer results? Never knew that.

You and others are using the word "optimize" as if it's some kinda magic. Wanna know why Crysis 2 is so much more "optimized" than Crysis 1? Look at the texture resolutions on things like the floor and the wall to start. Do some people notice things like that? No, they don't. I do. But I also still regularly play NES/SMS games.
 
Shanadeus said:
I always assumed you were kinda dumb, but you kinda make sense.

My apologies.
The monkey is dumb, the human behind the monkey has been interested in pixel shading tech since he read a paper on color combiners more than a decade ago. Honestly, never expected the tech to advance to the level it has this quickly. I thought Dot3 precision effects wouldn't be common until around the time of the 360, to see it happen almost a generation before that.

Gotta give to realtime renderers. They do push the bar far in little time.

Don't tell the monkey I called him dumb. He might send Nirolak another nude photo and get banned for a month again.

H_Prestige said:
So you optimize in order to achieve poorer results? Never knew that.
Or to achieve them at all on given hardware?
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Massa said:
And then you include the optimization and bam, it runs on next-gen consoles.
I'd kill for an optimisation-on-a-switch (outside of 'just drop all render targets' res 5 times') that can get a tech demo running on 3 (three) GX580's to run on whatever will be in the xbox loop/ps4 at similar performance. No, seriously, just point me to a target and i'd wack 'em.
 

[Nintex]

Member
The one thing I don't get is that Nintendo isn't quite ready to start the next generation yet but still they went ahead and nuked the Wii. After Zelda they look extremely weak on the console side for the coming 12 months.
 
Gamer @ Heart said:
I doubt they where scrambling. Its probably been on the table for years, but there was no point in trying to make it work on Wii. Just hold it off until the Wii U plans are more finalized, then roll it out first on the 3DS so that they can be unified later on.


That's all they have been doing since the summer...scrambling.

Hopefully they figure it out for E3 and 2012.
 
[Nintex] said:
The one thing I don't get is that Nintendo isn't quite ready to start the next generation yet but still they went ahead and nuked the Wii. After Zelda they look extremely weak on the console side for the coming 12 months.
Being perfectly honest Nintendo has looked extremely weak for a long time before that.
 

Medalion

Banned
Thunder Monkey said:
Being perfectly honest Nintendo has looked extremely weak for a long time before that.
3DS looks promising by the day but Wii is just a barren wasteland after Zelda for 1 freakin year, that is insane
 
Ahh, Burntpork is back! I was starting to miss his crazyness (I wouldn't have any excuse to post the Burntpork chart in that case!)

orioto said:
In the "Nintendo loves contradiction" family. WiiU is using a stylus, yet it's meant to use a touch interface while you'll be playing with a pad. That means you won't be able to use that stylet most of the time.

See Ocarina of Time on 3ds, do you have a stylet in your hands while you play with the pad, to help you select your weapon ?? No. You use your finger with a touch screen that isn't made for that at all... Nintendo Magic!

Yes, resistive screens can be operated with a finger as well, whodathunkit? You can't do multitouch, but that's not a big deal when you have buttons (never mind the tradeoff for losing the accuracy of stylus controls)

ShockingAlberto said:
I doubt the logic was

Must Use Stylus -> Resistive

It was likely

Capacitive -> Easy to Break -> Expensive -> Must try to keep costs down -> Resistive

Are capacitive screens really expensive these days though? If anything, wouldn't resistive screens be more expensive since the economies of scale that come from practically every single non Nintendo touchscreen device using a capacitive screen wouldn't apply? (since Nintendo are basically the only major manufacturer who still use resistive screens)

I can imagine that durability might come into play though (but probably not enough of a factor to push them away from resistive tech)


Medalion said:
3DS looks promising by the day but Wii is just a barren wasteland after Zelda for 1 freakin year, that is insane

Well they do have Rhythm Heaven Fever, Mario Party 9, Mario & Sonic 3, The Last Story (PAL) and Pandora's Tower (PAL) so they do still have a good few games to release in 2012.
 
Thunder Monkey said:
And most of those efficiencies come from lowering precision on a whole range of effects. Not that they can't get close. But then again I don't think any of the three are going to be using hardware that much more powerful than a 2012 PC. Sony maybe. MS no.
No, its not. These efficiencies come from from debugging and analyzing your code by lines at the time, taking that usage data and coming up ways to present the code differently ,in a more efficient manner, if that simplifies it enough. What you and steviep described is more of a result of deadlines and the like.
 

Azure J

Member
[Nintex] said:
The one thing I don't get is that Nintendo isn't quite ready to start the next generation yet but still they went ahead and nuked the Wii. After Zelda they look extremely weak on the console side for the coming 12 months.

Seriously, this. It also doesn't help that this latest development makes it sound like Wii U news is going to be a desert until mid year 2012, what's there to keep the interest high? Seems like a lot to ask of the 3DS alone. Wii is almost a virtual non-factor with how Nintendo (primarily NoA) have cast it out in the cold.
 

birdchili

Member
Thunder Monkey said:
Being perfectly honest Nintendo has looked extremely weak for a long time before that.
it's so painful to watch, since they keep making statements that suggest that they know what they're doing wrong...

next year is going to be a mess for them on the console front.
 
iamshadowlark said:
No, its not. These efficiencies come from from debugging and analyzing your code by lines at the time, taking that usage data and coming up ways to present the code differently ,in a more efficient manner, if that simplifies it enough. What you and steviep described is more of a result of deadlines and the like.
So you truly believe they can take the effects seen in a cutscene, and use it on hardware no where near as powerful as what it was built on just by reorganizing code? I'll take your word for it mate, but I don't see how they could pull it off without changing something.
 

KrawlMan

Member
StevieP said:
You and others are using the word "optimize" as if it's some kinda magic. Wanna know why Crysis 2 is so much more "optimized" than Crysis 1? Look at the texture resolutions on things like the floor and the wall to start. Do some people notice things like that? No, they don't. I do. But I also still regularly play NES/SMS games.

Now, I'm not a game developer, but I always assumed "optimize" was used in game development in the same way that it's used in software development. Some aspects of your code base may be less efficient then others. To optimize your code you don't attempt to sacrifice functionality, you find areas that were less than ideal before and improve on them. This can take the form of finding a way to reduce how much data is stored in memory, or speeding up some particularly slow function.
 
Just so no one takes what I'm saying as fact.

I am more centered on visual aspects of game design than programming. Like I said I'll take what he's saying as truth. But... I don't see how something built for 1200 spu's can run as is on a 800 spu part. Without a loss of quality that is.
 

HeySeuss

Member
BurntPork said:
Did I say anything negative? At all?

Well, okay, I guess that last part technically.
Welcome back brah. Threads aren't the same without you. Also acebandage is/was banned as well.
 

BurntPork

Banned
[Nintex] said:
The one thing I don't get is that Nintendo isn't quite ready to start the next generation yet but still they went ahead and nuked the Wii. After Zelda they look extremely weak on the console side for the coming 12 months.
Well, obviously something didn't go according to plan. I think they thought that it's best to deal with the drought in the short term so that they could depend on Wii U taking off and getting third-party support in the long term so they don't have to carry it alone. Nintendo really likes to focus on the long term.

Also, looking at Nintendo's stock, whatever Iwata said really worked! I bet it was the whole eShop on smartphones thing.

Thunder Monkey said:
It's okay. I'm glad you're back, if only for your eccentric nature.
Shick Brithouse said:
Welcome back brah. Threads aren't the same without you. Also acebandage is/was banned as well.
I'm going to be a bit more chill now, so you might not be totally satisfied. :p
 

StevieP

Banned
Thunder Monkey said:
Just so no one takes what I'm saying as fact.

I am more centered on visual aspects of game design than programming. Like I said I'll take what he's saying as truth. But... I don't see how something built for 1200 spu's can run as is on a 800 spu part. Without a loss of quality that is.

We're speaking in terms of visuals here. It can't. Something made for the equivalent of 4500 SPUs cannot look the same running with 800 (or less) SPUs that we're looking at when we talk the next generation of consoles and thermal/power limits associated with such.
 
KrawlMan said:
Now, I'm not a game developer, but I always assumed "optimize" was used in game development in the same way that it's used in software development. Some aspects of your code base may be less efficient then others. To optimize your code you don't attempt to sacrifice functionality, you find areas that were less than ideal before and improve on them. This can take the form of finding a way to reduce how much data is stored in memory, or speeding up some particularly slow function.
Exactly. Thats one of the first things you learn in school.

Thunder Monkey said:
So you truly believe they can take the effects seen in a cutscene, and use it on hardware no where near as powerful as what it was built on just by reorganizing code? I'll take your word for it mate, but I don't see how they could pull it off without changing something.

Well you also have to factor in how less efficient pc development is to console development.
PC games, in simple terms, employ a brute force method of development. Some mid-level api does most of the work, since software needs to run on a infinte amount of different configurations. You have way less control of the hardware, so you have to write code in a more generalized fashion, while not being able to take advantage of designing software for a particular architecture and maximizing performance.

Thunder Monkey said:
ust so no one takes what I'm saying as fact.

I am more centered on visual aspects of game design than programming. Like I said I'll take what he's saying as truth. But... I don't see how something built for 1200 spu's can run as is on a 800 spu part. Without a loss of quality that is.

Its not as is. see the above.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Thunder Monkey said:
Just so no one takes what I'm saying as fact.

I am more centered on visual aspects of game design than programming. Like I said I'll take what he's saying as truth. But... I don't see how something built for 1200 spu's can run as is on a 800 spu part. Without a loss of quality that is.
Erm, as somebody who's been doing and optimising game engines & graphics pipelines for the past 9 years, across a plethora of architectures, I can assure you, there's no such programming and/or dataflow optimisation which you can just flip a switch on and get something which required the BW and ALU power of 3x GX580s to run on something which is 1/3 or 1/4 the power of those. Bar dropping resolutions left and right (thus lowering both BW and ALU reqs). The programming effort for such things can vary from non-trivial to outright-fucking-impossible, and this is taking into account all kinds of profiling tool, GPU analyzers and all that jazz. Even if you were ultra lame with the ALU in your shaders and the analyzer told you so in your face, there still remains the issue of BW and that usually is 'optimised' in one way and one way only - you cut off on things.
 

StevieP

Banned
EloquentM said:
why are half the people arguing never in the WiiU speculation thread?

Because they'd be laughed out at the mention of Samaritan. It pops up in every other WiiU thread or any thread that discusses any of the 8th generation of consoles, really.
 

EloquentM

aka Mannny
StevieP said:
Because they'd be laughed out at the mention of Samaritan. It pops up in every other WiiU thread or any thread that discusses any of the 8th generation of consoles, really.
I mean when you look at it could the average gaming rig OUT TODAY play the samaritan tech demo on high? I don't expect that it would be able to so why would we expect a small enclosed gaming console that comes out even in 2014 to be able to pump out visuals like that?
 
iamshadowlark said:
Exactly. Thats one of the first things you learn in school.



Well you also have to factor in how less efficient pc development is to console development.
PC games, in simple terms, employ a brute force method of development. Some mid-level api does most of the work, since software needs to run on a infinte amount of different configurations. You have way less control of the hardware, so you have to write code in a more generalized fashion, while not being able to take advantage of designing software for a particular architecture and maximizing performance.
No I get that.

I just don't think their hardware will be capable of it at all without cutting precision, or texture res, or particles, or etc. I don't expect the 720 to be that much more powerful than the WiiU. Likely have more RAM, but more or less on equal footing otherwise. No amount of optimization or cuts will get Samaritan to run on a 640spu part.

You'll be able to get close... or close enough for me. But not without a simpler form of the effects shown. If you can get it to run on a 720, you should be able to get it to run on a WiiU, and from there a PS3 or 360. It wouldn't be anything close to the Samaritan engine at that point.
 
blu said:
Erm, as somebody who's been doing and optimising game engines & graphics pipelines for the past 9 years, across a plethora of architectures, I can assure you, there's no such programming and/or dataflow optimisation which you can just flip a switch on and get something which required the BW and ALU power of 3x GX580s to run on something which is 1/3 or 1/4 the power of those. Bar dropping resolutions left and right (thus lowering both BW and ALU reqs). The programming effort for such things can vary from non-trivial to outright-fucking-impossible, and this is taking into account all kinds of profiling tool, GPU analyzers and all that jazz. Even if you were ultra lame with the ALU in your shaders and the analyzer told you so in your face, there still remains the issue of BW and that usually is 'optimised' in one way and one way only - you cut off on things.
Which was my default position at the outset.

He seems so damn sure though. Easy to see the appeal when you don't have programming knowledge to back you up.
 
Thunder Monkey said:
Which was my default position at the outset.

He seems so damn sure though. Easy to see the appeal when you don't have programming knowledge to back you up.

Psh, you don't know which hood he's from, man. He also led his debate team.
 
[Nintex] said:
The one thing I don't get is that Nintendo isn't quite ready to start the next generation yet but still they went ahead and nuked the Wii. After Zelda they look extremely weak on the console side for the coming 12 months.


They don't have games they could release for any console (whether Wii or WiiU) until 2013 or 2014. They could either release WiiU now and have a huge drought for years and years or just have the Wii end with a year of no games and have a less huge drought for the WiiU later on.
 
blu said:
Erm, as somebody who's been doing and optimising game engines & graphics pipelines for the past 9 years, across a plethora of architectures, I can assure you, there's no such programming and/or dataflow optimisation which you can just flip a switch on and get something which required the BW and ALU power of 3x GX580s to run on something which is 1/3 or 1/4 the power of those. Bar dropping resolutions left and right (thus lowering both BW and ALU reqs). The programming effort for such things can vary from non-trivial to outright-fucking-impossible, and this is taking into account all kinds of profiling tool, GPU analyzers and all that jazz. Even if you were ultra lame with the ALU in your shaders and the analyzer told you so in your face, there still remains the issue of BW and that usually is 'optimised' in one way and one way only - you cut off on things.

Wait, are you comparing the BW requirements of PC software to one of a console. Specifically a pc with a MB that has 3 gpus and an infinite amount of other none essential components. Seriously?

Also at the bolded, thats exactly why companies like sony and epic have teams assigned to this sort of task. And who the hell said something about flipping a switch lol?
 
I just wanted to chime in to let people know that resistive multi-touch screens do exist...
I'm not sure if Nintendo is gonna use one, but they are out there if they want one.

and as for the Optimization talk...
I've always imagined it to be that a Dev would push for the best graphics possible that the machine can produce, realize that now that they are pushing every advanced effect in the book at the highest possible resolution, the frame rate has dropped to (two)2fps.

So they need to optimize. In order to do that, they need to scale back the amount of effects, polygons, memory usage, resolution or whatever it is they can to ease the load on the GPU and get that frame rate back up to acceptable levels. This may mean reorganizing some code to improve data flow, but will most likely involve the downgrading of character models, the removal of lighting sources and shadows, the lowering of texture resolution, the decrease of draw distance and whatever other graphical effects they can trim back without making the game immediately look last gen compared to the maxed out screen caps.

Now that optimization would work any which way they need it to depending on what the desired results were, but am I wrong to think of it that way?
 
BlackNMild2k1 said:
I just wanted to chime in to let people know that resistive multi-touch screens do exist...
I'm not sure if Nintendo is gonna use one, but they are out there if they want one.

and as for the Optimization talk...
I've always imagined it to be that a Dev would push for the best graphics possible that the machine can produce, realize that now that they are pushing every advanced effect in the book at the highest possible resolution, the frame rate has dropped to (two)2fps.

So they need to optimize. In order to do that, they need to scale back the amount of effects, polygons, memory usage, resolution or whatever it is they can to ease the load on the GPU and get that frame rate back up to acceptable levels. This may mean reorganizing some code to improve data flow, but will most likely involve the downgrading of character models, the removal of lighting sources and shadows, the lowering of texture resolution, the decrease of draw distance and whatever other graphical effects they can trim back without making the game immediately look last gen compared to the maxed out screen caps.

Now that optimization would work any which way they need it to depending on what the desired results were, but am I wrong to think of it that way?
You are right in that what you described happens most of the time, but thats not optimization nor is it due to hardware constraints, more like pressure and deadlines from the higher ups. The very nature if programming teaches us that something can always be improved on.
 
orioto said:
And maybe they also wanted to let the stylet to keep the nds style. After all, the whole 3ds design is meant to imitate the nds except it's not the same usage at all.

Now for the WiiU, the stylet isn't even really smart when you imagine something easy to use for the whole family etc... People would want to touch it just like that and they are not used to having a tool to touch their smartphone.
I have to think the group of people interested in Wii U but haven't used a DS is going to be pretty limited.
 

JGS

Banned
Medalion said:
3DS looks promising by the day but Wii is just a barren wasteland after Zelda for 1 freakin year, that is insane
Honestly, they need to just go 99.99 on Wii and have the Nintendo Selects line across the board on all stuff over 2 years old after the fiscal year is up & depending on how long Zelda's legs are. Really, the DS needs to drop in price too.

It won't happen but it should. They could make it a part of the E3 stuff
 
Top Bottom