• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Community Thread

Status
Not open for further replies.
R

Rösti

Unconfirmed Member
From a GameSpot interview with Scott Moffitt:

Before E3, a few third-party developers came out to say the Wii U hardware was better than the Xbox 360 and PS3. Can you share any insight into the console's hardware specifications?

Unfortunately I have to give you the bad answer. It just comes back to a company philosophy that we believe the experiences and the gameplay are more important than facts and figures. Once the system is available no doubt people will reverse engineer it and take it apart and that info will become known. Our focus is to talk about what it can do for gameplay and how it can revolutionize entertainment, rather than focus on tech specs.
http://www.gamespot.com/news/wii-u-message-confusing-nintendo-6383049

I was hoping to avoid reverse engineering as the only option to know the specs of the console. Of course, there will be plenty of sites that will take the machine apart, but the amount of information coming from such laboratory studies is usually not enough in my opinion; well that goes for reports published online at least.

Time to start saving for two consoles I guess. Unless I can get something from NOA Rob later this week. Remember to post any questions you may have.
 
http://www.neogaf.com/forum/showthread.php?t=478941&page=3

Why was that thread closed? Proven to be false information?

I think just because it was based on rumors w/ no source. People also seem to resent BGAssassin and StevieP outside this thread. No idea why - their info has never been proven false and their posts are at least respectful and generally well thought out. You're welcome in here, guys!

So I've made up my mind and realized I don't care that Nintendo didn't show anything at E3 and that at least 25% of third party games coming to the system would be good enough for me. I'm getting one at launch if there's a black unit available. I am no longer angry or bitter.

Same here. Nintendoland, NSMBU, AC3, ZombiU, and perhaps Lego City Stories (for the gf) will keep me quite busy this holiday. I can't wait to see what else is in store for the system.
 

chris3116

Member
The analysis of their confused and weak presentation is dead on, but the bandwagon jumping about "Nintendo still has no online at all!" is getting tiresome. But that seems half due again to Nintendo, and half due to people not paying attention: on one hand, the Nintendo Direct video did suggest Wii U will have a lot of online social functionality, including some honest innovations. On the other hand, Nintendo themselves is not clarifying at all how the full online suite of Wii U is laid out and how everything interacts. At the primary E3 conf, they gave the direct impression that Miiverse was the only online functionality the system has, and the only interface there is - Iwata directly contradicted this during the Direct presentation when he noted that a conventional 'home screen' was available on the controller and could be swapped to the other screen, but naturally, they still have nothing to show whatsoever to end the confusion, fear, uncertainty, and doubt.

Nintendo's problem remains that they're incredibly tight fisted about giving information away, and seem to feel they must carefully shape image and anticipation as if pruning a bonsai. Unfortunately, that method is fast becoming patronizing and frustrating. Even if Nintendo does have their shit together behind the scenes, and is working on a solution that addresses most of the issues people have, their refusal to be forthright is causing everyone to write them off.

I agree about what you say. This thing of "non information" started when Nintendo said they won't release the specs to the people when the Wii was about to be release. Hardware specs I could understand since many people don't care about what is the CPU or GPU and it would be useless for the consumers to know this.

The software and service specs is another thing. Consumers need to know what is the OS. and how it's working. The online service is also a thing that consumers need to know. Does it have a Virtual console service, eShop? What are the prices of the downloadable games?
 

nordique

Member
I definitely noticed a lot more dynamic lighting and shadows in ZombiU than
games presented for the HD twins this E3. Im not sure if they cant do it, but
it seems to come off effortlessly for the WiiU.

But thats the issue of diminishing returns.
Adding more lights sources doesn't mean an image will look better.
Its how you apply it that makes art.

I think what ZombiU suffers from is a distinctive art style and a lack of creativity and variety in the zombie design that makes the game come off generic looking. I dont know how early this build was, but I hope they have some nice surprises for us in the visual area. Because I do like the concept of the game.

Killer Freaks seemed to have that special look, though as a game, it might not have been as interesting.


Thats the thing I'm thinking too, it was an early build so hopefully they can make it look better

The textures in the demos were weak, and lighting aside the game looked like it would be easy to replicate on current HD systems.

I hope it looks better, but then again the same things were said about Red Steel :p

http://ca.ign.com/videos/2006/05/09/red-steel-nintendo-wii-trailer-2006-05-09

*ubisoft's target render included*


http://ca.ign.com/videos/2006/05/09/red-steel-nintendo-wii-gameplay-e3-conference-demo

*E3 2006 demo/E3 2006 build


http://ca.ign.com/videos/2006/10/13/red-steel-nintendo-wii-trailer-2006-10-13-5

*swordfight from near final build*


http://ca.ign.com/videos/2006/11/30/red-steel-nintendo-wii-video-video-review-480p

*review copy gfx - looks essentially identical to E3 build*


we were told at the time the game would look better at launch, any one who picked this game up at Wii's launch might remember the disappointing graphics


and my other issue is with Ubisoft's ZombiU target render (or their CGI clip at Nintendo's E3 2012 conf. - ed) is it might set up false expectations regarding how the final game will actually look. This is the same company that showed this:

red-steel-20060503040815747-1486408_640w.jpg

red-steel-20060623060557375-1557960_640w.jpg


go to this:

red-steel-20060727114857104-1597032_640w.jpg

red-steel-20060727114858510-1597033_640w.jpg


and this:

red-steel-20060509074909942-1493677_640w.jpg


to this:

red-steel-20060728104722208-1598261_640w.jpg



Hopefully ZombiU fares better.
 

nordique

Member
I think just because it was based on rumors w/ no source. People also seem to resent BGAssassin and StevieP outside this thread. No idea why - their info has never been proven false and their posts are at least respectful and generally well thought out. You're welcome in here, guys!



Same here. Nintendoland, NSMBU, AC3, ZombiU, and perhaps Lego City Stories (for the gf) will keep me quite busy this holiday. I can't wait to see what else is in store for the system.

Yeah, I don't know why. Probably cause bg and StevieP are realistic about next gen consoles? And bg certainly has info regarding the dev kits, so I would believe it. I mean, those systems were in development for quite some time before their alleged 2013 release date. Can't just magically change the target specs (CPU/GPU) if those are in the kits. Only fine tuning as we saw with the Wii U kits. Ram is probably the only thing that is more changeable

I understand there was no source and that thread could have turned into something nasty but those specs are believable all things considered.

Orbis and Durango will be pretty powerful compared to 360/PS3 but not a quantum leap in any way.
 

Gahiggidy

My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
About Lego City Undercover...

Is there any reason they have to have those silver and gold Lego "coins/chips" littering the game each time you do something good? I dunno, I just find it annoying to have to run around each time those things pop-up. I noticed that the same stuff is in Lego LotR, so I guess its a staple of the series. Will this mean it will play similarly to those other Lego games?
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
Even UbiSoft's 'real' Red Steel screenshots were fake. They were rendered in higher resolutions than the Wii could output, with anti aliasing and anisotropic filtering to clean up the image quality.

They still do it this generation with their touched up bullshots. Ubi is the worst.
 
What I mean by the ZombiU target render was that CGI video they showed at the conference (Nintendo's; E3 2012)

I feel we won't really see what Wii U is capable of graphically until a game shows off its GPGPU capabilities. That won't be for a while.

you mean that opening with the UK song God Save the Queen? That was meant to be CG I doubt that was ever a target for ingame
I doubt even the highest PC can pull that off in game at that high of detail
 
Scott Moffit said:
Unfortunately I have to give you the bad answer..

I like this. It sounds like a genuine "I'm one of you" answer. His hands are tied, but he's not being a reggie about it.



What I mean by the ZombiU target render was that CGI video they showed at the conference (Nintendo's; E3 2012)

I feel we won't really see what Wii U is capable of graphically until a game shows off its GPGPU capabilities. That won't be for a while.

It's mostly up to Nintendo. For most third party developers, GPGPU will be considered with the same gravity as "TEV" was on the Gamecube and Wii until the next systems come from Microsoft and Sony. Then GPGPU will be a super awesome hotness, but not many games will have it implemented for the Wii U, since it will then be seen a system with not enough compute resources in the GPGPU to make a substantial difference. Bank on it.
 
you mean that opening with the UK song God Save the Queen? That was meant to be CG I doubt that was ever a target for ingame
I doubt even the highest PC can pull that off in game at that high of detail

At Nintendo's conference, there was one with CG hands holding a CG Wii U GamePad looking at at TV showing what was probably CG ZombiU footage. I think this is what they were referring to.
 

MDX

Member
Rösti;39000073 said:
Time to start saving for two consoles I guess. Unless I can get something from NOA Rob later this week. Remember to post any questions you may have.


Id like to know why not all developers or publishers are putting the WiiU brand on their game advertisements on those games that we know will come to the WiiU. Is there still an NDA?

I also want to know why Nintendo didnt show more tech demos this E3, like they did last year with Zelda and the Garden, if they knew they couldnt show off some big titles.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
What this means, is that Wii U MIGHT still retain it's Wii U features while playing Wii and even Gamecube games, since it would be a simple matter of lowering the clock rates of these parts lower, while keeping the other resources for the Wii U OS. The technology could easily allow running apps off the tablet while you play Wii games on your TV, or vise versa.
U-CPU might actually not be compatible with the cube - there's no guarantee the new CPU will clock down to cube levels.
 

Gahiggidy

My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
I don't get the big desire to have upscale Wii/GCN games. IMO, they often look worse than the resolution the game was designed for; the simpler geometry and standard-res textures can stick out like a sore thumb in those Dolphin screens.
 

BlackJace

Member
I don't get the big desire to have upscale Wii/GCN games. IMO, they often look worse than the resolution the game was designed for; the simpler geometry and standard-res textures can stick out like a sore thumb in those Dolphin screens.

But games like Galaxy, Skyward Sword, and Prime 3 would look even more stunning.
 

nordique

Member
Even UbiSoft's 'real' Red Steel screenshots were fake. They were rendered in higher resolutions than the Wii could output, with anti aliasing and anisotropic filtering to clean up the image quality.

They still do it this generation with their touched up bullshots. Ubi is the worst.

Good call. I noticed when I found those screenshots, even the "real" ones, I asked myself "I don't remember the game even looking this decent"

Red Steel 2 was a great looking Wii title. Red Steel 1, quite the opposite, imo.

you mean that opening with the UK song God Save the Queen? That was meant to be CG I doubt that was ever a target for ingame
I doubt even the highest PC can pull that off in game at that high of detail

http://www.youtube.com/watch?v=vKXB9dhmkyw

At Nintendo's conference, there was one with CG hands holding a CG Wii U GamePad looking at at TV showing what was probably CG ZombiU footage. I think this is what they were referring to.

Yep. I linked it above in case anyone else needs to look it over vs the gameplay demos.

I like this. It sounds like a genuine "I'm one of you" answer. His hands are tied, but he's not being a reggie about it.

It's mostly up to Nintendo. For most third party developers, GPGPU will be considered with the same gravity as "TEV" was on the Gamecube and Wii until the next systems come from Microsoft and Sony. Then GPGPU will be a super awesome hotness, but not many games will have it implemented for the Wii U, since it will then be seen a system with not enough compute resources in the GPGPU to make a substantial difference. Bank on it.

I like Scott Moffit. He seemed nervous being on stage at E3, but he seems like a genuinely nice guy. I'm more interested in how he handles his title during the Wii U launch.

I know you're probably right regarding the GPGPU stuff, though I still hope it really gets used [regardless of perceptions] if the next Sony/MS consoles use theirs too. That would mean a new, similar development framework for all 3 systems, so it might not be exactly the same as the TEV situation.

At the very least, Nintendo is sure to make some good use of it.

Do you know any Gamecube games that made good use of the TEV?

U-CPU might actually not be compatible with the cube - there's no guarantee the new CPU will clock down to cube levels.

Interesting. So even if it can play Wii games via backward compat it is not a guarantee it could play Gamecube? How would that be the case? Software Emulation?

Rösti;39000073 said:
From a GameSpot interview with Scott Moffitt:


http://www.gamespot.com/news/wii-u-message-confusing-nintendo-6383049

I was hoping to avoid reverse engineering as the only option to know the specs of the console. Of course, there will be plenty of sites that will take the machine apart, but the amount of information coming from such laboratory studies is usually not enough in my opinion; well that goes for reports published online at least.

Time to start saving for two consoles I guess. Unless I can get something from NOA Rob later this week. Remember to post any questions you may have.

Pretty good interview on his end. It sounded like, the two console thing, was in response and/or within context of the Wii, not the Wii U.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
I know you're probably right regarding the GPGPU stuff, though I still hope it really gets used [regardless of perceptions] if the next Sony/MS consoles use theirs too. That would mean a new, similar development framework for all 3 systems, so it might not be exactly the same as the TEV situation.

At the very least, Nintendo is sure to make some good use of it.

Do you know any Gamecube games that made good use of the TEV?

The TEV's problem was being a kind of shitty non-standard compared to the way graphics processors and driver libraries were advancing. A GPGPU, on the other hand, will be a standard of the future. It has all the benefits of an old GPU, fewer of the weaknesses. Benefits game development, rendering, optimisation, and everything else. Good for everyone.

But if Nintendo does indeed have a shitty CPU in the Wii U, and hoped to bank on GPGPU usage to make up for it, then they kind of dicked themselves around with modern games. The TEV made things difficult for developers, and the GPGPU coupled with a weak CPU will make things difficult for current generation ports.

Silly, really, that unlike the TEV, a modern GPU feature could be a hurdle for current generation ports. Only Nintendo :p.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Do you know any Gamecube games that made good use of the TEV?
Every good looking cube title had to use the TEV, but the real forte of the tech was the ultra-fast EMBM - Sunshine's water being a good example.

Interesting. So even if it can play Wii games via backward compat it is not a guarantee it could play Gamecube? How would that be the case? Software Emulation?
Exactly. Frankly, though, that thought occurred to me only recently, when I was wondering why nintendo would not utter a word about compatibility with the cube in any shape or form - not even in a DD context. Then I realized that it could take a compete 'VC overhaul' - compiling them for the U-CPU and introducing explicit timing mechanisms, for the cube games to ever work on the U. Basically, no per-clock compatibility with the cube would mean that some games might run as they are, while other would crash and burn, thus it would take proper QA cycles again for *all* of them.
 

Skiesofwonder

Walruses, camels, bears, rabbits, tigers and badgers.
Well I've stayed away long enough. My hype for the WiiU went from almost nothing to substantially something over the last few weeks. E3 disappointment aside, the games are all that matter and for the titles currently slated to release during the first three-four months, I have over ten that I am interested in, eight of which that are already on my "must and potential buys" list for the Wii-U. Considering we don't even know the full launch window line-up, that's pretty good.

Something else I should probably address (because I have a long-winded PM from Ideaman asking for the topic to be addressed in a Nintendo-themed thread), is Ideaman. I've basically been asked to give an apology for "doubting" the "prophet", but after looking over most of the information Ideaman provided and what was known before E3 (widely known, hinted at, or found through deep digging) I still haven't found any conclusive evidence that points to Ideaman having real sources. Maybe well informed and a logical guesser, yes. But nothing conclusively pointing to him being redeemed like he promised. I think what did (and continues) to irk me the most is his constant need to claim everything (small to large) that was revealed was somehow hinted by him previously, and therefore "proves" that he is who he thinks he is.

I also have problems with the constant teasing and rationing out of information, considering the way Nintendo's E3 really turned out. Considering he said countless times that he knew even more then he was going to reveal, but wasn't going to speak a word because it would "ruin the surprises", such a person who had access to such information must have some idea how bad this E3 could of potentially turned out to be (considering he is also taking credit for hinting about the "ghoulification of Reggie" with ZombiU at E3 presser lol), and would of kept expectations in check, instead of helping in raising them to an all-time high like he did.

All in all, I do apologize for saying your information was false, because quite the contrary, some of if it wasn't. But I still don't agree with your posting style leading up to E3, your constant claims that you hinted at certain reveals (Rayman Legends, ZombiU Reggie, Miiverse, etc.) which you hardly did, and your constant need to be "redeemed" and apologized too because of harsh criticisms (that I still don't believe were uncalled for).

Anyways, I'm glad that E3 is all behind us now, and we can move on to actually anticipating owning the WiiU.
 

Skiesofwonder

Walruses, camels, bears, rabbits, tigers and badgers.
Since it seems a few (?) people have actually played the Wii-U now, anybody care to give some impressions on Lego City Undercover? Out of all the E3 coverage/hands-on from the floor, LCU seemed to really be overlooked.
 

nordique

Member
The TEV's problem was being a kind of shitty non-standard compared to the way graphics processors and driver libraries were advancing. A GPGPU, on the other hand, will be a standard of the future. It has all the benefits of an old GPU, fewer of the weaknesses. Benefits game development, rendering, optimisation, and everything else. Good for everyone.

But if Nintendo does indeed have a shitty CPU in the Wii U, and hoped to bank on GPGPU usage to make up for it, then they kind of dicked themselves around with modern games. The TEV made things difficult for developers, and the GPGPU coupled with a weak CPU will make things difficult for current generation ports.

Silly, really, that unlike the TEV, a modern GPU feature could be a hurdle for current generation ports. Only Nintendo :p.

haha...good post EC

and thanks for the explanation. So essentially, in that case, things won't get good on the port level till the next XB/PS consoles are part of the primary development cycle

It will be interesting to see what exactly the CPU in the Wii U is like once the console is "reverse engineered" and whether it is weak as some whispers indicate, or in fact simple middleware issues and the CPU is indeed competent enough.

Every good looking cube title had to use the TEV, but the real forte of the tech was the ultra-fast EMBM - Sunshine's water being a good example.

Exactly. Frankly, though, that thought occurred to me only recently, when I was wondering why nintendo would not utter a word about compatibility with the cube in any shape or form - not even in a DD context. Then I realized that it could take a compete 'VC overhaul' - compiling them for the U-CPU and introducing explicit timing mechanisms, for the cube games to ever work on the U. Basically, no per-clock compatibility with the cube would mean that some games might run as they are, while other would crash and burn, thus it would take proper QA cycles again for *all* of them.

Very interesting to note of blu, thanks. Though, the remedy then would be simply to have a separate software emulation code for cube than for wii correct?
 

AzaK

Member
Rösti;39000073 said:
From a GameSpot interview with Scott Moffitt:


http://www.gamespot.com/news/wii-u-message-confusing-nintendo-6383049

I was hoping to avoid reverse engineering as the only option to know the specs of the console. Of course, there will be plenty of sites that will take the machine apart, but the amount of information coming from such laboratory studies is usually not enough in my opinion; well that goes for reports published online at least.

Time to start saving for two consoles I guess. Unless I can get something from NOA Rob later this week. Remember to post any questions you may have.
Personally that's just so much bullshit. Why not talk specs to an enthusiast site? They are the sites of consumers who want to know. Sure, no point talking specs if you're at a Wii U Experience kiosk but elsewhere why not. Sooner or later people will find out and discusss it. They're probably just scared that if the specs got out early, we could lambast them and that would affect sales.

I also find it kind of ironic that he says they want the experience to speak for themselves but as far as the games that would actually take advantage of those specs in a meaningful way (core games like batman, me and zombi u), the experiences are generally lacking. I actually think that at the momentum they released specs it would probably help them. We might see the compute shaders, ram etc and go "OK this thing is going to be pretty good when it gets up to speed" as opposed to what we have now which is "That looks just like a 360 game".

Basically, it's just more Nintendo bullshit.
 
I think just because it was based on rumors w/ no source. People also seem to resent BGAssassin and StevieP outside this thread. No idea why - their info has never been proven false and their posts are at least respectful and generally well thought out. You're welcome in here, guys!

Really? I never noticed that. For me at least.

we were told at the time the game would look better at launch, any one who picked this game up at Wii's launch might remember the disappointing graphics

That's also possibly because they were going off of target specs that Nintendo scrapped.

Something else I should probably address (because I have a long-winded PM from Ideaman asking for the topic to be addressed in a Nintendo-themed thread), is Ideaman. I've basically been asked to give an apology for "doubting" the "prophet", but after looking over most of the information Ideaman provided and what was known before E3 (widely known, hinted at, or found through deep digging) I still haven't found any conclusive evidence that points to Ideaman having real sources. Maybe well informed and a logical guesser, yes. But nothing conclusively pointing to him being redeemed like he promised. I think what did (and continues) to irk me the most is his constant need to claim everything (small to large) that was revealed was somehow hinted by him previously, and therefore "proves" that he is who he thinks he is.

Wow IM. It's not that serious. I've only posted stuff for people to discuss and didn't care if it was doubted because a person had the right to do so. I couldn't care less about needing an apology from someone doubting if something I passed along turned out true.

Rösti;39000073 said:
From a GameSpot interview with Scott Moffitt:


http://www.gamespot.com/news/wii-u-message-confusing-nintendo-6383049

I was hoping to avoid reverse engineering as the only option to know the specs of the console. Of course, there will be plenty of sites that will take the machine apart, but the amount of information coming from such laboratory studies is usually not enough in my opinion; well that goes for reports published online at least.

Time to start saving for two consoles I guess. Unless I can get something from NOA Rob later this week. Remember to post any questions you may have.

I probably haven't talked about it enough, but ask him about USB 3.0 ports since this console is going to be on the market for at least six year and Nintendo plan to rely on external HDDs. Especially since the console has two USB 2.0 controllers and it would seem that one could be converted.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
and thanks for the explanation. So essentially, in that case, things won't get good on the port level till the next XB/PS consoles are part of the primary development cycle

Theoretically, yes.

If the rumours of Nintendo banking on the GPGPU to make up for the weak CPU are true, the problem with current generation titles is that the engines are not built to utilise GPGPU advantages. They can't offload CPU processing to the GPU, not without having to have portions of the engine rewritten. Much like trying to get stuff working on the TEV, this can be a long, tedious ordeal, one that would probably require a lot of bug testing due to the engine changes.

As more engines that take advantage of compute programming become a standard, then GPGPU architecture will shine.
 

nordique

Member
No they wouldn't. They'd look pretty much the same. In fact, since the image is being resized by a non-integral multiplier, the picture would look worse than it normally would.

I agree with this and with the sentiments regarding the pointlessness of upscaling Wii games to HD.

We saw what that "upscaling" was like with PS2 games on the early PS3's, and it wasn't as "good" as everyone thought especially those with larger screen TVs:

http://ps3.ign.com/articles/793/793775p1.html

Upscaling can also be a detriment, as GW has mentioned a few times




That said however, there were some positive results too, amidst mixed results overall:

http://www.joystiq.com/2007/05/24/ps3-upscales-ps2-games-and-makes-them-look-fantastic-continued/

http://www.neogaf.com/forum/showthread.php?t=158224&page=2


Seems that upscaling on the PS3, while turning progressive scan mode ON on the PS2 game, made the title look worse than upscale with progressive scan mode OFF




Yet again, in some cases there were little to no difference.



Only if a game is remastered with HD graphics, such as the HD re-releases of PS2 classics on the PS3, only then would it be a true "HD upgrade"

I have a feeling Nintendo will be doing that with Gamecube and Wii games alike, especially if Gamecube games skip virtual console.
 
Theoretically, yes.

If the rumours of Nintendo banking on the GPGPU to make up for the weak CPU are true, the problem with current generation titles is that the engines are not built to utilise GPGPU advantages. They can't offload CPU processing to the GPU, not without having to have portions of the engine rewritten. Much like trying to get stuff working on the TEV, this can be a long, tedious ordeal, one that would probably require a lot of bug testing due to the engine changes.

As more engines that take advantage of compute programming become a standard, then GPGPU architecture will shine.

I get the feeling that the delay with the fifth kit and it going through such a long tweaking probably dealt with the CPU since there was a small performance boost. There were enough problems arising in the 4th kit pointing to the CPU to not think it was ignored during that roughly six month tweak period.
 

nordique

Member
That's also possibly because they were going off of target specs that Nintendo scrapped.

I thought about that too, but then again, Ubisoft did still show similar graphics in a CGI clip in their E3 2006 Red Steel trailer, the one shown at Nintendo's conference. It could still fall under the false expectations category in that case

Theoretically, yes.

If the rumours of Nintendo banking on the GPGPU to make up for the weak CPU are true, the problem with current generation titles is that the engines are not built to utilise GPGPU advantages. They can't offload CPU processing to the GPU, not without having to have portions of the engine rewritten. Much like trying to get stuff working on the TEV, this can be a long, tedious ordeal, one that would probably require a lot of bug testing due to the engine changes.

As more engines that take advantage of compute programming become a standard, then GPGPU architecture will shine.

Interesting. This will be an interesting generational shift, and especially for the Wii U. I do hope the CPU proves to be more powerful in the sense it provides a good balance even for a GPGPU.

Now, would Nintendo's own development tools not implement this somehow? I understand their tool kit will not ever become the defacto set of use, but I do wonder why it was not made more apparent. Too early to judge perhaps.
 

nordique

Member
I get the feeling that the delay with the fifth kit and it going through such a long tweaking probably dealt with the CPU since there was a small performance boost. There were enough problems arising in the 4th kit pointing to the CPU to not think it was ignored during that roughly six month tweak period.

yeah, this is why I am hoping the CPU tweaks result in some very improved middleware, even to the extent the CPU is more capable than say Xenon, if not in raw strength then in efficiency.

I wonder how much of the issues were related to OoOE
 
I agree with this and with the sentiments regarding the pointlessness of upscaling Wii games to HD.

We saw what that "upscaling" was like with PS2 games on the early PS3's, and it wasn't as "good" as everyone thought especially those with larger screen TVs:

http://ps3.ign.com/articles/793/793775p1.html

Upscaling can also be a detriment, as GW has mentioned a few times

Oh, I should add to this that upscaling is not what Dolphin does, in case anyone's getting the two confused. Dolphin rerenders the game's geometry to a higher resolution. Upscaling is like if you took each frame of a game, opened it up in GIMP, resized it by something like 1.5x or 2.25x, and saved it, with no additional alteration or enhancement.

The other part of the whole "anti-upscaling in console" argument would be that every HD television has upscaling as a feature*, so it's a pointless add-on to have in a console unless you bought a shitty television.



* that's what the Internet tells me. I do not have an HD TV, nor have I done a lot of looking into it, so I could be mistaken to some small degree here.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
Upscaling is just stretching the image to fit the native resolution. So yeah, all displays technically do it. But because quality of upscaling differs between monitors, some prefer the idea of hardware upscaling to a 720/1080 signal.

But if it's not actually rendering the game at a higher resolution it doesnt really matter.
 
Upscaling is just stretching the image to fit the native resolution. So yeah, all displays technically do it. But because quality of upscaling differs between monitors, some prefer the idea of hardware upscaling to a 720/1080 signal.

But if it's not actually rendering the game at a higher resolution it doesnt really matter.

For some reason I get the feeling that if the Wii U had an upscaler, it would not be better than the upscaler on most televisions. Nintendo's not particularly known for astonishingly advanced hardware outputs. Their optical audio port, for instance…
 
I thought about that too, but then again, Ubisoft did still show similar graphics in a CGI clip in their E3 2006 Red Steel trailer, the one shown at Nintendo's conference. It could still fall under the false expectations category in that case

Forgot about that.

Interesting. This will be an interesting generational shift, and especially for the Wii U. I do hope the CPU proves to be more powerful in the sense it provides a good balance even for a GPGPU.

Now, would Nintendo's own development tools not implement this somehow? I understand their tool kit will not ever become the defacto set of use, but I do wonder why it was not made more apparent. Too early to judge perhaps.

It should especially since Iwata made mention of "slightly different architecture". But definitely too early to see it.

yeah, this is why I am hoping the CPU tweaks result in some very improved middleware, even to the extent the CPU is more capable than say Xenon, if not in raw strength then in efficiency.

I wonder how much of the issues were related to OoOE

IMO I doubt the OoO-ness had anything to do with it, but more architecture and raw power. If the CPU is based on 476FP cores, then on paper a stock core is 3 GFLOPs.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Very interesting to note of blu, thanks. Though, the remedy then would be simply to have a separate software emulation code for cube than for wii correct?
Well, I don't think wii will need CPU emulation whatsoever, and I think we will see the odd 1st/2nd party cube title VC-ified (read: recompiled for controller compatibility and timing fixes) on the ushop.

yeah, this is why I am hoping the CPU tweaks result in some very improved middleware, even to the extent the CPU is more capable than say Xenon, if not in raw strength then in efficiency.

I wonder how much of the issues were related to OoOE
I bet a good deal of the issues are fp-throughput-related. In this regard my opinion has not changed since WUST.. 3 I think?


* that's what the Internet tells me. I do not have an HD TV, nor have I done a lot of looking into it, so I could be mistaken to some small degree here.
My old 42" EDTV (native 854x480) plasma made 95% of the HDTV base and their upscalers look like a joke for SD gaming. While people were bitching and moaning about the Wii's picture output on their 'cutting edge' $400 sets, I was having a superb-looking SD generation and a smirk on my face. Alas, I had to leave the old box behind when I moved last time. Now I'm 'HDTV-only' and part of the bitch 'n' moan camp. Oh, how I've fallen.
 

jacksrb

Member
I assume that these guys are working on Wii U titles. I wonder what?

-Retro
-EAD Tokyo
-MonoliftSoft
-MonsterGames

I also assume that these games will come to Wii U. I wonder when we will know?

-BLOPS 2
-007 Legends
-Need for Speed Most Wanted
-Madden
-FIFA
 

Penguin

Member
So was in The Dark Knight Rises thread, and someone mentioned Excitebike, and it got me to thinking, how that has recently become a DD franchise for Nintendo (at least the last two were on WiiWare and eShop, respectively)

And was curious what other franchises would you like to see Nintendo resurrect as sort of DD franchises. I think Dr. Mario and a handful of their puzzlers are also good fits.
 

Gahiggidy

My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
....
I probably haven't talked about it enough, but ask him about USB 3.0 ports since this console is going to be on the market for at least six year and Nintendo plan to rely on external HDDs. Especially since the console has two USB 2.0 controllers and it would seem that one could be converted.

Also, there has got to be an official Nintendo drive add-on, right? It would be very un-nintendo to fully support digital downloads of Nintendo published games but then require everyone to use a third party device to take advantage of it. There's a certain segment of Nintendo's customers that would only be comfortable using an official Nintendo disc-drive.

Also, will a separate power source be needed to run a drive add-on? Or can it be powered through the USB port alone?
 

Lyude77

Member
So was in The Dark Knight Rises thread, and someone mentioned Excitebike, and it got me to thinking, how that has recently become a DD franchise for Nintendo (at least the last two were on WiiWare and eShop, respectively)

And was curious what other franchises would you like to see Nintendo resurrect as sort of DD franchises. I think Dr. Mario and a handful of their puzzlers are also good fits.

I might be up for a new Urban Champion on DD, since Nintendo seems to like it. They just have to, you know, make it good. Brawlers are a somewhat popular DD genre, so I think it could work sales-wise too.
 

sfried

Member
I'm getting mixed messages about the GPGPU: On one hand it's a GPU that does CPU functions but on the other hand developers who do not properly optimize won't see much of a performance enhancement. What again is the main advantage of a GPU that does CPU calculations?

And what is this smaller CPU I keep hearing about?
 
I bet a good deal of the issues are fp-throughput-related. In this regard my opinion has not changed since WUST.. 3 I think?

Arkam started showing up around 2 I believe, so it may have been that thread.

Also, there has got to be an official Nintendo drive add-on, right? It would be very un-nintendo to fully support digital downloads of Nintendo published games but then require everyone to use a third party device to take advantage of it. There's a certain segment of Nintendo's customers that would only be comfortable using an official Nintendo disc-drive.

Also, will a separate power source be needed to run a drive add-on? Or can it be powered through the USB port alone?

I believe 2.5" HDD can be powered by the USB port.


Also while doing some searching about UE4 and GPGPU, I came across an article taking an excerpt from Tim Sweeney in 2009 about GPGPU usage at that time.

http://www.tomshardware.com/news/Sweeney-Epic-GPU-GPGPU,8461.html#xtor=RSS-181

Epic Games' chief executive officer Tim Sweeney recently spoke during the keynote presentation of the High Performance Graphics 2009 conference, saying that it is "dramatically" more expensive for developers to create software that relies on GPGPU (general purpose computing on graphics processing units) than those programs created for CPUs.

He thus provides an example, saying that it costs "X" amount of money to develop an efficient single-threaded algorithm for CPUs. To develop a multithreaded version, it will cost double the amount; three times the amount to develop for the Cell/PlayStation 3, and a whopping ten times the amount for a current GPGPU version. He said that developing anything over 2X is simply "uneconomic" for most software companies. To harness today's technology, companies must lengthen development time and dump more money into the project, two factors that no company can currently afford.

But according to X-bit Labs, Sweeney spent most of his speech preaching about the death of GPUs (graphics processing units) in general, or at least in a sense as we know them today. This isn't the first time he predicted the technology's demise: he offered his predictions of doom last year in this interview. Basically, the days of DirectX and OpenGL are coming to a close.

“In the next generation we’ll write 100-percent of our rendering code in a real programming language--not DirectX, not OpenGL, but a language like C++ or CUDA," he said last year. "A real programming language unconstrained by weird API restrictions. Whether that runs on Nvidia hardware, Intel hardware or ATI hardware is really an independent question. You could potentially run it on any hardware that's capable of running general-purpose code efficiently."

This definitely adds perspective to possible issues with PS360 ports and the rumored direction of Wii U.
 
Also, there has got to be an official Nintendo drive add-on, right? It would be very un-nintendo to fully support digital downloads of Nintendo published games but then require everyone to use a third party device to take advantage of it. There's a certain segment of Nintendo's customers that would only be comfortable using an official Nintendo disc-drive.

Also, will a separate power source be needed to run a drive add-on? Or can it be powered through the USB port alone?

It would violate USB specification if it could not deliver power; is that not the case?

though I guess they could reuse their DVD/BD strategy and just call it "Nintendo Serial Bus" ;)
 
Status
Not open for further replies.
Top Bottom