• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Speculation Thread 2: Can't take anymore of this!!!

Status
Not open for further replies.

Cookychan

Banned
I'd say the final kits would be labeled as such.
With an F or FINAL in the codename.
Don't final kits go out around 3 months prior to E3 with Nintendo consoles?
We may see something soon, I guess.
 

Nibel

Member
3D screen, vitality sensor, haptic feedback... it's like you're asking Nintendo to disappoint you. Of course it just wouldn't be a next-gen Nintendo thread without amplified expectations, so carry on.

And you think that anyone here truly believes that Nintendo is going to implant that stuff? We just talk about the possibilities, so relax dude.
 

lednerg

Member
Really? I mean, I know that a lot of people don't have their HDTVs hooked up properly and there's a big running gag about that, but to say it's some imperceptible difference or something is..... odd. I don't have a great eye for these sorts of things, but if I can notice a difference, ANYONE can. And it's gorgeous by comparison. My 42" plasma makes 720p content look less than stellar. Using a Wii with it was a total damned nightmare (praise Zombie Jesus for Dolphin).

I can't tell if you're saying the Wii is 720p, but it only runs at 480i or 480p (with component cables).

As far as the difference between 720p and 1080p goes, 720p wouldn't be all that noticeable from even 7 feet away from the screen. Especially not with good AA. But you would notice the smoother framerate and added effects that 720p would make possible from any distance.

And you think that anyone here truly believes that Nintendo is going to implant that stuff? We just talk about the possibilities, so relax dude.

I don't know what people believe. I'm just commenting on what people are writing, saying I hope they aren't setting themselves up for disappointment. I'm plenty relaxed.
 

z0m3le

Banned
While I think he is wrong, it is something to consider when you see so many comments about the Samartian demo implying that it is not that impressive.

http://www.hardcoregaming101.net/ninjagaiden/xbox-c1a.jpg

That's ninja gaiden black on xbox, it's one of the best looking games on the system.

http://www.360sync.com/wp-content/uploads/2010/05/image_alan_wake-12275-759_0009.jpg

that's Alan Wake, I would of picked Ninja Gaiden again, but the art assets might not of changed enough to really be considered one of the best looking games on 360.

What I am saying is that games won't make this jump next gen, for one reason alone, the resolutions won't change on the majority of games... now at the end of a systems life cycle, we might see something resembling Samartian's demo, but personally I don't think the difference is as convincing. That is to say the jump is inferior.
 

Oddduck

Member
The Wii U really needs a feature that makes people go "oooh" at the idea of a touchscreen on a controller. I think NFC is a huge step in the right direction, but I don't know if it's a system seller feature.

People say they don't want the controller to be super expensive.

But either Nintendo has to make the controller super awesome or the specs super awesome. Nintendo can't just cut costs on specs and make a cheaply made controller and expect gamers not to find the Wii U feeling a little cheaply made. The controller's price of technology is the reason we aren't getting super powerful specs. I think it is reasonable that people want the most features possible out of that controller since we traded power and specs for a controller.

People here might buy the Wii U for Nintendo franchises, but those who aren't fans of Nintendo franchises will buy Wii U for the controller. Just like non-nintendo fans bought the first Wii for the controller. And the Wii U controller needs to impress.
 

spekkeh

Banned
Really? I mean, I know that a lot of people don't have their HDTVs hooked up properly and there's a big running gag about that, but to say it's some imperceptible difference or something is..... odd. I don't have a great eye for these sorts of things, but if I can notice a difference, ANYONE can. And it's gorgeous by comparison. My 42" plasma makes 720p content look less than stellar. Using a Wii with it was a total damned nightmare (praise Zombie Jesus for Dolphin).
But that's because you set it to 720p then, if both the Wii U and your tv have good upscaling algorithms, you won't notice the difference between native 1080p and upscaled 720p.
 

MDX

Member
3D screen, vitality sensor, haptic feedback... it's like you're asking Nintendo to disappoint you. Of course it just wouldn't be a next-gen Nintendo thread without amplified expectations, so carry on.



So WiiU might not end up being the most powerful console, but the most feature rich.
 

Terrell

Member
I can't tell if you're saying the Wii is 720p, but it only runs at 480i or 480p (with component cables).

As far as the difference between 720p and 1080p goes, 720p wouldn't be all that noticeable from even 7 feet away from the screen. Especially not with good AA. But you would notice the smoother framerate and added effects that 720p would make possible from any distance.

No, I'm saying it only runs at 480p and looks like fucking garbage on a 1080p TV.

My TV is more than 8 feet away from me right now, in my standard seat from it.
I can still see the difference plain as day.

But that's because you set it to 720p then, if both the Wii U and your tv have good upscaling algorithms, you won't notice the difference between native 1080p and upscaled 720p.

Yeah, my TV upscales about as well as you can expect for a 2008 model. Can still tell the difference, every single time.
 

Cookychan

Banned
I'm hoping RE: Operation Raccoon City never comes to Wii U (*raising flame shield*).
Don't get me wrong, RE is a great series, but really Capcom? This is more like CoD with viruses.

/rant Hopefully RE6 is better.
 

z0m3le

Banned
Yeah, like something like that needs information.

Diminishing return, Jesus Christ. In 20 years maybe, on the PS12

Ah, so you don't believe in diminishing returns, in that case, I doubt there is really a way to argue with you, but I'll say this... Samartian demo took 3 580x's that is 7.5 Tflops of power, it's also over 30x the GPU power in the 360, the xbox to 360 jump was only about 6x the computational power.

I don't think I can make it much clearer then that... Sure Epic said they might be able to push that down to 2.5Tflops, but that will come with SOME concessions, and they have yet to do it, and in the end, it would have to be possible on 1.4Tflops of GPU power for it to hit the rumored Xbox3 specs.

So if you believe that is next gen, prepare to squint your eyes and tilt your head, I'm sure you can eventually convince yourself that you are right.
 

D_prOdigy

Member
I'm hoping RE: Operation Raccoon City never comes to Wii U (*raising flame shield*).
Don't get me wrong, RE is a great series, but really Capcom? This is more like CoD with viruses.

/rant Hopefully RE6 is better.

I'm sure Capcom will say they'd consider bringing it over if enough people beg them for it.

They got in with that schtick pretty early on with Asura's Wrath.
 

Cookychan

Banned
LOL at the Wii U's trailer on IGN's channel on YT. The comments are hilar.
Anyhoo, last year I saw a guy on the video claiming 720 will have Avatar in real-time.
He got flamed more than Rebecca Black.
 

wsippel

Banned
Useless information: Current Wii U SDK is at version 2.02.

Useful information: The Wii U, like all other current Nintendo systems (and unlike PS3 and 360), has a dedicated audio DSP.
 
Ah, so you don't believe in diminishing returns, in that case, I doubt there is really a way to argue with you, but I'll say this... Samartian demo took 3 580x's that is 7.5 Tflops of power, it's also over 30x the GPU power in the 360, the xbox to 360 jump was only about 6x the computational power.

I don't think I can make it much clearer then that... Sure Epic said they might be able to push that down to 2.5Tflops, but that will come with SOME concessions, and they have yet to do it, and in the end, it would have to be possible on 1.4Tflops of GPU power for it to hit the rumored Xbox3 specs.

So if you believe that is next gen, prepare to squint your eyes and tilt your head, I'm sure you can eventually convince yourself that you are right.
While we obviously are not going to see that quality as a launch title, I'm sure it would be achievable at least in 720p (the demo was 1080p) after a couple years of learning the hardware (if the hardware is in line with the rumours you mention)
 

mclem

Member
Imagine the next big console Mario with physics.. I know this post is weird for most people, but I see an huge opportunity here: a good physic engine and Nintendo's creativity could do something amazing.

Little Big Planet, I feel, pretty much conveys both the positives *and the negatives* of a good physics engine in a platforming environment. My gut says to me that the negatives would mean that a straight Mario game using physics wouldn't really feel like Mario - but some other IP might work for it. Crazy suggestion: while it won't *play* like it, dredge up the Wrecking Crew IP. That'd put good reason into the physics engine existing, and players wouldn't expect it to control like modern Mario.
 

z0m3le

Banned
While we obviously are not going to see that quality as a launch title, I'm sure it would be achievable at least in 720p (the demo was 1080p) after a couple years of learning the hardware (if the hardware is in line with the rumours you mention)

Which is about what I said just a couple posts back, I really think that by the end of the generation, you will see something close to that demo, though I am not sure what they will have to cut... I mean it's a 7.5Tflop demo, I'm sure it doesn't actually use ALL of that power, probably couldn't anyways thanks to SLI's throughput.

So in conclusion Wii U will look noticeably better than PS360 in most games, and from X720 rumors, there won't be a large difference between them, matter of fact if ms uses a 720p tablet, then games using the tablet could look worse on X720 than Wii U, but I am not predicting that, just stating the possibilities and ramification.

Also remember by the end of this generation, the Wii U will have moved well beyond the zelda and bird demos too.
 
Useless information: Current Wii U SDK is at version 2.02.

Useful information: The Wii U, like all other current Nintendo systems (and unlike PS3 and 360), has a dedicated audio DSP.

Seems like a bad engineering decision, todays CPU's should have plenty of grunt for audio and that's why dedicated audio chips were eliminated.

For example on an Xbox 360 game, you toss the audio guy one of the 6 CPU threads, on PS3 he might get a SPU.

It's likely just inefficient to dedicate some silicon to audio, as opposed to just adding that much more budget to CPU, GPU, RAM, etc.

Probably not a giant deal either way, mind you.
 

z0m3le

Banned
A single 580 is rated at 1.58TFLOP.

Ah, I guess I was thinking of the 590 when I was adding it all up. Flops is probably the worst way you could compare an AMD GPU to a Nvidia one anyways. (seeing as how a HD5870 is a 2.7Tflops card)

I was simply trying to show how impossibly powerful the computer that demo ran was.


Seems like a bad engineering decision, todays CPU's should have plenty of grunt for audio and that's why dedicated audio chips were eliminated.

For example on an Xbox 360 game, you toss the audio guy one of the 6 CPU threads, on PS3 he might get a SPU.

It's likely just inefficient to dedicate some silicon to audio, as opposed to just adding that much more budget to CPU, GPU, RAM, etc.

Probably not a giant deal either way, mind you.

Well, this might be needed for Wii emulation, one of the hardest things dolphin had to do was emulate sound for some Nintendo games. Seems a bit more reasonable than, for example putting an entire Wii in it.
 
Ah, I guess I was thinking of the 590 when I was adding it all up. Flops is probably the worst way you could compare an AMD GPU to a Nvidia one anyways. (seeing as how a HD5870 is a 2.7Tflops card)

I was simply trying to show how impossibly powerful the computer that demo ran was.

This has been covered a lot of times, but Epic's already said Samaritan can run at 720P at 1.1 TF and 1080P 2.5 TF.

I think they also implied with optimization it could perhaps run on a single 580, which will be an old busted video card before you know it (been surpassed by 7970 and 7950, will be surpassed+eliminated by Kepler soon, then we will have 1-2 more generation of video cards before PS4/720 release. We could well be on AMD 9000 series Nvidia 700/800 series by 720/PS4 or more assuming late 2013 launch imo it could be 2014)
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Seems like a bad engineering decision, todays CPU's should have plenty of grunt for audio and that's why dedicated audio chips were eliminated.

For example on an Xbox 360 game, you toss the audio guy one of the 6 CPU threads, on PS3 he might get a SPU.

It's likely just inefficient to dedicate some silicon to audio, as opposed to just adding that much more budget to CPU, GPU, RAM, etc.

Probably not a giant deal either way, mind you.
There's a metric known as transistor efficiency - how many transistors are employed in doing a given task. While CPUs nowadays can do lots of things sufficiently fast, the use of dedicated silicon yields much better transistor efficiency for the given task. A CPU core, SMT or otherwise, can be used much more efficiently at another task than doing FIR and mixing wave samples, so just because the CPU can do something does not mandate why it should do it.

This has been covered a lot of times, but Epic's already said Samaritan can run at 720P at 1.1 TF and 1080P 2.5 TF.

I think they also implied with optimization it could perhaps run on a single 580, which will be an old busted video card before you know it (been surpassed by 7970 and 7950, will be surpassed+eliminated by Kepler soon, then we will have 1-2 more generation of video cards before PS4/720 release. We could well be on AMD 9000 series Nvidia 700/800 series by 720/PS4 or more assuming late 2013 launch imo it could be 2014)
Right. Since you're apparently well informed about what Epic said about Samaritan's FLOP needs, why don't you call Tim Sweeney and tell him what an overblown figure he gave in his DICE presentation. I mean, he misled everybody with that, instead of just being hones and admitting Epic suck at GPU optimisations. Just think about it, this is your golden opportunity in life to make a name for yourself in the industry.
 

wsippel

Banned
Seems like a bad engineering decision, todays CPU's should have plenty of grunt for audio and that's why dedicated audio chips were eliminated.

For example on an Xbox 360 game, you toss the audio guy one of the 6 CPU threads, on PS3 he might get a SPU.

It's likely just inefficient to dedicate some silicon to audio, as opposed to just adding that much more budget to CPU, GPU, RAM, etc.

Probably not a giant deal either way, mind you.
No, it's actually the other way around: All games have sound, so wasting CPU cycles on audio mixing, something DSPs are far better suited for, is actually stupid.
 

mclem

Member
Indeed. I got into a debate God knows how many pages back as to whether all games will have the "tablet only" mode included. I was under the impression that this would be OS level, but I'm honestly not so sure.

It can't be OS level. Any game which would use an independent control interface on the tablet in tablet+television mode would need to have an alternative UI design if it were to run correctly in pure-tablet mode. That would needed to be handled by the devs, it couldn't come automatically from the UI.

Also, unless Nintendo makes supporting both playstyles a requirement (and I suspect they won't do so; taking away the ability of a game to only run in tablet+TV mode would stifle innovation), I suspect that many devs will just choose not to support it. If nothing else, supporting it would - in one fell swoop - double your testing complexity.
 

z0m3le

Banned
This has been covered a lot of times, but Epic's already said Samaritan can run at 720P at 1.1 TF and 1080P 2.5 TF.

I think they also implied with optimization it could perhaps run on a single 580, which will be an old busted video card before you know it (been surpassed by 7970 and 7950, will be surpassed+eliminated by Kepler soon, then we will have 1-2 more generation of video cards before PS4/720 release. We could well be on AMD 9000 series Nvidia 700/800 series by 720/PS4 or more assuming late 2013 launch imo it could be 2014)

Pretty sure X720 is in devs hands right now, they are in europe at a summit about it too according to a crytek tweet, so I doubt Microsoft will be using anything beyond HD7000 (rumored to be HD6670)

Also your quote of what epic said is wrong, Epic said that with optimization they could probably get it to run on a single 580, that is an extremely powerful card, even if you compare it to an HD7970 it doesn't look outdated. (like I said before though, concessions will have to be made if they run it on next gen consoles, though by the end of the consoles lives... 6 years or so down the road, Wii U and x720 will probably be able to run something that looks very close to Samaritan's demo mostly because these are closed systems. (console vs pc)
 

z0m3le

Banned
Audio DSP.
No Digital Audio Out.

Oh Nintendo.
:p

Welcome to the age of HDMI, personally in this day and age unless you are an audiophile. I'd expect most people just plug their HDMI into their receiver or directly into their TV. As long as I don't have to match up those red and white cords blindly behind my tv, I'll be fine... Also a dog ate the red cord so all my sound is mono, thank god I don't have a real sound system hooked up to my tv.
 
Pretty sure X720 is in devs hands right now, they are in europe at a summit about it too according to a crytek tweet, so I doubt Microsoft will be using anything beyond HD7000 (rumored to be HD6670)

Theres a couple of rumours much like everything, one is for a 7 series.
 

Terrell

Member
For which game exactly, because almost no console game runs at native 1080p.

I ain't limited to console games, got a computer hooked up to the same TV. Last PC game I played on it was Portal 2. I'm sure I could do a direct comparison by manipulating the game's resolution output and still be proven correct, though.
 
@z0m3le:

You can't compare a bloated PC techdemo with console hardware. The has about 80% overhead in the OS. Considering the extra computational power, fillrate, featureset, flexibility and memory bandwidth of something like a HD7770 or GTX560TI has to offer over the Xenos or RSX. Such a GPU in hands of competent developers like Naughty Dog or Capcom will give us much better graphics and gameplay experiences than Epic ever could with Samaritan. Just see what developers can do with first generation games on mobile phone/tablet like hardware.

If you think current PC games is where you should be looking to see what next gen will look like you've got another thing coming.
 

z0m3le

Banned
Theres a couple of rumours much like everything, one is for a 7 series.

Oh yeah, trust me I know, the rumor I refer to is the Fudzilla one confirmed by IGN, but I want to stress that it really doesn't matter too much which chip they started with, it will be a unique custom chip, especially if it's suppose to be over 1Tflops and still be based on HD6670 lol, it would need a huge customization to make that happen. My point was that it won't be based on HD8000 or HD9000 series that SG assumed was possible.
 

VAPitts

Member
LOL at the Wii U's trailer on IGN's channel on YT. The comments are hilar.
Anyhoo, last year I saw a guy on the video claiming 720 will have Avatar in real-time.
He got flamed more than Rebecca Black.

what's even funnier is u have people on Gaf saying the same thing.
 

z0m3le

Banned
@z0m3le:

You can't compare a bloated PC techdemo with console hardware. The has about 80% overhead in the OS. Considering the extra computational power, fillrate, featureset, flexibility and memory bandwidth of something like a HD7770 or GTX560TI has to offer over the Xenos or RSX. Such a GPU in hands of competent developers like Naughty Dog or Capcom will give us much better graphics and gameplay experiences than Epic ever could with Samaritan. Just see what developers can do with first generation games on mobile phone/tablet like hardware.

If you think current PC games is where you should be looking to see what next gen will look like you've got another thing coming.

I think the start of next gen will look like Tri-ace's engine running really well with more effects, more polygons and higher res textures.

And I think the end of the generation will look like samaritan demo, (with concessions made) there might be an exclusive here or there that is just "balls to the wall" pushing hardware, but with developers costs, I think it's reasonable to assume what I am saying. I mentioned that it is because of a closed box system that Samaritan might even be possible next gen, though I still say concessions will have to be made. Thanks to great art though, it won't really matter what we get, maybe those prerendered killzone 2 videos will come back LOL.
 
There's a metric known as transistor efficiency - how many transistors are employed in doing a given task. While CPUs nowadays can do lots of things sufficiently fast, the use of dedicated silicon yields much better transistor efficiency for the given task. A CPU core, SMT or otherwise, can be used much more efficiently at another task than doing FIR and mixing wave samples, so just because the CPU can do something does not mandate why it should do it.

Yup, and I'm 100% sure it's more efficient to not do dedicated audio. even sound cards in PC's are rare nowdays. There's not just the chip cost, but also the increased motherboard complexity. Also, anytime you have dedicated silicon it's not flexible. If you're using the CPU you can always choose to dedicate more power to graphics or game code and less to audio, if it's dedicated you have no choice.

If it's for some kind of backward compatibility though, it's understandable.

Right. Since you're apparently well informed about what Epic said about Samaritan's FLOP needs, why don't you call Tim Sweeney and tell him what an overblown figure he gave in his DICE presentation. I mean, he misled everybody with that, instead of just being hones and admitting Epic suck at GPU optimisations. Just think about it, this is your golden opportunity in life to make a name for yourself in the industry.

Why dont you call him and tell him you know better than him what it takes to run his engine. He should get a good laugh.
 

z0m3le

Banned
Specialguy adds to the discussion, there are plenty of people here who disagree with each other, whats one more? besides he is far more reasonable this past week.

And I agree that Epic's engine will be optimized, but like I said before, those two things you mentioned about epic in the post were the same. They WILL try to optimize it to run on 1 580x, I don't know if they will succeed, but next gen consoles won't be using graphics cards on par with a GTX 580, they are closed systems though so with some performance cuts here and there, we will likely see some stuff in 4 to 6 years that looks like Samaritan.
 
They have the card in hardware locked down in advane. They are not throwing a 2013/2014 card into a machine launching in 2013/2014.

That's exactly what 360 did...it can be done.

Actually 360 /Xenos pushed so close to the bleeding edge it was kind of ridiculous, Xenos used some tech that wasn't in ATI cards for another couple of years. It also wasn't finished until the very last second which is an incredibly dangerous game to play. Xenos was basically ahead of it's time, forget contemporary with it's time. The equivalent of a 2014 video card in a 2013 machine.

The problem is if you try to time it just right like MS did with Xenos, if there's a problem you literally have no choice but to delay your console for months.

Basically it can and has been done, but I wouldn't recommend it and dont really see it happening this time. I think they'll go with something either even with the time they're aiming for (late 2013 card for late 2013 console), or slightly behind (early 2013 card for late 2013 console, etc).

But a 2012 card in a 2013 machine, I dont see.
 
Status
Not open for further replies.
Top Bottom