3D screen, vitality sensor, haptic feedback... it's like you're asking Nintendo to disappoint you. Of course it just wouldn't be a next-gen Nintendo thread without amplified expectations, so carry on.
Really? I mean, I know that a lot of people don't have their HDTVs hooked up properly and there's a big running gag about that, but to say it's some imperceptible difference or something is..... odd. I don't have a great eye for these sorts of things, but if I can notice a difference, ANYONE can. And it's gorgeous by comparison. My 42" plasma makes 720p content look less than stellar. Using a Wii with it was a total damned nightmare (praise Zombie Jesus for Dolphin).
And you think that anyone here truly believes that Nintendo is going to implant that stuff? We just talk about the possibilities, so relax dude.
While I think he is wrong, it is something to consider when you see so many comments about the Samartian demo implying that it is not that impressive.
Go here and scroll to S/N 9
That bird was male.They do? I don't recall seeing any in the Bird demo.
But that's because you set it to 720p then, if both the Wii U and your tv have good upscaling algorithms, you won't notice the difference between native 1080p and upscaled 720p.Really? I mean, I know that a lot of people don't have their HDTVs hooked up properly and there's a big running gag about that, but to say it's some imperceptible difference or something is..... odd. I don't have a great eye for these sorts of things, but if I can notice a difference, ANYONE can. And it's gorgeous by comparison. My 42" plasma makes 720p content look less than stellar. Using a Wii with it was a total damned nightmare (praise Zombie Jesus for Dolphin).
That's a very convincing and informative argument.
3D screen, vitality sensor, haptic feedback... it's like you're asking Nintendo to disappoint you. Of course it just wouldn't be a next-gen Nintendo thread without amplified expectations, so carry on.
I can't tell if you're saying the Wii is 720p, but it only runs at 480i or 480p (with component cables).
As far as the difference between 720p and 1080p goes, 720p wouldn't be all that noticeable from even 7 feet away from the screen. Especially not with good AA. But you would notice the smoother framerate and added effects that 720p would make possible from any distance.
But that's because you set it to 720p then, if both the Wii U and your tv have good upscaling algorithms, you won't notice the difference between native 1080p and upscaled 720p.
For which game exactly, because almost no console game runs at native 1080p.Yeah, my TV upscales about as well as you can expect for a 2008 model. Can still tell the difference, every single time.
Yeah, like something like that needs information.
Diminishing return, Jesus Christ. In 20 years maybe, on the PS12
I'm hoping RE: Operation Raccoon City never comes to Wii U (*raising flame shield*).
Don't get me wrong, RE is a great series, but really Capcom? This is more like CoD with viruses.
/rant Hopefully RE6 is better.
Useless information: Current Wii U SDK is at version 2.02.
Useful information: The Wii U, like all other current Nintendo systems (and unlike PS3 and 360), has a dedicated audio DSP.
While we obviously are not going to see that quality as a launch title, I'm sure it would be achievable at least in 720p (the demo was 1080p) after a couple years of learning the hardware (if the hardware is in line with the rumours you mention)Ah, so you don't believe in diminishing returns, in that case, I doubt there is really a way to argue with you, but I'll say this... Samartian demo took 3 580x's that is 7.5 Tflops of power, it's also over 30x the GPU power in the 360, the xbox to 360 jump was only about 6x the computational power.
I don't think I can make it much clearer then that... Sure Epic said they might be able to push that down to 2.5Tflops, but that will come with SOME concessions, and they have yet to do it, and in the end, it would have to be possible on 1.4Tflops of GPU power for it to hit the rumored Xbox3 specs.
So if you believe that is next gen, prepare to squint your eyes and tilt your head, I'm sure you can eventually convince yourself that you are right.
Not that surprising, considering Wii and 3DS have one as well. Takes a few percent load off the CPU, so it's definitely welcome.Hopefully useful improvements are inside.
Woah, dedicated DSP?
Imagine the next big console Mario with physics.. I know this post is weird for most people, but I see an huge opportunity here: a good physic engine and Nintendo's creativity could do something amazing.
While we obviously are not going to see that quality as a launch title, I'm sure it would be achievable at least in 720p (the demo was 1080p) after a couple years of learning the hardware (if the hardware is in line with the rumours you mention)
A single 580 is rated at 1.58TFLOP.I mean it's a 7.5Tflop demo.. <snip>
Useless information: Current Wii U SDK is at version 2.02.
Useful information: The Wii U, like all other current Nintendo systems (and unlike PS3 and 360), has a dedicated audio DSP.
A single 580 is rated at 1.58TFLOP.
Seems like a bad engineering decision, todays CPU's should have plenty of grunt for audio and that's why dedicated audio chips were eliminated.
For example on an Xbox 360 game, you toss the audio guy one of the 6 CPU threads, on PS3 he might get a SPU.
It's likely just inefficient to dedicate some silicon to audio, as opposed to just adding that much more budget to CPU, GPU, RAM, etc.
Probably not a giant deal either way, mind you.
Ah, I guess I was thinking of the 590 when I was adding it all up. Flops is probably the worst way you could compare an AMD GPU to a Nvidia one anyways. (seeing as how a HD5870 is a 2.7Tflops card)
I was simply trying to show how impossibly powerful the computer that demo ran was.
There's a metric known as transistor efficiency - how many transistors are employed in doing a given task. While CPUs nowadays can do lots of things sufficiently fast, the use of dedicated silicon yields much better transistor efficiency for the given task. A CPU core, SMT or otherwise, can be used much more efficiently at another task than doing FIR and mixing wave samples, so just because the CPU can do something does not mandate why it should do it.Seems like a bad engineering decision, todays CPU's should have plenty of grunt for audio and that's why dedicated audio chips were eliminated.
For example on an Xbox 360 game, you toss the audio guy one of the 6 CPU threads, on PS3 he might get a SPU.
It's likely just inefficient to dedicate some silicon to audio, as opposed to just adding that much more budget to CPU, GPU, RAM, etc.
Probably not a giant deal either way, mind you.
Right. Since you're apparently well informed about what Epic said about Samaritan's FLOP needs, why don't you call Tim Sweeney and tell him what an overblown figure he gave in his DICE presentation. I mean, he misled everybody with that, instead of just being hones and admitting Epic suck at GPU optimisations. Just think about it, this is your golden opportunity in life to make a name for yourself in the industry.This has been covered a lot of times, but Epic's already said Samaritan can run at 720P at 1.1 TF and 1080P 2.5 TF.
I think they also implied with optimization it could perhaps run on a single 580, which will be an old busted video card before you know it (been surpassed by 7970 and 7950, will be surpassed+eliminated by Kepler soon, then we will have 1-2 more generation of video cards before PS4/720 release. We could well be on AMD 9000 series Nvidia 700/800 series by 720/PS4 or more assuming late 2013 launch imo it could be 2014)
No, it's actually the other way around: All games have sound, so wasting CPU cycles on audio mixing, something DSPs are far better suited for, is actually stupid.Seems like a bad engineering decision, todays CPU's should have plenty of grunt for audio and that's why dedicated audio chips were eliminated.
For example on an Xbox 360 game, you toss the audio guy one of the 6 CPU threads, on PS3 he might get a SPU.
It's likely just inefficient to dedicate some silicon to audio, as opposed to just adding that much more budget to CPU, GPU, RAM, etc.
Probably not a giant deal either way, mind you.
Indeed. I got into a debate God knows how many pages back as to whether all games will have the "tablet only" mode included. I was under the impression that this would be OS level, but I'm honestly not so sure.
This has been covered a lot of times, but Epic's already said Samaritan can run at 720P at 1.1 TF and 1080P 2.5 TF.
I think they also implied with optimization it could perhaps run on a single 580, which will be an old busted video card before you know it (been surpassed by 7970 and 7950, will be surpassed+eliminated by Kepler soon, then we will have 1-2 more generation of video cards before PS4/720 release. We could well be on AMD 9000 series Nvidia 700/800 series by 720/PS4 or more assuming late 2013 launch imo it could be 2014)
Useless information: Current Wii U SDK is at version 2.02.
Useful information: The Wii U, like all other current Nintendo systems (and unlike PS3 and 360), has a dedicated audio DSP.
Audio DSP.
No Digital Audio Out.
Oh Nintendo.
Pretty sure X720 is in devs hands right now, they are in europe at a summit about it too according to a crytek tweet, so I doubt Microsoft will be using anything beyond HD7000 (rumored to be HD6670)
For which game exactly, because almost no console game runs at native 1080p.
Theres a couple of rumours much like everything, one is for a 7 series.
LOL at the Wii U's trailer on IGN's channel on YT. The comments are hilar.
Anyhoo, last year I saw a guy on the video claiming 720 will have Avatar in real-time.
He got flamed more than Rebecca Black.
@z0m3le:
You can't compare a bloated PC techdemo with console hardware. The has about 80% overhead in the OS. Considering the extra computational power, fillrate, featureset, flexibility and memory bandwidth of something like a HD7770 or GTX560TI has to offer over the Xenos or RSX. Such a GPU in hands of competent developers like Naughty Dog or Capcom will give us much better graphics and gameplay experiences than Epic ever could with Samaritan. Just see what developers can do with first generation games on mobile phone/tablet like hardware.
If you think current PC games is where you should be looking to see what next gen will look like you've got another thing coming.
There's a metric known as transistor efficiency - how many transistors are employed in doing a given task. While CPUs nowadays can do lots of things sufficiently fast, the use of dedicated silicon yields much better transistor efficiency for the given task. A CPU core, SMT or otherwise, can be used much more efficiently at another task than doing FIR and mixing wave samples, so just because the CPU can do something does not mandate why it should do it.
Right. Since you're apparently well informed about what Epic said about Samaritan's FLOP needs, why don't you call Tim Sweeney and tell him what an overblown figure he gave in his DICE presentation. I mean, he misled everybody with that, instead of just being hones and admitting Epic suck at GPU optimisations. Just think about it, this is your golden opportunity in life to make a name for yourself in the industry.
Well I do. That's why I get to be the judge of when I can laugh or not.
You should have seen his post pre-edit.
Judgmental little shit with no reason to be.
Well, you're 100% wrong.Yup, and I'm 100% sure it's more efficient to not do dedicated audio.
They have the card in hardware locked down in advane. They are not throwing a 2013/2014 card into a machine launching in 2013/2014.
Yup, and I'm 100% sure it's more efficient to not do dedicated audio.