He's probably shooting for that cinematic framerate of 15fps... when no action is on-screen.
The eye can't see more than 24 frames/s anyway.
huehuehuehue
He's probably shooting for that cinematic framerate of 15fps... when no action is on-screen.
It's not as simple as that. Another die, say 6 billion transistors minus the computational caches etc is still a big fucking die.
Which product of theirs competes with Titan so as to force down its price?I don't see how they aren't.
But they already have GK110 in professional cards (Tesla and Quadro). The Geforce line is supposed to be consumer gaming cards.
But they already have GK110 in professional cards (Tesla and Quadro). The Geforce line is supposed to be consumer gaming cards.
Seems like a lot of the die size would be wasted on consumers, I don't get why the high end single chip wouldn't just be a GK104 with more cores speed and bandwidth. The double precision units of 110 are unnecessary for games.
well the 7970GE out right now is the top card. Their next gen card will likely be out by the time this comes.Which product of theirs competes with Titan so as to force down its price?
Oh no, it has a larger bus?! ... Wait, how is that bad again?
Awesome a 900 dollar piece of hardware that has no real purpose...
Uh.
Any one here who is a fan of 60fps, 3D, Occulus Rift, Supersampling, higher than 1080p Resolution, Tessellation, etc could benefit greatly from this. And still be needing more horsepower. Especially with next gen fast approaching, and Crysis 3. PC gamers are starting to have reasons to upgrade more again, finally.
Whoa... DAT meltdown! It's like... premium products are a bad thing! And no one should be caring about visual fidelity.
We should be blaming AMD for the price, because they're seemingly incapable of producing hardware powerful enough to compete with nVidia.
I'll definitely be waiting for a price drop of some kind, but this is good as I probably don't need to upgrade anyway, seeing as my 2yo PC is way more powerful than either of the next-gen consoles as it is.
Only assuming that the drivers to fix frame latency issues are correct.well the 7970GE out right now is the top card. Their next gen card will likely be out by the time this comes.
You are comparing dies with 100% size difference, while we're talking of a hypothetical difference of 14%. Try again.die size increase cuts yields exponentially.
Fig. 9. (a) and (b) have the same defects but the yields are 38% and 79%, respectively with the chip edge of (a) double that of (b) (four times increase in chip area).
If AMD drops the ball again this will probably be more powerful than the high end 700 series card.
I've already dropped quite a bit of info - the amount of wafers you order for your Tesla products are less than 1/12th of what you do for your Geforce line.Artist, could you give me a rundown of what's going on with the production then? I'm no engineer, I just understand performance and how to test it. You seem to know what's going on here, but you're only really responding with 'no' instead of explanations.
Is there anything other than inductive reasoning we have to believe this is the case? Would love to see some more data on this.Not that you're not right but the process node drop also significant increases the number of chips they can get off a wafer. Excluding R&D costs it's now significantly cheaper for them to produce these chips than before we hit 2xnm. It appears like they ate the increased the profit margins at first to make up for the R&D setbacks and now that they realize they can jack the prices are seeing how far they can push it.
You mean in terms of # of products sold? Essentially getting a cut on the price of the manufacturing because of economies of scale?I've already dropped quite a bit of info - the amount of wafers you order for your Tesla products are less than 1/12th of what you do for your Geforce line.
What I meant was that Tesla K20 being in short supply does not indicate bad yields but rather inaccurate demand projections for K20 within Nvidia.You mean in terms of # of products sold? Essentially getting a cut on the price of the manufacturing because of economies of scale?
85% increase!? Wow.
Uhh...
Rough Comparison:
Gotcha. Thanks!What I meant was that Tesla K20 being in short supply does not indicate bad yields but rather inaccurate demand projections for K20 within Nvidia.
He's kind of right, if you discount the 690 as dual GPU. The 7990 is also more powerful than the 690 as AMD's cards scale better in multi GPU setups.Uhh...
It's not worth it. :<Shhh
mobile chips are NOT comparable to desktop parts like this.
You know that nobody believes you.
Jesus, dat price.
I know right. I only have 2x 580s in my machine, but playing BF3 on a 120hz monitor at 120 fps, all Ultra settings, is a sight to behold.
http://iliyan.de/publications/VisibilityCaching/VisibilityCaching_EG2013.mp4Can we do realtime raytracing yet?
That's awesome. Hard to believe those screenshots could be accomplished in realtime, very impressive stuff.
Supposedly 6GB. I think it comes in either 5GB or 6GB variants, so it'll be one of those two.This is of actual interest to me. Would be much preferable to a SLI setup to push my high res monitor.
I can probably unload my card for 350 or so still.
Will want to see the 1440p or 1600p benchmarks first. This is assuming they are not stupid and put enough VRAM on the thing.
You are comparing dies with 100% size difference, while we're talking of a hypothetical difference of 14%. Try again.
Quoting number isn't what Im going to try to do since Im not here to argue who has the bigger epeen. Only that people should think before they blame something.
Supposedly 6GB. I think it comes in either 5GB or 6GB variants, so it'll be one of those two.
DO you even lift?
lol 5GB or 6GB for a single GPU is so ridiculous outside of using it for a realtime GPU render engine.
So if I buy two 6GB VRAM Titans I will have more VRAM than most people have regular RAM in their PCs let alone the next gen consoles.
This appeals to me.
fixed
What did I say that you are responding to me with this?Quoting number isn't what Im going to try to do since Im not here to argue who has the bigger epeen. Only that people should think before they blame something.
the same principle apply. check the number again. #of working die compared to area increase.
DO you even design chips?
lol 5GB or 6GB for a single GPU is so ridiculous outside of using it for a realtime GPU render engine.
So if I buy two 6GB VRAM Titans I will have more VRAM than most people have regular RAM in their PCs let alone the next gen consoles.
This appeals to me.
fixed
What did I say that you are responding to me with this?
*edit* ohhhh, TechReport's really accurate and awesome graphs.
Listen man, my test bench has a 7970 on it. I'm not some fanboy. I'll point out any hardware flaw if I see it.
Same principle applies but the numbers change drastically.the same principle apply. check the number again. #of working die compared to area increase.
DO you even design chips?
Apples to oranges comparison.
The previous highest (read crazy insane) price EVER for a single-gpu product was the 8800 Ultra ($829); http://www.legitreviews.com/article/496/1/
Pricing will most likely change.