• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Nvidia announces press event for March 3rd

Sorry where did I mention consoles in this post?
You mentioned it earlier. And I find it really hard to complain about prices overall. Progress in GPU tech has been slowing down, not on purpose, and the costs of finding gains have inevitably gone up yet PC gaming remains affordable while still being quite powerful.
 
It's likely something they have never talked about before. They said the same thing about the Shield portable and that came completely out of the blue.
I think your on point on it being tegra related. Maybe a tegra powered console? Or perhaps a new tablet that doesn't have anything to do with shield like a more business oriented tablet? That picture looks like a laptop but, other then their own chromebook, a laptop just seems to implausible.
 
Tegra X1 powered console

4K output (Netflix/Amazon)
Android apps
Media center
Game streaming from Nvidia cloud library
Game streaming from your desktop etc.

$249
 
The hd 4890 launched at 250 dollars
I paid 130 for a (only 20 percent slower) 4870 in early 2009 (and it smashed games at the time performance wise)
I got 4890 in April 2009 to complement the nehalem... it didn't really smash games <.<

FPS would be 30s if I crank up details.
 
It's far from arbitrary as the category they get put into decided the price they are sold at. Which is what's most important as a buyer.
Having to pay 350- 500 euros for what is effectively the 960 is not cool...

I'm not sure what you're even arguing. I said that the thing that matters is the performance characteristics of the card and its cost - what its internal chipset label is doesn't mean dick. If it's not good value, it's not good value. Doesn't mean shit whether it's classified as high end or not. Hence, if you purchased the 980 because you thought it benched well and were willing to the pay the price, you shouldn't be losing sleep over the fact that hidden in a vault somewhere is the specs for the yet-to-be-produced GM210 chipset.

The X04 -> X10 yearly split seems like it's the new normal for Nvidia cards, and this is a change in how things were done from the pre-Kepler days. While this is interesting, it strikes me as academic - the 680, the successor to the 560Ti in your books had roughly double it's performance. Did we actually lose something with this new chip schedule? I don't know, maybe somebody can do a series of performance comparisons for that transitional generation.
 
Does Nvidia have a precedent of announcing a new line of cards only half a year later? Seems too early to snub out the 900 series.
 
Does Nvidia have a precedent of announcing a new line of cards only half a year later? Seems too early to snub out the 900 series.

It announces them generally when they become available to order, or close to it.

This is probably not a new card announcement, and if it does also contain a card announcement it will be something resembling a Titan or 690 or something like that.
 
Did we actually lose something with this new chip schedule? I don't know, maybe somebody can do a series of performance comparisons for that transitional generation.
Hmm maybe a basic GFLOPS per $200 chart will settle the argument.

Does Nvidia have a precedent of announcing a new line of cards only half a year later? Seems too early to snub out the 900 series.
Any 980Ti, Titan II or 990 won't cannibalize the current cards.
 
Nvidia's very own VR thingy ? Is such a thing even possible ?
 
Been going through Nvidia's presentations for GDC. Some topical stuff on it which could tie into this presentation:


Monday has the following:

Advanced Visual Effects with DX11/DX12 - Fluid Simulation and Hybrid Ray Traced Shadows for DX 11 +12

Nvidia is conspicuously absent from stuff on Tuesday, until their 7.30 presentation.

Wednesday has the good stuff.
New Maxwell GPU Features
SLI and Stutter Avoidance with Multiple GPUs
Nvidia's VR Direct Software
VXGI - Real-time Global Dynamic Illumination
NSight Visual Studio 4.5 and Beyond for upcoming graphics APIs

Thursday is Shield/Android Day, and Friday is GRID Day.

I'd guess that they'll be talking about VR Direct, with a number of other software techs they'll be introducing to Maxwell and beyond - VXGI, more ray-tracing (wishful thinking here).


edit - just read decoy's link, fuck lol
 
I'll trade my 580 for a 780ti in a heartbeat! That card is still great

Oh yeah it is. I adore my 780ti, its performed great. I still can play every game at high settings at 1080p over 30 fps, usually 60 or close to it.

That being said its not quite capable of steady 4k and I really hate how it only has 3gb of VRAM. I really hope Nvidia next set of cards has 6+
 
Oh yeah it is. I adore my 780ti, its performed great. I still can play every game at high settings at 1080p over 30 fps, usually 60 or close to it.

That being said its not quite capable of steady 4k and I really hate how it only has 3gb of VRAM. I really hope Nvidia next set of cards has 6+
What in the world is this? Are you sure your system isn't dying? This is @ 1440p on Ultra

bf4-fps.gif


The 780Ti pushes games at 120Hz/1080p.
 
I ave no idea what this could be. Thankuflly it is gaming related.

It would be great if it was the Titan II, because that would mean the gpu market isnt being artificially propped up in terms of time scaling any longer. How long has titan-like performance been the limit in single GPU? Way too long, that is for certain.
 
It's always folly to try to and interpret marketing statements, but "redefine the future of gaming" really doesn't sound like an incremental GPU upgrade to me.
 
It's always folly to try to and interpret marketing statements, but "redefine the future of gaming" really doesn't sound like an incremental GPU upgrade to me.

If I'm not mistaken, the last time they used this terminology was for G-Sync. That plus the 5 years in the making, maybe it is also a news related to gaming peripherals. Let's say a new input / control device.
 
I'm not sure what you're even arguing. I said that the thing that matters is the performance characteristics of the card and its cost - what its internal chipset label is doesn't mean dick. If it's not good value, it's not good value. Doesn't mean shit whether it's classified as high end or not. Hence, if you purchased the 980 because you thought it benched well and were willing to the pay the price, you shouldn't be losing sleep over the fact that hidden in a vault somewhere is the specs for the yet-to-be-produced GM210 chipset.

The X04 -> X10 yearly split seems like it's the new normal for Nvidia cards, and this is a change in how things were done from the pre-Kepler days. While this is interesting, it strikes me as academic - the 680, the successor to the 560Ti in your books had roughly double it's performance. Did we actually lose something with this new chip schedule? I don't know, maybe somebody can do a series of performance comparisons for that transitional generation.

About the yet-to-be produced thing, we have leaked pictures of GM200 showing it already in production at least a couple months back and shipping data from even further back. Plus it's on the same 28nm process node Nvidia have been using for three years now. Purported initial yield problems explaining away GK100 don't apply to GM200 at all. The chip is certainly in production with very little room for process maturation at this point to do anything.

And about whether we've lost anything, yes, we have. Remember the 980 being ~5-10% faster than the 780Ti (which is pitiful) even though there's plenty of room for the die to be larger and still be easily profitable in a similar price range? GM204 is barely any faster than GK110; it is factually the smallest increase a GPU architectural debut from Nvidia (at least anytime in the past decade) has ever yielded by a vast margin, and the chip itself is relatively puny and no doubt quite a bit cheaper to manufacture.

The GTX 680 was barely 30% faster than the 580. According to Nvidia's own representation of the 480 vs the 280 in Heaven, however (which supports that 30%), the 480 (GF100) is around 60-70% faster (with some variance) than the 280 (with tesselation, that the 280 cannot do). The 280 (GT200) had a similar lead over the 8800GTX/9800GTX. The 8800GTX (G80) was about double the 7800GTX before it. The 8800GTX launched for $650 (undercut later by the massively popular 8800GT), the 280 launched at $650 (forced to $500 weeks after by ATi), the 480 launched at $500. All three of those chips were the big-die chips of their respective generations/architectures and all consistently and vastly outperformed their predecessors. There is some variation as to how much depending on the specific series/architecture, but they're all way beyond the modest improvement the 680 brought and especially way beyond the barely existent improvement the 980 brought.

The improvement the 680, also at $500, should have brought over its predecessor (the 580) is more in line with the 780 (GK110) and the improvement the 980 should be over the 780/Ti is more in line with what GM200 will bring to the table, if we're to maintain the same level of progress from architectural debut to architectural debut. Instead, we're getting rumors of a delayed Titan II for $1350 with what historically would have been the 980 with a price at roughly half that $1350 to cover the extra costs of the bigger die, meaning around ~$650-700. GM204 is not a high-end chip; it doesn't perform relative to its predecessor like a new architecture's high-end chip would, it doesn't have the die size of a high-end chip, it doesn't have the design characteristics of a high-end chip, it doesn't have the power consumption of a high-end chip, and it doesn't have the internal code name of a high-end chip, but it does have the price of a high-end chip.

Additionally, look how ridiculous pricing has become. The Titan? Nvidia tried charging $3000 for their most recent dual-chip card (Titan Z), which is way beyond any dual-chip GPUs of the past including the 590 which had GPUs of a similar die size as GK110.
 
About the yet-to-be produced thing, we have leaked pictures of GM200 showing it already in production at least a couple months back and shipping data from even further back. Plus it's on the same 28nm process node Nvidia have been using for three years now. Purported initial yield problems explaining away GK100 don't apply to GM200 at all. The chip is certainly in production with very little room for process maturation at this point to do anything.

And about whether we've lost anything, yes, we have. Remember the 980 being ~5-10% faster than the 780Ti (which is pitiful) even though there's plenty of room for the die to be larger and still be easily profitable in a similar price range? GM204 is barely any faster than GK110; it is factually the smallest increase a GPU architectural debut from Nvidia (at least anytime in the past decade) has ever yielded by a vast margin, and the chip itself is relatively puny and no doubt quite a bit cheaper to manufacture.

The GTX 680 was barely 30% faster than the 580. According to Nvidia's own representation of the 480 vs the 280 in Heaven, however (which supports that 30%), the 480 (GF100) is in the 60-70% faster than the 280 range (with tesselation, that the 280 cannot do). The 280 (GT200) had a similar lead over the 8800GTX/9800GTX. The 8800GTX (G80) was about double the 7800GTX before. The 8800GTX launched for $650 (undercut later by the massively popular 8800GT), the 280 launched at $650 (forced to $500 weeks after by ATi), the 480 launched at $500. All three of those chips were the big-die chips of their respective generations/architectures and all consistently and vastly outperformed their predecessors. The improvement the 680, also at $500, should have brought over its predecessor (the 580) is more in line with the 780 (GK110) and the improvement the 980 should be over the 780/Ti is more in line with what GM200 will bring to the table, if we're to maintain the same level of progress from architectural debut to architectural debut. Instead, we're getting rumors of a delayed Titan II for $1350 with what historically have been the 980 with a price at roughly half that $1350 to cover the extra costs of the bigger die, meaning around ~$650-700. GM204 is not a high-end chip; it doesn't perform relative to its predecessor like a new architecture's high-end chip would, it doesn't have the die size of a high-end chip, it doesn't have the physical characteristics of a high-end chip, it doesn't have the power consumption of a high-end chip, and it doesn't have the internal code name of a high-end chip, but it does have the price of a high-end chip.

Additional, look how ridiculous pricing has become. The Titan? Nvidia tried charging $3000 for their most recent dual-chip card (Titan Z), which is way beyond any dual-chip GPUs of the past even the 590 which had GPUs of a similar die size as GK110.

Great post.

Plus, GM204 only has a 256-Bit bus, unlike high-end chips that have a 384-bit or 512-bit bus.
 
Why are people thinking it's a new GPU? ANDROID site gets an invitation, and you think it's the next Titan, that's specifically NOT made for gaming. It's going to be something Tegra gaming related. That doesn't leave much options besides yet another Android console. Perhaps their own VR device?
 
Can see it being Nvida's VR attempt or some other new product market (maybe game streaming like PlayStation now). I Doubt its for a new GPU (as the 5 years in the making make it sound something thats not a GPU).
 
23.5GB VRAM GTX 1080. The slogan will be "YOU get some vram and YOU get some vram and EVERYBODY GETS SOME VRAM". Everyone i will go wild. People will parade on streets all over the world. They will laugh and cry and sing. All because we can now run that Shadow of Mordor texture pack thing in peace and harmony.

No but seriously, it wouldn't be a new GPU would it?

Will mark my calendar for March 2.5.
mj-laughing.gif
 
Top Bottom