• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia CEO laments low Turing sales, says last quarter was a ‘punch in the gut’

H

hariseldon

Unconfirmed Member
I'll tell ya, stuff like this is why I stopped being a PC gamer and went back to consoles. PC's just got too expensive with the upgrades be worth my time and energy.

Buy a console and I'm good for 5 to 10 years. Any game I buy for the PS4 will actually work on my PS4.

Buy a PC and it will be insufficient for the latest games in about 3 years. I have to make sure that a game I buy will actually work. And I get pissed when I have to lower settings below my threshold, or it ends up being incompatible with this-or-that because tech has moved beyond whatever I may have going on. Then you run into what I call the "cascade effect" where you have upgrade this to upgrade that to upgrade this to upgrade that. Bleh. I got sick of it.

I was a PC gamer between 1996 and 2013 (spent a few years between PS1 and PC, but missed the entire PS2 era when it was current and bought a PS3 in 2013) and I've come to the conclusion that if I don't use a PC to play games, it will last as long or longer than a game console. For example, this laptop is 8 years old. Can't play any newer games, but it still works for all my legacy console emulators and everything else I need it for, so at this point, I still don't NEED to upgrade it. But my PS4 will be relevant for another 1-3 years and STILL be cheaper to upgrade than a PC will be.

Personally I've bought computers and found them to be good for a lot more than 3 years, bearing in mind usually PC games have higher quality and performance than console games. If it's a desktop, a new GPU every 5 years will suffice, the CPU is barely relevant these days. My desktop running a 970 is getting on a bit and still runs everything.
 

Reallink

Member
FYI "certain high end GPU's" is almost definitely literal and not rhetorical. An explicit exclusion of the 2080Ti, which has been more or less perpetually sold out since release. Even today you will generally only find 2 or 3 models in stock and 30 others sold out. It's effectively selling as fast as they can make them, there is no way Nvidia could consider it a disappointment. I would not be at all surprised to see them pushing the pricing even further next generation ($1499 for the shitty models and up to $1999 for the super custom AIB's). A lot of people have a lot of disposable income and are always happy to spend it on "The Best" of anything.

The cards that are failing are the 2080 and especially 2070, which were always going to be the lions share of Turing sales and revenue. And even there, I'm 90% confident the issue is not the price, it's that they offer very little performance gains over the 1080Ti and 1080 respectively. If they actually offered 2 year generational 50+% gains, I think they would actually be completely sold out and price hiked well above MSRP like Pascal cards were for nearly a year.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
The again Dark, you have an RTX2080 Ti model which is a very very expensive card.

I am excited about potential of technology too, heck I still would have wanted to see the IA64 chip and the ISA extensions the ex Alpha EV8 chip design team would have designed on a modern manufacturing process. I think Elbrus2K had some interesting concepts behind it, Transmeta and Denver just in time code optimisation is interesting, and being able to use the additional Tensor cores and the new Mesh Shaders, and integration of Raytracing / rasterisation hybrids, even the Quake II demo you showed the other day, etc...

... but this will take quite a long while to really being what people expect and buying these cards is not delivering that. As you said they are not good mass products, but how good are they for early adopters too? St for a niche of the niche?
I appreciate your honest thinking around this, different from the excitement driving and hype generating coverage in the beginning you could see online.
 

Panajev2001a

GAF's Pleasant Genius
Not true and I get so sick of repeating it but here we go again:

You will not find a new 1080 Ti for $650 anywhere as they are sold out everywhere and they have been out of stock since November. Check Newegg, Amazon, Microcenter, etc. Not one of those places has one in stock except for Amazon that has one for $1000. All the other ones being sold are $1000+ from 3rd party vendors. Newegg has one use for $799, but at that price the 2080 is a much better buy

In short, the argument that a 1080 Ti is a better buy is no longer a true one as all the stock has been bought up and 3rd party vendors are now selling them at insane markups.


To be fair the argument “not true that 1080 Ti is better value for your money because nVIDIA helped make it so you cannot buy it anymore”... well it would make me feel a bit sad :/.
 

twdnewh_k

Member
Well, in this case, the issue is the die size. I feel this should have been 7nm. The chip is enormous on the RTX cards and rather expensive to manufacture as a result.

I think they simply ran out of die space, honestly. I do think standard performance is still quite good overall and my CPU has become the main bottleneck. I can play every game at 4K60 now, basically, but hitting higher frame-rates at 4K remains challenging.
I think that's where a lot of people don't agree (and which evident from the news article), after 2.5 years a lot were expecting a much larger leap in performance. Specially at these prices.
Die Size was their choice, plus we cant deny that Nvidia only came at those prices because they felt their market position made them feel like they can.

To be honest it made me feel happy the market finally decided to react this way.
 
Last edited:

Chastten

Banned
Quite glad too. The last time I bought a high-end card was back in 2006 and that was like €160. That thing could play anything at the time at around max settings and fluid framerates.

This time last year you bought an entry-level card for that price. Absolutely insane PC-gamers have put up with this nonsense.
 

dark10x

Digital Foundry pixel pusher
I think that's where a lot of people don't agree (and which evident from the news article), after 2.5 years a lot were expecting a much larger leap in performance. Specially at these prices.
Die Size was their choice, plus we cant deny that Nvidia only came at those prices because they felt their market position made them feel like they can.

To be honest it made me feel happy the market finally decided to react this way.
Well, it was 2.5 years after the original 10 series - but most are comparing with the 1080ti which was a 2017 card.

I can certainly understand, though, the price is way too high.
 

Shmunter

Member
Recording high performance PC specs etc. I’m a console gamer, and I’m curious if my mindset is unique;

My appreciation of games is actually largely dependent on seeing results squeezed from fixed hardware. Essentially I see the achievement in the elegance of the software pushing boundaries. Hardware stretching the software seems almost meaningless to me.

I do absolutely appreciate pristine iq and solid frame rates, but seeing it within the confines of a known spec is key, providing for a much needed baseline for the measure of quality and development effort.

For this very reason I adore new console generations and seeing the evolution of software through the years.

Please excuse my quirk 😇
 
My stock portfolio is a "punch in the gut".

Dear Jensen,

Please go back to not being like Apple and raising prices until people stop buying your products and the stock price tanks.

Sincerely,
U.S.
 

JohnnyFootball

GerAlt-Right. Ciriously.
You dont get it.

Point is that if gamer is willing to pay max 650 for 1080ti level card, they wont pay 1400 for any level of card.

I quit pc gaming after high end cards got over 400€ mark, around 2010, when my pc were too slow already
No. I get it just fine.

When the 2080 and the 2080 Ti were released during that first month, 1080s and 1080 Ti’s were still available and being sold at their best prices. As a result a lot of people bought them up and got good deals. But they sold out quickly and are no longer available and its extremely disingenuous to compare closeout 1080 prices that are no longer available to current 2080/Ti prices.

Right now at this moment if you want equal to slightly better than 1080 Ti your only option is the 2080 and there have been some deals to be had as I’ve seen them as low as $650.
 
Well, in this case, the issue is the die size. I feel this should have been 7nm. The chip is enormous on the RTX cards and rather expensive to manufacture as a result.

If AMD where competitive nVidia would have had to eat the cost of their expensive die full of tensor cores they designed for AI. As it stands now they are offloading that cost to the gaming consumer and desperately trying to justify why we need tensor cores in our GPU mainly used for rasterizing.
 

AllyITA

Member
Quite glad too. The last time I bought a high-end card was back in 2006 and that was like €160. That thing could play anything at the time at around max settings and fluid framerates.

This time last year you bought an entry-level card for that price. Absolutely insane PC-gamers have put up with this nonsense.
i will not argue on the performance of that 160€ card, but as i remember it at that price even at the time you could only get an entry level card.

at the time it was more like <200€ entry level; 250/300 medium level; 500€ high end
 

ookami

Member
I think it was quite obvious that I was talking about the type of buyer that values and gets a card in the price bracket of a GTX 1080Ti [ ... ]. A working, young adult that has money to spend on a >900$ videocard but yet doesn't have the rest of the financial obligations that an older person has is the type of person I had in mind.
While I agree with you that people have a tendency to buy more expensive sustainable hardware, I was, in fact, arguing about those PC customers which in my opinion are not especially younger than their console counterpart. And it felt like you were talking about much younger people, hence my questioning about your statement and what got you this idea.

No mainstream product ever goes en masse for the most expensive offering, I would have been silly to suggest that. I don't know how you even got that from my post, maybe I didn't explain it properly but whatever.
I never said you suggested that. Talking about the hardware survey was in response to someone else generalization.
 
Last edited:

ZywyPL

Banned
The cards are overkill for 4K, hell, even an OCed 1080Ti runs games at 70-90FPS, so you need one of those super expensive 4K 144Hz displays to fully utilize the cards. But when you use RTX, the cards can't handle it in more than QHD, for which they are simply too expensive. Plus, almost no support for the new tech for which you have to pay so much makes the cards a very questionable purchase, to say the leaat.

NV needs to push hard for the DLSS+RTX combo being widely used in the upcoming games - being able to play games in 4K60 with RT would make the cards so much more attractive, despite their price,because as of now it's basically just a slightly more powerful 1080Ti for almost double the price...
 

dark10x

Digital Foundry pixel pusher
If AMD where competitive nVidia would have had to eat the cost of their expensive die full of tensor cores they designed for AI. As it stands now they are offloading that cost to the gaming consumer and desperately trying to justify why we need tensor cores in our GPU mainly used for rasterizing.
Absolutely which is why competition is so important. I really hope AMD manages to come up with something but the Radeon VII isn't it.

That said, due to lack of competition, it was a good time to introduce the RT core.
 
Last edited:

tkscz

Member
Yeah, price your cards at ludicrous prices and people won't buy them. That they're surprised by this is hilarious.

Well people had gone out of their way to buy Titan cards in the past and they were around the same price. The issue is that not many games use ray-tracing and even when they do, you have to lose on frames and stick with 1080p for stability. Especially if you went with an RTX 2060.
 
Well people had gone out of their way to buy Titan cards in the past and they were around the same price. The issue is that not many games use ray-tracing and even when they do, you have to lose on frames and stick with 1080p for stability. Especially if you went with an RTX 2060.
Titan cards aren't designed for gaming. They can do it, sure, but that's not what they are primarily made for.

The issue is that, bar the 2060, the RTX cards are just too expensive for what they currently offer.
 

tkscz

Member
Titan cards aren't designed for gaming. They can do it, sure, but that's not what they are primarily made for.

The issue is that, bar the 2060, the RTX cards are just too expensive for what they currently offer.

While yes, the Titans were made as low cost rendering cards compared to their $10,000+ Tesla/Quadro cousins, a lot of people bought them for gaming. Hell, several Gaffers bought MULTIPLE of them for gaming and showed it off.

And at the same time you're right, people bought those Titans knowing it was run games better than which ever current high-end card was out at the time (exception for the 1080 Ti), with the RTX 2080 Ti, 2080 and 2070, you're spending WAY more for a feature that makes games look better but at a frame and resolution cost.
 
Last edited:

longdi

Banned
I am 40% down on my nvidia stocks, fucking Jensen for being a sneaky Ceo, always withholding bad news.

I would have upgraded to rtx2080ti , if it had 16GB vram and $999 witha h2o cooler. At 11GB, it just dont feel like an upgrade over 1080ti.

I know the big turing is an expensive wafer to cut. Thats why Nvidia should have started with mid-range turing like a rtx2060Ti 8GB for $429, performs like a 1080. Also release the laptop version to keep OEM refreshing new models.
The big cards should come with 7nm mid of 2019, 16GB. This would have kept your sales going, and your stupid stock prices dont keep crashing.

From 2018, my gains were enough to buy a Rtx2080ti easily, my dreams are over!
 
Last edited:

Ivellios

Member
While Ray tracing tech is really amazing, it is simple not worth for this price and performance cost, so i understand why people didnt buy the RTX cards.

Plus Ray Tracing/DSLL are not supported in the vast majority of games as well

Personally I've bought computers and found them to be good for a lot more than 3 years, bearing in mind usually PC games have higher quality and performance than console games. If it's a desktop, a new GPU every 5 years will suffice, the CPU is barely relevant these days. My desktop running a 970 is getting on a bit and still runs everything.

I wish that was true, my ancient i5 4590 can only run newer games on minimum, so even if i upgrade the GPU i would still need to upgrade to a new CPU as well.

Not only that, but the requirements for Ray Tracing demands a far stronger CPU as well.
 
H

hariseldon

Unconfirmed Member
While Ray tracing tech is really amazing, it is simple not worth for this price and performance cost, so i understand why people didnt buy the RTX cards.

Plus Ray Tracing/DSLL are not supported in the vast majority of games as well



I wish that was true, my ancient i5 4590 can only run newer games on minimum, so even if i upgrade the GPU i would still need to upgrade to a new CPU as well.

Not only that, but the requirements for Ray Tracing demands a far stronger CPU as well.

I'd be surprised if the CPU was any kind of bottleneck, it's almost certainly the GPU. CPUs are rarely stretched in an era where the focus is on shiny shiny rather than stuff like enemy AI etc which might tax the CPU. As I said before, my NVidia 970 which is hardly cutting edge runs pretty much anything fine, and I'm happy enough to do without raytracing so that's not an issue. I'd classify it as a nice-to-have, not a must-have.
 

Ivellios

Member
I'd be surprised if the CPU was any kind of bottleneck, it's almost certainly the GPU. CPUs are rarely stretched in an era where the focus is on shiny shiny rather than stuff like enemy AI etc which might tax the CPU. As I said before, my NVidia 970 which is hardly cutting edge runs pretty much anything fine, and I'm happy enough to do without raytracing so that's not an issue. I'd classify it as a nice-to-have, not a must-have.

Well if i bought a new RTX 2060 today and Battlefield 5, according to the game minimum requirements i would not be able to play it, since the CPU even on minimum is better than mine.

Or maybe i could make it work with everything on low and low resolution, but my point is CPU is still significant.
 
H

hariseldon

Unconfirmed Member
Ivellios Ivellios I'd be willing to wager that with that GPU you'll be fine running Battlefield V. The minimum requirements are often... creative.
 
H

hariseldon

Unconfirmed Member
I do hope you are right, since i will wait long time before upgrading CPU as well.

Might I suggest picking up something on Steam or GOG that's cheap but is a known GPU hog and see how it performs (if you don't already own such a game).
 

Ivellios

Member
It just makes sense to wait if you aren’t one of the bleeding edge early adopters.

Pretty much, especially considering how few games support Ray tracing/DSLL.

Might I suggest picking up something on Steam or GOG that's cheap but is a known GPU hog and see how it performs (if you don't already own such a game).


This is a good idea, but i still have not upgraded the GPU, i am waiting to see if the rumored GTX 1660 series is more worth it than the RTX 2060.
 

FireFly

Member
I was a PC gamer between 1996 and 2013 (spent a few years between PS1 and PC, but missed the entire PS2 era when it was current and bought a PS3 in 2013) and I've come to the conclusion that if I don't use a PC to play games, it will last as long or longer than a game console. For example, this laptop is 8 years old. Can't play any newer games, but it still works for all my legacy console emulators and everything else I need it for, so at this point, I still don't NEED to upgrade it. But my PS4 will be relevant for another 1-3 years and STILL be cheaper to upgrade than a PC will be.
But that's exactly why you *don't* need to upgrade your PC every 3 years like you did in previous generations – because games are now made primarily for consoles, which have fixed hardware specifications. If you bought a Radeon R9 290 5 years ago, you are still good to go for playing games at 1080p, and will be until the generation ends. I bought a Geforce 970 3 years ago, and see no reason to upgrade until the next generation hits, because I still only have a 1080p screen.

Basically, once you have bought a graphics card sufficient to play at your preferred resolution at console settings, with a little performance to spare, you can ride it out until the next generation hits.
 
Last edited:
The amerikan based companies definitely have the most margin this case though. Not everything is China’s fault, not even if Trump says so.

Is this true though? They aren’t doing the manufacturing, they aren’t contributing to the goods outside of the engineering. The plans, more than likely not even on paper at this point are handed over to China and we get back a fully packaged product within months. Nvidia or many other tech companies may as well just be fronts at this point.
 

Kamina

Golden Boy
Is this true though? They aren’t doing the manufacturing, they aren’t contributing to the goods outside of the engineering. The plans, more than likely not even on paper at this point are handed over to China and we get back a fully packaged product within months. Nvidia or many other tech companies may as well just be fronts at this point.
There is a lot of room between knowing all details of each individaul part or assembling pre-supplied parts according to specifications. As member of a company which does do business like that i can tell you that the actual information a single supplier receives about a product is much more vague than the actual sub-component they are producing for that product. Thats why it is so important to have several differen suppliers overall, who in turn source an assembling supplier.
Looking at Apple and co, the price you pay for the product is largely margin and development costs, less actual prouction and resourc costs.
 
Last edited:

Kamina

Golden Boy
Yeah and the markup between Chinese phones and apples show that.
To be fair: Copying many of the functions in one way or the other from the competition, as well as having the rest of the development also located in china, keeps the total development costs low.
 
Last edited:
H

hariseldon

Unconfirmed Member
Pretty much, especially considering how few games support Ray tracing/DSLL.




This is a good idea, but i still have not upgraded the GPU, i am waiting to see if the rumored GTX 1660 series is more worth it than the RTX 2060.
Ah I had a reading failure, misread it as you'd already upgraded. Apologies.
 
I was talking mostly hardware, software adds another level of complexity to the equasion.

What hardware has apple come up with on their own? Have they invented any new ram? New screens? New batteries? As far as I know they just take parts from everyone else and put them in an apple designed shell. Not much different than Dell really.
 

Kamina

Golden Boy
What hardware has apple come up with on their own? Have they invented any new ram? New screens? New batteries? As far as I know they just take parts from everyone else and put them in an apple designed shell. Not much different than Dell really.
Hardware functions, and developments/advancements based on pre existing hardware.
This is all really complex. There is more to development of a product than engineering it based on components you find on the market. Often you have patentes for new things you share with one supplier, other times you use certain components with slight changes in a way other than what they were intended to be used for,

All i’m saying is that individual chinese suppliers for each themselves know much less about the product they are producing and assembling than you think, and that development is more than stacking parts and designing a nice shape for it.
 
Last edited:
All i’m saying is that individual chinese suppliers for each themselves know much less about the product they are producing and assembling than you think, and that development is more than stacking parts and designing a nice shape for it.

How can you say that when they produce knock offs as fast as they do? And it’s not just making a copy. It’s taking a set of cheaper components and making versions of their own. I think they know far more than we’d think. Japan was the clear leader in technology when we were growing up and that shifted for some reason and now japan is stuck in this weird place almost as if they’ve been shunned.
 
Last edited:

Kamina

Golden Boy
How can you say that when they produce knock offs as fast as they do? And it’s not just making a copy. It’s taking a set of cheaper components and making versions of their own. I think they know far more than we’d think. Japan was the clear leader in technology when we were growing up and that shifted for some reason and now japan is stuck in this weird place almost as if they’ve been shunned.
Reverse engineering is its own field to specify in, thats why they are are so good with knock offs.

However, I feel we have drifted too far from the initial topic.
 
Last edited:

Chastten

Banned
i will not argue on the performance of that 160€ card, but as i remember it at that price even at the time you could only get an entry level card.

at the time it was more like <200€ entry level; 250/300 medium level; 500€ high end

Technically you're correct but you're missing something important. The prices you mention are the MSRP launch prices but cards were never sold for that price and dropped ridiculously fast. I just checked and mine was a GeForce 6800GT that I bought in 2005 for €160. Granted, this was a year after it's release, and it was the slower AGP version since I didn't want to get a new Mobo but the value was incredible.

It would be like buying a 1070 for under €200 now
 

Ascend

Member
Well, there's two perspectives to consider.

The RTX cards are not great from a consumer perspective and, reviewing them from the "should I spend money on this" perspective suggests that you should wait. That's one style of review.

The other recognizes the potential of what we're seeing. What they're attempting is fantastic and interesting. It's extremely important for the future of real-time graphics. This is how graphics cards USED to be, though, but the last ten years have been rather dull. I'm thrilled to see something new again but, yeah, it's for early adopters only right now.

So I think these are interesting cards but not necessarily a great value right now.
I have to disagree here...

People talk about nVidia pushing the industry forward with RTX, but what they were really doing, at least up to and including RTX, was limit the adoption of DX12/Vulkan and increase their card prices for more profit. And there is literally ZERO reason, why ray tracing could not be implemented through compute, which AMD is also extremely good at. If things were done openly, gamers would benefit, but that's not what happens.
Now that they have RTX they are seen as innovative, even though it's simply the next stage of what used to be TWIMTBP, PhysX, GameWorks in general, G-Sync, which is closing off certain technologies for themselves and singling out the competition, because they can.

But you only have to look back a year, where Microsoft announced Ray Tracing support for DX12, and Vulkan followed later in that same year. AMD had its Radeon Rays 2.0 ready in March of last year, which is real time Ray Tracing in Vulkan using Async compute. That's not even counting the first version of Radeon Rays.
DLSS, same thing. Microsoft released DirectML early in 2018. Nothing DLSS does is special, in the sense that it all can be done while using DirectML. And that's ignoring the fact that dropping resolution and upscaling normally often gives superior quality compared to- for the same performance as DLSS. But no. nVidia has to play this game of closing off everything in order to pretend that only they have the technology to do what they're doing, while that can't be further from the truth.

Vulkan in particular (although DX12 also is equally capable) offers an efficiency and flexibility that is unprecedented, where many technologies could be implemented without the need for nVidia's closed off approach. All the while Vulkan, and probably even DX12 wouldn't exist in its current form if it wasn't for AMD, but they don't get props for innovation. nVidia gets praised for their anti-consumer closed off approach though, and to me, that's disgusting.
Large groups of gamers bashed DX12 because nVidia performed badly (or at least worse than DX11) in it, while AMD had either parity or an improvement with it in most cases. So what really is the problem then? DX12, or nVidia? Vulkan is bashed to a lesser degree because there the benefits have been more obvious. The only reason it's bashed is because it's barely used compared to DirectX. But nVidia's mind share has dominated gaming for too long, and the consumer would be better off if things went differently, as in, more open.

The issue is, AMD didn't market RT even though they had it, and why should they, when hardware in reality is not ready for it? nVidia on the other hand are masters of marketing and most importantly, deception. Everyone thinks that only they can do RT now, and that's a big joke. But this time they have gone too far and it didn't work out like they expected.
And I'm extremely glad to see that at last, this approach is firing back in their face, because honestly, I'm tired of it.
 
Last edited:

llien

Member
FYI "certain high end GPU's" is almost definitely literal and not rhetorical. An explicit exclusion of the 2080Ti, which has been more or less perpetually sold out since release. Even today you will generally only find 2 or 3 models in stock and 30 others sold out.

Depends on country, I guess.
In DE, large online retailer has 10+ models, all are in stock, total number sold is laughable.

The issue is, AMD didn't market RT even though they had it, and why should they, when hardware in reality is not ready for it? nVidia on the other hand are masters of marketing and most importantly, deception. Everyone thinks that only they can do RT now, and that's a big joke.
To be fair here, at the moment only they can do "that type of RT" which they have baked in into 20xx series, as it is using specialized hardware that somehow does ray intersection tests faster.
 
Last edited:
Top Bottom