• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Nvidia Stuns Wall Street with Record Earnings

Still don't understand why MS went with AMD for XB1. They have an agreement with Nvidia from 2000 that states no company can own more than 30% of Nvidia shares without Microsofts first and last rights of refusal.



This could become important if Microsoft really starts focusing on PC gaming. They already chose Nvidia for the Surface Book gpu and their Azure cloud gpu platform.

I would like to see another system use an nvidia and intel cpu like the OG xbox. I think Intel might make things a bit expensive for them however.

EDIT: Good for them. I have a 970 and it runs really well. I guess TDP does really mean something. I had an AMD card and it was noisy and did consume some power compared to what Nvidia had. That was the 7850. Great gpu. My 970 is quiet and really doesnt get to hot. I do wish AMD would compete a bit more. I wouldnt have any issues in using a gpu from either vendor though.
 
The problem is that people are just buying Nvidia products just because of the name. Amd doe have reasonable priced gpu per performance. At this point no one is willing to give Amd a chance though, people blindly buying Nvidia are the problem at this point because they are letting Nvidia get away with some shady activities.

I don't
want to
believe this. PC gamers are pretty savvy and tend to buy the best product. nVidia has the best GPU ecosystem atm but not by an insurmountable degree, if AMD sticks the landing with Polaris I would wager it would turn their fortunes around.
 
The problem is that people are just buying Nvidia products just because of the name. Amd doe have reasonable priced gpu per performance. At this point no one is willing to give Amd a chance though, people blindly buying Nvidia are the problem at this point because they are letting Nvidia get away with some shady activities.

You're absolutely right. I don't even look at AMD products when I consider a purchase. Like, don't even look.

What they need is a product that wins back mindshare. Even if it has to be a loss-leader.

They need a 8800GT, a GTX 460, or a GTX 970. Something that spreads like wildfire on forums, something that gets reviewers excited.

They can't just be content to release cards that are a little bit better for a little bit cheaper. That won't work.
 
I don't
want to
believe this. PC gamers are pretty savvy and tend to buy the best product. nVidia has the best GPU ecosystem atm but not by an insurmountable degree, if AMD sticks the landing with Polaris I would wager it would turn their fortunes around.

Well even back in times when AMD (or ATI) had clearly better cards market share difference never really was that much in favor for AMD (slightly better). Now NVIDIA has little better cards and they are stomping AMD. I don't generally like to place too much emphasis on companies selling products on brand power but it's pretty weird.
 
Well even back in times when AMD (or ATI) had clearly better cards market share difference never really was that much in favor for AMD (slightly better). Now NVIDIA has little better cards and they are stomping AMD. I don't generally like to place too much emphasis on companies selling products on brand power but it's pretty weird.

Yeah, even at AMDs heights way back when they were clearly out and ahead in value they never dominated market share that I know of.

I'm just not sure it's in the cards for them. A shitload of people ignore them regardless.

When I look at that space currently and the shit with the 3.5gb cards and the way they handle it, I don't understand the mindset of someone not worrying about an increasing monopoly in this space.
 
You're absolutely right. I don't even look at AMD products when I consider a purchase. Like, don't even look.

What they need is a product that wins back mindshare. Even if it has to be a loss-leader.

They need a 8800GT, a GTX 460, or a GTX 970. Something that spreads like wildfire on forums, something that gets reviewers excited.

They can't just be content to release cards that are a little bit better for a little bit cheaper. That won't work.

With dud releases like the Fury X/Nano AMD won't be getting anywhere soon.

The available units at release was pathetic. Polaris with a solid release won't turn things around overnight but if they can build on that for future releases and not price it at roughly the same price at Nvidia's best consumer offering whilst being considered a worse performer then they might get somewhere.

But the battle is won at the lower end of the spectrum anyways, Says alot when the 970 is the most popular card on steam even way after the 3.5gb fiasco and the r9 300 series isn't even on the list.
 
nVidia burned both Sony and MSFT in the past. Good luck getting that business back.

Both MS and (until relatively recently) Sony have been perfectly happy to include Nvidia tech in their PCs. If the deal was right they would have taken it.

Nvidia having record earnings is bad for consumers? Wha?

Yeah the problem is that AMD isn't earning shit, not that Nvidia is successful. We want them both to do well.
 
I continue to be astounded by the fact that it seems to be possible for Nvidia to make any profit (never mind increasing revenue and profit) after missing out on those incredibly important and lucrative console deals.

Maybe they are selling all the salt people assured me that they were drowning in.

I could post my list of features again, but people generally seem to ignore it.

Given your reputation, I certainly wouldn't ignore it. From what I can tell, the 390x is a better performer at resolutions higher than 1080p--very competitive at 1080p, and has 8GB of DDR5, making it more "future proof" than the 970. By the way, I realize that future proof in the pc world is tenuous at best.

I saw the PSU mention in an earlier post; no offense to that poster, but that reasoning didn't exactly resonate with me. The ecosystem/driver stuff makes sense, but the new Crimsons are pretty good.
 
I continue to be astounded by the fact that it seems to be possible for Nvidia to make any profit (never mind increasing revenue and profit) after missing out on those incredibly important and lucrative console deals.

Gamers seem to only care about posting huge number sales, not caring about profits. So to them, seeing that PS4 and xbox one have AMD GPUs, they think that is a big deal because a lot of AMD GPUs are being shipped, not even considering anything else. Only when it suits their argument do they dig deeper, sometimes.
 
Yeah, NV "burned" MS by not agreeing to take a loss for an MS product in place of MS. And how exactly did they "burned" Sony?

RSX being a POS and being inferior to the AMD chip from day one. Also along with the cell keeping manufacturing costs high.
 
RSX being a POS and being inferior to the AMD chip from day one. Also along with the cell keeping manufacturing costs high.

Sony has nobody but themselves to blame for this. Their choice of PS3 architecture led to this situation, with RS being canned NV has actually helped them to fill the void in place of a GPU on the PS3 in the short enough time to launch only a year after 360. If you remember they were contracted to produce the software tools only at first but then after RS failure Sony decided to order the GPU from them as well. Also PS3 is built on royalties model and NV doesn't produce the RSX, they've licensed the tech to Sony in the same way as everyone is doing this in console space.
 
nVidia burned both Sony and MSFT in the past. Good luck getting that business back.

If I remember correctly, MS and Sony went with AMD cause it was cheaper proposal they got. Intel and Nvidia were rejected cause $$$ not that they don't want to work with them.
 
If I remember correctly, MS and Sony went with AMD cause it was cheaper proposal they got. Intel and Nvidia were rejected cause $$$ not that they don't want to work with them.

Not the sole reason as NV didn't have any CPU they could've licensed back in the days of current gen design but yeah, price has a lot to do with it.
 
Sony has nobody but themselves to blame for this. Their choice of PS3 architecture led to this situation, with RS being canned NV has actually helped them to fill the void in place of a GPU on the PS3 in the short enough time to launch only a year after 360. If you remember they were contracted to produce the software tools only at first but then after RS failure Sony decided to order the GPU from them as well. Also PS3 is built on royalties model and NV doesn't produce the RSX, they've licensed the tech to Sony in the same way as everyone is doing this in console space.
I doubt it was a last minute canned decision. The custom nature of the RSX plus Nvidia's own desire to testbed their unified programmable shader tech (IINM HJS mentioned this in one of his campus speech) on mass market device means it may have been mutually planned long ago.

Sony should caveat emptor in the very first place and it is ultimately business with suits, though it doesn't change that Nvidia gave them a very raw deal.
 
Not the sole reason as NV didn't have any CPU they could've licensed back in the days of current gen design but yeah, price has a lot to do with it.

In principle, could there be an Intel-Nvidia collaboration to produce an SOC? I mean I know it's not going to happen, but is it the kind of thing that a console maker could get done if price was no object? Or are APUs like that too technically challenging unless it's under the same roof, like AMD/ATI are?
 
In principle, could there be an Intel-Nvidia collaboration to produce an SOC? I mean I know it's not going to happen, but is it the kind of thing that a console maker could get done if price was no object? Or are APUs like that too technically challenging unless it's under the same roof, like AMD/ATI are?
No point I think since each can make its own non-phone SoC.
 
Great for nVidia.

I'm one of these that think old ATI give more competition to nVidia than AMD ever did after buy ATI.

I don't think this is true, things seem to be about the same. ATI was even in a lull when AMD bought them, not long before that their highest end card was trading blows with Nvidia's midrange chip. They've grown more competitive from there, Nvidia may still have the overall performance crown but at least Fury isn't getting put up against the 970 or something.
 
In principle, could there be an Intel-Nvidia collaboration to produce an SOC? I mean I know it's not going to happen, but is it the kind of thing that a console maker could get done if price was no object? Or are APUs like that too technically challenging unless it's under the same roof, like AMD/ATI are?

I actually think getting Nvidia and Intel to work together would be tough. If I remember correctly Intel cut Nvidia out of the integrated graphics on their laptop motherboards. That pissed Nvidia off. I fear if someone went with Intel would they want to try and push their Iris Pro into the system instead of working with a gpu maker.
 
Still don't understand why MS went with AMD for XB1. They have an agreement with Nvidia from 2000 that states no company can own more than 30% of Nvidia shares without Microsofts first and last rights of refusal.

Nvidia had no reason to be at the bargaining table when AMD was (and is) circling the drain and desperately needed any form of income. I'm sure you've noticed that AMD's financials haven't really improved despite being the CPU and GPU provider in all three current-gen consoles, which have sold somewhere north of 60m units combined. I believe the last time AMD posted a profit was Q3 2014 -- the aforementioned console contracts are ostensibly pennies on the dollar.
 
Given your reputation, I certainly wouldn't ignore it. From what I can tell, the 390x is a better performer at resolutions higher than 1080p--very competitive at 1080p, and has 8GB of DDR5, making it more "future proof" than the 970. By the way, I realize that future proof in the pc world is tenuous at best.
Ok, here are a few reasons to buy an Nvidia GPU over a similar AMD GPU even if the latter is better in a pure FPS/$ metric:
  • Better OpenGL support
  • Better Linux drivers
  • Lower CPU overhead in DX9 and DX11
  • Driver-based HBAO+ injection and wider support for driver-based SGSSAA injection
  • Shadowplay, which has more solid support for e.g. desktop shadow capture than AMD's equivalent
  • G-sync, which has its own list of advantages over freesync, e.g. windowed mode support
  • CUDA support for professional applications
  • PhysX support
  • More consistent frametime results in memory-limited situations
  • DSR with its much broader range of supported downsampling resolutions
  • 3D Vision
Now, each of those will be of different importance to different people, and someone may well and rightfully say they don't care about any of them - or not sufficiently to influence their purchase over raw performance potential - but they are all factual (I can provide links for any which are not self-evident) and someone surely cares about each. The common supposition that people who buy Nvidia GPUs are invariably blinded by fanboism really grates me, since I believe that my purchasing decisions in hardware are quite well-considered.
 
Dat feel when you run Linux 99% of the time and would love to support the competitor's products with open-source drivers but NVIDIA's closed binary blobs are just better-coded and smoother-running so you've contributed to this.
 
How much cheap you guys think the 980ti will get when Pascal is released?

Not much. NV haven't been doing much discounting lately (hence the profit) and its unlikely to drop much on re-sale either.

Edit: beaten... and Aztechnology's says it better as well!!!!
 
I doubt it was a last minute canned decision. The custom nature of the RSX plus Nvidia's own desire to testbed their unified programmable shader tech (IINM HJS mentioned this in one of his campus speech) on mass market device means it may have been mutually planned long ago.

Sony should caveat emptor in the very first place and it is ultimately business with suits, though it doesn't change that Nvidia gave them a very raw deal.

For all I know it was a last minute decision hence the two RAM pools and specs being changed late in the development because of which most launch titles did not even approach the quality of target renders they've showed at first.

RSX is hardly a custom design - it's a G71 chip with half the memory bus and some smaller changes in the caching subsystem. If they would've contracted NV from the start they would've gotten some early implementation of G80 probably with RDRAM memory controller which would've allowed them to use UMA with Cell. The whole PS3 GPU part kinda screams that it was a quick afterthought.

As for the deal - NV gave them what they've agreed with. It's the exact same situation as with any other console contractor. They have nobody but themselves to blame if the deal wasn't in their favor.

In principle, could there be an Intel-Nvidia collaboration to produce an SOC? I mean I know it's not going to happen, but is it the kind of thing that a console maker could get done if price was no object? Or are APUs like that too technically challenging unless it's under the same roof, like AMD/ATI are?

Well, anything is possible in theory especially if money is of no concern. But in practice it will be highly uneconomical to produce such APU as both NV and Intel are likely to push their own CPU/GPU solutions and will put high price barriers to any alternative. So unless you have an unlimited budget it will always be a lot cheaper to get the complete tech from just one vendor. In the next cycle though NV will have their Denver ARM core available and Intel will have their iGPU developed to such heights that it won't be as simple decision as it was during the current gen design phase. Because of the financial situation both Intel and NV could easily undercut AMD in their offerings if they will feel that this is something they want to pursue.
 
Ok, here are a few reasons to buy an Nvidia GPU over a similar AMD GPU even if the latter is better in a pure FPS/$ metric:
  • Better OpenGL support
  • Better Linux drivers
  • Lower CPU overhead in DX9 and DX11
  • Driver-based HBAO+ injection and wider support for driver-based SGSSAA injection
  • Shadowplay, which has more solid support for e.g. desktop shadow capture than AMD's equivalent
  • G-sync, which has its own list of advantages over freesync, e.g. windowed mode support
  • CUDA support for professional applications
  • PhysX support
  • More consistent frametime results in memory-limited situations
  • DSR with its much broader range of supported downsampling resolutions
  • 3D Vision
Now, each of those will be of different importance to different people, and someone may well and rightfully say they don't care about any of them - or not sufficiently to influence their purchase over raw performance potential - but they are all factual (I can provide links for any which are not self-evident) and someone surely cares about each. The common supposition that people who buy Nvidia GPUs are invariably blinded by fanboism really grates me, since I believe that my purchasing decisions in hardware are quite well-considered.

Great post, I have run multiple crossfire and sli setups and after going from my 5970 to a 670 sli I decided I am staying in the nvidia camp. For the above reasons you posted (some of them) and the fact that nvidia's software is always evolving and seems better supported. AMD likes to create new things and make them open source which I like, but they always seem to fall to side afterwards. I want AMD to succeed but I had my 5970 for a few years and they blew major game releases on the driver front (Crysis 2 and Deus Ex HR to name two within 5 months of each other).
 
In principle, could there be an Intel-Nvidia collaboration to produce an SOC? I mean I know it's not going to happen, but is it the kind of thing that a console maker could get done if price was no object? Or are APUs like that too technically challenging unless it's under the same roof, like AMD/ATI are?
I would say practically impossible.
You would need a lot of money and additional time and most critical, the blessing of the companies.
Intels IP is tailored for their fabrication process.
Nvidia is producing their IP on TSMCs processes.
Now for an SoC you have to port Intels CPU-IP to TSMC or Nvidia's GPUs to Intels process.

You don't have to guess much, that it's very unlikely that either company would be willing to do so.
 
Now, each of those will be of different importance to different people, and someone may well and rightfully say they don't care about any of them - or not sufficiently to influence their purchase over raw performance potential - but they are all factual (I can provide links for any which are not self-evident) and someone surely cares about each. The common supposition that people who buy Nvidia GPUs are invariably blinded by fanboism really grates me, since I believe that my purchasing decisions in hardware are quite well-considered.

Cool... now do one for AMD GPUs.

I'm not expecting a list that's as long, but surely, if you're not biased as you claim, and you have done the research and fairly considered all the options, you'll have a list of notes for the AMD side of things as well.
 
I don't know why I should have to prove my impartiality to you, but whatever floats your boat. I don't have "a list of notes" though, and I also didn't have one for Nvidia. This is all off the top of my head.

Obviously, the main reason to consider AMD GPUs at most price points is the often superior average FPS/$ metric. But since the previous post was about reasons outside of that, it is indeed quite small. Here's what I can think of:
  • A more serious commitment to and support of OpenCL and its more recent versions
  • Better performance in some FLOPS-intensive games/applications
  • Often larger memory pools in the same price class as comparable NV cards (outside of the Fury line, which I honestly would never recommend at this point)
  • Higher relative double precision floating point performance in consumer cards

That's really all I can think of. For most of the GPU software ecosystem features that I listed in my other post, the basic fact is that AMD's equivalent -- if there is one at all -- is generally not as fully-featured.
 
There are two things out of my head in addition:

- I heard multi-monitoring is easier to handle than on Nvidia. (Never tried myself)
- The new Radeon Settings. You can like or dislike the design, but that thing is fast.
 
There are two things out of my head in addition:

- I heard multi-monitoring is easier to handle than on Nvidia. (Never tried myself)

It most definitely is with their professional level cards. Why that extra control panel functionality on the Quadro drivers isn't also available at least as an option on their consumer cards has always been a pet peeve of mine.
 
I gotta say at least in my current experience with a 970, the Nvidia Control Panel is atrociously slow and it takes about 30 seconds to even open some of the settings tabs. I haven't had the opportunity to test the new Radeon Settings, but it's good that AMD is pushing for improvements there they sorely need it.

Nvidia certainly spends more money and effort on the whole, but a lot of it goes to proprietary stuff like 3D Vision, PhysX, G-Sync and CUDA which ultimately end up meaning very little for the average gamer. G-Sync is a great innovation for sure, but it doesn't come for free, and you could put that money into a faster GPU instead and not lock yourself into buying their GPUs in the future. If you're just buying a card for Windows gaming, the differences aren't that big.

The vast majority of gamers unfortunately don't have a clue about the actual differences between GPUs, and can barely utter the model number of the card they are using. So mindshare is the thing AMD needs, and that only comes from bringing an ass-whooping GPU to the market.

I'd say a good part of Nvidia's current success is solely based on their wins back with the 8800 series. Ever since they've been the premium option, and ATI/AMD the value option. I always questioned why ATI went with the "small die strategy", it was a recipe for failure. I think it was back during the HD 4000 series time when they made this comment about the best way to lose a fight is not to show up. Well, they've not shown up in a lot of fights during these years and it clearly shows.

I'd put most of the blame on the AMD/ATI merger which seemed to be very problematic, and you can see the divide to this day with RTG now officially it's own thing. AMD is a company with a long history of mismanagement, but one would hope for the sake of the market the new management could get their act together. As a funny anecdote for those who aren't aware, AMD initially wanted to buy Nvidia before ATI, but Jen-Hsun Huang would've only accepted if he had become the CEO of AMD. I wonder what the company would look like these days if that deal would've happened.
 
The Nvidia control panel is a piece of shit. Luckily, Nvidia Inspector exists and is an absolutely fantastic, powerful, and fast tool, so you basically never need to open the control panel ;)
 
The Nvidia control panel is a piece of shit. Luckily, Nvidia Inspector exists and is an absolutely fantastic, powerful, and fast tool, so you basically never need to open the control panel ;)

The thing I'm referencing isn't exactly the Control Panel, but something else that gets installed with Quadros. Can't remember the name though, but it gives you a lot more options.

That said, Inspector indeed is a more than viable alternative.
 
I have been using Nvidia tech since my Geforce MX 420. Somehow, Nvidia have got me in their grasp since long ago...
 
I've basically become a single-issue GPU buyer ever since upgrading to a G-Sync monitor. As long as G-Sync outperforms Freesync, I'll keep buying Nvidia.
 
1. Manufacture the most powerful GPUs
2. Sell it for twice the price of production costs
3.???
4.Profit

Nvidia is the Maybach or Porsche of the gaming world, their business works so fine because they have no real competition in the high end market and thus can benefit of the sheer gluttony that is modern PC gaming.
 
Top Bottom