• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

How AMD is Going to Screw Nvidia

Lol people posting current Benchmarks from DX11 games etc.

I'm still watching but I can say I hope so, Nvidia's last straw for me was their refusal to support Adaptive Sync in order to create a closed system that essentially forces people to continue to buy Nvidia GPU's if they have a Gsync monitor.
 
I really hate the type of conversation (and I use that term liberally) sparked by such youtube videos, in every context pretty much. Information is best conveyed in text and illustrations, polemic is best conveyed in video form.

Attributing the DX12 lead on AMD hardware to console optimisation is highly premature - Quantum Break being the example he comes back to is... really questionable. The game is fucked on PC, big time. Gears of War: UE is an Xbox One (AMD) remake of an X360 (AMD) game running on DX12. With no AMD advantage - in fact on some AMD cards it seems brokenish and in a benchmark I'm looking at now, the 980ti outperforms Fury X @1080p, then demolishes it in 4k.
Yeah, if that is the shining example (it's not, it's just a bad port) then AMD didn't "screw" Nvidia, they screwed PC gamers.
 
Am I the only one who thinks this sounds pretty realistic?

Big-die GPUs are increasingly unsustainable, and AMD has a big advantage in stacking chips.

if you watch both videos, it is pretty realistic and well informed... if you watch only first few minutes of 2nd video and then skip, you would end up thinking its fanboyish.
 
Did many commenters here watch the video? This isn't about their current performance. It's about how they'll develop the next generation of consoles (of which they are doing all 3) and how the technologies AMD has been developing over the last decade will make it difficult for Nvidia to stay relevant in this field. He then claims this is a big reason why Nvidia is more rapidly diversifying into other fields than gaming lately.
 
if you watch both videos, it is pretty realistic and well informed... if you watch only first few minutes of 2nd video and then skip, you would end up thinking its fanboyish.

It's not realistic, AMD's advantages from DX12 and console development are not gamechanging, and Nvidia is not going to sit and watch as AMD tries to shift the market toward a different sort of GPU then be completely shocked and surprised when games perform worse on their hardware.

Especially if it's a reasonable and fiscally sensible shift, Nvidia would make the move too (though I question whether it would work for AMD at all). They can certainly afford to do anything AMD can at this stage. Then where is the benefit for AMD? They get a few frames of improvement from being the console GPU? That's the same place they are now.
 
It's not realistic, AMD's advantages from DX12 and console development are not gamechanging, and Nvidia is not going to sit and watch as AMD tries to shift the market toward a different sort of GPU then be completely shocked and surprised when games perform worse on their hardware.

Especially if it's a reasonable and fiscally sensible shift, Nvidia would make the move too (though I question whether it would work for AMD at all). They can certainly afford to do anything AMD can at this stage. Then where is the benefit for AMD? They get a few frames of improvement from being the console GPU? That's the same place they are now.

He mentions at the end of the video that ultimately it's about mindshare and CPU's.

Paraphrasing here:

"There isn't a lot of money in GPU's. AMD is coming for Intel, and much less NVIDIA. NVIDIA wasnts out of gaming as fast as they can. They want Intel's billions, and they're going to get it!"
 
He mentions at the end of the video that ultimately it's about mindshare and CPU's.

Paraphrasing here:

"There isn't a lot of money in GPU's. AMD is coming for Intel, and much less NVIDIA. NVIDIA wasnts out of gaming is fast as they can. They want Intel's billions, and they're going to get it!"

I didn't watch the last few minutes, but that's even more ridiculous because Intel has their backs against the wall as it is. See: their focus entirely being on performance per watt and low power x86, and their recent layoffs. The money in CPUs is arguably more questionable than the money in GPUs.

I also don't think Nvidia is desperately trying to get out of gaming, dipping their toes in other potential lucrative markets is simply good business.
 
I didn't watch the last few minutes, but that's even more ridiculous because Intel has their backs against the wall as it is. See: their focus entirely being on performance per watt and low power x86, and their recent layoffs. The money in CPUs is arguably more questionable than the money in GPUs.
Funnily enough, of the 3 companies being discussed (AMD, Nvidia and Intel), only the "screwed" one is showing no signs at all of fiscal issues.
 
I do wish things were more competitive. I'd love to build an all AMD machine if they can show me nice things. Time will tell, but for now I'll stick with the trusty NVIDIA/Intel builds.
 
AMD's long-term strategy paying off would be a great thing to see.

That would imply they had a long-term strategy. Remember "bulldozer"...."mantle"..."hawaii"? They have "nvidia killers" on a regular basis. I'd love to see them succeed though. I won't be buying another AMD GPU ever again, but my 2500k will die at some point.
 
youve just proven my point. these days a 770 competes with a 7870 when it comes to performance while a titan competes with a 7970. kepler is garbage in todays titles.

You're like an Ostrich just burying your head in the sand. Alright, here's a new one that compares the exact segment you are talking about. The 380X is a HD7970 GHz edition for all intents and purposes. In Battlefield 4 at 1080p we get this result:

bf4_1920_1080.png

The GTX 770 handily outperforms it. We can then extrapolate with the GTX Titan which is a beefed up GTX 780 probably performs about where the R9 290 is (EXACTLY as I said it did above. The only instances where Kepler cannot hold its own is when Gameworks takes advantage of certain functionality on a hardware level that Kepler doesn't have--and even then it's usually only the GTX 770 that suffers, the GTX 780 and 780 Ti (as well as Titan) handle it just fine. The only games I can imagine Kepler dropping an entire tier to mid-level GCN GPU's is if it is something like a GTX 680 in Ashes of Singularity (a terrible benchmark for multiple reasons) or something along those lines.

Funnily enough, of the 3 companies being discussed (AMD, Nvidia and Intel), only the "screwed" one is showing no signs at all of fiscal issues.

A lot of people don't realize that discrete GPU sales are a single digit percentage of what these companies sell. Nvidia has a stranglehold on the Laptop GPU market and markets directly to gamers much better than AMD does--during ESL One Manilla this weekend I saw exactly 0 AMD ads compared to dozens of ads for Nvidia products, as well as indirectly Nvidia ads for companies like Newegg.
 
so AMD are gonna get their CPU/GPU's in consoles, which is a huge market, and that way games will be developed for them so if you get an AMD cpu/gpu it will out perform an Nvidia gpu or Intel CPU

i'm all for it. nvidia and intel need competition. a strong and healthy AMD can only be good for everyone.
 
You're like an Ostrich just burying your head in the sand. Alright, here's a new one that compares the exact segment you are talking about. The 380X is a HD7970 GHz edition for all intents and purposes. In Battlefield 4 at 1080p we get this result:



The GTX 770 handily outperforms it. We can then extrapolate with the GTX Titan which is a beefed up GTX 780 probably performs about where the R9 290 is (EXACTLY as I said it did above. The only instances where Kepler cannot hold its own is when Gameworks takes advantage of certain functionality on a hardware level that Kepler doesn't have--and even then it's usually only the GTX 770 that suffers, the GTX 780 and 780 Ti (as well as Titan) handle it just fine. The only games I can imagine Kepler dropping an entire tier to mid-level GCN GPU's is if it is something like a GTX 680 in Ashes of Singularity (a terrible benchmark for multiple reasons) or something along those lines.



A lot of people don't realize that discrete GPU sales are a single digit percentage of what these companies sell. Nvidia has a stranglehold on the Laptop GPU market and markets directly to gamers much better than AMD does--during ESL One Manilla this weekend I saw exactly 0 AMD ads compared to dozens of ads for Nvidia products, as well as indirectly Nvidia ads for companies like Newegg.

are you seriously this daft? why are you linking benchmarks for a cross gen launch title from 2013?

black ops 3
far cry 4
far cry primal
division
battlefront
evolve
hitman
NFS 2016
killer instinct
quantum break
gears UE
rainbow 6 siege
advanced warfare
gta v
ryse
mirrors edge catalyst
just cause 3

go check kepler performance in those games then come back and tell me it performs just fine in todays games
 
I guess this would be a thing if one assumed that asynchronous compute and teh hardware specifics of GCN are the be all end all of GPU performance (they are decidedly not), and that such feature sets will never be integrated in NV GPUs.

Being a part of the DX12 superset spec thing, I imagine Async Compute will eventually show its face on NV and intel GPUs... just like how the other DX12 specs missing on AMD GPUs will eventually show up there as well.

The video is really overreaching with its conclusions.
 
Salty AMD users never stops to amuse me.

It doesn't help when people are coming in and posting benchmarks from games and cards that are largely irrelevant to the topic.

I guess this would be a thing if one assumed that asynchronous compute and teh hardware specifics of GCN are the be all end all of GPU performance (they are decidedly not), and that such feature sets will never be integrated in NV GPUs.

Being a part of the DX12 superset spec thing, I imagine Async Compute will eventually show its face on NV and intel GPUs... just like how the other DX12 specs missing on AMD GPUs will eventually show up there as well.

The video is really overreaching with its conclusions.

Agreed assuming the market leaders won't adapt to compete is silly. But the point is if AMD get's there first and can show serious performance gains or at least compare favorably with Nvidia performance wise at a lower cost then it could make all the difference for AMD. The video is definitely overstating though.
 
are you seriously this daft? why are you linking benchmarks for a cross gen launch title from 2013?

black ops 3
far cry 4
far cry primal
division
battlefront
evolve
hitman
NFS 2016
killer instinct
quantum break
gears UE
rainbow 6 siege
advanced warfare
gta v
ryse
mirrors edge catalyst
just cause 3

go check kepler performance in those games then come back and tell me it performs just fine in todays games

I haven't checked those benchmarks yet but considering at least 5 you listed are considered meh- to terrible port jobs (lol quantum break XD) I'm not sure that including such games in a list like this will help a point.
 
It's such a shame that I'm stuck with Nvidia now because of a G-sync monitor and because AMD does not have an alternative to GameStream (like PS4 remote play on a PC).
 
I haven't checked those benchmarks yet but considering at least 5 you listed are considered meh- to terrible port jobs (lol quantum break XD) I'm not sure that including such games in a list like this will help a point.

so what? the performance is what it is. those are all graphically high end games. they are relevent
 
It's such a shame that I'm stuck with Nvidia now because of a G-sync monitor and because AMD does not have an alternative to GameStream (like PS4 remote play on a PC).

That's why I jumped ship and grabbed a 144hz Asus MG279Q (Same panel as the Acer Predator. Got it for $400 on sale as opposed to 700+ for the Acer XB217HU. I figure either Nvidia will give in and support adaptive sync in addition to Gsync as a low cost option. Or I'll switch to AMD come Polaris.

As for game streaming, there are options, what device are you using it for primarily?

so what? the performance is what it is. those are all graphically high end games. they are relevent

Not just that, those are industry leaders which means it could be construed as a precursor considering how recent those games are.
 
totally missing the point of the video most of you are.



Also, by supporting one brand you are not supporting the industry that drives your hobby.



80% market share will ensure everyone gets less for their money. Watch cpu industry state.
 
I want Nividia to have some good competition.. that said the last wave of consoles did not really improve AMDs situation at all. How is this next batch going to be any different?
 
We had this thread years ago, it started a damn joke and it seems people forgot it. Guess that Nvidia is salty again cause they got no consoles contracts.
 
totally missing the point of the video most of you are.



Also, by supporting one brand you are not supporting the industry that drives your hobby.



80% market share will ensure everyone gets less for their money. Watch cpu industry state.

Not gonna throw $600 for an inferior product just because I want to support a corporation. That's a bad way of making buying decisions.
 
Not gonna throw $600 for an inferior product just because I want to support a corporation. That's a bad way of making buying decisions.

If it's only minorly inferior (I mean that, if there's a substantial difference I would still buy an Nvidia card, did so with the 970 when it came out years ago.) considering the direction of Nvidia towards a walled garden and AMD's approach to enhancing and embracing and pushing technologies that benefit all consumers I'll take AMD every time.
 
totally missing the point of the video most of you are.



Also, by supporting one brand you are not supporting the industry that drives your hobby.



80% market share will ensure everyone gets less for their money. Watch cpu industry state.

You're proposing buying an actual worse quality GPU so that we may not have to buy an eventually worse quality GPU.

Makes sense.
 
are you seriously this daft? why are you linking benchmarks for a cross gen launch title from 2013?

black ops 3
far cry 4
far cry primal
division
battlefront
evolve
hitman
NFS 2016
killer instinct
quantum break
gears UE
rainbow 6 siege
advanced warfare
gta v
ryse
mirrors edge catalyst
just cause 3

go check kepler performance in those games then come back and tell me it performs just fine in todays games

Ryse
Far Cry 4
GTAV (770 does poorly but the 780 does fine and the 780 Ti actually beats all but the 390X)
CoD: Advanced Warfare
Far Cry Primal (Only 780 Ti represented but still a Kepler card, and the 780 doesn't do much worse in any benchmark)
Ashes of Singularity (780 Ti still holds its own against even the 970)
The Witcher 3 (780 Ti is on par for R9 390 despite having limited support for Gameworks)
The Division (Another Gameworks title that intentionally cripples Kepler)
Just Cause 3
Battlefront
Evolve
Dark Souls 3

In all of these the 770, 780, and especially the 780 Ti hold their own--which is impressive for cards that are a few years old at this point. And to answer your question, I chose BF4 because it's a well rounded game with minimal bias (despite supporting Mantle and being sponsored by AMD) and is built on a great engine that is well optimized (uses up to 8 threads assuming developers allow it to). The same engine that Battlefront uses, as well as about a dozen other EA titled released since 2013. I didn't bother looking too hard for Killer Instinct or GoW UE because they are both universally known to be bad ports with huge performance issues.
 
My most recent GPU is a GTX 980Ti, bought about 7 months ago. Yeah, I'm like a lot of PC gamers in that I don't even consider AMD when looking at a new GPU... but I think I've seen the light. That is going to change when I upgrade later this year or in 2017. I've already seen the early signs of it in many DX12 games. I think this guy is right.

Not saying I will go AMD, but they are back on the table for sure.
 
Buy two equally priced video cards today - one team red the other team green. You'll get more value today going red, you'll also find the the performance delta widen over time in red's favour.

Why is this?

Because people will pay a premium to get Team Green
 
If it's only minorly inferior (I mean that, if there's a substantial difference I would still buy an Nvidia card, did so with the 970 when it came out years ago.) considering the direction of Nvidia towards a walled garden and AMD's approach to enhancing and embracing and pushing technologies that benefit all consumers I'll take AMD every time.

AMD has to do that because they are consistently playing catch-up. Nvidia beat them to the punch on Gsync, CUDA, and Gameworks. All these are pretty successful for Nvidia, and AMD's offerings are either non-existent or completely overlooked. There's also the fact that Nvidia got ahead of the Streaming trend with Shadowplay, and had it on peoples computers like a year and a half before AMD had similar software.

My most recent GPU is a GTX 980Ti, bought about 7 months ago. Yeah, I'm like a lot of PC gamers in that I don't even consider AMD when looking at a new GPU... but I think I've seen the light. That is going to change when I upgrade later this year or in 2017. I've already seen the early signs of it in many DX12 games. I think this guy is right.

We'll know come Computex. That is when Nvidia reveals Pascal and AMD reveals Polaris 10. Depending on how both look we'll have a better idea of how this generation will play out.
 
Ryse
Far Cry 4
GTAV (770 does poorly but the 780 does fine and the 780 Ti actually beats all but the 390X)
CoD: Advanced Warfare
Far Cry Primal (Only 780 Ti represented but still a Kepler card, and the 780 doesn't do much worse in any benchmark)
Ashes of Singularity (780 Ti still holds its own against even the 970)
The Witcher 3 (780 Ti is on par for R9 390 despite having limited support for Gameworks)
The Division (Another Gameworks title that intentionally cripples Kepler)
Just Cause 3
Battlefront
Evolve
Dark Souls 3

In all of these the 770, 780, and especially the 780 Ti hold their own--which is impressive for cards that are a few years old at this point. And to answer your question, I chose BF4 because it's a well rounded game with minimal bias (despite supporting Mantle and being sponsored by AMD) and is built on a great engine that is well optimized (uses up to 8 threads assuming developers allow it to). The same engine that Battlefront uses, as well as about a dozen other EA titled released since 2013. I didn't bother looking too hard for Killer Instinct or GoW UE because they are both universally known to be bad ports with huge performance issues.

no idea why you linked the witcher or ds3, never claimed those were bad for nvidia. the other links you provided show how poor kepler cards perform. do some more scouring and youll find that results can be much worse depending on the test scene
 
AMD has to do that because they are consistently playing catch-up. Nvidia beat them to the punch on Gsync, CUDA, and Gameworks. All these are pretty successful for Nvidia, and AMD's offerings are either non-existent or completely overlooked. There's also the fact that Nvidia got ahead of the Streaming trend with Shadowplay, and had it on peoples computers like a year and a half before AMD had similar software.



We'll know come Computex. That is when Nvidia reveals Pascal and AMD reveals Polaris 10. Depending on how both look we'll have a better idea of how this generation will play out.

OBS for streaming or Dxtory for recording wins that though. Shadowplay is ok compared to other offerings that pretty much everyone who is wanting that kind of usage goes for. It's a neat but relatively unused feature.

Wait did the mod change the title, or was that always the title? What a stupid title name, that was begging for arguments.
 
Top Bottom