Nvidia CEO Jensen Huang Trashes AMD New GPU Radeon VII: 'The Performance Is Lousy'.

IbizaPocholo

NeoGAFs Kent Brockman
Dec 1, 2014
9,571
4,798
680
ibiza
#1
https://gizmodo.com/nvidia-ceo-trashes-amds-new-gpu-the-performance-is-lou-1831621038

Yesterday, AMD announced a new graphics card, the $700 Radeon VII, based on its second-generation Vega architecture. The GPU is the first one available to consumers based on the 7nm process. Smaller processes tend to be faster and more energy efficient, which means it could theoretically be faster than GPUs with larger processes, like the first generation Vega GPU (14nm) or Nvidia’s RTX 20-series (12nm).

I say “could,” because so far Nvidia’s RTX 20-series has been speedy in our benchmarks. Nvidia cards from the $1,000+ 2080 Ti down to the $350 2060 announced Sunday support ray tracing. This complex technology allows you to trace a point of light from a source to a surface in a digital environment. What it means in practice is video games with hyperrealistic reflections and shadows.

It’s impressive technology, and Nvidia has touted it as the primary reason to upgrade from previous generation GPUs. AMD’s GPUs, notably, do not support it. And at a round table Gizmodo attended with Nvidia CEO Jensen Huang he jokingly dismissed AMD’s Tuesday announcement, claiming the announcement itself was “underwhelming” and that his company’s 2080 would “crush” the Radeon VII in benchmarks. “The performance is lousy,” he said of the rival product.

When asked to comment about these slights, AMD CEO Lisa Su told a collection of reporters, “I would probably suggest he hasn’t seen it.” When pressed about his comments, especially his touting of ray tracing she said, “I’m not gonna get into it tit for tat that’s just not my style.”

But boy was it Huang’s. Over the hour-long conversation, Huang repeatedly joked about his GPU competitor. When someone brought up Intel’s new focus on graphics he joked about how Intel’s graphics team was just AMD’s, and he wasn’t sure what AMD even had (he did go on to express his sincere respect for Intel). Then when asked about Nvidia’s decision to adopt support for Adaptive Sync monitors, a kind of variable refresh technology that quickly updates the image on a monitor to allow for smoother gameplay, “We invented the whole area.”

“The truth is most of the FreeSync monitors do not work,” Huang said. “They don’t even work within AMD’s graphics cards because nobody tested it. And we think that is a terrible idea to let a customer buy something believing the promise of that product and have it not work.”

AMD’s CEO, Lisa Su, denied this allegation and noted the more stringent guidelines of FreeSync 2. She also admitted she was totally fine with Nvidia adopting support. “We believe in open standards,” she said. “We believe in an open ecosystem. That’s been a mantra. So we have no issue with our competitors adopting FreeSync.”

With regards to ray tracing, Su said the technology was too young to be recognizable. “[F]or us it’s, you know, what is the consumer going to see? The consumer doesn’t see a lot of benefit today because the other parts of the ecosystem are not ready. I think by the time we talk more about ray tracing the consumer’s gonna see the benefit.”
 
Jun 23, 2013
2,328
895
390
#3
Nvidia cards from the $1,000+ 2080 Ti down to the $350 2060 announced Sunday support ray tracing. This complex technology allows you to trace a point of light from a source to a surface in a digital environment. What it means in practice is video games with hyperrealistic reflections and shadows.
<snip>
It’s impressive technology, and Nvidia has touted it as the primary reason to upgrade from previous generation GPUs.
<snip>
With regards to ray tracing, Su said the technology was too young to be recognizable. “[F]or us it’s, you know, what is the consumer going to see? The consumer doesn’t see a lot of benefit today because the other parts of the ecosystem are not ready. I think by the time we talk more about ray tracing the consumer’s gonna see the benefit.”
What games support Ray tracing currently, or will in the near future? What is required to run it, and will it be on next gen consoles in large masses similarly to 4k/HDR?

I know it's impressive, but I thought you need quite the beast to get it done properly.

I know someone with technical know-how can comment.
 
Nov 24, 2018
1,657
1,061
210
#5
Guess that means the 2080 and everything below it are lousy too? I have an Nvidia atm since I upgraded last year when I got lucky and found a 1080 (the mini by Zotac, very nice implementation) for a non inflated price but come on, lol. I had a Radeon before and see no reason to not have one in the future too if it seems like the best deal for my price range at the time whether I go high end or lower than that.
 
Last edited:

Chittagong

Gold Member
Jun 8, 2004
17,748
778
1,690
#6
It would be fun if CEOs did more of this. Can you imagine?

“Samsung phones are plasticky and ugly” - Tim Cook, Apple

“God of War is a boring checkpoint quest like all the Assassins Creeds and Red Deads of the world” - Shuntaro Furukawa, Nintendo

“We didn’t even bother with a campaign, and yet still Call of Duty Black Ops IIII crushed Battlefield V.” - Bobby Kotick, Activision
 
Jan 28, 2010
2,298
145
640
#7
What games support Ray tracing currently, or will in the near future? What is required to run it, and will it be on next gen consoles in large masses similarly to 4k/HDR?

I know it's impressive, but I thought you need quite the beast to get it done properly.

I know someone with technical know-how can comment.
It's a feature that was rushed so nVidia could justify meager hardware improvements with an increased price. Like Dice had to lower the quality just so people could get decent performance.
 
Jul 10, 2017
1,914
1,363
255
#8
Shows us what you've got then, dude.

I really want a new gen just so that prices of current ...hell, even older GPUs go down 8(
 
Jan 24, 2015
7,347
1,923
365
#11
Ahhh team green how i want them to fall hard flat on their faces xD

Cant wait for intel to piss in their cereal
 
Apr 5, 2018
462
591
225
#12
Well the problem Hueng has is that Nvidia are losing market share , i think he is just feeling a bit of pressure that's all. Yes Nvidia dominate the High end Graphics card SKU but his problem is the majority of PC gamers have Not got High End Graphics card's. The AMD 580 and Vega 64 are good cards as price vs performance go , we all know the the 2080ti is the best graphics card but its £300-500 to expensive.

so Hueng is being a dick because he know's he messed up big time with the price of the RTX line.
 
Last edited:
Likes: Helios

Makariel

Gold Member
Jan 14, 2018
1,435
1,198
350
#13
Can't take these statements seriously. How do they know it doesn't keep up with the 2080? It hasn't released yet, and they are just bitter that they don't get paid for each Freesync monitor, like they are with G-Sync.
 
Apr 19, 2018
532
471
275
#14
What games support Ray tracing currently, or will in the near future? What is required to run it, and will it be on next gen consoles in large masses similarly to 4k/HDR?

I know it's impressive, but I thought you need quite the beast to get it done properly.

I know someone with technical know-how can comment.
We will see more games support it when AMD decides to implement it, because then it will be a standard part of DX12 and Vulkan and not some proprietary Nvidia BS.
 

llien

Gold Member
Feb 1, 2017
4,131
1,451
475
#19
Meh at "CEO of the company known for doing tricks as dirty as it gets bashes competing product".

Vega7 is Vega64 minus 4 CUs, twice as much ROPs and RAM, shrunk to 7nm.
With 2/3rds of AMD GPU R&D doing Navi, beating nVidia's 530mm^2 chip with 330-ish mm^2 chip (yeah, it's 7nm vs 12nm, I remember) ain't bad at all.

AMD recovered from 50%+ (!!!) IPC deficit against Intel AND beat them at perf/watt, I have no doubt their GPU department can do well, if funded.

The FreeSync part is hilarious. Dude simply doesn't know how to lose without losing face.


they did pick kinda obscure games to do a comparison with the rtx2080.
They published results of 25 games (from old DX11, to new DX11,12,Vulkan) though.
 
Last edited:
Likes: Imtjnotu

JohnnyFootball

The Last of Us may be third person, but it is hardly third person.
Jan 20, 2014
8,190
1,165
595
#20
Meh at "CEO of the company known for doing tricks as dirty as it gets bashes competing product".

Vega7 is Vega64 minus 4 CUs, twice as much ROPs and RAM, shrunk to 7nm.
With 2/3rds of AMD GPU R&D doing Navi, beating nVidia's 530mm^2 chip with 330-ish mm^2 chip (yeah, it's 7nm vs 12nm, I remember) ain't bad at all.

AMD recovered from 50%+ (!!!) IPC deficit against Intel AND beat them at perf/watt, I have no doubt their GPU department can do well, if funded.

The FreeSync part is hilarious. Dude simply doesn't know how to lose without losing face.



They published results of 25 games (from old DX11, to new DX11,12,Vulkan) though.
Like I said, he should be thanking AMD. They enabled him to keep his card prices sky high.
 
Jan 9, 2015
388
482
280
#22
I guess you're allowed a bit of arrogance when you've been at the top for so long. Would love to see AMD put out better stuff, but it feels like they put a lot of effort into products that lose out to nvidia's top cards.

But I also feel like AMD kills it in the mobile/console realm soooo
 
Sep 20, 2018
129
181
185
#24
It would be fun if CEOs did more of this. Can you imagine?

“Samsung phones are plasticky and ugly” - Tim Cook, Apple

“God of War is a boring checkpoint quest like all the Assassins Creeds and Red Deads of the world” - Shuntaro Furukawa, Nintendo

“We didn’t even bother with a campaign, and yet still Call of Duty Black Ops IIII crushed Battlefield V.” - Bobby Kotick, Activision
So....you want Aaron Greenberg as a CEO then?
 
Likes: Yakuzakazuya
Mar 26, 2015
1,027
382
280
Brazil
#26
Like I said, he should be thanking AMD. They enabled him to keep his card prices sky high.
Yeah, everyone was expecting AMD to push a mid ranged card with a reasonable price, instead we got Radeon 7 with overkill 16gb at $699.

And though i dont agree with the way he said, he is right, if you can find a RTX 2080 around the same price as Radeon 7 i think the extra tecnology (Ray tracing, DSLL, etc) makes rtx 2080 a better investment.
 

JohnnyFootball

The Last of Us may be third person, but it is hardly third person.
Jan 20, 2014
8,190
1,165
595
#27
Yeah, everyone was expecting AMD to push a mid ranged card with a reasonable price, instead we got Radeon 7 with overkill 16gb at $699.

And though i dont agree with the way he said, he is right, if you can find a RTX 2080 around the same price as Radeon 7 i think the extra tecnology (Ray tracing, DSLL, etc) makes rtx 2080 a better investment.
It really depends on how well Vega 2 performs. If it's on par with the 2080 it will do fine.

Perception though is what hurts AMD. The Vega 56 and 64 were actually pretty good cards that could trade blows with the 1070 and 1080, but you'd never know it based on people's attitudes towards the Vega.
 
Mar 26, 2015
1,027
382
280
Brazil
#28
It really depends on how well Vega 2 performs. If it's on par with the 2080 it will do fine.

Perception though is what hurts AMD. The Vega 56 and 64 were actually pretty good cards that could trade blows with the 1070 and 1080, but you'd never know it based on people's attitudes towards the Vega.
But if its on par on perfomance, then why not get a rtx 2080?

Radeon 7 also has a higher TDP than RTX 2080 as well
 

JohnnyFootball

The Last of Us may be third person, but it is hardly third person.
Jan 20, 2014
8,190
1,165
595
#29
But if its on par on perfomance, then why not get a rtx 2080?

Radeon 7 also has a higher TDP than RTX 2080 as well
Because it's $100 cheaper for starters, although the 2080 has been on sale. $100 cheaper does matter.

TDP only seems to matter when it's being used to knock AMD and I am puzzled by it. If the 2080 had higher TDP nobody would bat an eye.
 
Last edited:
Likes: Silver Wattle
Mar 26, 2015
1,027
382
280
Brazil
#31
Because it's $100 cheaper for starters, although the 2080 has been on sale. That does matter.
My argument is based on both being on the same price range.


TDP only seems to matter when it's being used to knock AMD and I am puzzled by it. If the 2080 had higher TDP nobody would bat an eye
I dont know about that, but im just saying that both at the same price range, with similar performance, RTX is a better deal due to supporting Freesync now, having ray tracing and dsll alongside with a better TDP.

I honestly dont know if Radeon 7 16gb is worth more than all that.
 

JohnnyFootball

The Last of Us may be third person, but it is hardly third person.
Jan 20, 2014
8,190
1,165
595
#33
My argument is based on both being on the same price range.
For the record, I am not necessarily disagreeing with you, but I don't think argument about better features really helps with 2080 RTX the way many thinks it does, the performance penalty for RT significant. Even the 2080 Ti can barely muster playable framerates at 1440p in Battlefield V, a game that is designed to run smoothly for the ultimate multiplayer experience.

Also, don't discount the 16GB of HBM. I see many gamers arguing that 8GB is overkill right now for gaming, but RT benchmarks show that we may approach that limit sooner rather than later.

https://www.hardocp.com/article/201...nvidia_ray_tracing_rtx_2080_ti_performance/11

VRAM capacity on your video card is extremely important and it relates to both the DX12 API, NVIDIA Ray Tracing, and DXR in Battlefield V. Starting with the default DX11 API, and NO DXR, VRAM utilization is not outrageous for BFV. We have to understand it is going to demand its fair share of VRAM because of the nature of the game, and its demand on performance. It does exceed 4GB of VRAM at 1080p and 1440p, but nowhere near 8GB. At 4K it reaches 6GB of VRAM utilization. As far as DX11 NO DXR performance goes, an 8GB video card is fine at 4K and 1440p, and a 6GB card will do fine at 1080p in BFV.



The problem is when we enable DX12 on these RTX video cards in BFV, VRAM capacity utilization increases greatly. It makes it so that we are nearly hitting 8GB of VRAM usage at 1080p and 1440p. At 4K it exceeds 8GB of VRAM usage. This doesn’t leave a lot of headroom to enable DXR on 8GB video cards.


When we enable even the lowest "LOW" DXR option 8GB of VRAM is greatly exceeded at 1440p, and almost hits the capacity at 1080p. 4K is out of the question.


When we enable "ULTRA" DXR we have now exceeded 8GB of VRAM even at just 1080p! This renders 8GB video cards as a bottleneck in BFV. 8GB is just not enough VRAM for Ultra DXR in BFV even at 1080p.

At the playable setting we found, it seems 8GB is a minimum for 1080p. While 11GB is a minimum for 1440p, or at least over 8GB since "LOW" and "MEDIUM" DXR are playable at 1440p. An 8GB video card, isn’t enough.

At 4K there is no choice, even an 11GB video card is bottlenecked. Now we see the reason for the new TITAN RTX with 24GB of VRAM.

What is the result of bottlenecking VRAM in BFV? Stuttering, choppiness, pausing in gameplay, jerkiness, framerate skipping, noticeable periods of time going by as new art assets are loaded and swapped out of VRAM while playing. Moving from area to area in a map means the loading and re-loading of textures and assets, and there is lag when this happens, and it happens often when hitting that VRAM capacity cap. It can also cause inconsistent framerate and frametime, and wild swings in framerate as we have experienced. It does not make for a good gaming experience.
 
May 4, 2014
1,154
42
290
#36
You can get 2080's for 650€ right now in Germany, so VII will have to compete with that. If all they can muster up as marketing benchmarks is equal performance in two AMD-favoring titles and one win in some random Vulkan title, you can bet your ass they'll be losing more than winning in most games against the 2080 once the reviews are out. And that's without RTX/DLSS and worse power consumption. The extra VRAM this card has is unlikely to truly matter as once the games use more, this won't be be good enough for those settings anyways.

The truth of the matter is, this card exists because of a couple reasons. AMD needed a test chip for 7 nm, which was just Vega64 with DP support and more HBM for HPC market. Since Nvidia launched 2080 at such ridiculous price, it left AMD the opportunity to bring this chip to the gaming market too. These chips seem to be just Instinct rejects with less CUs. There's not even an option to halve the memory to make a cheaper version. Halving the chips would also halve the bandwidth.

In all likelihood the first Navi chip won't even reach these performance levels, but more like 2060/70 class for less money, so this is likely all that AMD has to offer for the high end until 2020. It's the only response they can give, and it doesn't impress. Architecture wise and feature wise it's already obsolete. It likely won't have HDMI 2.1 support, or stuff like variable rate shading for next gen VR, Virtuallink connector etc. RTX and DLSS might not be worth the price hikes right now, but VII is offering even less, so gamers are right in questioning its worth. The only people exited seem to be those who are looking out for a cheap FP64 compute card.

In the end both companies are offering crud to the consumers. RT cores and Tensor cores aren't really there for gamers, RTX and DLSS is just an afterthought to justify to gamers why Nvidia is wasting tons of die space on what are features for enterprise space (Machine learning and professional ray-tracing applications). They just don't want to spend the hundreds of millions it costs to design separate chips for these markets. Same goes for AMD, who are just pushing old chips onto newer nodes and take years to develop even a single actually new chip architecture with Navi. At least in their case you can understand it's because of their limited budget. Nvidia is making bank with its gaming chips, and spending it on ventures outside gaming as that's where their growth opportunities are. They have the PC gaming market by the balls, and it shows. Navi is the only real hope that there's actual competition happening this year in the GPU space, but AMD's track record hasn't exactly been great in the entire lifetime of the GCN architecture.
 
Mar 26, 2015
1,027
382
280
Brazil
#37
For the record, I am not necessarily disagreeing with you, but I don't think argument about better features really helps with 2080 RTX the way many thinks it does, the performance penalty for RT significant. Even the 2080 Ti can barely muster playable framerates at 1440p in Battlefield V, a game that is designed to run smoothly for the ultimate multiplayer experience.

Also, don't discount the 16GB of HBM. I see many gamers arguing that 8GB is overkill right now for gaming, but RT benchmarks show that we may approach that limit sooner rather than later.

https://www.hardocp.com/article/201...nvidia_ray_tracing_rtx_2080_ti_performance/11
We have to see that all these tests with BF5 (as far as i know) are done without the DSLL patch, which launches on January 15th. According to Huang with DSLL patch there are barely any FPS drops.

If that is true i dont know, but i think its worth waiting if ray tracing can actually work at a reasonable performance if paired with DSLL.

As for the part about 16 gb, maybe it would be really useful if the card had ray tracing, but since it does not i dont think most games will reach 8gb , let alone 16 gb., even at 4k.
 

JohnnyFootball

The Last of Us may be third person, but it is hardly third person.
Jan 20, 2014
8,190
1,165
595
#39
We have to see that all these tests with BF5 (as far as i know) are done without the DSLL patch, which launches on January 15th. According to Huang with DSLL patch there are barely any FPS drops.

If that is true i dont know, but i think its worth waiting if ray tracing can actually work at a reasonable performance if paired with DSLL.

As for the part about 16 gb, maybe it would be really useful if the card had ray tracing, but since it does not i dont think most games will reach 8gb , let alone 16 gb., even at 4k.
Side note: what disappoints me is that the 20 series will end up with a pretty short shelf life. The 2080 should have shipped with 12GB of memory and the 2080 TI should have at least 16GB.
 

JohnnyFootball

The Last of Us may be third person, but it is hardly third person.
Jan 20, 2014
8,190
1,165
595
#40
You can get 2080's for 650€ right now in Germany, so VII will have to compete with that. If all they can muster up as marketing benchmarks is equal performance in two AMD-favoring titles and one win in some random Vulkan title, you can bet your ass they'll be losing more than winning in most games against the 2080 once the reviews are out.
Ill take that bet. I bet it will have it's fair share of victories over the 2080
 
Jun 9, 2012
3,270
721
455
#41
He’s not wrong. AMD’s newest flagship will be on par with (and launch at the same price as) Nvidia’s flagship from 2 full years ago. Hard to see that as anything but underwhelming.

I mean it’s not a TERRIBLE card. Might be a good buy if you’re interested in the free games or you’re a “prosumer” with non-gaming applications. But I can’t imagine many people are regretting their 1080 Ti or 2080 purchases right now.
 
Apr 19, 2018
1,016
902
265
#44
He’s not wrong. AMD’s newest flagship will be on par with (and launch at the same price as) Nvidia’s flagship from 2 full years ago. Hard to see that as anything but underwhelming.
That is not exactly true.
From fps provided by AMD vega7 has same performance as 2080 for same price.
2080Ti is $1000+gpu. Who will buy it for such price ? small minority.
The disappointing part here is that it is on 7nm and they didn't undercut NVIDIA.

What pressure exactly? There is none.
There is a lot actually.

Dude is angry because he though AMD will also release something with raytracing so rest of gaming community would start to use it, but they didn't and whole tensor/rtx will be in same position as their psyhx, small minority of developers using it.

Then dude is doubly angry because AMD cut them new asshole on their G-sync monitors and Nvidia couldn't anymore keep up with freesync and forced them to adopt standard.

The tripple rage is coming when they will announce new consoles and AMD will cook up new architecture that will be in new consoles and won't allow Nvidia to use their new fixed function hardware essentially cutting them on PC market too like nvidia did with gameworks.
 
Last edited:
Mar 19, 2013
22,226
1,194
520
Brazil
#45
There is a lot actually.

Dude is angry because he though AMD will also release something with raytracing so rest of gaming community would start to use it, but they didn't and whole tensor/rtx will be in same position as their psyhx, small minority of developers using it.

Then dude is doubly angry because AMD cut them new asshole on their G-sync monitors and Nvidia couldn't anymore keep up with freesync and forced them to adopt standard.

The tripple rage is coming when they will announce new consoles and AMD will cook up new architecture that will be in new consoles and won't allow Nvidia to use their new fixed function hardware essentially cutting them on PC market too like nvidia did with gameworks.
Nothing of that is relevant to nVidia market at all.

- GTX 2000 is selling to spread ray-trying... AMD adopt it or not have nothing to them... the opposite their RTX will probably be standard before AMD release whatever with ray-tracing.

- They are still selling G-Sync monitor at premium price plus they are now making people with Freesync monitor buys GeForce instead AMD... win win.

- Console market won't affect nVidia market.
 
Last edited: