• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Former AMD Radeon boss says NVIDIA is the GPU cartel, Nvidia's AI customers are scared to be seen courting other AI chipmakers for fear of retaliatory

Bernoulli

M2 slut



Context

Nvidia's AI customers are scared to be seen courting other AI chipmakers for fear of retaliatory shipment delays, says rival firm​

A report published by the Wall Street Journal on Monday surfaces an accusation that Nvidia might be willing to delay data center GPU orders if it becomes aware of a customer looking for greener pastures. Thus, rival AI chipmaker Groq says that fearful customers are secretive about acquiring or designing AI acceleration technology for fear of retaliatory shipment delays. This stands in contrast to Nvidia's own statements on the matter, with the company saying that it is trying to allocate supply fairly and to offer customers alternative access to compute while they wait for their shipments.

It would be a pity if something were to happen to your pending GPU order...

Fear of being found out is so strong that it isn’t uncommon for people to deny they have had meetings with rival AI accelerator chip firms, Jonathan Ross, CEO of rival chip startup Groq, told the WSJ.

“A lot of people that we meet with say that if Nvidia were to hear that we were meeting, they would disavow it,” Ross told the WSJ. “The problem is you have to pay Nvidia a year in advance, and you may get your hardware in a year, or it may take longer, and it’s, ‘Aw shucks, you’re buying from someone else, and I guess it’s going to take a little longer.’”

If true, this revelation has echoes of Intel using strong-arm tactics in the early noughties to disincentivize PC-making partners from offering AMD-inside systems. Elsewhere in the WSJ report, it's made clear that Nvidia is enjoying its dominance of the booming AI market, with an estimated 80% of companies using AI acceleration reliant upon the green team's hardware.
 
Now imagine the power of Microsoft with custom ARM chips they will start producing :messenger_tears_of_joy: Vendor lock you into Azure, AI, Windows, Office and then home-made ARM 🔥
 
Last edited:
Nvidia right now to their AI customers:

Marlon Brando Movie GIF
 

T4keD0wN

Member
When your hardware is so inferior for every use case besides gaming people start to be afraid of associating with you since they cant risk their orders from the only viable provider getting delayed.
Comforting 30 Rock GIF

Sucks for amd, but it wouldnt ever become a problem if their offerings were any viable.
 
Last edited:

Bernoulli

M2 slut
When your hardware is so inferior for every use case besides gaming people start to be afraid of associating with you since they cant risk their orders from your competition getting delayed.
Comforting 30 Rock GIF


During the Wednesday AMD investor event, Meta, OpenAI, and Microsoft announced their adoption of AMD’s latest AI chip, the Instinct MI300X.

This week’s announcement means that MI300X and MI300A are shipping and in production. The MI300X, designed for cloud providers and enterprises, is built for generative AI applications and outperforms Nvidia’s H100 GPU in two key metrics: memory capacity and memory bandwidth. That lets AMD deliver comparable AI training performance and significantly higher AI inferencing performance.

In other words, the MI300X GPU boasts over 150 billion transistors, surpassing Nvidia’s H100, the current market leader, with 2.4 times the memory. It also reaches 5.3 TB/s peak memory bandwidth, which is 1.6 times more than Nvidia’s H100’s 3.3 TB/s.

 

WitchHunter

Banned
How would we get our raytranced butts and the like without nvidia?

Does this mean game graphics will stagnate?

The future looks grim.
 

Buggy Loop

Member
What a sore fucking loser

This is the ex Alienware guy who plastered « up to » performances all over RDNA 3 presentations that got the internet thinking that it would be within spitting distance of a 4090 and even surpass it. What a stupid presentation that was.

761soy.jpg


I have a hard time pitying AMD after giving ATI/AMD 20 years of CPU and GPU dedication and see them always play the victim and let their fans take the pitchforks for their failures. All the gamework controversies were bullshit and they were just stirring up shit for pity buying and being the underdog when in fact it was all their own doing. I was AMD during those times and It was fucking obvious. For a company that invented hardware tessellation in 2001, no fucking excuses. Peoples were fixing their drivers to just limit tessellation within 48 hours but they didn't do any of those fixes in a timely manner. They cultivated this controversy.

AMD let CUDA run rampant into all universities and work places that make GPU accelerated studies for FIFTEEN YEARS. Make a pikachu surprised face that companies are not jumping on your dick as soon as you present an alternative. Those customers just had to endure dire fucking periods of time without CUDA because AMD just wasn't ready for FIFTEEN YEARS. That's a product that is a teenager by then. Even arguable that they even truely have a competing product to CUDA even nowadays but I had to give them some slack.
 
Last edited:

Wildebeest

Member
The difference is in software, not hardware. If an independent third party had written a programming environment that ran just as fast as CUDA on nvidia hardware but also worked on ATI or Intel GPU then the market would be blown wide open. nvidia have no reason to cooperate with or financially support such a thing, though.
 
Last edited:

Wildebeest

Member
Now imagine the power of Microsoft with custom ARM chips they will start producing :messenger_tears_of_joy: Vendor lock you into Azure, AI, Windows, Office and then home-made ARM 🔥
Custom AI chips are mostly used on the server side. Google make their own custom chips they call "tensor processing units" but you have no need to buy one to use chrome.
 
Custom AI chips are mostly used on the server side. Google make their own custom chips they call "tensor processing units" but you have no need to buy one to use chrome.
It really depends on the volume they can produce. For example it opens the possibility to sell hardware to customers and this hardware will be properly designed to run Azure or AI workload on premises or something. Granted cloud is a more common way. But for the future - something like this is not out of realm of impossibility.

Especially with Microsoft's "azure credits" deals :messenger_tears_of_joy:
 
Last edited:

WitchHunter

Banned
I have a hard time pitying AMD after giving ATI/AMD 20 years of CPU and GPU dedication and see them always play the victim and let their fans take the pitchforks for their failures. All the gamework controversies were bullshit and they were just stirring up shit for pity buying and being the underdog when in fact it was all their own doing. I was AMD during those times and It was fucking obvious. For a company that invented hardware tessellation in 2001, no fucking excuses. Peoples were fixing their drivers to just limit tessellation within 48 hours but they didn't do any of those fixes in a timely manner. They cultivated this controversy.

AMD let CUDA run rampant into all universities and work places that make GPU accelerated studies for FIFTEEN YEARS. Make a pikachu surprised face that companies are not jumping on your dick as soon as you present an alternative. Those customers just had to endure dire fucking periods of time without CUDA because AMD just wasn't ready for FIFTEEN YEARS. That's a product that is a teenager by then. Even arguable that they even truely have a competing product to CUDA even nowadays but I had to give them some slack.
Yeah, but they were on the verge of dying out... stock has seen 3-4 dollar lows. Plus NVDA seems like a protecced company. Just look what happened to 3Dfx. Maybe they rattled AMD's feathers a bit to show them don't fuck around. Too many "accidents" around this company means it might be a strategic asset of some sorts.
 
Last edited:

willothedog

Member
Yeah, but they were on the verge of dying out... stock has seen 3-4 dollar lows. Plus NVDA seems like a protecced company. Just look what happened to 3Dfx. Too many accidents around this company means it's a strategic asset.

Wouldn't be surprised if they supply the likes of Fort Meade, Langley etc with compute power.


security GIF by HuffPost
 

Mr.Phoenix

Member
Bullshit.

Utter nonsense.

This is one of those areas where if AMD pushed out actual competitive and (this may sound obvious) better tech, then customers wouldn't care whatever Nvidia thinks.

AMD has no one but themselves to blame with how far behind tech-wise they let themselves get behind Nvidia.
 

Loomy

Thinks Microaggressions are Real
However you feel about AMD vs Nvidia, if this has merit antitrust agencies will look very closely at it. Again, no matter how you feel about AMD, if Nvidia is breaking the law here, they will be penalized for it.
 

shamoomoo

Member



Context

Nvidia's AI customers are scared to be seen courting other AI chipmakers for fear of retaliatory shipment delays, says rival firm​

A report published by the Wall Street Journal on Monday surfaces an accusation that Nvidia might be willing to delay data center GPU orders if it becomes aware of a customer looking for greener pastures. Thus, rival AI chipmaker Groq says that fearful customers are secretive about acquiring or designing AI acceleration technology for fear of retaliatory shipment delays. This stands in contrast to Nvidia's own statements on the matter, with the company saying that it is trying to allocate supply fairly and to offer customers alternative access to compute while they wait for their shipments.

It would be a pity if something were to happen to your pending GPU order...

Fear of being found out is so strong that it isn’t uncommon for people to deny they have had meetings with rival AI accelerator chip firms, Jonathan Ross, CEO of rival chip startup Groq, told the WSJ.

“A lot of people that we meet with say that if Nvidia were to hear that we were meeting, they would disavow it,” Ross told the WSJ. “The problem is you have to pay Nvidia a year in advance, and you may get your hardware in a year, or it may take longer, and it’s, ‘Aw shucks, you’re buying from someone else, and I guess it’s going to take a little longer.’”

If true, this revelation has echoes of Intel using strong-arm tactics in the early noughties to disincentivize PC-making partners from offering AMD-inside systems. Elsewhere in the WSJ report, it's made clear that Nvidia is enjoying its dominance of the booming AI market, with an estimated 80% of companies using AI acceleration reliant upon the green team's hardware.

What does this have to do with the article presented? If I gathered the information correctly,the article is say folks who are looking for alternatives in this A.I space will be published with a delay trying to access this marked on the cheap side.

Basically, Nvidia is trying to have a monopoly in this arena by using their market lead to coerce people/companies to stick with Nvidia.
 

shamoomoo

Member
Bullshit.

Utter nonsense.

This is one of those areas where if AMD pushed out actual competitive and (this may sound obvious) better tech, then customers wouldn't care whatever Nvidia thinks.

AMD has no one but themselves to blame with how far behind tech-wise they let themselves get behind Nvidia.
True,but the article is saying Nvidia is trying to discourage competition with their market dominance by delaying folks towards access for alternatives.
 

SABRE220

Member
While I dont doubt theres truth in this, the fact remains and has to take the majority of blame for their pathetic marketshare in the GPU sector. They have been delivering inferior hardware and software for a decade and let nvidia remain uncontested for 15 years with cuda. Nvidia went all in with R&D in compute/ML/AI and took the road bumps that came with the dedicated hardware while amd played it safe and kept the same pipeline with iterative upgrades essentially for 15years.
 

Drew1440

Member
Wasn't this the case with Intel, where they put pressure on OEM's to avoid AMD processors and chipsets? I wouldn't put it past Nvidia to do the same.
 
Top Bottom