• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia RTX 6000 48GB Ada Professional GPU Listed at Suggested Retail of $9,999

Draugoth

Gold Member
yqxctokS66TeoZagbq23gY-1200-80.jpg.webp




Nvidia's new RTX 6000 Ada professional GPU has started popping up in online listings, along with the price. The card is currently listed at CompSource and ShopBLT with a suggested retail price of $9,999. Luckily, both stores are offering a discount off the "suggested retail price," dropping the price of the card to just $8,209.65 and $7,377.71, respectively, at the time of this writing.

The RTX 6000 Ada is Nvidia's latest professional GPU, and is one of the first "prosumer" GPUs to adopt Nvidia's most recent Ada Lovelace GPU architecture. According to PNY's spec sheet, the RTX 6000 Ada is a monster, with 76.3 billion transistors, 18,176 CUDA cores, 568 Tensor cores for AI-focused workloads, and 142 Gen-3 RT cores for ray tracing.

Compared to the Nvidia RTX 4090, the RTX 6000 Ada has 1,792 (10.93%) more CUDA cores. Surprisingly, Nvidia is still not using a fully-unlocked AD102 die — which has 18,432 CUDA cores and two more SMs — for the RTX 6000 Ada. Note that the RTX 6000 sticks with the same 300W TBP (Total Board Power) limit of the previous generation card, as workstations are more likely to use multiple cards and don't necessarily want a single 450W card.


 
Last edited:

Celcius

°Temp. member
If it has double the VRAM of a 4090, 11% more cuda cores, and is only 300w then I'm guessing it is heavily underclocked compared to the 4090? That would also explain the blower style cooler I guess.
 

LiquidMetal14

hide your water-based mammals
Display Port 1.4 matters to a niche right now.

I'm not on either side of this AMD/NVIDIA conflict just like AMD/iNTEL on CPU's but the DP 1.4 is a cheap jab which doesn't matter to those educated enough to know what it does. And to those that it does matter, they will asses it's importance vs other factors.

It's marketing bulletpoint fluff until you or I can even afford the HW to drive the rated performance. As if right now, I don't care even 1% about that.

Regarding this GPU, it's expensive and not for me.
 

Gamerguy84

Member
I wish they would go ahead and show us the 5080 as I'm sure it's done and taped out.

They just drip feed the junkies like EA sports yearly franchises.
 
Its a better binned chip - certainly higher yield/fewer defects. It probably has a lower V/f curve and obviously its using 20Gbps GDDR6 instead of G6X, which is much more power hungry.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
If it has double the VRAM of a 4090, 11% more cuda cores, and is only 300w then I'm guessing it is heavily underclocked compared to the 4090? That would also explain the blower style cooler I guess.
It doesnt use GDDR6X.
It would be marginally underclocked on the core vs a 4090. (borderline irrelevantly due to how boost works on newer GPUs)
The 4090 loses very very little performance when powerlimited.
Blower style cards are easier to stack together.
Nvidia basically sold factory overclocked cards from the jump, the TDP of "reference" 4090s is way too high, should have been at around 320 - 360W.

NVIDIA-RTX-A6000-Graphics-Card-48-GB-GDDR6-GA102-GPU-_4.png
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
10k dollar gpu?
zelda-pixels.gif

In theory when you buy this GPU it pays for itself after the first render set you sell.

A render set should net you atleast 2000 dollars on the low end, and easily10K if you are being serious....if you bought this GPU we can assume you are being serious.

P.S It should be a 7000 dollar GPU dunno why its being priced at 9-10K.
 

nemiroff

Gold Member
Obviously not worth it for "gamers". This is only worth it if your company pays for it to make your hours at work "more efficient" so-to-speak. We've hired a team of Unreal 5 developers at work now, it'll be fun to ask them how a 10,000 Dollar GPU can make their life at work better. I guess they'd say the 48 GBs of VRAM is totally awesome.
 
not aimed at gamers or any average consumer.

$10k sounds a lot and of course it is. It also gives people a free shot at Nvidia but this line of cards has always been super expensive. it's aimed at companies that do heavy 3D work or even AI. we're talking the likes of movie makers who are rendering 3D movies. not just rendering the movie to watch but rendering it during production. i don't know what the heaviest CGI movie to date is but think that but you need to edit it in real time or damn close to it.

if these cards can shave off a few seconds of someone waiting for a render to complete then that company will buy them up in bulk for sure.

imagine you wanted to edit a photo but had to wait 3 minutes every time you made a change to see the result. that's what i'm talking about but the people using these cards are working with shit waaaay more demanding that photo editing lol. if this card can cut a render down from 3 minutes to 2 minutes then it'll pay for itself real fucking fast.
 
Last edited:
There'll be a lot of buyer's remorse, spending 10k only to have an outdated card in a year.
something tells me that the people spending $10k on a bunch of GPUs (mostly likely they aren't buying just one) will be more than happy to update as soon as a faster model is out.
 
The businesses that buy these cards make their money back 10's-100's of times over while using them.
yup. time is money. the faster they can put a product out and bring money in the better. also if the card is more efficient they will save money on energy.

gotta spend money to make money.
 
  • Like
Reactions: GHG

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
There'll be a lot of buyer's remorse, spending 10k only to have an outdated card in a year.
The next RTX6000 is at best coming out in 2+ years.
By then this thing will have paid for itself 10 fold easily.

This is already using a near full AD102 chip.
There is no AD101 or AD100 to use for consumers.
AD100 is pretty much only used in data center and machine learning applications.
The PCIe versions dont even come with fans, your server should have a cooling solution for the GPGPU.


160GB of VRAM.

A100PCIe_3QTR-TopLeft-Dual_NVLinks_Black_Alt.png
 

killatopak

Gold Member
I think I'll take a break from PC again until top of the line goes back to sub $1k or at least have a higher jump from last gen.
 

killatopak

Gold Member
Top of the line sub $1k?

In the world we currently live in?

Will Ferrell Goodbye GIF by filmeditor
Just the 80 just like old times when 90 or titans didn’t exist. Anyway, I did say if the jump in performance between generation was big enough, I would make an Exemption.
 
Last edited:
Top Bottom