• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Can consoles not go with Nvidia for some reason?

small_law

Member
Old man here to remind everyone that MS went with Nvidia for the original Xbox then switched to ATI (AMD acquired them) for the 360, which caused all the back compat hell between those consoles.

Nvidia was thrilled to get the original Xbox contract. As I understand it, they wanted way too much money per unit sale for what became the 360. Their offer was some ridiculous dealbreaker.
 

Tams

Member
Looks like Apple is ultimately to blame for that. They want complete control over their hardware/software and the push to make macs run on the same chips as the phones.

Well, ummm, no. While that was probably what spurred them to go fully in with Apple Silicon development (bear in mind that Apple helped found ARM), Apple turned to AMD for GPUs after Nvidia screwed them over.

AMD even went as far as to make custom GPUs with HBM for them.
 

Tams

Member
We know from the Nvidia leaks that they are making a custom SoC for Nintendo (T239).

Umm, thanks for contributing nothing to the thread? I pretty much said that, but we don't know if it will be for a future Nintendo console, hence why I said 'almost certain'.

Unless you're an insider, in which case spill the beans (with proof) and lose your job for us. Show that you truly are devoted to The GAF.
 

SScorpio

Member
Old man here to remind everyone that MS went with Nvidia for the original Xbox then switched to ATI (AMD acquired them) for the 360, which caused all the back compat hell between those consoles.

Nvidia was thrilled to get the original Xbox contract. As I understand it, they wanted way too much money per unit sale for what became the 360. Their offer was some ridiculous dealbreaker.
The way I heard it was that MS didn't have a good contract with Nvidia, and Nvidia wouldn't lower the price they were paying. So Microsoft was still paying Nvidia the same as they were at launch, which is why MS killed off hardware production so quickly rather than keeping it around after the 360 came out.
 

Tsaki

Member
AMD has x86 license and their CPUs are superb. Radeon GPUs are also great in performance per mm^2. They might not have a 4090 competitor but a console wouldn't have that size of a GPU anyway so that's moot.
People also forget that DLSS is not free. It runs on Tensor cores which themselves occupy die real estate.
They now got tech inside their device that can compete with consoles 6-7 times it's size, with 5-7 times it's terraflops?
You have big disappointment ahead of you if that's what you expect.
 

Dream-Knife

Banned
Well, ummm, no. While that was probably what spurred them to go fully in with Apple Silicon development (bear in mind that Apple helped found ARM), Apple turned to AMD for GPUs after Nvidia screwed them over.

AMD even went as far as to make custom GPUs with HBM for them.
Where did nvidia screw them over?
 

winjer

Gold Member
Where did nvidia screw them over?

With the original Xbox, MS was trying to do a slim version and cost reduction.
But for that, they needed smaller and cheaper chips, with a newer process node. The usual stuff with a Slim console.
NVidia refused to provide a price cut for their GPU.
I don't know what happened with nvidia and the PS3. But it might have been a similar situation.
 

Woopah

Member
Umm, thanks for contributing nothing to the thread? I pretty much said that, but we don't know if it will be for a future Nintendo console, hence why I said 'almost certain'.

Unless you're an insider, in which case spill the beans (with proof) and lose your job for us. Show that you truly are devoted to The GAF.
You said that Nvidia has shown no inclination to do extensive custom work, when the Nvidia leaks show that they are doing extensive custom work for Nintendo.

That was the point I was making.
 

Dream-Knife

Banned
With the original Xbox, MS was trying to do a slim version and cost reduction.
But for that, they needed smaller and cheaper chips, with a newer process node. The usual stuff with a Slim console.
NVidia refused to provide a price cut for their GPU.
I don't know what happened with nvidia and the PS3. But it might have been a similar situation.
I'm aware of those, I was talking about apple.
I did. Shitty they didn't extend support for their solder issues, but lead-free solder was trash back then, and the government mandated it. Same thing happened to 360 and ps3.

Other than that, meh apple seems more shitty of the two.
 
Last edited:

GymWolf

Member
In an ideal world a console should have and nvidia gpu and intel cpu but we live in a world where you can identify as a musky mouflon on social media and not being locked up in a loony bin, so, far from ideal...
 
Last edited:

Interfectum

Member
Or these claims of the Switch 2 being a portable PS5/SeriesX beater are massively exaggerated.
There are no serious claims that this beats PS5 or SeriesX... only defensive fanboy exaggerations on both sides.

What IS claimed is it seems Switch 2 is capable of running a UE5 demo that looks on par with current console offerings, most likely with major help from DLSS 3+ and a more optimized version of the demo.
 

Buggy Loop

Member
Nvidia doesn`t have their own desktop x86 chips. End of story pretty much.

What's the harm in going ARM on consoles though?

Apple showed everyone how you can push that architecture. Nvidia's GRACE CPU is a monster, scaling it down from non AI applications back to normal levels is an easy task for them.

Funny peoples say Nvidia doesn't have a CPU. It's the least problematic thing to create for them. They sell whole package solutions for all professional levels. Not having the old ass x86 license is almost a liberation.

Real reason console manufacturers wouldn't go Nvidia is costs, obviously, but Intel is in a good position to steal the show with their tile APUs, they have the tech inside also to reach Nvidia ML & RT, then its a matter of how starved they are to score a contract, probably a lot more hungry than AMD.
 
Last edited:

zeldaring

Banned
Cause if they went with NVDA consoles would cost 800$ just for the added DLSS which is over hyped to the moon. if you wanna pay 300$ for a better upscaler then you would be happy.
 
Last edited:

Robb

Gold Member
My guess would be cost and convenience.

Only reason Nintendo got Nvidia for the Switch was because Nvidia had a chip no one else wanted and Nintendo needed/wanted something cheap since the WiiU was absolutely tanking.

Assuming Nintendo is keeping Nvidia for the next system I’m sure they managed to leverage their current position to get another good deal. If Nvidia thinks Nintendo can sell another 140M systems I’d be surprised if they didn’t want to get in on that again. Otherwise they would’ve moved to someone else, and I’m sure Nintendo have been doing the rounds and comparing what they could get from AMD.
 

JackMcGunns

Member
I see these threads about Nintendo/switch 2 and Nvidia and how with dlss it could catch up simply due to software and AI wise.

If thats the case, why don't consoles go with Nvidia? Just like Nintendo. Is there a reason they are stuck with AMD?


MS went with them with the original Xbox. In 2000, Microsoft put up $200 million (Roughtly 380 Million today) to help Nvidia, then a young company with only $375 million in annual sales, to design a chip for what would eventually be called NV2A for Xbox or NV20 for the PC market. In the six days after the deal was concluded, Nvidia’s share price nearly tripled, while trading volume in the shares surged by a factor of six. In fact, it was such a big deal for the company that an Nvidia employee got in trouble with the SEC for insider trading.

nVidia is the company it is today mostly because of MS and Xbox, they used this money and opportunity to gobble up assets from companies like 3Dfx Interactive and S3. Despite all this, nVidia turned around a stabbed MS in the back by over-charging for the GPU after years of production and manufacturing prices had dropped. They eventually went into arbitration and settled. But clearly this was a good reason for MS and nVidia to part ways and MS to go with ATI (Known as AMD today).

Sony also took a stab at using nVidia and ended up with a pricey and lacklustre GPU codenamed RSX which peformed worse than its competitor from ATI codenamed Xenos, the GPU in Xbox 360, whereas if it weren't for the extra muster from the Cell processor, 360 would've left the PS3 in the dust despite releasing an entire year earlier.
 
It's the same. Microsoft worked with them on the OG Xbox and Sony worked with them on the PS3 and both got the shaft for the price they felt. Thus them going with AMD the very next generation.

Xbox and PS5 both made a deal with AMD way back before DLSS reveal.

Nintendo got lucky because nvidia failed with the Shield portable which was similar to what Nintendo needed for the Switch so they made a deal with nvidia at that time and much later were able to renegotiate for the switch 2 to get nvidias DLSS support.


OP these basically answer the question. Not much more to add in terms of backstory.

Theoretically yes Sony & Microsoft could go with Nvidia next gen, but their production costs would go up, because like others have said they'd have to get the CPU & GPU from two different companies. It'd also mean a non-hUMA architecture, so split RAM between CPU and GPU, and the bottleneck of relying on PCIe as an interconnect between the chips instead of a faster, lower-latency proprietary interconnect.

Unless Nvidia starts making performant CPUs, there's no reason for Sony or Microsoft to consider ever going back to them. Meanwhile at least Sony are developing custom tech for accelerating parts of the graphics pipeline that AMD's stuff by default might not be doing. We'll probably get to see this with the PS5 Pro when it comes to the RT solution and, we've already seen it with the PS5 in terms of the data I/O subsystem; much of that was in-house custom ASIC designs paired with AMD's ongoing efforts.
 

Zathalus

Member
For those saying Microsoft got fucked by Nvidia, that is true. It was also 20 years ago and today Microsoft is one of Nvidia's biggest clients.

The lack of x86 stops Microsoft from going with Nvidia. Not hurt feelings from two decades ago.
 
It's all about cost. Nvidia just doesn't compete on price. Console components need to be cheap for the whole thing to work. AMD clearly is willing to come in at a competitive price. Let's say you are out to procure a processor and it has to be below $200 and Nvidia offers one for $250 and AMD is willing to sell you one for $199 then guess who gets the contract. Nvidia is well know for not reducing their prices mainly because they don't have to. AMD makes both CPUs and GPU and will always be better value. That's all it is to it. You have a budget.
 
I see these threads about Nintendo/switch 2 and Nvidia and how with dlss it could catch up simply due to software and AI wise.

If thats the case, why don't consoles go with Nvidia? Just like Nintendo. Is there a reason they are stuck with AMD?
Because AMD is the cheapest option and Nvidia is premium. Why did Nintendo go with Nvidia? Because Nvidia made a move for the mobile HW market with their Tegra line, they failed to gain traction and needed to offload all of them Tegras at the time so Nintendo signed a 10+ year contract to get them on the cheap.
 

Drew1440

Member
Sony also took a stab at using nVidia and ended up with a pricey and lacklustre GPU codenamed RSX which peformed worse than its competitor from ATI codenamed Xenos, the GPU in Xbox 360, whereas if it weren't for the extra muster from the Cell processor, 360 would've left the PS3 in the dust despite releasing an entire year earlier.
Sony originally had plans for the Cell BE to do the graphics rendering. This wasn't working out and feedback from developers wasn't good, so Sony had to turn to nvidia for a GPU design at the last minite. Hence why the end product was underperforming and it had to use split memory pools, Nvidia had no experiance with the XDR memory the Cell had used.

For what it's worth, Sony still used Nvidia GPUs in the VAIO laptops until they were discontinued, and Microsoft used them in a few Surface Pro models so they are happy to use Nvidia in higher end products, but for consoles AMD is king.
 

shamoomoo

Member
You said that Nvidia has shown no inclination to do extensive custom work, when the Nvidia leaks show that they are doing extensive custom work for Nintendo.

That was the point I was making.
I'm sure the Switch 2 is just a Jetson APU that could fit in the switch and isn't crazy customized.
 

Mr.Phoenix

Member
For those saying Microsoft got fucked by Nvidia, that is true. It was also 20 years ago and today Microsoft is one of Nvidia's biggest clients.

The lack of x86 stops Microsoft from going with Nvidia. Not hurt feelings from two decades ago.
That's not true, it was never about hurt feelings.

And being a client in the server market is a completely different thing from the console market. honestly, that you couldn't make such a distinction makes me question how seriously to take what you are saying.
 

jorgejjvr

Member
Because AMD is the cheapest option and Nvidia is premium. Why did Nintendo go with Nvidia? Because Nvidia made a move for the mobile HW market with their Tegra line, they failed to gain traction and needed to offload all of them Tegras at the time so Nintendo signed a 10+ year contract to get them on the cheap.
Lighting in a bottle
 

jorgejjvr

Member
AMD has x86 license and their CPUs are superb. Radeon GPUs are also great in performance per mm^2. They might not have a 4090 competitor but a console wouldn't have that size of a GPU anyway so that's moot.
People also forget that DLSS is not free. It runs on Tensor cores which themselves occupy die real estate.

You have big disappointment ahead of you if that's what you expect.
I don't have those expectations. That's simply what I'm seeing OTHERS claim lol. "Ray tracing better than Xbox and ps5", "able to run the matrix demo", "on par with ps5" etc etc.
 

Zathalus

Member
That's not true, it was never about hurt feelings.

And being a client in the server market is a completely different thing from the console market. honestly, that you couldn't make such a distinction makes me question how seriously to take what you are saying.
Of course it's not hurt feelings, I was being facetious. It was a serious dispute between the two companies over a cost issue. It was also two decades ago and companies can easily work together again if the benefit is there. Other factors are way more important that prevent the two from working together on a console APU, such as cost and reliance on x86.
 

buenoblue

Member
By the time the switch 2 comes out the ps5 and series x will be 4 maybe even 5 years old. And maybe let's just see how powerful the switch 2 really is before we get too excited.
 

64bitmodels

Reverse groomer.
What's the harm in going ARM on consoles though?
losing backwards compat and having less power. Apple's M chips are cool and all but they are productivity workhorses more than they are gaming chips. their performance doesnt scale up the more TDP you give it, kind of like the steam deck's apu, and their performance on games as is.... it's OK. Impressive by ARM standards but nothing special by console standards, PS5 and Xbox would crush it every time.
 

Schmendrick

Member
What's the harm in going ARM on consoles though?
nothing if it had happened 2 gens ago. Now it would kill native backwards compatibility and make multiplat development much more complicated.
Consoles won't deviate too much from PCs anymore unless someone makes the console manufacturers an offer they can't refuse.
 
Last edited:

SolidQ

Member
The last time Sony did that Nvidia gave them a crap GPU and charged much more than AMD does.
Switch 1 got Maxwell instead Pascal, Switch 2 going to Ampere instead ADA. I represent if PS5 going to be with Pascal instead Ampere.
So console holders can use latest tech from AMD, while with NVIDIA, they maybe have old tech, like with PS3 instead G80, got crappy G70
 

Mr.Phoenix

Member
Of course it's not hurt feelings, I was being facetious. It was a serious dispute between the two companies over a cost issue. It was also two decades ago and companies can easily work together again if the benefit is there. Other factors are way more important that prevent the two from working together on a console APU, such as cost and reliance on x86.
And my point is the things that made them stop working together 20 years ago? Are still there today. And they are more relevant today then they were back then.
 

Zathalus

Member
And my point is the things that made them stop working together 20 years ago? Are still there today. And they are more relevant today then they were back then.
What? A pricing dispute on a GPU from 20 years ago? Why would that be relevant today? If Microsoft were to ask Nvidia to design an APU today I'm sure a new contract would be drawn up and have very little relation to the one back then. Microsoft would likely not though as Nvidia only has access to ARM and not x86. Cost and backwards capability are also major concerns as well.
 

Raonak

Banned
AMD is simply a better choice for a full spec console. A full x86 APU solution.

Plus AMD are very flexible, both with pricing, but also with hardware design.
Sony (mark cerny) work alongside AMD to get a chip with all the customizations that they want
(variable clock system and tempest audio for PS5, dedicated checkerboarding hardware for PSPro, etc.)


They wouldn't have this level of freedom with someone like Nvidia.
 

Mr.Phoenix

Member
What? A pricing dispute on a GPU from 20 years ago? Why would that be relevant today? If Microsoft were to ask Nvidia to design an APU today I'm sure a new contract would be drawn up and have very little relation to the one back then. Microsoft would likely not though as Nvidia only has access to ARM and not x86. Cost and backwards capability are also major concerns as well.
Sigh... No. I was trying to avoid having to explain this.

The issue 20 years ago is not that there was a pricing dispute, that's just how the issue presented. The issue is Nvidia policies. Eg.

The GPU in the PS5 is basically a 6700XT. A $480 GPU at launch on the PC. AMD would have an APU, which has a CPU and GPU, and sell that APU to Sony or MS only adding a small margin on the on-die cost. So say that 320mm2 APU cost AMD $100? They would sell it to Sony for $110.

Nvidia on the other hand, doesn't operate that way, Nvidia would look at that GPU, look at its PC equivalent, and price the GPU as if they are selling to just any other OEM. So in this case, that would be something like a 2080 super, a $700 GPU. Now while the actual die may cost Nvidia only $120? They would not sell it to Sony/MS for that, they would sell it to them for $250+, which is around what they sell it to their other OEMs. No exceptions. And being that they have the GPU market cornered anyway, especially now, they have even less reason to make any exceptions. Do you go with Nvidia? Well, consoles just get more expensive. That is totally fine by them as they would rather gamers buy their PC GPUs anyway.

That's the problem that is still there today. And that is all before you even start looking at the fact that this would take us back to the pre-PS3 era and would be back to having discrete CPUs and GPUs in consoles...
 
Last edited:

Fafalada

Fafracer forever
If thats the case, why don't consoles go with Nvidia? Just like Nintendo. Is there a reason they are stuck with AMD?
NVidia burned both Sony and Microsoft quite badly on the times they worked with them - and it wasn't just simple issue of costs. Although cost-wise the deals they made were indeed quite bad, IIRC og XBox was literally losing more money on the console each passing year - in no small part thanks to NVidia part of the deal. And Sony also got the short end of the stick coming to them way too late on PS3, with no real options but to take whatever was offered.

Cost likely still plays a factor but the relationships also need to be rebuilt for these things to happen.
 
Top Bottom