• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Anyone miss Nvidia doing console gpu's?

Woody337

Member
I can help but wonder if AMD doing all the console gpu's has hampered console technology. If nvidia was still doing at least once console gpu does anyone think we would be further along?
 

MikeM

Member
Meh. RDNA 2 has good raster performance. No DLSS sucks but FSR/in-house developer upscaling tech will do just fine for consoles. Raytracing is the only real negative I see for consoles using AMD tech, but again meh.
 

LordOfChaos

Member
Miss them in terms of the historical output they had? Not really, the RSX was a last minute solution that wasn't as good as the unified shader competition, and both companies seemed to have issues with Nvidia wanting to control all aspects of their own chip, which iirc caused issues with shrink schedules and AMD was far more happy to do semicustom work for lower margins. Both Sony and Microsoft used them and then moved on next gen.

Now in terms of their current chips? Yeah, they're of course ahead especially on ray tracing performance, but there's a big but, AMD alone holds an x86 license and can make x86 APUs that the console vendors want. They could hypothetically switch to ARM cores, but the off the shelf stuff isn't really up there and Apple's aren't up for license, though they could decide to make their own I suppose. That's a lot of R&D though, and BC likely locks us into x86 consoles for a while.

Nvidia's own ARM cores are a little weird, as they do internal binary translation, in a "straight line" they can be very fast but they bog down on spaghetti code.

Might be interesting when Intel is making high performance GPUs though, as then they could provide the whole banana solution like AMD does, but both Intel and Nvidia weren't that interested in low margin semicustom console silicon, maybe that'll change now though.

Also, Nvidia still is in a console it's worth remembering, the Switch. Not really custom at all though and I have a very strong theory that it was due to Nvidia not meeting wafer silicon agreements for TX1 which Nintendo capitalized on.
 
Last edited:

YCoCg

Member
I'm sure Sony and Microsoft don't miss them considering how much Nvidia screwed them over. Besides, what makes you think they'd put their best in a console? That's one way they screwed Sony as they lied about providing the best for Sony only to later reveal a GPU that beat the system for the same costs. You think Nvidia would let consoles have DLSS, etc? Of course not, they'd make up some excuse about it not being possible so they can sell it off with their GPUs instead.
 

Ezquimacore

Banned
Nvidia is better on pc hardware. These new consoles are better using AMD because AMD has the APU market dominated. If the consoles were using dedicated gpus then the story would be different.
 

GymWolf

Member
Of course, but you need to maintain the price lower with AMD for all the people who cry when they have to spend more than 500 dollars for something that guarantee you thousands hours of entartaining for almost a decade.
 
Last edited:

Three

Member
Nope, nvidia charge too much when it comes to bang for your buck. It would have just meant weaker consoles as the manufacturers tried to hit their price target. They did the PS3 GPU.
 
Last edited:

Azurro

Banned
I can help but wonder if AMD doing all the console gpu's has hampered console technology. If nvidia was still doing at least once console gpu does anyone think we would be further along?

NVidia fucked up their business relationships with MS and Sony. The question itself doesn't make sense because Nvidia doesn't want to make their IP available for the console market.
 

LordOfChaos

Member
They've never made a good console GPU (maybe the original Xbox?), so no.

The original Xbox was definitely a good quarter to half generation ahead. But then the RSX was a last minute shove in and not a great effort, though that's more Sony's planning at fault.
 
Last edited:

00_Zer0

Member
No. They screwed both MS and Sony with pricing. It's going to suck when they screw Nintendo over with this same tactic though. Nintendo will drop them and screw B/C and their ecosystem leaving legacy fans screwed once again.
 
Last edited:

Pagusas

Elden Member
Its crazy to me how much dominance AMD has in the APU market, yet the only device that really *required* an SOC/APU, the Switch, is Nvidia. Really shocked that partnership happened.
 

Derktron

Banned
Yes I think about it every day
Video Games Nintendo GIF by nomalles

Same
 

killatopak

Member
Its crazy to me how much dominance AMD has in the APU market, yet the only device that really *required* an SOC/APU, the Switch, is Nvidia. Really shocked that partnership happened.
It’s a partnership of dire needs. Nvidia shield was more or less a product that failed to take off while Nintendo was desperately in need of mobile tech strong enough.

I think they both benefited from this unexpected need of each other.
 

OverHeat

« generous god »
DLSS on consoles would be a god send so yeah…btw Nintendo do not count 😂
 
Last edited:

Fbh

Member
The only thing that sucks is Ps5/SX having nothing comparable to DLSS 2.0.

Would have been amazing this gen
 

deriks

4-Time GIF/Meme God
Dude, the Switch

Also, why should this be a concern?! I mean, we basically have two GPU companies, and both of them are on par
 

Mowcno

Member
The PS5/XSX are incredibly powerful for their price and compared to last-gen this generations hardware is much more timely and competitive with PC. We went from the PS4 having crappy jaguar and GPU power equivalent to a 7850 which launched at $250 1.5 years before the PS4. Now we have a powerful Zen 2 CPU and an RDNA2 GPU equivalent to a current $350+ GPU and people are still complaining.

Console hardware hasn't been better than this in a long time.
 

Amiga

Member
nvidia doesn't do CPUs.

Sony/MS dealing with 2 chip vendors would have added another $100 to the price.
 

winjer

Gold Member
As with most things, it's a matter of trade offs.
nVidia has tensor and RTX units, but these occupy die space and spend power.
On a normal graphics card with it's own PCB, power delivery and a die that is only for the GPU, this doesn't matter much.
But consoles have an SoC, that share CPU, IO and GPU. If you spend die space on RTX units, you lose room for s compute units.
AMD's solution allows for enabling ray-tracing with minimal area used. It's performance is lower than nVidia's, but it's a trade-off.

Also consider, that nVidia doesn't have a GPU with an hardware scheduler and an advanced front-end, since Fermi.
This on PC is not very important, because users can just buy a faster CPU.
But on consoles, it's very important to maximize performance and optimization.

A few years back, in the days ofd the original Xbox, I heard a story that MS was not happy with their relation with nVidia.
Mostly due to prices for the GPUs for the console, especially after a process node reduction.
But the gist of the story was that nVidia is a somewhat difficult company to work with.
And matter of fact is that both Sony and MS had one console with a GPU from nVidia. Only One.
AMD/ATI has already participated on 3 consoles for MS and 2 for Sony. More if you consider the PS4 Pro and Xbox One X.

Another advantage of AMD is the X86 license. By having the same X86-64 arquitectures on consoles and PC, it facilitates porting between platforms.
This is why, in the last decade, we have seen a boom of many console franchises also being ported to PC, and PC franchises being ported to consoles.
 
Last edited:

M1chl

Currently Gif and Meme Champion
If you want to hear it, Nvidia would not probably made more powerfull APU, because given how tiny those chips are, you would probably get even less power, if it would be coupled with Tensor cores. nVidia mid-high range are pretty massive.

Not the mention, nobody work on any integration of them into some x86 chip. Which was obvious choice back when these consoles were starting to develop.
 
Won't be happening unless home consoles move to ARM. Having a manufacturer who can handle both the CPU and GPU side of things is just too convenient.
 
Nvidia provides the CPU/GPU for the Switch.

They were pushed out from the industry after doing the work for the OG xbox and PS3. I assume that the console manufacturers contacted them for every design attempts they made since then and the product they offered made no sense for the price target consoles manufacturers had.

So no, if they are not there this is probably for a good reason, they don't fit the model.

Would I like a console with a 3090 super equivalent GPU with a Code i9 whatever the number is now? Sure, I have wet dreams too.
Won't be happening unless home consoles move to ARM. Having a manufacturer who can handle both the CPU and GPU side of things is just too convenient.
I see that as a real possibility in 6 or 7 years, Apple showed the way with their M1 chip, the power characteristics and customization options are just too enticing.
 
Nvidia is like Apple, you don't go to them for the best price x performance in the business. That directly contradicts the philosophy of Sony and Microsoft.
 

tusharngf

Member
nvidia CEO always goes for higher pricing. Thats the thing what makes nvidia not easy to deal with. They ask premium price for performance.
 

gatti-man

Member
I wish nvidia would get cheaper for consoles to allow all that sweet PC tech they develop to be effortlessly implemented in consoles. Would enrich both sides imo but nvidia wants wild cash to do consoles.
 

Shut0wen

Member
I can help but wonder if AMD doing all the console gpu's has hampered console technology. If nvidia was still doing at least once console gpu does anyone think we would be further along?
Tbh at the time they did the OG xbox they got alot of bad reputation for fucking up the gpu by making it 25% less powerful then what microsoft had intended then they did the gpu for ps3 and well we all know how that ended up so yeah there stuck with nintendo and probably will stick with them for awhile as they seem pretty good at making cheap gpu
 

SlimySnake

Flashless at the Golden Globes
lol the PS3 GPU was a fucking disaster. No I do NOT miss Nvidia.

Read up on how Nvidia delivered an underpowered RSX after promising Sony a much more powerful GPU. IIRC, they even said it was like a GTX 7800 which was top of the line back then but it ended up being much worse.

The PS5 and XSX consoles are both amazing considering the fact that they managed to sneak in an 8 core 16 thread zen 2 CPU in there with a double digit tflops GPU. Those tensor cores that help with DLSS 2.0 and those RT cores both take dedicated space on the die that would make them far more expensive. The rtx 3060 Ti is on par with the 2080 which is on par with the XSX GPU. That thing is massive at around 370 mm2. More than the entire XSX APU. Adding the CPU and I/O to it would make it over 450mm2. 1.5x bigger than the PS5 APU. How are you gonna cool that in a console form factor? How much is it going to cost? 50% more the the PS5 APU?

And for what? Slightly better RT performance and exact same standard rasterization performance? DLSS 2.0 in quality mode gives you maybe 20% more performance. Is it really worth it?
 
Top Bottom