• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

12-pin Power Connector: It's Real and Coming with NVIDIA Ampere GPUs

llien

Member
Over the past few days, we've heard chatter about a new 12-pin PCIe power connector for graphics cards being introduced, particularly from Chinese language publication FCPowerUp, including a picture of the connector itself. Igor's Lab also did an in-depth technical breakdown of the connector. TechPowerUp has some new information on this from a well placed industry source. The connector is real, and will be introduced with NVIDIA's next-generation "Ampere" graphics cards. The connector appears to be NVIDIA's brain-child, and not that of any other IP- or trading group, such as the PCI-SIG, Molex or Intel. The connector was designed in response to two market realities - that high-end graphics cards inevitably need two power connectors; and it would be neater for consumers to have a single cable than having to wrestle with two; and that lower-end (<225 W) graphics cards can make do with one 8-pin or 6-pin connector.

The new NVIDIA 12-pin connector has six 12 V and six ground pins. Its designers specify higher quality contacts both on the male and female ends, which can handle higher current than the pins on 8-pin/6-pin PCIe power connectors. Depending on the PSU vendor, the 12-pin connector can even split in the middle into two 6-pin, and could be marketed as "6+6 pin." The point of contact between the two 6-pin halves are kept leveled so they align seamlessly.

xysaJrZzfgv5OxWK.jpg

the connector should be capable of delivering 600 Watts of power (so it's not 2*75 W = 150 W), and not a scaling of 6-pin. Igor's Lab published an investigative report yesterday with some numbers on cable gauge that helps explain how the connector could deliver a lot more power than a combination of two common 6-pin PCIe connectors.

TPU

So:
Up to 600 watts, curious, what this tell us about Samsung 8nm.
The "it's just not to have to manage 2 cables" is an obvious BS.
Interesting part is that connector is rated for 25 insertions. I was shocked first, but then figured existing ones are rated at 30. (connector is guaranteed to not break if you plug/unplug it that many times)
The implication or PSUs, cough... Note that just having adapter connectors doesn't cut it, due to the higher currents.
 
Last edited:

CuNi

Member
AFAIK Igor from Igorslab said that he already heard about that 12 pin early March or so and thinks it's only a engineering prototype for now and will slowly be introduced through next generations and PSU versions.
 
Rumours of next Nvidia cards being power hogs could be true. Quite worrying (or a good thing depending) they need a 12-pin on their top model to compete with AMD's top RDNA2 card.

This is evidence that Samsung's node process is not as good as TSMC's. I mean even Intel struggles to compete.
 
Last edited:

//DEVIL//

Member
I have evga P2 1300 power supply. if I need to replace it just for Nvidia, they can fuck off and shove their GPU up their asses. not gonna spend gazillion of dollars on a GPU then ask me for more money for a new power supply or worse an Nvidia power supply { or they partner with one of the companies for certified power supply ) . yeah No. pass.
 
Last edited:

Kuranghi

Member
I just hope that AMD brings out something thats made to compete with nvidia top-end cards, so that the stupid nvidia pricing will lower again. If you don't care about RTX the pricing is ridiculous compared to Maxwell.

I bought my latest card in 2019 but I still went with Pascal because it felt stupid to spend ~£750 (RTX 2080) on a GPU when my 970 was £279, how much was the 980? ~£450?

Especially when its a first gen RT product, they have said since last year that the 3000 series will smash the 2000 for RT efficiency. Although they would say that haha.
 

idrago01

Banned
Got a platinum 1000 watt EVGA power supply so I should be fine, I just don't want them to underwhelm again like they did with the 2080Ti
 

godhandiscen

There are millions of whiny 5-year olds on Earth, and I AM THEIR KING.
I remember when I was at the top of my game, running a pair of GTX580 in SLI. I miss the excitement of those days, but eventually the experience burned me out of of PC gaming. Dedicating 600W to a GPU alone is nuts and will require a liquid cooling solution to ignore the sound coming from that oven. I'll pass.

Rumours of next Nvidia cards being power hogs is true. Quite worrying (or a good thing depending) they need a 12-pin on their top model to compete with AMD's top RDNA2 card.

This is evidence that Samsung's node process is not as good as TSMC's. I mean even Intel struggles to compete.

Oh shit, I wasn't aware:


Yeah dude, I'd skip this generation. Let Samsung sort out their quirks.
 
Last edited:

Kuranghi

Member
I have evga P2 1300 power supply. if I need to replace it just for Nvidia, they can fuck off and shove their GPU up their asses. not gonna spend gazillion of dollars on a GPU then ask me for more money for a new power supply or worse an Nvidia power supply { or they partner with one of the companies for certified power supply ) . yeah No. pass.

What do you need 1300W for? Triple-SLI? Do you have like 50 hard drives? I confuse.
 

kiphalfton

Member
Unless this is for non-consumer graphics cards, this is fucking stupid. Making it mandatory for you to buy a new PSU for their graphics card, yeah not gonna happen. I mean when has this actually ever happened before? Even PCI Express generations are forward compatible. Then if somebody doesn't have a high enough wattage PSU, then they have to upgrade their PSU.

Hopefully the AIB manufacturers swoop in and save the day. Or are able to do something, so buying a new mobo isn't necessarily warranted.
 

DeaDPo0L84

Member
I sense another Fermi style generation of cards on the horizon.



Concerned? It's looking like anyone wanting a 3080ti might need an entirely new PSU regardless of how old it is.

Well if that's the case depending on the price of the 3080ti I may hold off a bit.
 

Kuranghi

Member
Havent you noticed that power has been increasing lately? Both for cpu and gpu.

I have a 8 year old CPU, the 3770K, which uses 150W at load according to quick google, (9700K uses "over 200W at load with overclocking"), my GPU is a GTX 1080 which seems to use 250W in a worst case scenario, thats only 450W even if I had the 10th gen CPU.

Obviously you have the MB and HDDs and other stuff on top of that but I didn't think you'd need 1300W unless you have multiple GPUs/CPUs.

This article indicates you made a good choice, you are more future proofed than my 750W PSU is, but it just seems to be overkill for right now (Excluding having multiple GPU/CPUs as I said above), thats why I said the 50 HDD thing, thought maybe you were running a server on it.
 

rofif

Banned
This is just stupid.
We need a general motherboards and structure overhaul.
There is absolutely no need for the 90s structure to still be around.
The cpu power cable should be included in man 24pin years ago. The pci slots should be totally redefined or at least x2 as long to deliver more power and hold the very heavy cards
 
Last edited:

GetemMa

Member
This probably won't be a big deal for modular power supply owners.

Not sure if they can offer converters on this or not for everyone else.

Everything is just pointing to these cards being absurdly expensive propositions. What a shame.
 

kiphalfton

Member
This probably won't be a big deal for modular power supply owners.

Not sure if they can offer converters on this or not for everyone else.

Everything is just pointing to these cards being absurdly expensive propositions. What a shame.

I mean maybe they can offer something for modular or semi modular PSU's, but I doubt they're going to give it away for free. I almost think they wouldn't be willing to make them altogether, since it just seems like a big liability. I may be wrong, and feel free to tell me if I am, but I thought I had read that you can't use modular power cables between different brand and/or model of PSU's. So if true, that would mean each company would have to make these aftermarket cables for each model of PSU that is compatible. Seems like a lot of work. I think instead of giving a person the opportunity to burn their home down, they would just come out with a power supply (maybe through EVGA, or other Nvidia partner) and release that.
 

GetemMa

Member
I mean maybe they can offer something for modular or semi modular PSU's, but I doubt they're going to give it away for free. I almost think they wouldn't be willing to make them altogether, since it just seems like a big liability. I may be wrong, and feel free to tell me if I am, but I thought I had read that you can't use modular power cables between different brand and/or model of PSU's. So if true, that would mean each company would have to make these aftermarket cables for each model of PSU that is compatible. Seems like a lot of work. I think instead of giving a person the opportunity to burn their home down, they would just come out with a power supply (maybe through EVGA, or other Nvidia partner) and release that.

Most Graphics cards come with their own power connector. I have never come across a PSU that required you use their brand of cabling.

I just built a rig 2 weeks ago for my nephew and his GPU came with it's own power connector. It was an Nvidia GTX 1650 and I plugged it into a Thermal Take PSU. It worked just fine.

Nvidia is not going to sell graphics cards that cost $800 to $1500 that would require you to purchase an Nvidia branded PSU. That would be Nvidia bending over, spreading their cheeks and pointing to AMD and saying "just come over here and fuck us up"

I think the panic stations happening around this news are a bit silly.

That said, if you are currently running a PSU below 750W, the TPD could be a big issue. I bet that 350W power draw is accurate.
 

PhoenixTank

Member
Concerned? It's looking like anyone wanting a 3080ti might need an entirely new PSU regardless of how old it is.
With the potential of an adapter using 2x8pins as well as PSU manufacturers selling new cables for modular power supplies it may not be as bad as it sounds. But that is a rumour on top of a rumour.

Was Nvidia choosing Samsung's 9nm node their drop the ball moment? We can all dream (so comp leads to lower prices for everyone).
The node choice has flipped back and forth several times now and I've learnt to rarely believe Nvidia rumours... even if it seems like they have some other semi-confirmed leaks this time round.
Anything substantial on Samsung popped up?
 
With the potential of an adapter using 2x8pins as well as PSU manufacturers selling new cables for modular power supplies it may not be as bad as it sounds. But that is a rumour on top of a rumour.


The node choice has flipped back and forth several times now and I've learnt to rarely believe Nvidia rumours... even if it seems like they have some other semi-confirmed leaks this time round.
Anything substantial on Samsung popped up?

No just the rumours everyone has seen. I said 9nm earlier but the rumour is actually 7nm or 8nm, further clouded by different processes used for different architectures.

One thing that makes perfect sense to me is Samsung's 7nm or 8nm is not as 'good' (density? yields?) as TSMC's 7nm+, but how true that is remains to be seen.
 

Celcius

°Temp. member
Oof, this makes me think they'll be hot and power hungry cards.
I doubt that nvidia won't let the new cards work with existing power supplies as-is though... otherwise they'd be seriously limiting their market.
 

888

Member
Now is NOT the time to be buying new PSUs. They are stupid expensive right now. Spent 155 on a EVGA Supernova G3 750W. With a 1070, Ryzen 7 3700x, 3x NVME and 3x SSDs I was around 450W required. If Nvidia wants to start messing with new Pinouts and requirements they better really think it through.

Just read this could only be for Founders cards. Partner boards could be different.
 
Last edited:

Rbk_3

Member
Fun fact, my whole pc runs with a 525 watt PSU (gtx 1070 and overclocked 8700k).
These guys want 600 watt just for their GPU.

I was running a 2080S and a 9900KS perfectly fine on a 650W. I upgraded to 850 just to be safe, but I still had headroom.
 

GHG

Gold Member
With the potential of an adapter using 2x8pins as well as PSU manufacturers selling new cables for modular power supplies it may not be as bad as it sounds. But that is a rumour on top of a rumour.

As long as the adapters work... No custom cables NO BUY.

img_20200719_0734039ojy1.jpg


img_20200719_074444dpkvl.jpg



img_20200719_080011w2kfh.jpg


I will delay the purchase until I can get cables that match my current ones. Hopefully it doesn't take too long.
 

llien

Member
Pragmatically, NV would not do this if there would be no need.
Most likely the top end cards on Samsung 8nm would have peak power consumption way above of what normal connectors area guaranteed to deliver.

So far in the leaks, the biggest card was rated at "only" 320w.

I just hope that AMD brings out something thats made to compete with nvidia top-end cards, so that the stupid nvidia pricing will lower again.
Oh boy.
 
Top Bottom