• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 4090 12VHPWR cable appears to be very dangerous

RoboFu

One of the green rats
But then he and others tried it and it didn't.
well no shit .. breaking the solder isnt a equal thing on all cables. some of cables will some wont. but more and more melted cables are popping up and the way those cables are built are the fault. corsir makes a cable that arer separate connections and have not melted.

igorlabs have replicated the issue. Its not a theory anymore.

https://www.igorslab.de/en/adapter-...hot-12vhpwr-adapter-with-built-in-breakpoint/


  • The problem is not the 12VHPWR connection as such, nor the repeated plugging or unplugging.
  • Standard compliant power supply cables from brand manufacturers are NOT affected by this so far.
  • The current trigger is NVIDIA’s own adapter to 4x 8-pin in the accessories, whose inferior quality can lead to failures and has already caused damage in single cases.
  • Splitting each of the four 14AWG leads onto each of the 6 pins in the 12VHPWR connector of the adapter by soldering them onto bridges that are much too thin is dangerous because the ends of the leads can break off at the solder joint (e.g., when kinked or bent several times).
  • Bending or kinking the wires directly at the connector of the adapter puts too much pressure on the solder joints and bridges, so that they can break off.
  • The inner bridge between the pins is too thin (resulting cross section) to compensate the current flow on two or three instead of four connected 12V lines.
  • NVIDIA has already been informed in advance and the data and pictures were also provided by be quiet! directly to the R&D department.
 
  • Like
Reactions: GHG

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I aint drinking the koolaid till I see someone actually making it.
Ive woken up in New Orleans when I started the night in Houston one too many times to just drink this shit up.

My psu maker shipping the 12 pin cables for free if u got it in last year. Singed up for one.
I see what you did there.
XJkqo59.png


Supposedly, the 4080 12Gb that was cancelled, had a similar performance to a 3090Ti.
Maybe this will become the 4070.

For many GPU generations, when a new gen releases, it's mid range card would match the performance of the previous Gen flagship. But with a much lower price.
The old way was to have the xx70 match and/or beat the range topper of the prior generation.
Basically wait two years to own the xxTi of last gen with less memory.
The "4070" will only be a success if it can absolutely walk a 3090Ti.

The 4080'12G was an embarrassment, Nvidia had the biggest balls thinking of charging 900+ dollars and calling it an xx80.............even if they rebadge it as a 4070 its probably the weakest xx70 of all time.
JMdnqA3.png


Nvidias greed has totally clouded their judgment, but consumer sheepism will continue to provide them with success.

The 4080 should have been on the same AD102 as the 4090, just cut down with 20G memory.
The 4070 should have been a cut down AD103.
The 4070Ti would/should have been whats called the 4080'16G right now.
 

benno

Member
igorlabs have replicated the issue. Its not a theory anymore.
It actually is because he didn't replicate it. The maximum temp he got was 68c under load. That is in spec.
From the link you posted.
I also measured the 12VHPWR using resistance sensors and was able to determine absolutely unsuspicious temperatures (picture above). At plenty of 530 watts it was under 60 °C and even the maximum temperature of just under 68 °C I could only prove after an hour with 600 watts in the stress test. But definitely nothing melts there yet!

What Igorslab has done is spliced the cable and found that cables share pins. He isn't happy about it and based his theory on that. He hasn't shown any overheating at all.

Like I keep saying, the only people who have successfully managed to overheat the connector are GALAX when they fitted the connector loose.
 

benno

Member
Why does that matter?
I value what is true and factual over what the hive-group thinks.
Clearly there's evidence of a problem occurring with the adapter provided by NVIDIA here. At some point they either need to issue a recall and/or provide a revised adapter.
Which is what is up for discussion. Is it the connector. Is it user error. Is it both?
 

DonkeyPunchJr

World’s Biggest Weeb
thise are all the same thing lol. . if you bend the cables to much you break the solder causing the issue.
just give it up. the guy knew there was a problem around the cable. where you said he was full of shit. there is a problem. you were wrong.
How is breaking the solder the same thing as pulling the pins out of the connector? I’m not saying there’s no problem with the cable. I’m saying his diagnosis of the root cause is complete bullshit.
 

HoofHearted

Member
I value what is true and factual over what the hive-group thinks.

Which is what is up for discussion. Is it the connector. Is it user error. Is it both?
So do I - arguing over semantics doesn't negate that there's an issue to be resolved.

If (when) a fire starts in someone's computer because of this and it hits the news - it really won't matter whether or not this is a connector issue, user error, product/design issue, or all of the above.
 

PhoenixTank

Member
He is not just some random guy on the internet. He studied this stuff at the university.

I posted videos about Buildzoid. A guy with university studies on electronics.
Just for the record and by his own admission Buildzoid was a computer science student. Not an electrician or electrical engineer but has been doing his current OC stuff for a good few years. That said I don't think he is full of shit.



Melting, fire, loss of magic smoke. Flat out not meant to happen. plenty of reason to be concerned and treat it all with caution without requiring exhaustive proof of the cause.
 
Last edited:

benno

Member
So do I - arguing over semantics doesn't negate that there's an issue to be resolved.
Discussing what is causing the issue isn't arguing over semantics, it's the title of the thread.

If (when) a fire starts in someone's computer because of this and it hits the news - it really won't matter whether or not this is a connector issue, user error, product/design issue, or all of the above.
If it's preventable by someone actually plugging the cable in correctly then I think it does matter.

Everyone is saying that it's to do with the cables in the rear of the connector breaking. If that was the case then the melting would be happening near that location where the cable joins to the connector and you'd see it melt the shrinkwrap and actual cable. It isn't. The images all show the melting occurring right at the tip of the connector where it meets the GPU socket, as you can see on all the images.
It isn't a break in the connector due to stress or damage, it's the connector not plugged in correctly and having poor contact with the GPU socket. Am I wasting my time trying to explain this to people?

XoTNTBD.jpg


iRO7Cas.jpg
3x6cDtG.jpg
77tiRjJ.jpg
74nEpA5.jpg
84AZHgQ.jpg



There’s no way they could possibly tell what cable a user plugs into the card…

If you have a burnt card connector and your power cable is normal I think they'd be able to spot it.
 
Last edited:

HoofHearted

Member
Discussing what is causing the issue isn't arguing over semantics, it's the title of the thread.

The title of the thread is "RTX 4090 12VHPWR cable appears to be very dangerous" - which is correct - the cable provide does appear to be very dangerous..

The "cause" has been identified - it's the cable/adapter - to that end - there's no need for discussion - the question now is - What is NVIDIA going to do about it?

The semantics here is arguing and postulating that it is user error contributing to these cables melting - which is absurd.

If it's preventable by someone actually plugging the cable in correctly then I think it does matter.

This isn't a "user" problem - people have been plugging in power sources to their GPUs for years without problems. This is a new product provided to the market with what essentially amounts to is a poorly engineered and horribly designed patch cable...

Your example is analogous to blaming the drivers for running into Ford Pintos and causing them to explode - when the problem was clearly with the Pinto fuel tanks.
 
Last edited:

benno

Member
The "cause" has been identified - to that end - there's no need for discussion - the question now is - What is NVIDIA going to do about it?
I've just made a post showing they're wrong.

The semantics here is arguing and postulating that it is user error contributing to these cables melting - which is absurd.
What's absurd is you turning up and telling me I can't post in a thread about a connector issue when I'm discussing a connector issue.

This isn't a "user" problem - people have been plugging in power sources to their GPUs for years without problems. This is a new product provided to the market with what essentially amounts to is a poorly engineered and horribly designed patch cable...
And I've just showed you that it is.

Do you have a point or do you just want to play boss for the day?
 

M1chl

Currently Gif and Meme Champion
Discussing what is causing the issue isn't arguing over semantics, it's the title of the thread.


If it's preventable by someone actually plugging the cable in correctly then I think it does matter.

Everyone is saying that it's to do with the cables in the rear of the connector breaking. If that was the case then the melting would be happening near that location where the cable joins to the connector and you'd see it melt the shrinkwrap and actual cable. It isn't. The images all show the melting occurring right at the tip of the connector where it meets the GPU socket, as you can see on all the images.
It isn't a break in the connector due to stress or damage, it's the connector not plugged in correctly and having poor contact with the GPU socket. Am I wasting my time trying to explain this to people?

XoTNTBD.jpg


iRO7Cas.jpg
3x6cDtG.jpg
77tiRjJ.jpg
74nEpA5.jpg
84AZHgQ.jpg





If you have a burnt card connector and your power cable is normal I think they'd be able to spot it.
So you are saying that due to being metal, that it cannot transfer heat from the plates? Look again, those side pins, have just one single plate (per pin), it definitely happens at the solder joints.
 

benno

Member
So you are saying that due to being metal, that it cannot transfer heat from the plates? Look again, those side pins, have just one single plate (per pin), it definitely happens at the solder joints.
I'm saying that it has heatshrink near the cable end. if the heat was at that end then the thin heatshrink would melt first, along with burns on the cable sleave. That isn't happening. It's all happening at the pin end. The opposite end to which the youtube vids are saying. If the connector is breaking at the point where the youtube videos are claiming then that would be the point of resistance and where the heat would first build up. I've yet to see signs of heat build up at that point on any of the burnt cables.
 
Last edited:

HoofHearted

Member
I've just made a post showing they're wrong.

Interesting - are you a qualified electrical engineer?

What's absurd is you turning up and telling me I can't post in a thread about a connector issue when I'm discussing a connector issue.
And I've just showed you that it is.
Do you have a point or do you just want to play boss for the day?

Don't be obtuse - where did I say you can't post here? I'm simply questioning your "(il)logical conclusion" attempting to identify the root cause of the issue is user error.

It's obvious that you clearly haven't (or refuse to) read up on the latest details, or watched the latest videos, outlining and clearly identifying the root problem.

All you've "shown" is a bunch of pictures attempting to justify your odd stance that it's somehow tied to people plugging in the adapter.

News Flash: It's not.

The problem is that NVIDIA packaged a shitty and cheap adapter with it's $1600+ cards that didn't even meet the same level of quality as their last gen (3xxx) cards.

I'll even make it easy for you - here's a link to the site that walks through and clearly identifies the root cause of the issue.

Here's the summary from the site that clearly outlines the root cause of the issue - I've even bolded and italicized the key points for you that specifically detail this isn't a "user" issue just in case you miss them or fail to comprehend them:

Summary and conclusion

The overall build quality of the included adapter for the GeForce RTX 4090, which is distributed by NVIDIA itself, is extremely poor and the internal construction should never have been approved like this. NVIDIA has to take its own supplier to task here, and replacing the adapters in circulation would actually be the least they could do. I will therefore summarize once again what has struck those involved (myself included) so far:

  • The problem is not the 12VHPWR connection as such, nor the repeated plugging or unplugging.
  • Standard compliant power supply cables from brand manufacturers are NOT affected by this so far.
  • The current trigger is NVIDIA’s own adapter to 4x 8-pin in the accessories, whose inferior quality can lead to failures and has already caused damage in single cases.
  • Splitting each of the four 14AWG leads onto each of the 6 pins in the 12VHPWR connector of the adapter by soldering them onto bridges that are much too thin is dangerous because the ends of the leads can break off at the solder joint (e.g., when kinked or bent several times).
  • Bending or kinking the wires directly at the connector of the adapter puts too much pressure on the solder joints and bridges, so that they can break off.
  • The inner bridge between the pins is too thin (resulting cross section) to compensate the current flow on two or three instead of four connected 12V lines.
  • NVIDIA has already been informed in advance and the data and pictures were also provided by be quiet! directly to the R&D department.

My point? Educate yourself by fully reading and comprehending what's out there - not by just looking at the pretty pictures and making assumptions.
 
Last edited:

benno

Member
What's absurd is that you clearly haven't (or refuse to) read up on the latest details or watched the latest videos outlining and clearly identifying the root problem.

All you've "shown" is a bunch of pictures attempting to justify your odd stance that it's somehow tied to people plugging in the adapter.

News Flash: It's not.

The problem is that NVIDIA packaged a shitty and cheap adapter with it's $1600+ cards that didn't even meet the same level of quality as their last gen (3xxx) cards.

I'll even make it easy for you - here's a link to the site that walks through and clearly identifies the root cause of the issue.

Here's the summary from the site that clearly outlines the root cause of the issue - I've even bolded and italicized the key points for you that clearly outline this isn't a "user" issue just in case you miss them or fail to comprehend them:
Just because your favourite youtuber says something doesn't make it so. None of the youtubers have managed to get the cables to overheat. Even the one you linked to.

Here's the quote from him/your link where he tried to get it to overheat and failed...
I also measured the 12VHPWR using resistance sensors and was able to determine absolutely unsuspicious temperatures (picture above). At plenty of 530 watts it was under 60 °C and even the maximum temperature of just under 68 °C I could only prove after an hour with 600 watts in the stress test. But definitely nothing melts there yet!
So after all his fucking around all he managed to do was to throw 600w down it and get it to raise to 68' which wasn't hot enough to raise a sweat.

There has been 4 different attempts by various youtubers to cut cables, cut connectors, bend connectors etc etc to try and replicate the overheating of the connector and nobody has managed to do it. Nobody. And here you are blindly telling me that it's a done deal when it clearly isn't.

Refute what I posted.
 

HoofHearted

Member
Just because your favourite youtuber says something doesn't make it so. None of the youtubers have managed to get the cables to overheat. Even the one you linked to.

Here's the quote from him/your link where he tried to get it to overheat and failed...

So after all his fucking around all he managed to do was to throw 600w down it and get it to raise to 68' which wasn't hot enough to raise a sweat.

There has been 4 different attempts by various youtubers to cut cables, cut connectors, bend connectors etc etc to try and replicate the overheating of the connector and nobody has managed to do it. Nobody. And here you are blindly telling me that it's a done deal when it clearly isn't.

Refute what I posted.

LMFAO - I ... just... can't any more...

You really should go back and actually read and attempt to fully understand and comprehend the ENTIRE article, including the particular statement/section that you just quoted back to me here.

If you want to die on this hill - go right ahead ...

Ignorance is bliss...

Holy shit fuck. Now I've seen it all.

 
Last edited:

benno

Member
LMFAO - I ... just... can't any more...

You really should go back and actually read and attempt to fully understand and comprehend the ENTIRE article, including the particular statement/section that you just quoted back to me here.

If you want to die on this hill - go right ahead ...

Ignorance is bliss...

Holy shit.


mate, refute me or fuck off. You've come to the thread and just started having a personal go at me. At no point have you read anything I've posted or tried to refute any of my posts. Nothing you've posted so far has anything to do with the actual thread.
 

HoofHearted

Member
mate, refute me or fuck off. You've come to the thread and just started having a personal go at me. At no point have you read anything I've posted or tried to refute any of my posts. Nothing you've posted so far has anything to do with the actual thread.
I have provided valid responses to everything you’ve stated. Everything I’ve posted is directly related to the topic at hand.

You’ve posted pictures.

There’s nothing further to “refute” here because what you’re attempting to quantify is illogical and based on a complete lack of comprehension and understanding on your part.

I’m not going to continue arguing with someone that has the mental capacity of a brick wall.

Hopefully some semblance of logic will enlighten you when NVIDIA figures out that a recall might be prudent.

Good luck to you - and please, for the sake of humanity, don’t have any children.
 

benno

Member
I have provided valid responses to everything you’ve stated. Everything I’ve posted is directly related to the topic at hand.

You’ve posted pictures.

There’s nothing further to “refute” here because what you’re attempting to quantify is illogical and based on a complete lack of comprehension and understanding on your part.

I’m not going to continue arguing with someone that has the mental capacity of a brick wall.

Hopefully some semblance of logic will enlighten you when NVIDIA figures out that a recall might be prudent.

Good luck to you - and please, for the sake of humanity, don’t have any children.
that's it? You're not refuting anything, you're here to throw some petty insults, show the world your .gif animation upload skills and boast about some imaginary superiority you believe you have. Your whole point was man on youtube says something different and I'm thick.

Awesome. You certainly showed me.
 
Last edited:

OZ9000

Banned
It appears this problem is inherent to the Nvidia adapter and the fact the terminations are soldered and not crimped.

 
Last edited:

KungFucius

King Snowflake
I aint drinking the koolaid till I see someone actually making it.
Ive woken up in New Orleans when I started the night in Houston one too many times to just drink this shit up.


I see what you did there.
XJkqo59.png



The old way was to have the xx70 match and/or beat the range topper of the prior generation.
Basically wait two years to own the xxTi of last gen with less memory.
The "4070" will only be a success if it can absolutely walk a 3090Ti.

The 4080'12G was an embarrassment, Nvidia had the biggest balls thinking of charging 900+ dollars and calling it an xx80.............even if they rebadge it as a 4070 its probably the weakest xx70 of all time.
JMdnqA3.png


Nvidias greed has totally clouded their judgment, but consumer sheepism will continue to provide them with success.

The 4080 should have been on the same AD102 as the 4090, just cut down with 20G memory.
The 4070 should have been a cut down AD103.
The 4070Ti would/should have been whats called the 4080'16G right now.
If we went by last gen, 4080 should have been the 4090 Die with less RAM priced at 850 or so, 4090 Should have had more cores unlocked leaving a little overhead for the 4090 Ti. What we got instead was "Let's jack up the prices of the 4070/Ti and call them 4080s to make the 3000 series at their launch prices still look like a good move because we have way too many of them left."

I am not sure how you can call it sheepism when there is no competition. It is market dominance and people just dealing with it like we deal with every other big company with minimal competition and their price gouging bullshit.
 

winjer

Gold Member
BTW, can someone guess what type of connector, NVidia's Ampere for professional market uses?
rtx-a6000.jpg
 

HoofHearted

Member
That's the 12th recorded instance so far.

Bullzoid thinks it is happening because of the connector design:
7z7ht493hmw91.jpg
Design of the connector or the plug/adapter? That’s the first time I’ve seen a pic of the actual connector burnt.

This issue is going to spread like wildfire.
 

dave_d

Member
Isn't this guy famous for fear mongering and shitty takes?
Well that and the time he couldn't open a box. (Ok, it was a crate and he destroyed the crate to get the computer out. I was thinking, "There's got to be more screws you moron, go look for them.")
 

rofif

Can’t Git Gud
I don't get it why are the FE plugs STRAIGHT 90degree on 40xx cards.
WTF were they thinking. The 4 pin also seems kinda unnecessary for new power supplies.

The solution on 3080 actually looks ok. I like the chunky connectors and the angle gives me just enough room so I can manouver the cables nicely. And p600s case got that proper trapdoor for pci cables right below

DQryZqA.jpg

5mEKe45.jpg
 

skneogaf

Member
I wonder how well my 3090ti adapter will work on a 4090?

It has a sleeve but I'm not sure if it has a plate or is pinned into the connector like the 3090 adapter.
 

Fredrik

Member
It appears this problem is inherent to the Nvidia adapter and the fact the terminations are soldered and not crimped.


Soldering is fine, if done well, but what’s shown in some tear down photos should never pass a quality check, looks like first day at school soldering, cold solder, cut wires, and melted isolation into the solder joint. I mean, what is this?
wOaxszL.jpg

VE0sC6L.jpg
 
Last edited:

rofif

Can’t Git Gud
so did 3080fe cables had this issue? I don't remember hearing about it and I've been using mine with original nvidia adapter for 2 years like I posted above.
I know it is technically a different cable but still a tiny connector
 

LiquidMetal14

hide your water-based mammals
Is there a way to get s breakdown of what they were doing and the temps? I get that some may have been bent awkwardly but knowing how far they were pushing would help. I may end up getting the separate adapter or cables in the future but I'm not pushing the full load on the card. It's actually the best OC I've had on the core and mem OC's. Referring to the Gigabyte Gaming OC model.
 

Fredrik

Member
Is there a way to get s breakdown of what they were doing and the temps? I get that some may have been bent awkwardly but knowing how far they were pushing would help. I may end up getting the separate adapter or cables in the future but I'm not pushing the full load on the card. It's actually the best OC I've had on the core and mem OC's. Referring to the Gigabyte Gaming OC model.
I would say it’s about quality control. The soldering is absolutely horrible and some cables are bad. Hard to say how the soldering is on the good cables but if you’re not bending the cable close to the connector and if you’ve been using it for awhile already and you can’t see any results from heating, then I can’t see why you would have to worry.
I’ve checked my cable with a heat camera, looked at the pins and connector, made sure to push it in until the locking clicks, and I don’t worry about it anymore. Well, except that I’m not planning to bend it more than needed though, tight cable management is a no-no this time.
 
Last edited:

benno

Member


Please note: 26:10 mins into the video.

Video basically states that he's been testing 8 different cables for 72 hours and can't reproduce any of the theories floating around.
That Igorslab has tested a cable not representative to the build quality of all the cables he has and is wired with different gauge, lower rated wire and has different shared connector plates.
Can't recreate the break in cable/solder joint which Jay2cents is claiming.
That what I've been posting regarding poorly seated connectors has "definite merit" but he hasn't managed to get a GPU to boot with the cable not secured correctly in the socket.
That it's still open for more testing and the youtubers etc making claims to know what it is is talking through their arse.
That it's possible just bad batch cables, bad connection or something else we don't know as yet.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I am not sure how you can call it sheepism when there is no competition. It is market dominance and people just dealing with it like we deal with every other big company with minimal competition and their price gouging bullshit.
Thats literally the definition of sheepism.
The RX6000 cards are very very competitive versus the RTX30s but sheep kept buying RTX cards.

CUDA and OptiX users i can understand but gamers really should be seeing the RX6000 cards as a true option.
But look at the steam hardware charts.

average-fps_2560_1440.png
 

benno

Member
I wonder how well my 3090ti adapter will work on a 4090?

It has a sleeve but I'm not sure if it has a plate or is pinned into the connector like the 3090 adapter.
IF it has a different pinout it'll be an interesting smoke machine for a few seconds.
 
Top Bottom