• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 4090 12VHPWR cable appears to be very dangerous

zcaa0g

Banned
Thats literally the definition of sheepism.
The RX6000 cards are very very competitive versus the RTX30s but sheep kept buying RTX cards.

CUDA and OptiX users i can understand but gamers really should be seeing the RX6000 cards as a true option.
But look at the steam hardware charts.

https://tpucdn.com

Now /review/asus-geforce-rtx-4090-strix-oc/images/average-fps_2560_1440.png
Thats literally the definition of sheepism.
The RX6000 cards are very very competitive versus the RTX30s but sheep kept buying RTX cards.

CUDA and OptiX users i can understand but gamers really should be seeing the RX6000 cards as a true option.
But look at the steam hardware charts.

average-fps_2560_1440.png


And now show Ray tracing benchmarks.
 

KungFucius

King Snowflake
Thats literally the definition of sheepism.
The RX6000 cards are very very competitive versus the RTX30s but sheep kept buying RTX cards.

CUDA and OptiX users i can understand but gamers really should be seeing the RX6000 cards as a true option.
But look at the steam hardware charts.

average-fps_2560_1440.png

AMD Cards being "competitive" does not mean there is competition. The RX cards were not the best and being a lower market share card there are less of them available. Further they launched at absurd prices compared to the comparable Nvidia cards. AIB 6800 XTs were 800+ at launch and were worse than 3080 at RT. If you want to call that competition then fine. But real competition is having several or many players that actually compete on quality and price so the customer has some power. I don't see AMD competing seriously on quality or price. NVidia sets the price and AMD puts out weaker products for 50 bucks less for the reference cards and then AIBs jack prices. People are hoping they will show something different this week, but It's probably going to be more of the same. Cards that do as good or better than the NVidia cards in the targeted price bracket with worse RT performance for marginally less ad 1/4 overall availability compared to Nvidia.

I can't really speak to people buying last gen cards today. From what I have seen, the AMD discounts look good, but I would never go after 2 year old tech when something new is launching.
 

DonkeyPunchJr

World’s Biggest Weeb
In what games considering the top 50 games on Steam dont have RT
Uh Cyberpunk is the second best selling game on Steam right now (after MW2). And most of the top games on Steam can run on a potato anyway.

Just stop this shit, you aren’t doing the red team any favors. “RDNA2 is just as good as RTX, the only reason people buy RTX is because they’re stupid Nvidia sheep!!!” “What about ray tracing and DLSS?” “Well… I don’t care about those and neither should you!!”
 

daffyduck

Member
Uh Cyberpunk is the second best selling game on Steam right now (after MW2). And most of the top games on Steam can run on a potato anyway.

Just stop this shit, you aren’t doing the red team any favors. “RDNA2 is just as good as RTX, the only reason people buy RTX is because they’re stupid Nvidia sheep!!!” “What about ray tracing and DLSS?” “Well… I don’t care about those and neither should you!!”
Turning that around: "I care about RTX and DLSS, so everyone else should!"

Raw processing power still rules the day.
 

rofif

Banned


Please note: 26:10 mins into the video.

Video basically states that he's been testing 8 different cables for 72 hours and can't reproduce any of the theories floating around.
That Igorslab has tested a cable not representative to the build quality of all the cables he has and is wired with different gauge, lower rated wire and has different shared connector plates.
Can't recreate the break in cable/solder joint which Jay2cents is claiming.
That what I've been posting regarding poorly seated connectors has "definite merit" but he hasn't managed to get a GPU to boot with the cable not secured correctly in the socket.
That it's still open for more testing and the youtubers etc making claims to know what it is is talking through their arse.
That it's possible just bad batch cables, bad connection or something else we don't know as yet.

Nothing. Absolutely nothing.
They are looking hard for a controversy and they can't find it.
This is really silly
 

radewagon

Member
Even if there is a recall ( which won't happen ), I am not sending my card. it's by a miracle I was able to get an FE. now I am supposed to return it? yeah never.
Weird hill to die on. By your own admission it's very unlikely for a recall to happen. Which means, that if it does happen, it'll only be because the risk for the end-user is substantial. In the event of a recall, I hope you reconsider.
 

HoofHearted

Member
Nothing. Absolutely nothing.
They are looking hard for a controversy and they can't find it.
This is really silly
Something's clearly up - per the mega thread on reddit - this issue is now occurring across the board on several AIBs and multiple cards/products across the AIBs:

Confirmed cases: Asus, Gigabyte, MSI, Galax

Zotac has just been reported on their cards - but not confirmed (yet).

And that's only the people that are going out of their way to report it.
 

DonkeyPunchJr

World’s Biggest Weeb
Turning that around: "I care about RTX and DLSS, so everyone else should!"
Sure. If you don’t care about either of those and you don’t expect them to matter for the lifespan of your GPU, you should buy AMD.

But for fuck’s sake don’t be like “look at this comparison chart that only includes raster performance, you have to be a stupid sheep to buy Nvidia.” That is a really dishonest fanboyish thing to do and you deserve to be called out on it if you do that.

Raw processing power still rules the day.
What? No. Raster, ray tracing, and DLSS are all different things. You can’t simplify it down to one single dimension.
 

benno

Member
Nothing. Absolutely nothing.
They are looking hard for a controversy and they can't find it.
This is really silly
possibly. In a months' time it'll have blown over. nothing will have changed. No recall. No new cables sent out. We'll discover they've sold 50,000 GPUs or whatever, and the RMA figures are expected typical returns of what they have in other GPU releases, but youtubers, thanks to this created hysteria, have had their clicks boosted in the null before the AMD release.
 
Last edited:

Chiggs

Gold Member
RT isn't a magic bullet. And I say that as a long time Nvidia user.

I think it's becoming increasingly difficult to make that argument as more and more games continue to implement these features.

And if you're using your card for gaming AND for any sort of production work that involves rendering, the RT cores in Nvidia GPUs are must-haves.
 
Last edited:

Fredrik

Member
Nothing. Absolutely nothing.
They are looking hard for a controversy and they can't find it.
This is really silly
Feels like there are different tech groups trying to disprove each other, plus they all want clicks.
I find it interesting that Igor’s adapter had cables speced lower than all GN’s adapters, the soldering was much worse as well. Could be two different cable adapter manufacturers and the bad one hasn’t made as many.
 
Last edited:

benno

Member
Feels like there are different tech groups trying to disprove each other, plus they all want clicks.
I find it interesting that Igor’s adapter had cables speced lower than all GN’s adapters, the soldering was much worse as well. Could be two different cable adapter manufacturers and the bad one hasn’t made as many.
I don't think it mattered anyway. J2C, Igorslab and GN all came out with the same 65'-68' temps of the cables when they butchered them.
If anyone still believes it's breaks and bends in the cables which causes this, then the onus of proof is upon them as that's 5 different people who have all done the tests with no melting.
 

//DEVIL//

Member
Weird hill to die on. By your own admission it's very unlikely for a recall to happen. Which means, that if it does happen, it'll only be because the risk for the end-user is substantial. In the event of a recall, I hope you reconsider.
Because in worst case senario. You just …. Wait for it…….. buy a cable from cable mod or atx3.0 psu . Tadaaaaaaaaaaa

Problem solved . This is just some drama queens online. The 3090ti had the same plug . What happened ? Why no one talked about it ? Oh wait .. we know why . Click bait and drama queens .

Nothing is justifying returning an FE card. They are unicorn .
 
Last edited:

Fredrik

Member
I don't think it mattered anyway. J2C, Igorslab and GN all came out with the same 65'-68' temps of the cables when they butchered them.
If anyone still believes it's breaks and bends in the cables which causes this, then the onus of proof is upon them as that's 5 different people who have all done the tests with no melting.
Has anyone done any tests when the plug isn’t pushed all the way in? There is some force needed before the lock clicks in place.
 

benno

Member
Has anyone done any tests when the plug isn’t pushed all the way in? There is some force needed before the lock clicks in place.
Only GALAX. They threw 1500w down the cable and disconnected it a little and it started to overheat. They concluded it was the cause of the overheating issue, but everyone else just ignored it.

The video linked is watchable if you alter the subtitles to english.

 
Last edited:

//DEVIL//

Member
Only GALAX. They threw 1500w down the cable and disconnected it a little and it started to overheat. They concluded it was the cause of the overheating issue, but everyone else just ignored it.

The video linked is watchable if you alter the subtitles to english.

Yup I did notice it . When I put it the first time it didn’t click . It sounded weird to me. So I plugged it again slowly but firmly and fair enough it did click . I kinda knew this was a big reason some people got this issue.

Also according to Steve, Igor cable is different than all his 5 cables he tested . 150 to 300w resistance difference. And the way they are connected is also different . So it could be some old version of cables shipped by mistake to some unlucky people .
 

benno

Member
So it could be some old version of cables shipped by mistake to some unlucky people .
If you read the reddit sticky people are claiming to have both 150w cables and 300w cables with bad and good solder joints etc. but in all the tests they couldn't get the bad cables to overheat anyway.
 

daffyduck

Member
What? No. Raster, ray tracing, and DLSS are all different things. You can’t simplify it down to one single dimension.
Sure you can. RTX and DLSS are pretty recent things. Not all devs have or even want to jump on board yet.

I wasn't the one calling Nvidia buyers (I am one, too) sheep, but yes, the new features are often used by fanbois to completely dismiss AMD.
 

Chiggs

Gold Member
Hey, CableMod guys...my order for some standalone 12vhpwr Asus/Seasonic cables literally just shipped yesterday...after about 9 days in processing. :(

Not sure if any of you are experiencing the same thing, but I thought I would check in. I'd imagine they're being hammered right now.
 

Fredrik

Member
Only GALAX. They threw 1500w down the cable and disconnected it a little and it started to overheat. They concluded it was the cause of the overheating issue, but everyone else just ignored it.

The video linked is watchable if you alter the subtitles to english.


Yup I did notice it . When I put it the first time it didn’t click . It sounded weird to me. So I plugged it again slowly but firmly and fair enough it did click . I kinda knew this was a big reason some people got this issue.
I did the same thing, first no click, then *click*.
Sounds like this might be one reason for the melting then.
Possibly combined with bad quality soldering in some cases, but GN kinda throw that idea under the bus when they cut a solder joint and it still only increase temp a few degrees.
 
Last edited:

DonkeyPunchJr

World’s Biggest Weeb
Sure you can. RTX and DLSS are pretty recent things. Not all devs have or even want to jump on board yet.
Alright dude. Well I’ll be over here enjoying the superior experience in Cyberpunk, Flight Simulator, and Doom Eternal. Good luck convincing anybody with your idiotic l “the only thing that matters is raw power because not all games use DLSS and ray tracing” argument.
 

OZ9000

Banned
Alright dude. Well I’ll be over here enjoying the superior experience in Cyberpunk, Flight Simulator, and Doom Eternal. Good luck convincing anybody with your idiotic l “the only thing that matters is raw power because not all games use DLSS and ray tracing” argument.
For AMD to be remotely competitive for their next cards they will have to match DLSS 2.0 at the bare minimum.

FSR looks crap compared to DLSS especially in a game like Cyberpunk.
 

Chiggs

Gold Member


Steve refuting the findings from Igor's Lab.

Edit: Whoops...missed the previous post. Beaten like a red-headed stepchild.

For AMD to be remotely competitive for their next cards they will have to match DLSS 2.0 at the bare minimum.

FSR looks crap compared to DLSS especially in a game like Cyberpunk.

The leaks are indicating a 2x rasterization improvement and RT performance that is a little higher than Ampere. Can't wait for Thursday.
 
Last edited:

OZ9000

Banned


Steve refuting the findings from Igor's Lab.

Edit: Whoops...missed the previous post. Beaten like a red-headed stepchild.



The leaks are indicating a 2x rasterization improvement and RT performance that is a little higher than Ampere. Can't wait for Thursday.

They need a viable answer to DLSS 2.0 which offers awesome performance and IQ.

That feature is going to decide whether I go for a 4080 or 7800/7900 XT.
 

Chiggs

Gold Member
They need a viable answer to DLSS 2.0 which offers awesome performance and IQ.

That feature is going to decide whether I go for a 4080 or 7800/7900 XT.

I agree. DLSS is a damn good feature. I'm rooting for AMD to deliver the goods. Could be some rather fortuitous timing for them...if they can nail all aspects (pricing, performance, and the most scary of all: availability).
 
Last edited:

daffyduck

Member
Alright dude. Well I’ll be over here enjoying the superior experience in Cyberpunk, Flight Simulator, and Doom Eternal. Good luck convincing anybody with your idiotic l “the only thing that matters is raw power because not all games use DLSS and ray tracing” argument.

Good luck convincing anyone you're not an Nvidia fanboi.

And, since you asked:

Sheep GIF by Celebrity Apprentice Australia
 

Buggy Loop

Member
What’s the point of wanting >3090 class on either Nvidia or AMD if it’s not for ray tracing? Please tell me.

Pure rasterization games are already running at ridiculous framerates unless you’re aiming for something like 8k. What FPS do you need? 200? 300?

This GPU range is only going to shine and flex it’s muscles for ray tracing.
 

nikos

Member
Add me to the list of people who haven't experienced an issue. I've checked my cable twice and felt like an idiot doing so.

There are currently 15 confirmed cases out of how many cards sold? I guarantee people mishandled or bent the cable. From the cases I've read, they even admitted to it. We were told from before the cards launched to not do that.

/r/nvidia is fucking horrible. Unless you post the popular opinion of "fuck Nvidia" you get downvoted to hell. Those same hypocrites are still buying Nvidia cards.

Jay, Nexus etc. have most of those idiots brainwashed while they profit from making bullshit videos. Same morons who made something out of nothing with the 30 series POSCAPs

CableMod is probably absolutely loving this. Free money as a result of fear mongering. I was going to grab one of those cables to tidy up but I don't think I care to anymore.
 
What’s the point of wanting >3090 class on either Nvidia or AMD if it’s not for ray tracing? Please tell me.

Pure rasterization games are already running at ridiculous framerates unless you’re aiming for something like 8k. What FPS do you need? 200? 300?

This GPU range is only going to shine and flex it’s muscles for ray tracing.
VR for me
 

Chiggs

Gold Member
CableMod is probably absolutely loving this. Free money as a result of fear mongering. I was going to grab one of those cables to tidy up but I don't think I care to anymore.

The phrase "Cutting off one's nose to spite one's face" comes to mind.

The adapters are absolutely hideous, and if you have a showcase PC, CableMod's solution seems to be the only sane choice, especially if you're unlucky to have the 4 plug adapter.
 

OZ9000

Banned
What’s the point of wanting >3090 class on either Nvidia or AMD if it’s not for ray tracing? Please tell me.

Pure rasterization games are already running at ridiculous framerates unless you’re aiming for something like 8k. What FPS do you need? 200? 300?

This GPU range is only going to shine and flex it’s muscles for ray tracing.
4k60

I don't care about RT in its current iteration. Adds very little to most games at present.
 

I Master l

Banned


I find this comment interesting
I think Buildzoid's theory is the most probable at the moment. It's not about the cable side, it's about the pins (actual metal blades in the pins) at the connector side. Cables didn't melt (mostly), pins did, and it seems to be exclusive to the Nvidia connector which has lower build quality of the pins. I think it's definitely worth checking out.
 

mitchman

Gold Member


Steve refuting the findings from Igor's Lab.

Edit: Whoops...missed the previous post. Beaten like a red-headed stepchild.

Igor's lab used a 150V rated cable, they tested on a 300V rated cable, as they said several times, so saying they "refuted" Igor's lab would be wrong here. The question is where Igor got that cable from.
 

Chiggs

Gold Member
Igor's lab used a 150V rated cable, they tested on a 300V rated cable, as they said several times, so saying they "refuted" Igor's lab would be wrong here. The question is where Igor got that cable from.

I think they just said that to avoid a fracas. Look at the post above mine. It’s starting to seem like we have some edge cases that are being elevated, and it looks like a sampling of people were either sent the wrong adapter or a lower quality one made it through the gates.
 
Last edited:

HoofHearted

Member
Igor's lab used a 150V rated cable, they tested on a 300V rated cable, as they said several times, so saying they "refuted" Igor's lab would be wrong here. The question is where Igor got that cable from.
Igor wasn't the only person with a 150V cable - there are other documented cases that show the 150V rated cable...

EDIT: there's also cases where the solder/plate connections match Igor's cable that also use 300V cable - so there's clearly a variety of different cable configurations out in the wild here.

 
Last edited:

benno

Member


TecLab/GALAX tries again to melt the cable with 1500w, bends, breaks and loosely seating the connector.

From the video:-

Clarifies that the new connectors are the new ATX standard and not Nvidia standard, so have been thoroughly tested.
The connector is built to withstand high sustained loads and very high spike loads.
To do the test they removed an actual socket from a 4090 card and used the cable supplied with that same card.

WIth a poorly fitted slightly pulled out to one side connector the temps rose to 76c with just 450w.

Proceeds to swing the cables and PSU around, split the cables and mangle the connectors for worst exteme case scenario along with 1500w power and manages to raise temps to 145'c but still no melting.

Reduced the load back to 600w and the temps dropped to within spec levels.

Conclusion:

If you don't fit the cable correctly and seat it loose it will generate a lot of heat.
Bending the cable isn't recommended as it's against manufactures guidelines but didn't increase temps when they did it, so owners' risk.

The cause is either a currently unknown manufacturing defect or people not connecting the cable correctly.

GALAX HOF. GALAX must be doing this on purpose to make Reddit lose their minds
oT653Py.jpg
 
Last edited:

KyoZz

Tag, you're it.
Is there a way to buy a good cable right now? And if so any recommendation? Just ordered a 4090 Gigabyte Gaming OC and I'd like to avoid using the Nvidia one.
 

benno

Member
Now folks with native 3.0 power supplies are having burnt cables...
I found this amusing.
The company who are making a killing selling replacement cable adapters to people too scared to use the supplied Nvidia one comes out with this response the moment it isn't an Nvidia adapter..
07lsyjB.jpg
 

Celcius

°Temp. member
I found this amusing.
The company who are making a killing selling replacement cable adapters to people too scared to use the supplied Nvidia one comes out with this response the moment it isn't an Nvidia adapter..
07lsyjB.jpg
They also posted another reply in the thread as well:

14oQLgO

14oQLgO.png
 
Last edited:

benno

Member
They posted another reply in the thread as well:
A bit too late though isn't it. They were the ones who helped jump start this shitstorm and profited from it by posting the little pamphlet images showing how bending the Nvidia adapter causes these issues. Suddenly when everyone realises it isn't bending the adapter and it can happen to their cables it's a quick U-turn to it's "user error" cables not pushed in correctly, and talk of stop selling. How about they refund all the people they sold the £100 cables to out of fear the Nvidia cable would burn their houses down?
 
Last edited:
Top Bottom