• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Reports of Bricked NVIDIA GeForce RTX 5090 and RTX 5090D Surge

rodrigolfp

Haptic Gamepads 4 Life
Camped for a week under shop's doors to buy 5000$ videocard, it bricked 3 days later. PC gaming is marvelous
RROD and YLOD after the warrant ends is better???
Imagine buying a $5k prebuilt PC just to get a 5090 and you get a brick lmfao. My 4090 will stay with me for a few more years.
Imagine buying a prebuild plus without warranty.
 
Last edited:

Dorago

Member
Has this "Nvidia most successful company ever" thing just been a huge money laundering scam?

They seem to be spending plenty of money on ad campaigns, PowerPoint presentations, and public private partnerships but don't seem to have produced a GPU that was worth a shit since the 1080.

Their new gen connector catching fire even after they recalled it the first time is a huge red flag too.
 

rm082e

Member
Has this "Nvidia most successful company ever" thing just been a huge money laundering scam?

They seem to be spending plenty of money on ad campaigns, PowerPoint presentations, and public private partnerships but don't seem to have produced a GPU that was worth a shit since the 1080.

Their new gen connector catching fire even after they recalled it the first time is a huge red flag too.
  • The 2000 series was a poor value proposition overall. But the 2080 Ti did have really good overclocking potential. I never saw anyone who bought that card complain about it.
  • People who bought 3000 series cards were quite pleased with them. Part of that is the 2000 series being disappointing, but the 3000 series was a solid performer at a time when a lot of people were playing more games due to quarantine.
  • The 4090 is still quite impressive, aside form the crazy high price.
  • I'm not about to defend any SKU in the 5000 series though. It's just a case of Nvidia having zero competition at this point. We saw the same thing with 5-8% improvements from one generation to the next during the quad core days of Intel.
Overall, I think what's changed is the uplift from generation to generation is shrinking. As that happens, the expectation of the users also needs to adjust. Vote with your wallet.

Their new gen connector catching fire even after they recalled it the first time is a huge red flag too.

I'd like to see how many examples of this happening to 5090s we actually see real world evidence for. This hasn't been an issue on the 4090 for a while, so the idea that it's suddenly a problem again on the 5090 is suspicious. I am guessing this is panic over a very small number of user error instances.
 

Bitstream

Member
The 4090s were starting fires, at least this one doesn't take your house with it.

Canadian Lol GIF
 

Crayon

Member
Well it's been an entertaining launch so far but I'll wait to see how widespread this really is. I wasn't convinced the power connectors was really that serious until some more investigation came out. A lot of these scares spiral up from what is really few instances.
 

baphomet

Member
Has this "Nvidia most successful company ever" thing just been a huge money laundering scam?

They seem to be spending plenty of money on ad campaigns, PowerPoint presentations, and public private partnerships but don't seem to have produced a GPU that was worth a shit since the 1080.

Their new gen connector catching fire even after they recalled it the first time is a huge red flag too.

Do you even know what money laundering is?

The 10 series was pretty ass.

They never recalled their connector?
 

Celcius

°Temp. member


Ruh Roh
User was using a 3rd party cable, but still, it's the new 2V-2x6 connector and melted on both videocard side and psu side. He says it clicked and was secure on both sides too.
 
Last edited:

rofif

Can’t Git Gud
Reddit already got some melting cables posts.
Anyway - 5090 is not a real card that real people buy. I don't even consider it.
The bigger problem is that 5080 is also unavailable and real price is double what they ask for
 
Last edited:

Hohenheim

Member
Scary stuff!
I have a new Asus Tuf 1200W PSU (Atx 3.1), which has a 16 pin 600w cable for direct connection to the 5090.

Would you guys use that, or use the adapter cable that comes with the 5090 cards?
 

Ulysses 31

Member
Scary stuff!
I have a new Asus Tuf 1200W PSU (Atx 3.1), which has a 16 pin 600w cable for direct connection to the 5090.

Would you guys use that, or use the adapter cable that comes with the 5090 cards?
How much do you care about cable management? One vs 4 cables to power the 5090.
 
Reddit already got some melting cables posts.
Anyway - 5090 is not a real card that real people buy. I don't even consider it.
The bigger problem is that 5080 is also unavailable and real price is double what they ask for
Just save yourself the trouble and wait for the 6000 series instead.

I think the 5000 series is only worthwhile if you're upgrading from a weak tier 3000 series or any 2000 series GPU.
 

Hohenheim

Member
How much do you care about cable management? One vs 4 cables to power the 5090.
I prefer a single cable, but if it's safer to use 4 cables and the adapter, i'd rather do it that way.
The PSU only have three 8-pin inputs on it though. And two 16 pin input. Wouldn't it have to be four 8 pin inputs for using the adapter that comes with the cards?
I currently use the 16 pin cable that came with the PSU to power my 4070 Super, and don't use the 2x8 pin adapter that came with that card.
aT15ShF.jpeg
 
Last edited:

RoboFu

One of the green rats
Nvidia getting a dose of reality these few months huh. Massive stock loss and now they are losing money with bad cards which will turn into multiple class action lawsuits.
 

Ulysses 31

Member
I prefer a single cable, but if it's safer to use 4 cables and the adapter, i'd rather do it that way.
The PSU only have three 8-pin inputs on it though. And two 16 pin input. Wouldn't it have to be four 8 pin inputs for using the adapter that comes with the cards?
I currently use the 16 pin cable that came with the PSU to power my 4070 Super, and don't use the 2x8 pin adapter that came with that card.
Doesn't look like you got a choice with that PSU anyway to use the 12VHWPR cable unless you wanna run the 5090 with scaled back power with 3 of the 4 8-pins used on the adapter. :messenger_winking_tongue:

I've not heard about multiple 8-pins -> 12VHWPR being better for worse for power delivery, only that's it's worse for cable management.
 
Last edited:

PaintTinJr

Member
Scary stuff!
I have a new Asus Tuf 1200W PSU (Atx 3.1), which has a 16 pin 600w cable for direct connection to the 5090.

Would you guys use that, or use the adapter cable that comes with the 5090 cards?
If the single cable (with six cable pairs, so assuming 100watts per pair) is rated correctly you'll get marginally better power efficiency and - assuming it is an all or nothing cable design - you can be sure that in operation the card will either have access to the necessary 600w or 0w, where as the adapter could have just one 150watt port functional and operate the card fine on low power and blow something when stepping up to draw more, or any other combination of 150, 300, 450 that wouldn't report as failing but could risk damaging the card.

And assuming the adapter has more cable pairings than 6(it does because you'd have to adapt two 6pins to an 8pin), the increased metal means marginally more resistances, and marginally more energy wasted as heat.
 
Last edited:

Hohenheim

Member
Doesn't look like you got a choice with that PSU anyway to use the 12VHWPR cable unless you wanna run the 5090 with scaled back power with 3 of the 4 8-pins used on the adapter. :messenger_winking_tongue:

I've not heard about multiple 8-pins -> 12VHWPR being better for worse for power delivery, only that's it's worse for cable management.
Yeah, i've tried searching for info about this, and it seems to be no negatives in using the single 16pin cable. Easier for cable management too, as you said. It's a new ATX 3.1 PSU, so I guess I can't really take any more safety-measures, except for making sure the cable is properly plugged in of course.
The adapter that comes with the 5090 has a yellow "safety thing" that easily lets you see if the connection is properly in, but I have never had any issues connecting these cables, and since it's the official cable that came with the PSU it should hopefully be fine.

Using three 8-pin connectors for lower power usage could be a option though, since the cards only lose about 5% performance for draws lot less power
But if just feels wrong somehow to not connect all four if going that route.. i'm probably overthinking this:)
 
Since there are so few 5080 and 5090s out there at the moment due to demand far exceeding available stock (whether intentional or not) then any issues with these cards is going to come across as comparatively worse than if they had been widely available.

Also, the people having issues with these will be far more vocal about the problems than those that aren't (since you aren't going to be shouting out on an internet forum that your new card works - why would you when that is what you would expect!).

All new products launch with issues or faulty hardware but it is just far too early to tell if this is a widespread issue or just something isolated to a few vocal but unlucky PC users who were stupid enough to buy the graphics cards from scalpers.
 

Hohenheim

Member
If the single cable (with six cable pairs, so assuming 100watts per pair) is rated correctly you'll get marginally better power efficiency and - assuming it is an all or nothing cable design - you can be sure that in operation the card will either have access to the necessary 600w or 0w, where as the adapter could have just one 150watt port functional and operate the card fine on low power and blow something when stepping up to draw more, or any other combination of 150, 300, 450 that wouldn't report as failing but could risk damaging the card.

And assuming the adapter has more cable pairings than 6(it does because you'd have to adapt to 6pins to an 8pin), the increased metal means marginally more resistances, and marginally more energy wasted as heat.
Yeah, it's a single 600w 16-pin cable that came with the PSU, so should be rated correctly.

Thanks, this was very informative and makes me feel safer using that 600w single cable with the 5090, and forget about the 4-way adapter that comes with the card.
 
Last edited:
That cable itself looks a bit scetchy, almost DYI quality.

Most 12VHWPR cables I've seen look more like this

51YC+KbiuPL._SX522_.jpg
Yeah I would only ever recommend using official cables only.

I personally use an official Seasonic cable for my GPU (12VHWPR -> 2x 8 pins).

Cablemod was pretty popular - but I do not know how reliable they as there were tons of RMA threads on Reddit.
 
Last edited:

SHA

Member
Imagine buying a $5k prebuilt PC just to get a 5090 and you get a brick lmfao. My 4090 will stay with me for a few more years.
Imagine next gen consoles has the same issue? I'm going retro this way to clear my mind and just think about the games.
 
Last edited:

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
Nvidia getting a dose of reality these few months huh. Massive stock loss and now they are losing money with bad cards which will turn into multiple class action lawsuits.

Nvidia won't know what's coming when they lose that class action lawsuit and will have to reimburse every single RTX 5090 FE owner. Wow!

Arrested Development Reaction GIF by MOODMAN
 

PaintTinJr

Member
That's an informative video, but unless PCIe power specs have changed, the mobo PCIe socket is carrying the first 75watts of that GPU load, so even shaving 75watts off the 660watt cable rating, the cable under normal circumstances should still be able to take the 525watt maximum, across the 6 cable pairs, but if he's reading +20amps, it means one single cable pair is doing +240watts of that possible 400watts cable power draw.(IIRC he said system draw was 575watts, system at 100watts, and PCIE power at 75watts).

Despite Molex cables saying gold with nickel was good, the problem with gold in that scenario is that it is a softer metal and is less heat resistant, so conductivity on those first set of pins at 100deg C - which it should be nowhere near - reduces conductivity rapidly causing resistance to current, causing even more power draw, in a cyclical trap, and on every use the first pairing getting that hot is causing the thermal and electrical shielding on the first pair to degrade increasing electrical interference. How this has got through testing and into a product is shocking, and downright dangerous, there should be 400watt power limit per PC component, unless we are taking the foreman grill with kettle design into the PC world.

Zero common sense on display from the world's richest company designing GPUs, who could have guess it?
 
Last edited:

Jinzo Prime

Member
Assumed that video above would just be random 3rd party commentary, but no, he's got hold of the card people had been posting shots of and commenting on and he'd already noticed irregularities with his own system that were at least similar. Informative video.

BwWw4tu.png
2AwdTs5.png
A single wire carrying 23 Amps while every other one doing 3-12? And that's using an offical Corsair cable?

That is a terrible design, I don't think the blame falls on ModDIY here.

Edit: also 300°F on the PSU side? That's oven baking temps, on an open bench. Imagine how hot it gets in a case.
 
Last edited:

PaintTinJr

Member
A single wire carrying 23 Amps while every other one doing 3-12? And that's using an offical Corsair cable?

That is a terrible design, I don't think the blame falls on ModDIY here.

Edit: also 300°F on the PSU side? That's oven baking temps, on an open bench. Imagine how hot it gets in a case.
Looking at the part in the video where he measures the amps on each pair and adding them up is interesting.

23 amps
2 amps
5 amps
11 amps
8 amps
3 amps
= 52Amps @ 12volts - 624watts in the cable with the first pair having 276watts

Which suggests that 191watts of that 276watts is just heating the first pair to find an equilibrium point - for current drift velocity IIRC - where the pair is delivering the (presumably) necessary 85watts it supposed to carry and the resistance due to heat has stabilized so the loss of heat to the cable and environment is matched to the excess 191watts, which is then allowing 28amps split across the others to actually supply another 336watts (75watts for mobo slot + 85watts on pair one) meaning the RTX 5090 FE actually using just shy of 500watts(496watts) to play SW Outlaws IIRC.
 
Last edited:

Danknugz

Member
Has this "Nvidia most successful company ever" thing just been a huge money laundering scam?

They seem to be spending plenty of money on ad campaigns, PowerPoint presentations, and public private partnerships but don't seem to have produced a GPU that was worth a shit since the 1080.

Their new gen connector catching fire even after they recalled it the first time is a huge red flag too.
all their money comes from selling rack mount clistered GPUs to companies for AI farms. they don't care about selling for much lower margin to people running silly little video games and then whine and cry about it.
 
Top Bottom