• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The RTX 50 Disaster (Gamers Nexus)

Bojji

Member
Conflating lithography failure that didn't get caught through the binning process to final customer-product defect rates is silly. I don't think there is any defence of nvidia here, this was a weird process failure, they will learn and improve. (Though not sure why anyone carries water for any corporation if I am honest)

I don't think ever before think like that (missing working parts of GPU) happened. Both Nvidia and AiBs should be checking GPUs so it's almost impossible that they missed this defect.
 

HeWhoWalks

Gold Member
The cost of being PCMR.
Not really. Not only do most generations face nothing like this, there are plenty of other cards on the market which continue to work just fine and allow top-tier gaming on PC. I still have 3090 Ti/13900K/96GB-6800MHz rig that easily gets the job done. Will upgrade to a 5090 once the kinks are worked out.

PCMR = options.
 
Last edited:

Senua

Member
Not really. Not only do most generations face nothing like this, there are plenty of other cards on the market which continue to work just fine and allow top-tier gaming on PC. I still have 3090 Ti/13900K/96GB-6800MHz rig that easily gets the job done. Will upgrade to a 5090 once the kinks are worked out.

PCMR = options.
DLSS4 has extended the life of those older cards too. Nvidia are dicks but the transformer model is such a win for us.
 

winjer

Member
I've never owned or played on the FX 5900 Ultra, but looking at that small cooler, it's hard to believe that noise was an issue.

VRSkNUdfLZ2WsnjoXfHFSY-970-80.jpg



Shitty performance all around? Performance was definitely not good in DX9, but in DX8 and openGL the FX series was quite competitive.

2Ecc7siLjbBcTPEvqmvGJd-970-80.gif


mKa7TFVuonQG9rFhnUzcjJ-970-80.gif


vakj5zC5iJWRai5TzvYyAe-970-80.gif


Wq947y7nNemh6BEv7k5rzW-970-80.gif


DRguJCSYd47vvTQHMB5zyD-970-80.gif

That is the FX refresh, which fixed some of the problems with the architecture.
Look at the FX5800. That one was an even bigger disaster.
 

Senua

Member
Facts.

I booted up Silent Hill 2 and Alan Wake 2 after the 4 update and it was immediately apparent that they both were cleaner/crisper (the latter especially)!
Yep, so now moving down from quality to balanced, or even performance will still look great, whereas before there was a noticeable drop in image quality, especially in performance even at 4k.
 

Buggy Loop

Member
You're still not getting it. It's not the same thing as having a DOA, not even close.

DOA parts are parts that were once spec compliant and fully functional, they tend to fail after the final QC checks. Here you've got parts that were never even spec compliant somehow making their way all their way to retail.

And yes they can run checks at the first stage to figure out if the silicon meets the necessary specifications, it's all covered in the OP along with some of the other videos I've linked throughout this thread.

And it happens, many times before. Again, I feel peoples just want to be dramatic for sake of being dramatic. Who in the fuck has seen the past 3 decades of chipset launches and don't remember these things?


Now they caught the problem on time and issued a recall to retailers that had the defective CPUs on shelves but again, it managed to find its way on a shelve with a failure to detect silicon problem. Cool on the recall, if retailers didn't just say fuck it I'm not sending back and gamble on RMA %.

Can you guarantee me that 100% of the DOA chipsets didn't have silicon QC slippage like the above? I don't recommend making that bet. :messenger_tears_of_joy:

Peoples think this is as easy as making sausage or something? Very complex parts, very complex debugging software and hardware. Shitload of QC validations throughout the pipeline, it happened, it happens, will happen. QC Slippage happens on all components, in all fields. It could even be the debugging tool that damages some part because of static discharge or voltage control problems, who's to say it came out of TSMC like that? Nobody knows. Anyone claiming they do outside of the manufacturer and Nvidia is pulling it out of their ass.

I have no idea why you're hellbent on making all these pathetic excuses for them, but Jensen appreciates you, I'm sure he'll replace your 3080 with one of those botched 5090's for all your efforts.

I'm not making excuses. Where did I say missing ROPs is fine? Nowhere.

Its you guys falling for overly dramatic things.

You're making claims like its 5x too much. Really... fundamentally you never answered, based on what? That's a very precise claim. I'm waiting on your FMEDA report then, you seem to have the stats nailed down so you just have to publish it.

I don't think I've ever recommended anyone to buy 5000 series just like multiple times in the past I didn't recommend 4000 series for the insane pricing. The power connector is a much much bigger problem here than the ROPs which can easily RMA and is simply QC slippage, fixable. Power connector I don't think there's an easy solution at all, its hardware design. The only solution is to change the spec of the cable for higher gauge to increase safety margins, not sure that's anytime soon.

I know this is a boring approach but I've been in QC/Engineering for 20 years. From aerospace industry to high power energy systems. I just can't stand dramatic thumbnails and overly dramatic claims like "OMG! WUT? Nvidia MISSED THIS ? IMPOSSIBRUUUU". Peoples who act like this are terminally online armchair engineers and have no clue how the world of manufacturing works.

I guess I am also sucking up to Lisa Su? No I have some fucking ounce of integrity in a field I am very familiar with. Peoples claiming that companies are sending defective products knowingly as some kind of evil corporation teenage fanfic are cringe as fuck. They have no idea how many failure points there are in a production line, especially complex products and electronics are the peak of human engineering complexity. These kind of news we can just kind of realize, shit happens. Fix it, RMA it, give good customer support. I think 970 VRAM problem was way more of a dick move than any of the above QC problems. Among many other nonsense actually put purposedly on design to fuck us over, there's no RMA for that.
 
Why? Small heatsinks and small fans mean more noise.
My Geforce 3 had an even smaller fan, but it was very quiet. My sister had FX 5500 and that card was quiet as well.

But I do know that the Geforce FX 5800 was very loud. People even made jokes about it:




That is the FX refresh, which fixed some of the problems with the architecture.
Look at the FX5800. That one was an even bigger disaster.
According to tomshardware review the FX5900 wasnt nearly as loud compared to the FX5800.


The architecture has also been improved, so I wouldn't call this particular card a disaster, given the excellent DX8 / OpenGL performance. The Radeon 9800 only had better performance in DX9 (especially in Half Life 2), but by the time most developers finally started using DX9 (PS3 / x360 era) even the 9800 was too slow anyway.

Yep, so now moving down from quality to balanced, or even performance will still look great, whereas before there was a noticeable drop in image quality, especially in performance even at 4k4k.
DLSS 3.7 already looked excelent in it's performance mode (better than native 4K TAA in some games like RDR2), especially with reshade sharpening mask.

DLSS 3.8 Performance

Horizon-DLSSP.jpg


SH2-DLSS-P.jpg


b1-Win64-Shipping-2024-09-01-00-30-46-747.jpg


DLSS-P.jpg


RDR2 uses Vulkan, so the reshade sharpening filter is not visible on this screenshot. But even without it, the image is reasonably sharp.
 

Senua

Member
My Geforce 3 had an even smaller fan, but it was very quiet. My sister had FX 5500 and that card was quiet as well.

But I do know that the Geforce FX 5800 was very loud. People even made jokes about it:





According to tomshardware review the FX5900 wasnt nearly as loud compared to the FX5800.


The architecture has also been improved, so I wouldn't call this particular card a disaster, given the excellent DX8 / OpenGL performance. The Radeon 9800 only had better performance in DX9 (especially in Half Life 2), but by the time most developers finally started using DX9 (PS3 / x360 era) even the 9800 was too slow anyway.


DLSS 3.7 already looked excelent in it's performance mode (better than native 4K TAA in some games like RDR2), especially with reshade sharpening mask.

DLSS 3.8 Performance

Horizon-DLSSP.jpg


SH2-DLSS-P.jpg


b1-Win64-Shipping-2024-09-01-00-30-46-747.jpg


DLSS-P.jpg


RDR2 uses Vulkan, so the reshade sharpening filter is not visible on this screenshot. But even without it, the image is reasonably sharp.

Great in stills, in motion it fell apart. Transformer model fixes this
 
Great in stills, in motion it fell apart. Transformer model fixes this
I dont agree. DLSS 3.7, even in it's performance mode, had less motion blur and shimmer than TAA native in quite a few games I played, let alone the shimmering mess of FSR native.

The difference was particularly noticeable in Black Myth Wukong and Cyberpunk, where FSR's native image was extremely shimmery during movement (especially around vegetation), while DLSS performance looked excellent.

As for DLSS 4 (transformer model), it offers sharper motion clarity, but there is more shimmering around vegetation compared to DLSS 3.7. Try playing Cyberpunk with DLSS 3.7 and move the camera around the vegetation. Shimmering is minimal with DLSA 3.7, while DLSS 4 shimmers a lot around vegetation (at least in raster and RT mode, because with path tracing DLSS 4 finally looks as clean as DLSS 3.7).
 
Last edited:

Three

Member
My Geforce 3 had an even smaller fan, but it was very quiet. My sister had FX 5500 and that card was quiet as well.
There are different types of fans too. I think up to the geforce 4 all the fans were not blower fans. Blower fans are generally louder. Bigger fans are quieter and larger heatsinks require lower airflow so bigger heatsinks generally mean less noise too.
 
I have a RTX 3070 and I was thinking about upgrading my GPU, either to a RTX 5070 (or even Ti if the price wasn't absurd) or an RX 9070 (or XT) depending on price). Now it's looking like I'll keep using it for two more years. It's just insane how much the GPU landscape has changed since 2020 for the worse... I hope AMD doesn't follow the greedy path...
This. Glad I am on 1080p and not changing anytime soon. My 3060ti still maxes out all I play at 1080. I got one of the last evga cards with a step up from a 2060 during the pandemic.
I can wait for prices to drop for a proper upgrade. Well maybe by the time the 6000x series arrives the prices will drop for the 5xx cards.
 
There are different types of fans too. I think up to the geforce 4 all the fans were not blower fans. Blower fans are generally louder. Bigger fans are quieter and larger heatsinks require lower airflow so bigger heatsinks generally mean less noise too.
Yeah it was the FX line that started the blower trend.

The first large-scale use of a blow-out style cooler came when the FX 5800 Ultra cards premiered. The solution NVIDIA reference design introduced worked differently than today’s blow-out style coolers.
Here is a cool article about the history of gpu coolers if anyone is interested: https://www.sapphirenation.net/history-graphics-cards-coolers-part
 

winjer

Member
My Geforce 3 had an even smaller fan, but it was very quiet. My sister had FX 5500 and that card was quiet as well.

But I do know that the Geforce FX 5800 was very loud. People even made jokes about it:



According to tomshardware review the FX5900 wasnt nearly as loud compared to the FX5800.


The architecture has also been improved, so I wouldn't call this particular card a disaster, given the excellent DX8 / OpenGL performance. The Radeon 9800 only had better performance in DX9 (especially in Half Life 2), but by the time most developers finally started using DX9 (PS3 / x360 era) even the 9800 was too slow anyway.


The FX 5800 was released in 2003, but in the next year, we already had DX9 games, such as Half Life 2 and FarCry.
And these games were playable on a 9700 Pro, but never on a FX5800. So the FX series was already dated out of the gate.
The FX5800 was good enough in DX8, but it also had the problem of scaling worse than the 9700 Pro, as resolution increased.

But the worst part was the FX5200. Not only it was a very poor part, even considering it was a cheap part, but it also had major problems with a ton of variants with different configurations, that usually meant even lower performance.
The FX5800 was a disaster of a GPU, but the FX5200 was a complete catastrophe. I remember making a lan party with friends, and one of them had a PC with a 5200. I had much higher fps playing UT2004, while using max settings and higher resolution, than him with lowest settings and lower resolution.
It was a sick dog of a GPU, that didn't even get decent frame rates, at low settings.
 
That was pretty bad
85fd6154aadce40ecb770f251d8685f7.png

ccead587eae04f11d4206000086fb8c4.png

eea854a9d7c671822fcd33e5738f94d7.png

92257c443095b5c41952f78f122f07c0.png
If I remember correctly, Serious Sam 2 only used DX8, so I wonder what can explain such huge difference. I played the game on my Geforce 3 that supposed to be much slower than the FX 5800 and I dont remember having any problems with framerate in this game.

According to hexus review, the FX 5900 ultra had 178fps at 1024x768 in serious sam 2, and I'm more inclined to believe in their results given how well the game ran on my Geforce 3. 20fps on the 5800 doesnt make any sense.


SS10.png

SS10Q.png
 
Last edited:

winjer

Member
If I remember correctly, Serious Sam 2 only used DX8, so I wonder what can explain such huge difference. I played the game on my Geforce 3 that supposed to be much slower than the FX 5800 and I dont remember having any problems with framerate in this game.

According to hexus review, the FX 5900 ultra had 178fps at 1024x768 in serious sam 2, and I'm more inclined to believe in their results given how well the game ran on my Geforce 3. 20fps on the 5800 doesnt make any sense.


SS10.png

SS10Q.png

 

Thank you. I forgot that there were two Serious Sam games, Serious Sam the Second Encounter (often referred to as Serious Sam 2), but also Serious Sam 2, but that was a totally different game. Second Encounter used DX8, while Serious Sam 2 used DX9, so it all makes sense now. Radeon was much faster simply because it was a DX9 game.

But the worst part was the FX5200. Not only it was a very poor part, even considering it was a cheap part, but it also had major problems with a ton of variants with different configurations, that usually meant even lower performance.
The FX5800 was a disaster of a GPU, but the FX5200 was a complete catastrophe. I remember making a lan party with friends, and one of them had a PC with a 5200. I had much higher fps playing UT2004, while using max settings and higher resolution, than him with lowest settings and lower resolution.
It was a sick dog of a GPU, that didn't even get decent frame rates, at low settings.
I think even my GeForce 3 was faster than the FX 5200. If I remember correctly, the FX 5200 was bottlenecked by very limited memory bandwidth.

The FX 5800 was released in 2003, but in the next year, we already had DX9 games, such as Half Life 2 and FarCry. And these games were playable on a 9700 Pro, but never on a FX5800. So the FX series was already dated out of the gate.
I have seen Geforce FX gameplays (and even geforce 4) from Far Cry 1 and Half Life 2, so those games were definitely playable on FX cards. Around 200fps at low settings.



I think what you meant to say was that these games were not playable at max settings using Shader Model 2.0 effects. Shader model 2.0 was definitely a mess on the FX series, and games that finally took advantage of it ran like crap on FX cards.

When the geforce FX5800 was released the vast majority of games still used DX8 though and FX had amazing performance in these games, and especially on the 5900 Ultra, because this card had very impressive memory bandwidth (27 GB/s) even compared to 5800 (16 GB/s), or radeon 9800 pro (21 GB/s).

5900U - 126 fps on average
5800U - 61 fps on average

g7wEbMh.jpeg


Now AMD is in a similar position. There are PT games that are playable on the RTX 40/50 series cards but not on Radeon GPUs. These PT games still offer Lumen or standard RT support though, so people arent making a big deal about it.
 
Last edited:

SolidQ

Member
If I remember correctly, Serious Sam 2 only used DX8,
There is DX9 with 2.0 shaders

Serious Sam 2 Recommended Requirements​

  • CPU: Pentium 4 or Athlon XP
  • CPU SPEED: 2.6 GHz
  • RAM: 512 MB
  • OS: Windows 2000/XP
  • VIDEO CARD: DirectX 9.0 compliant video card with Pixel Shader support (NVIDIA GeForce FX 5900+ / ATI Radeon 9500+ / Intel 915g+)
  • PIXEL SHADERS: 2.0
 
Last edited:

Celcius

°Temp. member
Next nvidia decides to start slacking off on their drivers apparently.
The Game Ready Driver 572.60 just came out two days ago and includes updates for games such as monster hunter wilds.
However, early this morning nvidia puts out a GeForce Hotfix Driver Version 572.65 and the patch notes mention

"This hotfix addresses the following issue:
  • PC may boot to a black screen when connected via DisplayPort with certain monitors [5131002]"
Sigh, I use DisplayPort to connect my videocard to my monitor so I guess I'm updating to this. I guess they rushed the driver out for the launch of monster hunter like they rushed Blackwell out the door.
 
Top Bottom