• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

GeForce RTX 5090 is $1,999 5080 $999 5070 Ti $749 5070 $549 (Availability Starting Jan 30 for RTX 5090 and 5080)

OverHeat

« generous god »
Pauvre toi mon ami. Traverse le pont à Ontario?
Quebec city to Ottawa is a hell of a ride lol
0tpmHsf.png
 
Last edited:

Haint

Member
How can you say that when the 4090 and 4080S have a commanding share of 2% combined in the Steam survey?!?!
The 3% for 4090/4080's represents like 6-7 Billion dollars of revenue, the 9% for 4060/4060ti represents like 4 Billion. Its costs around $350 to build a 4080, $450-$500 for a 4090, meaning profit margins are 200%+. 4060/4060ti margins are likely only 40% at best. Similarly people with the highend cards likely spend multiple times more on games and microtransactions. The hign end cards absolutely do dominate the industry in dollars, profit, and spending.
 
Last edited:
Not a hate troll bait. Just a unique observation from 4090 owner. People who have 4090 rarely say that


21:30 Jeff says the FG on 4090 is unacceptable and feels like playing on geforce now.
I like how he says "I am here to fucking play video games and not add geforce now latency".
It's kinda rare seeing 4090 owner shitting on FG.
I personally don't see any reason to use it. With 4090-5090 you already have amazing framerates with dlss2 (that I always use)... so why make 70fps into 240 but add more lag and potential artifacts? Are you playing a game or watching the fps number in rtss there in a corner.

After more than 25 years of playing fast-paced FPS games such as Quake and Unreal Tournament, I became very sensitive to input lag. I can tell the difference in input latency between 170Hz VSYNc and 167Hz VRR (VRR cancels VSYNC input lag). DLSS FG increase input latency a little bit, but it's not as big as Vsync input lag on my monitor. In games like cyberpunk that increase in input lag is in placebo territory even for me, so comparisons to "geforce now" clearly suggest to me that wrong settings are to blame.

Nvidia FG can add very high input lag (100ms on top of your base input latency) if you have vsync turned on, or try to limit framerate with RTSS (riva tuner). Some people might use the wrong settings and they will think that DLSS FG is to blame.

I enable DLSS FG in all my games because it always improve image quality (on sample and hold display more fps equals sharper image during motion) and it also improves my aiming, because my eyes can track moving objects much easier. I can play cyberpunk at 40-50fps and have a good time even at such fps (especially on gamepad), but with FG on top of that my aim improved DRAMATICALLY, especially on M+K. The game felt much closer to 80fps than 40fps.

I enable DLSS FG even if I have a decent average fps (80fps), because some games might still dip below 60fps from time to time. Normally, this would bother me, but since I started using DLSS FG, I no longer care about the sub 60 fps dips, because I no longer notice them.

Here's evidence to support my claims.

Cyberpunk at 1440p TAA native + psycho RT. I get 73fps with these settings and 28ms latency (in the top right hand corner of my monitor screen)

20250108-211531.jpg


The same settings just with DLSS FG (without DLSS super resolution) 124fps and 37ms latency. That's just 9ms more but it made the game look MUCH sharper during motion and I could aim much easier.

20250108-211622.jpg


Another game BMW

4K DLSS performance, very high settings, fullRT, 57fps and 63ms input latency

20250107-215732.jpg


With DLSS FG I get 88fps and 54ms. That's 9ms less even compared to FG off, so in this particular game FG offers the best latency possible (because it activates nvidia reflex).

20250107-215926.jpg


However, not all games have such low input latency with DLSS FG. In Alan Wake 2 I measured additional 20ms input latency with DLSSFG. The game is still perfectly playable on gamepad, but on M+K I started to feel increased input delay (but it's still wasnt "geforce now" bad).

Alan Wake 2, 4K DLSS performance, max settings, fullPT, 75fps and 40ms latency

20250113-205120.jpg


With DLSS FG 108fps and 60ms latency

20250113-205159.jpg



Overall I'm very happy with the results of DLSS FG. I can see problems in FSR3 FG and Lossless Scaling FG (people said that LSFG adds very little input lag, but it was still too much for me), but DLSS FG works so well that I consider it a free performance boost and use it in every single game (even in Alan Wake 2).
 
Last edited:

rofif

Can’t Git Gud
After more than 25 years of playing fast-paced FPS games such as Quake and Unreal Tournament, I am very sensitive to input lag. I can tell the difference in input latency between 170Hz VSYNc and 167Hz VRR (VRR canced VSYNC input lag). DLSS FG increase input latency a little bit, but it's not as big as Vsync input lag on my monitor. In games like cyberpunk that increase in input lag is in placebo territory even for me, so comprisons to "geforce now" suggest to me that settings are to blame. Nvidia FG adds unnaceptable lag (100ms on top of your base input latency) if you have vsync turned on, or try to limit framerare with RTSS (riva tuner). Some people might use the wrong settings and they will think that DLSS FG is to blame.

I enable DLSS FG in all my games because it always improve image quality (on sample and hold display more fps equals sharper image during motion) and because it also improves my aiming, because my eyes can track objects much easier. I can play cyberpunk at 40-50fps and have good time (especially on gamepad), but with FG on top of that my aim improves DRASTICALLY. Even if I have a decent fps (80fps), some games might still dip below 60fps from time to time. Normally that would bother me, but sice I started using FG I no longer care about sub 60fps.


Here's evidence to support my claims.

Cyberpunk at 1440p TAA native + psycho RT. I get 73fps with these settings and 28ms latency (in the top right hand corner of my monitor screen)

20250108-211531.jpg


The same settings, just with 124fps and 37ms latency, just 9ms more but it made the game look MUCH sharper during motion and I could aim much easier.

20250108-211622.jpg


Another game BMW

4K DLSS performance, very high settings, fullRT, 57fps and 63ms input latency

20250107-215732.jpg


With DLSS FG 88fps and 54ms. That's 9ms less even compared to without FG, so in this particular game FG offers the best latency possible (because it activates nvidia reflex).

20250107-215926.jpg


However, not all games have such low input latency with DLSS FG. In Alan Wake 2 I measured additional 20ms input latency with DLSSFG. The game is still perfectly playable on gamepad, but on M+K I started to feel increased input delay (but it's still wasnt "geforce now" bad).

Alan Wake 2, 4K DLSS performance, max settings, fullPT, 75fps and 40ms latency

20250113-205120.jpg


With DLSS FG 108fps and 60ms latency

20250113-205159.jpg



Overall I'm very happy with the results of DLSS FG. I can see problems in FSR3 FG and Lossless Scaling FG, but DLSS FG works so well that I consider it a free performance boost and use it in every single game (even Alan Wake 2).
good post. Something I would 100% try and use in few games when I got 5080 or something
What if you used reflex without FG? Does it feel better for you?
 
After more than 25 years of playing fast-paced FPS games such as Quake and Unreal Tournament, I became very sensitive to input lag. I can tell the difference in input latency between 170Hz VSYNc and 167Hz VRR (VRR cancels VSYNC input lag). DLSS FG increase input latency a little bit, but it's not as big as Vsync input lag on my monitor. In games like cyberpunk that increase in input lag is in placebo territory even for me, so comparisons to "geforce now" clearly suggest to me that wrong settings are to blame. Nvidia FG adds unnaceptable lag (100ms on top of your base input latency) if you have vsync turned on, or try to limit framerare with RTSS (riva tuner). Some people might use the wrong settings and they will think that DLSS FG is to blame.

I enable DLSS FG in all my games because it always improve image quality (on sample and hold display more fps equals sharper image during motion) and it also improves my aiming, because my eyes can track objects much easier. I can play cyberpunk at 40-50fps and have good time (especially on gamepad), but with FG on top of that my aim improves DRASTICALLY. Even if I have a decent fps (80fps), some games might still dip below 60fps from time to time. Normally that would bother me, but sice I started using FG I no longer care about sub 60fps.

Here's evidence to support my claims.

Cyberpunk at 1440p TAA native + psycho RT. I get 73fps with these settings and 28ms latency (in the top right hand corner of my monitor screen)

20250108-211531.jpg


The same settings, just with 124fps and 37ms latency, just 9ms more but it made the game look MUCH sharper during motion and I could aim much easier.

20250108-211622.jpg


Another game BMW

4K DLSS performance, very high settings, fullRT, 57fps and 63ms input latency

20250107-215732.jpg


With DLSS FG 88fps and 54ms. That's 9ms less even compared to without FG, so in this particular game FG offers the best latency possible (because it activates nvidia reflex).

20250107-215926.jpg


However, not all games have such low input latency with DLSS FG. In Alan Wake 2 I measured additional 20ms input latency with DLSSFG. The game is still perfectly playable on gamepad, but on M+K I started to feel increased input delay (but it's still wasnt "geforce now" bad).

Alan Wake 2, 4K DLSS performance, max settings, fullPT, 75fps and 40ms latency

20250113-205120.jpg


With DLSS FG 108fps and 60ms latency

20250113-205159.jpg



Overall I'm very happy with the results of DLSS FG. I can see problems in FSR3 FG and Lossless Scaling FG, but DLSS FG works so well that I consider it a free performance boost and use it in every single game (even Alan Wake 2).
What app are you using to measure the latency?
 
good post. Something I would 100% try and use in few games when I got 5080 or something
What if you used reflex without FG? Does it feel better for you?
I always use "Nvidia Reflex" when the game supports it, so when I said that the input latency in Cyberpunk was in placebo territory for me, I was comparing both modes with Reflex on.

Black Myth Wukong is the only exception, as this particular game does not allow me to enable "nvidia reflex" separately. IDK if developers have done it on purpose, but if you want to get the lowest latency possible in BMW, you need to turn on DLSS FG, because only then nvidia reflex will be activated. In some places DLSS FG can have 20ms lower latency compared to FG off. Black Myth Wukong is somewhat laggy, but DLSS FG removes that lag almost completely. I cant imagine playing this game without DLSS FG.

What app are you using to measure the latency?
Geforce experience APP. You need to enable hardware monitoring.
 
Last edited:

Kilau

Gold Member
The 3% for 4090/4080's represents like 6-7 Billion dollars of revenue, the 9% for 4060/4060ti represents like 4 Billion. Its costs around $350 to build a 4080, $450-$500 for a 4090, meaning profit margins are 200%+. 4060/4060ti margins are likely only 40% at best. Similarly people with the highend cards likely spend multiple times more on games and microtransactions. The hign end cards absolutely do dominate the industry in dollars, profit, and spending.
That huge and profitable high end gpu market that AMD and Intel can't survive in? Even if it's just that they make shit cards that can't compete, Nvidia and the AIB vendors sure do leave a lot of money on the table by producing so few of those cards.

Even if the cards only have $400-$500 of hardware in them, you can't just ignore the R&D and marketing costs, your thoughts on owners of those spending more on games and DLC is just pure conjecture.
 

//DEVIL//

Member
I’m not driving 10h for a GPU 😂😂😂
I mean yeah. 10 hours like me going from Ottawa to Toronto is too far for a GPU. by the time I am there I am either ghosted or sold. but if its 2 hours drive like Ottawa to Montreal. then I don't mind doing it. especially on the weekend. take my family for a day trip.
 

Haint

Member
That huge and profitable high end gpu market that AMD and Intel can't survive in? Even if it's just that they make shit cards that can't compete, Nvidia and the AIB vendors sure do leave a lot of money on the table by producing so few of those cards.

Even if the cards only have $400-$500 of hardware in them, you can't just ignore the R&D and marketing costs, your thoughts on owners of those spending more on games and DLC is just pure conjecture.

Correct, AMD and Intel can't build 90 class or even 80 class cards at any price, which is why they willingly cede it to Nvidia. They wouldn't sell any units even if they could cause those buyers are the most discerning and 99% would go with Nvidia for the feature set and software support (notably hobbyist AI support). They build so few of them cause they share the same manufacturing capacity as $20,000+ AI GPU's. Nvidia's low end offerings are the lowest end shit it's still semi-economically viable to build, they're 30 class and 50 class cards masquerading as 60 class. Low end buyers are much less discerning and just want the fastest/most playable rasterized frame rates, which gives AMD/Intel the best odds at capturing some market share.
 
Last edited:
Nah, it’s such youtubers that suck. Clickbait titles, clickbait faces, zero useful content
Clickbait or not, it's difficult to argue with the raw figures.

MnMy3Bp.png

KTYo6bD.png


Overall the 5000 series has been a sideways refresh with a software patch. The 5090 has an increase in performance - with increased power consumption and total price. Rest of the cards? Raster performance is no better than 2 year old GPU.
 
Last edited:
Imagine trusting this fucking guy to educate you on technology

EAFS2MI.png
Skip the video and look at this post
 

analog_future

Resident Crybaby
Skip the video and look at this post

Looks similar to the 4000 series cards, where the 5090 is a very nice jump and the 5080 is not.

At least the 5080 is priced at $999 instead of the $1199 that they asked for the 4080.
 

rm082e

Member
Not a hate troll bait. Just a unique observation from 4090 owner. People who have 4090 rarely say that


21:30 Jeff says the FG on 4090 is unacceptable and feels like playing on geforce now.
I like how he says "I am here to fucking play video games and not add geforce now latency".
It's kinda rare seeing 4090 owner shitting on FG.
I personally don't see any reason to use it. With 4090-5090 you already have amazing framerates with dlss2 (that I always use)... so why make 70fps into 240 but add more lag and potential artifacts? Are you playing a game or watching the fps number in rtss there in a corner.


I've been listening to Jeff for 20 years. I'm not going to say he's wrong, but the degree to which he's implying the additional latency impacts his ability to perform in game compared to his skill level relative to competitive fighting game and FPS players seems out of whack. He's always been a bit above average at both of those genres. Not to say he needs to "deal with it" or anything. I just think the latency introduced by FG is one of those things that some people are going to feel and others aren't. The people who don't feel it just aren't sensitive to it and probably aren't playing a lot of competitive multiplayer games.
 
Looks similar to the 4000 series cards, where the 5090 is a very nice jump and the 5080 is not.

At least the 5080 is priced at $999 instead of the $1199 that they asked for the 4080.
The 4000 was a considerable jump over the previous gen (3000). The 4080 is 25% faster than the 3090 Ti, and 50% faster than the 3080.

The 5080 will likely be 5-10% faster than the 4080 at most. Therefore, $999 is not a good price for the 5080 at all.
 
Last edited:

MikeM

Member
Clickbait or not, it's difficult to argue with the raw figures.

MnMy3Bp.png

KTYo6bD.png


Overall the 5000 series has been a sideways refresh with a software patch. The 5090 has an increase in performance - with increased power consumption and total price. Rest of the cards? Raster performance is no better than 2 year old GPU.
Is this graph from Nvidia? I didn’t see anything without RT on it previously.

If so, seems normal raster is shit and the upgrades are in RT workloads and frame gen. Zzzzzzzs
 
Last edited:

Bojji

Member
Is this graph from Nvidia? I didn’t see anything without RT on it previously.

If so, seems normal raster is shit and the upgrades are in RT workloads and frame gen. Zzzzzzzs

Nvidia hides numbers of Blackwell "normal" teraflops for some reason

But you can calculate them. Of course teraflops between generations may or may not be comparable to each other.

We have to wait for reviews (as always).
 
Top Bottom