• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia DLSS 4 Announcement Soon

moogman

Neo Member
I love that NVIDIA continually push new technologies to the point where I would never consider buying an AMD graphics card because they just copy them with inferior variants that lag years behind NVIDIA. On the other hand, this has meant that NVIDIA have been free to push the prices up on their cards knowing that people like myself will stick with them rather than buy a lesser product from AMD or Intel. This is why I also hate NVIDIA!!!

DLSS4 will almost certainly be exclusive to the RTX 50 cards but I feel that this marketing ploy will eventually backfire on NVIDIA if they continue to drive up the prices of their graphics cards and limit the number of people who can afford to buy them. Fewer cards sold means less developer support which means less games using their technology. The £1,000 price range was reserved for the top-tier cards once but is now becoming the mid-tier range with the high-end options costing £1,500+.

I own a RTX 4080 Founders Edition card and it cost me £1,100. Although it has been a fantastic GPU, especially for gaming on a 1440p 360 Hz QD-OLED monitor, I do think that card was overpriced by at least £200. I expect the RTX 5080, which I personally thinks sounds underwhelming based on the leaked specs (16 GB of VRAM and still 256-bit, really?!?), will cost at least £1,500, possibly more, which puts it beyond the price range I am comfortable paying for a product that typically lasts 2-3 years.

I actually think that the high price of their GPUs is what is going to ultimately kill my interest in PC gaming as someone who wants to play games at the best quality and framerates, making mid-tier cards a no-no for me. And I don't think I could ever bring myself to switch back to AMD GPUs, even if they were competitively priced, as they are just so far behind NVIDIA technologically that I actually think it's embarrassing.

Agreed, putting prices up is more profit per unit but forces people into skipping more generations. I'm still on my EVGA 3080, and skipped the 40 series as the equivalent price became the 4070 so I was gaining very little on £ vs performance. I was all in for a 5080 but if they bump the price again I will have to see if the value is worth it.
 

DirtInUrEye

Member
trillion dollar Nvidia has the money to ensure the devs are incentivised to include their latest tech in all prominent releases.

Yeah, same as how pristine porcelain tiles with perfectly mirrored neon purple reflections and asphalt that reflects the street lamps as if they're growing into the floor suddenly exist everywhere in modern game environments. Because Nvidia sponsored the RTX in and gamers were weirdly very quick to believe these reflections actually appear "accurate".
 

Buggy Loop

Gold Member
You realize they are not pushing any crap forward? It's just another gimmick to make some fools spend 2k$ for a new gpu to play gaas games.
Real games are still on ps5, day1.
/s but not really ?

AMD has been playing catch up all those years on tech but Nvidia pushes nothing forward, sure. Peak rofif rofif trolling PC and promoting PS5 pixel soup is back I see

michael-scott-closes-the-door-awkwardly-on-the-office.gif
 
Last edited:

sachos

Member
Really looking forward to DLSS4, hoping it is the start of neural rendering for otherwise classic raster techniques.
 

rofif

Can’t Git Gud
AMD has been playing catch up all those years on tech but Nvidia pushes nothing forward, sure. Peak rofif rofif trolling PC and promoting PS5 pixel soup is back I see

michael-scott-closes-the-door-awkwardly-on-the-office.gif
IMG-2937.jpg


Every PC thread.
I was early 2070 adopter for RTX and early 3080 adopter.
Their best thing is DLSS. Don't care about framegen and other stuff.
They also pushed RT forward I guess for what it's worth.
There is nothing to be positive about in this thread. Price sucks, dlss4 we don't know what it is. These cards are not needed.
I wanted to upgrade from 3080 10gb. But at this point for what? To play games year too late with a bit better graphics? I don't care.

edit: You don't have ground to critique me here. I am probably more of an enthusiast than you guys are. At least until it started to feel like paying more for being a sucker
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
I was early 2070 adopter for RTX and early 3080 adopter.
Their best thing is DLSS. Don't care about framegen and other stuff.
They also pushed RT forward I guess for what it's worth.
There is nothing to be positive about in this thread. Price sucks, dlss4 we don't know what it is. These cards are not needed.
I wanted to upgrade from 3080 10gb. But at this point for what? To play games year too late with a bit better graphics? I don't care.

edit: You don't have ground to critique me here. I am probably more of an enthusiast than you guys are. At least until it started to feel like paying more for being a sucker
I think frame gen is pretty great. Combined with Reflex, you end up with a much smoother image with better motion clarity without incurring a latency penalty compared to native without Reflex. You can also just enable Reflex to greatly lower latency, but in single player games, 40 to 60ms hardly matters.
 
Last edited:

BennyBlanco

aka IMurRIVAL69
I was early 2070 adopter for RTX and early 3080 adopter.
Their best thing is DLSS. Don't care about framegen and other stuff.
They also pushed RT forward I guess for what it's worth.
There is nothing to be positive about in this thread. Price sucks, dlss4 we don't know what it is. These cards are not needed.
I wanted to upgrade from 3080 10gb. But at this point for what? To play games year too late with a bit better graphics? I don't care.

Ok? Nobody cares dude. Go post in a ps5 thread rather than having a meltdown in every thread about PC gaming. Also Id rather play Hades 2 and BG3 a year early than the latest Sony x Sweet Baby Inc collab.
 

hinch7

Member
I was early 2070 adopter for RTX and early 3080 adopter.
Their best thing is DLSS. Don't care about framegen and other stuff.
They also pushed RT forward I guess for what it's worth.
There is nothing to be positive about in this thread. Price sucks, dlss4 we don't know what it is. These cards are not needed.
I wanted to upgrade from 3080 10gb. But at this point for what? To play games year too late with a bit better graphics? I don't care.

edit: You don't have ground to critique me here. I am probably more of an enthusiast than you guys are. At least until it started to feel like paying more for being a sucker
You bought a PS5 Pro to play games with better visuals and fidelity at near double the cost of a base PS5. You don't see the the hypocrisy here?

With that $800, that could've gone towards a 5070Ti that probably will near double the performance and more in RT over a PS5 Pro.
 
You bought a PS5 Pro to play games with better visuals and fidelity at near double the cost of a base PS5. You don't see the the hypocrisy here?

With that $800, that could've gone towards a 5070Ti that probably will near double the performance and more in RT over a PS5 Pro.
True... Sony these days is price gouging worse than nVidia and Apple combined.
 

rofif

Can’t Git Gud
Reply banned.
Ok? Nobody cares dude. Go post in a ps5 thread rather than having a meltdown in every thread about PC gaming. Also Id rather play Hades 2 and BG3 a year early than the latest Sony x Sweet Baby Inc collab.
fuck off bitch. I can post where I want.
I think frame gen is pretty great. Combined with Reflex, you end up with a much smoother image with better motion clarity without incurring a latency penalty. You can also just enable Reflex to greatly lower latency, but in single player games, 40 to 60ms hardly matters.
I know. Quite usable with like 70-80fps as a base to get above 100. But I already have 70fps... I don't need 140 with more lag. I've played it and it feels smooth enough but I just can't care for it.
disgusted not safe for work GIF


wow he went there
It's true. You guys are fake enthusiasts probably like asmongold. Loud mouth and probably nothing else. I had whole life of pc experience to be able to appreciate what consoles are doing.
You should at least understand if you had experience.
I am free of this pc bullshit and it feels great. That said - I have both. I just finished indiana jones on pc you know? I have bigger steam library than count of your brain cells.
You bought a PS5 Pro to play games with better visuals and fidelity at near double the cost of a base PS5. You don't see the the hypocrisy here?

With that $800, that could've gone towards a 5070Ti that probably will near double the performance and more in RT over a PS5 Pro.
It was cheap and worth it ¯\_(ツ)_/¯
Money doesn't matter to me.
I play games on ps5 for other reasons... like because the games are there, in perfect working condition day 1 and I just couldn't care to fix yet another game on pc another time... I was tired of feeling like a tool you know?
Granted, this is less of a problem with top hardware.

True... Sony these days is price gouging worse than nVidia and Apple combined.
are you fucking for real? LIKE FOR REAL?
1080ti, best gpu ever was 699 on release. That's 900$ today with inflation.
3080 was 699 in 2020.
Then suddenly 4080 was 1200 for no reason (because nvidia knows suckers will pay scalper prices)
And now 5080 is what? 1700 euro?
And where is the efficiency? These cards have bigger chips, consumer more power and cost twice as more than they should. It's facts.

Meanwhile ps5? 450$ and pro 700$.
ps4 was 400$ in 2013... that's 550 today with inflation.
Ps5 is cheaper than ps4 was dude. Nvidia doubled the prices.
And ps5 pro is a premium product that's only 250$ more expensive. it's not x3 or double the price of ps5.
in fact, 700$ is worth about 570$ from 2020. So the thing is only a bit cheaper than ps5 was on release if you do the math.
Nvidia doubled their prices OUTSIDE of inflation. Sony goes right with inflation.

So no. Sony is NOT RAISING prices and outdoing apple or nvidia. You would've to be a fucking idiot to think that. in fact, even last 3 apple phones all cost the same.
meanwhile nvidia here is doing whatever they want.
 
Last edited:
I think frame gen is pretty great. Combined with Reflex, you end up with a much smoother image with better motion clarity without incurring a latency penalty. You can also just enable Reflex to greatly lower latency, but in single player games, 40 to 60ms hardly matters.
Why do people always keep spouting marketing without even taking a second to think about what they're saying? There is always a latency penalty to using framegen. If you use reflex without frame gen, the latency is lower than using reflex with frame gen. Nvidia must really love it that people voluntarily spout their marketing falsehoods as truths.
 

Gaiff

SBI’s Resident Gaslighter
fuck off bitch. I can post where I want.

I know. Quite usable with like 70-80fps as a base to get above 100. But I already have 70fps... I don't need 140 with more lag. I've played it and it feels smooth enough but I just can't care for it.
It's actually usable at 40 and above per NVIDIA's recommendation. For FSR3, it's 60 and above because they don't necessarily add input reduction technologies like Reflex. FF XVI for instance can drop to 50fps at max settings 4K DLAA on my 4090. Toggle Reflex+Frame Generation on, you get 80-110fps without more latency. In this situation, why wouldn't you use it?

Unless your game is extremely latency sensitive, there's no reason not to use Frame Generation. The games that do need extremely fast response are usually twitch shooters or fighters and those run at 300fps on low-tier GPUs. Cyberpunk, FF XVI, Indiana Jones, or other AAA games should use Frame Generation if available.
 
Last edited:

rofif

Can’t Git Gud
It's actually usable at 40 and above per NVIDIA's recommendation. For FSR3, it's 60 and above because they don't necessarily add input reduction technologies like Reflex. FF XVI for instance can drop to 50fps at max settings 4K DLAA on my 4090. Toggle Reflex+Frame Generation on, you get 80-110fps without more latency. In this situation, why wouldn't you use it?

Unless your game is extremely latency sensitive, there's no reason not to use Frame Generation. The games that do need extremely fast response are usually twitch shooters or fighters and those run at 300fps on low-tier GPUs. Cyberpunk, FF XVI, Indiana Jones, or other AAA games should use Frame Generation if available.
yeah fsr3 is shit in comparison from what I've tried.
Anyway - I don't oppose this tech. It's better to have dlss3 than not
 

Allandor

Member
Well it is than the 4th time they promise to fulfill the uncompromised path tracing in real time ....
Still not really available in low end tiers so still a feature that won't reduce development costs as they promised. Instead it is more AI than ever that still isn't useable on lower cost tiers in a meaningful way until the next Gen (like previous gen ....)....
 

R6Rider

Gold Member
are you fucking for real? LIKE FOR REAL?
1080ti, best gpu ever was 699 on release. That's 900$ today with inflation.
3080 was 699 in 2020.
Then suddenly 4080 was 1200 for no reason (because nvidia knows suckers will pay scalper prices)
And now 5080 is what? 1700 euro?
And where is the efficiency? These cards have bigger chips, consumer more power and cost twice as more than they should. It's facts.

Meanwhile ps5? 450$ and pro 700$.
ps4 was 400$ in 2013... that's 550 today with inflation.
Ps5 is cheaper than ps4 was dude. Nvidia doubled the prices.
And ps5 pro is a premium product that's only 250$ more expensive. it's not x3 or double the price of ps5.
in fact, 700$ is worth about 570$ from 2020. So the thing is only a bit cheaper than ps5 was on release if you do the math.
Nvidia doubled their prices OUTSIDE of inflation. Sony goes right with inflation.
Morgan Freeman Reaction GIF by MOODMAN
 

Gaiff

SBI’s Resident Gaslighter
Why do people always keep spouting marketing without even taking a second to think about what they're saying? There is always a latency penalty to using framegen. If you use reflex without frame gen, the latency is lower than using reflex with frame gen. Nvidia must really love it that people voluntarily spout their marketing falsehoods as truths.
You’re completely missing the point. Every game with frame generation has Reflex. Not every game without frame generation has Reflex.

Reflex + Frame generation at 100fps has comparable or better latency than no frame generation and no Reflex at 50fps. There is more latency without frame generation, but that’s because of Reflex, which is guaranteed to be there if frame generation is. We had been playing games without Reflex for years. Why is it suddenly a problem that Reflex+frame gen gives comparable or better latency than no Reflex at all? If it was frame generation alone, you’d be correct, but it comes with Reflex anyway, so why does it matter?
 
Last edited:

V1LÆM

Gold Member
can't decide if i wait for a 5080 Super or just grab a 5090 asap.

if 5000 is anything like 4000 i should just get a 5090 and that's it. also i currently have a 4080 so not sure how much of an improvement a 5080 would be. it's also 16GB VRAM but i really want 24GB or more.

the thing with a 5090 is that I might need a new PSU and Case. I just got this case a few months ago and my PSU is 1000W which might not be enough of the 5090 can hit 575W. That leaves 400W for the rest of my system which is a 7950X3D (want to upgrade at some point), an AIO, 5x rgb fans, a B50E motherboard. 3x SSDs.
 
You’re completely missing the point. Every game with frame generation has Reflex. Not every game without frame generation has Reflex.

Reflex + Frame generation at 100fps has comparable or better latency than no frame generation and no Reflex at 50fps. There is more latency without frame generation, but that’s because of Reflex, which is guaranteed to be there if frame generation is. We had been playing games without Reflex for years. Why is it suddenly a problem that Reflex+frame gen gives comparable or better latency than no Reflex at all? If it was frame generation alone, you’d be correct, but it comes with Reflex anyway, so why does it matter?
Are you even thinking about what you're saying? If every game with frame generation has reflex, that means you can get much lower latency in the game by just using reflex and not using frame generation. Now attempting to compare a game without reflex and frame generation to a game that has both is like comparing an apple to a pear. It's just a bad comparison so I'm not sure why someone would go comparing unlike things.... That is unless they were trying to make a bad faith argument like Nvidia's marketing.
 

rofif

Can’t Git Gud
You’re completely missing the point. Every game with frame generation has Reflex. Not every game without frame generation has Reflex.

Reflex + Frame generation at 100fps has comparable or better latency than no frame generation and no Reflex at 50fps. There is more latency without frame generation, but that’s because of Reflex, which is guaranteed to be there if frame generation is. We had been playing games without Reflex for years. Why is it suddenly a problem that Reflex+frame gen gives comparable or better latency than no Reflex at all? If it was frame generation alone, you’d be correct, but it comes with Reflex anyway, so why does it matter?
reflex is just vsync off + 1 frame buffer. It was always some sort of option in nvcp before it was called reflex
Reflex is only a bit faster than these old options.
We all have vrr monitors. Just keep the limiter below your hz and the lag is best that can be for the most part
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Are you even thinking about what you're saying?
Are you? Because you're not making sense.
If every game with frame generation has reflex, that means you can get much lower latency in the game by just using reflex and not using frame generation. Now attempting to compare a game without reflex and frame generation to a game that has both is like comparing an apple to a pear. It's just a bad comparison so I'm not sure why someone would go comparing unlike things.... That is unless they were trying to make a bad faith argument like Nvidia's marketing.
Yes, but again, how much does that super low input latency matters compared to smoother frame rate and better motion clarity with a higher IQ? Case and point, Cyberpunk 2077.

doesnt-this-make-frame-gen-practically-useless-v0-an1zmsorcasc1.png


Reflex wasn't added until the frame generation. In this instance, without Reflex, you had a massive latency of 101.5ms at 42fps. It's reduced to just 63.5ms with Reflex. DLSS Quality+Frame Generation gives 112fps with 62.6ms of latency vs 90fps with 42.7ms of latency for DLSS Performance+Reflex without frame generation. What would you rather have? 20 fewer seconds of latency or DLSS Quality (1440p) over Performance (1080p) and a nice 22fps boost? I can guarantee you, you will absolutely not notice those 20ms of added latency playing Cyberpunk. You will, however, notice the difference between DLSS Q and DLSS P and the 22 extra frames per second. In this instance DLSS Q + Reflex+Frame Generation>DLSS P+Reflex and no frame generation. Nobody was bitching about the latency before Reflex was added, so why bitch now with Frame Generation? It's basically like playing without Reflex as we did in the past, but with a much higher IQ and fps.

If it were CS2, I'd give a shit about those 20ms playing on a 240Hz monitor, maybe. Cyberpunk though? Give me that better IQ and clearer motion clarity brought about by the higher frame rate.

reflex is just vsync off + 1 frame buffer. It was always some sort of option in nvcp before it was called reflex
Reflex is only a bit faster than these old options.
We all have vrr monitors. Just keep the limiter below your hz and the lag is best that can be for the most part
You can also enable Ultra Low Latency Mode, but it often doesn't play nice with some games and can cause higher latency or even crashes in some instances. The results from fiddling with the Control Panel are inconsistent. Reflex is guaranteed to work properly and lower your latency.
 
Last edited:

Mister Wolf

Member
Are you? Because you're not making sense.

Yes, but again, how much does that super low input latency matters compared to smoother frame rate and better motion clarity with a higher IQ? Case and point, Cyberpunk 2077.

doesnt-this-make-frame-gen-practically-useless-v0-an1zmsorcasc1.png


Reflex wasn't added until the frame generation. In this instance, without Reflex, you had a massive latency of 101.5ms at 42fps. It's reduced to just 63.5ms with Reflex. DLSS Quality+Frame Generation gives 112fps with 62.6ms of latency vs 90fps with 42.7ms of latency for DLSS Performance+Reflex without frame generation. What would you rather have? 20 fewer seconds of latency or DLSS Quality (1440p) over Performance (1080p) and a nice 22fps boost? I can guarantee you, you will absolutely not notice those 20ms of added latency playing Cyberpunk. You will, however, notice the difference between DLSS Q and DLSS P and the 22 extra frames per second. In this instance DLSS Q + Reflex+Frame Generation>DLSS P+Reflex and no frame generation. Nobody was bitching about the latency before Reflex was added, so why bitch now with Frame Generation? It's basically like playing without Reflex as we did in the past, but with a much higher IQ and fps.

If it were CS2, I'd give a shit about those 20ms playing on a 240Hz monitor, maybe. Cyberpunk though? Give me that better IQ and clearer motion clarity brought about by the higher frame rate.

Whenever Frame Gen is brought up they always act like every game they play requires Counter Strike 2 level precision. Same guys that play on console as well at 30 - 60fps with no complaints.
 
Last edited:

rofif

Can’t Git Gud
Are you? Because you're not making sense.

Yes, but again, how much does that super low input latency matters compared to smoother frame rate and better motion clarity with a higher IQ? Case and point, Cyberpunk 2077.

doesnt-this-make-frame-gen-practically-useless-v0-an1zmsorcasc1.png


Reflex wasn't added until the frame generation. In this instance, without Reflex, you had a massive latency of 101.5ms at 42fps. It's reduced to just 63.5ms with Reflex. DLSS Quality+Frame Generation gives 112fps with 62.6ms of latency vs 90fps with 42.7ms of latency for DLSS Performance+Reflex without frame generation. What would you rather have? 20 fewer seconds of latency or DLSS Quality (1440p) over Performance (1080p) and a nice 22fps boost? I can guarantee you, you will absolutely not notice those 20ms of added latency playing Cyberpunk. You will, however, notice the difference between DLSS Q and DLSS P and the 22 extra frames per second. In this instance DLSS Q + Reflex+Frame Generation>DLSS P+Reflex and no frame generation. Nobody was bitching about the latency before Reflex was added, so why bitch now with Frame Generation? It's basically like playing without Reflex as we did in the past, but with a much higher IQ and fps.

If it were CS2, I'd give a shit about those 20ms playing on a 240Hz monitor, maybe. Cyberpunk though? Give me that better IQ and clearer motion clarity brought about by the higher frame rate.


You can also enable Ultra Low Latency Mode, but it often doesn't play nice with some games and can cause higher latency or even crashes in some instances. The results from fiddling with the Control Panel are inconsistent. Reflex is guaranteed to work properly and lower your latency.
yeah true. reflex is on game level. Some games don't want to listen to system level settings
 

Kataploom

Gold Member
Yeah, same as how pristine porcelain tiles with perfectly mirrored neon purple reflections and asphalt that reflects the street lamps as if they're growing into the floor suddenly exist everywhere in modern game environments. Because Nvidia sponsored the RTX in and gamers were weirdly very quick to believe these reflections actually appear "accurate".
Damn this is my opinion about it as well, RT off looks more consistently good because surfaces are not overly reflecting everything just for the sake of "Wow!"s
 

DirtInUrEye

Member
Worst case scenario for owners of the higher end 3000 and 4000 series cards who can't resist the urge to upgrade is still being able to sell your existing card for Nvidia used prices which normally means bloody good money, maybe a third or even half the cost of the SKU you decide to buy next. I reckon people with 4070 Ti Super and upwards will still make daft money on eBay and the like even twelve months from now.
 

Buggy Loop

Gold Member
It's true. You guys are fake enthusiasts probably like asmongold. Loud mouth and probably nothing else. I had whole life of pc experience to be able to appreciate what consoles are doing.
You should at least understand if you had experience.
I am free of this pc bullshit and it feels great. That said - I have both. I just finished indiana jones on pc you know? I have bigger steam library than count of your brain cells.


laughing-hysterically-point.gif


Oh look, he's off his meds again and now completely unhinged. This typically is a buffer of one week before he's banned.

I've been gaming since commodore 64. I was programming games on it with basic language at 8. I built my PCs starting with 386 architecture. I've never even seen you in any nostalgic PC threads. No I don't believe you're any hardcore, not with the way you navigate game by game through pcgamingwiki to solve anything :messenger_tears_of_joy:

Do share your steam profile, we'll get a laugh
 

rofif

Can’t Git Gud
laughing-hysterically-point.gif


Oh look, he's off his meds again and now completely unhinged. This typically is a buffer of one week before he's banned.

I've been gaming since commodore 64. I was programming games on it with basic language at 8. I built my PCs starting with 386 architecture. I've never even seen you in any nostalgic PC threads. No I don't believe you're any hardcore, not with the way you navigate game by game through pcgamingwiki to solve anything :messenger_tears_of_joy:

Do share your steam profile, we'll get a laugh
Then you should be smarter
 

The Cockatrice

I'm retarded?
Cant wait for nvidia to pay engine/game companies lots of money to unoptimize their games so consumers are forced to buy their overpriced gpus because RAYTRACING!!!!!!!!!!!!!!!!!!!!!!!! which 80% of consumers disable.
 

winjer

Gold Member
I think frame gen is pretty great. Combined with Reflex, you end up with a much smoother image with better motion clarity without incurring a latency penalty compared to native without Reflex. You can also just enable Reflex to greatly lower latency, but in single player games, 40 to 60ms hardly matters.

I'm playing Darktide currently. And it really benefits from the lower latency.
I prefer to have FG turned off and have the lowest latency. It's impressive how responsive the game becomes like that.
 

justiceiro

Marlboro: Other M
So you are saying the company that released dlss with 20 series, dlss 2.0 with the 30 series and dlss 3.0 with the 40 series will dare to release dlss 4 with the 50 series???? For real????? I can't believe it!!!

One thing that consoles are better than PC is that console rumours are much more wild and simply made up. PC rumours are simply true most of the time, you don't even have space for speculation.
 
Cant wait for nvidia to pay engine/game companies lots of money to unoptimize their games so consumers are forced to buy their overpriced gpus because RAYTRACING!!!!!!!!!!!!!!!!!!!!!!!! which 80% of consumers disable.
Ray-tracing is the future and it's been here since 2018. It's no longer a novelty, even Nintendo plans to support it.

You guys remind me when nVidia had released GeForce 256 back in 1999 and most people thought T&L was "useless", so instead they preferred ATi Rage Fury MAXX (which didn't age like FineWine):

SgEoETM.png


20-25 year old RGB bling PCMR gamers weren't even born back then, so it's understandable if you don't remember.

Same thing when nVidia had released programmable shaders with GeForce 3 (T&L was the norm). Every new technology is deemed "useless" in the beginning, especially if your favorite company doesn't support it that well (sour grapes and all).

It's a shame AMD is so far behind nVidia these days, but it wasn't always like that (i.e. Radeon 9700 vs GeForce FX, Evergreen/5XXX vs Fermi and even Polaris 8GB @ $199 were great GPUs at their time).

And last but not least, let's not pretend AMD has never pushed marketing BS (64-bit CPUs may help with draw distance, but everything else such as increased vegetation is being handled by the GPU):

AMD's goals are quite admirable, but the fact of the matter is that none of the visual improvements enabled by the Far Cry patches had anything to do with AMD64 or EM64T. They are artificially limited to run on those platforms alone, but could work just as well on a 32-bit platform.

How many of you had a gaming-capable PC back then?
 
Last edited:

The Cockatrice

I'm retarded?
It's no longer a novelty,
Stats says otherwise

8ob2GWO.jpeg


Most people are still at 8GB VRAM and most still at 1080P which means most people will not enable raytracing. It's still a novelty. It'll be less so in 5 years or so assuming the next-gen and next next-gen gpu's as well as the new consoles will be affordable which I doubt.
 
Last edited:

diffusionx

Gold Member
4090cels finna be seething when Nvidia tells them their $1800 card doesn't support this new software lmao

Stats says otherwise

8ob2GWO.jpeg


Most people are still at 8GB VRAM and most still at 1080P which means most people will not enable raytracing. It's still a novelty. It'll be less so in 5 years or so assuming the next-gen and next next-gen gpu's as well as the new consoles will be affordable which I doubt.
People on this forum don't understand this, don't even try. The total disconnect between what they think the PC market is (people running $1000 GPUs) and what it actually is (people playing Minecraft on their laptop) is crazy.
 
Last edited:
Are you? Because you're not making sense.
Ironic. My point of contention with your initial statement that using reflux provides lower latency that using reflux plus frame gen and so your initial statement was incorrect. You have now turned the discussion into something else that was not even being discussed. Essentially you’ve built a beautiful straw man to argue a point i never made?
Yes, but again, how much does that super low input latency matters compared to smoother frame rate and better motion clarity with a higher IQ? Case and point, Cyberpunk 2077.

doesnt-this-make-frame-gen-practically-useless-v0-an1zmsorcasc1.png
Who the heck was arguing about motion clarity? Like I said, strawman.
Reflex wasn't added until the frame generation. In this instance, without Reflex, you had a massive latency of 101.5ms at 42fps. It's reduced to just 63.5ms with Reflex. DLSS Quality+Frame Generation gives 112fps with 62.6ms of latency vs 90fps with 42.7ms of latency for DLSS Performance+Reflex without frame generation. What would you rather have? 20 fewer seconds of latency or DLSS Quality (1440p) over Performance (1080p) and a nice 22fps boost? I can guarantee you, you will absolutely not notice those 20ms of added latency playing Cyberpunk. You will, however, notice the difference between DLSS Q and DLSS P and the 22 extra frames per second. In this instance DLSS Q + Reflex+Frame Generation>DLSS P+Reflex and no frame generation. Nobody was bitching about the latency before Reflex was added, so why bitch now with Frame Generation? It's basically like playing without Reflex as we did in the past, but with a much higher IQ and fps.

If it were CS2, I'd give a shit about those 20ms playing on a 240Hz monitor, maybe. Cyberpunk though? Give me that better IQ and clearer motion clarity brought about by the higher frame rate.


You can also enable Ultra Low Latency Mode, but it often doesn't play nice with some games and can cause higher latency or even crashes in some instances. The results from fiddling with the Control Panel are inconsistent. Reflex is guaranteed to work properly and lower your latency.
Again, arguing a point I never made. Just admit you’re wrong dude. Like you’re just talking about a bunch of stuff no one was discussing. What I initially asserted is a fact. Reflux by itself provides lower latency than reflux plus frame generation. That’s the beginning and ending of the argument I initially presented. The rest of your ramblings are for you alone and have nothing to do with my argument.
 
Stats says otherwise

8ob2GWO.jpeg


Most people are still at 8GB VRAM and most still at 1080P which means most people will not enable raytracing. It's still a novelty. It'll be less so in 5 years or so assuming the next-gen and next next-gen gpu's as well as the new consoles will be affordable which I doubt.
Even 6GB VRAM cards such as RTX 2060 (not SUPER) support ray tracing and can run Indiana Jones just fine at 60 fps low settings, let alone 8GB cards such as 2060S/3060/4060.


I remember people suggesting RX 5700 instead back then, due to having more VRAM and slightly better raster performance, but it didn't age that well, eh?

AMD released RX 5700 6 months after RTX 2060 back in 2019, when the rumors about next-gen consoles (PS5/XBOX Series) were rampant and we knew they would support ray tracing. So why invest in a GPU with no RT features/acceleration?

Besides that, ray tracing reduces the work game devs have to do (baked lighting can look good -see Uncharted 2/3- but it requires tons of manual work) and we already live in an era where the AAA industry is crumbling due to crunch, lay-offs etc.

Back in the 90s we had prerendered backgrounds in video games, because PCs back then couldn't handle it. That's no longer the norm. Technology progresses and you have to keep up with it (and by that I don't mean that you have to buy 4090 or 5090).
 

3liteDragon

Member
If the texture upscaling rumor for DLSS 4's true, then I don't see it being a 50 series exclusive feature. Sounds like something that can work on all RTX cards or on the 30 series at least. The white papers NVIDIA did on this last year shows it being tested on a 4090.
 
Top Bottom