• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia DLSS 4 Announcement Soon

Gaiff

SBI’s Resident Gaslighter
Ironic. My point of contention with your initial statement that using reflux provides lower latency that using reflux plus frame gen and so your initial statement was incorrect. You have now turned the discussion into something else that was not even being discussed. Essentially you’ve built a beautiful straw man to argue a point i never made?

Who the heck was arguing about motion clarity? Like I said, strawman.

Again, arguing a point I never made. Just admit you’re wrong dude. Like you’re just talking about a bunch of stuff no one was discussing. What I initially asserted is a fact. Reflux by itself provides lower latency than reflux plus frame generation. That’s the beginning and ending of the argument I initially presented. The rest of your ramblings are for you alone and have nothing to do with my argument.
I was arguing that Reflex+Frame Generation is better than no Reflex or Frame Generation because it provides similar latency to native without Reflex. You got your panties in a bunch and accused me of spouting falsehoods. Looks like the numbers aren’t falsehoods, huh?

Why would I admit to being wrong when you’re the one who jumped in and completely misrepresented the point I was making?
 
Last edited:

Tallahassee

Member
I think frame gen is pretty great. Combined with Reflex, you end up with a much smoother image with better motion clarity without incurring a latency penalty compared to native without Reflex. You can also just enable Reflex to greatly lower latency, but in single player games, 40 to 60ms hardly matters.
it absolutely is, fg helps a lot with cpu bottlenecks.
a great recent example of its use is Space Marine 2
 

Zathalus

Member
Stats says otherwise

8ob2GWO.jpeg


Most people are still at 8GB VRAM and most still at 1080P which means most people will not enable raytracing. It's still a novelty. It'll be less so in 5 years or so assuming the next-gen and next next-gen gpu's as well as the new consoles will be affordable which I doubt.
If you look at the actual GPU breakdown, over 25% of GPUs on the Survey are 3070/4060Ti or better. That class of GPU and up is more than enough for enabling RT effects. 25% is a minority, true, but combine that with the staggering amount of users that Steam has you are looking in the region of ~50 million or more users for which RT is perfectly usable. Hardly a novelty anymore, although not at the level of being standard yet.
 

The Cockatrice

I'm retarded?
Even 6GB VRAM cards such as RTX 2060 (not SUPER) support ray tracing and can run Indiana Jones just fine at 60 fps low settings, let alone 8GB cards such as 2060S/3060/4060.

Cards can run modern games but those cards cant run those games with RAYTRACING ON at any reasonable performance, which was my point. You completely missed the point entirely.

If you look at the actual GPU breakdown, over 25% of GPUs on the Survey are 3070/4060Ti or better. That class of GPU and up is more than enough for enabling RT effects. 25% is a minority, true, but combine that with the staggering amount of users that Steam has you are looking in the region of ~50 million or more users for which RT is perfectly usable. Hardly a novelty anymore, although not at the level of being standard yet.

They can enable the raytracing settings ofc, no one is arguing that, but at reasonable performance? Nah, no ones is doing that, hence why its a novelty. If you guys wanna live in a bubble go ahead, but fact is, no one with 8gb cards is going to play at garbage performance with RT on. Hell even myself with 12GB I rarely enable that shit.
 
Last edited:

64bitmodels

Reverse groomer.
i hope that neural rendering tech is put to good use then. Anything else involving AI or frame gen I couldn't care less.
You guys remind me when nVidia had released GeForce 256 back in 1999 and most people thought T&L was "useless", so instead they preferred ATi Rage Fury MAXX (which didn't age like FineWine):
Geforce 256 price in 1999: $199. Adjusted for inflation that's under 400 today
Geforce RTX 4090: $1599.

gee, i wonder why T&L became the standard while Raytracing is still lagging for mass implementation.
 
Last edited:
Cards can run modern games but those cards cant run those games with RAYTRACING ON at any reasonable performance, which was my point. You completely missed the point entirely.

They can enable the raytracing settings ofc, no one is arguing that, but at reasonable performance? Nah, no ones is doing that.
nVidia cards from 2019 can run modern games at reasonable performance with RT ON:



It's not nVidia's fault AMD/Intel have subpar RT implementation (to be fair, Intel is advancing Arc GPUs faster than AMD does with Radeon).

AMD has no excuse IMHO. They have captured the home console market, which is not small at all (PS4, PS4 Pro, XBOX ONE, XBOX ONE X, PS5, PS5 Pro, XBOX Series S/X), they have a steady stream of income.

Maybe they should bring Raja back, I don't know. In my eyes RTG in 2025 is worse than it was in 2016.

It's ridiculous Sony did all the R&D for AI upscaling (PSSR) and AMD did nothing to compete against DLSS in the PC GPU market for 6 years (2018-2024). Even Intel brought XeSS a while ago, despite not having a single console contract and a declining market cap.
 

Zathalus

Member
They can enable the raytracing settings ofc, no one is arguing that, but at reasonable performance? Nah, no ones is doing that.
Of course it is at reasonable performance. Cyberpunk 2077 can be set to Ultra RT, 1080p+DLSS Quality at 70-80fps on a 4060ti. Drop to RT medium, and you can even do 60fps at 1440p DLSS Quality on a 4060ti. Dying Light RT + 1440p at over 70 FPS can be done. Indiana Jones 1440p + RT at 60 fps. Metro Exodus Enhanced Edition 1440p, Ultra settings, DLSS Quality around 90FPS. Drop non-RT settings to optimized ones, and performance will be even higher if you wish.

Naturally, you won't push 4k or path tracing with a 4060ti. But 1080p and RT or even 1440p and RT is perfectly possible and at good performance as well.
 
Last edited:
Geforce 256 price in 1999: $199. Adjusted for inflation that's under 400 today
Geforce RTX 4090: $1599.

gee, i wonder why T&L became the standard while Raytracing is still lagging for mass implementation.
GeForce 256 TDP: 18 watts

GeForce RTX 4090 TDP: 450 watts

Wanna tell me what tier of TDP you buy these days at $199?

4090 isn't made solely for gamers (like GF256), it's a card for productivity (AI etc.) and it pays off pretty fast that way.

You also ignore the fact PCBs are more complex (more layers) and cooling (copper heatpipes etc.) is more elaborate. These things cost money, they're not deflationary (like Moore's law).

5090 will be more expensive than 4090 for one simple reason: 512-bit bus. If you know (PCB design), you know.

And no, T&L didn't become the industry standard since day 1 as you insinuate. It took many years for mass adoption, 3Dfx was still strong back in 1999.
 
Last edited:

//DEVIL//

Member
if DLSS 4 has the Neural rendering, then its going to be 5000 series exclusive.

not new, but a typical middle finger for 4000 owners and 3000 owners heh..
 
Wouldn't go that far as Apple but sure are just as anti-consumer as any other large tech company lol.
Wanna know what pisses me off?

We used to say consoles tend to follow a razor-blades business model (you buy the console for cheap or even at a loss, but games are a bit more expensive to make up for the loss). PS3/PS4 were like that.

PS5 established even more expensive games ($70 or even €80 in the EU), $10 next-gen upgrades, tons of remasters (despite having PS4 BC) AND on top of that an expensive console (PS5 was cheap at $500 back in 2020, but PS5 Slim at $500 is expensive in 2025, let alone PS5 Pro at €800 with no disc drive, which costs another €150-200 because Sony refuses to produce more units).

At almost €1000 why not build a PC instead? (I know it's not for everyone, but that's not my point)

Let alone PS+ Essential which is expensive as hell these days (and no sales either, they're trying to upsell you Extra/Premium tiers).

Sony has become too arrogant and I have no idea if Microsoft will ever be able to compete against them.
 
Last edited:

64bitmodels

Reverse groomer.
GeForce 256 TDP: 18 watts

GeForce RTX 4090 TDP: 450 watts

Wanna tell me what tier of TDP you buy these days at $199?

4090 isn't made solely for gamers (like GF256), it's a card for productivity (AI etc.) and it pays off pretty fast that way.

You also ignore the fact PCBs are more complex (more layers) and cooling (copper heatpipes etc.) is more elaborate. These things cost money, they're not deflationary (like Moore's law).

5090 will be more expensive than 4090 for one simple reason: 512-bit bus. If you know (PCB design), you know.

And no, T&L didn't become the industry standard since day 1 as you insinuate. It took many years for mass adoption, 3Dfx was still strong back in 1999.
hate to be dismissive, but none of what you said matters.

Nobody's gonna raytrace shit if people don't have the money for it. The 4070 super is the bare minimum for decent raytracing performance and it's 600 bucks!!!

This hate for raytracing is simply because the cards best capable of it are also the ones that are about as much as a rent payment. Making a production-class GPU and marketing it to gamers instead of seperating the branding is real dumb, and even if it were a production class card it doesn't excuse the insane jump of prices the 80 series had.

Till we get a raytracing card with 4070 level RT performance for under 299, this technology is not becoming standard
 
hate to be dismissive, but none of what you said matters.

Nobody's gonna raytrace shit if people don't have the money for it. The 4070 super is the bare minimum for decent raytracing performance and it's 600 bucks!!!

This hate for raytracing is simply because the cards best capable of it are also the ones that are about as much as a rent payment. Making a production-class GPU and marketing it to gamers instead of seperating the branding is real dumb, and even if it were a production class card it doesn't excuse the insane jump of prices the 80 series had.

Till we get a raytracing card with 4070 level RT performance for under 299, this technology is not becoming standard
FED, TSMC, AI and lack of competition drive the prices up.

And there are rumors about Trump imposing 40% tariffs:

 

Klik

Member
I remember 10 years ago you could build high end gaming PC for around 1200$. Now its around 3000$.


Yes inflation etc but unless everyone's wages went up 3x in last 10 years,paying that much for high end pc is crap
 

Gaiff

SBI’s Resident Gaslighter
I remember 10 years ago you could build high end gaming PC for around 1200$. Now its around 3000$.


Yes inflation etc but unless everyone's wages went up 3x in last 10 years,paying that much for high end pc is crap
COVID coupled with the mining boom showed NVIDIA how much people were willing to pay. It made them realize these gamer schmucks would be alright paying $1500 for a top-tier GPU instead of $750.
 

Gaiff

SBI’s Resident Gaslighter
I'm playing Darktide currently. And it really benefits from the lower latency.
I prefer to have FG turned off and have the lowest latency. It's impressive how responsive the game becomes like that.
Fair enough. Haven’t tried this game. I assume it has Reflex as well?

Whatever the case, it was a boon every time I enabled it.
 

Thebonehead

Gold Member
You realize they are not pushing any crap forward? It's just another gimmick to make some fools spend 2k$ for a new gpu to play gaas games.
Real games are still on ps5, day1.
/s but not really ?

You don't have ground to critique me here. I am probably more of an enthusiast than you guys are. At least until it started to feel like paying more for being a sucker


They Pull Me Back In Al Pacino GIF by The Godfather
 

manfestival

Member
In AMD we continue to put our hopes and dreams knowing they they will inevitably drop the ball on their offering like they do every generation. Everything they provide is perfectly capable yet somehow insufficient at the same time.
 
In AMD we continue to put our hopes and dreams knowing they they will inevitably drop the ball on their offering like they do every generation. Everything they provide is perfectly capable yet somehow insufficient at the same time.
Maybe UDNA in 2026-2027 (PS6 GPU) will be able to compete against nVidia, but nVidia will have a new GPU microarchitecture in the market by then...

RDNA is kinda like GCN back in 2016, but way more expensive compared to Polaris.
Tom's Hardware [TH], Paul Alcorn: So, with UDNA bringing those architectures back together, will all of that still be backward compatible with the RDNA and the CDNA split?

JH: So, one of the things we want to do is ...we made some mistakes with the RDNA side; each time we change the memory hierarchy, the subsystem, it has to reset the matrix on the optimizations. I don't want to do that.

What precisely will UDNA change compared to the current RDNA and CDNA split? Huynh didn't go into a lot of detail, and obviously there's still plenty of groundwork to be laid. But one clear potential pain point has been the lack of dedicated AI acceleration units in RDNA. Nvidia brought tensor cores to then entire RTX line starting in 2018. AMD only has limited AI acceleration in RDNA 3, basically accessing the FP16 units in a more optimized fashion via WMMA instructions, while RDNA 2 depends purely on the GPU shaders for such work.

Our assumption is that, at some point, AMD will bring full stack support for tensor operations to its GPUs with UDNA. CDNA has had such functional units since 2020, with increased throughput and number format support being added with CDNA 2 (2021) and CDNA 3 (2023). Given the preponderance of AI work being done on both data center and client GPUs these days, adding tensor support to client GPUs seems like a critical need.
 
Last edited:

Larxia

Member
I remember when 1000€ used to be the price for the super high end stuff like GTX 690 and GTX Titan back in the day...
The kind of stuff that everyone thought was way too expensive, something for richs or pros, and people instead went for the 70 and 80 gpus that were around 400 and 500€...
Better days.
 

The Cockatrice

I'm retarded?
nVidia cards from 2019 can run modern games at reasonable performance with RT ON:

The video you showed doesnt have RT on.

Of course it is at reasonable performance. Cyberpunk 2077 can be set to Ultra RT, 1080p+DLSS Quality at 70-80fps on a 4060ti.

Cyberpunk is an incredibly well optimized game in its final state and it's probably the only exception to the rule.





Keep clowning all you want.
 

YeulEmeralda

Linux User
I remember 10 years ago you could build high end gaming PC for around 1200$. Now its around 3000$.


Yes inflation etc but unless everyone's wages went up 3x in last 10 years,paying that much for high end pc is crap
A high end PC does last longer these days.
I mean a 5090 will keep you on ultra for 5 years.
 
You havent played the game have you? It requires a GPU capable of raytracing, but the game doesnt use raytracing by default and can be disabled.
Gotcha, you're clueless.

Ray tracing is mandatory. Path tracing is optional.

Educate yourself before spouting nonsense.

Many people with GTX cards are crying because they cannot play it... even with cards such as 1080 Ti.
 

The Cockatrice

I'm retarded?
Gotcha, you're clueless.

Ray tracing is mandatory. Path tracing is optional.

Educate yourself before spouting nonsense.

Many people with GTX cards are crying because they cannot play it... even with cards such as 1080 Ti.


w3ubKHj.jpeg


EDIT: meh, i was wrong, the game uses RTGI by default or some very light form of it similar to how Metro Exodus did it, which runs even on base PS5 at 4k60fps.
 
Last edited:

Buggy Loop

Gold Member
You havent played the game have you? It requires a GPU capable of raytracing, but the game doesnt use raytracing by default and can be disabled.

Come On What GIF by MOODMAN


Full RT = path tracing

When Full RT is off, it falls back to ray tracing, always. Even Series S & Series X are on ray tracing.

Not even mods have removed the requirement because there's no raster GI to fall back to.

edit - OMG your posts above. Never go full confidence when so wrong

9.jpg
 
Last edited:

The Cockatrice

I'm retarded?
You're a fucking clown.

Do you know what PATH tracing means?

Welcome to my ignore list, retard.

Guess I am an idiot.

Come On What GIF by MOODMAN


Full RT = path tracing

When Full RT is off, it falls back to ray tracing, always. Even Series S & Series X are on ray tracing.

edit - OMG your posts above. Never go full confidence when so wrong

9.jpg

Yeah I just checked. The settings in-game confirm it. I edited my post. My main point still stands tho before the indiana jones discussion.

EDIT: meh, i was wrong, the game uses RTGI by default or some very light form of it similar to how Metro Exodus did it, which runs even on base PS5 at 4k60fps.

:(
 
Last edited:

64bitmodels

Reverse groomer.
A high end PC does last longer these days.
I mean a 5090 will keep you on ultra for 5 years.
Not when the 1080ti exists. 699 at the time and it got you performance that lasted a whole console gen. Even now 1080tis are still kicking and have better specs in some respects than modern budget cards with the same power (having 11gb of VRAM instead of the 8gb 3060, 4060, 6600, 6600 xt, 7600, 7600 xt, etc have)
 

Fabieter

Member
At least Apple supports their old hardware. The latest iOS runs on phones from 2018. Nvidia drops features from hardware that is perfectly capable of running it (proven by hackers) in 2 years.

You mean like it was proven that they slowed down their phones with new updates in the past?

It's not all bad but it's ridiculous to call apple better than sony as far as greed goes.
 

kevboard

Member
Not when the 1080ti exists. 699 at the time and it got you performance that lasted a whole console gen. Even now 1080tis are still kicking and have better specs in some respects than modern budget cards with the same power (having 11gb of VRAM instead of the 8gb 3060, 4060, 6600, 6600 xt, 7600, 7600 xt, etc have)

yup, technically the 1080ti can outperform current gen consoles in many titles. it sadly gets left behind slowly by surely due to new tech kneecapping it.
while you can get some raytracing running at playable (console like) framerates on it, more modern RT implementations run less and less playable on it or aren't supporting it anymore in the first place.
and now mesh shaders are getting used as well, which the 1080ti can't handle either.
 

Jesb

Member
Are we expecting these cards to be a part of GFN or will ultimate tier stick with 4080 for a long time?
 

Zathalus

Member
The video you showed doesnt have RT on.



Cyberpunk is an incredibly well optimized game in its final state and it's probably the only exception to the rule.





Keep clowning all you want.

I literally said path tracing is out of the question and you show me some videos with path tracing? Alan Wake 2 is double funny because even with zero RT at all the game is still incredibly demanding, it’s just the way it is.

Meanwhile here is a list of games you can play RT with at 60+ FPS on a 4060Ti when leveraging DLSS Quality at 1080p/1440p:

Indiana Jones
Metro Exodus
Cyberpunk 2077
Control
Witcher 3
Dying Light 2
Ratchet & Clank
Spider-Man
RE: Village
Dragons Dogma 2
Doom Eternal
Avatar
Watch Dogs: Legion

There are still others I am sure, plus don’t forget all the UE5 games that use Lumen either like Black Myth, Robocop, or Silent Hill 2 either. Lumen is a form of voxel ray tracing after all.
 

Zathalus

Member
At least Apple supports their old hardware. The latest iOS runs on phones from 2018. Nvidia drops features from hardware that is perfectly capable of running it (proven by hackers) in 2 years.
If you are referring to Frame Generation, it was never successfully made to work on 3000 series GPUs. That was debunked. There are mods that replace DLSS generation with FSR3 but that is a totally different technology and is worse as well.
 
Top Bottom