• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DLSS 4 (the new Super Resolution and Ray Reconstruction models) will be compatible with ALL RTX GPUs

llien

Member
(that DLSS is nearly flawless and even better than native was never claimed by anyone)
Brady Bunch K GIF
 
Last edited:

T-800

Neo Member
It will be backward compatible with all games that use DLSS and can be forced from the new Nvidia APP in all games, it will be available with the release of the new cards.

Fucking APP

Guess we have to thank ai sales for backwards compatibility, hopefully it runs the same or better on my 20series.
 
yup, it's already better in quality mode than native TAA in many games, but native TAA is also kinda shit... so being better than kinda shit was already very welcome and is why people love DLSS. and the more it improves over native TAA, the better.

like in Doom Eternal or Indiana Jones, not using DLSS quality mode and instead running native with TAA is literally costing you image quality while running worse. and while not looking perfect, the fact that it looks objectively better than native TAA while giving you a perfomance boost is great.

and now this great thing gets a massive quality boost.
Both DLSS and TAA will look soft without sharpening mask. With my reshade settings TAA / DLSS quality improves dramaticaly, but maybe with this improved model DLSS image will look sharp even without reshade.

BTW. DLSS looks better than native TAA in EVERY SINGLE GAME if you combine DLSS performance + DLDSR and you still get a higher framerate compared to native TAA. I see no point in playing at native TAA when games support DLSS.
 

Wolzard

Member
Console is really a scam if you think about it... RTX20 is receiving the new features but to get the PiSSeR you need to burn $700 and the customers who bought a base Ps5 get the finger... and it isn't even worth it.

If that weren't the case, why would people buy the PS5 Pro? It's the same thing as the SSD, which made games "only possible on the PS5" until they came out on the PC running on HDDs. Or when they announced that the games would come out on PS4 too.

Console is a machine to take all the money from the player. :messenger_tears_of_joy:
 
Last edited:

llien

Member
i do think DLSS looks better than native in a lot of games
It's magic. :D
Yeah, well, maybe.

I did have this sort of conversation in the early days of glorified TAA upscaling. (nobody seriously pushed that with AI upscaling, the DLSS 1).

I was challenged into "guess which of these is native" in all seriousness.
For some magical reason, the pics with more blur and less detail appeared to be universally the ones with glorified TAA upscaling applied.

So, that's where my skepticism comes from, cough.

On top of anything hyped by PF more likely than not being scheisse.

Because Raytracing is going to become a requirement in almost all big games going forward.
When was it that the "hardwahr RT" gimmick was first hyped,, mm, 6+ years ago, right?
I'm pretty sure this opinion was voiced prior to 2025 too.

But maybe THIS TIME it is the time to, well, finally "do it".... :)))
With 4th (!!!) gen of "hardwahr RT" cards around the corner.
 

Stuart360

Member
It's magic. :D
Yeah, well, maybe.

I did have this sort of conversation in the early days of glorified TAA upscaling. (nobody seriously pushed that with AI upscaling, the DLSS 1).

I was challenged into "guess which of these is native" in all seriousness.
For some magical reason, the pics with more blur and less detail appeared to be universally the ones with glorified TAA upscaling applied.

So, that's where my skepticism comes from, cough.

On top of anything hyped by PF more likely than not being scheisse.
Well on my screen DLSS doesnt blur, it fact it often looks sharper than native rez. Even FSR can look sharper than TAA.
I mean a lot of people say this and i dont think its some big consipracy theory, or everyone is clouded by delusion or somehting.

For me FSR can be hit or miss, but DLSS looks great to me, outside of some of the early implementations like i said.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Fucking APP

Guess we have to thank ai sales for backwards compatibility, hopefully it runs the same or better on my 20series.

The Nvidia App is actually pretty good.
If you dont like it you could probably do all this with Nvidia Inspector or manually inject using DLSSTweak Tool or just copy/paste.
But i imagine for most people using the App just makes life easier.


Forcing things from an app/driver level, where have I heard that before and why do I remember some issues with it.... ;P

DLSDSR?

DLDSR-feature-in-Prey-as-shown-by-Nvidia.jpg


It worked well and I used it all the time when i used to have a 1440p panel, and it did indeed give me better image quality vs just DSR even from higher factors.
 

llien

Member
I mean a lot of people say this and i dont think its some big consipracy theory, or everyone is clouded by delusion or somehting.
Under the hood stuff is rendered at lower resolution (definitely), TAAed (definitely), denoised using NN (most likely).
It is pretty natural to lose detail here, it is rather unnatural to improve pics.

Particularly bad will be small, quickly moving objects, e.g. raindrops. As with any other TAA derivative.

I think the shared pics that I've mentioned earlier were thought to be better by the original poster too. I had to zoom in a bit to see the diff and that's not what gamers normally do, cough.

Which hints at, in my opinion, a huge elephant in the room: people try to game at resolutions that are higher than their eyes can properly perceive.
 

hyperbertha

Member
Guys, forget upscaling and frame gen a minute, there's huge megabombs

The bolded was not even presented in the conference and are huge game changers for path tracing

Neural radiance cache path tracing / RTX Mega Geometry / Neural Texture Compression / RTX Neural faces



RTX Mega Geometry accelerates BVH building, making it possible to ray trace up to 100x more triangles than today’s standard.

RTX Neural Radiance Cache uses AI to learn multi-bounce indirect lighting to infer an infinite amount of bounces after the initial one to two bounces from path traced rays. This offers better path traced indirect lighting and performance versus path traced lighting without a radiance cache. NRC is now available through the RTX Global Illumination SDK, and will be available soon through RTX Remix and Portal with RTX.

awDAX3L.jpeg


They've basically rebuilt EVERYTHING

The shader pipeline with neural shaders
The BVH structure with Mega geometry
The path tracing solution with neural cache radiance
The upscaling technology by switching to transformer model

Season 18 Omg GIF by America's Got Talent's Got Talent

But is this also for the older cards?
 

Buggy Loop

Gold Member
But is this also for the older cards?

Yes!

Upscaler to transformer model all RTX cards

NRC is going to RTX remix even

Alan Wake 2 is getting update for Mega geometry and they mention its for all RTX cards.

It works on all RTX because they are using tensor cores

Of course, 5000 series is a neural MONSTER, it'll perform like no previous RTX cards in that field. But path tracing performances should improve on all RTX cards.
 

Hugare

Member
I don't get folks who with every new DLSS version finally realize the last wasn't as flawless as they've been saying. It's SO awesome. No wait, NOW it's awesome. No wait, NOW it finally fixes the HUGE issue we didn't care for before (also omg @ the lame competition not having it fixed yet)! Etc. 🤷‍♂️
This is what happens when a company is competing only with itself and it's still pretty early tech

DLSS versions from ages ago are better than what the competition is offering today. And there's still tons of room for improvement.
 

Buggy Loop

Gold Member
Bro that's a lot of frames. :goog_relieved:

Might consider building a new machine sooner than expected lol.

Yea

My 3080 Ti was begging to be put off behind the barn when I tried Full RT in that game, there was no combination of making it feasible unless I was aiming 20-30 fps (unacceptable).

Went with Lumen for that one.

5080 looks mighty tempting
 

kevboard

Member

I didn't say that
Hate You Dummy GIF by Amazon Prime Video



because of course people claim it looks better than native, because it often does.
noone claims it's flawless, it's just better than non-AI driven TAA

TAA is trash, DLSS is better TAA, therefore better than native if properly implemented.

Doom Eternal, Indiana Jones, Death Stranding and many more titles look objectively better with DLSS quality mode than running native, as their TAA exhibits less images detail, more ghosting and less edge stability.

being better than native TAA however doesn't mean it's flawless, it means it's better than a bad AA solution, and because that bad AA solution is now the standard and often unavoidable, DLSS is a welcome alternative that improves upon it
 
Last edited:

llien

Member
TAA is trash, DLSS is better TAA, therefore better than native if properly implemented.
DLSS is game ran at lower resolution, plus TAA (antialiasing) + upscaling and some NN denoising.
(The true AI one, the 1.0, was not using TAA)

"therefore better than native" my bottom.

Death Stranding and many more titles look objectively better with DLSS quality mode than running native
Oh, FFS...
DS was the "guess which side has DLSS enabled" test from back those times.
Yeah, guess what, it was the less detailed, blurrier one.
 
Last edited:

kevboard

Member
DLSS is game ran at lower resolution, plus TAA (antialiasing) + upscaling and denoising.

"therefore better than native" my bottom.

have you ever used it or are you just talking out of your ass?

dyzfcveo.png


ah9qyjjt.png


these shots are taken while moving to the left.
notice the ghosting between the metal bars in the window being lessened with DLSS.
notice how the leaves in the top look cleaner with DLSS.
what you don't see in this still shot sadly is that native TAA had way more highlight shimmering around the metal bars as you move, while DLSS properly antialiased them.

differences like these are common when comparing DLSS quality mode to native TAA.
and this is "only" 1440p, with a sub 1080p native resolution. once you hit a 4K target it's even more pronounced
 
Last edited:

T-800

Neo Member
The Nvidia App is actually pretty good.
If you dont like it you could probably do all this with Nvidia Inspector or manually inject using DLSSTweak Tool or just copy/paste.
But i imagine for most people using the App just makes life easier.

Just sad its not in the control panel also, in the convenient section where you could always change those kind of settings.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Just sad its not in the control panel also, in the convenient section where you could always change those kind of settings.

Why wouldnt it be it be Control Panel?

The Control Panel has most if not all of the settings the Nvidia App has (sans filters and overlays) Im pretty sure the Nvidia App driver settings simply changes things in the Control Panel without you needing to actually open the Control Panel and if you want even more granularity then use Nvidia Inspector which has even more settings.
The App is just quick access.
 

xenosys

Member
Mark Cerny just made a presentation about making CNN run on PS5 pro, meanwhile nVidia has moved on to transformer.

Nvidia are always 2-3 steps ahead of everyone else. I can see why AMD chickened out last night. They probably got wind of Nvidia's announcement(s) and decided to play reactive as they've always done in the GPU market and waited for prices before coming up with their own strategy.

Cerny's boasting about tech that Nvidia was working on 2 generations ago.
 
Last edited:

Mister Wolf

Member
Nvidia are always 2-3 steps ahead of everyone else. I can see why AMD chickened out last night. They probably got wind of Nvidia's announcement(s) and decided to play reactive as they've always done in the GPU market and waited for prices before coming up with their own strategy.

Cerny's boasting about tech that Nvidia was working on 2 generations ago.

AMD's GPU department is chicken shit for that move.
 

Wolzard

Member
Can someone clarify if RTX Neural Textures are supported in previous RTX GPUs too?

Probably, as this is a technique to be incorporated into Nvidia's SDK. AMD and Intel are also working on this.
Microsoft will incorporate it into DirectX, so it should even appear on the Xbox. I've seen some extensions being worked on for Vulkan as well.

 
Probably, as this is a technique to be incorporated into Nvidia's SDK. AMD and Intel are also working on this.
Microsoft will incorporate it into DirectX, so it should even appear on the Xbox. I've seen some extensions being worked on for Vulkan as well.

Interesting.

I'm curious to see if Nintendo (Switch 2) and Sony (PS6 most likely, not PS5 Pro) plan to follow suit...
 

T-800

Neo Member
Why wouldnt it be it be Control Panel?

The Control Panel has most if not all of the settings the Nvidia App has (sans filters and overlays) Im pretty sure the Nvidia App driver settings simply changes things in the Control Panel without you needing to actually open the Control Panel and if you want even more granularity then use Nvidia Inspector which has even more settings.
The App is just quick access.

Well hopefully it will be. I might have used nvidia inspector back in the day when 770 would not clock down properly with 2 monitors, but ill see how it goes, probably just copy stuff or whatever.
 

Arsic

Loves his juicy stink trail scent
I’m curious how good this will be on a 3080.

A game like stalker 2 could heavily benefit from this.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Well hopefully it will be. I might have used nvidia inspector back in the day when 770 would not clock down properly with 2 monitors, but ill see how it goes, probably just copy stuff or whatever.

I take it you havent looked in the Nvidia App, Control Panel or Inspector in a long long long time?
The Nvidia App drivers section is literally.....literally shortcuts to Control Panel settings you are most likely to use.

i7xrr6H.png





Note: The Nvidia App literally came out like two months ago.......were you thinking of Geforce Experience?
 

Thabass

Member
I really wish that the 30 series was going to get Frame Gen with DLSS 4, but the 5080 being cheaper than I thought, I may have to take the plunge. I'll see how availability is when it launches, but I am more than okay to wait for demand to die down.
 

T-800

Neo Member
I take it you havent looked in the Nvidia App, Control Panel or Inspector in a long long long time?
The Nvidia App drivers section is literally.....literally shortcuts to Control Panel settings you are most likely to use.

i7xrr6H.png





Note: The Nvidia App literally came out like two months ago.......were you thinking of Geforce Experience?

Nah the APP, because I got the impression when it came out that its meant to replace both these in the long run.
 

PaintTinJr

Member
why its hard for some to get this ?? its in the picture .... from their page. Multi Frame Gen is under DLSS 4. however, the only new thing they added to DLSS is MFG and its 5000 series exclusive. did they enhance the old DLSS ? sure. but the real full DLSS4 is for 5000 series only

nvidia-dlss-4-feature-chart-breakdown.jpg
I think the part that isn't coming across to others in your observation is that without MFG or DLSS FG this isn't a free lunch on the older hardware.

AFAIK the TOPs hardware utilisation to do earlier DLSS was consistent across all hardware, meaning a DLSS pass took a fixed timeslice+latency to upscale, whereas with DLSS 4 SR and RR that seems like it will no longer be the case, so in the past a game running native at 33fps would merely have its image quality improved plus the 1-2ms latency per frame of the DLSS upscale done on the Tensor cores.

Where as the Cyber punk graphic showing comparisons of DLSS3.5 to DLSS 4 with a difference of 103frames indicates that the 50 series Tensor cores offer more parallel TOPS to upscale those 103 frames within the same time without adding latency, or by reverse the DLSS4 algorithm is too heavy for the 30 series hardware - and possibly the 40 series - without MFG meaning its use will degrade frame-rate, requiring lower native on older 30 series to provide it with more frame-time for TOPs, but which in turn will negatively impact the upscale quality - by starting with inferior native - to maintain performance yielding a zero sum game IMO.
 

PeteBull

Member
Yea

My 3080 Ti was begging to be put off behind the barn when I tried Full RT in that game, there was no combination of making it feasible unless I was aiming 20-30 fps (unacceptable).

Went with Lumen for that one.

5080 looks mighty tempting
As a fellow 3080ti owner, wait for independend reviews, after that who knows, u might not even wanna look at 5080 since its definitely below even 4090(not to mention 16gigs of vram will be bottleneck in 2-3years in 4k), with 5090 at least u got over 100% performance jump in raster vs 3080ti, on top all new features, and ofc 32gigs of vram so 100% warranty for it to never be bottleneck even on 4k max rt/path racing in future games, even 5-6 years from now, hell 5090 is stronger from future ps6 :)
 
Top Bottom