• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DLSS 4 (the new Super Resolution and Ray Reconstruction models) will be compatible with ALL RTX GPUs

//DEVIL//

Member
I wanted to try the DLSS swapper and inspector with Black Ops 6 multiplayer. but I am not sure its a good idea.
 

TIGERCOOL

Member
I wanted to try the DLSS swapper and inspector with Black Ops 6 multiplayer. but I am not sure its a good idea.
have done it, no bans or anything. transformer model causes some very infrequent artifacts (usually on distant background assets), but generally works great. Have added 20-50fps running in performance mode over the old quality mode, depending on the map, and it's cleaner in motion overall. some hair transparencies also get a little fuzzy, but well worth it.
 

PaintTinJr

Member
DLSS 2 and 3 are the same outside of frame generation, with 3 simply having different presets for games to use.

On a 2060 Super (~ 100 TOPS) DLSS 3 has a cost of 0.61ms to upsample 1080p from 540p and 1.01ms to upsample 1440p from 720p. DLSS 4 raises this to 1.15ms and 2.02ms respectively. The Switch 2 should have about 25 TOPS when docked, but it does have sparsity which the 2060 Super doesn’t. There doesn’t appear to be a 1:1 relationship between TOPS and execution time.

Sourced from Nvidia documentation:



Agreed, based on the hardware they have FG/MFG seems unlikely and sticking to CNN DLSS would be best.


No fake frames in the video, frame generation is disabled, just upscaling.
But the document says those are for performance mode, and going from such a low resolution source that offers little over FSR3.1 to Nintendo IMO.

In fact, I could actually see a situation where Nintendo are using Nvidia NPU/Tensors but not using DLSS and instead use FSR4 in docked mode to be technologically more consistent - and independent of Nvidia - by doing FSR3.1 handheld, FSR4 in TV docked mode, however if they are using DLSS2 or DLSS3 they'd arguably need to be using balanced or quality mode to benefit enough over FSR3.1 to justify the tensors and the additional power draw, and they need the ML AI upscaling to last 2ms at least per 16.6ms or 33ms frame otherwise they need more TOPS per second that will be less utilised per frame which doesn't compliment a handheld, and secondly by having the DLSS or FSR4 take 2ms - or even more - it allows them to scale down the tensor silicon area - as I suggested 300TOPs -> 150TOPs - and power usage by merely doubling the processing time slice duration, which sounds very Nintendo IMO.
 

yamaci17

Member
incredible how bad native taa is in spiderman 2

it is insane how bad the so called "IGTI" that is used on ps5 compared to DLSS

it is being given more pixels with lower performance yet ends up looking worse lol

this is why I don't care if I have to play this game with a lower internal resolution than ps5 due to vram or other performance reasons. it just doesn't matter lol
 

Gaiff

SBI’s Resident Gaslighter
incredible how bad native taa is in spiderman 2

it is insane how bad the so called "IGTI" that is used on ps5 compared to DLSS

it is being given more pixels with lower performance yet ends up looking worse lol

this is why I don't care if I have to play this game with a lower internal resolution than ps5 due to vram or other performance reasons. it just doesn't matter lol
IGTI looks a lot worse than DLSS, but man does this game look bad for a $315M production.
 
Last edited:

yamaci17

Member
IGTI looks a lot worse than DLSS, but man does this game look bad for a $315M production.
yeah it looks pretty poor. it looks like the game is meant for ray traced global illumination but they couldn't make it perform well on base PS5 and rushed it to bake the lighting but didn't have enough to make it properly or something

lighting looks so weird. I prefer 1st game's graphics over this one... no idea how this happened. and of course it has useless ray tracing options on PC
 
Last edited:

Aaron07088

Neo Member
incredible how bad native taa is in spiderman 2

it is insane how bad the so called "IGTI" that is used on ps5 compared to DLSS

it is being given more pixels with lower performance yet ends up looking worse lol

this is why I don't care if I have to play this game with a lower internal resolution than ps5 due to vram or other performance reasons. it just doesn't matter lol
is this override mode or legacy dlss ?
 

PaintTinJr

Member
incredible how bad native taa is in spiderman 2

it is insane how bad the so called "IGTI" that is used on ps5 compared to DLSS

it is being given more pixels with lower performance yet ends up looking worse lol

this is why I don't care if I have to play this game with a lower internal resolution than ps5 due to vram or other performance reasons. it just doesn't matter lol
But how does it look in motion? On console in all the modes the motion looks great, where as that video of DLSS4 at 640x360 @90fps looked PS4@30fps janky, and that motion handling impacts perceptible image quality outside of taking stills.

Stills sadly don't capture the jankiness and drop in vibrancy that the poor motion adds, so though your conclusion might be right, that video tells me that DLSS in some games might be 2 steps forward and 3 steps back even when still shot comparisons and frame-rate counters imply otherwise.
 

yamaci17

Member
I didnt like override mode in this game. Preset K has ghosting and graphic bugs whats your opinion about that?
I didn't play long enough but I didn't notice any major issues

then again I usually do not notice such issues so my experience is not a good indicator. all I care about is image clarity :(
 

PaintTinJr

Member
I didn't play long enough but I didn't notice any major issues

then again I usually do not notice such issues so my experience is not a good indicator. all I care about is image clarity :(
I can confirm that IQ(Clarity) and HDR in SM2 is spectacular on a (PS4 gen) HDR reference display(65" Sony KD-65ZD9) and has far more impact than the still comparison, as was my experience over the 90hrs I played IIRC.
 
Last edited:

kiphalfton

Member
Have RTX 3080, don't keep up with anything relating to DLSS. What games would this be edit from? What features are different than before?
 

Aaron07088

Neo Member
I found that depth of field is buggy in this game. With dlss enabled background is flickering without dof cutscenes looking weird. Its really bad port. Nixxes has a lot of work but also they are working on tlou part 2. i hope they not screwed on tlou part 2.
 

Gaiff

SBI’s Resident Gaslighter
I found that depth of field is buggy in this game. With dlss enabled background is flickering without dof cutscenes looking weird. Its really bad port. Nixxes has a lot of work but also they are working on tlou part 2. i hope they not screwed on tlou part 2.
TLOU Part II is another incoming flop. People on PC don’t care about those story-driven cinematic-heavy games. That’s for the PlayStation crowd. PC wants Bloodborne, Demon’s Souls, and GT7, yet Sony decides these are the 3 games they’ll never port. Instead, they port stuff no one asked for like Sackboy.
 

Aaron07088

Neo Member
TLOU Part II is another incoming flop. People on PC don’t care about those story-driven cinematic-heavy games. That’s for the PlayStation crowd. PC wants Bloodborne, Demon’s Souls, and GT7, yet Sony decides these are the 3 games they’ll never port. Instead, they port stuff no one asked for like Sackboy.
Bro im talking about the port quality not games style or something. What if sonny bring up bloodborne with memory leaks, broken graphics or stutter problems? People always shilling nixxes port but nixxes ports always has a problem. Even today first spider-man remastered has a texture problem. Go and play first spider-man remastered first mission. Police officer badge never load correctly. I re-visited before spiderman 2 arrived. Still has a problems.
 

Gaiff

SBI’s Resident Gaslighter
Bro im talking about the port quality not games style or something. What if sonny bring up bloodborne with memory leaks, broken graphics or stutter problems? People always shilling nixxes port but nixxes ports always has a problem. Even today first spider-man remastered has a texture problem. Go and play first spider-man remastered first mission. Police officer badge never load correctly. I re-visited before spiderman 2 arrived. Still has a problems.
I've always thought Nixxes were overrated. They did no outstanding ports, just a bunch of okay ones at best, but I see people like Battaglia praising them to high heavens. I'll also never forget Mankind Divided.
 
Last edited:

AFBT88

Member
Nixxes never been a perfect studio, their ports always have some weird problems. They do somethings amazing but f up something else.
 
Last edited:


I gotta admit, some of the examples are extremely nitpicky. I wasn't aware of a few of those before Tim stopped moving and zoomed in. Still, DLSS4 ray reconstruction is a nice improvement. Extremely heavy on RTX 20 and 30 though.

Is that because of architectural differences, or due to lower amount of TOPS?
 
Both I’m guessing.
p8DoMCV.png


Source: https://images.nvidia.com/aem-dam/S...ell/nvidia-rtx-blackwell-gpu-architecture.pdf

TBH, I think people really, really underestimate nVidia's progress when it comes to AI performance.

Yes, I know most people are stuck to raster performance mindset (FP32 shaders), but this feels like back in 1999 dismissing T&L in favor of RIVA TNT/3Dfx Voodoo (old school rasterizers) or dismissing programmable shaders back in 2001 in favor of T&L.

AI is the new paradigm shift in 3D graphics and now I understand why Turing/Ampere cards feel a bit slow in DLSS4 RR. You can find TOPS numbers in nVidia's website.

nVidia still has lots of room for growth in AI (1-bit LLMs will probably appear in 2027 consumer GPUs):
 

Buggy Loop

Member
p8DoMCV.png


Source: https://images.nvidia.com/aem-dam/S...ell/nvidia-rtx-blackwell-gpu-architecture.pdf

TBH, I think people really, really underestimate nVidia's progress when it comes to AI performance.

Yes, I know most people are stuck to raster performance mindset (FP32 shaders), but this feels like back in 1999 dismissing T&L in favor of RIVA TNT/3Dfx Voodoo (old school rasterizers) or dismissing programmable shaders back in 2001 in favor of T&L.

AI is the new paradigm shift in 3D graphics and now I understand why Turing/Ampere cards feel a bit slow in DLSS4 RR. You can find TOPS numbers in nVidia's website.

nVidia still has lots of room for growth in AI (1-bit LLMs will probably appear in 2027 consumer GPUs):

Yup, the play is neural for the future. I said it many times on neogaf and the architecture will eventually flex. Ada was the first heavy neural GPU even though it was not flexed really until these new models.

Blackwell will surely shine later on too.

My post in the other thread also showcases the advantages of these AI monsters from Ada and above, the Neural Texture Compression SDK on github has MASSIVE Disc size → PCI-E banwidth → VRAM savings compared to older generations. Older RTX cards still see the benefit of disc size and PCI-E banwidth but not the VRAM texel by texel decompression with TOPS.

When you'll start to stack
  • Neural upscaler
  • Neural ray reconstruction
  • Neural texture compression
  • Neural radiance cache path tracing
  • Neural shaders
  • Neural Mega geometry
  • Neural skins
  • Neural hair
  • Neural....
Which is not any time soon of course, but then the architecture will make sense. Rasterization is a dead end. The whole pipeline will change.
 

3liteDragon

Member
Yup, the play is neural for the future. I said it many times on neogaf and the architecture will eventually flex. Ada was the first heavy neural GPU even though it was not flexed really until these new models.

Blackwell will surely shine later on too.

My post in the other thread also showcases the advantages of these AI monsters from Ada and above, the Neural Texture Compression SDK on github has MASSIVE Disc size → PCI-E banwidth → VRAM savings compared to older generations. Older RTX cards still see the benefit of disc size and PCI-E banwidth but not the VRAM texel by texel decompression with TOPS.

When you'll start to stack
  • Neural upscaler
  • Neural ray reconstruction
  • Neural texture compression
  • Neural radiance cache path tracing
  • Neural shaders
  • Neural Mega geometry
  • Neural skins
  • Neural hair
  • Neural....
Which is not any time soon of course, but then the architecture will make sense. Rasterization is a dead end. The whole pipeline will change.
Blackwell will age well with future games when these features get used & the AI uplift will certainly help.
 

Thebonehead

Gold Member
Go read Cerny's patent and listen to his recent technical video, or search the marketing explanations of how the Bravia X1 chips or newer XR processors work.

Nvidia are just using their own buzzword for what PSSR is already doing more intelligently. PSSR starting with a full-sized native image with holes to lower the pixel count is the best algorithm.

Training PSSR with games designed with it in mind will clear up these early issues. The only question is whether it will be on Pro games this gen or PS6 games next-gen.
Steve O Hot Ones GIF by First We Feast
 

nikos

Member
I was using the new drivers and was getting IRQ NOT LESS OR EQUAL BSOD's all over the place. So I'm rolling back to the previous driver.

Don't know if it was because of the edits i made in Profile Inspector, I followed the YT guide where they tell you which things to change.



I enabled it globally, and used DLSS swapper to change it in a bunch of games. But I was playing FF7 Remake which doesnt even use DLSS when I was getting crashes when alt tabbing or when starting Steam. Really weird stuff.

Maybe I messed something up, I'm reading the comments there and they are saying not even to install the Nvidia App and just use the Profile Inspector. And that when forcing it globally you don't need to use dlss swapper.

I wish Nvidia would just make this easy, but it's just not there yet.


Wow, that's my video! GAF -> Internet -> GAF

Inspector was updated today and it's more straightforward. You also don't need to use DLSS Swapper at all, as people in the comments pointed out, unless a specific game is giving you an issue.

For anyone interested, I recorded a new video with the new process.

 

daninthemix

Member
Go read Cerny's patent and listen to his recent technical video, or search the marketing explanations of how the Bravia X1 chips or newer XR processors work.

Nvidia are just using their own buzzword for what PSSR is already doing more intelligently. PSSR starting with a full-sized native image with holes to lower the pixel count is the best algorithm.

Training PSSR with games designed with it in mind will clear up these early issues. The only question is whether it will be on Pro games this gen or PS6 games next-gen.
How do you get so many words out when your tongue is glued to Sony's bumhole?
 

DenchDeckard

Moderated wildly
Is there anything I need to do to benefit from all of this. I've just started rebirth and turned off dlss I'm running TAAU at 4k ultra settings and dropping to like 90fps under extreme load. Other times I'm comfortably 120fps.

Should I be using DLSS? I have g sync so I can't even see the frame drops. Game looks good. Just wondering if the new dlss would make it look even better.
 
Wow, that's my video! GAF -> Internet -> GAF

Inspector was updated today and it's more straightforward. You also don't need to use DLSS Swapper at all, as people in the comments pointed out, unless a specific game is giving you an issue.

For anyone interested, I recorded a new video with the new process.


Interesting - so DLSS swapper isn't required at all?

How does inspector force the latest DLSS file if we don't manually replace it?
 

MMaRsu

Member
Wow, that's my video! GAF -> Internet -> GAF

Inspector was updated today and it's more straightforward. You also don't need to use DLSS Swapper at all, as people in the comments pointed out, unless a specific game is giving you an issue.

For anyone interested, I recorded a new video with the new process.


So I dont need to use DLSS swapper, but what about the games I already swapped out the dlss?

So if I were to install the latest drivers (again), hopefully no BSOD's this time, and then enable it globally?
 

Xcell Miguel

Gold Member
Interesting - so DLSS swapper isn't required at all?

How does inspector force the latest DLSS file if we don't manually replace it?
It's not Inspector doing it, it's the Nvidia driver itself since the recent versions, it can override the DLSS calls from the games and force its own version.

You can do it already in the Nvidia App for each game, but not at global level, which Inspector can do because in the end the driver settings are only values in the registry and each global setting can be overridden for a game. Nvidia simply decided to not expose those settings at a global level (yet ?), maybe to avoid some compatibility issues.
 

Mithos

Member
Nvidia simply decided to not expose those settings at a global level (yet ?), maybe to avoid some compatibility issues.
I'd guess because some anticheats could be triggered and report shenanigans behind the scene if not "whitelisted" by Nvidia and the Gamedev/Publisher.
Not all anticheat gets triggered or trigger instantly, could be days or weeks before "boom" you're BANNED.
 
Last edited:

Captn

Member
You can still see some ghosting, like when birds are flying across buildings, but there's also a clear uptick in visible detail. And it goes beyond just it being sharper, but there's genuinely extra information in the frame.

I'm not using DLSS at all since it creates ghosting no matter what preset so I turned off all upscale options in game and just playing it with DLAA in game and FG ON instead but using DLSS Tweaks with preset K which gives me the below stats. It somehow upscales from 2232x1264 to 4k ? I don't know how that works but all I know is the game looks amazing and runs like a dream.

Using maxed out settings and RT on Ultimate. Locked at 117 for my C1.

4090/7950x/32GB DDR5 6000
Windows 10
Nvidia 566.36

My GPU is also undervolted hence the low wattage usage and usually runs at 99% depending on the scene like this one for instance where it's being used at 75%.

RPcbFEM.jpeg
 
Last edited:

nikos

Member
Interesting - so DLSS swapper isn't required at all?

How does inspector force the latest DLSS file if we don't manually replace it?
So I dont need to use DLSS swapper, but what about the games I already swapped out the dlss?

So if I were to install the latest drivers (again), hopefully no BSOD's this time, and then enable it globally?

You don't need to use DLSS swapper. By doing this, you're forcing games to use the latest DLSS version at the driver level. I'd only use DLSS Swapper (or manually replace DLSS files) for games that it doesn't work with for whatever reason.
 

keefged4

Member
Is there anything I need to do to benefit from all of this. I've just started rebirth and turned off dlss I'm running TAAU at 4k ultra settings and dropping to like 90fps under extreme load. Other times I'm comfortably 120fps.

Should I be using DLSS? I have g sync so I can't even see the frame drops. Game looks good. Just wondering if the new dlss would make it look even better.
Yes. Even at 50% rendering resolution (performance) DLSS4 with preset K destroys TAAU at 4k. Try it, you won't go back.
 
Well, what does this mean for someone with a 3060 like me? Does DLSS Quality now provide more FPS? Has DLSS Performance improved visually to the point where it's on par with Quality?
 

Gaiff

SBI’s Resident Gaslighter
Well, what does this mean for someone with a 3060 like me? Does DLSS Quality now provide more FPS? Has DLSS Performance improved visually to the point where it's on par with Quality?
Slightly lower performance at the same preset quality, but significantly better IQ. Apparently, performance often rivals or even exceeds the old quality IQ.
 
Top Bottom