have done it, no bans or anything. transformer model causes some very infrequent artifacts (usually on distant background assets), but generally works great. Have added 20-50fps running in performance mode over the old quality mode, depending on the map, and it's cleaner in motion overall. some hair transparencies also get a little fuzzy, but well worth it.I wanted to try the DLSS swapper and inspector with Black Ops 6 multiplayer. but I am not sure its a good idea.
But the document says those are for performance mode, and going from such a low resolution source that offers little over FSR3.1 to Nintendo IMO.DLSS 2 and 3 are the same outside of frame generation, with 3 simply having different presets for games to use.
On a 2060 Super (~ 100 TOPS) DLSS 3 has a cost of 0.61ms to upsample 1080p from 540p and 1.01ms to upsample 1440p from 720p. DLSS 4 raises this to 1.15ms and 2.02ms respectively. The Switch 2 should have about 25 TOPS when docked, but it does have sparsity which the 2060 Super doesn’t. There doesn’t appear to be a 1:1 relationship between TOPS and execution time.
Sourced from Nvidia documentation:
![]()
DLSS/doc/DLSS_Programming_Guide_Release.pdf at main · NVIDIA/DLSS
NVIDIA DLSS is a new and improved deep learning neural network that boosts frame rates and generates beautiful, sharp images for your games - NVIDIA/DLSSgithub.com
Agreed, based on the hardware they have FG/MFG seems unlikely and sticking to CNN DLSS would be best.
No fake frames in the video, frame generation is disabled, just upscaling.
IGTI looks a lot worse than DLSS, but man does this game look bad for a $315M production.incredible how bad native taa is in spiderman 2
it is insane how bad the so called "IGTI" that is used on ps5 compared to DLSS
it is being given more pixels with lower performance yet ends up looking worse lol
this is why I don't care if I have to play this game with a lower internal resolution than ps5 due to vram or other performance reasons. it just doesn't matter lol
yeah it looks pretty poor. it looks like the game is meant for ray traced global illumination but they couldn't make it perform well on base PS5 and rushed it to bake the lighting but didn't have enough to make it properly or somethingIGTI looks a lot worse than DLSS, but man does this game look bad for a $315M production.
It's hard to believe that people around here were praising the graphics. It looks shockingly bad.IGTI looks a lot worse than DLSS, but man does this game look bad for a $315M production.
You should open a topic about it.IGTI looks a lot worse than DLSS, but man does this game look bad for a $315M production.
is this override mode or legacy dlss ?incredible how bad native taa is in spiderman 2
it is insane how bad the so called "IGTI" that is used on ps5 compared to DLSS
it is being given more pixels with lower performance yet ends up looking worse lol
this is why I don't care if I have to play this game with a lower internal resolution than ps5 due to vram or other performance reasons. it just doesn't matter lol
override mode of courseis this override mode or legacy dlss ?
I didnt like override mode in this game. Preset K has ghosting and graphic bugs whats your opinion about that?override mode of course
But how does it look in motion? On console in all the modes the motion looks great, where as that video of DLSS4 at 640x360 @90fps looked PS4@30fps janky, and that motion handling impacts perceptible image quality outside of taking stills.incredible how bad native taa is in spiderman 2
it is insane how bad the so called "IGTI" that is used on ps5 compared to DLSS
it is being given more pixels with lower performance yet ends up looking worse lol
this is why I don't care if I have to play this game with a lower internal resolution than ps5 due to vram or other performance reasons. it just doesn't matter lol
I didn't play long enough but I didn't notice any major issuesI didnt like override mode in this game. Preset K has ghosting and graphic bugs whats your opinion about that?
I can confirm that IQ(Clarity) and HDR in SM2 is spectacular on a (PS4 gen) HDR reference display(65" Sony KD-65ZD9) and has far more impact than the still comparison, as was my experience over the 90hrs I played IIRC.I didn't play long enough but I didn't notice any major issues
then again I usually do not notice such issues so my experience is not a good indicator. all I care about is image clarity![]()
Looks like a PS3 game.IGTI looks a lot worse than DLSS, but man does this game look bad for a $315M production.
i found this video he is showing override vs cnnI didn't play long enough but I didn't notice any major issues
then again I usually do not notice such issues so my experience is not a good indicator. all I care about is image clarity![]()
TLOU Part II is another incoming flop. People on PC don’t care about those story-driven cinematic-heavy games. That’s for the PlayStation crowd. PC wants Bloodborne, Demon’s Souls, and GT7, yet Sony decides these are the 3 games they’ll never port. Instead, they port stuff no one asked for like Sackboy.I found that depth of field is buggy in this game. With dlss enabled background is flickering without dof cutscenes looking weird. Its really bad port. Nixxes has a lot of work but also they are working on tlou part 2. i hope they not screwed on tlou part 2.
Bro im talking about the port quality not games style or something. What if sonny bring up bloodborne with memory leaks, broken graphics or stutter problems? People always shilling nixxes port but nixxes ports always has a problem. Even today first spider-man remastered has a texture problem. Go and play first spider-man remastered first mission. Police officer badge never load correctly. I re-visited before spiderman 2 arrived. Still has a problems.TLOU Part II is another incoming flop. People on PC don’t care about those story-driven cinematic-heavy games. That’s for the PlayStation crowd. PC wants Bloodborne, Demon’s Souls, and GT7, yet Sony decides these are the 3 games they’ll never port. Instead, they port stuff no one asked for like Sackboy.
I've always thought Nixxes were overrated. They did no outstanding ports, just a bunch of okay ones at best, but I see people like Battaglia praising them to high heavens. I'll also never forget Mankind Divided.Bro im talking about the port quality not games style or something. What if sonny bring up bloodborne with memory leaks, broken graphics or stutter problems? People always shilling nixxes port but nixxes ports always has a problem. Even today first spider-man remastered has a texture problem. Go and play first spider-man remastered first mission. Police officer badge never load correctly. I re-visited before spiderman 2 arrived. Still has a problems.
In some games modifying the DLSS file breaks DLSS e.g. Space Marine 2.I wanted to try the DLSS swapper and inspector with Black Ops 6 multiplayer. but I am not sure its a good idea.
I gotta admit, some of the examples are extremely nitpicky. I wasn't aware of a few of those before Tim stopped moving and zoomed in. Still, DLSS4 ray reconstruction is a nice improvement. Extremely heavy on RTX 20 and 30 though.
Both I’m guessing.Is that because of architectural differences, or due to lower amount of TOPS?
Both I’m guessing.
Is that because of architectural differences, or due to lower amount of TOPS?
![]()
Source: https://images.nvidia.com/aem-dam/S...ell/nvidia-rtx-blackwell-gpu-architecture.pdf
TBH, I think people really, really underestimate nVidia's progress when it comes to AI performance.
Yes, I know most people are stuck to raster performance mindset (FP32 shaders), but this feels like back in 1999 dismissing T&L in favor of RIVA TNT/3Dfx Voodoo (old school rasterizers) or dismissing programmable shaders back in 2001 in favor of T&L.
AI is the new paradigm shift in 3D graphics and now I understand why Turing/Ampere cards feel a bit slow in DLSS4 RR. You can find TOPS numbers in nVidia's website.
nVidia still has lots of room for growth in AI (1-bit LLMs will probably appear in 2027 consumer GPUs):
The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits
Recent research, such as BitNet, is paving the way for a new era of 1-bit Large Language Models (LLMs). In this work, we introduce a 1-bit LLM variant, namely BitNet b1.58, in which every single parameter (or weight) of the LLM is ternary {-1, 0, 1}. It matches the full-precision (i.e., FP16 or...arxiv.org
Blackwell will age well with future games when these features get used & the AI uplift will certainly help.Yup, the play is neural for the future. I said it many times on neogaf and the architecture will eventually flex. Ada was the first heavy neural GPU even though it was not flexed really until these new models.
Blackwell will surely shine later on too.
My post in the other thread also showcases the advantages of these AI monsters from Ada and above, the Neural Texture Compression SDK on github has MASSIVE Disc size → PCI-E banwidth → VRAM savings compared to older generations. Older RTX cards still see the benefit of disc size and PCI-E banwidth but not the VRAM texel by texel decompression with TOPS.
When you'll start to stack
Which is not any time soon of course, but then the architecture will make sense. Rasterization is a dead end. The whole pipeline will change.
- Neural upscaler
- Neural ray reconstruction
- Neural texture compression
- Neural radiance cache path tracing
- Neural shaders
- Neural Mega geometry
- Neural skins
- Neural hair
- Neural....
Go read Cerny's patent and listen to his recent technical video, or search the marketing explanations of how the Bravia X1 chips or newer XR processors work.
Nvidia are just using their own buzzword for what PSSR is already doing more intelligently. PSSR starting with a full-sized native image with holes to lower the pixel count is the best algorithm.
Training PSSR with games designed with it in mind will clear up these early issues. The only question is whether it will be on Pro games this gen or PS6 games next-gen.
I was using the new drivers and was getting IRQ NOT LESS OR EQUAL BSOD's all over the place. So I'm rolling back to the previous driver.
Don't know if it was because of the edits i made in Profile Inspector, I followed the YT guide where they tell you which things to change.
I enabled it globally, and used DLSS swapper to change it in a bunch of games. But I was playing FF7 Remake which doesnt even use DLSS when I was getting crashes when alt tabbing or when starting Steam. Really weird stuff.
Maybe I messed something up, I'm reading the comments there and they are saying not even to install the Nvidia App and just use the Profile Inspector. And that when forcing it globally you don't need to use dlss swapper.
I wish Nvidia would just make this easy, but it's just not there yet.
How do you get so many words out when your tongue is glued to Sony's bumhole?Go read Cerny's patent and listen to his recent technical video, or search the marketing explanations of how the Bravia X1 chips or newer XR processors work.
Nvidia are just using their own buzzword for what PSSR is already doing more intelligently. PSSR starting with a full-sized native image with holes to lower the pixel count is the best algorithm.
Training PSSR with games designed with it in mind will clear up these early issues. The only question is whether it will be on Pro games this gen or PS6 games next-gen.
Wow, that's my video! GAF -> Internet -> GAF
Inspector was updated today and it's more straightforward. You also don't need to use DLSS Swapper at all, as people in the comments pointed out, unless a specific game is giving you an issue.
For anyone interested, I recorded a new video with the new process.
Wow, that's my video! GAF -> Internet -> GAF
Inspector was updated today and it's more straightforward. You also don't need to use DLSS Swapper at all, as people in the comments pointed out, unless a specific game is giving you an issue.
For anyone interested, I recorded a new video with the new process.
i found this video he is showing override vs cnn
It's not Inspector doing it, it's the Nvidia driver itself since the recent versions, it can override the DLSS calls from the games and force its own version.Interesting - so DLSS swapper isn't required at all?
How does inspector force the latest DLSS file if we don't manually replace it?
I'd guess because some anticheats could be triggered and report shenanigans behind the scene if not "whitelisted" by Nvidia and the Gamedev/Publisher.Nvidia simply decided to not expose those settings at a global level (yet ?), maybe to avoid some compatibility issues.
You can still see some ghosting, like when birds are flying across buildings, but there's also a clear uptick in visible detail. And it goes beyond just it being sharper, but there's genuinely extra information in the frame.
Interesting - so DLSS swapper isn't required at all?
How does inspector force the latest DLSS file if we don't manually replace it?
So I dont need to use DLSS swapper, but what about the games I already swapped out the dlss?
So if I were to install the latest drivers (again), hopefully no BSOD's this time, and then enable it globally?
Yes. Even at 50% rendering resolution (performance) DLSS4 with preset K destroys TAAU at 4k. Try it, you won't go back.Is there anything I need to do to benefit from all of this. I've just started rebirth and turned off dlss I'm running TAAU at 4k ultra settings and dropping to like 90fps under extreme load. Other times I'm comfortably 120fps.
Should I be using DLSS? I have g sync so I can't even see the frame drops. Game looks good. Just wondering if the new dlss would make it look even better.
Yes. Even at 50% rendering resolution (performance) DLSS4 with preset K destroys TAAU at 4k. Try it, you won't go back.
You can change it globally using the latest version of the Nvidia Profile Inspector. Follow the video nikos posted above:Thanks. Is there a guide to activate this anywhere, please?
Slightly lower performance at the same preset quality, but significantly better IQ. Apparently, performance often rivals or even exceeds the old quality IQ.Well, what does this mean for someone with a 3060 like me? Does DLSS Quality now provide more FPS? Has DLSS Performance improved visually to the point where it's on par with Quality?
A bit of an abstract question, but what kind of DLSS preset would you compare PSSR to, specifically in a game like FF7 Rebirth?Slightly lower performance at the same preset quality, but significantly better IQ. Apparently, performance often rivals or even exceeds the old quality IQ.