• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[VideoCardz] Inno3D teases “Neural Rendering” and “Advanced DLSS” for GeForce RTX 50 GPUs at CES 2025

3liteDragon

Member
It sounds like DLSS 4 might bring neural rendering/upscaling to textures, but based on what I'm reading, it MIGHT not be exclusive to the 50 series? Anyone else looking into this, feel free to correct me. If this works on all RTX cards, that'll be huge. All DLSS features are a form of neural rendering but Jensen also talked about being able to generate textures/NPCs on the fly when he was asked about DLSS's future.
RTX-50-INNO3D-CES2025.png


image.png
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Neural textures shouldnt be exclusive to RTX50 if Nvidia really is introducing it.
It shouldnt even be Nvidia exclusive.

Neural Rendering is also a really broad statment, because DLSS SuperResolution and FrameGeneration could be seen as Neural Rendering.

It sounds like what they are saying is their new GPUs are more powerful so can use those AI features faster.(shocking I know)
If thats the big advancement the RTX50s are bringing, im going be skipping the generation for sure.


If its NRC they are introducing then im slightly more interested, even if ReSTIR (RayReconstruction) has only been used by what 2 or 3 games since its introduction.
So NRC is likely gonna be even rarer.

featured_hu4a297c885b52414d168629cb13b28fa8_151111_720x0_resize_q90_lanczos.jpg
 

Kenpachii

Member
Isn't the opposite more likely? I read something about this yesterday, saying this technology uses lots of vram. But I've no clue.
Exciting stuff for sure! Looking forward to Jensens presentation about this stuff.

Yea read up on it and it does seem to consume more v-ram.
 

GHG

Member
Neural textures shouldnt be exclusive to RTX50 if Nvidia really is introducing it.
It shouldnt even be Nvidia exclusive.

Neural Rendering is also a really broad statment, because DLSS SuperResolution and FrameGeneration could be seen as Neural Rendering.

It shouldn't, but this is Nvidia we are talking about, so if they can find a way to make it proprietary then they will.
 

SABRE220

Member
Just when the others are pulling close Nvidia keeps putting in the distance hopefully this time the others can catch up quicker because you can be sure Nvidia just added 20% premium on the new cards.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
So devs can be even more lazy and rely on gpu's to enhance/render higher quality textures? Yeah sure Jan.

Reference textures are higher quality than you receive in game.
They are also frikken massive as in hundreds of MB if not gigs in size for some assets.

What this means is devs can compress those super high quality textures down and let the neural upscaler decompress and upscale the textures to a good level, without said textures taking up a bunch of space.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Isn't the opposite more likely? I read something about this yesterday, saying this technology uses lots of vram. But I've no clue.
Exciting stuff for sure! Looking forward to Jensens presentation about this stuff.

If it really is NTC they are introducing then it will reduce VRAM usage for decent quality textures.
It would basically be like DLSS SuperResolution but for textures.
The overhead cost should still be lower than using the reference ground truth textures.

small_ntc-example-1.jpg
 

Knightime_X

Member
If Switch 2 uses this tech, and it looks as good if not better than ps5 I will nut several gallons.
Not getting my hopes up, but I really DO hope nintendo decides to go back to playing with power.
 

The Cockatrice

I'm retarded?
Reference textures are higher quality than you receive in game.
They are also frikken massive as in hundreds of MB if not gigs in size for some assets.

What this means is devs can compress those super high quality textures down and let the neural upscaler decompress and upscale the textures to a good level, without said textures taking up a bunch of space.

Thats a good case scenario.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Thats a good case scenario.

If you ever see pretty much any games artdump youll notice the original textures are always ridiculously high quality.
Thats because during the authoring stage of the textures devs will usually be using the highest quality setting they can to create the textures because its easier to create texture when you have a large canvas e.g 4096x4096

One of the "hard parts" of game development is making shit actually fit into memory, which is why we retopo geometry and compress textures down.
When they make the assets whether is polygons or textures the OG is always well beyond what would be reasonable for a shipping game.

Compressing the textures then having some sort of middleware that can upscale them, makes the dance of size/quality much easier on devs and hopefully much prettier for gamers.
In theory its a win/win and not a crutch as you assumed because we gamers want the textures as clear as possible, but we also want them to fit in memory/storage.
 

00_Zer0

Member
No Jensen. Not falling for the flashy shit. We need affordable GPUs not buzzwords.
Remember the promise of Ray Tracing for all back when the RX 2xxx series was released? From that day to this this which amounts to about 6 years and still Ray Traced games are few and far between. Nvidia has brute forced this technology to where it is today, and yet the majority of the PC gaming public can barely take advantage of it.
 
Last edited:

GymWolf

Member
I told ghg that some shanenigans to improve textures was one of the possible future steps.

Finally they are doing something for us assett nutjobs and not just fancy lights and shadows.
 

Kataploom

Gold Member
Do we even have a bottleneck for displaying textures? I mean, that's the smallest of the concerns about game rendering these days, unless they allow us to show real time procedural textures which scale infinitely I don't see a big benefit.

I rather have them use the power to real time lighting calculations or something that's currently killing performance instead.
 

00_Zer0

Member
Does this increase the need for more VRAM?
Nvidia has come up with a new feature called Neural VRam where they take 8Gb and 12GB of VRam and accelerate the amount through their patented AI process only available on their 5xxx series cards. It will cost you for this type of exclusive technology though. The base 5060 model with 8GB of ram will now cost $449.99 and the 12GB 5070 base model will cost you an economical $749.00. Everybody wins with these entry prices, but this is just a rumor I came up with from out of my ass, so take it with a grain of AI upscaled salt.
 
Last edited:
Top Bottom