• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DLSS 4 (the new Super Resolution and Ray Reconstruction models) will be compatible with ALL RTX GPUs

Dude, I'm not surprised by your disingenuous behaviour. I remember your comments and you did the same thing in our argument about PS2 vs Xbox when you tried to present facts in such a way to make the inferior console superior. You wanted to tell people that the PS2 hardware could render the same effects in software and outperform the more powerful console in polycounts even though it was RAM limited and you couldnt even grasp that you need memory even to be able to use higher polycounts. Even developers who made games on all platforms said that Xbox had the upper hand when it came to hardware and polycounts, whereas the PS2 has the upperhand in other areas (fillrate) but you still thought Sony's console was superior. I had to link internal reports done by sony to show people the real poly counts of PS2 games and the numbers were nowhere near what you claimed and that finally shut you up.
PS2's exotic philosophy is a total antithesis to nVidia's philosophy.

Brute force (eDRAM insane 2560-bit bus @ 48GB/s) vs efficiency (hardwired circuitry, compression etc).

PS2 was capable of emulating fur shaders (SoTC), but the framerate was abysmal:


I'm not sure if XBOX's GeForce 3 GPU was capable of rendering fur shaders, but I vividly remember this GeForce 4 demo:



nVidia continues the efficiency philosophy with DLSS... brute force/rasterization is no more (even AMD will stop chasing that train with FSR4 aka rebranded PSSR). ;)

I'm curious to see if PS6 will have something like DLSS4 + MFG, because I don't expect a huge bump in CPU/GPU specs (could be a sizeable jump if they adopt chiplets instead of a monolithic APU with low yields, we'll see).
 
PS2's exotic philosophy is a total antithesis to nVidia's philosophy.

Brute force (eDRAM insane 2560-bit bus @ 48GB/s) vs efficiency (hardwired circuitry, compression etc).

PS2 was capable of emulating fur shaders (SoTC), but the framerate was abysmal:


I'm not sure if XBOX's GeForce 3 GPU was capable of rendering fur shaders, but I vividly remember this GeForce 4 demo:



nVidia continues the efficiency philosophy with DLSS... brute force/rasterization is no more (even AMD will stop chasing that train with FSR4 aka rebranded PSSR). ;)

I'm curious to see if PS6 will have something like DLSS4 + MFG, because I don't expect a huge bump in CPU/GPU specs (could be a sizeable jump if they adopt chiplets instead of a monolithic APU with low yields, we'll see).

Developers on the PS2 could render great looking effects in software (let's say comparable to xbox), but with performance penalty, like your example with fur rendering. Xbox could render these effects more easily thanks to dedicated pixel & vertex shaders and most importantly developers had enough RAM budget to use these shader effects on much bigger scale.

As for DLSS, it was a very clever idea, and without it I don't think Nvidia will push RT as soon. The RTX 2080ti could run RT games well, but only with DLSS. Even RTX50 series cards will require to use DLSS if people want to get high refreshrate experience in the most demanding RT/PT games. Personally, I see no problem with this, given how well the technology works. I was not sure I would like nvidia FG, but it also works very well (at least x2 FG, as x3 or x4 is a bit too extreme).

Mark Cerny is going to give developers what they want. I'm sure the PS6 hardware will be build around AI (PSSR, ML powered FG, ML ray reconstruction).
 
Last edited:

FireFly

Member
Okay fine, I'm happy to draw a line under you starting our discussion in this thread like below and thumbing up Corporal.Hicks childish verbal insults if you actual want to discuss.
Well, apart from using jargon, what I said only applies to you if you were actually arguing in bad faith.

Regardless of render output resolution and ML AI algorithm my view is no normal view of the windmill in those screenshots with that scene camera should have the contrast or detail shown
That's fine, but the purpose of DLSS is not to make an image more natural or "real". It is to get closer to the 16xSSAA "ground truth" image, that reflects how the developers intended the scene to look. So in this case the relevant question is whether the windmill has that level of detail in the SSAA anti-aliased image, and I suspect the answer is yes, based on the other distant scene elements.
 
Last edited:

PaintTinJr

Member
Well, apart from using jargon, what I said only applies to you if you were actually arguing in bad faith.
Just own it and move on - don't do the bad faith thing you yourself floated wrongly about a legit point of analysis.
That's fine, but the purpose of DLSS is not to make an image more natural or "real". It is to get closer to the 16xSSAA "ground truth" image, that reflects how the developers intended the scene to look. So in this case the relevant question is whether the windmill has that level of detail in the SSAA anti-aliased image, and I suspect the answer is yes, based on the other distant scene elements.
Okay that's fine but DLSS is now copying the PSSR/Sony Bravia XR algorithm/patent with it's more intelligent handling of in scene data via it new buzzword 'Transformers', so by its own standards, XeSS' or FSR3.1's and before, yeah it's doing great....but that's old hat, much like its efforts with fur in R&C.

On DLSS 4 it should be much closer to PSSR by attempting to "shallow fake" the fur, rather than just reproduce standard heavy duty real-time rendering, so taking a ML AI agnostic view, I'm still disinterested in any of those efforts on the windmill, pixel peeping or not.
 
Last edited:

ArtHands

Thinks buying more servers can fix a bad patch
I wonder how many years will Sony take to also switch to Transformer model. 6 or 7 years maybe?
 
Top Bottom