• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Microsoft is preparing DirectX for Neural Rendering

amigastar

Gold Member
It’s hideous because it would ruin movies that are shot at 23.976. Visually it’ll make games look super smooth and high framerate. But the latency was way too high to play. Frame gen is ultimately the same thing, just with better latency (which I still find unacceptable).
Star Wars: The Force Unleashed II also had some kind of Frame Interpolation. But i didn't play it so i can't comment on latency.
 
Last edited:

llien

Member
Textures take up a decent chunk of game storage, and if something isn't done we'll soon be approaching storage size bottlenecks.

We've already seen examples of neural texture compression in games like GOWR, basically storing a low resolution texture and then upsampling it via ML in realtime, this is going to be really exciting stuff once more game studios start to embrace it but I guess the hardware will have to come first. I'm really curious to see how the neural rendering will impact geometry and other things like material shaders.

Texture compression is a thing for years.

Using NNs for that is a different story, e.g VAE (2013) can do (compression levels shown here are... insane, from 128*128*3=49152 to 1024; obviously, it comes at a price):

VAE-architecture-and-reconstruction-examples-A-VAE-architecture-The-encoder-and-the.jpg

| VAE architecture and reconstruction examples. (A) VAE architecture. The encoder and the decoder both contained 5 layers. The dimension of latent variables was 1024. Operations were defined as: *1 convolution (kernel size=4, stride=2, padding=1), *2 rectified nonlinearity, *3 fully connected layer, *4 re-parameterization trick, *5 transposed convolution (kernel size = 4, stride = 2, padding = 1), *6 sigmoid nonlinearity. (B) Examples of VAE reconstruction. Each testing image was coded 1024-dimensional latent variables with VAE encoder. The reconstruction from VAE decoder (blurred images on right side) retained basic and condensed information similar to the original input (clear images on left side).​


But doing that dozens of times a second, just to save VRAM... I am skeptical.
 
Last edited:

Lethal01

Member
It’s hideous because it would ruin movies that are shot at 23.976. Visually it’ll make games look super smooth and high framerate. But the latency was way too high to play. Frame gen is ultimately the same thing, just with better latency (which I still find unacceptable).

Frame gen looks far better due to the fact it can use the actual games data and yes due to AI.
 

adamsapple

Or is it just one of Phil's balls in my throat?
Star Wars: The Force Unleashed II also had some kind of Frame Interpolation. But i didn't play it so i can't comment on latency.

I might be mistaken, but I think they demoed it, but never actually implemented it on the console version, which still ran at a standard 30fps
 
Last edited:

DoubleClutch

Gold Member
Frame gen looks far better due to the fact it can use the actual games data and yes due to AI.

Not that much better. Ghost of Tsushima had lots of artifacts for example when it was turned on.

And regardless of how it looks, it plays poorly because of increased latency. That’s something that’s not an easy fix and why it’ll never be used for serious gaming.
 

Knightime_X

Member
How about making games fun first?

So many special marketing terms and technologies yet we still are stuck in the late 2000’s in terms of gameplay logic.

Still waiting for DirectStorage to revolutionize gaming… it came out 3 years ago and there’s like a handful of games that use it and Forspoken is one of them.
2 completely separate groups of developers.
1 makes the game, the other makes them look good.
 

DoubleClutch

Gold Member
2 completely separate groups of developers.
1 makes the game, the other makes them look good.

My point is none of them end up utilized to any great extent that significantly improve the gameplay.

It looks better, but that’s about it. We have less destructible environments now than we did before. But everything is shiny at least.
 

winjer

Gold Member
My point is none of them end up utilized to any great extent that significantly improve the gameplay.

It looks better, but that’s about it. We have less destructible environments now than we did before. But everything is shiny at least.

FFS, dude, this thread is about technology. Not gameplay.
If you to whine about gameplay create your own thread.
 

adamsapple

Or is it just one of Phil's balls in my throat?
My point is none of them end up utilized to any great extent that significantly improve the gameplay.

It looks better, but that’s about it. We have less destructible environments now than we did before. But everything is shiny at least.

Hi .. what the fuck does anything you've said in this topic have to do with the topic itself, lol.

This is just technology, it's up to the developers on how to utilize it. Whinging about 'but game play is not fun' is a whole another can of worms.
 

Knightime_X

Member
My point is none of them end up utilized to any great extent that significantly improve the gameplay.

It looks better, but that’s about it. We have less destructible environments now than we did before. But everything is shiny at least.
Marvel Rivals has lots of destruction, more than any game in recent times.
Same with The finals.

Earlier games like Red faction: Gurallia had piddly gpu and cpu requirements.
Games like RFG wasn't possible to that degree in the ps2 days.
It'll come, just give it time.
 

Zathalus

Member
And regardless of how it looks, it plays poorly because of increased latency. That’s something that’s not an easy fix and why it’ll never be used for serious gaming.
The increased latency is not that impactful. Average latency increase is 15ms over a game with Reflex enabled and is usually on par or lower vs a game that doesn’t use it. If you’ve never had an issue with a 60fps game on console or before Reflex was around, you won’t have a problem with the latency of frame generation. The newer DLSS4 version is supposed to have less latency as well.
 

DoubleClutch

Gold Member
The increased latency is not that impactful. Average latency increase is 15ms over a game with Reflex enabled and is usually on par or lower vs a game that doesn’t use it. If you’ve never had an issue with a 60fps game on console or before Reflex was around, you won’t have a problem with the latency of frame generation. The newer DLSS4 version is supposed to have less latency as well.

15 ms of added latency is far more than I will ever accept.

That’s the difference between playing at 60 fps and 30 fps on its own, without taking into account all the other latencies.
 

DoubleClutch

Gold Member
So you never played games released before 2020 then.

What’s your argument here?

If a game is 60 fps it has 1,000 ms / 60 =16.667 ms of latency. Going from 60 to 120 reduces it to 8.33, now you add 15 ms back to it for frame gen and you’re at more than what 60 fps was to begin with.

Then you add latency from your TV, peripherals such as your controller and mouse, the network, etc.

No.
 

DoubleClutch

Gold Member
He doesn’t know system latencies but he’s in a tech thread…

His "pure" native has more latency than framegen+reflex+upscale

Oh Brother Facepalm GIF by reactionseditor

Keep slurping that green kool-aid.

That’s why every competitive gaming entails turning all that garbage off, but okay, sure.

I bet you believe playing with mouse acceleration and angle snapping at maximum is also better because it’s smoother and higher tech.
 

DoubleClutch

Gold Member
Here you go:

5.png


Source: https://www.techspot.com/article/2546-dlss-3/

“DLSS 3 frame generation also increases input latency, even if it increases the frame rate. What this creates is a weird situation where frame rate increases, but latency increases as well - usually when gaming, higher frame rates lead to lower latency, and therefore a more responsive experience.

But this isn't the case with DLSS 3, which means that at times enabling DLSS 3 can make a game feel more sluggish even if the frame rate has risen, a weird feeling for sure.”
 
Last edited:

Buggy Loop

Gold Member
Keep slurping that green kool-aid.

That’s why every competitive gaming entails turning all that garbage off, but okay, sure.

I bet you believe playing with mouse acceleration and angle snapping at maximum is also better because it’s smoother and higher tech.

Competitive gamers use at the very least reflex/antilag + upscaling. Upscaling if they use 4k of course. If its the kind of gamer that plays 4:3 720p low settings to see the most raw square pixels to spot a sniper, no, but really how relevant is that even today. Haven't seen that in a tournament for a decade on CS:GO.

They don't use frame gen because then it adds ~12ms. They reach minimum latency possible.

qRLYAFT.png


DLSS Quality + frame gen + reflex is 63% lower latency than your raw native

Without even going into how typically Nvidia natively has lower latency than AMD

fortnite-latency-4070-ti-perf.png


That +22ms from a 4070 vs 7900XTX. Unacceptable right?

xn7W425.png


Consoles, unacceptable without reflex

By your very own criteria of 15ms, nothing is acceptable except Nvidia + reflex (y)

And I mean... Reflex 2 specifically for competition shooters is gonna be unbeatable.

Argue this one :

l4tcntj.jpeg
 
Last edited:

Zathalus

Member
What’s your argument here?

If a game is 60 fps it has 1,000 ms / 60 =16.667 ms of latency. Going from 60 to 120 reduces it to 8.33, now you add 15 ms back to it for frame gen and you’re at more than what 60 fps was to begin with.

Then you add latency from your TV, peripherals such as your controller and mouse, the network, etc.

No.
Before Reflex came around the end to end latency for a 60fps game on PC was anything from 60 to 200 ms. Reflex changed that by lowering latency to the 35 to 60 ms range. Adding frame generation on that increases the total latency to 45 up to 70 ms or so. Reflex was only introduced at the end of 2020 and many games still ship with it not even being available.

Hence if you find frame generation to be adding more latency then you would ever accept, then how was any game prior to 2020 at all acceptable? Or the many games that come out today with it being missing? Fun fact, enabling frame generation on the PC version of Black Myth gave you lower latency than native as the game didn’t ship with a toggle for Reflex by itself.
 

winjer

Gold Member
Before Reflex came around the end to end latency for a 60fps game on PC was anything from 60 to 200 ms. Reflex changed that by lowering latency to the 35 to 60 ms range. Adding frame generation on that increases the total latency to 45 up to 70 ms or so. Reflex was only introduced at the end of 2020 and many games still ship with it not even being available.

Hence if you find frame generation to be adding more latency then you would ever accept, then how was any game prior to 2020 at all acceptable? Or the many games that come out today with it being missing? Fun fact, enabling frame generation on the PC version of Black Myth gave you lower latency than native as the game didn’t ship with a toggle for Reflex by itself.

That is not accurate.
Those kind of latencies you talk about would only happen in poorply optimized games, at 60 fps or lower, and with V-sync on, without a Gsync monitor, and with a render queue of 3.



60hz-vs-240hz.gif
 

Zathalus

Member
That is not accurate.
Those kind of latencies you talk about would only happen in poorply optimized games, at 60 fps or lower, and with V-sync on, without a Gsync monitor, and with a render queue of 3.



60hz-vs-240hz.gif
There is a post before mine showing end to end latency in Deathloop being 150ms locked at 60fps. Tested with Nvidia LDAT iirc.
 

DoubleClutch

Gold Member
We’re mixing apples, bananas, and oranges.

Frame generation on its own always increases latency.

DLSS increases frame rate because it’s rendering at a lower resolution and the higher frames helps partially lower latency.

Reflex will typically lower the frame rate but can help lower total system latency in turn in some cases but not all, and the fact it is not turned on by default is evidence of this.

Let’s take a popular game like Call of Duty. If you can already run it natively at your display’s resolution and max refresh rate, under no circumstances should you turn on DLSS and frame gen, and certainly not frame gen on its own. Reflex would probably help but not always.
 

DoubleClutch

Gold Member
The catch in those charts is it’s not comparing like for like resolutions, frame rates/times.

What you should be comparing is 4K DLSS performance vs 1080p native because then they’re both being rendered at the same resolution.

It’s disingenuous to compare 4K native vs 1080p upscaled.

In competitive gaming the smearing and artifacting from upscaling may make it better to just run lower resolution straight without any upscaling.
 

onQ123

Gold Member

We're not that far away from Nvidia's vision.
Only problem with this is it can't be used as a standard for developers unless these GPUs dominate the market or Switch 2 use it .

Microsoft has a chance to set it as a standard for the next wave of Xboxes & do stuff like having really small games files & faster game production.
 

Zathalus

Member
And I just showed you CS:Go doing 40ms on default. And 15ms with a few tweaks.
By running the game internally at 2000fps?

It doesn’t really invalidate my point, that many, many PC games ran at higher end to end latency before Reflex became a thing. Latency higher than Reflex + Frame Generation. Even if it was a console game.
 
Last edited:

winjer

Gold Member
Many others didn’t. So unless he refuses to play a game with a higher end to end latency of 60ms, my point still stands.

Like I said, if it's a poorly optimized game, with vsync on, without Gsync and a render queue of 3 fps, that might be true for some games.
But that is like running a car with the handbrake on and then complaining it's slow.
 

Zathalus

Member
Like I said, if it's a poorly optimized game, with vsync on, without Gsync and a render queue of 3 fps, that might be true for some games.
But that is like running a car with the handbrake on and then complaining it's slow.
Call of Duty, Fortnite, Destiny 2, and Cyberpunk 2077 all have latency higher than 80ms with Reflex disabled. Those are some of the most popular games on the market. If all of them are poorly optimised then that’s what a lot of people play with. Elden Ring latency is even higher. Once again, extremely popular game.

If most people are fine with that, some Reflex+Frame Generation is hardly going to matter.
 

winjer

Gold Member
Call of Duty, Fortnite, Destiny 2, and Cyberpunk 2077 all have latency higher than 80ms with Reflex disabled. Those are some of the most popular games on the market. If all of them are poorly optimised then that’s what a lot of people play with. Elden Ring latency is even higher. Once again, extremely popular game.

If most people are fine with that, some Reflex+Frame Generation is hardly going to matter.

That speaks to the lack of optimization for latency of those games, not to what is possible on PC with well optimized games.
Besides, what settings were those tests made with?
BTW, did you know that a single cvar in UE4/5 can have a huge impact in latency? It's the render queue, and it's turned on by default, making latency much greater on all games.
Did you know that capping frame rate, slightly bellow average frame rate can improve latency significantly in most games.
Or that both the Nvidia and AMD control panel have options to reduce latency.
 

DoubleClutch

Gold Member
Call of Duty, Fortnite, Destiny 2, and Cyberpunk 2077 all have latency higher than 80ms with Reflex disabled. Those are some of the most popular games on the market. If all of them are poorly optimised then that’s what a lot of people play with. Elden Ring latency is even higher. Once again, extremely popular game.

If most people are fine with that, some Reflex+Frame Generation is hardly going to matter.

Wrong.


Warzone 2: At 4K, Reflex produced a 6% average improvement, though, less than a 1ms PC latency difference to put things in perspective.

Plague Tale: This test shows how lean Unreal Engine 5 can be when it comes to latency. It also reveals limited potential for Nvidia Reflex to improve the experience. The Reflex average PC latency improvement was 0.5ms, which is technically a 4% reduction, but not a perceivable change.

Also, sometimes you will have worse frame pacing. It’s a good technology but not a silver bullet. It depends on the situation.
 

DoubleClutch

Gold Member
That speaks to the lack of optimization for latency of those games, not to what is possible on PC with well optimized games.
Besides, what settings were those tests made with?
BTW, did you know that a single cvar in UE4/5 can have a huge impact in latency? It's the render queue, and it's turned on by default, making latency much greater on all games.
Did you know that capping frame rate, slightly bellow average frame rate can improve latency significantly in most games.
Or that both the Nvidia and AMD control panel have options to reduce latency.

In traditional rendering, a queue between the cpu and gpu helps maintain pacing but adds input latency.

Reflex synchronizes the cpu and gpu pipelines, effectively removing the render queue and reducing input latency. However, it requires precision timing to prevent pacing issues.

Furthermore, if you’re bottlenecked at the gpu, reflex might actually slightly increase latency rather than reduce it.
 
Last edited:

Zathalus

Member
That speaks to the lack of optimization for latency of those games, not to what is possible on PC with well optimized games.
Besides, what settings were those tests made with?
BTW, did you know that a single cvar in UE4/5 can have a huge impact in latency? It's the render queue, and it's turned on by default, making latency much greater on all games.
Did you know that capping frame rate, slightly bellow average frame rate can improve latency significantly in most games.
Or that both the Nvidia and AMD control panel have options to reduce latency.
Only one of the games I listed even uses UE. And yes, I’m sure there are many tricks to do with reducing latency. Nvidia LLM or other options can help. But that kind of is not my point is it? I was addressing the claim that frame generation adds an unacceptable amount of latency, which unless you are playing competitive twitch shooters exclusively, is probably completely untrue for the majority of people.

Wrong.


Warzone 2: At 4K, Reflex produced a 6% average improvement, though, less than a 1ms PC latency difference to put things in perspective.

Plague Tale: This test shows how lean Unreal Engine 5 can be when it comes to latency. It also reveals limited potential for Nvidia Reflex to improve the experience. The Reflex average PC latency improvement was 0.5ms, which is technically a 4% reduction, but not a perceivable change.

Also, sometimes you will have worse frame pacing. It’s a good technology but not a silver bullet. It depends on the situation.
From the article - “ The test data we gathered strictly measures PC latency as opposed to total system latency that includes peripheral input and monitor display latency. ”

I’m talking about end to end latency here (captured with a high speed camera or Nvidia LDAT, the latter DF and Hardware Unboxed used), which is what is actually important, as that is what matters to you as the end user. I’ve also been talking about games at 60fps, if you move it up to 144fps as that article does, then the gap would be lower, at 360hz for example, the advantage of Reflex is basically nothing, but at that fps frame generation is useless as well. Competitive shooters or extreme high refresh gaming is also not really what frame generation should be used for.
 

nemiroff

Gold Member
My point is none of them end up utilized to any great extent that significantly improve the gameplay.

It looks better, but that’s about it. We have less destructible environments now than we did before. But everything is shiny at least.
Compulsive. please stop
 
Last edited:

DoubleClutch

Gold Member
Only one of the games I listed even uses UE. And yes, I’m sure there are many tricks to do with reducing latency. Nvidia LLM or other options can help. But that kind of is not my point is it? I was addressing the claim that frame generation adds an unacceptable amount of latency, which unless you are playing competitive twitch shooters exclusively, is probably completely untrue for the majority of people.


From the article - “ The test data we gathered strictly measures PC latency as opposed to total system latency that includes peripheral input and monitor display latency. ”

I’m talking about end to end latency here (captured with a high speed camera or Nvidia LDAT, the latter DF and Hardware Unboxed used), which is what is actually important, as that is what matters to you as the end user. I’ve also been talking about games at 60fps, if you move it up to 144fps as that article does, then the gap would be lower, at 360hz for example, the advantage of Reflex is basically nothing, but at that fps frame generation is useless as well. Competitive shooters or extreme high refresh gaming is also not really what frame generation should be used for.

Why would we take mouse latency into account when measuring the effect of reflex??
 

Zathalus

Member
Why would we take mouse latency into account when measuring the effect of reflex??
Because I’ve been talking about end to end latency this whole time? You’re using an article with a completely different testing method and frame rate is obviously going to give different numbers to what I’ve been talking about. Never mind the fact that total system latency is the only thing that actually matters when talking about adding or lowering latency, for obvious reasons.

Look, if you exclusive play games at 144hz-360hz, then sure I can see how Reflex and Frame Generation doesn’t appeal to you. A lot of people, especially those playing single player games with higher visual fidelity, probably don’t exclusively play at that kind of fps. Hence the latency savings achieved by Reflex is usually substantial, and the added latency of frame generation is minor.
 

DoubleClutch

Gold Member
Because I’ve been talking about end to end latency this whole time? You’re using an article with a completely different testing method and frame rate is obviously going to give different numbers to what I’ve been talking about. Never mind the fact that total system latency is the only thing that actually matters when talking about adding or lowering latency, for obvious reasons.

Look, if you exclusive play games at 144hz-360hz, then sure I can see how Reflex and Frame Generation doesn’t appeal to you. A lot of people, especially those playing single player games with higher visual fidelity, probably don’t exclusively play at that kind of fps. Hence the latency savings achieved by Reflex is usually substantial, and the added latency of frame generation is minor.

I understand that, but if the latency isn’t improved in the situations I showed you, it’s not going to improve suddenly your peripheral latency. That’s why it’s a moot point overall.
 

DoubleClutch

Gold Member
Look, if you exclusive play games at 144hz-360hz, then sure I can see how Reflex and Frame Generation doesn’t appeal to you. A lot of people, especially those playing single player games with higher visual fidelity, probably don’t exclusively play at that kind of fps. Hence the latency savings achieved by Reflex is usually substantial, and the added latency of frame generation is minor.

Also, I get what you’re saying but the irony of that situation is if you’re gpu bound (which is most likely the case in the situation you’re describing), reflex could actually have the opposite effect of what’s intended.

And the lower FPS you already have, the far worse the added delay for frame gen is.

30 fps base, plus how ever many frame gen adds is basically unplayable for instance.
 

Lethal01

Member
Also, I get what you’re saying but the irony of that situation is if you’re gpu bound (which is most likely the case in the situation you’re describing), reflex could actually have the opposite effect of what’s intended.

And the lower FPS you already have, the far worse the added delay for frame gen is.

30 fps base, plus how ever many frame gen adds is basically unplayable for instance.

You probably wouldnt choose to upscale a shooter from 30fps, thats true, but gaming is more than just shooter and you could easily beat even something like elden ring with.

Frame generation implemented well adds maybe as much lag as vsync and calling that much lag intolerable is crazy. hell most competitive fighting games give you like 4 frames of lag.

if you are aiming to become the top 3% of a super popular shooter like marvel rivals or something then sure an added 20ms or so could hurt you but honestly its still negligible even if its nice to have.

 
Last edited:

Thebonehead

Gold Member
From reading this seems to be highly beneficial in the render pipeline to increase performance.

Render part of the frame then hand over to the machine learning cores with the information the model needs to complete it.

Difference is the hardware directly supporting it now.

Anyone got any deeper dives on this?
 

pasterpl

Member
How about making games fun first?

So many special marketing terms and technologies yet we still are stuck in the late 2000’s in terms of gameplay logic.

Still waiting for DirectStorage to revolutionize gaming… it came out 3 years ago and there’s like a handful of games that use it and Forspoken is one of them.
This is tech solution - like direct storage, it depends on devs if they use it or not. Nothing to do with gameplay etc. If devs don’t use it you will not see benefits. Not all devs will be using RTX or PSSR or FSR, but this have no impact on game being fun.
 

Zathalus

Member
If DirectX is getting Neural Rendering now, that probably means we will see it in the next gen PS6 as well. With Matrix Accelerators, UDNA in 2027/2028 will certainly have enough ML acceleration to do so. Obviously Sony will be utilising their own API to do so.
 

winjer

Gold Member
If DirectX is getting Neural Rendering now, that probably means we will see it in the next gen PS6 as well. With Matrix Accelerators, UDNA in 2027/2028 will certainly have enough ML acceleration to do so. Obviously Sony will be utilising their own API to do so.

CDNA already has great Tensor units. In fact, they are better than Nvidia's.
But Nvidia still has a huge advantage in software, so most companies choose Nvidia.

Only one of the games I listed even uses UE. And yes, I’m sure there are many tricks to do with reducing latency. Nvidia LLM or other options can help. But that kind of is not my point is it? I was addressing the claim that frame generation adds an unacceptable amount of latency, which unless you are playing competitive twitch shooters exclusively, is probably completely untrue for the majority of people.


From the article - “ The test data we gathered strictly measures PC latency as opposed to total system latency that includes peripheral input and monitor display latency. ”

I’m talking about end to end latency here (captured with a high speed camera or Nvidia LDAT, the latter DF and Hardware Unboxed used), which is what is actually important, as that is what matters to you as the end user. I’ve also been talking about games at 60fps, if you move it up to 144fps as that article does, then the gap would be lower, at 360hz for example, the advantage of Reflex is basically nothing, but at that fps frame generation is useless as well. Competitive shooters or extreme high refresh gaming is also not really what frame generation should be used for.

I can notice the latency difference by just enabling FG. And I can also notice the difference of enabling AntiLag2.
And the difference that capping frame rate does to improve frame times and lower latency.
And I can also notice that with FG, it's visually much smoother.
But overall, I still prefer to disable FG and enable Antilag2, to have really low latency.
 
Last edited:
I'm wondering: Will these features work on previous RTX series cards?

Alan Wake 2 will be updated soon and Nvidia has written that it will support Mega Geometry on all RTX series cards, which seems to suggest that neural rendering will not be restricted to RTX50 series.


rkMfZHx.jpeg
 
This funny thing is most early HDTVs had frame interpolation. It’s the same function today (adds fake frames in between), but it sounds so much cooler with neural networks, AI, machine learning, and the like.

It’s BS.
And they called me insane using motion smoothing on my lg cx. Now its all the rage.
 
Last edited:

winjer

Gold Member
And they called me insane using motion smoothing on my lg cx. Now its all the rage.

There are a few, very important differences.
FG on these games, can access several buffers, like depth, colour and motion, to produce a more accurate image, with fewer artifacts.
Something that a FG that works on the final frame, delivered by the console, can't do.
The second difference, is that all games that use Nvidia FG also have Reflex, to keep input latency in check.
 
Top Bottom