• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The lack of repulsion for this "frame gen" thing is truly concerning

a'la mode

Member
oh boy

This is exactly the kind of misinformation that makes these threads pop up

You're calculating display latency. Not a rendering pipeline end to end from input to display.

They even used a low latency mouse for call of duty cold war and fortnite since they work on PS5. Same mouse was used on PC.

It's a fun video! My favourite part is when they gloss over the implications of the game engine and implementation (I mean, of course they would, since it's a Nvidia Powerpoint), and Steve implies ETW is used for polling inputs. :messenger_tears_of_joy: Bro learned an acronym and was just burning to drop it.
 
Of course it doesn't look like that. That shit was the most naive implementation. Do you know what bilinear interpolation is (for resolution scaling)? That's basically what those early smoothing features on HDTVs are. Whereas Nvidia's FrameGen is analogous to what DLSS scaling does.
I had a feeling OP was exaggerating 😎 but nice to see it confirmed.
 

nkarafo

Member
and the casuals won't even care

Leonardo Di Caprio Look GIF by Once Upon A Time In Hollywood


Jurassic Park Ian Malcom GIF
 

K' Dash

Member
I had a feeling OP was exaggerating 😎 but nice to see it confirmed.

I would look for documentation on it instead of believing a random dude here.

But I agree it can’t be that simple, I’m just having a hard time wrapping my head around of creating a frame with new data that fits with the previous one.

DLSS, that’s easy to comprehend, for this frame generation stuff, I have to sit down and read the documentation.
 
You know what I find repulsing?

The amount of people who clearly haven’t tried it. It works, shockingly well, I don’t notice any latency (because it’s not the same as frame interpolation)


just like. Take a deep breath, and try the scary new technology before you develop such an emotional reaction to it.

I mean jeez I’ve seen tests that show frame gen on 40 series cards are still hitting lower latency than native rendered console games.

Chill out man. It’s good. Works great.
 

nkarafo

Member
upscaling has already had a bad effect on game optimization even if the tech itself is excellent. This just seems like it's going to have catastrophic consequences for gaming in the long run. I don't want all my future games to need framegen to run.
This right here is the biggest issue with this kind of tech.

You would think the reason it exists is to provide "extra performance" out of your games, right?

Well, for the first few batches of games, sure. But when it becomes established enough, developers will once again use this as an aid to cover for their incompetency and lack of optimization.

And before you know it, you will end up absolutely needing this, not for the extra performance this time but for just the acceptable performance games used to run without this tech.

The more you give to developers, the less they care. Like how they stopped caring about bugs when online digital stores gave them the freedom to release broken games with the excuse they will fix them later. Like how they stopped optimizing for space after games didn't have to be stored on a physical storage with fixed available space so now everything is more bloated than it should be. And like how all these upscaling technologies ended up being needed for acceptable performance instead of "extra" performance.

And this frame-gen tech will end up the same. It's an "extra" now that helps running games at 120+ fps, but soon enough it will be needed for the standard 60fps.

Basically the rule is this: It's not made for you. All these fancy tech things are made to benefit developers and publishers only.
 
After reading about Frame-Gen. I would assume that the current level of graphic cards can already push most games beyond their capabilities even accounting DLSS, the frame-gen tech would be built on top of the system. This means the tech becomes less noticable for higher end hardware and any motion+ effect isn't based from pushing 30 to 60 but from 60 and above. So even the worst case scenario, you're still getting excellent performance from its baseline.
 

Cyberpunkd

Member
What's more curious is to see console gamers talking about input lag now, i mean if you worry so much about that just don't get a console.
It’s frame pacing drama all over again. For decades nobody cared, then at the beginning of PS4/XBO gen suddenly every game started being analysed for frame pacing. I blame the grifters from Digital Foundry, had to come up with a problem to come up with a solution.

However the OP is right I that frame gen has terrible visual artifacts. I agree FG should be seen as a “budget setting” scenario, where your setup cannot pull required number of frames natively.
 

Cyberpunkd

Member
I dunno, seems that Steam Deck owners are loving it.
It’s the difference between 25FPS and 45FPS in Dog Town. It makes a bunch of games more than playable on the Steam Deck at a cost of graphic fidelity. Basically SD owners are discovering what PC gamers were doing for decades- dial down the resolution and graphical settings, you get more frames lol

Oh My God Wow GIF
 
It’s the difference between 25FPS and 45FPS in Dog Town. It makes a bunch of games more than playable on the Steam Deck at a cost of graphic fidelity. Basically SD owners are discovering what PC gamers were doing for decades- dial down the resolution and graphical settings, you get more frames lol

Oh My God Wow GIF
I found that a lot of demanding games already run at very low resolutions between 20’s to 30’s fps, I understand that some users don’t care about resolution but if what you’re saying it’s true, Damn!
 

kevboard

Member
Seriously guys? You might as well turn motion interpolation on your chitty low-end LCD.

a frame interpolation method that has access to engine information like the depth buffer and motion vectors can make way more precise and more informed guesses on how the next frame should look like.
and with DLSS specifically there's also a machine learning component that cleans up the image as much as possible.


It's not only fake as hell,

all frames are fake


but introduces input lag

if your game already runs at 60+ native fps on a PC with Nvidia reflex enabled (which is mandatory for DLSS frame gen) you have significantly lower input lag than 99% of console games.
adding a few milliseconds of lag with frame gen will still result in lower latency than almost any console title.

if you play at 120+ native fps with frame gen and Reflex, your latency will be so low that only an eSports player would notice the difference between native and frame gen


and visual artifacts all over the place, as expected.

these visual artifacts will be almost invisible to the naked eye if your base framerate is high enough.
on a flatscreen this can be preferable to the sample and hold persistence blur that all flat screens suffer from.


basically, if you play on PC, and you have a high end GPU, and you have a high end Monitor,
then frame gen has almost zero actual downsides but the upside of more fluid looking motion and less persistence blur from your monitor.




And +reflex without framegen = lower latency than +reflex with framegen.

Because framegen... adds latency?

yes reflex without frame gen is the lowest latency you can get.
however, 60fps native framerate with frame gen on top, and reflex enabled has SIGNIFICANTLY lower latency than the same game at the same framerate on console.
so if you are that sensitive to input lag, you better not play on console, as consoles don't have access to Nvidia Reflex.

God of War 2018 at 60fps:
PS5 = 112ms
PC + Reflex = 73ms

Frame Gen adds like 16ms of lag, which will still be far below console.
 
Last edited:
oh boy

This is exactly the kind of misinformation that makes these threads pop up

You're calculating display latency. Not a rendering pipeline end to end from input to display.




Digital foundry took the latencies from input to frame movement on PS5 and LDAT on PC.

5ldp623.png


They even used a low latency mouse for call of duty cold war and fortnite since they work on PS5. Same mouse was used on PC.

For God of War DF incredibly used the 30fps mode on PS5. Also they used the KB/M on PS5 for the above games to make the input lag the worst possible (as it's probably not optimzed properly on console). You notice how low the latency is on Destiny 2 (because there is no 30fps mode and no KB/M). Basically the same number as PC, reflex ON and vsync ON as this is exactly how it's run on PS5. So the PS5 has the same exact input lag as the PC when settings are identical.

Df are a bunch of PC warriors and are always trying to make PC look so much better than PS5. In reality on PS5 God of War @ 60fps has about 60ms of latency (when checked by NXGamer once). And last time I checked Call of Duty games have about 40-50ms of latency (at 60fps !) on PS5 using a controller.
 

Filben

Member
I've never experienced playing a game with frame generation. Does it really look like motion interpolation on a TV, with the soap opera effect?

It can't possibly look that horrendously bad, can it?
Not exactly like that, no soap opera effect. but it introduces artefacts. also depends on per game implementation and base FPS. Frame Generation works better the higher your base FPS.

In Indiana Jones I was barely getting 30fps with Ray Tracing with my old GPU, turned on Frame Generation and had artefacts all over the screen when I was moving around (the camera) plus a bit of input latency compared to V-Sync input latency. Input latency in this kind of game is okay, but not in a shooter or online games.

Now with my new GPU I get 45–60fps, turning on frame generation works way better in the same game but it still shows some artefacts here and there.

The thing is, all of this tech adds up. DLSS and other in-house solutions introduces some light smearing, then you get heavy post processing by the game, then you enable frame generation, then ray reconstruction. It adds up. Image clarity and quality while under motion is significantly lower than in games before this era. On the other hand, fidelity goes through the roof and you get amazingly realistic looking games.
 

FeralEcho

Member
The only thing this should be used for is maxing high refresh rate monitors from an already high frame rate.

You want 1000hz monitors? This is how you get there.

Boosting an already low starting frame rate (under ~75fps) with it is total shit though. Sadly that's without a doubt what you'll be getting with next-gen consoles.
This right here is why it's a problem. I don't mind frame generation, it's a great technology for 60 fps to get even more fps and in time it'll get better at input lag but as we've already seen with the lazy devs this generation,they will just use it as a crutch for bad optimization to get "stable" 30 fps+ as seen with Monster Hunter Wilds which while not looking anywhere near a new generation ahead of World it sports the most dogshit optimization and performance of any MH so far and they demand FG turned on as a requirement to get good fps in their spec requirements.

Overreliance on FG and Upscalers will just make devs lazier and lazier. That's the big issue here,not the tech itself.
 

nkarafo

Member
yes reflex without frame gen is the lowest latency you can get.
however, 60fps native framerate with frame gen on top, and reflex enabled has SIGNIFICANTLY lower latency than the same game at the same framerate on console.
so if you are that sensitive to input lag, you better not play on console, as consoles don't have access to Nvidia Reflex.

God of War 2018 at 60fps:
PS5 = 112ms
PC + Reflex = 73ms

Frame Gen adds like 16ms of lag, which will still be far below console.
Ι don't play on console. I play on PC.
 

mili2110

Member
I heard a developer say that Frame Gen must be built around the engine to get the most out of it. If the frame pipeline is rebuilt around FG they will be able to reduce input lag and improve motion clarity etc.
Cant remember the tweet or interview. Maybe DF podcast.

I guess all Sony 1st party Studios will do this for their PS6 games. Maybe we can really get 4K 120Hz with FSR + FG on PS6. (1080p internal resolution + true 30 FPS of course)
If it looks and feels like true 4k/120fps then I dont care if its fake or not.
 
Last edited:

keefged4

Member
I mean its fun to play around with, Especially using stuff like lossless scaling for emulating games stuck at 30/60fps, but in general it gets turned off in actual native PC games. I hate the artifacting it causes. 60fps interpolated to 120 is still 60hz latency, and you can definately feel it. It's horrible to use with M&K.
 

BlackTron

Member
Input lag is deceptive because you get used to it. I played through half of The Messenger on my TV and felt fine. I got the idea to output video to both my TV and 1ms monitor, and it was like a joke how much sooner my character landed after tapping jump on the monitor. Was halfway down the jump on TV. This was necessary for me to topple a boss after several failed attempts.

Eliminating as much lag as you can from all sources results in a different gameplay feel closer to what we enjoyed on RF adapters.
 
Only a clueless console gamer who has never tried this technology could compare TV interpolation to DLSS FG and claim that it adds noticeable input lag and artefacts.

If your base fps is around 60fps, the image looks artefact-free 99.9% of the time. Even if I'm looking for the artefacts it's not easy to find them because they're so small. I have to turn the camera extremely fast to notice small artefacts, but they're usually transparent, so even in this scenario it doesn't bother me.

c20ec5ff825260bc53a7-3.png


That's not the case with TV interpolation, even with Lossless Scaling FG, because these inferior framerate interpolation methods actually produce very noticeable artefacts, and you don't even have to look for them to notice them.

20250108-211531.jpg


20250108-211622.jpg


73fps 28ms latency vs 124fps 37ms latency. 9ms difference in latency is placebo territory and you get way sharper and smoother image during motion. Even on M+K that input lag difference is very small, not to mention on gamepad. I tried playing Cyberpunk at real 120fps (I used DLSSQ instead of DLSS FG) and on the gamepad the experience was the same as playing at generated 120fps. This technology works like magic on PC and would dramatically improve the gaming experience on consoles assuming it would work as well as DLSS FG.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
For God of War DF incredibly used the 30fps mode on PS5. Also they used the KB/M on PS5 for the above games to make the input lag the worst possible (as it's probably not optimzed properly on console). You notice how low the latency is on Destiny 2 (because there is no 30fps mode and no KB/M). Basically the same number as PC, reflex ON and vsync ON as this is exactly how it's run on PS5. So the PS5 has the same exact input lag as the PC when settings are identical.
They used the 60fps mode for God of War, not the 30fps mode. Where did you even get that they used the 30fps mode?
 

GymWolf

Member
Not the biggest fan myself but with a controller, the added inout latency is not super noticeable unoess you play arena shooters or hightl competitive, high octane games.
 

DirtInUrEye

Member
Nvidia marketing have an army of Don Drapers who are shaping the consumer narrative. They did it before with reflections: "you must have RTX to enable these pin sharp and completely ridiculous and unrealistic reflections all over your games".

Nobody batted an eye that time though. Now so-called "enthusiast" gamers all over the world have convinced themselves that tarmac and pavement reflections are never diffuse in real life and you can even use them to touch up your makeup.
 

squidilix

Member
oh boy

This is exactly the kind of misinformation that makes these threads pop up

You're calculating display latency. Not a rendering pipeline end to end from input to display.




Digital foundry took the latencies from input to frame movement on PS5 and LDAT on PC.

5ldp623.png


They even used a low latency mouse for call of duty cold war and fortnite since they work on PS5. Same mouse was used on PC.


Again and again dishonest comparisons.

PC =/= Nvidia

You show above all that the PS5 does less than on a Standard PC (AMD or Intel or Lower Nvidia) and that you STILL need Nvidia technology to be below the PS5 (here Nvidia Reflex)

So the "high latency" console between a PC argument is completely false. You're just using a technology that's proprietary to one GPU manufacturer.
 

Thebonehead

Gold Member
For God of War DF incredibly used the 30fps mode on PS5. Also they used the KB/M on PS5 for the above games to make the input lag the worst possible (as it's probably not optimzed properly on console). You notice how low the latency is on Destiny 2 (because there is no 30fps mode and no KB/M). Basically the same number as PC, reflex ON and vsync ON as this is exactly how it's run on PS5. So the PS5 has the same exact input lag as the PC when settings are identical.

Df are a bunch of PC warriors and are always trying to make PC look so much better than PS5. In reality on PS5 God of War @ 60fps has about 60ms of latency (when checked by NXGamer once). And last time I checked Call of Duty games have about 40-50ms of latency (at 60fps !) on PS5 using a controller.

That's simply because it is.

Their goal is to present a fair comparison of hardware capabilities.

Being weaker? Well there is no shame in that so there is no need to feel personally attacked and defensive over it. It's just a piece of hardware with its own weaknesses and strengths, much like Nvidia, AMD and Intel cards for instance.

If anything the chart shows that Destiny 2 has input lag is comparable to PC under similar settings, which actually supports the idea that PS5 achieves low latency when the developer targets that as a priority
 

Gaiff

SBI’s Resident Gaslighter
Again and again dishonest comparisons.

PC =/= Nvidia

You show above all that the PS5 does less than on a Standard PC (AMD or Intel or Lower Nvidia) and that you STILL need Nvidia technology to be below the PS5 (here Nvidia Reflex)

So the "high latency" console between a PC argument is completely false. You're just using a technology that's proprietary to one GPU manufacturer.
Rich doesn’t even use the PC properly in the first place, so you’re incorrect. Maxing out the GPU usage can also lead to higher latency. The trick to maximize a low input latency is to cap the fps a few fps below the monitor’s refresh rate and toggle GSync and force VSync on the NVIDIA control panel. This will result in a much lower latency than the base numbers Rich got (but won’t quite match Reflex for the most part).

And the reason it’s lower on PS5 in DualSense (such as GOW) games is because it’s 250Hz on PS5 but 125Hz on PC. You can however overclock it to 1000Hz on PC.
 
Last edited:

Elios83

Member
It's just the AI-based, more advanced version of the motion interpolation "soap-opera" effect that has been present on TVs for years and that every so called gamer so far has always disabled.
You'll still get as a compromise extra input lag and motion related artifacts around fast moving objects (ex. if you rotate the camera quickly) because AI is not magic.
If it's used as a quick job for developers to try to turn their 25-30fps code into a 60fps performance mode it won't be good.
 
Last edited:

TintoConCasera

I bought a sex doll, but I keep it inflated 100% of the time and use it like a regular wife
So lesson here is that it’s TV interpolation and a gimmick

Until

Cerny presents it for PS6 and then it’s a game changer

Bookmark me for PS6 tech unveil

Console fans are like AMD fans. It’s a gimmick until they get it.

I recall the same discussions for Gsync over a decade ago.
Yep, same way as 30fps is more cinematic and anything beyond 720p isn't noticeable by the human eye.

Give it some time and the discourse will do a 180º.
 

Fafalada

Fafracer forever
if you play at 120+ native fps with frame gen and Reflex, your latency will be so low that only an eSports player would notice the difference between native and frame gen
What reason 'other' than eSports would there be to try to run at 240+?
If you're only after motion clarity, (and presumably you have a display running 240+), you can achieve it with low-persistence refresh instead, which will not have the smudging and artifacts that frame-gen introduces, have no impact on latency, and won't need 'special GPU features' that NVidia is paywalling behind generations.

so if you are that sensitive to input lag, you better not play on console, as consoles don't have access to Nvidia Reflex.
Reflex is just software - there's no secret sauce that consoles can't do - it's down to what developers choose to prioritise.
And yes - console devs 'usually' don't bother, which is a problem in of itself, and actually aggravates frame-gen issues (because added latency from interpolation is also a function of base latency).
But that's all moving goal-posts, the relevant comparison isn't native fps on console to frame-gen on PC, most people moving from consoles will get better fps on PC anyway (ignoring stutterfests), no need for faux frames.
 

deeptech

Member
As some have already said it, it helps your games only when base fps is playable already, making it look smoother and nicer to the eye with more or less no added problems.
But those who think it literally "generates" or creates new actual frames and thus make unplayable 30fps game into a perfectly playable 90fps+ game, are simply wrong. It will never fix low performance.
So it's a win basically to those who are already winning. I guess it's cute overall, but just wrongly thought of as a fix for weak hardware, and it's certainly not magical as cringelords suggest.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
PS360-PS4 era: 60fps is overrated. You can’t tell the difference.

PS5/SX era: 75% choose 60fps. 30fps has suddenly become unacceptable.

Frame gen lag is proportional to the source framerate (and latency). If you frame gen that 150ms game - it'll be a lot worse than 170ms.
Not to mention that results are always at least Nx worse than the target (N)framerate even if frame-gen was 0 latency in of itself.
You’re correct, but at those latencies, you’re probably playing at 30fps or below and it’s universally agreed to not toggle frame gen on. Even NVIDIA recommends frame gen at 40fps and above and AMD 60fps and above. That’s just because NVIDIA makes sure to use Reflex too. Otherwise, they’d probably also recommend it only at above 60fps.
 
Last edited:

Durin

Member
The issue is really the usage of it. If you need frame-gen to get your game to 60fps, you'll notice the latency hit, and I don't think it's worth it. Console users have moved to 60fps, so devs of PS6 will target that with just upscaling, no frame-gen.

If upscaling or native resolution alone is already getting you 60+fps in a game, then the latency hit will be minimal to most people vs the better motion clarity of higher frames frame-gen provides. The newer Nvidia Reflex seems to further reduce latency hit as well, but I want to try myself.

The only time I noticed visual artifacts for frame gen was for Spider-man 1 when the tech first came out, and on Linux there are still some issues. Now though on Windows it's fine, and DLSS 4 brings upgrades to reducing visual artifacts with the transformer model.

I don't get complaining about visual artifacts when until PS5 Pro, you were stuck with the visual artifacts of Sony's checkerboard rendering, and the only other AA option game engine makers give us is TAA now...which is blurry slop that DLSS/DLAA spits on, and FSR 4 by the looks of will too.
 
Last edited:

G-DannY

Member
glorified motion-blur just to appeal and justify e-peens of "10000hz+" monitor owners
 
Last edited:

DirtInUrEye

Member
Huh. Nvidia owns the market. There is no voting with your wallet when there is only 1 choice at the high end.

Eh, Rockstar own the pretend grand theft auto market and are rumoured to be possibly charging a hundred quid for it.

"Well I'll have to just pay it then won't I." appears to be your response.
 
Last edited:

Calverz

Member
I agree with your point OP. I don’t like it either and avoid it where possible. But unfortunately the mass market won’t care and that dictates the travel of computer graphic technology. You are right, it wil become common place in the future.
 
Well, given that the only way to get more performance these days is to push more power through the card its either frame gen or an installable fusion reactor. My 4090 already heats my entire room and raises my electric bill by at least $30 a month. Add that the 5090 uses an additional 125 watts for a marginal 30% performance gain and it's not a great trend. We will definitely need frame gen to make up the difference going forward.
 
Top Bottom