• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

GeForce RTX 5090 is $1,999 5080 $999 5070 Ti $749 5070 $549 (Availability Starting Jan 30 for RTX 5090 and 5080)

And you can have framegen with Lossless Scaling for £5 right now no matter your GPU if you want 4x. Might not be perfect, but this stuff exists.

Can you help me understand what you mean by this? Sounds like there is a mod you can buy that does framegen on any card?
 

llien

Member
It's obvious they have nerfed the 5090 to keep the server solutions attractive.
With only 32GB, I so doubt it.
5090 has 33% shaders vs 4090, quite beefy buf.

Especially compared to the rest of the lineup.

Sounds like there is a mod you can buy that does framegen on any card?
Don't your TVs have the option to inflate frames? Chips that can do that are very cheap.

And if you wonder, where is the catch: frame inflation is increasing lag, making games less, not more responsive.

It is fine in point and click adventures and that sorts of games, where, ironically, fps does not matter much.

PS
Oh, wow, a software solution too... :)))
 
Last edited:

MMaRsu

Member
gpu prices went so far into ridiculous that people start to think 549 is not that bad. were are actually decent performing 149 or 299 cards? a successor to a gtx1650?

3060 wasnt announced when 3000 series cards were announced.

You'll get a 5060 for 399€
 
Can't wait to see reviews and benchmarks but I'm gonna try to ride my 3080ti for as long as I can. I game at ultra wide 1440p so I'm fine for now.
 
So with pricing this we could expect $349 5060 and $149 5050 on the future? When do you folks think those entry level cards are gonna be announced? 2026? Q4 2025?
 
Teraflops and rasterisation are old and useless.
absolute nonsense.

try playing a really demanding game that only gets ~30fps before applying framegen. It´s atrocious, because you`re still playing a 30fps-reaction level game, even if the fps counter gives you a 3 digit number.
Reflex helps quite a bit, or rather it`s a must, but in the end you really want those 60fps baseline before going all in on the fake frames and that`s a tall order with the way games are increasing in requirements, at least when we`re talking about AAA like CP2077 or Wukong f.e.
The "classic" performance metrics absolutely still do matter for the necessary baseline. Maybe that will change, too, but at least right now we`re not there, yet.
 
Last edited:

peish

Member
absolute nonsense.

try playing a really demanding game that only gets ~30fps before applying framegen. It´s atrocious, because you`re still playing a 30fps-reaction level game, even if the fps counter gives you a 3 digit number.
Reflex helps quite a bit, or rather it`s a must, but in the end you really want those 60fps baseline before going all in on the fake frames and that`s a tall order with the way games are increasing in requirements, at least when we`re talking about AAA like CP2077 or Wukong f.e.
The "classic" performance metrics absolutely still do matter for the necessary baseline. Maybe that will change, too, but at least right now we`re not there, yet.

Game developers will have to adapt. The old rasterisation just takes up too much power and is TDP limited

You can see where Nvidia is going, using AI to fake it, saves memory and gives a huge visual update potential.

I trust Jensen the visionary.
 
Game developers will have to adapt. The old rasterisation just takes up too much power and is TDP limited

You can see where Nvidia is going, using AI to fake it, saves memory and gives a huge visual update potential.

I trust Jensen the visionary.
Not meant to be offensive, but that is so broad that it makes no sense.
 
FG existed way before AI buzzwords came around.
TVs could do it for ages.
Faux frames increase lag making the whole "higher FPS" a sort of fruitless mental masturbation.

So, uh, yes.

And TV manufacturers enforced game mode which typically disables this smoothing in order to reduce input lag.

I don’t know what people are thinking. Supposedly frame gen is the future of gaming. But apparently so is cloud game streaming. Wireless controllers are already here to stay. When you combine Bluetooth + frame gen, it’s already getting dicey. I really doubt you can add in streaming and have an acceptable level of input lag.
 

Fess

Member
How are they dealing with the low graphics memory? Some new tech that makes 12 and 16GB enough on 5070 and 5080? Or is the higher speed going to fix it?

I hit the 16GB ceiling on my 4080 Super in Indy with full path tracing, when that happens it completely tanks the framerate and I have to start lowering settings.
 
Last edited:

peish

Member
Not meant to be offensive, but that is so broad that it makes no sense.

I am not AI expert but the gist i got from Blackwell, the drastic design changes it made. Nvidia believes from their stats, it is computationally cheaper to use AI tensor cores to fill the gaps of rasterisation.

Then add a slice of RT from the RT cores, you get the future of graphics.

You dont need to keep doubling rasterisation cores to achieve better graphics. You let Nvidia monster AI servers do training and transforming offsite, and send the results to your local GPU to AI your games.
 

Puscifer

Member
nvidia-geforce-rtx-5090-performance-chart.jpg
Now that the dust has settled these aren't impressive numbers without pure raster. That's not even DLSS quality...
 

Puscifer

Member
How are they dealing with the low graphics memory? Some new tech that makes 12 and 16GB enough on 5070 and 5080? Or is the higher speed going to fix it?

I hit the 16GB ceiling on my 4080 Super in Indy with full path tracing, when that happens it completely tanks the framerate and I have to start lowering settings.
I said the same thing about STALKER 2 and Cyberpunk recently and people gaslit me saying it was a memory leak vs my card tapping out at 4K, it's why I was hoping AMD was going to make due on their promises
 

peish

Member
Last edited:
I am not AI expert but the gist i got from Blackwell, the drastic design changes it made. Nvidia believes from their stats, it is computationally cheaper to use AI tensor cores to fill the gaps of rasterisation.
that`s correct, or rather we simply can`t build chips to actually do all the work. And actually not only ratser but RT is mostly gap-guessing, too.
You dont need to keep doubling rasterisation cores to achieve better graphics. You let Nvidia monster AI servers do training and transforming offsite, and send the results to your local GPU to AI your games.
uuuh, yes and no.
Even if 80+% on your screen is AI guesswork you still have to have a solid base the NN can extrapolate from. whatever functionality you are using. And for that you need classic compute power. And with the way things are going that can already be asking a lot. For really demanding games we`re often times looking at sub FHD resolutions internally on high(er) end hardware.
There`s a reason why cards still not only increase in AI tops but also classic compute capabilities the higher in tier you go. The performance increase will (have to) continue on all fronts, even if the balances are heavily shifting atm.
 
Last edited:

llien

Member
And TV manufacturers enforced game mode which typically disables this smoothing in order to reduce input lag.
Well, yes, and so what? Should I forget that $2 chip in my 1080p pre-OLED TV could inflate frames just fine?

I don’t know what people are thinking. Supposedly frame gen is the future of gaming. But apparently so is cloud game streaming. Wireless controllers are already here to stay. When you combine Bluetooth + frame gen, it’s already getting dicey. I really doubt you can add in streaming and have an acceptable level of input lag.
The increased lag is not a side effect due to inferior tech, it's an inherent part of frame inflation.
There is no future in which it starts to make sense, bar filthy marketing purposes.
 

rofif

Can’t Git Gud
Can someone tl dr me without DLSS and AI bullshit,
what is raw performance improvement:
4080 to 5080 ?

Or better yet. Someone calculate 3080 vs 5070ti vs 5080.
 
Last edited:

Puscifer

Member
Can someone tl dr me without DLSS and AI bullshit,
what is raw performance improvement:
4080 to 5080 ?

Or better yet. Someone calculate 3080 vs 5070ti vs 5080.
No one can, marketing hit us with a baseball bat so hard yesterday many of us still haven't come out of the daze. Gamers Nexus and Daniel Owen seemed to be the only outlets NOT circle jerking over the announcements that made people go "maybe we should wait for third party analysis to confirm because the charts don't paint a pretty picture"


My guess? Real raster without dlss and fg shenanigans will look like typical gen on gen improvements and you'll need all the extras to make it worthwhile.
 
Last edited:

Arsic

Loves his juicy stink trail scent
I’m broke as a joke or else I’d get a new 5080. That price is justifiable.
 

rofif

Can’t Git Gud
DLSS and AI is part of raw performance now.

If you exclude those 2, i say around >35% with RT
No one can, marketing hit us with a baseball bat so hard yesterday many of us still haven't come out of the daze. Gamers Nexus and Daniel Owen seemed to be the only outlets NOT circle jerking over the announcements that made people go "maybe we should wait for third party analysis to confirm because the charts don't paint a pretty picture"


My guess? Real raster without dlss and fg shenanigans will both look like a typical gen on gen improvement and you'll need all the extras to make it worthwhile.
Interesting. This channel usually makes fun off consoles, not pc lol


So now pc gamers were laughing at me playing 30 and 60fps?!
While they will be comfortable with playing 20fps, looking like 240fps ?
The input lag is still of a 20-30fps game. Just visually displayed like real 240fps (I tried framegen on 4080 and it was impressive anyway).

I know they are finding ways around this but you have 3 frames not reacting to input.
in this niktek example, if base game is 30fps, the cost of DLSS and FG4 could be (let's be lenient here), 10%.
So You are dealing with 27fps of BASE FRAMERATE upscaled to well above 120fps or more.
And while it will look like it, it must feel like shit to control using mouse right?!
I will remember than when people will laugh at me playing 30 and 60fps games :p Although 30fps is extremely rare nowadays on ps5 pro.

EDIT: This is not a bait or attack post. Please do not school me on this. I am not baiting or attacking. Take it easy guys and have fun with new graphics cards. I am considering 5080 myself since I love 30fps
 
Last edited:

Crayon

Member
No one can, marketing hit us with a baseball bat so hard yesterday many of us still haven't come out of the daze. Gamers Nexus and Daniel Owen seemed to be the only outlets NOT circle jerking over the announcements that made people go "maybe we should wait for third party analysis to confirm because the charts don't paint a pretty picture"


My guess? Real raster without dlss and fg shenanigans will look like typical gen on gen improvements and you'll need all the extras to make it worthwhile.

Yeah I feel like I'm on crazy pills. 5070 = 4090 c'mon now, people.
 

SHA

Member
gpu prices went so far into ridiculous that people start to think 549 is not that bad. were are actually decent performing 149 or 299 cards? a successor to a gtx1650?
And we get fewer worthy aaa games, AI is the real savior to this situation, it has taste like us.
 

proandrad

Member
Interesting. This channel usually makes fun off consoles, not pc lol


So now pc gamers were laughing at me playing 30 and 60fps?!
While they will be comfortable with playing 20fps, looking like 240fps ?
The input lag is still of a 20-30fps game. Just visually displayed like real 240fps (I tried framegen on 4080 and it was impressive anyway).

I know they are finding ways around this but you have 3 frames not reacting to input.
in this niktek example, if base game is 30fps, the cost of DLSS and FG4 could be (let's be lenient here), 10%.
So You are dealing with 27fps of BASE FRAMERATE upscaled to well above 120fps or more.
And while it will look like it, it must feel like shit to control using mouse right?!
I will remember than when people will laugh at me playing 30 and 60fps games :p Although 30fps is extremely rare nowadays on ps5 pro.

EDIT: This is not a bait or attack post. Please do not school me on this. I am not baiting or attacking. Take it easy guys and have fun with new graphics cards. I am considering 5080 myself since I love 30fps

There are other techniques in effect to lower input lag and they are the reason why Nvidia frame gen works well. If there wasn’t it would be pretty worthless.
 
Last edited:

Hot5pur

Gold Member
The 5090 vs 4090 comparison...damn.
Seems like the 30xx and 40xx we far better than 50xx uplifts. Have to flex that MFG marketing when even FG was and remains half baked at best. Heck even DLSS base has artefacts at times.
Probably the most disappointing gen in a while, just gimmicks to sell a slightly (20%?) improved version of the same thing still keeping the gimped VRAM.
 

Luipadre

Member
Interesting. This channel usually makes fun off consoles, not pc lol


So now pc gamers were laughing at me playing 30 and 60fps?!
While they will be comfortable with playing 20fps, looking like 240fps ?
The input lag is still of a 20-30fps game. Just visually displayed like real 240fps (I tried framegen on 4080 and it was impressive anyway).

I know they are finding ways around this but you have 3 frames not reacting to input.
in this niktek example, if base game is 30fps, the cost of DLSS and FG4 could be (let's be lenient here), 10%.
So You are dealing with 27fps of BASE FRAMERATE upscaled to well above 120fps or more.
And while it will look like it, it must feel like shit to control using mouse right?!
I will remember than when people will laugh at me playing 30 and 60fps games :p Although 30fps is extremely rare nowadays on ps5 pro.

EDIT: This is not a bait or attack post. Please do not school me on this. I am not baiting or attacking. Take it easy guys and have fun with new graphics cards. I am considering 5080 myself since I love 30fps


Wrong. Thats what reflex is for and now with reflex 2 it will be even lower
 

CuNi

Member
RTX 50 are AI GPU. Not just marketing but Nvidia clearly shifted the design for AI rendering with that huge bump in tensor cores.

Games are going be trained and AI rendered in future.

Teraflops and rasterisation are old and useless.
I also feel like people will get disappointed if they now hold out for the 6000-Series, as Nvidia will just double down on AI and RT/PT instead of Raster-performance.
Sure we will continue to see slight uplifts there too, but most new silicon and innovation will be dedicated to AI.
With DLSS, they don't need hardware that can natively run 4k raster anymore. They'll target 1440p or 1080p performance and then pump the rest into DLSS to generate the 4k output.
 
Top Bottom