• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 5080- 5090 Reveal Showcase | OT | $1999 of Monopoly | 6.30 PM P.T - 2:30 AM GMT

Will you buy the new cards?

  • Yes!

    Votes: 91 27.7%
  • No.

    Votes: 153 46.5%
  • Yes, but i will wait for a discount!

    Votes: 39 11.9%
  • Console is better

    Votes: 46 14.0%

  • Total voters
    329
  • Poll closed .
Ngl, this is terrible. Like it's looking like 30% real gains across the board. This frame-gen tomfoolery is there to obfuscate the numbers. Considering that the notes suggest that the games are rendered at 4k max settings, it's pretty bad. Performance gains look to be worse than ada maybe barring the low end, wild.

This maybe hyperbolic on my part but anytime a company's stock appreciates this much and employees can cash out on their RSUs, the company starts a slow decline in innovation. I suspected that it might happen with Nvidia and this lacklustre release only confirms it for me. These guys are about to enter a coasting phase.

Yep. One of the best things about upgrading for me is going back to older games that don't have good anti-aliasing and brute force good image quality and framerates. I was hoping 5090 would be atleast 70% faster than 4090, but only 25-30% is very disappointing.
I mean a 4090 is roughly 77% faster than a 3090. They are relying entirely on AI and fake frames, which honestly would be fine if all games were supported, but they aren't.
 

A2una1

Member
Yep. One of the best things about upgrading for me is going back to older games that don't have good anti-aliasing and brute force good image quality and framerates. I was hoping 5090 would be atleast 70% faster than 4090, but only 25-30% is very disappointing.
I mean a 4090 is roughly 77% faster than a 3090. They are relying entirely on AI and fake frames, which honestly would be fine if all games were supported, but they aren't.
It seems every Graphics Card Gen with NVidia now, there is a technology only exclusive to this Gen. It is a annoying af.
 

DenchDeckard

Moderated wildly
Lets Go Reaction GIF by The Lonely Island
 

FireFly

Member
At worst its gonna be a 4080 class card, comparing CUDA cores and clocks across generations is basically pointless.
The 3080 and 4070 have the same amount of compute, and perform very similarly, other than in specific RT applications. I can't see any mention from Nvidia of a general IPC uplift for Blackwell, and the 5090 looks roughly to scale with the increase in CUDA cores (once you take out the FG obfuscation). So I don't see how a 5070 is supposed to compete with a card that has almost 60% more CUDA cores, other than again, in specific RT applications.
 
Last edited:

Rivdoric

Member
It seems every Graphics Card Gen with NVidia now, there is a technology only exclusive to this Gen. It is a annoying af.

Again, as said by the fellow you quoted, this wouldn't be an issue if there was REAL perf gain behind. The raster difference 3090 vs 4090 is enormous.
4090 vs 5090 will be 20/30 at best judging by what we saw. All of that obscured by their marketing bs of AI frames which has completely replaced anything else.

I can be surprised when reviews come out, but this looks to be an incredible meh generation and this was easily foreseen : They have no competition so they don't care giving us +75% performance when others can't even match their previous gen. They just add 2 new fake images and done, nEw gEnErAtIoN.

I just hope enthusiasts, which are the core of the x90 line, will see through that foggish slop of a presentation and read the little lines at the bottom of each graphs.
 
Last edited:

evanft

Member
I bet the actual uplift gen-to-gen will be around 25% in raster, maybe a bit more in RT. Excited to see the updates to DLSS roll down to the previous gen cards.
 

A2una1

Member
Again, as said by the fellow you quoted, this wouldn't be an issue if there was REAL perf gain behind. The raster difference 3090 vs 4090 is enormous.
4090 vs 5090 will be 20/30 at best judging by what we saw. All of that obscured by their marketing bs of AI frames which has completely replaced anything else.

I can be surprised when reviews come out, but this looks to be an incredible meh generation and this was easily foreseen : They have no competition so they don't care giving us +75% performance when others can't even match their previous gen. They just add 2 new fake images and done, nEw gEnErAtIoN.

I just hope enthusiasts, which are the core of the x90 line, will see through that foggish slop of a presentation and read the little lines and the bottom of each graphs.
To be totally honest we don't know yet. Especially with the 5090. On their sheet Comparing 4090 and 5090 it says dlls ultra-performance. That would mean 1080p internally, so there could be CPU Bottleneck in place. Should be interesting when reviews hit. But holy moly 570W.....
 
Last edited:

Rivdoric

Member
To be totally honest we don't know yet. Especially with the 5090. On their sheet Comparing 4090 and 5090 it says dlls ultra-performance. That would mean 1080p internally, so there could be CPU Bottleneck in place. Should be interesting when reviews hit.

Indeed.

But again, this is a first for nvidia to not compare brute performance at all.
Which is absolutely not a good omen.
They clearly don't want to display the result, otherwise it would have been shown as this is still one of the most important data when wanting to change GPU.
 
Last edited:

MMaRsu

Member
Fake Frame Dream is getting better and better.

"Oh my i can finally enjoy RT game on 240fps with a base framerate of 20 and call it smooth, the future has never been so real !"

RTX 6090 : 1 real frame, 99 fake, 100% real price.

Who cares about fake frames or real frames? If performance is improved, and I don't notice it during gaming (especially SP games), why would I give a fuck about fake or real frames?

LMao
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Just stop spread BS

What BS am i spreading?
MSI did stop working with AMD.
Asus and Gigabyte are using their cheaper coolers.
AMDs close partners arent considered premium while MSI is.
The slide I posted is from AMD themselves so unless AMD are spreading BS, im not guilty


VBU8joW.gif
 
Last edited:
Who cares about fake frames or real frames? If performance is improved, and I don't notice it during gaming (especially SP games), why would I give a fuck about fake or real frames?

LMao
Fake frames can`t replace real fps, regardless of what PR bullshit is thrown around. The base framerate is what matters the most for responsiveness.
Nothing gained if a game looks smooth but plays like crap.
That one PR sheet with the CP2077 fps increase from ~20ish to >200? That´s gonna look nice...and play like utter shite.
 
Last edited:
Improved DLSS image reconstruction for RTX 40 / 50 series cards sounds awesome. Even 3.8.1 DLSS was far supperior compared to other image reconstruction methods (FSR, XeSS, PSSR) and now we will get even better image quality with DLSS. But I'm not so sure if 3-4x framegeneration will be usable. I tried lossless scaling FG and the number of motion artefacts increased dramatically from 2x FG to 3x FG.
 

MMaRsu

Member
Fake frames can`t replace real fps, regardless of what PR bullshit is thrown around. The base framerate is what matters the most for responsiveness.
Nothing gained if a game looks smooth but plays like crap.
That one PR sheet with the CP2077 fps increase from ~20ish to >200? That´s gonna look nice...and play like utter shite.

DLSS4 has 1ms input delay.

So no, stop talking out of your ass.
 
here's me thinking i had fun playing Cyberpunk, Diablo, Flight Sim for all those hundreds of hours with "fake" frames. never had a problem using frame gen 🤷‍♂️🤷‍♂️
Batman Facepalm GIF by WE tv

you`re sporting a 1k+ 4080. So guess what, the base framerate isn`t an issue for you.
DLSS4 has 1ms input delay.

So no, stop talking out of your ass.
we`re not talking about additional latency from FG, we`re talking about the base. So maybe understand what you?re talking about before responding...
 
Last edited:

Makoto-Yuki

Gold Member
The detractors act like every game is Counter Strike 2 or they're some professional esports player.
the funny thing is you don't even need frame gen for CS2/Valorant etc lmao. I can hit 360fps no problem in both of them. There is no need for frame gen in esports games and if a game did support it you shouldn't use it.

for single player games it's great.

Batman Facepalm GIF by WE tv

you`re sporting a 1k+ 4080. So guess what, the base framerate isn`t an issue for you.
yes and do you want to know what FPS games run at without DLSS/frame gen on a 4080?

Cyberpunk and Flight Sim are very demanding games. Cyberpunk WITH frame gen runs at 80-110fps with full maxed out settings and pathtracing.
 

MMaRsu

Member
Batman Facepalm GIF by WE tv

you`re sporting a 1k+ 4080. So guess what, the base framerate isn`t an issue for you.

we`re not talking about additional latency from FG, we`re talking about the base. So maybe understand what you?re talking about before responding...

Not really used FG much, but I doubt it will be an unresponsive mess with DLSS4.
 
yes and do you want to know what FPS games run at without DLSS/frame gen on a 4080?
yes I do. Also stop tossing DLSS upscaling and framegen into the same bucket.....
Not really used FG much, but I doubt it will be an unresponsive mess with DLSS4.
ofc not. If the base framerate is good the new multi-frame-framegen will be exponentially better.
I´m not advocating against FG, I´m advocating against people not using their brains and going by inflated framegen numbers when making hardware decisions.
The real fps have to be good to make framegen good, it`s really as simple as that.
 
Last edited:
Batman Facepalm GIF by WE tv

you`re sporting a 1k+ 4080. So guess what, the base framerate isn`t an issue for you.

we`re not talking about additional latency from FG, we`re talking about the base. So maybe understand what you?re talking about before responding...
I tested some games at extremely high settings and even 35fps base with nvidia FG was absolutely usable, especially on gamepad. You would think that FG on such low base framerare would destroy latency and make aiming more difficult, but FG actualy made my aiming more precise. Input latency difference wasnt noticeable, however my eyes could track moving objects much more easily.

Usually game at well over 60fps base (real fps), but FG helps immensly with sub 60fps dips. This technology works very well and I can imagine turning FG off, especially in the most demanding PT games.
 

Makoto-Yuki

Gold Member
do we know the dimensions of the 5090? i want to know if it'll fit in my case....

EDIT: nvm i learned how to google
 
Last edited:

MMaRsu

Member
yes I do. Also stop tossing DLSS upscaling and framegen into the same bucket.....

ofc not. If the base framerate is good the new multi-frame-framegen will be exponentially better.
I´m not advocating against FG, I´m advocating against people not using their brains and going by inflated framegen numbers when making hardware decisions.
The real fps have to be good to make framegen good, it`s really as simple as that.

Either way a 5070 will be a huge step up for someone on a 3070

Not falling for FG stuff again. Framegen with 40xx has bad artefacts and ghosting in a lot of games, often have to turn it off. I imagine multi FG will be even worse. Easy gen skip. Seems a 4090 might be good for even skipping next gen after this as the current gen uplift is barely 30% lol.
DLSS 4 has been very much improved.
 
Last edited:

Makoto-Yuki

Gold Member
omg they recommend 1000W for the 5090 so i shouldn't need to get a new PSU as mine is 1000W. If the card uses up to 575W that still leaves 425W for everything else. 7950X3D, motherboard, SSDS, AIO.

and the 5090 founder is same length/width as 4080 founder.

I might actually get a 5090 then!
 
Last edited:

buenoblue

Member
Might be in for 5080. I bought a 4070ti last spring for £599 it's been pretty great but I've run up against vram limit already a few times. I still have my 2070 super and looking on eBay I could sell them for £650.

Will 5080 be a significant upgrade from 4070ti, worth £400??
 
hmmm, much of this is software, the raw perf. isn't that much compared to the 40xx series but I do think the extra GDDR7 VRAM is going to be very useful in the future.

I suggest getting the sweet spot GPU - RTX 5080 or just skip if you have a 4080 and above (and if you want portability, the 5070 Ti Mobile is enough, gaming laptops are anyways limited be design and power)
 
Last edited:

nkarafo

Member
Lol, why does the performance of the 5070 matter when it's going to be bottlenecked by it's 12GB VRAM?

It's not going to be able to handle modern games at high settings above 1080p.

Imagine paying up to $600 in 2025 for a card that's only capable of 1080p for a couple of years. The planned obsolesce by Nvidia is insane and so many people are falling for it.
 

Buggy Loop

Member
yes I do. Also stop tossing DLSS upscaling and framegen into the same bucket.....

ofc not. If the base framerate is good the new multi-frame-framegen will be exponentially better.
I´m not advocating against FG, I´m advocating against people not using their brains and going by inflated framegen numbers when making hardware decisions.
The real fps have to be good to make framegen good, it`s really as simple as that.

Grampa Simpson Grandpa GIF by MOODMAN


Mah raster! But what about mah raster?!

This is the future old man

NPUs are so efficient that everything is going there

One day you’ll be looking fully at inferred graphics.
 

Buggy Loop

Member
Lol, why does the performance of the 5070 matter when it's going to be bottlenecked by it's 12GB VRAM?

It's not going to be able to handle modern games at high settings above 1080p.

Imagine paying up to $600 in 2025 for a card that's only capable of 1080p for a couple of years. The planned obsolesce by Nvidia is insane and so many people are falling for it.

They have a texture compression with AI now

The whole « old » comparatives of previous cards flew out the window last night

Nvidia redid the whole graphic pipeline.

All RDNA 2 vs ampere or RDNA 3 vs Ada benchmarks will have to be redone and especially with the updated games such as Alan Wake 2 which will support mega geometry first
 

Rudius

Member
Multi-frame generation at 4X will be fantastic for 240Hz OLEDs and future 480Hz at 4K with DSC, but at 3X it should be good too for 144Hz monitors at a native 48fps, which is the standard VRR floor for a good reason. Playing with a controller this should be fine.
 

ShaiKhulud1989

Gold Member
Multi-frame generation at 4X will be fantastic for 240Hz OLEDs and future 480Hz at 4K with DSC, but at 3X it should be good too for 144Hz monitors at a native 48fps, which is the standard VRR floor for a good reason. Playing with a controller this should be fine.
Input latency with 4x FG enabled will make yor 240Hz panel basically useless. Ideal scenario is DLSS+Reflex w/o any FG or, maybe, even ray reconstruction.
 

nkarafo

Member
They have a texture compression with AI now

The whole « old » comparatives of previous cards flew out the window last night

Nvidia redid the whole graphic pipeline.

All RDNA 2 vs ampere or RDNA 3 vs Ada benchmarks will have to be redone and especially with the updated games such as Alan Wake 2 which will support mega geometry first
So they are selling you gimped cards at hardware level and they try to compensate on software level. But the price is the same as if the hardware wasn't gimped at all so all these software advancements are for their own benefit ONLY. And that's assuming it will work the way you hope it will.

So you rely on their software now (while knowing their cards are gimped and bottlenecked) and you 100% trust them on this.
 

rofif

Can’t Git Gud
It will have better image quality at interpolated 240 than at native 60, less blur.
Maybe.
But in the slides they showed it’s the same 35ms as on lower fps.
Better just enable reflex and don’t use fg?
 

MMaRsu

Member
Lol, why does the performance of the 5070 matter when it's going to be bottlenecked by it's 12GB VRAM?

It's not going to be able to handle modern games at high settings above 1080p.

Imagine paying up to $600 in 2025 for a card that's only capable of 1080p for a couple of years. The planned obsolesce by Nvidia is insane and so many people are falling for it.

My 3070 from 2020 handles every game bar Wu Kong fine at 1440p or 4k with quality upscale. What a bunch of nonsense.

Unless you want to put everything on max, but why would you want that with a 5070/3070?
 

nkarafo

Member
My 3070 from 2020 handles every game bar Wu Kong fine at 1440p or 4k with quality upscale. What a bunch of nonsense.
Callisto Protocol uses up to 11GB @ 1080p on my end.

Unless you want to put everything on max, but why would you want that with a 5070/3070?
Gee, i don't know.

When i bought my $300 1060, i was able to play most games at max settings @ 1080p/60fps. I suppose expecting the same with a $600 card is too much in this day and age?

I'm not even talking about 4K here or even Ray Tracing. Just 1080p/Non RT. You won't be able to do that soon with only 12GB. Expect to have to reduce textures to medium in less than a year (with a $600 card).

Its amazing how much the standards are lowered and how consumers have accepted it.

Again, let that sink in.... 12Gb for a $600 card in 2025. I guess i'm the crazy one here.
 
Last edited:

MMaRsu

Member
Callisto Protocol uses up to 11GB @ 1080p on my end.
Yes and Wu Kong is also unoptimized trash.

Uses 11GB on what? Ultra textures? RT max?

I know I can't max out a card like the 3070, so I always go for a mix of high/medium settings which usually still looks great and performs great.

Why would I go for absolute max settings when it impacts my performance?
 
Last edited:

Zathalus

Member
The Euro prices are significantly higher because the Euro has taken a nose dive in value vs the dollar over the past few years.
 
Top Bottom