MikeM
Member
Pauvre toi mon ami. Traverse le pont à Ontario?FE but since I’m in Quebec it’s going to be hard.
So it’s going to be Asus or Zotac for me.
The Astral look godly
Pauvre toi mon ami. Traverse le pont à Ontario?FE but since I’m in Quebec it’s going to be hard.
So it’s going to be Asus or Zotac for me.
The Astral look godly
I don't think NVidias most expensive products will dominate the market
Quebec city to Ottawa is a hell of a ride lolPauvre toi mon ami. Traverse le pont à Ontario?
How can you say that when the 4090 and 4080S have a commanding share of 2% combined in the Steam survey?!?!
How do you get around the Quebec restrictions? Buy from the US?Quebec city to Ottawa is a hell of a ride lol
Unfortunately 3rd party sellers aka scalpersHow do you get around the Quebec restrictions? Buy from the US?
Man that sucks. Thanks language police?Unfortunately 3rd party sellers aka scalpers
ExactlyMan that sucks. Thanks language police?
Quebec city to Ottawa is a hell of a ride lol
I’m not driving 10h for a GPUThat aint shit unless youve got bad weather
I’m not driving 10h for a GPU
Yes,sir!!!!!You will soldier!
More chips going to become available I assume. Prices go down?oops
NVIDIA's Blackwell AI Servers Faced With Overheating & Glitching Issues; Major Customers, Including Microsoft & Google, Start Cutting Down Orders
NVIDIA's Blackwell AI servers are facing a supply chain bottleneck as Team Green fails to resolve the overheating and architectural flaws.wccftech.com
The upgrade on my laptop is gonna be $5k, I can feel my accountant giving me shit already.Some more prebuilds at Best Buy
The 3% for 4090/4080's represents like 6-7 Billion dollars of revenue, the 9% for 4060/4060ti represents like 4 Billion. Its costs around $350 to build a 4080, $450-$500 for a 4090, meaning profit margins are 200%+. 4060/4060ti margins are likely only 40% at best. Similarly people with the highend cards likely spend multiple times more on games and microtransactions. The hign end cards absolutely do dominate the industry in dollars, profit, and spending.How can you say that when the 4090 and 4080S have a commanding share of 2% combined in the Steam survey?!?!
Not a hate troll bait. Just a unique observation from 4090 owner. People who have 4090 rarely say that
21:30 Jeff says the FG on 4090 is unacceptable and feels like playing on geforce now.
I like how he says "I am here to fucking play video games and not add geforce now latency".
It's kinda rare seeing 4090 owner shitting on FG.
I personally don't see any reason to use it. With 4090-5090 you already have amazing framerates with dlss2 (that I always use)... so why make 70fps into 240 but add more lag and potential artifacts? Are you playing a game or watching the fps number in rtss there in a corner.
good post. Something I would 100% try and use in few games when I got 5080 or somethingAfter more than 25 years of playing fast-paced FPS games such as Quake and Unreal Tournament, I am very sensitive to input lag. I can tell the difference in input latency between 170Hz VSYNc and 167Hz VRR (VRR canced VSYNC input lag). DLSS FG increase input latency a little bit, but it's not as big as Vsync input lag on my monitor. In games like cyberpunk that increase in input lag is in placebo territory even for me, so comprisons to "geforce now" suggest to me that settings are to blame. Nvidia FG adds unnaceptable lag (100ms on top of your base input latency) if you have vsync turned on, or try to limit framerare with RTSS (riva tuner). Some people might use the wrong settings and they will think that DLSS FG is to blame.
I enable DLSS FG in all my games because it always improve image quality (on sample and hold display more fps equals sharper image during motion) and because it also improves my aiming, because my eyes can track objects much easier. I can play cyberpunk at 40-50fps and have good time (especially on gamepad), but with FG on top of that my aim improves DRASTICALLY. Even if I have a decent fps (80fps), some games might still dip below 60fps from time to time. Normally that would bother me, but sice I started using FG I no longer care about sub 60fps.
Here's evidence to support my claims.
Cyberpunk at 1440p TAA native + psycho RT. I get 73fps with these settings and 28ms latency (in the top right hand corner of my monitor screen)
The same settings, just with 124fps and 37ms latency, just 9ms more but it made the game look MUCH sharper during motion and I could aim much easier.
Another game BMW
4K DLSS performance, very high settings, fullRT, 57fps and 63ms input latency
With DLSS FG 88fps and 54ms. That's 9ms less even compared to without FG, so in this particular game FG offers the best latency possible (because it activates nvidia reflex).
However, not all games have such low input latency with DLSS FG. In Alan Wake 2 I measured additional 20ms input latency with DLSSFG. The game is still perfectly playable on gamepad, but on M+K I started to feel increased input delay (but it's still wasnt "geforce now" bad).
Alan Wake 2, 4K DLSS performance, max settings, fullPT, 75fps and 40ms latency
With DLSS FG 108fps and 60ms latency
Overall I'm very happy with the results of DLSS FG. I can see problems in FSR3 FG and Lossless Scaling FG, but DLSS FG works so well that I consider it a free performance boost and use it in every single game (even Alan Wake 2).
What app are you using to measure the latency?After more than 25 years of playing fast-paced FPS games such as Quake and Unreal Tournament, I became very sensitive to input lag. I can tell the difference in input latency between 170Hz VSYNc and 167Hz VRR (VRR cancels VSYNC input lag). DLSS FG increase input latency a little bit, but it's not as big as Vsync input lag on my monitor. In games like cyberpunk that increase in input lag is in placebo territory even for me, so comparisons to "geforce now" clearly suggest to me that wrong settings are to blame. Nvidia FG adds unnaceptable lag (100ms on top of your base input latency) if you have vsync turned on, or try to limit framerare with RTSS (riva tuner). Some people might use the wrong settings and they will think that DLSS FG is to blame.
I enable DLSS FG in all my games because it always improve image quality (on sample and hold display more fps equals sharper image during motion) and it also improves my aiming, because my eyes can track objects much easier. I can play cyberpunk at 40-50fps and have good time (especially on gamepad), but with FG on top of that my aim improves DRASTICALLY. Even if I have a decent fps (80fps), some games might still dip below 60fps from time to time. Normally that would bother me, but sice I started using FG I no longer care about sub 60fps.
Here's evidence to support my claims.
Cyberpunk at 1440p TAA native + psycho RT. I get 73fps with these settings and 28ms latency (in the top right hand corner of my monitor screen)
The same settings, just with 124fps and 37ms latency, just 9ms more but it made the game look MUCH sharper during motion and I could aim much easier.
Another game BMW
4K DLSS performance, very high settings, fullRT, 57fps and 63ms input latency
With DLSS FG 88fps and 54ms. That's 9ms less even compared to without FG, so in this particular game FG offers the best latency possible (because it activates nvidia reflex).
However, not all games have such low input latency with DLSS FG. In Alan Wake 2 I measured additional 20ms input latency with DLSSFG. The game is still perfectly playable on gamepad, but on M+K I started to feel increased input delay (but it's still wasnt "geforce now" bad).
Alan Wake 2, 4K DLSS performance, max settings, fullPT, 75fps and 40ms latency
With DLSS FG 108fps and 60ms latency
Overall I'm very happy with the results of DLSS FG. I can see problems in FSR3 FG and Lossless Scaling FG, but DLSS FG works so well that I consider it a free performance boost and use it in every single game (even Alan Wake 2).
That's datacenter GPUs, very different.More chips going to become available I assume. Prices go down?
I always use "Nvidia Reflex" when the game supports it, so when I said that the input latency in Cyberpunk was in placebo territory for me, I was comparing both modes with Reflex on.good post. Something I would 100% try and use in few games when I got 5080 or something
What if you used reflex without FG? Does it feel better for you?
Geforce experience APP. You need to enable hardware monitoring.What app are you using to measure the latency?
That huge and profitable high end gpu market that AMD and Intel can't survive in? Even if it's just that they make shit cards that can't compete, Nvidia and the AIB vendors sure do leave a lot of money on the table by producing so few of those cards.The 3% for 4090/4080's represents like 6-7 Billion dollars of revenue, the 9% for 4060/4060ti represents like 4 Billion. Its costs around $350 to build a 4080, $450-$500 for a 4090, meaning profit margins are 200%+. 4060/4060ti margins are likely only 40% at best. Similarly people with the highend cards likely spend multiple times more on games and microtransactions. The hign end cards absolutely do dominate the industry in dollars, profit, and spending.
It’s 500 miles to Ottawa, we have a full tank of gas, half of a pack of cigarettes, it’s dark and we are wearing sunglasses! Hit it!I’m not driving 10h for a GPU
I mean yeah. 10 hours like me going from Ottawa to Toronto is too far for a GPU. by the time I am there I am either ghosted or sold. but if its 2 hours drive like Ottawa to Montreal. then I don't mind doing it. especially on the weekend. take my family for a day trip.I’m not driving 10h for a GPU
That huge and profitable high end gpu market that AMD and Intel can't survive in? Even if it's just that they make shit cards that can't compete, Nvidia and the AIB vendors sure do leave a lot of money on the table by producing so few of those cards.
Even if the cards only have $400-$500 of hardware in them, you can't just ignore the R&D and marketing costs, your thoughts on owners of those spending more on games and DLC is just pure conjecture.
These are cheap compared to what similar prebuilds will cost in Europe. Some versions of the 5090 alone will probably cost close to that 5090 pre build.Some more prebuilds at Best Buy
I came across these two videos and my mind has now been opened to Nvidia's smoke and mirrors. Surprise surprise - the 5000 series overall sucks.
Clickbait or not, it's difficult to argue with the raw figures.Nah, it’s such youtubers that suck. Clickbait titles, clickbait faces, zero useful content
I came across these two videos and my mind has now been opened to Nvidia's smoke and mirrors. Surprise surprise - the 5000 series overall sucks.
I came across these two videos and my mind has now been opened to Nvidia's smoke and mirrors. Surprise surprise - the 5000 series overall sucks.
I came across these two videos and my mind has now been opened to Nvidia's smoke and mirrors. Surprise surprise - the 5000 series overall sucks.
When the actual hardware (based on CUDA/shader counts) has such limited improvement over the predecessor, yesSo a series that has yet to release sucks and saw no improvements? Sure.
Yep awaiting benchmarks - but I don't magically expect a 20% improvement for most of the line up if the hardware is practically the same.I will wait for someone like Gamers Nexus to destroy NVidia if indeed the 5000 series sucks overall
I came across these two videos and my mind has now been opened to Nvidia's smoke and mirrors. Surprise surprise - the 5000 series overall sucks.
Skip the video and look at this postImagine trusting this fucking guy to educate you on technology
I thought the bogdanoff twins had diedImagine trusting this fucking guy to educate you on technology
Skip the video and look at this post
GeForce RTX 5090 is $1,999 5080 $999 5070 Ti $749 5070 $549 (Availability Starting Jan 30 for RTX 5090 and 5080)
FE but since I’m in Quebec it’s going to be hard. So it’s going to be Asus or Zotac for me. The Astral look godly https://rog.asus.com/graphics-cards/graphics-cards/rog-astral/rog-astral-rtx5090-o32g-gaming/ Pauvre toi mon ami. Traverse le pont à Ontario?www.neogaf.com
Not a hate troll bait. Just a unique observation from 4090 owner. People who have 4090 rarely say that
21:30 Jeff says the FG on 4090 is unacceptable and feels like playing on geforce now.
I like how he says "I am here to fucking play video games and not add geforce now latency".
It's kinda rare seeing 4090 owner shitting on FG.
I personally don't see any reason to use it. With 4090-5090 you already have amazing framerates with dlss2 (that I always use)... so why make 70fps into 240 but add more lag and potential artifacts? Are you playing a game or watching the fps number in rtss there in a corner.
The 4000 was a considerable jump over the previous gen (3000). The 4080 is 25% faster than the 3090 Ti, and 50% faster than the 3080.Looks similar to the 4000 series cards, where the 5090 is a very nice jump and the 5080 is not.
At least the 5080 is priced at $999 instead of the $1199 that they asked for the 4080.
Is this graph from Nvidia? I didn’t see anything without RT on it previously.Clickbait or not, it's difficult to argue with the raw figures.
Overall the 5000 series has been a sideways refresh with a software patch. The 5090 has an increase in performance - with increased power consumption and total price. Rest of the cards? Raster performance is no better than 2 year old GPU.
Is this graph from Nvidia? I didn’t see anything without RT on it previously.
If so, seems normal raster is shit and the upgrades are in RT workloads and frame gen. Zzzzzzzs
You mean that chart?NVIDIA's own charts show the 5080 is a software patch over the 4080.