• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Let's face it: GPU manufacturers have hit the wall and things are only going to get worse from here.

Shifty1897

Member
I expected this to be a low effort thread but you made some good points and I found your post very interesting and insightful. Great job OP!
 

Three

Gold Member
Has anyone worrying about this stuff considered that the real limitation is the sheer amount of work and complexity involved in creating software that maximizes the potential of the hardware?

The reason why middle-ware engines like UE have become dominant is that rolling your own engine is such a monumental task that very few are willing to entertain the idea; That's just creating the toolset, not a finished product.

I hate to point this out, but has movie CG improved by that much over the past decade or so? No, it hasn't. In fact if anything its declined due to teams being given inadequate time for polish passes.

This is way less complex work than games because there's no interactive aspect to consider, and the creators have absolute control over what is and isn't presented and essentially have unlimited compute capacity to create their vision offline.

Crazy thought; maybe you should spend more time appreciating WHAT IS, and how much work has gone into it, rather than belly-aching over the "failure" to attain some fantasy level that nobody ever seems to quantify outside of metrics that are dubiously relevant.
People want their cake and to eat it too. There is just no money in spending all this time on maximising hardware usage and amazing graphics for it to flop like The Order 1886. The AAA games market has contracted and there are people who often even gun for its demise by not appreciating any of it. Then they turn around and say that the leaps aren't big enough. The way that the industry has tried to deal with this decline and increased risk/cost but still see those leaps is to make development cheaper by offloading the burden on to the hardware and middleware. UE5, AI and RT are the main drivers of that now.
 

Gaiff

SBI’s Resident Gaslighter
Dying? Open your eyes and do some research.
Why do that when you can just troll?

PS6 will 100% disappoint in raw numbers and visuals. We are at the limits of $499 and 230w tdp. Nvidia powered PC will leave consoles further in the distant
The odds of a $500 PS6 in 2028 are almost nil. $600 is more likely. Same price as the PS5 7 years later? I doubt it. Maybe if they have a cheaper DE and a disc version…even still, the DE could be $600 and the disc version $700.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Bolded is quite an understatement, Sony's Hw design team led by Cerny are legendary. Vita, PS4/Pro and PS5/Pro are all masterfully designed. Im sure PS6 will surprise everyone.
PS4, masterfully designed. A tablet CPU with a 5400 RPM HDD over a USB 2.0 interface. Then you got the PS4 Pro, a massively unbalanced system.

They did what they could with the budget and resources they had, but calling those systems masterfully designed is comical.
 

DoubleClutch

Gold Member
People want their cake and to eat it too. There is just no money in spending all this time on maximising hardware usage and amazing graphics for it to flop like The Order 1886. The AAA games market has contracted and there are people who often even gun for its demise by not appreciating any of it. Then they turn around and say that the leaps aren't big enough. The way that the industry has tried to deal with this decline and increased risk/cost but still see those leaps is to make development cheaper by offloading the burden on to the hardware and middleware. UE5, AI and RT are the main drivers of that now.

The Order 1886 was good but I never want to see that 21:9 bullshit and call it 1080p.
 

DoubleClutch

Gold Member
None of this is a problem for me. The development has to catch up to the tech. Sure, we have better geometry, textures, shinier surfaces, but gameplay hasn’t evolved the way I expected it to.

Half Life 2 came out in 2004 and had the Gravity Gun and using physics to solve puzzles, battle enemies, and navigate the environment. Fast forward two decades, and there hasn’t been much evolution in that front. No amount of raster is going to fix that.

Give me Bloodborne 2 at the current level of graphics and I’ll be happy as can be.
 

Hudo

Member
None of this is a problem for me. The development has to catch up to the tech. Sure, we have better geometry, textures, shinier surfaces, but gameplay hasn’t evolved the way I expected it to.

Half Life 2 came out in 2004 and had the Gravity Gun and using physics to solve puzzles, battle enemies, and navigate the environment. Fast forward two decades, and there hasn’t been much evolution in that front. No amount of raster is going to fix that.

Give me Bloodborne 2 at the current level of graphics and I’ll be happy as can be.
Exactly. I would love to see the "power" of modern consoles and PC tech be applied to actual advances in AI and gameplay (systems). Good graphics are boring.
 
Last edited:

Thebonehead

Gold Member
No thanks.

Enjoy being PC cucked.
Oh it's the edgy neo account that's jealous of people spending money again

You know it's ironic because...

IaX4RC2.jpeg
 
You're missing the point, Nvidia has the monopoly on GPUs 88% of PC users use an Nvidia GPU.

They are rising the prices because they can.

As for improvements, game developers have gone lazy and do not optimize the games, specially after UE5. The true enemy is lazyness and lack of creativity, not lack of power.
 
Last edited:
Speed differences are too low to be really relevant. Once game goes out of vram and spills into system RAM you are fucked:


It's not always the case. I downloaded texture pack for W40K Space Marine 2 that required around 18GB VRAM, and that's real VRAM usage, not even allocation. Some people have had problems on 16GB VRAM cards because performance tanked with this texture pack, or the game stuttered, but on my PC the game still ran smoothly, because PCIe transfer was fast enough.

This test shows how drastically fast PCIe can improve performance in VRAM-limited situations.

Gfmg8bw.jpeg

What's funny, 4GB card on PCIe4 slot was faster in some cases than 8GB model on PCIe3 (both minimum and average fps)
 
Last edited:
It's not always the case. I downloaded texture pack for W40K Space Marine 2 that required around 18GB VRAM, and that's real VRAM usage, not even allocation. Some people had problems on 16GB VRAM cards because performance tanked with this texture pack, or the game stuttered, but on my PC the game still run smoothly, because PCIe transfer was fast enough.

This test shows how drastically fast PCIe can improve performance in VRAM-limited situations.

Gfmg8bw.jpeg

What's funny, 4GB card on PCIe4 slot was faster in some cases than 8GB model on PCIe3 (both minimum and average fps)
This is precisely what I’m talking about. You can have all the VRAM in the world, but bandwidth will always be the bottleneck.
 
Last edited:

hinch7

Member
It's not always the case. I downloaded texture pack for W40K Space Marine 2 that required around 18GB VRAM, and that's real VRAM usage, not even allocation. Some people had problems on 16GB VRAM cards because performance tanked with this texture pack, or the game stuttered, but on my PC the game still run smoothly, because PCIe transfer was fast enough.

This test shows how drastically fast PCIe can improve performance in VRAM-limited situations.

Gfmg8bw.jpeg

What's funny, 4GB card on PCIe4 slot was faster in some cases than 8GB model on PCIe3 (both minimum and average fps)
Getting a 8GB card in 2025 is a terrible idea though. Games are getting more and more heavy on memory.

One of the reasons why there is/was so much hype for the B580. Once you run out of VRAM buffer, your performance tanks and frametimes becomes a mess.
 
Last edited:

Bojji

Member
It's not always the case. I downloaded texture pack for W40K Space Marine 2 that required around 18GB VRAM, and that's real VRAM usage, not even allocation. Some people had problems on 16GB VRAM cards because performance tanked with this texture pack, or the game stuttered, but on my PC the game still run smoothly, because PCIe transfer was fast enough.

This test shows how drastically fast PCIe can improve performance in VRAM-limited situations.

Gfmg8bw.jpeg

What's funny, 4GB card on PCIe4 slot was faster in some cases than 8GB model on PCIe3 (both minimum and average fps)

This is precisely what I’m talking about. You can have all the VRAM in the world, but bandwidth will always be the bottleneck.

It's not true at all. This FC example is some anomaly.

Reality looks like this:

a1c9eM0.jpeg
RG3MpVp.jpeg
TAntyxa.jpeg
NF8atK4.jpeg
usYERbV.jpeg
 
It's not true at all. This FC example is some anomaly.

Reality looks like this:

a1c9eM0.jpeg
RG3MpVp.jpeg
TAntyxa.jpeg
NF8atK4.jpeg
usYERbV.jpeg
Very high and ultra settings with ray tracing on? The 4060 is a low-end card, why would anyone do this? These cards are getting stellar fps with all those features enabled, what’s the complaint here?

Secondly, the 50 series will be on PCIe 5 where as the 40 series is on PCIe 4. The 50 series will have double the bandwidth thus verifying further what I said.

Lastly, the console versions of these games don’t run at very high - ultra settings and have even less VRAM available to them and rely on AMD’s version of DLSS and other types of checkerboard rendering.
 
Last edited:

Bojji

Member
Very high and ultra settings with ray tracing on? The 4060 is a low-end card, why would anyone do this? These cards are getting stellar fps with all those features enabled, what’s the complaint here?

It's 4060ti...

Newer games are chugging 8GB cards even not on highest settings. But you can keep denying reality.

It also shows what happens when card goes out of vram, it's not similar to FC at all, isn't it? You have exactly the same GPU but one with more memory performs better.
 
Last edited:
It's 4060ti...

Newer games are chugging 8GB cards even not on highest settings. But you can keep denying reality.

It also shows what happens when card goes out of vram, it's not similar to FC at all, isn't it? You have exactly the same GPU but one with more memory performs better.
Stop setting the game to very high and ultra at a tier where it’s not acceptable. The textures alone will fill the VRAM. You shouldn’t be using 4K assets at 1080p. The low-tier GPU isn’t tuned for that.
 
Last edited:

Bojji

Member
Makes sense. The extra $100 goes a long way in that regard. Pay budget prices, get budget performance.

Both 4060 and 4060ti shouldn't have 8GB in the first place. 10GB minimum for 4060 and 12GB for 4060ti.

8GB is entry level memory amount for cards like xx50 series.

But nvidia is gonna nvidia...
 
It's not true at all. This FC example is some anomaly.

Reality looks like this:

a1c9eM0.jpeg
RG3MpVp.jpeg
TAntyxa.jpeg
NF8atK4.jpeg
usYERbV.jpeg
You said that once "game goes out of vram and spills into system RAM you are fucked", and my exampe showed that's not always the case.

"Not always the case" means there will be still games, where even fast PCIe will be not enough to fix performance in VRAM limited situation. RT games are especially sensitive to memory bandwidth, and can close to desktop as soon you run out or VRAM (especially RE4Remake). Techspot tested games without RT, and PCIe4 bandwidth was enough to improve performance across every single raster game they tested, although in most games card with more VRAM was still faster.

WD_1080p.png


Doom_1440p.png

HZD_1080p.png


ACV_1440p.png



I'm not saying that fast PCIe slot can save GPUs with low amount of VRAM, but it certainly can help in some cases. That's why some people with 16GB VRAM cards has problems in W40K Space Marine 2 with 100GB texture pack, while I have no problems on my card despite having the same 16GB VRAM. Faster PCIe clearly made a difference in my case.
 
Last edited:

Bojji

Member
The 4060 is entry level though.

For some reason they didn't make 4050 (yet), but 4060 shouldn't have less memory than 3060. It should AT LEAST have that 10GB.

You said that once "game goes out of vram and spills into system RAM you are fucked", and my exampe showed that's not always the case.

"Not always the case" means there will be still games, where even fast PCIe will be not enough to fix performance in VRAM limited situation. RT games are especially sensitive to memory bandwidth, and can close to desktop as soon you run out or VRAM (especially RE4Remake). Techsport tested games without RT, and PCIe4 bandwidth was enough to improve performance across every single raster game they tested, although in most games card with more VRAM was still faster.

WD_1080p.png


Doom_1440p.png

HZD_1080p.png


ACV_1440p.png



I'm not saying that fast PCIe slot can save GPUs with low amount of VRAM, but it certainly can help in some cases. That's why some people with 16GB VRAM cards has problems in W40K Space Marine 2 with 100GB texture pack, while I have no problems on my card despite having the same 16GB VRAM. Faster PCIe clearly made a difference in my case.

Fast PCIE might help somewhat with performance loss but it's not the solution.

You have to kill texture quality/resolution or change GPU to one with more memory.
 
For some reason they didn't make 4050 (yet), but 4060 shouldn't have less memory than 3060. It should AT LEAST have that 10GB.



Fast PCIE might help somewhat with performance loss but it's not the solution.

You have to kill texture quality/resolution or change GPU to one with more memory.
I don’t disagree with you.
 

64bitmodels

Reverse groomer.
This is better for the video game industry (Not the gpu makers) now the devs have to squeeze what we have, time for the talented devs to shine and the mediocre ones to git good.
they don't "have to" do shit. They will either:

Stagnate graphics (which is already happening to some extent)
Go GAAS since that's a genre which is not as reliant on good visuals to sell
or
Just not give a fuck. framegen & dlss will clean up the unoptimized mess

The mediocre devs are keeping their jobs.
 

Xdrive05

Member
The Daniel Owen tests above clearly show that 8GB is the limitation EVEN at 1080p. It's dispositive. Even the 4060ti runs way better with the 16GB version, EVEN at 1080p in some new games. 1% lows tell the story too.

Push things up to 1440p with DLSS and frame gen, and you need even more VRAM. 8GB can forget about it.

4060ti 16GB plays Indiana Jones very well at 1440p Balanced with Full RT if you use frame generation and a controller (yes, the FG input lag is bad with mouse look, but fine on controller). And it uses 14.5GB of VRAM to do that. The 8GB version of the exact same card won't even give you the Full RT option in the damn menu.
 
Last edited:
Top Bottom