• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Let's face it: GPU manufacturers have hit the wall and things are only going to get worse from here.

We have seen huge improvements in GPUs since early 2000 almost every / two years, but it seems that many people do not understand what has made this possible. It's not like Nvidia and ATI / AMD have been improving the architecture so much each year to get 2x increase in performance. The truth is that this progress has only been possible because TMSC has been able to offer more transistors at the same price every 1-2 years. Unfortunately we hit the node wall and Nvidia cannot improve performance without increasing price and power consumption anymore. The RTX5090 will have absolutely ridiculous 575TDP, and will probably require 1000W PSU (and a power plant ;). Some people like to think that Nvidia is just greedy, but can we really expect them to cut prices when costs go up? That's not realistic.

So what does this mean for us gamers? Well, we can certainly say goodbye to cheap GPUs and meaningful improvements every 2 years. Do you guys remember the difference between 980ti vs. 1080ti? In some cases it was a whopping 112% relative difference even without DLSS / FG gimmicks. Now we can forget about such massive gains in pure raster.


hitman_3840_2160.png


The good part is, current GPUs will last longer than ever.

The Node Wall will also have an effect on the consoles. People expect PS6 to bring the desired improvements like PT, but I doubt AMD will make APU that will be able to match RTX4090 in the next few years. The RTX2080ti was released in 2018 and it took 7 years before console (PS5Pro) could match it's performance (the 2080ti is still faster, especially in RT, but not that much). How many years it will take for AMD to build an APU that will be able to compete with the RTX4090, let alone the RTX5090? I was hoping to see PT used on the PS6, but given current reality I need to lower my expectations. Perhaps Sony / MS should delay the PS6 and Xbox5 until they can offer a meaningful performance / technology boost. The PS5 / XSX launched in 2020, but dev cycles increased a lot in the last decade. Nowadays we get a quality game release by waiting at least 4-5 years, so IMO Sony / MS should not rush the new console generation any more, and especially if they cant give people desired / noticeable improvements.

What can we do if things only get worse with our current hobby? Play older games? Maybe we should get a new hobby and play some chess 🙃😃 🤣?
 
Last edited:

Sanepar

Member
I said this about the pro and will say it again about the PS6 in this same vein

People will be very let down when they see the raw numbers on the PS6

Oh and I don't need any new hobbies :)
Well in 2028 i expect 50 real tflops on 250w(so basically a 4090).
 

Buggy Loop

Gold Member
Counter argument : Developers have hit the wall

They've barely progressed from 2015 graphics that was running on <2TFlops with no ML nor RT, paired with shit CPUs and HDD, and now they kneecap a 4090 24GB with 9800x3D and 64GB and SSD. Bad optimization anyone? It's terrible. Not impressed. Hardware TAA/DLSS and so on has just made them completely lazy. Just set DLSS to performance if it doesn't run! Or FSR pixel soup on console.

GPU cycle could stop for literally 5 years and devs should just try to catch up with optimization and know-how, but they won't.
 

TrebleShot

Member
Yeah it's true and because Nvida is an American company bigger = better.

Not much nuance in how they do things. tdp on these things bloated out of consumer electronics years ago.
 
Everyone is acting like there aren’t choices when it comes to GPU’s. You can choose to buy a new GPU from any manufacturer and any generation. You don’t need to spend a shit ton for a pc. Sure it’s more than a console, but if you do more than game on it it’s completely worth the added expense. Not only this, but it actually is cheeper than buying a console as well as buying a whole separate desktop tablet, or laptop for media consumption, work, or various other hobby’s.
 

Bojji

Member
I said this about the pro and will say it again about the PS6 in this same vein

People will be very let down when they see the raw numbers on the PS6

Oh and I don't need any new hobbies :)

I expected this after weak Pro.

Normally we should get ~4090 level of GPU (raw raster power) after all those years and based on previous gens.

But with all that tech slowing down thing we will for sure not see that with PS6...

I bet they will expand on RT and ML (Cerny words basically) but raw raster power will be below 7900XTX/4080 level.
 

rofif

Can’t Git Gud
Everyone is acting like there aren’t choices when it comes to GPU’s. You can choose to buy a new GPU from any manufacturer and any generation. You don’t need to spend a shit ton for a pc. Sure it’s more than a console, but if you do more than game on it it’s completely worth the added expense. Not only this, but it actually is cheeper than buying a console as well as buying a whole separate desktop tablet, or laptop for media consumption, work, or various other hobby’s.
It’s just that new gpus didn’t use to feel so desperate in the past. I always looked forward to new gpus coming out and seeing how much gamer nexus and Jayz will overlock them.
Hell I was top10 myself in one of 3dmark tests on 3080 launch just because I had one so early. Used to be more exciting because it also felt like bang for buck
 
Sony will be dumb to focus on or even talk in great detail about PS6 specs, but for that to happen, the games need to be ready and should "surprise and delight" (new and fresh art direction/level design/AI) and have a game mechanic/system "hook" that "hasn't been seen before".

regarding AMD/Nvidia...if I research what GPU to buy. YT will be filled with gaming-centric videos for the most part. But I wonder how misleading or misrepresented the market actually is. i will guess that "gaming" is a secondary aspect into building a PC (in the big scheme of things) People how build PCs are actually using them as "workstations" or productivity devices.
 

rofif

Can’t Git Gud
I expected this after weak Pro.

Normally we should get ~4090 level of GPU (raw raster power) after all those years and based on previous gens.

But with all that tech slowing down thing we will for sure not see that with PS6...

I bet they will expand on RT and ML (Cerny words basically) but raw raster power will be below 7900XTX/4080 level.
We used to get top previous Gen gpu spec in consoles because the cards used to be around 200watt Max not so long ago.
Now with huge 350-600watt dies and process node shrink slowing down, it’s not possible to get that into 200watt console.
Moores law barely applies anymore if you exclude dlss stuff
 

hinch7

Member
Speaking way too early we haven't seen what AMD can do with UDNA and a proper MCM design. The 9070 XT is rumored to reach within 5% of a 4080 in a small die. No way have we hit a wall yet.

Consoles, maybe if we're talking cost unless they're going charge a pretty penny for one. Much like the PS5 Pro.
 
Last edited:
NVIDIA’s profit margins are insane. Nobody is asking them to be a charity, but those prices are an absolute scam if they are true.
Really can’t blame them when everyone buys them anyway regardless of price. It’s simple supply and demand. Demand is through the roof. They literally don’t have competition.

Let’s face it. The only reason anyone rails on AMD for not being competitive enough is because they want Nvidia to lower their prices, not because they want to buy anything AMD is offering.
 
Last edited:
Counter argument : Developers have hit the wall

They've barely progressed from 2015 graphics that was running on <2TFlops with no ML nor RT, paired with shit CPUs and HDD, and now they kneecap a 4090 24GB with 9800x3D and 64GB and SSD. Bad optimization anyone? It's terrible. Not impressed. Hardware TAA/DLSS and so on has just made them completely lazy. Just set DLSS to performance if it doesn't run! Or FSR pixel soup on console.

GPU cycle could stop for literally 5 years and devs should just try to catch up with optimization and know-how, but they won't.
It's unfair to say that we have seen no improvement in the game's graphics since 2015 (PS4 era). I think RT offers the biggest improvement in game graphics since shaders. Some 2015 games literally look like new with RT lighting (the witcher 3).

Have you played any PS4 era games lately? I have, and they look very dated when you compare the detail, or the lighting with modern games.

RDR2-2025-01-02-23-14-32-552.jpg


RDR2-2025-01-02-23-19-18-042.jpg


RDR2-2025-01-02-23-33-00-734.jpg


tll-2024-12-05-17-24-58-676.jpg


Uncharted-4-Kres-z-odzieja-20241206014639.jpg


PS4 games may look amazing in small thumbnails or low-resolution gifs (especially games with beautiful scenery line Uncharted 4, or RDR2), but open screenshots at 4K and the difference is immediately apparent. Compare the stone in the first screenshot with Black Myth Wukong (BMW). In BMW, not only is the stone much more detailed, but the PT lighting has also made it possible to bring every bump on it's surface to life.


b1-Win64-Shipping-2024-09-01-00-30-46-747.jpg


b1-Win64-Shipping-2024-09-01-00-07-52-582.jpg


On PC I see already a GENERATIONAL difference between the PS4 graphics. I'm not so sure if even the PS6 will be able to run BMW with PT and show this difference.

BMW is not the only stunning looking game that would be completely impossible to run on PS4 hardware without downgrading the graphics to the extreme.




 
Last edited:

Buggy Loop

Gold Member
It's unfair to say that we have seen no improvement in the game's graphics since 2015 (PS4 era).

I didn't say there's no improvements, barely improved.

I played Wukong 3 times. Why is a mostly linear map with invisible walls and no simulations to speak of except to hit on enemies is even compared to the world of Red Dead redemption 2? Do you understand the scale of simulations and open worlds?

Horizon Forbidden West is still in top list of graphics and runs on PS4.

Some games are well optimized for ray tracing and path tracing, they exist, but by far, a majority of them run like shit.
 
Last edited:
You should wait at least 2 more days, I don't think this is gonna age well.
Why, do you really expect to see miracless form the RTX5090?. The 1080ti (2017) $799 was the last trully high end card with MSRP below $1000. The 5090 will cost at least and will not even offer 980ti vs 1080ti level of performance gap. Nvidia may give us DLSS4 to milk some gamers, but the RTX 50 series is going to be terrible. We will see even higher prices and less performance gap.
 
I didn't say there's no improvements, barely improved.

I played Wukong 3 times. Why is a mostly linear map with invisible walls and no simulations to speak of except to hit on enemies is even compared to the world of Red Dead redemption 2? Do you understand the scale of simulations and open worlds?

Horizon Forbidden West is still in top list of graphics and runs on PS4.

Some games are well optimized for ray tracing and path tracing, they exist, but by far, a majority of them run like shit.
I played both horizon Remaster and Forbidden West. Beutiful scenery, and amazing art direction but the lighting was flat as hell. PT games like Wukong, Cyberpunk have way better lighting, also Avatar is on totally different level despite using standard RT.

Maybe I played too many PT / RT on my PC so my perspective is a bit skewed. I just cannot look at PS5 games like Horizon 1 Remaster / 2 Forbidden West and be impressed with the graphics, sorry.
 
The 5090 will be a lot faster. The rest of the lineup will OTOH be very mid.
You seem to have forgotten how much faster the 1080ti was compared to the 980ti. Relative difference was insane (92%) and that was pure raster, with no DLSS gimmicks.

doom_3840_2160.png


I doubt the RTX5090 will offer similar gains in raster (maybe with FG 2.0, but generated framers arnt the same as real frames). The price could double though :p.
 
Last edited:

Celcius

°Temp. member
You seem to have forgotten how much faster the 1080ti was compared to the 980ti, and that was pure raster, with no DLSS gimmicks.

doom_3840_2160.png


I doubt the RTX5090 will offer similar gains. The price could double though :p.
The RTX 5080 is rumored to be on the same level or slightly faster than the RTX 4090
...and the RTX 5090 literally has double the specs of the RTX 5080, so I'd say it's possible
We'll see in 2 days though
 
NVIDIA’s profit margins are insane. Nobody is asking them to be a charity, but those prices are an absolute scam if they are true.
I do not know how much money Nvidia has spent on developing these new GPUs. It's possible that they increased their margins to cover their research costs. The RTX5090 chip itself (materials and labour) may cost very little compared to it's MSRP, but NVIDIA has certainly spent billions researching this chip and they also need funds for their future research.

Do you think they are just greedy and there's nothing else to it?
 
Last edited:
I do not know how much money Nvidia has spent on developing these new GPUs. It's possible that they increased their margins to cover their costs. Do you think they are just greedy?

The margins on the AI chips are crazy. The Gaming parts I don't think they have said much about that.
 
Moore's Law is at its end and while GAAFET transistors will provide a sizeable jump in transistor density, the bigger reason for the stagnation in semiconductor manufacturing advancement (in terms of cost saving opportunities), is that TSMC is now single a monopoly in bleeding edge logic-based semi-conductor manufacturing.

Given that Global Foundaries fell off a cliff and Intel got left in the dust, TSMC has been increasing prices across all major nodes multiple times over the course of the past half-decade. They will of course blame wider socio-economic factors for the increases, but the truth is just pure and simple corporate greed because they now have no competition.

The TSMC monopoly is the real root cause of the slowdown and death of Moore's Law. The silver lining is that as we've reached the end of silicon scaling, there is an opportunity for whole new exotic computing technologies to emerge that could kick-start a whole new era of technological advancement and arms race between many different players. The problem is, most of those non-silicon based technologies are still many decades away from prime time.
 

flying_sq

Member
I'm waiting for chiplets in GPUs, mono dies are on on the way out. I think you're correct to a point, but I also think your underestimating the most valuable company in the world's ability to throw infinite money at an issue until it works.
 
The TSMC monopoly is the real root cause of the slowdown and death of Moore's Law.

I don’t think that’s true, there’s huge opportunity for competition when prices are high

If these other companies were competent they should be able to do better.

Maybe TSMC is just that much better, but they aren’t causing diminishing returns
 
Last edited:

Aces High

Member
I do not know how much money Nvidia has spent on developing these new GPUs.
Probably not much.

At this point it's just trickle down development for gaming. Both Nvidia and AMD are data center first. Gamers get the leftovers.

 
I think a good argument could be made to bring back Crossfire and SLI with new ai tools in order to make the use case of multi gpu's commonplace. Its better than having a second tower that acts as a gpu once the RTX 9090 comes out.
 

lh032

I cry about Xbox and hate PlayStation.
good, hopefully this will mean that the gap jumping to new gen will be wider and wider.
ANd better frame rates as well.
 
I think a good argument could be made to bring back Crossfire and SLI with new ai tools in order to make the use case of multi gpu's commonplace. Its better than having a second tower that acts as a gpu once the RTX 9090 comes out.
Yes, let's bring back SLI. If I win the lottery, I'll buy a RTX6090 Quad SLI and play every single PT game at 4K native 240fps / Hz 😃
 
Last edited:
Top Bottom