• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5 Pro devkits arrive at third-party studios, Sony expects Pro specs to leak

PaintTinJr

Member
20.9 TF at 2.721ghz on 5nm process with 60CUs. This is why I am expecting PS5 Pro to reach 20 TF if it uses at least the 5nm node. And I don't see it using less than that one year after 7800 XT. Sony will need 2.6 ghz to reach 20 tf and I expect they will hit (or very close to it) that with their dynamic clocks.
I don't see them increasing the GPU clocks at all in a mid-gen refresh - maybe the CPU though - as the GPU clock increasing is almost impossible to ensure easy backwards compatibility in a PS6. If anything they'll want to reduce clocks against improved IPC to free up thermal headroom for the next-gen, when the number of CUs is going to need to increase faster than the lithography shrinkage will allow while driving a high clock rate.

That's why I think they will revert to their old abandoned prototype dual Cell BE solution from the PS3 for the refresh, and possibly PS6 too. An APU like the PS5 APU has everything PlayStation need to use it as a AI/RT co-processor to another PS5 APU for a mid-gen refresh, and the chip is dirt cheap to them having already used 50M already.

Complex highspeed motherboard wiring done profitably has been Sony's expertise for years, the PS2 Reality Synth probably being the best example of a design only Sony would do, and then them make a killing on it by selling amazing and making it cheaply.
 

Mr.Phoenix

Member
6 nm fabricated "Zen 2" has improvements with pref/watt.


My Gigabyte RTX 4080 Gaming OC's "one-click" overclock can reach 2.9 GHz which is roughly 56.4 TFLOPS. This year's AD103 is RTX 4080 Super which is about 59 TFLOPS at 2.9 Ghz.

RX 7900 XT has 51.48 TFLOPS at 2394 Mhz. Most RX 7900 XT can match RX 7900 XTX's 2499 Mhz clock speed.

RX 7900 XTX has 61.42 TFLOPS at 2499 Mhz.

RTX 3080 has +29 TFLOPS at 1710 Mhz with 68 SM and 96 ROPS. RTX 3080's RT cores fully accelerate DXR.

RTX 4070 has +29 TFLOPS at 2475 Mhz. RTX 4070 Super in 2024.

RX 7800 XT has 37 TFLOPS at 2430 Mhz.

RTX 4070 Ti has 40 TFLOPS at 2610 Mhz. RTX 4070 Ti Super in 2024.

Both Ampere/Lovelace SM and RDNA 3 CU have 128 FP shader units without corresponding TMU increases. Both AMD and NVIDIA are doubling FLOPS power per CU/SM without increasing TMUs which can benefit workloads like RT denoise and geometry workloads.

With Ampere/Lovelace and RDNA 3 generation, AMD and NVIDIA TFLOPS are roughly similar when AMD's RT issue is removed. AMD needs to improve FLOPS per chip area. Fixing RT cores RDNA 3.5 will help with PC's GPU market competition.

RTX 4090 is a monster GPU with a monster price tag.
I don't know, I cant help but feel that the AMD and Nvidia listed FP 32 TF numbers are scams. As we are yet to see any game take advantage of this whole dual-issue compute thing.And the way both companies goes about achieving these double TF numbers is weird. Eg. Nvidia didn't really double anything, they just made their already existing 64 INT32 cores also capable of "acting" like 64 FP32 cores simultaneously. AMD on the other hand just allows its own 64 FP32 cores to work on two instructions simultaneously,

Either way, we are yet to see any game use this feature, evident in the fact that compared to their GPUs using the older TF measurements, we are not seeing anywhere near the kinda performance boosts that these inflated TF numbers would suggest.
 
I don't see them increasing the GPU clocks at all in a mid-gen refresh - maybe the CPU though - as the GPU clock increasing is almost impossible to ensure easy backwards compatibility in a PS6. If anything they'll want to reduce clocks against improved IPC to free up thermal headroom for the next-gen, when the number of CUs is going to need to increase faster than the lithography shrinkage will allow while driving a high clock rate.

That's why I think they will revert to their old abandoned prototype dual Cell BE solution from the PS3 for the refresh, and possibly PS6 too. An APU like the PS5 APU has everything PlayStation need to use it as a AI/RT co-processor to another PS5 APU for a mid-gen refresh, and the chip is dirt cheap to them having already used 50M already.

Complex highspeed motherboard wiring done profitably has been Sony's expertise for years, the PS2 Reality Synth probably being the best example of a design only Sony would do, and then them make a killing on it by selling amazing and making it cheaply.
They'll downclock it for BC if needed. This is how BC works on PS5. In some rare games (like AC Unity) GPU clocks are set the PS4 Pro value. Even for Pro they overclocked the GPU as high as they could, from 800mhz to 911mhz. Which is about 14% overclock and they didn't have dynamic clocks back then resulting in jet engines noise in many PS4 Pro games!

For reference a similar 14% overclock would make PS5 Pro hit +2.5ghz and 19.4 TF.
 

FireFly

Member
20.9 TF at 2.721ghz on 5nm process with 60CUs. This is why I am expecting PS5 Pro to reach 20 TF if it uses at least the 5nm node. And I don't see it using less than that one year after 7800 XT. Sony will need 2.6 ghz to reach 20 tf and I expect they will hit that (or very close to it) with their dynamic clocks.
The 60 CU 7800 XT only runs at 2.4 GHz while consuming 250 Watts and is already on the 5nm node.
 

PaintTinJr

Member
They'll downclock it for BC if needed. This is how BC works on PS5. In some rare games (like AC Unity) GPU clocks are set the PS4 Pro value. Even for Pro they overclocked the GPU as high as they could, from 800mhz to 911mhz. Which is about 14% overclock and they didn't have dynamic clocks back then resulting in jet engines noise in many PS4 Pro games!

For reference a similar 14% overclock would make PS5 Pro hit +2.5ghz and 19.4 TF.
But that is still below the textbook 1.4 Ghz (1.2Ghz in reality) sweet spot for power draw versus thermal efficiency at only 911Mhz and had plenty of lithography changes in the road map to absorb that.

IMO the PS5 GPU is well above that on a much steeper part of the diminishing returns situation for higher clocks versus heat production, and the roadmap of lithography changes to absorb that on a PS6 just aren't there to be had. It is more prudent to do a paradigm shift with a co-processor in a refresh, and then maybe a full Oberon setup of many co-processors - and maybe partial stacking - for PS6, than continue down a path where thermals and power draw are ridiculous - going by RTX 4090/AMD RT 7900XTX requirements.
 

buenoblue

Member
The same thing was said about the ps5 and 3080 and look what happened there are fringe games where they are within %5 of each other like the last of us and rift apart
your fucking insane if you think ps5 gpu is in 3080 performance category. This whole thread is fucking insane.

PS5 pro will not even match the 3080 lol.
have these people even gamed on a 3080 or 4080?
 

SmokSmog

Member
I don't know, I cant help but feel that the AMD and Nvidia listed FP 32 TF numbers are scams
They are real, at least for Nvidia.
At 2750mhz 4090 you have 45TF INT32 or 90TF FP32
Back in the days with Turing and older GPUs you had for example 10TF INT32 or 10TF FP32, they had 1:1 cores, now since Ampere Nvidia GPUs have half the cores that can do both INT32 or FP32 and another half that can do only FP32. This gave them something around 30% boost in games with the same SM count.


e3f45886247593fe0b6089f57acdd9e9d4e89555eff160e21d9a358b05a7d6f7.png


68SM 2080Ti vs 68SM 3080 in Doom that likes FP32.

doom-eternal-3840-2160.png


PS5 has 10.3TF INT32/FP32

IN32 is used something around 25 to 35% of the time in games.
 
Last edited:

winjer

Gold Member
your fucking insane if you think ps5 gpu is in 3080 performance category. This whole thread is fucking insane.

PS5 pro will not even match the 3080 lol.
have these people even gamed on a 3080 or 4080?

For rasterization, yes, the PS5 Pro might be around the performance of a 3080.
For RT, it's very unlikely.
 

PeteBull

Member
your fucking insane if you think ps5 gpu is in 3080 performance category. This whole thread is fucking insane.

PS5 pro will not even match the 3080 lol.
have these people even gamed on a 3080 or 4080?
https://www.techpowerup.com/gpu-specs/geforce-rtx-3080.c3621 it shouldnt be much worse, since currently 3080 raster=7800xt raster and ps5pr0 specs are directly based on that chip, its actually really intersting case study coz 3080 launched at 700$ msrp in sept 2020, now 3 years later we got literally same raster performance at 509$ street price https://pcpartpicker.com/product/43...deon-rx-7800-xt-16-gb-video-card-21330-01-20g

Ofc worse rt and fsr instead of much better dlss, same with amd's framegen solution, but 6 more gigs of vram.
Thats why i think ps5pr0, especially if it will be on rumored 4n process node( so improved 5nm) will be relatively close to 3080 in specs/actual game performance, might be tiny bit worse but it wont be colorado canion of gap, but rouhly same ballpark.
I fully expect DF to compare ps5pr0 perf in games to nvidias 3080 and amd's 7800xt in console perf vs pc comparisions/optimal settings.
Edit: lets not forget CES in few days with 3 new super cards officially annouced from nvidia, weakest of the buch 4070 super is very likely to be direct competiton both in perf and price to amd's 7800xt
https://videocardz.com/newz/gigabyt...80-4070-ti-4070-super-series-rtx-3050-6gb-too and here unofficial leaked, but basically already confirmed specs of new cards, we know everything but pricing already https://videocardz.com/newz/nvidia-...eview-and-sales-embargo-information-leaks-out

Weakest of the bunch, 4070super will be literally same perf as 3080/7800xt too (4070 non super is 6% weaker from 3080 and 7% weaker from 7800xt).
 
Last edited:
https://www.techpowerup.com/gpu-specs/geforce-rtx-3080.c3621 it shouldnt be much worse, since currently 3080 raster=7800xt raster and ps5pr0 specs are directly based on that chip, its actually really intersting case study coz 3080 launched at 700$ msrp in sept 2020, now 3 years later we got literally same raster performance at 509$ street price https://pcpartpicker.com/product/43...deon-rx-7800-xt-16-gb-video-card-21330-01-20g

Ofc worse rt and fsr instead of much better dlss, same with amd's framegen solution, but 6 more gigs of vram.
Thats why i think ps5pr0, especially if it will be on rumored 4n process node( so improved 5nm) will be relatively close to 3080 in specs/actual game performance, might be tiny bit worse but it wont be colorado canion of gap, but rouhly same ballpark.
I fully expect DF to compare ps5pr0 perf in games to nvidias 3080 and amd's 7800xt in console perf vs pc comparisions/optimal settings.
Edit: lets not forget CES in few days with 3 new super cards officially annouced from nvidia, weakest of the buch 4070 super is very likely to be direct competiton both in perf and price to amd's 7800xt
https://videocardz.com/newz/gigabyt...80-4070-ti-4070-super-series-rtx-3050-6gb-too and here unofficial leaked, but basically already confirmed specs of new cards, we know everything but pricing already https://videocardz.com/newz/nvidia-...eview-and-sales-embargo-information-leaks-out

Weakest of the bunch, 4070super will be literally same perf as 3080/7800xt too (4070 non super is 6% weaker from 3080 and 7% weaker from 7800xt).
4070 is only weaker at 4K native though. The super gets a pretty good shader bump but will have the same memory bandwidth limitation at 4K probably.
 

buenoblue

Member
Exactly! PS5 pro will be close to 3080 until raytracing, then more than likely 3070ti. which is great 😀 don't get me wrong.

But saying base PS5 is on 3080 level is insane. And saying PS5 Pro will be 4080 really can't be serious right? PS6 will probably struggle to beat 4080.

Remember Flop for Flop AMD don't compete on an even performance with Intel, let alone a cut back power constricted SOC version the consoles are using.

Now don't get me wrong, we are living in an age with upscaling, DRS and the like where if a card is twice as powerful the games don't look twice as good. They can look very similar while being quite far apart tech wise. But specs are specs. A 10TF AMD SOC Gpu ain't outperforming a 29TF full size 3080.
 

Gaiff

SBI’s Resident Gaslighter
For rasterization, yes, the PS5 Pro might be around the performance of a 3080.
For RT, it's very unlikely.
It’d be a bit pathetic if it cannot match Ampere’s second generation RT when it’ll be the third RDNA generation of RT. I really hope it does match it or exceed it. The 3080 doesn’t exactly excel there either.
 

PeteBull

Member
It’d be a bit pathetic if it cannot match Ampere’s second generation RT when it’ll be the third RDNA generation of RT. I really hope it does match it or exceed it. The 3080 doesn’t exactly excel there either.
All i can say, check 7800xt review, there u got both raster and here timestamped rt performance in few games vs few similar grade cards, dont even look at nvida counterparts, look at older 6800xt, how lil both performance in raster and rt progress was made.


Lets hope sony actually bumps that rt performance alot and takes some features from rdna4 coz 7800xt isnt that amazing of a gpu when it comes to rt performance, its really decently priced, that is its huge advantage.
 

winjer

Gold Member
It’d be a bit pathetic if it cannot match Ampere’s second generation RT when it’ll be the third RDNA generation of RT. I really hope it does match it or exceed it. The 3080 doesn’t exactly excel there either.

It's weaker than a 3080 in RT, although not by a huge margin.
But maybe with custom RT and BVH implementations on a console, it might do better.

relative-performance-rt-2560-1440.png
 

Gaiff

SBI’s Resident Gaslighter
All i can say, check 7800xt review, there u got both raster and here timestamped rt performance in few games vs few similar grade cards, dont even look at nvida counterparts, look at older 6800xt, how lil both performance in raster and rt progress was made.


Lets hope sony actually bumps that rt performance alot and takes some features from rdna4 coz 7800xt isnt that amazing of a gpu when it comes to rt performance, its really decently priced, that is its huge advantage.

It's weaker than a 3080 in RT, although not by a huge margin.
But maybe with custom RT and BVH implementations on a console, it might do better.

You do know this is a mid gen refresh and not a totally new gen right?
Oh, I know that the 7800 XT is weaker in ray tracing but the rumor suggests that the Pro is RDNA 3.5 so presumably with some new features that will be on RDNA 4, and it will also emphasize ray tracing so I'm thinking it should be quite a bit better on that front if rumors are to be believed. A lot of the ray tracing is still done on the TMUs and AMD's "hybrid" approach isn't delivering amazing results thus far.
 

winjer

Gold Member
Oh, I know that the 7800 XT is weaker in ray tracing but the rumor suggests that the Pro is RDNA 3.5 so presumably with some new features that will be on RDNA 4, and it will also emphasize ray tracing so I'm thinking it should be quite a bit better on that front if rumors are to be believed. A lot of the ray tracing is still done on the TMUs and AMD's "hybrid" approach isn't delivering amazing results thus far.

That is the problem is this whole thread: all we have is speculation and rumors.

It's fun to try to guess what the PS5 Pro will be, but it's still just guessing.
 

Mr.Phoenix

Member
20.9 TF at 2.721ghz on 5nm process with 60CUs. This is why I am expecting PS5 Pro to reach 20 TF if it uses at least the 5nm node. And I don't see it using less than that one year after 7800 XT. Sony will need 2.6 ghz to reach 20 tf and I expect they will hit that (or very close to it) with their dynamic clocks.
Nope. Did you see the size of the OG PS5? I am beginning to think that a lot of people here just talk without really knowing what they are talking about.

The OG 2020 PS5, along with its size, drew ~220W on game load. That dropped to 202W with the 6nm revision, which is the same chip the PS5slim uses. That GPU you are talking about on the PC, doesn't even run at 2.7Ghz, it runs at 2.5Ghz and draws over 260W. And that is just a GPU alone. There is no way, not a chance in hell, that the PS5pro has a GPU clocked that high when there is also a CPU to contend with and it needs to fit into a chassis for a console. Just doesn't make sense.

Further, NEVER has a console variant of a PC GPU/CPU matched the PC equivalent in clocks/power consumption. They are always downclocked, and for good reason. If the PS5pro is on 5nm, Do not expect its GPU to be clocked anywhere above 2.4Ghz. If anything, 2.35GHz is more likely.
They are real, at least for Nvidia.
At 2750mhz 4090 you have 45TF INT32 or 90TF FP32
Back in the days with Turing and older GPUs you had for example 10TF INT32 or 10TF FP32, they had 1:1 cores, now since Ampere Nvidia GPUs have half the cores that can do both INT32 or FP32 and another half that can do only FP32. This gave them something around 30% boost in games with the same SM count.


e3f45886247593fe0b6089f57acdd9e9d4e89555eff160e21d9a358b05a7d6f7.png


68SM 2080Ti vs 68SM 3080 in Doom that likes FP32.

doom-eternal-3840-2160.png


PS5 has 10.3TF INT32/FP32

IN32 is used something around 25 to 35% of the time in games.
I am sorry but I don't see how what you just posted here proves they are actually being used. I am not saying they aren't real, just that we aren't seeing them anywhere. And you shouldn't even be using a "benchmarking tool" for this argument, as the argument here is if its even being used in games.

Here's the problem. Let's take 3 GPUs from Nvidia for instance. the 2080, 3080 and 4080. Take a game like RE4 running on all 3 in 1080p and no RT so this way none of those GPUs can be bottlenecked either by RT performance or RAM as RE4 in 1080p peaks at 9.4GB RAM utilization, so only the 2080 may suffer since it has only 8GB, which mind you, should work in favor of the other GPUs.

2080 10TF - 98fps
3080 14.8TF is +48% vs 2080 or if using the claimed 29.7TF its +197% vs 2080 however - 130fps only +32% vs 2080
4080 24.4TF is +140% vs 2080 or if using the claimed 48.7TF its +387% vs 2080) - 200fps only 104% better than 2080

See what's happening there? The resulting performance is more in line with the actual TF difference and nowhere near the claimed TF differences. For giggles the 4090 runs at 228fps. And remember, the 2080 is the only GPU here that is even RAM bottlenecked. See why I am not buying it?
 

Mr.Phoenix

Member
That is the problem is this whole thread: all we have is speculation and rumors.

It's fun to try to guess what the PS5 Pro will be, but it's still just guessing.
Its a speculation thread, all we can do is make educated guesses. Thats the whole point of threads like these. Those guesses become more accurate with more data. So we can use data like, precedent (what sony has done in the past), fabrication node availability, AMD GPUs on the market, leaks, common sense...etc. Eg... it would be stupid to assume Sony makes a PS5pro that draws 300W of power when they have NEVER made a console that draws anything over 220W under load. Or stupid to say that Sony would take a PC GPU from a specific family of GPUs and clock it higher than it's clocked on the PC. Stuff like that just doesn't happen.
 

winjer

Gold Member
Its a speculation thread, all we can do is make educated guesses. Thats the whole point of threads like these. Those guesses become more accurate with more data. So we can use data like, precedent (what sony has done in the past), fabrication node availability, AMD GPUs on the market, leaks, common sense...etc. Eg... it would be stupid to assume Sony makes a PS5pro that draws 300W of power when they have NEVER made a console that draws anything over 220W under load. Or stupid to say that Sony would take a PC GPU from a specific family of GPUs and clock it higher than it's clocked on the PC. Stuff like that just doesn't happen.

Exactly, that's what I'm saying. All we are posting is speculation.
Some of it might even be right, but most will be wrong.
Still, it's fun to try and guess what the next consoles will be. No shame in that.
 

Sertyak

Neo Member
There is a screenshot on the insomniac leaks subreddit that lists the new tech that will be featured in the game.
One of the new features that caught my attention is ai upscaling (machine learning).
Is this confirmation of a ps5 pro?
Unless they are doing ai upscaling on a regular ps5, which would be pretty difficult probably.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
One year later. No leaks happened.
lol its only been a few months. The Alan Wake 2 director didnt even know about the PS5 pro so its possible the devkits havent been sent out to all third parties yet.

remember, DF leaked the PS4 Pro specs after they got the official developer slides from a studio. This was in April I believe and the console launched in october. its possible that the devkits had been out for a while before DF got the leaks and its also possible that Sony took steps to ensure DF doesnt get leaks this time around.

its also possible that DF was contacted by Sony to hold off on any leaks and given a day one exclusive in exchange.
 
remember, DF leaked the PS4 Pro specs after they got the official developer slides from a studio. This was in April I believe and the console launched in october. its possible that the devkits had been out for a while before DF got the leaks and its also possible that Sony took steps to ensure DF doesnt get leaks this time around.

its also possible that DF was contacted by Sony to hold off on any leaks and given a day one exclusive in exchange.

Interesting

If this is the case, PS5 Pro will likely be kitted out with a lot of custom tech, maybe AI up-sampling which they're going to want DF to communicate to the wider audience. I suspect the Road To PS5 wasn't well received by Sony after it was released, even though it did age well mind you.
 

SlimySnake

Flashless at the Golden Globes
Exactly, that's what I'm saying. All we are posting is speculation.
Some of it might even be right, but most will be wrong.
Still, it's fun to try and guess what the next consoles will be. No shame in that.
yeah, there is no right or wrong here. I remember the next gen spec thread got very contentious because people kept calling each other idiots for simply saying the PS5 wouldnt be 8 tflops like the leaks suggested. I was personally hoping for 14 tflops so i wasnt right by any means, but I took so much shit for saying 8 is just way too low for the perf per watt gains AMD had promised. And in the end both XSX and PS5 were in double digits.
 

SlimySnake

Flashless at the Golden Globes
There is a screenshot on the insomniac leaks subreddit that lists the new tech that will be featured in the game.
One of the new features that caught my attention is ai upscaling (machine learning).
Is this confirmation of a ps5 pro?
Unless they are doing ai upscaling on a regular ps5, which would be pretty difficult probably.
Probably something similar to what GOW Ragnorak did with machine learning. There is machine learning tech in the PS5 and XSX. Just not as elobarate as nvidia's hardware.

 

Sertyak

Neo Member
Probably something similar to what GOW Ragnorak did with machine learning. There is machine learning tech in the PS5 and XSX. Just not as elobarate as nvidia's hardware.

You are probably right, didn't think about that. Anyway that screenshot has a lot of new stuff for they're engine, it will be a pretty big upgrade.
 

SABRE220

Member
your fucking insane if you think ps5 gpu is in 3080 performance category. This whole thread is fucking insane.

PS5 pro will not even match the 3080 lol.
have these people even gamed on a 3080 or 4080?
From one extreme to another...yes the ps5 gpu is no 3080 but I have gamed on a 3080 and the way you are presenting it as some unreachable monster for the current timeline is wierd to say the least. The 3080 is not some ludicrious expectation and would be the bare minimum requirement to justify the pro console.

That being said yes the pro is never matching the 4080 magic optimization or not no way in hell.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Nope. Did you see the size of the OG PS5? I am beginning to think that a lot of people here just talk without really knowing what they are talking about.

The OG 2020 PS5, along with its size, drew ~220W on game load. That dropped to 202W with the 6nm revision, which is the same chip the PS5slim uses. That GPU you are talking about on the PC, doesn't even run at 2.7Ghz, it runs at 2.5Ghz and draws over 260W. And that is just a GPU alone. There is no way, not a chance in hell, that the PS5pro has a GPU clocked that high when there is also a CPU to contend with and it needs to fit into a chassis for a console. Just doesn't make sense.

Further, NEVER has a console variant of a PC GPU/CPU matched the PC equivalent in clocks/power consumption. They are always downclocked, and for good reason. If the PS5pro is on 5nm, Do not expect its GPU to be clocked anywhere above 2.4Ghz. If anything, 2.35GHz is more likely.

I am looking at the tech powerup specs for the 7800xt and it says its both on 5 and 6nm. If its 6nm and already at 345mm2 then at 5nm, we are looking at roughly 310mm2 for the GPU alone. With a bigger CPU and PS5's IO block, we are looking at 380 mm2 plus additional RT cores so 400mm2. And that massive chip running at 2.7 Ghz?? yeah, not happening.

I think its safer to hope for 17 tflops, and hope the RDNA 4 IPC gains are around 25% which gets us to that hypothetical 20 tflops range.

With RT becoming more and more important, id rather they invest the die space in dedicated RT cores than raw tflops anyway. same goes for tensor cores for DLSS like upscaling. I am already having to run some games at DLSS balanced or performance (1296p or 1080p internal resolution) so even if by some miracle they get close to 3080 performance, they will still need a proper DLSS solution.
 

buenoblue

Member
From one extreme to another...yes the ps5 gpu is no 3080 but I have gamed on a 3080 and the way you are presenting it as some unreachable monster for the current timeline is wierd to say the least. The 3080 is not some ludicrious expectation and would be the bare minimum requirement to justify the pro console.

That being said yes the pro is never matching the 4080 magic optimization or not no way in hell.
Yeah 3080 for a pro console is reasonable. Though personally I think it would be just below this. But someone was comparing base PS5 to a 3080, and others saying pro will be 4080 class.
 

Mr.Phoenix

Member
I am looking at the tech powerup specs for the 7800xt and it says its both on 5 and 6nm. If its 6nm and already at 345mm2 then at 5nm, we are looking at roughly 310mm2 for the GPU alone. With a bigger CPU and PS5's IO block, we are looking at 380 mm2 plus additional RT cores so 400mm2. And that massive chip running at 2.7 Ghz?? yeah, not happening.

I think its safer to hope for 17 tflops, and hope the RDNA 4 IPC gains are around 25% which gets us to that hypothetical 20 tflops range.

With RT becoming more and more important, id rather they invest the die space in dedicated RT cores than raw tflops anyway. same goes for tensor cores for DLSS like upscaling. I am already having to run some games at DLSS balanced or performance (1296p or 1080p internal resolution) so even if by some miracle they get close to 3080 performance, they will still need a proper DLSS solution.
Naaa... you've got the sizing slightly wrong. The 7800XT is made up of two dies. GCD (200mm2) and MCDx4 (36.6x4=146mm2) The GCD, which has the CUs, is on 5nm. The MCD, has the cache, and mem PHY buses is on 6nm. Oh, and each MCD has 16MB of cache.

The PS5pro, on 5nm, will mean that its GPU will have a similar footprint, so 200mm2, 8c zen 2 on 5nm cant be more than 40mm2, and throw in another 80mm2 for the 8x mem PHY controllers, 12-16MB of CPU cache (up from 8MB) slightly bigger RT cores in the GPU.... and we end up with an APU that's about 320mm2.

They don't have to invest any more in dedicating RT cores, we just have to hope that they are using RDNA4 RT cores and that those RT cores also handle BVH acceleration, which is the single reason AMD RT sucks so much right now. But if RDNA4 RT cores does that, then yh, they should be at least 25-50% bigger than the current RT cores.
 

HeisenbergFX4

Gold Member
There is a screenshot on the insomniac leaks subreddit that lists the new tech that will be featured in the game.
One of the new features that caught my attention is ai upscaling (machine learning).
Is this confirmation of a ps5 pro?
Unless they are doing ai upscaling on a regular ps5, which would be pretty difficult probably.
I think thats a very interesting slide as well but well above my paygrade on what it actually means as far as current PS5 vs Pro capabilities

 

twilo99

Member
It's still the same architecture.

The real difference will have to be made on the software side where UE5 will have to be further optimized to run on an architecture from 2019 .. which isn’t a bad thing I guess because any optimization will trickle up to the newer CPUs
 
I wonder why they'd go with 60 fps as a base for Wolverine. Weird.

Creative choice, the only real reason most developers target 30 FPS is because of hardware limitations.

A combat heavy game like Wolverine will need to rely on the fluidity and responsiveness of 60 FPS, anything lower would be immersion breaking. It also brings a sense of realism as well since the motion is so lifelike.
 

shamoomoo

Banned
There is a screenshot on the insomniac leaks subreddit that lists the new tech that will be featured in the game.
One of the new features that caught my attention is ai upscaling (machine learning).
Is this confirmation of a ps5 pro?
Unless they are doing ai upscaling on a regular ps5, which would be pretty difficult probably.
FP16 and lower precision bits are supported on RDNA2, the muscles deformation was done with ML. Heck,Sony Santa Monica also used ML to upscale GOW Ragnarok in I believe performance mode.

Wolverine is a small game than Spiderman so all of the PS5's TFLOPS and fill-rates can be used to make "smaller" game god tier.
 
Last edited:
your fucking insane if you think ps5 gpu is in 3080 performance category. This whole thread is fucking insane.

PS5 pro will not even match the 3080 lol.
have these people even gamed on a 3080 or 4080?
I did not say the ps5 is 3080 territory… I said there are fringe example of games like last of us and rift apart where it actually does perform like a 3080 (or 2% better in last of us case) cause of the environment they both are in
 

SlimySnake

Flashless at the Golden Globes
Naaa... you've got the sizing slightly wrong. The 7800XT is made up of two dies. GCD (200mm2) and MCDx4 (36.6x4=146mm2) The GCD, which has the CUs, is on 5nm. The MCD, has the cache, and mem PHY buses is on 6nm. Oh, and each MCD has 16MB of cache.

The PS5pro, on 5nm, will mean that its GPU will have a similar footprint, so 200mm2, 8c zen 2 on 5nm cant be more than 40mm2, and throw in another 80mm2 for the 8x mem PHY controllers, 12-16MB of CPU cache (up from 8MB) slightly bigger RT cores in the GPU.... and we end up with an APU that's about 320mm2.

They don't have to invest any more in dedicating RT cores, we just have to hope that they are using RDNA4 RT cores and that those RT cores also handle BVH acceleration, which is the single reason AMD RT sucks so much right now. But if RDNA4 RT cores does that, then yh, they should be at least 25-50% bigger than the current RT cores.
So 200mm2 = 54 CUs, but the PS5 will need some extra ones for yields, right? At 2.4 GHz, that gets us to 16.5 tflops. 2.3 GHz would be 15.9 Tflops. I think thats what we get.

Of course, that means the RT and IPC gains better be substantial, otherwise, this will be a rather meek upgrade.
 

Gaiff

SBI’s Resident Gaslighter
I did not say the ps5 is 3080 territory… I said there are fringe example of games like last of us and rift apart where it actually does perform like a 3080 (or 2% better in last of us case) cause of the environment they both are in
I swear you gotta be a troll. The 6800 outperforms the PS5 by 24% in TLOU. Where are you even getting your numbers from?

There isn’t a single game where the PS5 comes within 2% of the 3080 or beats it. At worse, the 3080 is over 25% faster.
 
It’d be a bit pathetic if it cannot match Ampere’s second generation RT when it’ll be the third RDNA generation of RT. I really hope it does match it or exceed it. The 3080 doesn’t exactly excel there either.
Your expectations are weirdly bizarrely high and low you think it’s ridiculous the pro will ever match a 4080 in a single pure taster game but also think the pro will have better rt than a 3080???
 
Last edited:
Oh, I know that the 7800 XT is weaker in ray tracing but the rumor suggests that the Pro is RDNA 3.5 so presumably with some new features that will be on RDNA 4, and it will also emphasize ray tracing so I'm thinking it should be quite a bit better on that front if rumors are to be believed. A lot of the ray tracing is still done on the TMUs and AMD's "hybrid" approach isn't delivering amazing results thus far.
Your expectations are backwards your greatly exaggerating the rt performance and underselling the raster one
 

Gaiff

SBI’s Resident Gaslighter
Your expectations are weirdly bizarrely high and low you think it’s ridiculous the pro will ever match a 4080 in a single pure taster game but also think the pro will have better rt than a 3080???
The 4080 is 50% faster than the 3080 which in turn is over 60% faster than the PS5’s GPU. It’s over 2.2x faster than the PS5’s GPU. How the fuck is it unrealistic to expect the PS5 Pro to land within the ballpark of the 3080 in both raster and RT but fall short of the 4080 by a sizable margin?

At this point, I’ll assume you have a mental handicap.
 
Last edited:

Bojji

Member
About what I'm expecting. Perhaps 4070 Ti level of performance in exclusive titles but that's assuming Nixxes and other studios don't get a better handle of the PC platform. I mean, ND can't possibly make a worse port than TLOU. There's nowhere to go but up since a lot of the ports were first attempts using engines that had never seen the light of day on PC.

Based on leaks it will be ~7800XT so few % stronger than 4070. It also can be weaker than 7800XT thanks to console power limits (if they just don't go for 300W TDP). 4070ti is ~20% better than that and 4080 is ~30% better than than 4070ti. People expecting 4080 performance are going to be disappointed...

PS5 is usually in that 2070S - 2080S performance ballpark depending on how good or bad PC port is to compare.
 

rofif

Can’t Git Gud
Exactly! PS5 pro will be close to 3080 until raytracing, then more than likely 3070ti. which is great 😀 don't get me wrong.

But saying base PS5 is on 3080 level is insane. And saying PS5 Pro will be 4080 really can't be serious right? PS6 will probably struggle to beat 4080.

Remember Flop for Flop AMD don't compete on an even performance with Intel, let alone a cut back power constricted SOC version the consoles are using.

Now don't get me wrong, we are living in an age with upscaling, DRS and the like where if a card is twice as powerful the games don't look twice as good. They can look very similar while being quite far apart tech wise. But specs are specs. A 10TF AMD SOC Gpu ain't outperforming a 29TF full size 3080.
no way pro will be 3080 performance. 3080 does exactly 2x the fps on a good day
 
Top Bottom