Call it a waste if u want. Its still higher. And anyway, MS have talked about ai learning in the xsx meaning they may come up with a dlss alternative.
I still dont know how an 18% difference in tflops and 25% difference in vram bandwidth
Side by side comparisons and DF videos are clearly going to show the advantages. Ask anyone playing them apart from each other and it becomes less of an issue. I doubt anyone that takes home a PS5 is going to be disappointed. All of this stuff is just for epeen measuring which MS seems to be good at lately. What is the saying? It is not the size that matters but how you use it?I cropped to hide the labels. My point is that there IS as clear difference between Pro and One X.
The comparison here is 1:1, perfectly fair when representing 4K and working with a much bigger image.
![]()
He ma, look at this standalone random screenshot of RDR2 for Xbox One S, looks fine to me, I'll just go ahead and miss the point entirely though and post it anyway, after all if I have nothing to compare to it, it should look fine on Xbox One S, who the heck needs next gen consoles when I can just choose ignore that there's something better out there
![]()
I cropped to hide the labels. My point is that there IS as clear difference between Pro and One X.
The comparison here is 1:1, perfectly fair when representing 4K and working with a much bigger image.
![]()
He ma, look at this standalone random screenshot of RDR2 for Xbox One S, looks fine to me, I'll just go ahead and miss the point entirely though and post it anyway, after all if I have nothing to compare to it, it should look fine on Xbox One S, who the heck needs next gen consoles when I can just choose ignore that there's something better out there
![]()
He's a Capcom leaker from Reee who is not exactly tech literate but he does have connections indeed.Refresh my memory:
Who is Dusk Golem and why do I care?
Honestly mate, I don't think many people care about either game. The Gunk looks like a cutesy platformer, and that's covered by R&C. Scorn has been in development hell and nobody knows anything about the game. 2 games with little hype behind them from no-name developers.
yeah, dont forget VRS which doesnt seem to be in the PS5.
way too much obfuscation from devs and sony tbh. just fucking come out and say it. i dont want to be tricked into buying a console.
Also 40% more CUs so the Raytracing capabilities will be much better.2TF vs 0.5TF. If something needs 1TF more power than the PS5 has it doesn’t matter what the percentage difference is, just how much power there is.
Also 40% more CUs so the Raytracing capabilities will be much better.
2TF vs 0.5TF. If something needs 1TF more power than the PS5 has it doesn’t matter what the percentage difference is, just how much power there is.
What's so hard to understand? RE8 will be optimized to run at 4K dynamic on PS5.
PS5 GPU will not stay at 2.2GHz all the time, so it will be 18% performance gap in the best scenario.
As i predicted earlier, Series X will offer the best gaming performance, smaller volume/weight/heat/consumption, at the same or lower price than PS5.
Add in Gamepass, and 1TB ssd expansion, Phil is attacking the covid-hit gaming community with the best value, best designed next gen experience.
I do hope Sony is not that drunk with PS4 success, to think they can avoid price competition with Series X.
Probably an easier or I guess different way of thinking it atleast for me how I understood it was the Xbox runs like your standard console (which i'm going to add here the whole "sustained" thing is a bit overblown.) So runs with the clock like you would assume it would but just not always having a workload for it on certain screens and so on. While say the PS has the ability to run at the desired clocks or TF (AKA Variable) that the developers need for that scene and so on. So it helps explain how it's "A new paradigm" because it's offering something previously not allowed by consoles to date. Then we just add in the psu either running constant or varying etc.I don't think it'll be a 2.23 GHz most of the time, actually (let's see how many people read past that to realize I'm not insinuating they probably think I'm insinuating x3). Neither will Series X be at 1.825 GHz most of the time, either. It doesn't make any sense. If your game isn't demanding that much power, why have the GPU stress out at that level? I don't think "variable" frequency or "sustained" clocks are referring to this type of thing.
I believe the variable frequency comes into play in scenarios where a game is actually maxing out the clocks. So it has more to do with sustaining that power at a long period of time as best a possible, based on the stress of the power load. So in those scenarios where the clock is being maxed, if the power budget is exceeded and the GPU needs more power, the system will siphon some of the power budget from the CPU if able. If not, then the GPU will downclock. It bases this on power, not heat, and the frequencies are adjusted based on any reduction in the power load.
It works differently on the Series systems; again they aren't going to be at max clocks all the time because a lot of game logic doesn't even requires that. However, for points where max clocks are required, the systems determine the power budget based on the production of heat, and rather than using that to lower the clocks, it uses that to increase the cooling. So unless the cooling somehow fails, a game a guaranteed those max clocks at those frequencies for as long as the game needs it.
Keep in mind too that in both systems cases (PS5, Series X/Series S), that you don't necessarily need max clocks to be active to cause the scenarios mentioned above. For PS5 I'd imagine that frequent peaks at high clocks for either CPU or GPU in short periods would run a chance of the power supply not being able to draw/produce enough required power quickly enough to then distribute throughout the system? Dunno, I don't really know how PSUs work xD.
For Series X/Series S they have to contend with residual heat dissipation which could happen if excess heat isn't removed from the system fast enough inbetween peaks of high power usage. This is where you can run into the typical overheating issue of virtually any system (PS5 can also run into this issue, hence why it needs very good cooling even if it isn't monitoring for heat production to determine the level of cooling, but rather power draw as the monitored element), which can cause a shutdown. However I'd assume this won't be a case for the Series systems since the One X had an extremely good cooling system, and I'd also say this shouldn't be a factor with PS5, either.
All speculation on my part, but it at least sounds like it makes the most sense.
He's not wrong, but both systems have GEs. Unless AMD rebranded the GE for RDNA2 and Sony for whatever reason are going with the RDNA1 terminology there. I don't see that really being the case, and I also don't see AMD just outright removing the GE from RDNA2 and onward, either unless, again, it's been retooled and renamed.
Taking the more favorable option here, let's just say both systems have GEs. But we do know Sony have made some customizations with theirs. The question is if MS has. Only a few more days to find out (hopefully)!
2 TF more is a lot for that rendering dude, there will be SOME differences that doesn’t mean the PS5 isn't well designed but XSX is just better in numbers
He's not wrong, but both systems have GEs
I don't think it'll be a 2.23 GHz most of the time, actually (let's see how many people read past that to realize I'm not insinuating they probably think I'm insinuating x3). Neither will Series X be at 1.825 GHz most of the time, either. It doesn't make any sense. If your game isn't demanding that much power, why have the GPU stress out at that level? I don't think "variable" frequency or "sustained" clocks are referring to this type of thing.
I believe the variable frequency comes into play in scenarios where a game is actually maxing out the clocks. So it has more to do with sustaining that power at a long period of time as best a possible, based on the stress of the power load. So in those scenarios where the clock is being maxed, if the power budget is exceeded and the GPU needs more power, the system will siphon some of the power budget from the CPU if able. If not, then the GPU will downclock. It bases this on power, not heat, and the frequencies are adjusted based on any reduction in the power load.
It works differently on the Series systems; again they aren't going to be at max clocks all the time because a lot of game logic doesn't even requires that. However, for points where max clocks are required, the systems determine the power budget based on the production of heat, and rather than using that to lower the clocks, it uses that to increase the cooling. So unless the cooling somehow fails, a game a guaranteed those max clocks at those frequencies for as long as the game needs it.
Keep in mind too that in both systems cases (PS5, Series X/Series S), that you don't necessarily need max clocks to be active to cause the scenarios mentioned above. For PS5 I'd imagine that frequent peaks at high clocks for either CPU or GPU in short periods would run a chance of the power supply not being able to draw/produce enough required power quickly enough to then distribute throughout the system? Dunno, I don't really know how PSUs work xD.
For Series X/Series S they have to contend with residual heat dissipation which could happen if excess heat isn't removed from the system fast enough inbetween peaks of high power usage. This is where you can run into the typical overheating issue of virtually any system (PS5 can also run into this issue, hence why it needs very good cooling even if it isn't monitoring for heat production to determine the level of cooling, but rather power draw as the monitored element), which can cause a shutdown. However I'd assume this won't be a case for the Series systems since the One X had an extremely good cooling system, and I'd also say this shouldn't be a factor with PS5, either.
All speculation on my part, but it at least sounds like it makes the most sense.
He's not wrong, but both systems have GEs. Unless AMD rebranded the GE for RDNA2 and Sony for whatever reason are going with the RDNA1 terminology there. I don't see that really being the case, and I also don't see AMD just outright removing the GE from RDNA2 and onward, either unless, again, it's been retooled and renamed.
Taking the more favorable option here, let's just say both systems have GEs. But we do know Sony have made some customizations with theirs. The question is if MS has. Only a few more days to find out (hopefully)!
This is your 2 TF, i mean, 1.8 TF stronger GPU in numbers. Not bad for a weaker GPU, isn't it.
Whilst I'm here, I might as well post the rest. It's important to note all the below are theoretical maximums, including whether the clocks are fixed or not:
Extrapolated from RDNA1:
Triangle rasterisation is 4 triangles per cycle.
PS5:
4 x 2.23 GHz ~ 8.92 Billion triangles per second
XSX:
4 x 1.825 GHz - 7.3 Billion triangles per second
Triangle culling rate is twice number triangles rasterised per cycle.
PS5:
8 x 2.23 GHz - 17.84 Billion triangles per second
XSX:
8 x 1.825 GHz - 14.6 Billion triangles per second
Pixel fillrate is with 4 shader arrays with 4 RBs (render backends) each, and each RB outputtting 4 pixels each. So 64 pixels per cycle.
PS5:
64 x 2.23 GHz - 142.72 Billion pixels per second
XSX:
64 x 1.825 GHz - 116.8 Billion pixels per second
Texture fillrate is based on 4 texture units (TMUs) per CU.
PS5:
4 x 36 x 2.23 GHz - 321.12 Billion texels per second
XSX:
4 x 52 x 1.825 GHz - 379.6 Billion texels per second
Raytracing in RDNA2 is alleged to be from modified TMUs.
PS5:
4 x 36 x 2.23 GHz - 321.12 Billion ray intersections per second
XSX:
4 x 52 x 1.825 GHz - 379.6 Billion Ray intersections per second
Really??Both? So, VRS isn't mentioned for PS5. PS5 doesn't have VRS
Geometry engine isn't mentioned for XSX, XSX has it.
Can you explain me double standards
This is your 2 TF, i mean, 1.8 TF stronger GPU in numbers. Not bad for a weaker GPU, isn't it.
Whilst I'm here, I might as well post the rest. It's important to note all the below are theoretical maximums, including whether the clocks are fixed or not:
Extrapolated from RDNA1:
Triangle rasterisation is 4 triangles per cycle.
PS5:
4 x 2.23 GHz ~ 8.92 Billion triangles per second
XSX:
4 x 1.825 GHz - 7.3 Billion triangles per second
Triangle culling rate is twice number triangles rasterised per cycle.
PS5:
8 x 2.23 GHz - 17.84 Billion triangles per second
XSX:
8 x 1.825 GHz - 14.6 Billion triangles per second
Pixel fillrate is with 4 shader arrays with 4 RBs (render backends) each, and each RB outputtting 4 pixels each. So 64 pixels per cycle.
PS5:
64 x 2.23 GHz - 142.72 Billion pixels per second
XSX:
64 x 1.825 GHz - 116.8 Billion pixels per second
Texture fillrate is based on 4 texture units (TMUs) per CU.
PS5:
4 x 36 x 2.23 GHz - 321.12 Billion texels per second
XSX:
4 x 52 x 1.825 GHz - 379.6 Billion texels per second
Raytracing in RDNA2 is alleged to be from modified TMUs.
PS5:
4 x 36 x 2.23 GHz - 321.12 Billion ray intersections per second
XSX:
4 x 52 x 1.825 GHz - 379.6 Billion Ray intersections per second
Probably an easier or I guess different way of thinking it atleast for me how I understood it was the Xbox runs like your standard console (which i'm going to add here the whole "sustained" thing is a bit overblown.) So runs with the clock like you would assume it would but just not always having a workload for it on certain screens and so on. While say the PS has the ability to run at the desired clocks or TF (AKA Variable) that the developers need for that scene and so on. So it helps explain how it's "A new paradigm" because it's offering something previously not allowed by consoles to date. Then we just add in the psu either running constant or varying etc.
Both? So, VRS isn't mentioned for PS5. PS5 doesn't have VRS
Geometry engine isn't mentioned for XSX, XSX has it.
Can you explain me double standards
And you can't calculate the numbers. The one console has variable clocks and also I really don't think that the bigger GPU has same amount of the other unites like the PS5-GPU. That does not make sense at all.
But I haven't even been saying PS5 doesn't have "VRS". I just said VRS is a MS-branded term
Well 1st. Integrated Graphics.... lol kidding. But as for that, we know that in the past they let the frequency stay sort of set and let the power fluctuate. We also for the most part know that you can in fact change a frequency multiple times in a scene. I personally don't think PC is the right comparison and it's also worth noting since this is relatively new and different this is all in theory until it's in front of us. However based on what we know such as they have indeed said they have the power value at a set amount and don't let it vary, and wen't with a varying frequency based on load (i.e what the scene requires).But this isn't making too much sense to me. If they're enabling the ability to run at desired clock or TF the developer needs for the scene, what does that imply with other systems? Does it mean that other systems were providing an excess of frequency or power for the task at hand, wasting power as a result? Right now I'm typing this post on a PC and have my Task Manager open at all times, running an image editor and got a bunch of other tabs open. Integrated graphics (don't kill me xD), but nowhere near 100% system resources being stressed. Nowhere near 100% power being drawn, as my system cooling is literally silent. And this is a PC.
I just don't think there's really been cases of any system with fixed clocks drawing an excess of power to waste on rendering not requiring higher frequency usage, not for any lengthy period of time anyway. You may get a few frames upon a reduction in system frequency utilization where the power load has to take a bit to calm down to match the lower system utilization levels (this might be what you're referring to?), but that kind of fits more in what I was saying earlier in the post you responded to, and it still comes down to how efficient the cooling in the system is (extremely good cooling staves this off for the most part because even at higher clocks such a system would never generate enough heat to start needing to draw in more power for sustaining those clocks and the cooling won't have to work so hard in the first place as a result).
I can imagine with PS5's setup the power and cooling can make those adjustments a few ms faster, but in real-world practice probably won't make anything more than 1% - 3% difference in gains compared to the sort of "fixed" frequency setup of a contemporary platform. However for Sony's specific design it very likely brings a ton of performance gains compared to implementing a comparable "fixed" clock setup on PS5, in fact they more or less said as such in Road to PS5.
Yes, you can. Because PS5 will be at max. clocks most of its time. And clocks won't drop so dramatically, only couple of percentages, like Cerny mentioned it. But these stressed scenes happens rarely
Dark theme, yo
maybe I am being picky here, but what most means? 99% or 51% or maybe some other %?
Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power," explains Cerny.
Wait so PS5 struggles with 4K but it's apparently a cake walk for Series X despite the gap in raw power being smaller than it is this gen?
Hmm
Both? So, VRS isn't mentioned for PS5. PS5 doesn't have VRS
Geometry engine isn't mentioned for XSX, XSX has it.
Can you explain me double standards
You are correct, because technically speaking no modern GPU will run at fixed clocks all the time. I was thinking only about demanding gaming scenarios where GPU will need to run at max clocks for extended period of time. XSX will offer sustained level of performance in such scenario for sure.I don't think it'll be a 2.23 GHz most of the time, actually (let's see how many people read past that to realize I'm not insinuating they probably think I'm insinuating x3). Neither will Series X be at 1.825 GHz most of the time, either. It doesn't make any sense. If your game isn't demanding that much power, why have the GPU stress out at that level? I don't think "variable" frequency or "sustained" clocks are referring to this type of thing.
I believe the variable frequency comes into play in scenarios where a game is actually maxing out the clocks. So it has more to do with sustaining that power at a long period of time as best a possible, based on the stress of the power load. So in those scenarios where the clock is being maxed, if the power budget is exceeded and the GPU needs more power, the system will siphon some of the power budget from the CPU if able. If not, then the GPU will downclock. It bases this on power, not heat, and the frequencies are adjusted based on any reduction in the power load.
It works differently on the Series systems; again they aren't going to be at max clocks all the time because a lot of game logic doesn't even requires that. However, for points where max clocks are required, the systems determine the power budget based on the production of heat, and rather than using that to lower the clocks, it uses that to increase the cooling. So unless the cooling somehow fails, a game a guaranteed those max clocks at those frequencies for as long as the game needs it.
Keep in mind too that in both systems cases (PS5, Series X/Series S), that you don't necessarily need max clocks to be active to cause the scenarios mentioned above. For PS5 I'd imagine that frequent peaks at high clocks for either CPU or GPU in short periods would run a chance of the power supply not being able to draw/produce enough required power quickly enough to then distribute throughout the system? Dunno, I don't really know how PSUs work xD.
For Series X/Series S they have to contend with residual heat dissipation which could happen if excess heat isn't removed from the system fast enough inbetween peaks of high power usage. This is where you can run into the typical overheating issue of virtually any system (PS5 can also run into this issue, hence why it needs very good cooling even if it isn't monitoring for heat production to determine the level of cooling, but rather power draw as the monitored element), which can cause a shutdown. However I'd assume this won't be a case for the Series systems since the One X had an extremely good cooling system, and I'd also say this shouldn't be a factor with PS5, either.
All speculation on my part, but it at least sounds like it makes the most sense.
Sorry, will change to dark. LOL
Except that is HAS been mentioned
DF said it was 4K. Unless Capcom have some tricks in play to make it look 4K.
So if the 1080p constraints will be fixed by launch, what about the 4K constraints?
Elog...Elog...you know that's not what I'm saying. What I'm saying is simply Series systems DO have customizations...and they do. But people tend to ignore that reality and just say they are "PCs in a box". Which, I mean relative to older systems like PS2, Gamecube, Saturn, SNES, MegaDrive etc....BOTH of them are PCs in a box. They're both x86-based, that is a PC architecture primarily.
What you're referring to as "information" are either unsubstantiated rumors (the RDNA3 ones from people like MLID were destroyed by a PS5 software engineer on Twitter btw), or patents that could or could not be reflective of actual hardware in the PS5. There are just as many such patents relating to technologies that could or could not be in the Series systems, what we're trying to gauge is the probability of such patents in a finished retail product.
The post on this very page
Wait so you think that just because the PS5 has more power than the One X it will be able to do everything at native 4K just because it’s the same amount of pixels?
Why do you think that not all games are 4K natively on the X? Different engines and games require different amounts of power to run at 4K.
Official statement, thanks. I can also say: PS5 has VRS.
So, PS5 has VRS because it is mentioned on this very page
Jesus. No, percentage is just one metric that doesn't tell the whole story. Xbox's TF advantage is at least 2 TF. Nothing will change that. On top of that there's around 40% more compute units which are critical when it comes to raytracing.Okay, astroturfing at least for the typical xbox consumer, it worked.
The lies started.
The differences are compared in %.
source: basic mathematics.
Always the same script.
tell me, do you believe that or lie on the internet to "win the argument"?