Fake
Member
Same configuration? Devkit?So what was that spiderman SSD demo about?
Same configuration? Devkit?So what was that spiderman SSD demo about?
This "variable frequency" is a marketing trick, you see through this, right? Otherwise they would've just said it's always 2.23Ghz, but because the gap would be too large between the GPU's they decided to come with this variable frequency thing. Which means, if we up the GPU, we get a lower CPU, and vice versa. You can't have both, otherwise they would've done this out of the box.
So each game on PS5 now has to decide if they want to use less CPU or GPU power to match some aspects of the Xbox Series X.
Exactly my point, it has to function within thermal limit, so it will function at dynamic clocks. You can compare this to Native Vs Dynamic resolution. If you think Dynamic clocks that can reach 10.28 tflops and 3.5Ghz at most is comparable to Series X then I don't know what to say to you.Here's what it says:
Nowhere does it say "a variety of factors". Console developers will always need to work within the thermal and computational constraints of a system.
The bottom line is that it's a different approach. One is not better than the other and a "fixed" clock speed doesn't result in more "stability". It's down to the developers at the end of the day.
You didn't say developer friendly?Don't put words in my mouth. And pls read my post again.
Or maybe listen to the lead architect who said that thanks to the cooling system the clocks will mostly change around 2-3% because that's enough to deal with temperatures.I'm trolling? Go and read Eurogamer article on this, they clearly explain that those clocks on CPU and GPU will not be maintainable and will depend on variety of factors. Now put that against Series X that can have set target performance and maintain it all the time. You obviously talking without any technical knowledge on this and just making your own assumptions.
It's still faster 360 bit to 256bit.Wait but some of the ram on Xbox is slower?
It was a weird few weeks. PS5 had been rumoured to be around 12tf for a long time (with most saying a bit better than SeX), but then recently it morphed to 14 and then some believed that "up to 15tf" spec sheet someone posted.But I have little sympathy for those on the 13-15 TF hype train who got taken in. They refused to hear anything or accept any insiders that didn't paint the PS5 as superior. The better the PS specs anyone claimed the the more credence they gave them. They refused to hear ANYTHING about GitHub and harassed the hell out of anyone that claimed Oberon was the PS5
Yes, probably a 5-6% for XSXWait but some of the ram on Xbox is slower?
I do take your point, but my initial point was saying that to have a fully fixed clock speed at all times doesn't make any sense. A better way to have put it is that the GPU and CPU are capable of running at their maximum clock speeds at all times. I think it's a matter of semantics. Xbox SX is better if it can sustain its clocks indefinitely and PS5 can't, but if they are fixed, I still think that is absolutely silly.Exactly my point, it has to function within thermal limit, so it will function at dynamic clocks. You can compare this to Native Vs Dynamic resolution. If you think Dynamic clocks that can reach 10.28 tflops and 3.5Ghz at most is comparable to Series X then I don't know what to say to you.
It's still faster 360 bit to 256bit.
I rather look at facts, Mark Cerny is just throwing fancy terms to make his console seem better than it is, just like he did with PS4 and PS4 Pro. You can name it what you want, but this is just boost mode and boost mode is not sustainable. Of course this will all depend on how demanding the game is.Or maybe listen to the lead architect who said that thanks the cooling system the clocks will mostly change around 2-3% because that's enough to deal with temperatures.
Maybe Euroguys are right but, hey, it's not like we don't have infos from a real professionist here, give some credit to them instead of acting like they doesn't exists.
Or , like some have suggested, since RDNA2 is brand new, it's was 12 or 13 GCN TFs which mislead some to believe.... especially since RDNA2 wasn't a thing yet. Which, as Cerny goes to great lengths to point out in his presentation, the CUs in the different architectures are vastly different.What tells me about Kleegamefan, O'dium and OsirisBlack and their stated numbers is that MAYBE the dev kits of PS5 were more than 12 TeraFlops, that's the only explanation of why they were saying nmbers that were really off the mark with 10.3 TFs to be honest with you guys.
Maybe the PS5's dev kits were more powerful than XSX, because this is what Klee even said, he said that these are dev kit numbers, retail might change.
The system will crash and fail if both will be at full power for a longer period of time. It will just shut off due to heating issues.Like i said : it CAN be on all the time. If something is under stress, you can't have lower clock for stress situation
Damage control....From the other forum
“ XSX vs PS5 specs break down as...
CPU: 3.6GHz vs 3.5Ghz 8C/16T Zen 2 (Xbox wins +3%)
GPU: [email protected] vs [email protected] RDNA2 GPU (Xbox wins +17% shading and RT but PS5 wins +20% rasterisation)
RAM: 10GB@560GB/sec + 6GB@330GB/sec vs 16GB@448GB/sec (Xbox wins ~20-25% more bandwidth, total amount is a tie)
SSD: [email protected]/sec vs [email protected]/sec (PS5 wins speed +110%, Xbox wins capacity)
The consoles are within spitting distance of each other. Everyone who said they were both super close, and way more close than any of the consoles this generation, were right.”
The system will crash and fail if both will be at full power for a longer period of time. It will just shut off due to heating issues.
From a technical standpoint fixed clocks make way more sense, it's why Sony had fixed clocks on their previous consoles. Sony obviously just overclocked the GPU because they couldn't afford to be too far behind, but that overclocking is not sustainable. This sustainability will depend on many factors and type of game, so we don't know how it will perform.I do take your point, but my initial point was saying that to have a fully fixed clock speed at all times doesn't make any sense. A better way to have put it is that the GPU and CPU are capable of running at their maximum clock speeds at all times. I think it's a matter of semantics. Xbox SX is better if it can sustain its clocks indefinitely and PS5 can't, but if they are fixed, I still think that is absolutely silly.
Then why even have "variable frequency". If there isn't a problem, then why talk about this, why put this in your system? It's apparently not neededThat's why Sony surely invested in some serious cooling solution. I'm sure they know what they are doing.
Damage control....
It's not even close. All the numbers from the CPU and GPU are boost modes. And therefore you have not calculated that most of the time they will spend in lower clocks.
Then why even have "variable frequency". If there isn't a problem, then why talk about this, why put this in your system? It's apparently not needed
Aye aye Captain.So PS5 native 2.0GHz = 9.2TF and with boost clock to 2.23GHz PS5 = 10.3TF ?
No, they really weren't cardboards.
There is no fancy terms regarding this.I rather look at facts, Mark Cerny is just throwing fancy terms to make his console seem better than it is, just like he did with PS4 and PS4 Pro. You can name it what you want, but this is just boost mode and boost mode is not sustainable. Of course this will all depend on how demanding the game is.
If you look at the TDP of the consoles though, which is 150W vs 180W. The Xbox consumes 20% more energy than the PS5 which falls exactly in line with the increase in performance. I think it's a big stretch to say that PS5 is being overclocked based on the TDP.From a technical standpoint fixed clocks make way more sense, it's why Sony had fixed clocks on their previous consoles. Sony obviously just overclocked the GPU because they couldn't afford to be too far behind, but that overclocking is not sustainable. This sustainability will depend on many factors and type game, so we don't know how it will perform.
So PS5 native 2.0GHz = 9.2TF and with boost clock to 2.23GHz PS5 = 10.3TF ?
Check out the guy on the left:You might want to check the video again.
It was the Forza 5 crowd.You might want to check the video again.
I didn't focus on them. I was trying to work from home on one laptop, listen and skim my other laptop for the PS5 show.You might want to check the video again.
@Mod of War I am here and at your mercy, proceed with my flogging as you desire.
Let's say it's 5%, that would make it 9.7 tflops (exactly what I think it runs on default). While 5% might be tiny, it's not a small number in the overall power of the GPU. I'm just saying.There is no fancy terms regarding this.
He straight up said that changes are gonna be extremely tiny, he is the lead architect, he knows every single bit of this machine while you have a specs sheet and PC systems comparison.
Lol dude, yeah sure that's why he put it there. To confuse everybody purely for those 100 PS4 games that are BC2.23 GHz isn't needed for..........PS4 games. That's why is a variable frequency.
What about here?You might want to check the video again.
Tommy was unbanned for getting SeX right.Ah hell no, you were the closest out of everyone except GitHub.
NowTommy Fisher , it's too bad he's already been banned. I wonder if he's actually TimDog I could see him totally astroturfing everyone - too bad he's already been banned too.
Both next-gen consoles' custom SSD compared to extremely slow ~100 MB/s HDD we've been having since PS3 era (and now PS4) for over a decade, yes, this is a game changer. You folks probably don't understand this. But devs do. Also, XB1 has DDR3 not GDDR3. And I think you're also confusing yourself by mixing up SSD speeds with memory bandwidth if you're comparing 7GB/s (SSD throughput) with 68GB/s total system memory bandwidth of a XB1. These are two completely different things.SMH, you must be one of those folks that thinks data streaming at paltry 7GBs is gonna change gaming when GDDR6 operates at 448GBs on PS5. Sure, it will help with minor stuff, but nothing game changing like you think. We all saw how bad Xbox One was with 68GBs GDDR3 and you expecting miracles from 7GBs Virtual Ram.
Then why even have "variable frequency". If there isn't a problem, then why talk about this, why put this in your system? It's apparently not needed
It is, because it is 5% lolLet's say it's 5%, that would make it 9.7 tflops (exactly what I think it runs on default). While 5% might be tiny, it it's not a small number in the overall power of the GPU. I'm just saying.
Would not surprise me if he was from RDX team.Tommy was unbanned for getting SeX right.
But ya, his PS5 claim was way off like most of the others.
I think he was an MS plant. That ntkrnl guy from 5 years ago seemed like one too.
Let me guess, you legitimately think that you understand the specs better based on the wall of text you wrote defending the ps5. Truth is that Sony did something unprecedented, in a very bad way, in the console space by making insane and variable clock speeds just so to not seem completely overpowered by the XSX. Cerny himself admitted that there is no way that BOTH the cpu and the gpu will run at these peak clock speeds at the same time but that went over your head. Oh and guess what else you need , other than a fast hard drive, for these open worlds ? Fast CPU (for a decent frame rate) a powerful GPU and a fast Ram configuration to feed that GPU for making these worlds come true and the XSX is killing the ps5 in everything, even if we assume that the ps5 will run at its peak clock speeds at all times (spoilers....it will not).I find it funny and so many PS fans are imploding and so many Xbox Fans are gloating, without understanding the specs, or only making a comparison about TF. People got the wrong idea Xbox one and Ps4 had disparity between the systems, it wasn't TF people, it was the ram bottlenecking. There is more metrics than just TF, and there are benefits to lower count CU with high Freq versus Higher count CU lower Freq. It seems Sony decided to go with the least bottleneck, highly optimized approach, but on the other hand we really don't know the performance benchmark between the two until they are put to the test. As far as price, people seem to be in the understanding that since sony has less TF, naturally it should have a 100 dollar lower price tag, without accounting for the highly specialized SSD and heavily modified APU with a 3D Audio processor. Now this isn't to say I don't have my own set of questions or concerns, first as I said the TF is not a problem and more than sufficient, but the high Freq is alarming, making me wonder about what type of cooling solution they are coming up with ( BTW this isn't a panic reaction from Sony, Xbox Fans, it is intentional, I just what to know the "how and what made them go with that specific solution"), caveat to that is the fact that the NVM SSD size matters, which tells me that the PS5 enclosure must be extremely tight so I don't know how the air circulation is gonna work on this thing. Lastly, the SSD speed is otherwordly and is a game changer in a different sense. If you are gonna compare it to the TF, my friend you are comparing apples and oranges, this SSD is going to allow us to play open world games in an immersion we have yet to experience, but this is where my concern lies. While the Ps5 has this high tech, the controller function of the specialized SSD concerns me when dealing with 3rd party solution to increase the storage. Cerny himself expressed that as 3rd party SSD upgrades into the new solutions each unit has to be tested individually, the impact on the game play could be drastic if they don't conform to Sony's specialized SSD standards. In the end I like both systems and plan on ordering both. Xbox seems to be approaching a more PC esch system, and Sony is going toward more Console RISC based approach. I think the pricing will be similar, but I do have some questions on both unit. I think both sides of gamers are winners and should be angry over spec difference, but rather let the developers and artist make there masterpieces.
Developer friendly machine.You didn't say developer friendly?
That GDDR3 was obviously a typo. You still forgetting that GDDR memory warks together and therefore is way more powerful as data pool than any SSD.Both next-gen consoles' custom SSD compared to extremely slow ~100 MB/s HDD we've been having since PS3 era (and now PS4) for over a decade, yes, this is a game changer. You folks probably don't understand this. But devs do. Also, XB1 has DDR3 not GDDR3. And I think you're also confusing yourself by mixing up SSD speeds with memory bandwidth if you're comparing 7GB/s (SSD throughput) with 68GB/s total system memory bandwidth of a XB1. These are two completely different things.
The funny thing here is that indeed it's amazingly fast, and theoretically it should be able to load more assets, but then you are stuck with those small SSD sizes. The size of a game to really take advantage of this would be insane. I saw someone make the calculation that theoretically the PS5 is able to load in 70GB of assets in 10sec, and XSX 44GB. What size of game do you need that in only 10sec you should load that amount of assets? This is purely going to have an effect on loading times, and PS5 loading time will be 2sec, where XSX will take 4sec.
Lol dude, yeah sure that's why he put it there. To confuse everybody purely for those 100 PS4 games that are BC![]()