Reindeer
Member
What rebuttals? You use boost clock as direct measurement against fixed clock, that shows how silly your argument is.Says someone who does not provide any citations and is incapable of directly addressing rebuttals.
What rebuttals? You use boost clock as direct measurement against fixed clock, that shows how silly your argument is.Says someone who does not provide any citations and is incapable of directly addressing rebuttals.
Was this put up?
Btw matt from the other site was right on the money .he said xsx will have edge by nothing major and when he was pushed he said 15% last year.I prefer the 15% because it gets closer to the within 10% I have been saying all along![]()
On the hardware side not even upset it puts them in a pretty shit position for the entire generation though. They will be playing catch up and their is no sauce created that can span that gulf. XSeX is definitely more powerful. Also clearing up a bunch of bullshit Ive read PSV does constantly run in boost mode but I don't think that is a good thing. (Own Opinion) Cant imagine sustaining that high a clock speed is going to be quiet after hours of playing anything. Never cared who won will still be getting both systems for exclusives but XSeX should be the multiplat king.Any explanation on what happened? Bad sources?
Btw matt from the other site was right on the money .he said xsx will have edge by nothing major and when he was pushed he said 15% last year.
On the hardware side not even upset it puts them in a pretty shit position for the entire generation though. They will be playing catch up and their is no sauce created that can span that gulf. XSeX is definitely more powerful. Also clearing up a bunch of bullshit Ive read PSV does constantly run in boost mode but I don't think that is a good thing. (Own Opinion) Cant imagine sustaining that high a clock speed is going to be quiet after hours of playing anything. Never cared who won will still be getting both systems for exclusives but XSeX should be the multiplat king.
And the overclocked PS5 GPU is still 15% slower than Series X GPU in the best case scenario, so I'm not sure what your point is.
The GPU's use the same microarchitecture, but as Cerny said, a higher clocked 36 CU GPU performs better than a lower clocked 42 CU GPU because the higher clockspeeds have a trickle down effect to other aspects such as L1/L2 caches speeds and rasterization speeds. Having a dedicated SPU-like chip for audio helps offload audio processing from the GPU. When AMD talked about SmartShift, they said that it gives a free 10% boost in performance to devices over those who do not use SmartShift. Offloading in favor to the GPU and CPU is important because not all games are created equal. We have seen with games like Assassin's Creed Unity being more CPU intensive than GPU intensive, which was why the XB1 version actually performed better than the PS4 version.
Note how I actually paid attention to the fine details and even cite old games as examples. What do you have to back up your points? Unfounded assertions ad nauseaum.
The gap isn't large enough to make such a large difference visually. For multiplats, the PS5 version can run 75% the resolution of the XB1 version with Radeon Image Sharpening and it will look identical, if not very very close.
You claimed that overclocking an 2080 does not make it perform close to a 2080Ti. Benchmarks showed that you are wrong. Note how you provided no benchmarks to back your claims.What rebuttals? You use boost clock as direct measurement against fixed clock, that shows how silly your argument is.
For example, you can overclock 4 core CPU, but it will still not be able to do the same workload as 6 or 8 core CPU. You won't cut down the difference by much. Plus having less RT is different because of how much they impact the performance.
No, but I'm sure DF will try to replicate these conditions soon by under and overclicking gfx cards.Do you have exemples of graphics cards benchmarks showing this in practice? I remember that the Vega 64 did not perform as well as the Tflops would suggest compared to the 56, but would be nice to see testes done with RDNA cards, or even Nvidia for that matter.
15% will likely be lower due to other parts of the GPU being clocked significantly higher on PS5. We will have to wait and see how much that will matter.Damn thats crazy. Do you think ps5 can keep up with xsx performance wise ? I m assuming resolutions will be 15% lower due to gpu being 15% weaker .
Any comments on that ? Thanks man
Tell that to all PC GPU and CPUs then. Variable clocks has been common for a long time.Good Old Gamer also confirmed my suspicion that base clock of PS5 GPU is around 9.7 tflops. He also said variable frequency solution is not a good solution in the long term because of CPU and GPU having to constantly underclock and overclock all the time.
Show me one benchmark where 2080 performance the same as 2080Ti. You just making up stuff now. And read Osiris's comment above if you still in denial about power gulf.You're making circular arguments. If I can simply reference my previous posts to rebut your Argument from Assertions ad nauseam, then you have provided absolutely zero points.
You claimed that overclocking an 2080 does not make it perform close to a 2080Ti. Benchmarks showed that you are wrong. Note how you provided no benchmarks to back your claims.
Gotta love that rounding, let me fix it for you:
12.1 / 10.3 x 100 = 117
No they haven't, DF already stated that consoles always ran at fixed clock and the exception was Switch because it has to run in docked mode as well.No, but I'm sure DF will try to replicate these conditions soon by under and overclicking gfx cards.
15% will likely be lower due to other parts of the GPU being clocked significantly higher on PS5. We will have to wait and see how much that will matter.
Tell that to all PC GPU and CPUs then. Variable clocks has been common for a long time.
Strange?So, superfast SSD => 14 seconds to load game on XSeX (vs 30-ish on Xbox Strange)
I can tell that you're just responding as fast as you can without actually thinking because I already showed you a while ago:Show
Show me one benchmark where 2080 performance the same as 2080Ti. You just making up stuff now. And read Osiris's comment above of thy still in denial about power gulf.
Show the benchmarks. Here's what I found.
2080Ti has a +15% effective game eFPS over the 2080. When overclocking the 2080 by 110 Mhz, average FPS improved by 11%. And you also have to take into account that the 2080Ti has 11GB of VRAM on a 352-bit bus while the 2080 only has 8GB of VRAM on a 256-bit bus.
Good Old Gamer in his video said variable frequency will likely result in reduced performance after few years because clocks are always under stress.That also concerns me about new RROD issues over time.
That gulf ?do you mean gulf in power ? Man its just 17% . Right now x1x has 45% advantage over pro. Or ps4 has 41% advantage over x1.this is closest 2 console have ever beenOn the hardware side not even upset it puts them in a pretty shit position for the entire generation though. They will be playing catch up and their is no sauce created that can span that gulf. XSeX is definitely more powerful. Also clearing up a bunch of bullshit Ive read PSV does constantly run in boost mode but I don't think that is a good thing. (Own Opinion) Cant imagine sustaining that high a clock speed is going to be quiet after hours of playing anything. Never cared who won will still be getting both systems for exclusives but XSeX should be the multiplat king.
Man.. I don't mean to be a dick but you're just wrong.Nope it was referring to their own take of RDNA 2 cores that's on PS5, and they compered those to again another specialized core that is based (but not with complete feature parity!) on GCN that was on their PS4. So pears to avocados.
PS5 RDNA 2 is just a custom RDNA 2, and again PS4 GCN is a custom GCN, both of which you can't find in off the shelf PC part!
Well if a game is bottlenecked by I/O, solving that bottleneck frees up cycles that can be used by GPU to push FPS. So sticking in an SSD =/= higher fps directly, but indirectly if you know I/O is the bottleneck and then yeah SSD = higher fps. Star Citizen is a game that is highly bottlenecked by I/O, so if you have and HDD oh boy good luck getting it above 14 fps, but put in an m2 and it is night and day. Your top of the line graphics card is choked in the first instance, now it is flopping those points as it's supposed to.
Yeah me too. It sounds similar to Vega fp16 packed math and primitive shader stuff that was largely crippled by other part of the gpu and mostly unused for that reason. Now, it seems whatever was crippling it has been solved, and I kinda have a feeling it is have something to do with higher bandwidth, or using the audio SPU as coprocessor.
Yet! Maybe from now on, having PCIe 4.0 m2 SSDs will be much more important going in the future even on the PC landscape. What if the previously weakest links/consoles are generating a leap for PC gaming as well, pushing all enthusiast grade rigs in this direction, and all the developers start targeting these hw and expecting certain streaming speed demands for their upcoming games.
I don't see them going cheaper - the APU is probably smaller, but their cooling solution probably isn't. Everything else seems about equal. Sony's SSD may even be a little more (probably not, just more parallel) .. I think both companys will have very similar options on price.Yup, its all down to cost and exclusives now.
First party I would usually side with Sony, but damn if MS's recent change of tact hasn't got me hopeful. They now have a shit ton of first party studios, many of which haven't shown a single thing of what they are doing for next gen yet. I remain hopeful that they have taken the last few years to prep for all this. Sony, I will give the benefit of the doubt, because while they get a bit of stick for dad walking simulators... They are bloody well GOOD dad walking simulators.
But price...? Who knows. Actually who knows... PS5 could be priced at $399 but I don't actually see it. The parts are not THAT much cheaper, and that SSD is not gonna be cheap, that's for sure... I think the consoles will be on par price wise, but I really have this odd feeling MS will undercut them for some reason. I can't put my finger on it, I just have the tingles?
Bittersweet. I'm going to have to purchase a tv where the arc, hdr, and 4k60 is on more hdmi's that just one like my current device.This will definitely force to finally invest on an av receiver with 4k hdmi ports to be used for both picture and sound.
Good Old Gamer in his video said variable frequency will likely result in reduced performance after few years because clocks are always under stress.
It's a slow CPU on an old node .. they could replace/retire it with something better and compatible at the same price .. probably ..I have a question, I'm sorry if it was asked before...
I'm reading about Xbox Lockhart as a cheaper option coming out....
What about Xbox One X? They are killing it after less than 3 years?
Makes no sense to me, it is plenty fast for 1080p gaming
Those tests still fall comfortably below 2080TI SMH.I can tell that you're just responding as fast as you can without actually thinking because I already showed you a while ago:
Listen to commentary, that may help!So, superfast SSD => 14 seconds to load game on XSeX (vs 30-ish on Xbox Strange)
The clock freq drop 2% to drop the power draw from wall by 10%so what is PS5 locked tflops? 9.2? serious question
No they haven't, DF already stated that consoles always ran at fixed clock and the exception was Switch because it has to run in docked mode as well.
An overclocked 2080 having 96.5% (1.11/1.15 x 100 for the math) the performance of a 2080Ti is not comfortably below. Heck, that's even within a 5% margin of error. And this is in spite of the 2080 having less VRAM and a slower bus.Those tests still fall comfortably below 2080TI SMH.
That is gpu at full throttle, which also means the cpu is being downclocked during this time. So by maxing out gpu you create a cpu bottleneck and by maxing out cpu you are creating a gpu bottleneck. At least that is how I hear it. The ps5 cannot run 3.5 ghz cpu and 2.3 ghz gpu concurrently.That gulf ?do you mean gulf in power ? Man its just 17% . Right now x1x has 45% advantage over pro. Or ps4 has 41% advantage over x1.this is closest 2 console have ever been
Based on what they've shown, MS seems confident in their kleenex box design. And their recent Xbox X system is quiet too.As someone whose PlayStation consoles broke around the 2-year mark (i.e., after the warranty had expired, and despite all the care), I would like to ask the following - how's everyone feeling about console durability after both consoles' deep-dives? I know we are yet to see PS5's form factor, but going by what we have right now, which system seems more prepared to deal with heat issues, energy spikes, and the like?
I got a strange vibe from Cerny's discussion of PS4's heat issues, and how they would be handled in the coming generation. What are your thoughts on this, guys?
Man the down clock is 2% its nothing when it happens .its juts to keep the console from sounding like jet engine .the gpu is between 10.08 to 10.28 lol .u sound like the performance drops by 50% . Its 2%That is gpu at full throttle, which also means the cpu is being downclocked during this time. So by maxing out gpu you create a cpu bottleneck and by maxing out cpu you are creating a gpu bottleneck. At least that is how I hear it. The ps5 cannot run 3.5 ghz cpu and 2.3 ghz gpu concurrently.
I'd like to hear more about that, I'm a bit fuzzy as to whether that's the case.That is gpu at full throttle, which also means the cpu is being downclocked during this time. So by maxing out gpu you create a cpu bottleneck and by maxing out cpu you are creating a gpu bottleneck. At least that is how I hear it. The ps5 cannot run 3.5 ghz cpu and 2.3 ghz gpu concurrently.
Yeah, this just proves to me that Cerny was told to work within a certain limit and it probably was cheaper to go for faster SSD than it was for bigger APU.I am a Sony fan first and coming from a PS gamer I can say I can't wait to play games on the PS5.
That said their approach does concern me especially over time.
To me it felt like a knee jerk reaction to the XsX being 12 tfs and they had to do whatever they could to reach double digits.
Usually, for marketing reasons also, they go for something like 399,499, 599. Not sure if they will go for a 449 while XSX is 499 (as an example)
Well ssd can be designed in house but R&D costs will be absorbed else where rather than BOM .apu has to be bought. So different methodsYeah, this just proves to me that Cerny was told to work within a certain limit and it probably was cheaper to go for faster SSD than it was for a bigger APU.
The reason you wouldn't get a PS5 day one is because it won't play all games that you can play right now on your PS4? And I thought the main selling point were next gen games...
I don't remember MS making RT mandatory on XSX.
I don't know man. Just doesn't sound right. Kinda feels like a 32mb esram situation. I remember all of the xbox fans that were active on the forums defending xbox one 8gb ddr3 with this. In the end, it caused complexity which means the developers had trouble optimizing for it. I believed them all to at the time. That is what wishful thinking will do.Man the down clock is 2% its nothing when it happens .its juts to keep the console from sounding like jet engine .the gpu is between 10.08 to 10.28 lol .u sound like the performance drops by 50% . Its 2%![]()
and you are king of bullshit riddles.On the hardware side not even upset it puts them in a pretty shit position for the entire generation though. They will be playing catch up and their is no sauce created that can span that gulf. XSeX is definitely more powerful. Also clearing up a bunch of bullshit Ive read PSV does constantly run in boost mode but I don't think that is a good thing. (Own Opinion) Cant imagine sustaining that high a clock speed is going to be quiet after hours of playing anything. Never cared who won will still be getting both systems for exclusives but XSeX should be the multiplat king.
Sure. We will see the fruits soon enough man and if it falls behind xsx in multiplatform games more than 17% then you will be correct. Otherwise it is minor as cerny said and rarely happens.I don't know man. Just doesn't sound right. Kinda feels like a 32mb esram situation. I remember all of the xbox fans that were active on the forums defending xbox one 8gb ddr3 with this. In the end, it caused complexity which means the developers had trouble optimizing for it. I believed them all to at the time. That is what wishful thinking will do.
I don't pretend to know anything more about technology than "turn it on and it makes a noise", but I'd have thought that esRAM issue analogy would be more applicable to the SeX's slower 6gb pool of RAM?I don't know man. Just doesn't sound right. Kinda feels like a 32mb esram situation. I remember all of the xbox fans that were active on the forums defending xbox one 8gb ddr3 with this. In the end, it caused complexity which means the developers had trouble optimizing for it. I believed them all to at the time. That is what wishful thinking will do.
SStrange?
That is gpu at full throttle, which also means the cpu is being downclocked during this time. So by maxing out gpu you create a cpu bottleneck and by maxing out cpu you are creating a gpu bottleneck. At least that is how I hear it. The ps5 cannot run 3.5 ghz cpu and 2.3 ghz gpu concurrently.
New Switch mod delivers real-time CPU, GPU and thermal monitoring - and the results are remarkableHowever, there is a twist and it's something we've covered before, that we can now see play out in real-time - Nintendo's 'boost mode'. This amounts to optimisations in how certain games selectively overclock the CPU to improve loading times. For example, when you die in Mario Odyssey, the screen fades to black and the game loads you back to the last checkpoint. There is a fairly quick turnaround in Odyssey but this is faster thanks to boost mode. During loading, the CPU gets upclocked temporarily to 1785MHz - a 75 per cent increase on the stock clock. Meanwhile, the GPU actually drops all the way down to 76.8MHz - a tenth of its usual speed. Nintendo is balancing thermals by overclocking one component to the max, while downclocking another to the bare minimum.
How would you measure that? For starters, do we even have any numeric values, besides lower resolution, that Xbone has vs PS4?if it falls behind xsx in multiplatform games more than 17%
I said PC, not consoles. Perhaps read?No they haven't, DF already stated that consoles always ran at fixed clock and the exception was Switch because it has to run in docked mode as well.
By fps and resolution .those are hard numbers man.if xsx runs bative 4k (2160p) ps5 should be around 1800p . If its lower then we can say its behind more than the gpu power .How would you measure that? For starters, do we even have any numeric values, besides lower resolution, that Xbone has vs PS4?
The most unfortunate thing for me, personally, is Sony pushing those 36CUs to cross that psychological 10TF mark.
I'd rather it stayed cool, quiet, easier on component and power plants.
Yeah, the XsX ram set up is different for sure. It does seem very deliberate and that is wasn't configured this way as a bandaid like 32mb esram. DF mentioned that MS stated they actually embedded a processor in Xbox one X that monitored texture usage in ram and designed their setup accordingly. But this conclusion is based off watching a bunch of youtube and reading breakdowns of hardware so I could just be full of shit.I don't pretend to know anything more about technology than "turn it on and it makes a noise", but I'd have thought that esRAM issue analogy would be more applicable to the SeX's slower 6gb pool of RAM?
Nevermind, misread.I can tell that you're just responding as fast as you can without actually thinking because I already showed you a while ago:
Cant imagine sustaining that high a clock speed is going to be quiet after hours of playing anything.
That also concerns me about new RROD issues over time.