thicc_girls_are_teh_best
Member
AdoredTV hittin' us up with a next-gen analysis video. Been waiting for this (would make his the fourth: DF, NX Gamer, Moore's Law Is Dead and now AdoredTV. Waiting on you, Coreleks
Summary of his thoughts (some of my own thoughts on them thrown in from time to time, too):
>Never thought a next-gen system would use HBM2; too cost-prohibitive (I was hinting at this by saying the 16GB $120 estimates for HBM2 were just that: estimates. Not actual sell-prices. HBM2E was WAY out of the question because SK Hynix and Samsung have been on record saying their clients don't mind "paying a premium" for that memory)
>Never thought either next-gen system would use a 384-bit memory bus; would be significantly large on 7nm and require too much memory to saturate (even with the 320-bit bus in XSX MS had to go split 2 GB/1 GB chips. Also feels like a much better assessment on the XSX using 320-bit bus than Moore's Law Is Dead's take which, well, didn't make too much sense IMO xD)
>Feels a 256-bit memory bus is too small for next-gen APUs (maybe he's referring in terms of scaling to the GPU's actual performance levels? So maybe he feels a 256-bit bus for something like PS5 is fine, but would've been too small for something like XSX?)
>Always felt the Github leaks were right (can't argue with that; they were)
>Feels the split bandwidth speed pool in XSX was a cost-saving measure while still ensuring bandwidth requirements were met.
>Feels both systems are on 7nm enhanced process.
>Not anything from the video itself, but in it there's a part of Cerny's speech played and when he's talking about the GPU, he says "With PS5, the GPU was definitely the most area we felt the most tension, between adding new features, and keeping a familiar programming model."
We already know Cerny said there were features from the GPU they felt could be cut to f.it their PS5 target, so (this is my own speculation), there MIGHT be certain advanced features from PS5 GPU that are missing there, but could be present in the XSX's GPU. What those features are is anyone's guess though. However, this might become noticeable depending on exactly what features Sony decided to cut from their GPU that MS decided to keep in theirs.
>Seems quite impressed with the potential of Sony's SSD, though there's a bit of caution.
>While he acknowledges the smaller SSD in PS5 (compared to XSX) is a cost-saving choice, feels it was the preferable choice since it means having a drive that's over 2x faster thanks to Sony's proprietary flash memory controller.
>Feels that in a way, the PS5 can be seen as being chiplet-based. The main custom chip has three different elements: the CPU, GPU, and Flash memory I/O complex (which actually looks bigger than the CPU and GPU portions if you look at the graphic from the presentation!).
The I/O complex includes two co-processors (Coherency Engines; speculates they might be ARM or Jaguar-based), Overall tho, he doesn't think it meets the requirements to be considered an actual chiplet design.
>Surmises PS5's Geometry Engine is a rebranding of AMD RDNA2 tech
>Caught the specific mention by Cerny that PS5's RT will be based on AMD's PC initiatives. But goes into speculating something some folks may not like, because AMD's RT initiatives are strongly tied to Microsoft's DXR initiatives.
Could mean AMD has some other RT strategy they haven't shown, but that would be odd of them to show off their first RT around MS's DXR with an (admittedly garish) RT tech demo.
>Surmises Sony's intersection engine is a rebranding of AMD's fixed-function BVH traversal hardware in RDNA2
>Goes into the data-mining on PS5 chips and Github leak after this; basically same stuff we already know, but it does show that info on Oberon popped up (albeit without CU counts) a good while before the Github leak actually came out.
>Agrees with Cerny's focus on higher clocks allowing for faster rasterization, cache etc. performance, but feels shader unit count is the more reliable metric for computational performance (as long as there's no bottleneck at front end; i.e memory bandwidth I'd assume, or CPU power (like XBO and PS4 this gen).
Essentially, feels that while Cerny's right to a degree, he might be overemphasizing the bandwidth/saturation feed problem to the GPUs, particularly the RDNA2 ones, because while GCN DID have those bottlenecking problems, AMD's seemingly fixed the majority of it with RDNA1, let alone RDNA2 (which both systems are using). Could mean keeping the larger XSX GPU fed won't be much of an issue (but it would still be harder to do vs. PS5).
>Feels like Sony may've gambled on AMD being unable to improve the frontend with RDNA2, therefore building the APU's GPU around very high clocks. But with AMD having actually significantly improved the frontend, it might've meant Sony missed out on building a larger chip instead.
>Feels PS5's GPU's high clocks (which were actually held back due to power reasons) will be very good for AMD's upcoming GPU cards in the PC space.
>Feels both companies pushed their respective GPUs to the limit, but thinks MS made the smarter call this time around (wider, slower, and gambling on frontend improvements to RDNA2 which have actually materialized. Sony seemingly wasn't as confident in AMD's ability to improve the frontend and went with a narrower chip clocked quite faster, but with less shader performance).
>Surmises Sony decided very early on (probably around the time PS4 Pro development was wrapping up IMO) to go with a 36 CU GPU. That would fit with the Ariel and Oberon chips (even if I was thinking a later Oberon revision had a larger CU count).
>Brings up some comparison between PS5 BC (somewhat outdated info in his part, he's going by the Top 100 games thing when Sony put out the tweet hinting it's more than those games compatible, but they still have a ways to go before launch) and XSX BC (also a bit outdated info on his part; MS's doing a lot of the same compatibility testing as Sony, but has many more games already confirmed compatible. Also unlike Sony, MS's BC brings further enhancements to games like HDR support).
>Doesn't feel final game benchmarks will show a massive lead for MS, but feels they'll still be ahead in majority of game titles.
>Thinks MS has performance advantage in RT; not just due to a larger chip (RT on RDNA is CU-bound), but because devs on XSX will use DXR. Sony will need their own RT API stack or else devs would have to code "to-the-metal" for RT on PS5. Which sounds highly impractical, so I'm almost certain Sony has their own RT API in development.
>Agrees with literally everyone else, with Sony having the better SSD. In fact, figuring the controller probably wasn't cheap, if Sony did decide on a 36 CU GPU from the start, it might've been due to their plans for the SSD the whole time.
>Overall, is more impressed with what MS have accomplished with XSX, but he's still very interested in the PS5, both in picking one up and to see how games will utilize its strengths, particularly the SSD.
>Sees the high chance both systems will sell at similar MSRP, and that even with the differences in specs in a lot of areas, Sony should be okay selling at an MSRP similar to XSX due to brand power, though he at least hopes (and thinks) both systems should do very well next-gen.
------
Anyways, thought it was worth sharing to have a discussion. How do you guys and gals feel about their analysis and/or what else you've got to add?
