• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

NeoGiffer

Member
Ok so, here's my desk. Ask me anything. But seriously... This is a test to show you just how easy it is to be anybody on the internet and tell any old porkies. 40tf PS5 confirmed? I mean, this shot is an original, so it MUST be true. This isn't my desk. This isn't my devkit. But do yourself a favour and prove me wrong. Tell me where this devkit shot is from.


SOJcrDL.jpg


Of course this is bollocks bloody hell.

tumblr_mo7nxwliDD1s7kdi4o1_250.gif
 

Tiago07

Member
Ok so, here's my desk. Ask me anything. But seriously... This is a test to show you just how easy it is to be anybody on the internet and tell any old porkies. 40tf PS5 confirmed? I mean, this shot is an original, so it MUST be true. This isn't my desk. This isn't my devkit. But do yourself a favour and prove me wrong. Tell me where this devkit shot is from.


SOJcrDL.jpg


Of course this is bollocks bloody hell.
So here comes the hero of the heroes.
 

ANIMAL1975

Member


If he would say 11 or 12 is important you would not have quoted him 😂
He knows nothing . My grocery shop owner around the corner knows more than him about ps5 .
Yesterday he hinted about PS5 probably being delayed, and someone posted here. Later another person said that he come around, and admitted that it was to troll the "ps fanboys" lol, i think that is enough information to stop people from posting this troll shit!
 

Gavin Stevens

Formerly 'o'dium'
All I’m going to add is it’s funny how somebody mentions about how people are usually given dev machines and a target spec that Sony/Ms are aiming for, and then all of a sudden, he knows all about target specs, when there was no mention before...

Y’all bein’ played!
 

joe_zazen

Member
There's one reason this doesn't really add up: why do all this work for backwards compatibility through dual-GPUs and literally only have one of the GPUs even doing it, when a larger single GPU die is both potentially cheaper and more logical? The PS5 chips we've seen results for (Gonzalo, Ariel and Oberon, and I believe others have said that Ariel is basically an earlier version of Oberon) would be 40CU chips. If people are claiming a figure like 13TF, then across both chips they'd be disabling four cores, for 72CUs @ ~1420MHz.

...but if that's the case, why would the PS5 devkit need the cooling it seemingly has? From the looks of it that type of cooling is overkill for a system with GPUs running that low in clocks. And we've already seen Oberon testing at 2GHz...why do that for a system intended for a dual GPU setup? Unless, what, they have one Oberon chip at 2GHz or some super-high number like that, and the other at sub 1000MHz (well below any modern GPU card's clocks)!? So in reality wouldn't PS5's TF number be closer to, say, 15.6TF-16TF (assuming they could at least match XSX's possible GPU clock)? But there's absolutely no one credible throwing that figure around.

The other idea would be that in looking for a somewhat sane price point on a dual GPU system, they'd clock them both insanely low to keep the cooling costs down. But really, even a modestly pricey cooling setup is still cheaper than a second GPU (and the costs in implementing it in the design). So even with that in mind a dual "small" GPU setup & reduced cooling, is going to be a higher BOM than a monolithic larger GPU with sufficient cooling. And for what, exactly? A possible single TF performance edge, if that? Doesn't seem economically sensible imo.

I mean, monolithic GPUs already have CUs disabled for yield purposes; they can disable any number of additional CUs as needed if they wanted to run hardware-level back-compat for PS4 and PS4 Pro. And it'd be smarter than a dual 40 CU GPU design where you literally just turn one off but even both together don't deliver a substantial performance boost over a 60CU Navi chip (at least with the implementation of them Sony would do theoretically to reach the 12+/13TF claims some of these other rumors have).

So I mean, could they be using dual Oberons in the system? Maybe. But given the much lower-than-maximum performance potential they'd be using them at, it (IMHO) only lends credence to the idea that at some point, they were going with a sole 40CU chip, hence the 2019 launch rumors, the reports mentioning a delay into 2020, the rumors over a system targeting $399 specs, and even helps put into perspective some of the sudden departures from SIE and other PS divisions last year like Shawn Layden. Something would have forced them into a redesign state, maybe it was the XSX?

Personally I just find it fascinating because if indeed the dual GPU spec plays out like it does, with two Oberon (or whatever its final name will be) chips but only giving an extra TF of performance over XSX (if that), then also it humorously feels a bit like karma (if you can call it that) biting Sony. It's almost completely analogous to the rapid redesign SEGA had with the Saturn over two decades ago, after finding out PS1 specs. Partly due to being a larger company and also partly due to technology just being nowhere near as esoteric anymore in the console space (hell, AMD's doing the heavy lifting for both next-gen systems), I can see Sony succeeding in such a redesign where SEGA ultimately fumbled all those years ago....but it wouldn't change the (potential) similar factors of them leading to those respective redesigns.

Which honestly would just make me question if Sony did not pay MS or Nintendo as much of a threat to consider for next-gen, hence initially going with a more conservative design. Quite interesting really to see how these parallels can play out, it's what makes learning console history so fun. Now the other part of this equation that could make everything I'm saying moot, is if instead of dual Oberons, PS5 has a 40 CU GPU chip coupled with a smaller (say, 20 CU) chip.

Now THAT would make more sense, and allow both to reach their sweetspots while offering performance potentially either around XSX's 12TF figure or slightly more. You wouldn't essentially have two GPU clusters operating well below sweetspot efficiency, and each one could just have two CUs disabled rather than four. You would still need a sufficient cooling solution but (potentially) nothing more expensive than what the XSX is doing.

But again, that presents another problem: why haven't we uncovered any existence of that other potential GPU? It's getting pretty late and the reveal is literally around the corner. We should have at least SOME hard data on such a chip, via GPU benchmark tests, but to the best of my knowledge, nothing's come up. Hell, we even have a benchmark test for Lockhart's GPU, and we know even less about that than the PS5 xD.

EDIT: Also would like to add to my speculation, that if dual GPU with 2x Oberons and 13 or so TF were the performance targets for PS5, we wouldn't be seeing Oberon GPU benchmarks at 2GHz as late as June 2019, and we'd see at least a benchmark or two by now with Oberon around 1.42GHz (we already know the other two clocks were regression tests for PS4 and PS4 Pro).

If a dual GPU approach with Oberon and a 2nd chip half its size is the target for 12-13TF PS5, then we need a leak for that smaller chip to come out that can be tied to Oberon. Which hasn't happened yet and doesn't seem like it's going to happen. Also at some point we'd need an Oberon benchmark testing to come out with it clocked around 1.7Ghz since it'd be assumed both it and the other GPU would be clocked the same, and 1.7GHz is how you'd get to 12TF; 1.75GHz on both would push them near 12.5TF, higher than 1,8GHz would be slightly more than 13TF.

a few random thoughts
  • we know nothing of whats inside the ps dev kit. And devs don't open them to poke around. So not having any leaks or rumours isn't strange. Also, If the intern hadnt fucked up, we wouldnt even have the github stuff and we’d really be in the dark.

  • as for why a 36 cu chip, imo it is because Sony use hardware design to facilitate back compat, unlike Microsoft who use just use software emulation. This imposes constraints on the hardware designers, as with ps4pro needing ‘mirrored butterfly’ 36 cu chip. Pro could have been designed better if sony had the software chops, and the same will be said of the ps5. But they dont have those chops, so they need to design the hardware this way, which from the outside looks nonsensical.

  • cooling in a dev kit, really dont know whats going on there without seeing inside. But sony’s design prowess is overstated. Sony isnt much of a hardware company anymore beyond image sensors. My guess is they are using high rpm small-ish fans.
  • I never thought about a non-symmetrical setup. I suppose a 20cu second chiplet would make financial sense if the tech is there.
  • finally, github + reliable rumours make a 36cu ps5 apu unlikely unless xsx is also 8-9 TFs.
 

StreetsofBeige

Gold Member
I don't know anything about chip design but with the 36 cu being a prediction due to BC, why does it have to be 36?

Why can't PS5 have 40 or 50? There's still 36 available. Just have the system activate 36 when it's a BC game???

Maybe it's more complicated than that?
 

CatLady

Selfishly plays on Xbox Purr-ies X
Whom to believe here, a bunch of snowflakes or PS5 lead system architect? Difficult one 😭

This.
How do people keep going back to PS5 has no raytracing. We've got fake insiders spreading all kinds of bs about both consoles and people buying their garbage but one of the few things confirmed by Cerny himself, idiots want to doubt.

Who knows maybe Cerny is a fake insider, I mean after all he hasn't been vetted has he? :pie_roffles::pie_roffles::pie_roffles:
 
Last edited:
Status
Not open for further replies.
Top Bottom