HeisenbergFX4
Gold Member
I'll believe them before I believe Timdog, Mistermediax or Penello![]()
I have faith in Kleegamefan as well especially him saying both machines are close.
I'll believe them before I believe Timdog, Mistermediax or Penello![]()
We got a lot of armchair Electrical Engineers and Computer Systems Architects in this thread.
People were certain Microsoft would allow a 50% shader deficit back in 2012/13.....until it happened.
People making ridiculous pronouncements that a 9.2TF console isn't good enough and would be DoA.
We even had Digital Foundry making rather short sighted predictions that the PlayStation 4's early sales figures weren't indicative of sales performance long term.
The easiest way to make yourself look a fool.
And then we have the people who make new predictions with enough variety and at such frequency, that at least one of those predictions is almost certain to be partially correct.
The GitHub repo leak was about the only meaningful thing we had to talk about all year.
At this point just wait and see. Won't be long now.
Because it fucking came from AMDs ASIC engineers testing 2020 chips? Jesus people...who the f is Kleegamefan?Why is the Githib repo the most meaningful thing? You've had ppl in the industry make comments whose words were valid before but now all of a sudden they are not because they say ps5 is more powerful? I don't understand how now it's dismiss all those comments by ppl who have industry contacts like Reiner, Klee, Schreir and Matt.
But yes we shall wait and see. Official confirmation day will be an ex iting one.
Because it fucking came from AMDs ASIC engineers testing 2020 chips? Jesus people...who the f is Kleegamefan?
Not one of the chips mentioned in repo came out, they are all scheduled for 2020 (Renoir, MI100, Oberon, Arden, Sparkman). Why do you think people will trust "yea they are def close" by Kleegamefan then literal data coming from AMD GPU engineers? (That was removed from Githun and twitter)?
Check specs I posted few posts above. If these spec from Github match final consoles, they will be 50% closer then X vs Pro - therefore certainly close.
Sorry yes, apart from Navi10 all other chips are not out. I think when repo was created (2018 06 month) it was filled with Navi 10 data first.Nav10 is in the repo and is 100% out.
2 errors: We can see the ram bandwidth of Arden on the github leak: 560GB/s theoretical and only 16GB of total memory for Arden according to all insiders (that also leaked 4TF / 12tf etc.).Xbox One X vs PS4Pro
48% more TF (6TF vs 4.2TF)
50% more memory (12GB vs 8GB)
50% more BW (326GB/s v 218GB/s)
43% more Tex fill (188GT/s v 131GT/s)
35% less Pixel fill (37.5GP/s v 58GP/s)
Arden vs Oberon
30% more TF (12TF v 9.2TF)
Same memory (16/20GB v 16GB + 4GB)
11% more BW (640GB/s v 576GB/s)
30% more tex fill (375GT/s v 288GT/s)
19% less pix fill (107,5GP/s v 128GP/s)
I would say these two are ~20% difference and HALF that of X vs PS4Pro.
I that close? I think so. If you are Sony, and dont know MS is going with dual SKU + PC tower premium console, you have to think difference would be even less, therefore IMO, Sony's console is very close to theoretical limit you would expect for ~200W 400-500$ console in todays time.
Sorry yes, I used 16Gbps.2 errors: We can see the ram bandwidth of Arden on the github leak: 560GB/s theoretical and only 16GB of total memory for Arden according to all insiders (that also leaked 4TF / 12tf etc.).
Which is exactly what was guessed by many after this year E3 Scarlett reveal using their motherboard render.
Seems I missed quite a bit, If you think there is that much of a difference in these machines power then you're going to be disappointed. They are both very close and Neither is under 10. I will stand on what I said they are both very competitive and the difference is minimal. You have a month a week and a few days until its official.
2 errors: We can see the ram bandwidth of Arden on the github leak: 560GB/s theoretical and only 16GB of total memory for Arden according to all insiders (that also leaked 4TF / 12tf etc.).
Which is exactly what was guessed by many after this year E3 Scarlett reveal using their motherboard render.
Because while it may not be up to date or whatever, it is pretty clear cut. Stuff "insiders" had said, whether that's K Lee or Tom Warren or Matt or Penello or whoever, was vague and to me it didn't mean a whole lot.Why is the Githib repo the most meaningful thing? You've had ppl in the industry make comments whose words were valid before but now all of a sudden they are not because they say ps5 is more powerful? I don't understand how now it's dismiss all those comments by ppl who have industry contacts like Reiner, Klee, Schreir and Matt.
But yes we shall wait and see. Official confirmation day will be an ex iting one.
Sorry yes, apart from Navi10 all other chips are not out. I think when repo was created (2018 06 month) it was filled with Navi 10 data first.
People still buying into the red pill blue pill guy?![]()
Ok 5 chips in repo are not released, one is - Navi 5700.It seems you first decide on an agenda, then go looking for evidence to support it. That can get you to all kinds of places, the truth not being one of them.
Ok 5 chips in repo are not released, one is - Navi 5700.
Repo was created in June 2018 and updated throughout till June 2019. Easy to assume which chip data was first filled out.
Seems I missed quite a bit, If you think there is that much of a difference in these machines power then you're going to be disappointed. They are both very close and Neither is under 10. I will stand on what I said they are both very competitive and the difference is minimal. You have a month a week and a few days until its official.
I said before I take all "insiders" info at face value, but reserve the right to question them all.
This this this
In my view nobody qualifies as an insider unless they provide some real evidence. Anyone can post a couple of vague sentences which cover most bases and have the fallback of ‘things change’. It’s useless.
Could MS overclock?Xbox One X vs PS4Pro
48% more TF (6TF vs 4.2TF)
50% more memory (12GB vs 8GB)
50% more BW (326GB/s v 218GB/s)
43% more Tex fill (188GT/s v 131GT/s)
35% less Pixel fill (37.5GP/s v 58GP/s)
Arden vs Oberon
30% more TF (12TF v 9.2TF)
Same memory (16/20GB v 16GB + 4GB)
11% more BW (640GB/s v 576GB/s)
30% more tex fill (375GT/s v 288GT/s)
19% less pix fill (107,5GP/s v 128GP/s)
I would say these two are ~20% difference and HALF that of X vs PS4Pro.
I that close? I think so. If you are Sony, and dont know MS is going with dual SKU + PC tower premium console, you have to think difference would be even less, therefore IMO, Sony's console is very close to theoretical limit you would expect for ~200W 400-500$ console in todays time.
Ok let's assume this scenario, sony showing Horizon 2 with hellblade2 level of graphic in game and selling 14tf at 599$. Would you buy it? Or you would pass and buy cheaper 12tf xbox at 499$ with "in engine" hellblade2?
I would go for Sony even at 899$
I do agree but then I also get why anyone with real info might not want to share the full details of where and from who they got the info with strangers on a forum. Even if they are mods/admin.
Sorry yes, I used 16Gbps.
Oberon went from 448GB/s to 512GB/s to 530GB/s and I have to imagine final console will have 18Gbps chips (perhaps downclocked to 530GB/s).
That is only question. Can be within few % or 10%, doubt MS will go for more then 640GB/s.
I dont think its tiny, it has 50% more volume then Xbox One X. That one emited ~180W in gaming, so you can do the math what can this mean.I must admit I'm starting to think we are missing something really big with all these leaked chips. 1.8-2.0Ghz on the GPU and 3.2-3.5GHz on a much more powerful CPU just doesn't add up in a console size box (again the XSX box shown is tiny).
I feel major twists coming but can't tell if good or bad....
Well, thats only way you will get good BW on 256bit bus. 320bit bus and 16Gbps is likely very close in terms of wattage (as on this one you have 10 chips, instead of 8 +So not only 2Ghz gpu but 18gbps ram too?
Might be a design to consider if Sony want to make a personal heater, not so much for a home console.
We don't know anything but it is fun to throw numbers around and see who is closest.hi everyone, can anybody put me up to speed, what's the status ?
Seems I missed quite a bit, If you think there is that much of a difference in these machines power then you're going to be disappointed. They are both very close and Neither is under 10. I will stand on what I said they are both very competitive and the difference is minimal. You have a month a week and a few days until its official.
Well, thats only way you will get good BW on 256bit bus. 320bit bus and 16Gbps is likely very close in terms of wattage (as on this one you have 10 chips, instead of 8 +
wider bus).
can mods start banning troll fanboys like this? there has been a real surge of fanboys these past few days and they are really dragging the discussion down.
I dont think its tiny, it has 50% more volume then Xbox One X. That one emited ~180W in gaming, so you can do the math what can this mean.
The bigger the chip, the harder it is to overclock.Could MS overclock?
I do agree but then I also get why anyone with real info might not want to share the full details of where and from who they got the info with strangers on a forum. Even if they are mods/admin.
But 384bit bus is 32mm² larger then 256bit one. It would also mean 12 chips, 4 more then with 256 bit bus. So even if they are 14Gbps there are 50% more of them vs 8 on 256bit.384 bit + 12 chips for 24GB would be the way I would go but maybe 18gbps ram is more efficient and if it is great.
I suppose they may be using N7+ and Sony may not, but I seriously doubt AMD would commit their 7+ wafers to low margins, when they can make millions of Epyc Milan CCDs and chuck them at data centres at far higher margins. Microsoft (and/or Sony) would have to be seriously overpaying for a newer node.
You literally trolled him with a meme.
I think a lot of us on here can use a little self reflection, especially the past week or so.
That is leak from taiwanese forum and while I think its legit, its probably estimate.The bigger the chip, the harder it is to overclock.
Power consumption and thermals go out of control much faster. Scaling at higher frequencies tends to not work as well - particularly if there are bandwidth limitations.
They could do it. But it would be challenging. The XSX is big, but it's still smaller than an ITX chassis. And before anyone says "Vapour chamber cooling", just remember that vapour chambers are by no means advanced as a cooling technology.
I've also had a thought. The PS5 and XSX are believed to be 300mm2 and 350mm2 correct? How did Microsoft manage to fit 40% more shaders with only 17% more area?
I suppose they may be using N7+ and Sony may not, but I seriously doubt AMD would commit their 7+ wafers to low margins, when they can make millions of Epyc Milan CCDs and chuck them at data centres at far higher margins. Microsoft (and/or Sony) would have to be seriously overpaying for a newer node.
And before anyone says RDNA2 - features can be backported.
Consensus? Hah! There's the People's Front of Judea. And the Judean People's Front. And also the Popular Front, but his opinion doesn't matter. (Splitters!)I only want to ask if had a consensus here...
Maybe the 13TF PS5 devkits was to simulate retail GPU + Ray-tracing hardware
Seems I missed quite a bit, If you think there is that much of a difference in these machines power then you're going to be disappointed. They are both very close and Neither is under 10. I will stand on what I said they are both very competitive and the difference is minimal. You have a month a week and a few days until its official.
Despite all the "$599!!!" PS3 was sold at tremendous loss at launch, something like $300 per console. It is doubtful they'd be willing to take such a hit next gen (damn near bankrupted the company) - but things are different nowadays, SIE is a cash cow for Sony. So at some loss, why not.sony can't sell at a loss MS can.
13 vega tflops perhaps...but why would they use 5700 card and ducktape RT on top of it instead of just using a card with inherit RT. it sounds very budget in a way....but sony can't sell at a loss MS can.