Mass Shift
Member
If Sony or Microsoft skips this E3 then that will be a big mistake. Could matter in fact end up like SEGA
You could still have your press conference and skip E3 altogether.
If Sony or Microsoft skips this E3 then that will be a big mistake. Could matter in fact end up like SEGA
Probably... the "Oberon" only shows BC mode tests... I did not see Arden or Sparkman are these BC tests or full tests?
It could be that they're equally powerful, but Microsoft has decided to upclock the GPU by 100mHz. Either way, I don't expect any discernible difference.My question is, if Series X does have more TFs, why do you think some insiders are saying PS5 has the power edge?
There is no way to make predictions without thinking the inside info if you have them lol
That goes against what Sony has been saying about PS5 going to be a 'premium' console ALSO,I am afraid its going to be loud.
It wasn't my intention to start that topic again, cause I think it's now clear that all insiders and Phil retweeting the DF video that points to 12 TF RDNA, make obvious that Series X is at least 12 TF RDNA.Please don't start this topic again.
I'm trying to find the "leaks" again but I can't.Honestly no way to know at this point. That's why I was asking.
There is no way to make predictions without thinking the inside info if you have them lol
How do you know he not take inside info to make the predictions? He knows the inside info so he trend to make predictions that doesn't disagree with it.![]()
Our Video Game Predictions For 2019
What’s coming for video games in 2019? What sort of wild predictions do we have for the year? Today on Kotaku Splitscreen, we discuss.kotaku.com
He didn't get one 2019 prediction right.. They're not allowed to use inside info, so I'd say it puts Jason at a pretty big disadvantage when making these predictions lol.How do you know he not take inside info to make the predictions? He knows the inside info so he trend to make predictions that doesn't disagree with it.
Think what you wantHow do you know he not take inside info to make the predictions? He knows the inside info so he trend to make predictions that doesn't disagree with it.
So he makes wrong predictions on purpose even knowing it is wrong?Think what you want
none of his forecasts for 2019 came true
They have conditions. Without internal information
I don't think Jason Schreyer is lying to his colleagues
Or he makes predictions on things he as no inside info for. I don't think Jason is omniscient.So he makes wrong predictions on purpose even knowing it is wrong?
What the point?
Thats an easy one.My question is, if Series X does have more TFs, why do you think some insiders are saying PS5 has the power edge?
That could explain a performance edge, but Insiders like Klee have been clear that PS5 has more TFs.Thats an easy one.
When MS go RTX on the performance takes a hit, the better/discreet solution on PS5 performs better, all '9' of it's measly teraflops just for geometry, texturing and shaders all RT handled by the RT integrated chip. That '10%' difference soon disappears in this scenario.
That would be my guess anyway.
(play on word here, I have no insider info unlike Walt)
0.0001MHz would not make a difference.It could be that they're equally powerful, but Microsoft has decided to upclock the GPU by 100Hz. Either way, I don't expect any discernible difference.
Sorry, 100 Mhz.0.0001MHz would not make a difference.
I think that would draw more power than XSX big APU.Thats an easy one.
When MS go RTX on the performance takes a hit, the better/discreet solution on PS5 performs better, all '9' of it's measly teraflops just for geometry, texturing and shaders all RT handled by the RT integrated chip. That '10%' difference soon disappears in this scenario.
That would be my guess anyway.
(play on word here, I have no insider info unlike Walt)
Maybe it will have more tflops, this is just based on the info now as that is what the comments are based on, current or fairly recent devkits.That could explain a performance edge, but Insiders like Klee have been clear that PS5 has more TFs.
... Pardon?Wow this thread got sprinkled with some heavy feminist ingredient.
HATRED.
I am almost wishing next gen would start 2021 .
Whats happening here is too much for my little hart to handle.
Fake leaks , speculations, lies, fights.......
Wow this thread got sprinkled with some heavy feminist ingredient.
HATRED.
I am almost wishing next gen would start 2021 .
Whats happening here is too much for my little hart to handle.
Fake leaks , speculations, lies, fights.......
Welcome to the wonderful world of console gamingWow this thread got sprinkled with some heavy feminist ingredient.
HATRED.
I am almost wishing next gen would start 2021 .
Whats happening here is too much for my little hart to handle.
Fake leaks , speculations, lies, fights.......
Because he's a fucking moron.He didn't get one 2019 prediction right.. They're not allowed to use inside info, so I'd say it puts Jason at a pretty big disadvantage when making these predictions lol.
Because he's a fucking moron.
It's not even that. He think so highly of himself and completely unable to take criticism or comments on his writing. He acts like his book is god's gift to gamers. He acts like his expose pieces are Watergate 2.0. The guy is sniffing his own farts to a massive degree.When you decide to die on the hill being a PR mouthpiece for a video game forum, that's a given.
When you decide to die on the hill being a PR mouthpiece for a "video game" forum, that's a given.
Don't know how you could make that assumption baring in mind we know absolutely nothing about Sony's raytracing tech other than in December 2018 they stated they've been working on it for a while and were still working on it.I think that would draw more power than XSX big APU.
He didn't say twice the performance, he said twice the power. Two different things.He said twice the performance. That would be less than 12 tf RDNA. Kleegamefan made his RDNA clarification before the first DF video, which guessed Phil could be talking about 9-10 RDNA teraflops which are about twice the performance of One X.
Obviously, in specs given to devs, they would only use actual teraflop numbers.
In a world where everybody says the truth, you wouldn't need to double check anything. Regrettably, we are not living in that kind of world. Also, if several insiders have the same specs, it means that they can't be tracked by Microsoft just by looking at the leaked numbers.
And let's be honest crowing about power when we know Sony has at least two things (much faster ssd and better ray tracing) that could make up the deficit is probably not the smartest idea at this time.
Interesting that Jason Shreier predicts Sony will skip E3 again this year.
https://kotaku.com/our-video-game-predictions-for-2020-1840903755
Maybe it will have more tflops, this is just based on the info now as that is what the comments are based on, current or fairly recent devkits.
I don't think Sony devkits are up to spec yet MS are probably sending the out with retail APUs at this point. (I don't know would be cool if someone could let us know the state of the devkits - Sony usually only get the powerful last revision 'retail' devkits out after the consoles have been released historically)
Until the 2000 series cards, that was the norm for NVIDIA. With no competition from AMD, they hiked margins, reduced performance gains for lesser tiers.The 1060 had 980 like performance, yet it retailed even at 230$. It is not unheard for mid range next gen cards to match high end last gen cards.
In the github docs? Correct me if I'm wrong but neither sparkman nor arden had any measured results in any of the github docs.So when Arden and Sparkman were stress tested, full CU and RT were enabled?
I want whatever ganja Schreier is smoking because that would be absolutely crazy to skip E3 twice in a row. Hope that wouldn't be a sign of anything like a delay :S
Do you have a link with that doc?In the github docs? Correct me if I'm wrong but neither sparkman nor arden had any measured results in any of the github docs.
We can already throw out the idea that both sides are not going for smart designs and one is simply going for brute power. The way it sounds like Microsoft has implemented their SSD to assist their GPU, using it like a sort of virtual ram that may be connected pretty close to the GPU itself, is the very essence of smart design and not simply just relying on brute power. And rumors suggest it is Sony that's pushing their GPU to 2GHZ+ (a speed we know to be pretty crazy for a console) whereas Microsoft has gone for a smart wide approach with many more compute units, with a more manageable/reasonable clock speed. Even the emphasis by Microsoft on VRS, yet another focus on being smart rather than just brute forcing it. VRS can save their GPU precious resources that can go back into improving performance.
Even looking at Microsoft's form factor, it's easy to say that's a sign of the kind of brute power they're going for, but then they also managed to make this thing as quiet as Xbox One X. It doesn't appear to me as if brute power and smart design are at odds with each other in the Xbox Series X design.
Even if Microsoft are building themselves a mini PC tower I still see efforts all around the design from what Microsoft has said thus far to keep from having to brute force their way to a result. There are efforts to take pressure off the hardware. I don't think they're taking that they have a lot of power for granted.
Pretty sure thats not been said (how are insiders supposed to know how many devkit revisions there are going to be or the exact performance of the current ones?), I've read the tools are more mature and easier to use, and devkits have probably been out longer with equivalent hardware in (the 2ghz APUs not the final retail ones - this is because Sony don't have direct x which means they can't just stick anything in the box of equivalent power like MS can), but pretty sure xbox are the only ones that are shipping with final silicon in at present. Could be wrong this information isn't really out there.I thought Sony was supposed to be ahead in devkit progression according to a lot of the insiders, this would contradict that point.
Moving data around chip to chip requiers more energy = more power consumption for a consoleDon't know how you could make that assumption baring in mind we know absolutely nothing about Sony's raytracing tech other than in December 2018 they stated they've been working on it for a while and were still working on it
... and ? You suppose to sound smart ?Anyway, that's the concern of my electric company and the bank manager.
Or semi custom 36 dual cu Apu.
Please note unseen GPU with a test CPU pretty much confirms this is an APU.
Now which console makers chips get tested on windas and has vr. Hmm I wonder.....
docs were removed. I don't have a copy. Some on ree have saved it and shared some info after it was removed. Whatever data there was on arden/sparkman was theoretical data, not test data.Do you have a link with that doc?
Or semi custom 36 dual cu Apu.
Please note unseen GPU with a test CPU pretty much confirms this is an APU.
Now which console makers chips get tested on windas and has vr. Hmm I wonder.....
Jason said no inside info for his prediction
The reason I found it interesting was because of this, no insider info that they are for sure attending which I would think would have been decided one way or another by now.
I was making a joke. you have absolutely no idea how much energy it uses or heat it produces, we don't know nothing about it. I can tell you when I turn RTX on my card uses less power because the frame rates are lower. With vsync enabled and set to 60 fps it uses more 20% more power (Monitoring software only shows percentage of available power used) though to be fair and is slightly hotter 46v51 degrees.Moving data around chip to chip requiers more energy = more power consumption for a console
... and ? You suppose to sound smart ?
I think We are discussing two different things here, im saying that it gonna be very hot and power consuming for a console, on the other hand you discussing the fact that it wont matter to you if you are paying a few dollars more on your electricity bill.
The code points to the 4800H apu, but the clocks are lower, and we already know that chip is beyond the engineering sample stage. (could be an old test though)Insane Metal kept hinting about the 4800 APU. I really can't think of which company would want VR in their next machine
.
Already answered you above. It's a laptop with an APU and a 2060.The code points to the 4800H apu, but the clocks are lower, and we already know that chip is beyond the engineering sample stage. (could be an old test though)
I expect with all the leaks and dataminers, AMD are actively trying to hide what the chips are for now. Other than ruling out this being scarlet, as why would you test VR with that chip, I wouldn't be surprised if this was the PS5 chip. (I also wouldn't be surprised if it wasn't, I'm not invested in it)
One thing we do know though is with only 7 Radeon CUs, no way on earth is a 4800h destroying a 2080ti in any test.
Fair enough, still cannot see that setup beating a 2080ti, and it states the GPU used was AMD, which only uses GCN and 7 CUs.docs were removed. I don't have a copy. Some on ree have saved it and shared some info after it was removed. Whatever data there was on arden/sparkman was theoretical data, not test data.
It'a a laptop with an amd apu and a 2060 inside of it. If you google the eng sample (100-000000098-40_39/27_Y ) it'll take you to a post from about a month ago mentioning said laptop and everything inside it
ASUS Zephyrus G GA401IV DxDiag - Technopat Sosyal
------------------ System Information ------------------ Operating System: Windows 10 Pro 64-bit (10.0, Build 18363) (18362.19h1_release.190318-1202) Language: English (Regional Setting: English) System Manufacturer: ASUSTeK COMPUTER INC. System...www.technopat.net
Like I implied, doesn't scream 2080ti killer to me, even with both GPUs combined which of course wouldn't happen anyway.Already answered you above. It's a laptop with an APU and a 2060.
Fair enough, still cannot see that setup beating a 2080ti, and it states the GPU used was AMD, which only uses GCN and 7 CUs.
It doesn't add up, but not invested in it to be fair so not going to be arguing the case any more than that.