Good thing they sort of explained the eSRAM bandwidth math that people have been questioning ever since Leadbetter did the other article where he just repeated it uncritically.
I would have liked to know exactly how they got to the number that was repeated over and over again, because they are talking about real world scenarios where they get 140-150GB/s and not the (then) 192GB/s and more recently the 204GB/s number.
Edit: I missed the explanation in the side-bar. The bubble!
Welp senjutsusage got banned. Was his info right but he interpreted it completely wrong or was he just complete bullshit.
Uh-huh.If I would go on I would get people into trouble. That's not my intention. I guess I better leave the thread now.
Uh-huh.
Because it helps close the gap, MS wants that parity.My guess is that it was an extreme game of telephone where he interpreted an example given by Sony on how one could possibility allocate resources as a recommendation instead based on a lack of performance, as he was saying.
Perhaps that's giving him a little too much credit, but I just find it hard to believe that someone that posts here a lot would completely fabricate information like that.
Either way, I suppose that puts this whole thing behind us -- there's no evidence at the moment that Sony is actually recommending this split (how come this rumor absolutely refuses to die, by the way?)
Oh dang..First Reiko and now Senju who was following in Reiko's footsteps..
I feel sad. Senju made these threads so entertaining.
Uh-huh.
Uh-huh.
Not ekim too!
Not ekim too!
If this means anything: I can confirm that senjutsusage is legit.
My guess is that it was an extreme game of telephone where he interpreted an example given by Sony on how one could possibility allocate resources as a recommendation instead based on a lack of performance, as he was saying.
A lot of things, but the most likely falsehood was that he claimed sony briefed third party developers that after 14 CU's for graphics they didn't really do much, Yes he claimed SONY said that to devs.
Son, while your post isn't exactly considered thread whining, it could be misconstrued as such. That's a ban worthy offense round these here parts. Just an FYICan this thread just die now?
Welp senjutsusage got banned. Was his info right but he interpreted it completely wrong or was he just complete bullshit.
Can this thread just die now?
Why do these people do this to themselves? Why is it so important for them to prove the PS4 isn't more powerful than the Xbone that they would go to such measures as making up inside information even though the evidence is right in front of them plain as day. for god sake devs have even confirmed it.
Why? I cautioned ekim and said he was playing it safe and pulling the insider card on twitter and not on GAF until this very thread.I don't think this is a good idea.
I dont know about the sources involved here but Reiko tried to pin the blame on godhand.. and apparently godhand showed his PMs to a mod and wasnt banned. So take it on what we know.Who are these "sources" that keep getting people banned? They're either very good at convincing people they're legitimate, or the people referring to them are too easily led astray? Maybe confirmation bias or something?
Who are these "sources" that keep getting people banned? They're either very good at convincing people they're legitimate, or the people referring to them are too easily led astray? Maybe confirmation bias or something?
I don't get why people are falling on the sword for MS's sake.
Why do these people do this to themselves? Why is it so important for them to prove the PS4 isn't more powerful than the Xbone that they would go to such measures as making up inside information even though the evidence is right in front of them plain as day. for god sake devs have even confirmed it.
This isn't the first time Sage has been banned for this shit. He hasn't changed at all over the months and is still stating that his sources are the real deal and shit. I'm also the one to originally call him Reiko 2nd as far as I know as well. The guy is a joke.
Look at this post and look at what he posted in this thread, it's the exact same shit.
http://www.neogaf.com/forum/showpost.php?p=61329153&postcount=254
You'll never find out, these sources are secret.
I don't get why people are falling on the sword for MS's sake. MS designed the Xbone the way they did for a reason, and MS will have to live with it for the next 5-7 years or whenever the next round of consoles come about.
Just accept it and stop trying to desperately obfuscate and downplay the PS4's specs.
#1 I'm not this Reiko individual.
#2 My source is rock solid.
#3 I'm not being trolled.
#4 And I'm not trolling anyone else.
Take it or leave it, you don't have to believe me.
Well I hope others wise up and start being more cautious about who they regard as "sources". Because if you can't back it up on here, don't bring it up....
What first spawned the 'secret sauce' anyway? What was it rumored to be? I always thought it was supposed to be cloud computing (lol).
What first spawned the 'secret sauce' anyway? What was it rumored to be? I always thought it was supposed to be cloud computing (lol).
I think Agies was the first to use the term?
Arthur Gies probably.What first spawned the 'secret sauce' anyway? What was it rumored to be? I always thought it was supposed to be cloud computing (lol).
Digital Foundry: There's a cluster of CPU cores [in PS4]. Their purpose in the PC market - I think there are dual and quad configurations - is for tablets. There are two of those [in PS4]. Then you have what I can only describe as a massive GPU...
Mark Cerny: I think of it a super-charged PC architecture, and that's because we have gone in and altered it in a number of ways to make it better for gaming. We have unified memory which certainly makes creating a game easier - that was the number one feature requested by the games companies. Because of that you don't have to worry about splitting your programmatical assets from your graphical assets because they are never in the ratios that the hardware designers chose for the memory. And then for the GPU we went in and made sure if would work better for asynchronous fine-grain compute because I believe that with regards to the GPU, we'll get in a couple of years into the hardware cycle and it'll be used for a lot more than graphics.
Now when I say that many people say, "but we want the best possible graphics". It turns out that they're not incompatible. If you look at how the GPU and its various sub-components are utilised throughout the frame, there are many portions throughout the frame - for example during the rendering of opaque shadowmaps - that the bulk of the GPU is unused. And so if you're doing compute for collision detection, physics or ray-casting for audio during those times you're not really affecting the graphics. You're utilising portions of the GPU that at that instant are otherwise under-utilised. And if you look through the frame you can see that depending on what phase it is, what portion is really available to use for compute.
http://www.eurogamer.net/articles/digitalfoundry-face-to-face-with-mark-cernyDigital Foundry: Going back to GPU compute for a moment, I wouldn't call it a rumour - it was more than that. There was a recommendation - a suggestion? - for 14 cores [GPU compute units] allocated to visuals and four to GPU compute...
Mark Cerny: That comes from a leak and is not any form of formal evangelisation. The point is the hardware is intentionally not 100 per cent round. It has a little bit more ALU in it than it would if you were thinking strictly about graphics. As a result of that you have an opportunity, you could say an incentivisation, to use that ALU for GPGPU.
Ekim didn't deserve it IMO.
Ekim didn't deserve it IMO.
Arthur Gies probably.
![]()
![]()