I wonder if some hacker will finally find out the final clock speeds of the PS4, now that it is in the wild.
I still can't believe we don't know this. I'm almost positive it's 1.6GHz but it's so odd how much information we have on everything else but the CPU.
It would be hugely ironic if the PS4 had actually more audio processing capability reserved for games whereas the XBO would need to dedicated a non-trivial amount of them to Kinect.
I am also confident that it is 1.6Ghz. Would still be nice to have confirmation, and maybe additional information about the final resource allocation of the operating system.
Also 10% of the GPU reserved, and less ram allocated than ps4?Except the PS3 launched with an already old GPU and half the RAM a whole year later
This time the PS3 has a much better GPU, RAM several times faster and none of the OS and kinect baggage of the XO which for example takes 2 cores off the CPU all the time
Well that's interesting.Hmmmmm
![]()
Hmmmmm
![]()
This should be pointed out, According to the FCC filing for both the PS4 Dev Kit (https://apps.fcc.gov/oetcf/eas/reports/ViewExhibitReport.cfm?mode=Exhibits&RequestTimeout=500&calledFromFrame=N&application_id=435177&fcc_id=AK8DUTD1000) & the retail PS4 (https://apps.fcc.gov/oetcf/eas/reports/ViewExhibitReport.cfm?mode=Exhibits&RequestTimeout=500&calledFromFrame=N&application_id=583848&fcc_id=AK8CUH100C1), the PS4 is supposed have Bluetooth 4.0 in addition to Bluetooth 2.1 +EDR. On both of those links, the info is in the last .pdf file called "FCC 15C Report". It should be on page 5.
Now that it's confirmed that the wireless radio in the PS4 is a "Marvell Wireless Avastar 88W8797", we can 100% confirm that the PS4 has Bluetooth 4.0. Here's a link for the .pdf file info on the Marvell Wireless Avastar 88W8797 SoC:
http://www.marvell.com/wireless/assets/marvell_avastar_88w8797.pdf
I wonder why Sony doesn't list Bluetooth 4.0 as a tech spec feature on the PS4.![]()
It means that x1 doesn't have the advantage of being the only console to offload sound processing from the CPU.
Looks like PS4 had 20 CUs afterall (2 disabled)
![]()
Would have been nice if they weren't. PS4 could have been a nice 2+ tflop machine.
I'm still shocked this wasn't well known by most people with any interest/knowledge in this kind of stuff.
It was pretty much assumed by everyone.
"You forget about the eSRAM. It is not just DDR3. eSRAM has theoretical speed of about 204gb/s. DDR3 has about 68gb/s. Now. PS4 has 176gb/s of RAM speed due to GDDR5.Now, here is how the RAM works on the X1. You can add the RAM speeds together. You are be able to theoretically get 204+68 = 272gb/s. That's not realworld though. You get that through perfect coding. Never gonna happen. What you can get is 2 states: Good coding or Bad coding. Good code, you should be able to reach 200gb/s. Bad code, you will get 140-150gb/s. So it is in the developers hands. If they code good games (which we have to assume they will, except for launch), they will hit around 190 to 200gb/s, faster than the PS4. Depends on the coding.You can expect exclusives to hit that all the time. Multiplats is another story because of against-factors. You have a little more work to do with the X1. It is not about difficulty of development. It is about time and laziness. Can you trust developers to not be lazy on the X1 when PS4 doesn't require as much effort? It's up in the air at this point."
A friend posted this on Facebook, copied from an IGN forum, and I thought it might be a bit nonsensical. ..but I lack the technical knowledge to debunk it, would ya guys mind
I'm even a bit confused at what's being said? Faster and better but harder to code for?
I'm still shocked this wasn't well known by most people with any interest/knowledge in this kind of stuff.
A friend posted this on Facebook, copied from an IGN forum, and I thought it might be a bit nonsensical. ..but I lack the technical knowledge to debunk it, would ya guys mind
I'm even a bit confused at what's being said? Faster and better but harder to code for?
The statement completely ignores the fact that this high bandwidth is not only more difficult to achieve but also only applies for the tiny 32 MB space. The Xbox One might have an advantage in some special cases, but for the most part PS4's solution is without doubt the faster one (and that's regardless of how much optimization you do).
Also I wouldn't talk about "dev lazyness". It's all a cost-benefit equation.
Its all meaningless drivel.A friend posted this on Facebook, copied from an IGN forum, and I thought it might be a bit nonsensical. ..but I lack the technical knowledge to debunk it, would ya guys mind
I'm even a bit confused at what's being said? Faster and better but harder to code for?
A friend posted this on Facebook, copied from an IGN forum, and I thought it might be a bit nonsensical. ..but I lack the technical knowledge to debunk it, would ya guys mind
You have a little more work to do with the X1. It is not about difficulty of development. It is about time and laziness. Can you trust developers to not be lazy on the X1 when PS4 doesn't require as much effort? It's up in the air at this point."
I'm even a bit confused at what's being said? Faster and better but harder to code for?
A friend posted this on Facebook, copied from an IGN forum, and I thought it might be a bit nonsensical. ..but I lack the technical knowledge to debunk it, would ya guys mind
I'm even a bit confused at what's being said? Faster and better but harder to code for?
A friend posted this on Facebook, copied from an IGN forum, and I thought it might be a bit nonsensical. ..but I lack the technical knowledge to debunk it, would ya guys mind
I'm even a bit confused at what's being said? Faster and better but harder to code for?
Also doesn't it have only 16 ROPs versus 32 on PS4? That means it has just a little over half the latter's fillrate.Its all meaningless drivel.
The Bone is worse than the PS4 in terms of computational power by a third. No amount of eSRAM will make up for that kind of gap.
Are there still no proper comparisons for Xbone/PS4 multiplats?
WTF is going on?
A friend posted this on Facebook, copied from an IGN forum, and I thought it might be a bit nonsensical. ..but I lack the technical knowledge to debunk it, would ya guys mind
I'm even a bit confused at what's being said? Faster and better but harder to code for?
^ When I see a Halo CE/Killzone disparity, I'll agree that the gulf is as big as the Xbox/PS2 gap.
Is the ps4 bandwidth starved for the ROPS it has?
PS4 versions of games being better doesn't imply that bandwidth isn't sometimes a bottleneck for the GPU's I/O units.With the games showing. Look like it isn't true at all. It seem that xbox one is not in the balance to bring closer to PS4.
So far we got near every PS4 version are high superior than Xbone one.
PS4 versions of games being better doesn't imply that bandwidth isn't sometimes a bottleneck for the GPU's I/O units.
(If it is a bottleneck on occasion, I really wouldn't be surprised; that's not exactly unheard of for consoles without embedded memory pools.)
I would say that the only thing Microsoft can do is upclock the CPU and GPU again. I think Harrison said they could do it again. If they don't then yeah, MS is in trouble. Just look at all the multiplats having a notable performance/resolution increase on the ps4.
Yes, but I'm not seeing how this is relevant.But I don't think Xbox one got the best version just because PS4's bottleneck.
Remember every hardwares have bottleneck, it called limit capacity.
I was thinking about this difference the other day and I had a very interesting discussion with my friend about this very topic. My argument was that PS4 vs X1 is as big of a gap as OGXbox vs PS2 era. I had some observations to support my argument:
Back in PS2 era, SDTVs were the norm.
Most SDTVs were able to output pictures of 576i/50fps (PAL) or 480i/60fps (NTSC)*, of course frames can go lower than maximum 50/60 fps.
During that era, PS2,GC and OGXbox HAD to target only ONE resolution (depends on PAL vs NTSC). So each console must first target this resolution then use whats left of each console power to increase graphics and textures quality. Thats why Splinter Cell (OGXbox) almost looked like a different game to its PS2 counterpart. PS2 didnt have the luxury of lowering the resolution then stretch the image because there was no lower resolution back then.
These days we have PS4 with massive power advantage to Xbone, but also we have HD TVs with variable common resolutions of 720p,900p and 1080p. So every time Xbone started to struggle to keep up with PS4, developers just lower the resolution to the next stop, this step usually free up a considerable amount of power (56-125% less pixels per frame) depending on 900p-720p stops, after freeing this power devs can use the freed power to maintain the same texture quality between the versions. I think this trend will continue throughout this generation or until PS4 starts tapping into its GPGPU compute, that time Xbone will seriously need more than lowering the resolution to keep multiplatform games look similar ( by similar I mean a game with all effects, bells and whistles ASIDE from res+ fps) . Lets assume this hypothetical situation where PS4 utilize six of its CUs for compute:
PS4 : 12CU rendering + 6 CUs Compute.
in order to achieve similar compute performance:
Xbone: 6CUs rendering + 6 CUs compute.
Notice that Xbone rendering (graphics abilities) is 100% lower than PS4 in this case! so I can see huge graphical performance and disparities coming later this gen between the two, much greater than those during PS2/Xbox era.
also lets assume another hypothetical situation where we have a single resolution solution (either 720p, or 1080p)
in this case PS4 will easily pull ahead of Xbone by using the remaining power to enhance textures or IQ or even step up the framerates.
Small example just to explain the power required to run 1080p game vs 720p to the average joe (will use COD PS4 vs X1):
PS4 : 1080p @ 60fps
X1: 720p @ 60fps
Imagine this imaginary scenario were you have three 720p TVs hooked up with single HDMI. The more power the console output, the more TVs show pictures.
X1: one TV will output the game at 720p. two TVs are black (Turned off due to lack of power)
PS4: two TVs are showing the exact same 720p picture of X1 + the last TV is ON with 480i picture !
So basically PS4 is rendering 2.5 twice the resolution while maintaining the same framrates and might have better IQ !
FOR DOUBTERS: DID you see how big of a difference is that? (POWER WISE)
My conclusion is that the idea of Xbox vs PS2 power difference next gen is very true and realistic, it just happen to be less visible due to different output technologies these days.
* Im fully aware that some later SDTV was able to output 720p but this resolution was rarely used by games back then.
What does GAF think?
Forgive my weak grammar.
Upclocks won't do much in terms of performance. There are other fundamental limits that limit the XBO versus the PS4.
I would say that the only thing Microsoft can do is upclock the CPU and GPU again. I think Harrison said they could do it again. If they don't then yeah, MS is in trouble. Just look at all the multiplats having a notable performance/resolution increase on the ps4.
Um console did run at lower resolutions in the PS2\GameCube\Xbox era.
Um console did run at lower resolutions in the PS2\GameCube\Xbox era.