• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox One GPU Specs reveal date?

Minions

Member
The rumor was that if Microsoft wanted to improve their yields, they would have to downclock. It seems that they're just living with lower yields as they're launching in only a handful of countries.

"Living" with the yields also means they are spending a lot more on production if they plan to meet demand. Another reason for your additional $100 price tag.
 

nib95

Banned
Like I said, I can't wait. The damage control is going to be off the charts.

Not damage control, just common sense. I'm predicting what will happen months prior with that post. If devs have been using PS4 dev kits with mainly 2.2gb, and only more recently 4gb (GDDR5) of ram, whilst the same devs have been using XO kits with 12gb (DDR3) ram for a very long time, obviously the expectation is that PS4 games will look better over time, as devs have more time with final dev kits which presumably will have 8GB GDDR5 or more (dev kits usually have extra as buffer).

Remember early on a few devs actually said the XO had the edge, but this was because it had far more ram than the PS4, which made up for, at the time, the slight performance difference. The situation today is very different. The PS4 is notably more powerful and has the same amount of ram, just much faster. But this is a very very recent change.
 
5 billion transistors is probably my favourite released spec. Somebody managed to dredge up a number that came out on top and I salute her/him.
 
The rumor was that if Microsoft wanted to improve their yields, they would have to downclock. It seems that they're just living with lower yields as they're launching in only a handful of countries.
Yes. That was my thinking after seeing the low allocation numbers for the Xbone, but a lot of Gaffers have still been questioning whether the downclock rumors were true or not. Even someone else in this very thread asked Matt to confirm or deny the downclock rumors. There were no credible sources that corroborated the original rumor, but now Matt effectively denied it by confirming that the rumored specs are correct. So maybe people can stop asking now. That's my interpretation anyway, and Matt can speak up if I'm wrong.
 

nib95

Banned
Well that's a very huge difference, isn't it?

What do you mean? Fact of the matter is, early on devs were making PS4 games to use only a maximum of 2.2gb of ram or less. They were not developing games to use the system ram which was a part of the dev kit, the system ram was purely to run the OS and tools. The difference is that now devs will be making PS4 games to use 8GB of ram. Not even sure the final dev kits have even been sent out mind.
 

TheD

The Detective
Yes. That was my thinking after seeing the low allocation numbers for the Xbone, but a lot of Gaffers have still been questioning whether the downclock rumors were true or not. Even someone else in this very thread asked Matt to confirm or deny the downclock rumors. There were no credible sources that corroborated the original rumor, but now Matt effectively denied it by confirming that the rumored specs are correct. So maybe people can stop asking now. That's my interpretation anyway, and Matt can speak up if I'm wrong.

Who is Matt?
 

nib95

Banned
Means nothing. Either the final kit can run 1080p 60 fps or it can't.

PS1 hardware could theoretically run a game at 1080p at 60fps. In-fact, any hardware could really. You'd just have to degrade the graphical quality and features respectively. 1080p and 60fps is not limited by hardware performance (barring the fact that older consoles didn't support 1080p). It is purely a design decision.

See Wipeout HD on PS3 which runs at 1080p/60fps.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
Only the clockspeed of the CPU is left unknown.

Apart from the leaked documents, I read that the rumored clockspeeds hit the TDP sweetspot of Jaguar/GCN. Can't find the source, but it would make sense given that the same clockspeeds had been rumored for both XB180 and PS4.
 
PS1 hardware could theoretically run a game at 1080p at 60fps. In-fact, any hardware could really. You'd just have to degrade the graphical quality and features respectively. 1080p and 60fps is not limited by hardware performance (barring the fact that older consoles didn't support 1080p). It is purely a design decision.

See Wipeout HD on PS3 which runs at 1080p/60fps.

I'm talking about multiplats. Battlefield 4 is going to be a pretty good indicator of things to come.
 
What do you mean? Fact of the matter is, early on devs were making PS4 games to use only a maximum of 2.2gb of ram or less. They were not developing games to use the system ram which was a part of the dev kit, the system ram was purely to run the OS and tools. The difference is that now devs will be making PS4 games to use 8GB of ram. Not even sure the final dev kits have even been sent out mind.

http://www.vgleaks.com/orbis-devkits-roadmaptypes/

From January the Beta Kits are almost final and i supposse the only variable now is the final clock of the CPU ( I think some developers have hope of a final kit CPU clock increase ):

SoC Based Devkit

Available January 2013
CPU: 8-core Jaguar
GPU: Liverpool GPU
RAM: unified 8 GB for devkit (4 GB for the retail console)
Subsystem: HDD, Network Controller, BD Drive, Bluetooth Controller, WLAN and HDMI (up to 1980×1080@3D)
Analog Outputs: Audio, Composite Video
Connection to Host: USB 3.0 (targeting over 200 MB/s),
ORBIS Dualshock
Dual Camera
 

BigDug13

Member
I'm talking about multiplats. Battlefield 4 is going to be a pretty good indicator of things to come.

Ok, well multiplats have been running at 1080p/60hz on PC for years now with lower specs than the PS4. Are you seriously suggesting that it's actually less capable than the XBO just based on E3 alpha builds?
 

nib95

Banned
I'm talking about multiplats. Battlefield 4 is going to be a pretty good indicator of things to come.

I don't think any of the launch titles are going to be good indicators really, though obviously they'll act as some sort of initial example. Best gauge is going to be second gen titles.
 

nib95

Banned
http://www.vgleaks.com/orbis-devkits-roadmaptypes/

From January the Beta Kits are almost final:

SoC Based Devkit

Available January 2013
CPU: 8-core Jaguar
GPU: Liverpool GPU
RAM: unified 8 GB for devkit (4 GB for the retail console)
Subsystem: HDD, Network Controller, BD Drive, Bluetooth Controller, WLAN and HDMI (up to 1980×1080@3D)
Analog Outputs: Audio, Composite Video
Connection to Host: USB 3.0 (targeting over 200 MB/s),
ORBIS Dualshock
Dual Camera

Not really. Dev kits usually have extra ram over the retail unit to allow for debugging and other development advantages (you can't max out the consoles optimal ram use without it). Final dev kits afaik have not been sent out yet. They will likely have more than 8GB of GDDR5 ram in them.
 

chris0701

Member
Even KZ:SF's ram usage has been exceeded 4GB in Feb for the PS4 announcement event reveal, stop claiming 2.2GB devkit have been used for a long time.
 
Not really. Dev kits usually have extra ram over the retail unit to allow for debugging and other development advantages (you can't max out the consoles optimal ram use without it). Final dev kits afaik have not been sent out yet. They will likely have more than 8GB of GDDR5 ram in them.

Not always. If i remember correctly X360 final kits had also 512MB of RAM ( the console only had 256MB until near the launch ) and it was like that for a long time. If i am correct if you want a devkit with more than 8GB of RAM you have to redesign the APU to modify the mem controller.
 

strata8

Member
You can use something like Crysis 3 as a very rough basis for comparison between the two console GPUs. For multiplats, at least.

lT0myVm.gif


7770
1.28 TFLOPS (XB1 - 1.24 TF)
16 ROPS (XB1 - 16 ROPS)
1GB @ 72 GB/s (XB1 - 5GB @ 67 GB/s)

7850
1.76 TFLOPS (PS4 - 1.84 TF)
32 ROPS (PS4 - 32 ROPS)
2 GB @ 154 GB/s (PS4 - 7GB [?] @ 176 GB/s)
 
You can use something like Crysis 3 as a rough basis for comparison between the two console GPUs. For multiplats, at least.

lT0myVm.gif


7770
1.28 TFLOPS (XB1 - 1.24 TF)
16 ROPS (XB1 - 16 ROPS)
1GB @ 72 GB/s (XB1 - 5GB @ 67 GB/s)

7850
1.76 TFLOPS (PS4 - 1.84 TF)
32 ROPS (PS4 - 32 ROPS)
2 GB @ 154 GB/s (PS4 - 7GB [?] @ 176 GB/s)

Those pics look worse than TLOU.
 

chris0701

Member
Maybe it will like PSVIta, the CPU /GPU freq won't be publicly known forever.
Because they are slower than New iPad :lol
 
I don't know about the CPU specs, but the Xbox One has a crapload of transistors inside it. Like billions!!!!!

It should be really good


[/sarcasm]
 

BigDug13

Member
So the differences between the XB1 and PS4 are smaller than that?

No but showing two pictures, neither of them representative of the graphics we will actually see, doesn't really tell us anything. Yeah, I see a difference in those pictures from each other, but are you suggesting that's what we will actually see in difference and that those pics represent what we can actually expect?
 

nib95

Banned
Even KZ:SF's ram usage has been exceeded 4GB in Feb for the PS4 announcement event reveal, stop claiming 2.2GB devkit have been used for a long time.

Split among system, sound and graphics uses sure. Only 3GB is being used exclusively for graphics.

http://www.eurogamer.net/articles/digitalfoundry-inside-killzone-shadow-fall

So basically, they can double it and still have that extra 2GB left for system uses.


And why should I stop claiming the aforementioned? The SoC based dev kits were only sent out in January 2013. That means up until that point devs were only working with the 2.2GB dev kit version. Assuming KZ SF started development after KZ3 (Feb 2011), we can assume that GG were using a dev kit based on 2.2GB of ram for at least 2 years (till the new dev kits were sent out in Jan 2013).

http://www.vgleaks.com/orbis-devkits-roadmaptypes/
 

strata8

Member
1. Both PS4 and XBox One use custom processors. Does a HD7770 have eSRAM? Does it have DMEs? Does a HD7850 have 8 ACEs and 64 compute queues? Does it have two graphics pipelines?

2. Consoles are a completely different development environment. Different APIs (not that handbrake called DirectX for Windows which can't even control the 2 ACEs/2queues of an HD7970), much more optimization, etc...

That's fair enough, and you obviously know more about it than I do. I've edited the post.

Aren't ACEs related to GPU compute?
 

flying dutchman

Neo Member
MS previously put out strong specs on the xbox and 360 but decided that wii showed this wasnt needed to sell consoles. While the PS3 had a more powerful CPU than the 360, the 360 was actually a very powerful piece of kit when it was released. This gen not so much. While I never was a PS guy I did appreciate Sony trying to push the specs for their hardware. It was good for the industry.
so much MS did this time around has dissapointed me to the point where I am going sony this gen for the first time.
 
1. Both PS4 and XBox One use custom processors. Does a HD7770 have eSRAM? Does it have DMEs? Does a HD7850 have 8 ACEs and 64 compute queues? Does it have two graphics pipelines?

2. Consoles are a completely different development environment. Different APIs (not that handbrake called DirectX for Windows which can't even control the 2 ACEs/2queues of an HD7970), much more optimization, etc...
Out of interest, what are DMEs? And what is the advantage of two pipelines?
 
1. Both PS4 and XBox One use custom processors. Does a HD7770 have eSRAM? Does it have DMEs? Does a HD7850 have 8 ACEs and 64 compute queues? Does it have two graphics pipelines?

2. Consoles are a completely different development environment. Different APIs (not that handbrake called DirectX for Windows which can't even control the 2 ACEs/2queues of an HD7970), much more optimization, etc...

To be fair the 2 graphics pipelines are only for smooth OS pop outs...
 
You can use something like Crysis 3 as a very rough basis for comparison between the two console GPUs. For multiplats, at least.
Hmm....that's not very noticable at all. Only the system warz crew will fight over these details, the average Joe won't even notice a difference.
 
Exactly. ACE stands for Asynchronus Compute Engine (= compute pipeline). PS4 has 8 ACEs and 64 compute queues. A HD7970 for example only has 2 ACEs and 2 compute queues.

In abstract this means that each of the 8 Jaguar CPU cores can grab a number of shader cores which will help him to do some tasks. Combined with the unified adress space and the hUMA RAM PS4 will have a great GPGPU performance.



Data Moving Engines. Microsoft integrated them to relieve the shader cores which would otherwise have to copy all the data into the eSRAM.



Yeah, but I named two things for XBox and I wanted to name two things for PS as well! ;D

IMHO the best characteristic you shoul have pointed out is the onion + bus that allows bypassing the GPU caches. That will show the compute magic!.
 

Espada

Member
Exactly. ACE stands for Asynchronus Compute Engine (= compute pipeline). PS4 has 8 ACEs and 64 compute queues. A HD7970 for example only has 2 ACEs and 2 compute queues.

In abstract this means that each of the 8 Jaguar CPU cores can grab a number of shader cores which will help him to do some tasks. Combined with the unified adress space and the hUMA RAM PS4 will have a great GPGPU performance.



Data Moving Engines. Microsoft integrated them to relieve the shader cores which would otherwise have to copy all the data into the eSRAM.



Yeah, but I named two things for XBox and I wanted to name two things for PS as well! ;D

People seem to overlook this for some reason. Cerny went on at length about the machine's GPGPU capabilities and the specific modifications they made to maximize that. The benefits of GPU assisted processing in gaming are pretty substantial, and you can expect any developer worth their salt to take advantage of this. It'll be this that'll give us the noticeable elevation of 2nd gen+ games above those that came before. You get pretty cool path finding possibilities with this, for instance.

This also applies to the One and Wii U, but to a lesser extent.
 
I am sure I am not the only one but when will MS reveal the official specs of the Xbox one's GPU? Have they been talking about the cloud to delay the inevitable or that the GPU is indeed that much weaker compared to the PS4 GPU that they won't mention it at all?

The reason I ask is with a $100 higher price than the PS4 and as far as multi-platform games goes, GPU specs is indeed important to reveal.

They probably wont reveal the specs. If they are as rumored, they would look inferior. But if they say 8GB RAM 8 core CPU, they dont look inferior.

Anyways the games looked awesome at E3, so I wouldn't worry.

I've heard rumblings these architectures dont get much gain from using more than 14 CU's on gfx anyway (this is where 14+4 split vgleaks rumor started, from Sony developer suggestions/slides to use a 14/4 gfx/compute split as more than 14 CU's on GFX granted little return). So maybe 12-14CU's is the sweet spot anyway.

It wasn't said what's limiting them, it has to be CPU or bandwidth, my guess is the CPU. 1.6 ghz is kinda slow, even with 6.

Also, the esram can be an advantage.

If MS successfully convinces people "this black box's shadowy innards are ~equivalent to that other black box's shadowy innards, pay no attention to the man behind the curtain" they've done their job. So far it's working imo. Joe Blow isn't going to see a difference based on E3. But we'll see in the future.
 

M_A_C

Member
Exactly. ACE stands for Asynchronus Compute Engine (= compute pipeline). PS4 has 8 ACEs and 64 compute queues. A HD7970 for example only has 2 ACEs and 2 compute queues.

In abstract this means that each of the 8 Jaguar CPU cores can grab a number of shader cores which will help him to do some tasks. Combined with the unified adress space and the hUMA RAM PS4 will have a great GPGPU performance.



Data Moving Engines. Microsoft integrated them to relieve the shader cores which would otherwise have to copy all the data into the eSRAM.



Yeah, but I named two things for XBox and I wanted to name two things for PS as well! ;D

How big of a deal is that sSRAM anyways? Do we know yet?
 
You can use something like Crysis 3 as a very rough basis for comparison between the two console GPUs. For multiplats, at least.



7770
1.28 TFLOPS (XB1 - 1.24 TF)
16 ROPS (XB1 - 16 ROPS)
1GB @ 72 GB/s (XB1 - 5GB @ 67 GB/s)

7850
1.76 TFLOPS (PS4 - 1.84 TF)
32 ROPS (PS4 - 32 ROPS)
2 GB @ 154 GB/s (PS4 - 7GB [?] @ 176 GB/s)

You're neglecting the ESRAM in the bandwidth comparison which is patently ridiculous (you better believe MS didn't spend 1.6 billion transistors on it to do nothing).

But other than that, cool comparison and pic, and i agree 7850/7770 on paper (but I will say it's impossible to say at this stage how actual results will fair)
 
It wasn't said what's limiting them, it has to be CPU or bandwidth, my guess is the CPU. 1.6 ghz is kinda slow, even with 6.

Has the 1.6 ghz number been confirmed for both consoles? I remember talk of the PS4 at least being 1.8+

There are still a lot of unknowns about the consoles specs, that I feel will not become clear until after release. For example, has the xbox been down clocked and how much resources do the 3 OS's take.

On the PS4 side there has been talk of the OS using between 0.5 - 2GB RAM, also the final CPU speed.

I think both consoles will have amazing looking games, with most third party games being exactly the same. It is going to be interesting to see if Sony can make the extra power show in their first party games in the next couple of years. Naughty Dog its over to you.
 
Top Bottom