• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EuroGamer: More details on the BALANCE of XB1

KKRT00

Member
They essentially asked exactly what Microsoft wanted to talk about.

I posted the list of Penello's claims that lead to this earlier in the thread:

This Digital Foundry article is for all intents and purposes a platform for them to try and elaborate on, clarify and/or espouse these above claims in more depth. It essentially follows the same pattern of these claims. It's not some spur of the moment, coincidental interview; it's a reactive PR event.

Does it have information, sure. Is it ultimately still something of a puff piece, most assuredly. It really doesn't help that in the prior article, Leadbetter, essentially parroted the same line about the XB1 being balanced and wrote something to the effect of the PS4 being unbalanced.
And good that they've covered those topic, because those topics were in question. I ask it again, what other questions should he ask?

Leadbetter never said that PS4 in unbalanced, actually he said countless times that PS4 is more balanced than Xbone.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
What if the bike can exist in two different places at the same time using low latency? Also, PRT turns the basket into a black hole.

Just stop, it's embarrassing.
 
Kaz has perfect balance too

ibjCPlHyFmEMe.gif


http://www.youtube.com/watch?v=OEQUfIXamkM

AMAZING!
 

AlphaDump

Gold Member
What if the bike can exist in two different places at the same time using low latency? Also, PRT turns the basket into a black hole.

then we start mentioning packing, loading, unloading and sorting times.

and existence collapses into itself.
 
Dude, Ryse stands right now as, imo, the most amazing looking game on either system. I don't need to focus on all these multi-platforms that I haven't seen yet. Dude, I've seen Fifa on both PS4 and Xbox One. Do you see a huge difference there, either? That's one game we can talk about that's multi, right? Oh, but there's a catch, I know. It's FIFA :p

Whatever the case, I'm not living and dying on the numbers. We already know sony's kicking MS ass on the raw numbers. Nobody denies. Nobody is even denying that it will have an impact. Sony's console is stronger, period. But just how much will that show up in the real world is the question. That's all I'm saying. Personally, I would love if the PS4's advantage shows up every bit as much as the raw numbers indicates, because I'm going to own the system. However, realistically, I don't think it's going to happen. Call me one of those stubborn people that have to see it before I believe it. Is that so wrong?

Yea, that 900p game looks more impressive graphically than Killzone and Infamous to me. Listen, this isn't me saying Killzone and Infamous look like shit. They look fucking unbelievable. Unbelievable. I just think Ryse looks more crazy graphically. It doesn't matter to me what resolution it's doing it at. All that matters is what I see, and how I think it registers with me on a visual level, and Ryse's SP looks quite stunning. If people want to believe that Killzone and Infamous seriously look 44% or 50% superior graphically then that's their business.


The only thing that's "crazy" about this discussion is your opinion that Ryse looks better than KZ:SF or InFamous, not Ryse's graphics themselves. What exactly is so amazing about Ryse's visuals?

Ignoring the massive downgrade in resolution compared to those two titles (900p vs, 1080p), let alone framerate (60 fps for KZ:SF's multiplayer), the lighting, character models, action, background geometry, animation, etc. look noticeably worse than those two titles.

Is this argument about "art style", which is more subjective? Because that's the only thing I can think of right now that pay put Ryse ahead for some. It's the same crazy arguments that we heard this gen with Mario Galaxy being claimed to look better than some of the best PS360 titles.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
Meh. I'd be hard pressed to tell or care which was 900p or 1080p from couch distance.

Less pixels means more aliasing, you do see aliasing right?

Texture crawling and shimmering (texture aliasing) or "jaggies" (edge aliasing).
 

Vizzeh

Banned
Some questions I would have liked answered by Microsoft

Does their Graphics chips support DX11.2 ?

According to this it doesnt:

images.eurogamer.net/2013/articles//a/1/6/1/2/9/1/9/77.png/EG11/resize/1200x-1

DX11.2 Supports alot of fancy stuff that will aide in next GEN, things like Scalable frame buffers, tiled resources - Tiled resources will be particularly useful, it allows textures in advance of 10GB to be STREAMED to a file as small as 16mb stored on RAM, it allows a saving in memory because distant textures are seen in small quality, as you get closer those textures INCREASE in quality

all radeon 7000series cards with DX11.1 Can be updated in drivers to DX11.2 - BUT not all the features are present like tiled resources TIER 2 (TIER 1 Is availabe with software) but Teir 2 requires a hardware modification

PS4 Supports DX11.2+ (From the GDC SONY developer briefing http://www.gdcvault.com/play/1019252/PlayStation-Shading-Language-for)

Modern GPU
- DirectX 11.2+/OpenGL 4.4 feature set
- With custom SCE features
- Asynchronous compute architecture
- 800MHz clock, 1.843 TFLOPS
- Greatly expanded shader pipeline compared to PS3™

So can we safely Assume X1 Supports DX11.1 With a SOFTWARE upgrade to DX11.2

Where sony explicitly states 11.2+ They will support the hardware variant allowing for Teir 2 tiled resources and a few other goodies explained here:

http://msdn.microsoft.com/en-us/library/windows/apps/bg182880.aspx (good read)
 
Balanced, is that another way of saying less powerful, once its been through the PR spin machine.

So much disingenuous rubbish coming from MS, through out so many figures and walls of text, in the hope of muddying the waters.

No, that would be your skewed interpretation.

Balanced means balanced. The tech guys clearly explained in detail what they mean't in the article.
 
And good that they've covered those topic, because those topics were in question. I ask it again, what other questions should he ask?

Leadbetter never said that PS4 in unbalanced, actually he said countless times that PS4 is more balanced than Xbone.
People in this thread almost immediately picked up on the inadvertent admission that their lack of improvement from enabling 2 extra CUs compared to an upclock would suggest a different bottleneck affecting performance, in the number of ROPs they have for example - which could have been a point to bring up. Although on further inspection, as it was a conference call rather than a direct interview, perhaps there wasn't much room to challenge directly or ask off-script questions. But the write-up frankly doesn't particularly read as if any critical review of the information was really done, rather than simply regurgitating it at face value. And I thought the former was supposed to be part of DF's MO, thus titling their articles DF vs ____.

As for the PS4 and balance I posted the direct quote earlier in the thread, it's from another article:
According to inside sources at Microsoft, the focus with Xbox One was to extract as much performance as possible from the graphics chip's ALUs. It may well be the case that 12 compute units was chosen as the most balanced set-up to match the Jaguar CPU architecture. Our source says that the make-up of the Xbox One's bespoke audio and "data move engine" tech is derived from profiling the most advanced Xbox 360 games, with their designs implemented in order to address the most common bottlenecks. In contrast, despite its undoubted advantages - especially in terms of raw power, PlayStation 4 looks a little unbalanced by comparison.
I'm not sure where all these countless times Leadbetter or DF in general has talked about how much more balanced the PS4 is are...
 

Vizzeh

Banned
As always better reduce res for stable framerate. This hasn't changed.

1080p Is supposed to be a NEXT GEN benchamark.. if it cant be achieved its a failure imo

"Dynamic resolutions" doesnt cut it imo...

Yes ofc frame rate will be more important, but it shouldnt be an issue to startwith :)
 

KKRT00

Member
Ignoring the massive downgrade in resolution compared to those two titles (900p vs, 1080p), let alone framerate (60 fps for KZ:SF's multiplayer), the lighting, character models, action, background geometry, animation, etc. look noticeably worse than those two titles.
60fps in MP in KZ:SF must be proven, because for now its 30-40 most of the time. And latest footage of singleplayer had drops to 20s in some heavier scenes.

Lighting? Read about solutions in those games, Ryse is completely dynamic in terms of lighting, it has also much better shadowing system than both titles combined
Character models? Read about that in both games too, because Ryse wins here, not only characters models have more polys, but there are more of them on screen.
Animation? technically animations in Ryse are very detailed, probably the most detailed from all shown games, they just dont have transitions
Background geometry? We havent seen to much, but city in KZ:SF background has less than 500k polys after culling, its not much and there is no other game with grass like Ryse and that grass span through to the horizon and all of those are geometry with physics.
Cloth physics and body physics are best in Ryse
Motion blur and Bokeh are best in Ryse
We cant really talk about particles, because all games are different, but all of them are affected by wind and all of them are lit by light sources and shadowed by objects.
We dont know anything about water in Infamous and KZ:SF, but in Ryse its tessellated, its FFT and it generated real-time caustics.
All games use PoM, probably all also use tessellation for some geometry.
All of those games use Real-time reflections.
etc, so no Ryse is not technically inferior in any way.

===
People in this thread almost immediately picked up on the inadvertent admission that their lack of improvement from enabling 2 extra CUs compared to an upclock would suggest a different bottleneck affecting performance, in the number of ROPs they have for example - which could have been a point to bring up.

By the posts of Sebbbi on beyond3D ROP is limiting factor only if shaders are written in simple way.

--
I'm not sure where all these countless times Leadbetter or DF in general has talked about how much more balanced the PS4 is are...
In every comparison between both platforms?
 

onQ123

Member
If I was AMD I wouldn't take kindly to the claims of "Diminishing Returns" at 12 CU's when the HD 7970 has 32 CU's.
 
I think their attempts to prove that the performance is a wash means they are not happy with their boxes performance actually.

I mean... people have to run through so many "what if" hoops to get to the conclusion the x1 is not blown out of the water. The games will show what is up soon enough. I excitedly await the craziness that occurs when bf3 comes out... multiplatform comparisons will be completely glorious.

No I disagree, there is a huge preconception on the internet (as your second paragraph alone highlights) that the PS4 is leaps and bounds apart from the X1 from a technical standpoint. I believe they are perfectly satisfied with their box and it's unlikely they would go into such detail how the console works at Hot Chips conference if they weren't. they made the console but they have to sell it so there's nothing wrong with sharing some technical info.
 

nib95

Banned
And good that they've covered those topic, because those topics were in question. I ask it again, what other questions should he ask?

Leadbetter never said that PS4 in unbalanced, actually he said countless times that PS4 is more balanced than Xbone.

Can you show me some examples of this? Because this particular quote suggests the complete opposite. Leadbetter has been doing everything he can to downplay the differences between the two consoles, and tow the official Microsoft line.

Leadbetter | Dogital Foundry said:
In contrast, despite its undoubted advantages - especially in terms of raw power, PlayStation 4 looks a little unbalanced by comparison.


The funniest one however is how a 10% CPU upclock is "significant" whilst 40-50% more GPU performance is merely "very evenly matched".

Tbh, he's sort of single handedly flushed DF's reputation down the toilet somewhat.
 

AlphaDump

Gold Member
If I was AMD I wouldn't take kindly to the claims of "Diminishing Returns" at 12 CU's when the HD 7970 has 32 CU's.

eh. it was more the cost benefit between upclock or utilizing 2 additional cu's. (no idea if that is true or not)

regardless, AMD will be laughing all the way to the bank
 
If these are the kinds of differences you guys talking about seeing this gen, you're in for a long ride. Keep those numbers handy, you might need them to remind yourselves of what you think you should be seeing at that rate.

Honestly, I chuckle when I read these "who cares" type of posts, with those "attacks" mixed in the middle.
It's like we're completely ignoring the popularity of DF comparisons this past generation, particularly in GAF, when apart from some really awful ports, the vast majority of the games ended up being kinda the same thing.
 

c0de

Member
1080p Is supposed to be a NEXT GEN benchamark.. if it cant be achieved its a failure imo

"Dynamic resolutions" doesnt cut it imo...

Yes ofc frame rate will be more important, but it shouldnt be an issue to startwith :)

Opinions, opinions, how do they work... ;) I don't care for 1080p, I want more details, more open worlds, stable framerate. I can live with 720p and decent AA.
 

Rolf NB

Member
I'm bothered by the understandable instinct to attack the messenger rather than the message.

Leadbetter may be biased, and MS is obviously doing some PR spin, but that doesn't really matter. What matters is the facts as they were presented.


  1. Were they not giving us what we asked for? ("technical fellow" explanation of AP posts)
  2. Were they not accurate?
  3. Were they smart to do it, or should they have stuck with "just wait for the games"?
Please note that I agree that PS4 is obviously more powerful. I just think the "shill" accusations are kind of boring.
It's the third quote. This one:
Every one of the Xbox One dev kits actually has 14 CUs on the silicon. Two of those CUs are reserved for redundancy in manufacturing, <...> And we actually saw on the launch titles - we looked at a lot of titles in a lot of depth - we found that going to 14 CUs wasn't as effective as the 6.6 per cent clock upgrade that we did
This is bona fide BS.

I don't believe for one second that a 16.7% GPU througput bump is less of a real world perfomance boost than a 6.7% CPU clock bump. Remember, games consoles. What do you do on games consoles when your perfomance isn't where you want it? Reduce resolution. Hundreds of games did this in the curent gen. And GPU performance is the only thing that alleviates this. Again, I don't believe for one second that the benefits of a 16.7% GPU bump would be so negligible that it isn't worth doing, while a 6.7% CPU bump somehow is worth doing.

Re the "shill" thing, people who say BS face headwind, mockery and disrespect. That's the perk of a societal species. Nothing extraordinary.

Re the quote, the most likely truth behind of all this is that they could not, at this stage, alter the silicon itself. The design was nailed down and is ramping up production, and it would set them back months to add/remove/etc functional blocks, let alone all the money already spent on no longer useful silicon, and the additional money to dump into that process.

Another thing they couldn't do was enable the 2 CUs built in for redundancy. They need them to get the yields into a comfortable range. That's their purpose, and if they just openened them all, they'd lose that, making their yields suck, driving the cost per functional chip through the roof. Just like Sony never enabled the last SPE in the PS3s. They needed the redundancy. Microsoft needs the redundancy.

They also couldn't up the GPU clocks, most likely due to the GPU drawing a majority chunk of the overall system power already. Remember, upping the clock on a chip usually also needs increased voltage. It's not unusual to see power requirements scaling quadratically with clock. I.e. a 10% clock bump, can easily mean 20% more power draw.

What they could do, however, was up the clock on the CPU block. Because that's not a hot area of the chip, and a relatively small area. There was headroom there that they couldn't find anywhere else. And so they did it, just to have something to tout in public as a step forward.
 

nillapuddin

Member
I could give a shit less about the actual res the game is running at, we've been getting upscaled (significantly) for years now with the console, everyone remembers (Halo 3 -> 1138x640) and then finally just last year (Halo 4 -> 1280x720)

If every game doesn't make the direct leap to 1080p, I'm fine with that, if games wanna run at sub 1080p and bring 60 fps, then go for it.

Nobody is looking at Ryse on the couch from 8 feet away and losing their minds because of the upscaling. No one besides forum sweepers will ever even know.

edit: the HUD being rendered independently, in my mind, shows a lot of foresight about designing for games, no matter what the game runs at, every HUD could be as crisp as possible
 
Can you show me some examples of this? Because this particular quote suggests the complete opposite. Leadbetter has been doing everything he can to downplay the differences between the two consoles, and tow the official Microsoft line.




The funniest one however is how a 10% CPU upclock is "significant" whilst 40-50% more GPU performance is merely "very evenly matched".

Tbh, he's sort of single handedly flushed DF's reputation down the toilet somewhat.

Wow that Leadbetter quote is just... sigh.
 

Krakn3Dfx

Member
MS must genuinely be freaking out about the negative PR coming out of their '20% but 50% less' game console considering how much they're trying to get out ahead of this, but the numbers don't lie, and the gap is pretty substantial.

When you have multiple devs talking about the spec differences and ease of developing on one platform or the other (same thing happened with the PS3, I remember a lot of devs openly coming out and saying it was just a more difficult piece of hardware to make games on), at some point people have to admit that Microsoft is racing to play catch-up in a big way with development tools seeing that the hardware itself is a perceivable negative for developers over the competition.

Also, take a drink every time the guy refers to 'balance' in the architecture. I can only assume a small team of people at MS spent at least a few hours brainstorming a term they could use to make the situation more vague and questionable.
 

Cidd

Member
As always better reduce res for stable framerate. This hasn't changed.

Well stick to current gen then, I'm tired of blurry muddy texture all over my 55" OLED screen. 1080p is suppose to be the standard for crying out loud we're already moving into 4k territory in a few years.
 
60fps in MP in KZ:SF must be proven, because for now its 30-40 most of the time. And latest footage of singleplayer had drops 20s in some heavier scenes.

LIST

I am sorry but res more importation than half those things on that list .
It easy to do a whole bunch more of effects if your game rendering 44% less pixels .
 
By the posts of Sebbbi on beyond3D ROP is limiting factor only if shaders are written in simple way.
So the 17% compute improvement nets less performance than a 6% upclock... how exactly...?

Isn't what they're saying in the article essentially saying that the games they were running were fillrate limited? That seemed to be what several posters here have taken from it. Is that an incorrect conclusion?
In every comparison between both platforms?
I've yet to see Leadbetter refer to the PS4 as more balanced than the XB1 - he'd be directly contradicting the other quote.
 

Vizzeh

Banned
Opinions, opinions, how do they work... ;) I don't care for 1080p, I want more details, more open worlds, stable framerate. I can live with 720p and decent AA.

Yeah v true mate, horses for courses as they say.. I believe the same thing, I can live with 720p and AA, but would much rather 1080p ofc, i was more angling towards the hardware should exist in the first place to provide it :) - I know they "CAN" scale back on the things we want to see on screen, id rather them start with 1080p and then find then tweak the tricks to make the rest happen, I suspect the amount of Ram should allow us for plenty of things on screen at a time, depending on your console of choice, it could have higher or less quality textures/particle effects/stable frame rate (a bit of the GPU coming into it there) :p
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
Albert, why is your engineer talking out of both sides of his mouth?

"Interestingly, the biggest source of your frame-rate drops actually comes from the CPU, not the GPU," Goosen reveals. "Adding the margin on the CPU... we actually had titles that were losing frames largely because they were CPU-bound in terms of their core threads. In providing what looks like a very little boost, it's actually a very significant win for us in making sure that we get the steady frame-rates on our console."

Then later...

"We've done things on the GPU side as well with our hardware overlays to ensure more consistent frame-rates," Goosen adds. "We have two independent layers we can give to the titles where one can be 3D content, one can be the HUD. We have a higher quality scaler than we had on Xbox 360. What this does is that we actually allow you to change the scaler parameters on a frame-by-frame basis."

"I talked about CPU glitches causing frame glitches... GPU workloads tend to be more coherent frame to frame. There doesn't tend to be big spikes like you get on the CPU and so you can adapt to that," Goosen explains.

Glitches == frame drops I assume.

Dynamic resolution is to lower the load on the GPU. The only reason you so it is to maintain the frame rate. You guys are claiming the CPU offloaded and clocked-up to maintain frame rates, then talking about dropping the res to make sure your GPU is not over-burdened, which implies a lack of horse power (too few CUs?) or lack of memory bandwidth (DDR and eSRAM not utilized well). Ryse at 900P does not mean the CPU is weak, it means the GPU is.

Seems you have multiple bottlenecks at work by your own admission. Is this what balanced means?
 

c0de

Member
Well stick to current gen then, I'm tired of blurry muddy texture all over my 55" OLED screen. 1080p is suppose to be the standard for crying out loud we're already moving into 4k territory in a few years.

Well, let's see who is moving to 4k in a few years. Not everybody builds a new house because of a new TV.
And I can understand that you are tired of muddy textures but please buy a PC then if you want games running with 1080p *and* decent AA because 1080p still requires AA.
 
Top Bottom