• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EuroGamer: More details on the BALANCE of XB1

Elios83

Member
The good thing is that they admitted that the esram bandwidth isn't really 204GB/s. But 140GB/s using a specific benchmark programmed to use a dead cycle every 8 clock cycles. Something which can't be reproduced in real game situations.
The rest is just defending their choices and claiming they are balanced. But that has nothing to do with power.
 

c0de

Member
Seriously, the next person to use the term 'drivers' in a console thread may force me to gouge my eyes out. These aren't run of the mill PCs. 'Drivers' don't exist, there is no abstraction between the hardware and OS.

1jbMwDR.png
 
Until we can't distinguish looking at a screen from looking outside through a window, there's still room for improvement. Even people who can't tell 720p from 1080p 10 ft away would see the difference that ultra high resolutions will eventually bring.
Perhaps, but I take a 1080p Panny plasma over a 4K LCD every time, especially for gaming where motion resolution is equally as important. 4K LCDs don't even provide the motion resolution of a 1080p plasma, at least with current tech.
 

Respawn

Banned
theoretical peaks are just that

it is what you would get if there was never a miss of any kind, if there is a miss, there is a penalty. And well, there are always misses, which is why he states it is more around 140/150GB/s in real gaming terms, because there is a miss every 8th cycle or so.

I still call BS on that number. Maybe in in house tech demos using extremely controlled environments will there be a miss only once every 8th cycle, but in the real world gaming environment I think that number will be higher. The more misses, and the higher penalty, the lower your GB rate becomes. The GDDR5 number is 178GB/s peak, but that goes down pretty quickly the more misses you have. Which is why hUMA was important, because it allows the GPU AND the CPU to " see " the thread schedulers and queued jobs and know which one it needs to pull next. So the amount of " misses " are lowered significantly, and also takes some load off the CPU. The fact that the GPU is also side by side with the CPU also means the GDDR5's usual high rate of penalty on a miss is much lower then you would find on a standard PC where everything is separated.

If I had to take a wild guess I would say on a normal PC with 178GB/s memory bandwidth from GPU to GDDR5, your real world bandwidth would be more like 70-90GB/s, maybe lower. But on the PS4, it is likely more around 110-125GB/s. Due to the lower cycle miss penalty and the drop in actual misses and such. Of course developers can write benchmarks if none are provided to run tests on their product to see what the actual performance is bandwidth wise.
Good post and understood. I also knew just as much their numbers weren't actual performance.
 

TrueGrime

Member
So why are you in here then? Seems like a lot of effort for someone who thinks he's above such things.

There's no effort in reading about the opinions of others and give opinions yourself. My point is that, while I take the opinions of others into consideration, it isn't going to influence what console I'll purchase. Especially if it is a console or specs argument. It'll ultimately come down to the games I like and what I feel comfortable playing those games on. That's not being above anything.
 

Tycho_b

Member
the 2 extra CUs were less effective at giving them more performance than the clock increase.

It is clear and simple and proven over the years - best ways to increase performance are easiest ones - clock, shading units etc. Something which will always work without programmer's intervention.

The rest is smoke and mirrors and minor enhancements.
 

Chobel

Member
Those are definitely some of the most interesting parts to me. The Kinect skeletal system and their insistence on the importance of latency for GPGPU for starters, but the real eye raiser for me was their experimenting with 2 extra CUs and determining that the 2 extra CUs were less effective at giving them more performance than the clock increase.

Again what's stopping MS from using those 2 CUs? It's not like upclocking the GPU prevent them from using extra CUs.

I mean Xbox one have 2 extra CUs just lying around, use them.
 

TrueGrime

Member
Well thanks for that input. I'll be doing the same, but I'll be buying those games on the PS4 because I want to enjoy my favorite games with the best possible performance.

And we won't know the best possible performance until the consoles are out and the multiplats are on the market.
 

IT Slave

Banned
This confirms that every Xbox One actually has 14 CU's because as we know per Albert Penellow there is no hardware difference between a consumer Xbox One and a developer Xbox One.

In the future we will see Microsoft's own version of "unlocking the last SPU" debate with this. ;-)

Won't happen. Those are redundant to guarantee better yields. Not every Xbox One is going to have a 14 good CU's.
 

szaromir

Banned
Again what's stopping MS from using those 2 CUs? It's not like upclocking the GPU prevent them from using extra CUs.
Lower percentage of manufactured chips would be functional, particularly if they both activated all CUs and increased the clockrate.
 

vpance

Member
Perhaps, but I take a 1080p Panny plasma over a 4K LCD every time, especially for gaming where motion resolution is equally as important. 4K LCDs don't even provide the motion resolution of a 1080p plasma, at least with current tech.

Keeping all else equal, higher resolution is better, which is what that chart is saying. Agreed on plasma vs LCD though, FWIW.
 

Dragon

Banned
There's no effort in reading about the opinions of others and give opinions yourself. My point is that, while I take the opinions of others into consideration, it isn't going to influence what console I'll purchase. Especially if it is a console or specs argument. It'll ultimately come down to the games I like and what I feel comfortable playing those games on. That's not being above anything.

There's effort in going into various threads to muddy the water and then fall back on "I just want to play games and have fun." It's like making an offensive joke and then saying just joking.
 

teiresias

Member
Again what's stopping MS from using those 2 CUs? It's not like upclocking the GPU prevent them from using extra CUs.

I mean Xbox one have 2 extra CUs just lying around, use them.

Because, going by the article, they're there for yields which means not every Xbox One will have a chip with 2 extra working CUs.
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
Again what's stopping MS from using those 2 CUs? It's not like upclocking the GPU prevent them from using extra CUs.

I mean Xbox one have 2 extra CUs just lying around, use them.

Because due to yield choices those 2 CU's are not available or functional on every Xbox One, whereas 12 CU's are.

They have to go with the number they are guaranteed to be able to use, if they are binning for 12 CU's operational, 12 CU's is what they will have from now until the next gen, period.

They could choose to bin for 14 CU's, but that would drop the yield significantly, increasing cost and reducing supply.

They've made the choice for 12 CU's, that's what everyone is getting.
 

stryke

Member
Again what's stopping MS from using those 2 CUs? It's not like upclocking the GPU prevent them from using extra CUs.

I mean Xbox one have 2 extra CUs just lying around, use them.

Built in for redundancy means that when those chips leave the factory their minimum level of testing is that 12 of those 14 CUs work. They don't do anymore.

It's like the same with the Cell, the inactivated 8th SPE sitting in my PS3 right now could be in perfect working condition. Doesn't mean the PS3 at my next door neighbour's works fine.
 

Oemenia

Banned
He avoids the GPU questions well but can we really expect a linear 40% advantage in GPU performance? The advantage in the PC benchmarks is seen about to be about 25% but in a console environment will it be more drastic?
 

Finalizer

Member
Sure the PS4 might be a bit easier to program for but the XB1 is at least as easy if not more to program for than the 360, a console that was praised for that reason.

See, I would have agreed with that sentiment until the ESRAM micromanagement issues came to light. For now at least, it seems the 'Bone is in fact more difficult to deal with than the 360. Presumably, MS will work on ways to either fix or at least alleviate this down the line.

Now the message changes again to be about balance. Just. stop. talking.

I guess MS still thinks it's the early 2000s and they can just say whatever, expecting the public to blindly accept it.
 
The games will speak for themselves, it really is that simple. We will find out soon enough when Watchdogs, Battlefield 4 and other multiplat games that come out. They will be compared with great scrutiny. The exclusives like Drive Club and Forza 5 won't matter as much.

Pretty much. I expect PS4 to win all the face-offs day 1 with at least extra performance differences minimum. The extra lighting and other effects as bonuses with the gap widening as devs get to grips with huma.
 
I am not tech savvy enough at all to understand any of this, but "balance" certainly does come off as pretty blantantly buzzwordy... the new "power of the cloud"? (Also I assume this is what Penello was alluding to? This info?)
 

szaromir

Banned
Now the message changes again to be about balance. Just. stop. talking.
I prefer that MS are openly discussing the hardware even if they're sugarcoating some aspects, than them to be completely silent about it like Nintendo were about WiiU.
 

panda-zebra

Member
They could choose to bin for 14 CU's, but that would drop the yield significantly, increasing cost and reducing supply.

They've made the choice for 12 CU's, that's what everyone is getting.

So is it likely that this is potentially the source of the various rumours regarding poor yields? If they considered other possibilities than the upclocks but yields were too poor to go this way?
 
So you're also investing in a PC?

Oh god. I just knew someone would have to say it.

If you personal message me I'd be happy to give you the list of reasons I have no interest in PC gaming.


And we won't know the best possible performance until the consoles are out and the multiplats are on the market.

While this is true, we've already had developers admit the PS4 version of their game will look better.

There's no effort in reading about the opinions of others and give opinions yourself. My point is that, while I take the opinions of others into consideration, it isn't going to influence what console I'll purchase. Especially if it is a console or specs argument. It'll ultimately come down to the games I like and what I feel comfortable playing those games on. That's not being above anything.

So then why are you even bothering with a thread like this? You've already made up your mind, and now you come in here trying to convince the rest of us, who care about this stuff, that we're just wasting our time?
 

Chobel

Member
Lower percentage of manufactured chips would be functional, particularly if they both activated all CUs and increased the clockrate.
Because, going by the article, they're there for yields which means not every Xbox One will have a chip with 2 extra working CUs.
Because due to yield choices those 2 CU's are not available or functional on every Xbox One, whereas 12 CU's are.

They have to go with the number they are guaranteed to be able to use, if they are binning for 12 CU's operational, 12 CU's is what they will have from now until the next gen, period.

They could choose to bin for 14 CU's, but that would drop the yield significantly, increasing cost and reducing supply.

They've made the choice for 12 CU's, that's what everyone is getting.
Built in for redundancy means that when those chips leave the factory their minimum level of testing is that 12 of those 14 CUs work. They don't do anymore.

It's like the same with the Cell, the inactivated 8th SPE sitting in my PS3 right now could be in perfect working condition. Doesn't mean the PS3 at my next door neighbour's works fine.

Thank you.
 
Mixed feelings on the article. On the one hand it's good that microsoft is putting engineers forward that are explaining how it works. On the other hand - they are using PR babble as well. And on the mysterious third hand - "Digital Foundry Vs Xbox One" = "We let Microsoft say whatever they want, unchallenged." DF is a joke :/


I'm glad they are talking though. Cerny does the PR babble shit as well by saying "SUPER CHARGED PC ARCHITECTURE!" every time he is describing the machine as if Sony sat him down in front of a screen and shocked him every time he saw and PS4 and didn't say it. But once you cut through that shit, from Cerny and now from these guys you get a fairly good idea of what's going on.


The main problem that I have is that I find it hard to find either company (or any tech company, really) credible. Bringing out engineers is better than using Major Nelson though. Look at this complete fucking bullshit from the last cycle...

bHcpp1q.png


http://majornelson.com/2005/05/20/xbox-360-vs-ps3-part-1-of-4/


Mmmmhmmm.

So, yeah, each company is going to lie to us and Digital Foundry has lost all credibility. Ugh.

I guess we will be able to judge ourselves soon enough.
 

TrueGrime

Member
There's effort in going into various threads to muddy the water and then fall back on "I just want to play games and have fun." It's like making an offensive joke and then saying just joking.

Really? So coming in a tech specs thread and happily treading in the middle of the road and saying, "It's will be best to wait and see when both consoles are out and in the open as well as judging the multiplats when they're released to get the best handle on performance, but it in the end it will ultimately be about the games and what you like." ..is a bad thing? Maybe I should have just laughed at the shift the balance joke along with others and called it a day.
 
You mean the DMA controllers right? I'd wanna hope they have some of those :p.

Yea, but you're kinda oversimplying it a bit since no GCN card that I know of even comes with 4 DMA controllers. I think even a 7970 comes with just 2, so regardless of what we call them, that's a clear customization or extension of the GPU's hardware architecture.

Only place you'll find four on GCN is in something like a 7990, which is two GPUs in one package. So, obviously Microsoft extended the DMA capabilities of the GPU beyond the base hardware, and then on top of that they equipped 2 of the four with dedicated fixed function hardware.

Microsoft obviously wanted to improve the system's efficiency with regards to how it handled available bandwidth, and that much was clear even from the old VGleaks info we got.

http://www.amd.com/us/Documents/GCN_Architecture_whitepaper.pdf

GCN has dual bi-directional DMA engines, so that two streams of data can simultaneously use both directions of the PCI Express™ 3.0 link and efficiently use the available bandwidth.

It's not exactly useless simply because it already exists in GCN. That'd be almost like saying the customizations Sony made by adding more ACEs than standard GCN is pointless since ACE's already exist in GCN, just not 8 of them like in the PS4. DMA engines already exist in GCN, too, just not 4 of them like in the XB1, unless you're looking at one of those dual GPU monsters such as a 7990. Wait till misterx's blog gets a hold of this. :p

And, for the record, Microsoft renamed a lot of things in GCN. They don't even call their compute units 'Compute Units.' They're called Shader Cores.
 

EvB

Member
So is it likely that this is potentially the source of the various rumours regarding poor yields? If they considered other possibilities than the upclocks but yields were too poor to go this way?

It could well be the source of this, somebody misunderstanding what had come from the engineering team.

There may have been a point where they were deciding whether an extra CU was worth pushing for but instead they opted for the upclocking because it worked better in the grand scheme of things.

The same thing happened with PS3, they had to account for the fact that some of the CELL wouldn't survive manufacture.
 

Dragon

Banned
Really? So coming in a tech specs thread and happily treading in the middle of the road and saying, "It's will be best to wait and see when both consoles are out and in the open as well as judging the multiplats when they're released to get the best handle on performance, but it in the end it will ultimately be about the games and what you like." ..is a bad thing? Maybe I should have just laughed at the shift the balance joke along with others and called it a day.

Coming into a specs thread and saying I don't want to get into a spec debate is asinine to say the least.
 

JaggedSac

Member
Well thanks for that input. I'll be doing the same, but I'll be buying those games on the PS4 because I want to enjoy my favorite games with the best possible performance.

Hold on now, lets not go there. Because the PS4 won't be where that is and neither will the Bone.
 
Now the message changes again to be about balance. Just. stop. talking.

More gamers are informed than ever before. Back when the Xbox 360 launched, most of my friends had never heard of gaming websites or they have only visited them a handful of times. Now just about all of them read everything on the front pages of at least one or two. Once in a blue moon, they even peak at more technical articles.

It is going to be a harder sell than they think this time around. Information from the hardcore to the casual is being passed around much more fluidly and the "casual" are a bit more "hardcore" in terms of what they know. It isn't a straightline divide as much as it was back then.

In the future, we are going to be talking about accountants, pr and sometimes even committees messed things up for the Xbox One in the early days.
 
Pretty much. I expect PS4 to win all the face-offs day 1 with at least extra performance differences minimum. The extra lighting and other effects as bonuses with the gap widening as devs get to grips with huma.

Could be, I like the idea they are very close in terms of design so that we won't have the same issues we saw with the PS3.

Oh god. I just knew someone would have to say it.

If you personal message me I'd be happy to give you the list of reasons I have no interest in PC gaming.

I don't need reasons, I'm not trying to sell you anything. You should have just stated the best console experience and left it at that. If the PS4 shows a clear advantage that will be a pretty big plus for them since it's also a $100 cheaper. I think we are all just curious to see what real differences will be there.
 

Biker19

Banned
It's kind of funny observing Microsoft's constant re-juggling of their PR message hoping something sticks. If at first you don't succeed, try, try again...

"Cloud is going to give you more computational power!" - No, it's not.
"We never targeted the best specifications on our machine." - Fair enough, but...
"We up-clocked the CPU, this makes us better!" - Well, it is a good start but...
"Everything is going to be even, 50% is not going to happen!" - Yeah, 50% is probably not what we'll see but...
"... this shit is balanced. It's been designed to work together. Basically, it's secret sauce." - Okay but... the whole initial push of the PS4 was how there is no bottlenecks and how it's perfectly balanced.

And yet every single step of the way they pick up some stragglers who have been waiting to be wrapped in the warm bosom of Microsoft once again.

Microsoft shouldn't have made the Xbox One as a multimedia/kinect machine in mind first over gaming at all. It's their fault for doing so.

If the Xbox One was designed as a gaming machine first like with the PS4, & a multimedia machine second, then they could've easily competed against PS4 in specs.
 

IT Slave

Banned
Basically confirms how ROPs limited a lot of games are on X1.

If it had a decent amount of ROPs more CUs would have been better than more clock.

This is basically saying 'the upclock was a better choice because one part of the pipeline in our GPU is rather weak and forming a bottleneck - a part we're conveniently not discussing at all in this article'

I think he was talking about the pipeline inefficiencies that are inherent in GPUs. Listen to the Xbox One architecture panel. I think Boyd Multerer mentioned this in the article.

From the same panel, they mentioned that their focus was on keeping the CU's well-fed instead of just having more CUs.
 

TrueGrime

Member
Coming into a specs thread and saying I don't want to get into a spec debate is asinine to say the least.

LoL. It's asinine to come into a specs thread, give my own opinion, stand on middle ground when it comes to specs, and not get into a debate? Come on man, really? :)
 
Now the message changes again to be about balance. Just. stop. talking.

LOL. That's what I don't get though. What's wrong with saying that you've balanced your hardware's design just right? It only seems like this is a problem because they're not beating or matching Sony on raw performance numbers, which kinda seems silly, don't you think? Cerny also thinks balance is important, he isn't mocked for saying it.

I mean, it isn't that unreasonable that they really do believe they've got a nice balanced design, is it? Hell, even I thought the design was a nice balance, and I was saying precisely this on here before E3! :p What else do these guys have if not balance? They sure as heck didn't go for raw performance muscle. Balance, if anything, seems like the right word for what they tried to go for with the system. Some disagree on that balance, others think it isn't as bad as people say it is.
 
Top Bottom