The Foxconn leaks had too many little details right. Even the the pair of neon Joy Con. At 16nm, the clock speeds aren't too far fetched from the realm of possibility.Then how does that explain Eurogamer's leaked speeds being "final for launch"?
The Foxconn leaks had too many little details right. Even the the pair of neon Joy Con. At 16nm, the clock speeds aren't too far fetched from the realm of possibility.Then how does that explain Eurogamer's leaked speeds being "final for launch"?
The Foxconn leaks had too many little details right. Even the the pair of neon Joy Con. At 16nm, the clock speeds aren't too far fetched from the realm of possibility.
The Foxconn leaks had too many little details right. Even the the pair of neon Joy Con. At 16nm, the clock speeds aren't too far fetched from the realm of possibility.
Yeah it may be right but we have no idea if the tested clock speeds are for developers to use, or even for all of the cores. The leaker speculated 16nm based on the speeds, not the other way around. We won't know for sure the retail speeds or process node just from these pictures.
Yeah it may be right but we have no idea if the tested clock speeds are for developers to use, or even for all of the cores. The leaker speculated 16nm based on the speeds, not the other way around. We won't know for sure the retail speeds or process node just from these pictures.
I've leaned toward the Foxconn clock speeds being true for weeks now.I asked before, but you seem so sure that something is now confirmed, so I will ask again, what exactly are these picture here confirming without a doubt?
The GPU upclock could have been decided after DF leak. We're talking about 150MHZ more, it's absolutely possible if temperatures aren't an issue. Pascal was just speculation from the foxconn leaker so it can be 921MHZ but not Pascal or 16nm (although it would make more sense to have a clock that high with 16nm, but then again, it's a custom chip, it can be 16nm, 921MHZ, but not Pascal, or it could be all three).The Foxconn leaks had too many little details right. Even the the pair of neon Joy Con. At 16nm, the clock speeds aren't too far fetched from the realm of possibility.
two more photo
![]()
I was 100% joking if that was unclear.I asked before, but you seem so sure that something is now confirmed, so I will ask again, what exactly are these picture here confirming without a doubt?
Wouldn't the fact that these chips were fabbed in July indicate the Eurogamer clocks might be more in line with what we end up with? (At least according to post in other thread, I have no idea if the above is accurate).
At any rate it looks like they made a really efficient use of the space available in their spec. Bodes well for the future of the Nviendo partnership.
Wouldn't the fact that these chips were fabbed in July indicate the Eurogamer clocks might be more in line with what we end up with? (At least according to post in other thread, I have no idea if the above is accurate).
At any rate it looks like they made a really efficient use of the space available in their spec. Bodes well for the future of the Nviendo partnership.
Wouldn't the fact that these chips were fabbed in July indicate the Eurogamer clocks might be more in line with what we end up with? (At least according to post in other thread, I have no idea if the above is accurate).
At any rate it looks like they made a really efficient use of the space available in their spec. Bodes well for the future of the Nviendo partnership.
How do you know they were fabbed in july?
How do you know they were fabbed in july?
Don't forget that the clocks leaked by Eurogamer were communicated much later than July. I don't see how the final chips were fabricated in July and Nintendo would communicate much lower clocks in the Fall only to revise them a month or two later. Doesn't make too much sense.
Or Eurogamer's source was wrong/fake. Which seems unlikely because they have a great track record.
Boy, this is one sexy piece of hw.two more photo
![]()
Or eurogamer sat on that info for months, the fact that they received the info (for example) in late september/early october doesn't mean that the kits with those clocks were shipped just before that. They could have been around for months and months, and even be largely based on the stock tx1 with some customizations (which is actually what DF said in their summer article about NX using tegra iirc). Not to say that eurogamer's clocks are certainly wrong (read my previous post on the matter), but i don't think this tells us anything new, really.Yeah I'll agree with that. The only thing which might make sense is Nintendo opting to increase the clock speeds after launch, though I don't know why.
Or Eurogamer's source was wrong/fake. Which seems unlikely because they have a great track record.
Or eurogamer sat on that info for months, the fact that they received the info (for example) in late september/early october doesn't mean that the kits with those clocks were shipped just before that. They could have been around for months and months, and even be largely based on the stock tx1 with some customizations (which is actually what DF said in their summer article about NX using tegra iirc). Not to say that eurogamer's clocks are certainly wrong (read my previous post on the matter), but i don't think this tells us anything new, really.
In any case, it's obvious that we can't just dismiss the foxconn leak either. The guy was spot on about everything, doesn't make sense that he made up all the shit about the clocks.
The 32-bit RAM units are pretty much confirmed, in the other thread we have read off the letters and numbers on the RAM units and it exactly matches this piece of hardware, which Thraktor said was 32-bit per module: http://www.samsung.com/semiconducto...e-dram/low-power-ddr4/K4F6E304HB-MGCH?ia=3107So July 18th to the 25th for this chip.
First thing I can see that makes me really puzzled is using 2 ram chips, this would indicate 128bit memory bus, but thraktor in the other thread thinks he might have figured out the model number and those are 32bit chips for 64bit bandwidth. You think you'd just use 1 64bit chip for this.
The soc is a socket based chip? Its' area is much bigger than x1 in Tegra but the die seems to be about the same size, of course 20nm and 16nm are pretty much the same size so this doesn't really help us.
I would say that this isn't pascal, though it could still be 16nm, meaning it's about the same thing. X1 isn't a standard maxwell chip and was the basis for pascal anyways.
Foxconn leak was right, that much is definitely confirmed here. I think western devs could have asked for more performance from the cpu, this would push Nintendo to test higher clocks. I think until we understand why they would test the cpu at such a high clock, we should ignore both eurogamer and foxconn clocks as definitive, it's possible that they went with a clock between them as well. Though I do think the Foxconn clocks are more likely.
Except it really doesn't. The last line is just the dimensions of the chip and the line before it, that second digit doesn't look like a 4 and we can't tell if the last two are 80, 30, or 33.The 32-bit RAM units are pretty much confirmed, in the other thread we have read off the letters and numbers on the RAM units and it exactly matches this piece of hardware, which Thraktor said was 32-bit per module: http://www.samsung.com/semiconducto...e-dram/low-power-ddr4/K4F6E304HB-MGCH?ia=3107
If we assume those are 2 different power profiles, that's one hell of a CPU upclock.What do you guys make of this?
![]()
If we assume those are 2 different power profiles, that's one hell of a CPU upclock.
As pointed in the other thread the gap between GFLOPS doesn't match the gap between GPU clocks.
No, the problem is that the gap is too small. That would just make the gap larger.Isn't it possible that the chip can deactive/pause shader cores in low power mode?
Isn't it possible that the chip can deactive/pause shader cores in low power mode?
If the clocks and gflops counts are true, we are looking at roughly xbox one power in fullspeed mode which would be a lot more powerful than we expected.
I wouldn't get my hopes up though.
And like z0m1ie just pointed out it has 512 cores which there is no chance of. Looks fake to me.As pointed in the other thread the gap between GFLOPS doesn't match the gap between GPU clocks.
Eurogamer said:Documentation supplied to developers along with the table above ends with this stark message: "The information in this table is the final specification for the combinations of performance configurations and performance modes that applications will be able to use at launch."
It wasn't the opinion of a developer, it was written in official documentation and reported on in December 2016.
That's what makes the foxconn fanfic performance bump very, very unlikely to me.
And like z0m1ie just pointed out it has 512 cores which there is no chance of. Looks fake to me.
Well "at launch" has always struck me as a odd thing to mention in that line. Could suggest a consideration of upping performance later.
No matter what anyone thinks about the likelihood of certain clocks in retail hardware everyone by now surely has to accept that the Foxconn info is solid. So Nintendo certainly wanted to know that the SoC could run at a combined 921mhz/1785mhz frequency for whatever reason. A bump in performance is just as logical or more than any other theory we've heard, so think its silly to call it fan fiction personally.
It wasn't the opinion of a developer, it was written in official documentation and reported on in December 2016.
That's what makes the foxconn fanfic performance bump very, very unlikely to me.
Yeah I understand why it happened, still silly IMO. Shouldn't even post in a thread without reading first, let alone rename a thread based on nothing more than a assumption.
Higher rez pics just came up, it's the model tag is (?)DNX02-A2.
I would say that this isn't pascal, though it could still be 16nm, meaning it's about the same thing. X1 isn't a standard maxwell chip and was the basis for pascal anyways.
"If this Nvidia chip is legitimate, then its printed code is interesting, because it contains a code label of "UDNX02-A2." We know the Switch is set to contain a Tegra chip, and we had been led to believe that this would be much like the X1 found in Nvidia's Shield devices. But this "A2" designation, unlike the "A1" printed on existing X1 chips, could mean we're getting a significant change to the design.
That may very well be a shrink to a 16nm process, or we could actually be getting a surprise bump from the older Maxwell architecture to the newer Pascal. The latter seems rather unlikely, however, based on known Switch specifications that developers are designing to." https://arstechnica.com/gaming/2017/02/is-this-what-the-nintendo-switchs-insides-look-like/
On ArsTechnica they were reading a bit more into the model tag and came to a similar conclusion as z0m3Ie
And their theory falls apart when you see that the Tegra in the new Shield also has an A2 code:
And their theory falls apart when you see that the Tegra in the new Shield also has an A2 code:
There's a pretty simple explanation to that pic mentioning 516 cores and ridiculously high clocks. It's horseshit.
Luck has it, I don't rely on it. I believe it is a derivative of x1, it is custom and I'm wondering if the customization here is 16nm. I mean if it isn't, the entire design is sort of ridiculous imo, because you could make it passively cooled with eurogamer's clocks on 16nm.
But this "A2" designation, unlike the "A1" printed on existing X1 chips, could mean we're getting a significant change to the design.
I was talking about ArsTechnica and this statement:
As for 16nm, I agree, it does seem like a big missing opportunity, but it also can be the case that the Pascal Tegra doesn't really work outside of Parker for whatever reason. Or might be too expensive.
Has a significant patch been confirmed through any reliable sources or just rumoured?
so is the consensus 20nm TSMC?
The chip in the new shield and the switch are not the same chip. Switch's soc is gaming focused and shields one is is multimedia focused.
If the new shield shared the same chip nvidia would have talked up the improved graphically performance of the new sheild to which they did not.