CrustyBritches
Gold Member
Nope. Gonzalo was 1.8GHz, Oberon 2GHz, and the final version in Prospero is variable 2.2GHz. They were on the trail the whole time.He also said it would be 8TF
Nope. Gonzalo was 1.8GHz, Oberon 2GHz, and the final version in Prospero is variable 2.2GHz. They were on the trail the whole time.He also said it would be 8TF
Nope. Gonzalo was 1.8GHz, Oberon 2GHz, and the final version in Prospero is variable 2.2GHz. They were on the trail the whole time.
True. DF has gone to Sony numerous times to ask about VRS, and Sony refused to answer.
Sometimes you learn more from what people don't say, than they say.
If the PS5 has VRS, and they have been asked to clarify it by DF, and they don't, I can only guess that's because it doesnt have it.
You can't find any actual example of bias in that article can you? Claims without supporting evidence are meaningless. The article you posted as proof is anything but, in fact the article is perfectly reasonable and even-handed.
"In less than 5s I showed random proof" No. You showed a link to a Gaf thread where people, who think like you, were thinking like you. That's not proof of anything.
Show us the quotes, show us the examples, be specific. If DF is so biased you should have no problem.
DF was following the trail of the actual PS5 hardware. Fanboys can not reconcile this in their minds, because they believe they are right even when they were way off they will just move the goalposts.Digital Foundry predicted that the new generation would be 5700XT, it made sense to say that it would not be 36CUs because that would be "8TF" and nobody imagined that with RDNA2 the clock could be increased to [email protected] 10.28TF
Only a fanboy thinks 100-200MHz means anything.
Linus apologized for misrepresenting what Epic said about the SSD. They said "best in class", which Linus read as "fastest SSD".Digital Foundry predicted that the new generation would be 5700XT, it made sense to say that it would not be 36CUs because that would be "8TF" and nobody imagined that with RDNA2 the clock could be increased to [email protected] 10.28TF
Why should a Sony answer something to DF?
After Cerny made a presentation saying:
- SSD is not just for loading
- any downclock will be minimal.
- increasing the clock improves everything on the GPU.
DF shit out of his mouth:
- SSD is only for loading.
- PS5 never runs with CPU and GPU simultaneously at full clock.
- clock increase does not improve anything, because the 5700XT above 2ghz does not improve anything.
would you talk to any journalist who calls you a liar?
And guess what, DF isn't interested in listening to Sony, because it didn't hear anything from Cerny's lecture as I just demonstrated.
If I was interested in listening, we would not have the FUD festival encouraged by DF, and by that time DF would have already made a video about Geometry Engine
GE >> VRS.
===========
Linus apologized after talking nonsense about the PS5's SSD.
where is the "dictator" apologizing after saying that the SSD is only for loading? That the PS5 is never at full clock?
Will Leadbetter apologize after implying that Cerny lied, that raising the clock above 2ghz doesn't improve the GPU when AMD releases "6900XT" at 2.2~2.3 ghz?
The whole article was a damage control to show the diminished importance of resolution. The article postulates that resolution is a mere means to compare the relative performance and not a factor that shows a perceptible difference in real life. Do people not know how to read anymore?Hang on. Is this a satire account? If so apologies, I've been taking it seriously all this time.![]()
In some ways it's a grim state of affairs - especially for Microsoft - because as 2014 progressed, resolution as a meaningful, differentiating issue of the gameplay experience between multi-platform titles became less important. The amount of visually compromised 720p/792p titles appearing on Xbox One dwindled as the year progressed, while the 900p/1080p differential turned out to be much less pronounced than the raw maths suggested. Indeed, as the major titles rolled out in Q4, we saw resolution parity in key titles such as GTA 5, FIFA 15, Destiny and Assassin's Creed Unity. On other tentpole games like Far Cry 4 and Call of Duty: Advanced Warfare, PS4 maintained a resolution advantage, but raw pixel count wasn't the most crucial element of image quality (though it played much more of a role in the multiplayer portion of COD).
If you are wondering what reality check video they are talking about, it is a video where well you watch it for yourself.The question of whether resolution actually matters is explored nicely in GameSpot's recent Reality Check video, in which Cam Robinson oversees a 'blind taste test' of sorts across three platforms - PS4, Xbox One and PC. Far Cry 4 and COD are the main examples here, illustrating that despite a yawning chasm in resolution between PS4 and Xbox One, it's very hard for the majority of those participating to tell any difference in actual gameplay conditions. What's crucial in this case is that not only are both COD and Far Cry 4's res reductions well-handled on Xbox One, they also have performance profiles equivalent to or even better than their PS4 counterparts - and we're firmly of the belief that frame-rate difficulties have much more of an impact on the overall experience than resolution.
Robinson could have chosen titles where the difference is more pronounced - we would have enjoyed seeing the same test undertaken on the 720p vs 1080p Metal Gear Solid 5: Ground Zeroes, for example - but as an indicator for how close multi-platform titles are becoming, the games are well chosen. Call of Duty on Xbox One operates mostly at 1360x1080, yet looks remarkably close to the full HD PS4 version in motion. Similarly, Far Cry 4 runs at 1440x1080 on Xbox One, while again running at an uncompromised 1080p on the Sony console. The PS4 version is cleaner and more pleasing to our eyes, but not to a revelatory degree.
In the meantime, perhaps the biggest takeaway from the survey data is that it's the Wii U owners that are having the most fun from their gaming hardware
However, clearly it's still early days, and right now these machines remain very much uncharted territory - even for those who've been working with prototype hardware for a long time. Microsoft tells developers that the ESRAM is designed for high-bandwidth graphics elements like shadowmaps, lightmaps, depth targets and render targets. But in a world where Killzone: Shadow Fall is utilising 800MB for render targets alone, how difficult will it be for developers to work with just 32MB of fast memory for similar functions? On the flipside, Xbox One's powerful custom audio hardware - dubbed SHAPE (Scalable Hardware Audio Processing Engine) - should do a fantastic job for HD surround, a task that sucks up lots of CPU time on current-gen console. How does PS4 compare there? And just how much impact does the GDDR5 memory - great for graphics - have on CPU tasks compared to Xbox One's lower-latency DDR3?
While next generation of consoles finally arrive in a matter of months, the launch games will have mostly been developed on incomplete hardware - a state of affairs that was blatantly obvious from titles seen so far. On paper, Sony retains a clear specs advantage, but it was difficult to see that reflected in the quality of the games at E3. Based on what we're hearing about the approach to next-gen development, it could be quite some time before any on-paper advantage translates into an appreciably better experience on-screen.
The Xbox One vs. PlayStation 4 graphics spec comparison is stark to say the least. Both systems utilise AMD's GCN (Graphics Core Next) architecture, but Sony's rendering tech has 50 per cent more raw computational power than the Xbox One equivalent - and that's factoring out other differences between the systems. The question is, what is the impact in actual gameplay conditions?
Behind the scenes, developers have suggested to us that we shouldn't jump to conclusions about the extent of the PlayStation 4's superiority, and that the 50 per cent boost in GPU power emphatically won't result in a likewise boost to in-game performance.
Wanna know whats funny, remember when Mark Cerny says raising clocks is better than more CU?"The point is the hardware is intentionally not 100 per cent round," Cerny revealed. "It has a little bit more ALU in it than it would if you were thinking strictly about graphics. As a result of that you have an opportunity, you could say an incentivisation, to use that ALU for GPGPU."
An interpretation of Cerny's comment - and one that has been presented to us by Microsoft insiders - is that based on the way that AMD graphics tech is being utilised right now in gaming, a law of diminishing returns kicks in.
All of a sudden more CU is better than higher clocks now. I wonder why?The results pretty much confirm the theory that more compute cores in the GCN architecture doesn't result in a linear scaling of performance. That's why AMD tends to increase core clock and memory speed on its higher-end cards, because it's clear that available core count own won't do the job alone.
I don't mind when someone has a bias but we can't deny that it is not there.According to inside sources at Microsoft, the focus with Xbox One was to extract as much performance as possible from the graphics chip's ALUs. It may well be the case that 12 compute units was chosen as the most balanced set-up to match the Jaguar CPU architecture. Our source says that the make-up of the Xbox One's bespoke audio and "data move engine" tech is derived from profiling the most advanced Xbox 360 games, with their designs implemented in order to address the most common bottlenecks. In contrast, despite its undoubted advantages - especially in terms of raw power, PlayStation 4 looks a little unbalanced by comparison.
NVIDIA's paper spec boost clocks for RTX GPUs are conservative while it can't be said for AMD RX 5700 XT.Yup you have boost and OC, the poster I was countering was suggesting the extra GHz of the 2080 does nothing vs the extra TF of the Ti. I dont look too much at PC performance as I game on console, so thanks for that.
Your data shows clearly both cards effectively reach the same GPU boost clocks anyway, so proves my point that Ps5 will have some benefits of faster GPU clocks to make up ground on the TF differential.
Frame buffer's render targets has largest benefit for higher memory bandwidth.Where do I start ? 14 gbs is the speed of the memory chips, they are the same, XSX has wider bus for the 10 GB, Narrower for the 6 GB, go work it out for yourself there are 2 bandwidth numbers.
Go read up on absraction and apis if you dont understand, google works.
SonyGAF conspiracy? Again?!?If it’s all about who hits the report button more, whoever has the biggest fan base will get banned less, thus gaining an even bigger amount of support on the site and dominate the report button even more. It’s a terrible stance for the site to hold
The whole article was a damage control to show the diminished importance of resolution. The article postulates that resolution is a mere means to compare the relative performance and not a factor that shows a perceptible difference in real life. Do people not know how to read anymore?
If you are wondering what reality check video they are talking about, it is a video where well you watch it for yourself.
And the conclusion?
Lets try another article from Richard
Another article from Richard
Wanna know whats funny, remember when Mark Cerny says raising clocks is better than more CU?
All of a sudden more CU is better than higher clocks now. I wonder why?
I don't mind when someone has a bias but we can't deny that it is not there.
On a different note. Y'all need to give it a rest already.
First PS5 was RDNA 1 or 1.9, to it does not have hardware accelerated ray tracing, then it is really a 9TF console and now it does not have VRS or Mesh Shaders. Yes it does, this is a standard RDNA 2 feature set and there is no conceivable reason why PS5 will not have it. It is spelled out in the Road to PS5 video as part of the new feature set of the updated geometry engine.
The whole article was a damage control to show the diminished importance of resolution. The article postulates that resolution is a mere means to compare the relative performance and not a factor that shows a perceptible difference in real life. Do people not know how to read anymore?
The whole article was a damage control to show the diminished importance of resolution. The article postulates that resolution is a mere means to compare the relative performance and not a factor that shows a perceptible difference in real life. Do people not know how to read anymore?
If you are wondering what reality check video they are talking about, it is a video where well you watch it for yourself.
And the conclusion?
Lets try another article from Richard
Another article from Richard
Wanna know whats funny, remember when Mark Cerny says raising clocks is better than more CU?
All of a sudden more CU is better than higher clocks now. I wonder why?
I don't mind when someone has a bias but we can't deny that it is not there.
On a different note. Y'all need to give it a rest already.
First PS5 was RDNA 1 or 1.9, to it does not have hardware accelerated ray tracing, then it is really a 9TF console and now it does not have VRS or Mesh Shaders. Yes it does, this is a standard RDNA 2 feature set and there is no conceivable reason why PS5 will not have it. It is spelled out in the Road to PS5 video as part of the new feature set of the updated geometry engine.
ASUS ROG RX 5700 XT Strix already has about 10.27 TFLOPS with 448GB/s memory bandwidth
Subtle RDNA1.2 dig at PS5 againASUS ROG RX 5700 XT Strix at 2007 Mhz average clock speed yields about 10.276 TFLOPS.
They are likely the same as PS5’s GPU and thus not useful, or wrong narrative setting (lower clock, same number of units, you do the maths... even if they were more they would still need to be enough to cover the clockspeed gap) in the super long deep dives they did where they enunciated lots of other details (they went into details more than Cerny did on many things on the Road to PS5 video).MS did NOT reveal XSX GPU's non-CU areas such as ROPS unit count
DF Stating the truth in an even-handed manner is not proof of bias. I'd be happy to have that discussion with you if I thought you were genuine or reachable, however I'd have more chance of convincing an anti-vaxxer that Bill Gates is okey-dokey.This behavior of the DF is disgusting.
Even I didn't know that DF was such an xbox fanboy, I was disgusted.
DF is a cancer of the gaming world.
Don't end it there, continue to the next paragraph.This is just another example of the bias being in your interpretation, even your description of what the article is about is, in my opinion, wrong. There is nothing in the articles, or the quotes that you pulled from them, that offers proof of a bias by the author.
What is untrue about what he said or what you chose to highlight? He just didn't pander to your particular set of beliefs.
Here's some other quotes from that original article...
"...but resolution is a parameter by which the differing power levels of the machines can be addressed, and was certainly the way the specialist press - ourselves included - went about it."
"The PS4 version is cleaner and more pleasing to our eyes..."
"Xbox One not only runs at a slower frame-rate, but its imagery is delivered in an inconsistent manner, resulting in further judder."
"Battlefield Hardline launches later this month with a 720p Xbox One resolution and unimpressive image quality as a result, while we suspect we'll see the same situation with Metal Gear Solid 5: The Phantom Pain later on in the year..."
Doesn't sound like an unrepentant Xbox-loving rescue mission to me. The bias you're seeing is in your reading of the material, not the material itself.
Those titles should be outliers though,
and fingers crossed that across 2015 we'll see enough progress that next year's re-run of the Nielsen survey sees the quality of the gaming experience as the motivating factor behind investing in console hardware.
DF Stating the truth in an even-handed manner is not proof of bias. I'd be happy to have that discussion with you if I thought you were genuine or reachable, however I'd have more chance of convincing an anti-vaxxer that Bill Gates is okey-dokey.
Nonsense. You'r having to torture the article to eek out the bias you want to see and then, in your brackets, invent an implication that simply doesn't exist in the article, just in your head.Don't end it there, continue to the next paragraph.
Its the quality of the gaming experience that matters, (or like some fanboys love to proclaim, better controller, better online, friends) not the resolution. Lmao
The article was made to downplay resolution.
The article probably doesn’t exists... it just in our read.Nonsense. You'r having to torture the article to eek out the bias you want to see and then, in your brackets, invent an implication that simply doesn't exist in the article, just in your head.
I don't expect miracles from "RDNA 2" when XSX GPU only rivals RTX 2080 with 448GB/s memory bandwidth while MS/AMD throws higher memory bandwidth against it. I expected more from AMD.Subtle RDNA1.2 dig at PS5 again?
They are likely the same as PS5’s GPU and thus not useful, or wrong narrative setting (lower clock, same number of units, you do the maths... even if they were more they would still need to be enough to cover the clockspeed gap) in the super long deep dives they did where they enunciated lots of other details (they went into details more than Cerny did on many things on the Road to PS5 video).
I don't expect miracles from "RDNA 2" when XSX GPU only rivals RTX 2080 with 448GB/s memory bandwidth while MS/AMD throws higher memory bandwidth against it. I expected more from AMD.
I'm still waiting for RTX 2080 level efficiency from AMD e.g. NVIDIA's real-time memory compression is still superior when compared to AMD's.
You are forgetting memory bandwidth scaling with any TFLOPS increase.
XSX has ten 32 bit memory controllers and RDNA basic design has L2 cache in front of memory controllers e.g. 5MB L2 cache instead of RX 5700 XT's 4MB L2 cache.
Your "They are likely the same as PS5’s GPU " speculation is not superior to my own speculation.
Both X1X and PS4 Pro has forked from Polaris 10 baseline design
PS4 Pro includes RPM for CU and 40 CU design. Sony focused on compute bias improvement.
X1X includes 2MB render cache for ROPS and 44 CU design. MS focused on raster and some compute improvements.
Both improvement areas have appeared in Vega.
The whole article was a damage control to show the diminished importance of resolution. The article postulates that resolution is a mere means to compare the relative performance and not a factor that shows a perceptible difference in real life. Do people not know how to read anymore?
If you are wondering what reality check video they are talking about, it is a video where well you watch it for yourself.
And the conclusion?
Lets try another article from Richard
Another article from Richard
Wanna know whats funny, remember when Mark Cerny says raising clocks is better than more CU?
All of a sudden more CU is better than higher clocks now. I wonder why?
I don't mind when someone has a bias but we can't deny that it is not there.
On a different note. Y'all need to give it a rest already.
First PS5 was RDNA 1 or 1.9, to it does not have hardware accelerated ray tracing, then it is really a 9TF console and now it does not have VRS or Mesh Shaders. Yes it does, this is a standard RDNA 2 feature set and there is no conceivable reason why PS5 will not have it. It is spelled out in the Road to PS5 video as part of the new feature set of the updated geometry engine.
Guys.
Lmao that DF article was basically what every other games media were saying that time.
I guess everyone was biased towards M$ but just never bought an Xbox hahha
Another VRS type solution, called by another name, most likely would still give "evidence" (which was not seen by DF) that's it's being used if in fact it's there and it's being used should it not?The FUD continues
I'm going to repeat myself:
The PS5 does not have VRS as this is a MS Tech. Link for those that want to be bothered to read a bit. It's possible the PS5 has a similar tech implemented, but it would still not be VRS.
18 pages of non-discussion because fanboys can't do a simple google search
Here’s a pattern of the western media. Xbox comes out with always online DRM, they write it’s how things are going to be for everyone. Xbox offers a lower resolution standard, they write the human eye won’t notice. Xbox sells less than the competition, they write don’t look too much into it, the numbers don’t tell the story. Xbox 1X releases, they write resolution is back on the menu, stunning 4K has to be seen to be believed. Xbox ends the generation dead last, they write you the top reasons why it won the generation. Etc etc
Maybe it’s because Xbox is the “underdog” story. Which is weird because Sony is definitely the “little guy” when compared to MS.
All digital is different from always online.Everything goes online ? It was the always online vision ... but they where a console generation to early ... see how many reacted positively for the all digital PS5. Humans can only change so fast ...
Here’s a pattern of the western media. Xbox comes out with always online DRM, they write it’s how things are going to be for everyone. Xbox offers a lower resolution standard, they write the human eye won’t notice. Xbox sells less than the competition, they write don’t look too much into it, the numbers don’t tell the story. Xbox 1X releases, they write resolution is back on the menu, stunning 4K has to be seen to be believed. Xbox ends the generation dead last, they write you the top reasons why it won the generation. Etc etc
Maybe it’s because Xbox is the “underdog” story. Which is weird because Sony is definitely the “little guy” when compared to MS.
Of course it's not the same exact thing. There are different kinds of VRS. Read Microsoft's patent.The FUD continues
I'm going to repeat myself:
The PS5 does not have VRS as this is a MS Tech. Link for those that want to be bothered to read a bit. It's possible the PS5 has a similar tech implemented, but it would still not be VRS.
18 pages of non-discussion because fanboys can't do a simple google search
Im sure nvidia totally wasted transistors to help Microsoft market secret sauce lol.I’m beginning to think that VRS is new secret sauce/power of the cloud.
How do you explain all those analysis in the beginning of past gen where they clearly said you could see the difference between PS4 and XBox One in 3rd party games? CoD for example... It first ran at 720p and 1080p for Xbox and PS4 respectively, afterwards it got an update for Xbox to 900p. This really reads like they said there's no difference...Here’s a pattern of the western media. Xbox comes out with always online DRM, they write it’s how things are going to be for everyone. Xbox offers a lower resolution standard, they write the human eye won’t notice. Xbox sells less than the competition, they write don’t look too much into it, the numbers don’t tell the story. Xbox 1X releases, they write resolution is back on the menu, stunning 4K has to be seen to be believed. Xbox ends the generation dead last, they write you the top reasons why it won the generation. Etc etc
Maybe it’s because Xbox is the “underdog” story. Which is weird because Sony is definitely the “little guy” when compared to MS.
Each next-gen console version has its strengths and weaknesses, but clearly it is the PS4 game that offers a superior experience overall. The advantage here comes in the form of cleaner and sharper visuals that help to better realise the extra next-gen spit and polish on display over the 360, PS3, and Wii U versions of the game - something that Xbox One's upscaled 720p presentation fails to do in quite the same way. Source]
Linus apologized for misrepresenting what Epic said about the SSD. They said "best in class", which Linus read as "fastest SSD".
Mark Cerny gave DF a one on one interview about the PS5. So while you might think DF is biased, Cerny obviously doesn't. They asked a follow up question for him to clarify if the PS5 has VRS, and they never got a reply.
It's not a long bow to draw to assume its because it doesnt have VRS.
And VRS and GE have no equivalency.
Its like saying GPU > RAM.
Yeah, it must be totally worthless... waste of time engineering it.
Another VRS type solution, called by another name, most likely would still give "evidence" (which was not seen by DF) that's it's being used if in fact it's there and it's being used should it not?
Note, Geometry Engine on ps5 is culling geometry...ie..mesh shaders on X, not per pixel shading variation(VRS). Non equivalent but a different optimization technique with different results.
Yeah, it must be totally worthless... waste of time engineering it.
PR that’s what it all is ...
this is called a nice dodge.
this is called a nice dodge.
impossible this guy doesn't know mesh shader is a thing.
He is not talking about the CPU at all. You do know that the GE is also a GPU feature, right?A CPU doesnt hold a candle to the processing power of a GPU.
Sigh.
Umm, yeah. Looks like my post was lost on you.He is not talking about the CPU at all. You do know that the GE is also a GPU feature, right?
I'd like to point out that the digital foundery article did not conclude that it's absence was indicative of the PS5 not having it.Absence of evidence is not evidence of absence.
Just because you can't see something like this being deployed, it doesn't mean it's not occurring (nor it means it is).
True, although it is by far the most likely explanation. Occam's Razor.Absence of evidence is not evidence of absence.
Just because you can't see something like this being deployed, it doesn't mean it's not occurring (nor it means it is).
Each party has it own software implementation of VRS.The FUD continues
I'm going to repeat myself:
The PS5 does not have VRS as this is a MS Tech. Link for those that want to be bothered to read a bit. It's possible the PS5 has a similar tech implemented, but it would still not be VRS.
18 pages of non-discussion because fanboys can't do a simple google search
Thanks for clearing it out.Umm, yeah. Looks like my post was lost on you.
Him saying that VRS cant hold a candle to GE, is like saying that a CPU cant hold a candle to a GPU.
They are all seperate things, and do different jobs, and can't be compared to each other.
Its a moot point.