• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

One specific person assumes PS5 3rd Party Devs Are Finding Hardware Troublesome™, “Sony Too” May be Common

anothertech

Member
He is going out of his way to spread negativity around PS5... there is a definite agenda. I understand MS brings lots of money into the industry and they want to keep them in so everyone gets paid but spreading lies isn't the way to do it.

They have tried spreading positivity around Xbox by talking up gamepass recently and that obviously didn't do enough so they came out to take down Sony a bit....
"Why is he doing this"

$$$

He's a paid shill, clear and simple.

Also Craig
 
Probably this tweet



tenor.gif


Dang Matt ease off the poor guy.
 
Probably this tweet




I have to commend this guy (dusk)
I would just block the dude, not write explaining the how & why I know.
I say again, his knowledge of inside capcom stuff has already been proven when he made them bold claims of SF6 boss getting canned etc
 
Last edited:

VFXVeteran

Banned
Spider-man, Demons Souls, Death Loop, R&C, Kena and many of the other games shown at the reveal were at full 4K.... we know this. Some of them offer a 60fps mode that may have dynamic 4K due to the use of Ray Tracing and other demanding techniques......

So I don't know why we are even entertaining these obvious lies that come right off the back of lots of negative press for the competition....

Allow me to put a little bit of thought into some of you guys' assumptions.

Nearly all of the PS5 games shown were cinematics (ALL of XSX games were either cinematics or pre-rendered). 4k/30FPS would be viable in this circumstance. R&C isn't very complex rendering-wise, so I don't see a problem with pushing 4k/60FPS with that game.

There is definitely something fishy about these claims. Instead of writing these guys off, we should be skeptical but cautiously optimistic. I too had questions concerning the frequency adjusting pipeline between the CPU and GPU. I thought it odd and could possibly be trouble. I also didn't like the split memory pool of the XSX either. That could be trouble as well.

But 4k is expensive - even with this generations games. A lot of you assumed 1.8TF >>> 10-12TFLOPS means "no sweat" but the bandwidth limitations showed their head with the PC games trying to do 4k this generation. It is not in people's best interest to assume these games will run easily. 4k doesn't just mean 1 final buffer to write pixels to. Most of these graphics engines still use deferred rendering techniques which require multiple buffers at the same resolution (i.e. depth buffer for example). Bandwidth gets eaten up pretty quickly.
 

Bryank75

Banned
Allow me to put a little bit of thought into some of you guys' assumptions.

Nearly all of the PS5 games shown were cinematics (ALL of XSX games were either cinematics or pre-rendered). 4k/30FPS would be viable in this circumstance. R&C isn't very complex rendering-wise, so I don't see a problem with pushing 4k/60FPS with that game.

There is definitely something fishy about these claims. Instead of writing these guys off, we should be skeptical but cautiously optimistic. I too had questions concerning the frequency adjusting pipeline between the CPU and GPU. I thought it odd and could possibly be trouble. I also didn't like the split memory pool of the XSX either. That could be trouble as well.

But 4k is expensive - even with this generations games. A lot of you assumed 1.8TF >>> 10-12TFLOPS means "no sweat" but the bandwidth limitations showed their head with the PC games trying to do 4k this generation. It is not in people's best interest to assume these games will run easily. 4k doesn't just mean 1 final buffer to write pixels to. Most of these graphics engines still use deferred rendering techniques which require multiple buffers at the same resolution (i.e. depth buffer for example). Bandwidth gets eaten up pretty quickly.
I don't expect Spider-man for instance to do true 4K with 60fps and ray tracing..... that would be unreasonable unless they have some serious tricks...

I guess in my particular case it was that Dusk was saying PS5 was at 1080p, while XSX was at full fat 4K, which doesn't seem to add up in my mind. I could understand 1800p or something a little below 4K but 1080 seems ridiculous.

Like you say, there is a lot to it and I am not going to pretend to know all the ins and outs but PS has always delivered on their promises within reason and have kept me very happy last gen, so I give them the benefit of the doubt.
 
Well... If we really think about it: Of the two systems which is more "exotic" ? From what we've been hearing it's the PS5. So, unfortunately just throwing your code at it won't do. I would imagine that this would be an issue they run into at the beginning of the porting process, right? They throw the code at it and then see where it's struggling, then from their they tweak. I'm going to guess that none of the 3rd party games right now are code specifically for the next gen consoles, this is code that is being ported over. This news is either old or something that will get easier with time or when the code is built specifically for these next gen consoles. Although if this old code was originally built with PS4 in mind, doesn't the PS5 and PS4 have the same amount of CUs? Shouldn't that help, its not less.

IDK I'm not a dev and barely grasp any of this stuff LOL maybe someone like NXGamer NXGamer could shed some light 🤷‍♂️
 

Kazza

Member
Haha, so true


b75YVfn.jpg


Thread of course, thread is closed on ERA

Where's the rumour about the PS5 being 13TF? Why's that one missing from the list?

Anyway, this is going to be me when the first DF analysis of a next gen third-party game drops:

200.gif


The difference will probably only be like the current one between the Pro and X (i.e. noticeable, but not a huge gamechanger), but the threads here will still be :messenger_fire: :messenger_fire: :messenger_fire: :messenger_fire:
 

Bryank75

Banned
Where's the rumour about the PS5 being 13TF? Why's that one missing from the list?

Anyway, this is going to be me when the first DF analysis of a next gen third-party game drops:

200.gif


The difference will probably only be like the current one between the Pro and X (i.e. noticeable, but not a huge gamechanger), but the threads here will still be :messenger_fire: :messenger_fire: :messenger_fire: :messenger_fire:
If Xbox has any 3rd party games... looks like there will be very few that don't have a deal with PlayStation.....

I mean you could compare Fifa or NBA whatever....
 

Krisprolls

Banned
That says an option performance mode, why would they make it optional? just so a 4k30 would have some extra effects/rt? I think it was badly worded and the game is 4k30 with an optional 60 fps at lower res

It's optional because the 4k/30 fps "graphics" mode has aditional effects obviously. 4k/60 means 4k/60, there's no space to word it badly.

Apparently Insomniac didn't have much trouble getting great performance on the PS5. Doesn't look like struggling at 4k / 60 fps to me.

I'd be a bit less confident on XSX since no game was shown running on it and the flagship title looked so ugly they had to delay it until next year.
 
Last edited:

bargeparty

Member
That says an option performance mode, why would they make it optional? just so a 4k30 would have some extra effects/rt? I think it was badly worded and the game is 4k30 with an optional 60 fps at lower res

We've been through this, it's 4k at 60fps.

4k/30 because some people would rather it be extra shiny.
 

Evangelion Unit-01

Master Chief
These console wars are exhausting. I suspect it has something to do with the fact that everyone is locked up at home...

Xbox is more powerful. Full stop. We've all seen the numbers. No amount of SSD, "special sauce", or variable clocks will change that. Third party games will most likely look better on Series X. But does it really matter? Probably not.

Look at who has "won" the last few gens: PS2, Wii, and presumably the Switch. In each of these cases the least powerful console has won and delivered some incredible exclusive games in the process.

Consumers care about two things:
1. Price
2. Games

I think we can assume the prices will be comparable. Nobody wants to launch a PS3 or Xbox One.

So that leaves us with games. Xbox has Game Pass and while that is a fantastic service it is not the end-all be-all. Sony has an incredible lineup of first party juggernauts and they seemingly have deals on a ton of third party titles. I think it is safe to say that Sony is winning in this area. Xbox has done a ton to close the gap but they still have work to do-Halo and Gears are waning in popularity, their newly acquired first party studios are mostly young with games that are years away, and Sony has the third party deals.

I think it is safe to say that Sony has the advantage with games but Xbox is no slouch with game pass. Both consoles will do just fine. But why is that not enough for people? Why can't people talk about these companies objectively? Why don't people discuss this as enthusiasts instead of fanboys?

Instead we have these tiresome conspiracy theories.

"Sony is moneyhatting third party exclusives so they don't look less powerful in comparison videos" or "Microsoft paid off Digital Foundry look at these shills they didn't call Halo the worst game ever".

It's pathetic. Just because one console has the advantage in one area doesn't mean that you need to attack it or defend your platform of choice. It's sad and you just look foolish. Xbox might be marginally more powerful. Nobody outside of video game forums will care. The average consumer is going to go to Best Buy, get a console, and plug into their old lcd tv where the colors are all calibrated incorrectly. They aren't going to notice native vs checkerboard 4k. It just is not going to matter. For enthusiasts like us if Xbox and/or PS starts to really fall behind graphically they'll almost certainly release a "pro" mid gen refresh to make the power gap irrelevant. It just won't matter. People survived the 900p vs 1080p Xbox One/PS4 debacle-this won't be any worse than that.

All of that to say-we have to stop with the stupid conspiracy theories and endless console warring. 90% of the time the fighting is over stupid bullet points the side of the console box. It's just dumb forum fodder. It doesn't have any bearing on the market as a whole. It is ok to like things and care about things the market doesn't like or care about. Just separate yourself from the market. Realize that your consumption preferences as a video game forum poster are probably not the same as the market. This forum and other video game forums out there account for a small fraction of the market. The market is out there building stairs in Fortnite and buying micros in FIFA...they aren't worried about X vs Y terraflops. Both consoles will be just fine.

It is ok to speculate on who will win the sales race or what games will be the ones to play but the console warring is just plain embarrassing. There isn't any thoughtful conversation-just venom and third grade playground antics. These consoles can have strengths and weaknesses without it meaning one or the other is doomed. It's not a zero sum game one console is not going to "win" because the other "lost".

Now I'm sure I've managed to make both the PS and Xbox fanboys angry...good. Get over yourselves. As for me I'll be playing on both Xbox and PS this gen just as I do every other gen.
 
Last edited:

pixelation

Member
Fuck 4K, Digital Foundry themselves have said that DLSS offers more impressive results than native 4K. Couldn't care less about hitting some magical number just for the sake of reaching it when the alternative could potentially look better.
 

rnlval

Member
They don't know what they're talking about. It's always DuskGolem and Jeff Grubb, how many threads are we allowed to open about those two ?

Jeff Grubb doesn't understand what he says, no Sony variable frequencies is not AMD smartshift unlike what he says, it's not the same thing, AMD smartshift is about giving unused power from CPU to GPU, only in that direction. Sony PS5 model is about being sure max available power is used both on CPU and GPU based on workload (not based on temperature, and it's deterministic).

How many FUD threads are allowed for the same DuskGolem tweets ?
AMD Smartshift changes clock speed domains for CPU and GPU in accordance to workload within the total CPU+GPU power budget and upper TDP safety limit. Max clock speed reached doesn't indicate heavy usage.
 

rnlval

Member
He did said PS5 can be at max clocks most of the time during Road to PS5 and Eurogamer interview. Even PS5 devkits has a profile with locked max. clocks.
Reaching max clock speed is NOT an indication for workload magnitude e.g. scalar integer or 128 bit SSE workload on CPU can reach max clock speed while heavy 256-bit AVX 2 can cause clock speed throttle.
 

Chiggs

Gold Member
Fuck 4K, Digital Foundry themselves have said that DLSS offers more impressive results than native 4K. Couldn't care less about hitting some magical number just for the sake of reaching it when the alternative could potentially look better.

Please don’t interpret this as me weighing in on either console, but...

Fuck 4k because the dorklings from Digital Foundry—a bunch of guys with ZERO dev experience—say something?

What?!

You have your own eyes, right? And your own brain? I’d imagine you’re just as capable as developing your own opinion about something as, say, Digital Foundry is?

Also, I’ve spent the last year or so with an RTX 2080ti, and DLSS is not more impressive than native 4K, IMO.
 
While I am in no position to truly evaluate this information, so far I have seen many PS5 games--apparently running on the PS5 itself--and no xbox one seriesx games running on the actual series x hardware.

Just sayin'.

Ms could record videos on dev kits and drop them tomorrow to fix a portion of this mess, ask Ubisoft for footage of their next Assassins Creed game or something if you have to. Put 4K native 60+ fps + Raytracing + whatever in big bold letters.
 
You care about the price of something only if it interests you in the first place... games make you want it, if it's too expensive it will sell less, but still sell (look at the Switch)... if it doesn't have games/software nobody will bother pick one for 20$, unless they need a paper weight or a conversation starter under their TV.
 
Oh, how the FUD flies when Xbox has a bad showing. If I were you Xbox fans, I'd be more concerned about why no next gen game has ever been shown running on actual XSX HW just a mere 3 months from launch. And why the biggest franchise for Xbox is being so completely mishandled to allow it to look like a launch 360 game, only in 4K. Shown to the world as if that should be acceptable.
 

pawel86ck

Banned
Stop being obtuse.

My GPU never, ever runs at 'max clocks' 100% of the time. Nor does any modern GPU ever made. It doesn't need to. During less computationally demanding scenes or areas in a game, they downclock.

CPUs similarly do not need to run 3.3Ghz 100% of the time. Your argument is idiotic.
GPU doesn't need to run at max clock for long before downclock will happen and here you can see how GPU usage looks like at 4K on 2080ti OCed 2085MHz (18TF).


99% GPU usage for 99% of time. Both next gen consoles are weaker, so I'm sure their GPUs will be pushed to the max for extended periods of time as well.
 
Last edited:
I'm sure many of us considered the possibility, I was thinking about it earlier today....

I might just check how they write things.

It's a strange coincidence indeed when a xbox news of a game delay or some negative news is immediately followed by fake fud and concern trolls from youtubers and influencers and these so called insiders who try to malign the PS5. Didn't exactly this happen during PS4 xbox one launch, where there even claims saying PS4 games are struggling to run but in reality, PS4 games right out of the gate slapped the competition with native 1080p and higher frame rates, stable frame rates?
 

Topfuel

Member
It's a strange coincidence indeed when a xbox news of a game delay or some negative news is immediately followed by fake fud and concern trolls from youtubers and influencers and these so called insiders who try to malign the PS5. Didn't exactly this happen during PS4 xbox one launch, where there even claims saying PS4 games are struggling to run but in reality, PS4 games right out of the gate slapped the competition with native 1080p and higher frame rates, stable frame rates?

Strange the same happens when some bad PS stuff comes out?
 

just tray

Banned
So expect more Dynamic resolution on the PS5 is what I got from that which will make some developers not even bother with Smart Shift. I would imagine it's easier to program for than EDRAM and ESRAM or whatever they were called in prior generations once Sony gets upgraded (drivers) dev kits.

I don't get this solution because for Sony you always have to account for the lowest speed of the GPU and CPU, either the CPU or GPU, game design changes, or just go low with both and upscale or dynamic resolutions.

A fast SSD helps by streaming in textures but when the Series X doesn't have to cut the same CPU and GPU corners then an SSD will only help but so much.

Sony has the exclusives which really do matter though Game Pass competes with Sony Exclusives directly. It's like you can't to wrong picking up either system and Series X will be fine without Halo.
 

MastaKiiLA

Member
GPU doesn't need to run at max clock for long before downclock will happen and here you can see how GPU usage looks like at 4K on 2080ti OCed 2085MHz (18TF).


99% GPU usage for 99% of time. Both next gen consoles are weaker, so I'm sure their GPUs will be pushed to the max for extended periods of time as well.

That PC monitor is running at 1s refresh, and smartshift is supposed to happen in 2ms, meaning you could theoretically have 500 shifts in frequency during a single refresh, and the monitor could still show 99% util. From what I understand, smart-shift isn't necessarily based on util, but necessity. If a part of the rendering pipeline needs to run a bit faster, then you can bump the clock for that process, but you can still maintain full util of the GPU. In any case, I don't know if those PC monitors are useful at all, as you rarely have all units on either CPU or GPU fully occupied, outside of benchmark/test suites. Game code just isn't perfectly optimized like that, not to mention the rendering scenarios in a game are dynamic, based on what the player is actively doing. If you had full utilization just looking at scenery, there'd never be enough headroom for effects. The images shared of a real game workflow is more accurate, and I believe from ND, one of the best game farms in the industry. With process interdependencies, there's going to be CUs waiting to get data from other CUs. There's going to be some CUs running non-stop. The load will shift around multiple times in a single frame, which means tens of times each second that PC monitor is trying to capture. I don't think those monitors have millisecond granularity in their calculations, to even capture an accurate average, so I'm not sure that video is worth much of anything.
 

pawel86ck

Banned
That PC monitor is running at 1s refresh, and smartshift is supposed to happen in 2ms, meaning you could theoretically have 500 shifts in frequency during a single refresh, and the monitor could still show 99% util. From what I understand, smart-shift isn't necessarily based on util, but necessity. If a part of the rendering pipeline needs to run a bit faster, then you can bump the clock for that process, but you can still maintain full util of the GPU. In any case, I don't know if those PC monitors are useful at all, as you rarely have all units on either CPU or GPU fully occupied, outside of benchmark/test suites. Game code just isn't perfectly optimized like that, not to mention the rendering scenarios in a game are dynamic, based on what the player is actively doing. If you had full utilization just looking at scenery, there'd never be enough headroom for effects. The images shared of a real game workflow is more accurate, and I believe from ND, one of the best game farms in the industry. With process interdependencies, there's going to be CUs waiting to get data from other CUs. There's going to be some CUs running non-stop. The load will shift around multiple times in a single frame, which means tens of times each second that PC monitor is trying to capture. I don't think those monitors have millisecond granularity in their calculations, to even capture an accurate average, so I'm not sure that video is worth much of anything.
You are correct, not every GPU transistor is turned on even when OSD shows 99% GPU usage, but the thing is GPU was still forced to run at max clocks for the vast majority of time and it will be the same on PS5 if only games will run with unlocked framerate or dynamic resolution.
 
Stop being obtuse.

My GPU never, ever runs at 'max clocks' 100% of the time. Nor does any modern GPU ever made. It doesn't need to. During less computationally demanding scenes or areas in a game, they downclock.

CPUs similarly do not need to run 3.3Ghz 100% of the time. Your argument is idiotic.
So was MS lying when they said their console runs with fixed clocks? Are they running variable frequency and not telling anyone? Isn't it true consoles don't run like PCs with boost clocks.

Point of variabel freq. is that you can squeeze more GPU power than with fixed clocks.
I mentioned devkits has a profile with locked clocks.

What's the point of locked clocks when it is never utilized 100% then? Regarding XSX, some power will be left on the table.

Yeah, and it will only drop a couple percentages ( in tech analysis it's 2% ) to reduce power consumptio 10%. Mind you that 10% drop in power consumption is not a drop 10% in performance. Maybe resolution will drop if worst scene happens. Like bunch of todays games has dynamic resolution
So the conclusion is that variable clocks are better for game performance than fixed? Is it true that for the PS5 to hit max clocks on the GPU it has to down clock the CPU? What happens when you want max performance for both parts?
 
Last edited:

Nickolaidas

Member
Seems like Cerny was an idiot (he generally seems overrated and not like some god) & put all his marbles into an SSD. Anyways idc much about power but it better not be more expensive than the Xbox IX.

Now it makes sense why they’re going to moneyhatt bunch of 3rd party.

Sony believe’s Content>power will get people to buy PS-5.
Is this a satire post? It reads like a satire post.
 

geordiemp

Member
GPU doesn't need to run at max clock for long before downclock will happen and here you can see how GPU usage looks like at 4K on 2080ti OCed 2085MHz (18TF).


99% GPU usage for 99% of time. Both next gen consoles are weaker, so I'm sure their GPUs will be pushed to the max for extended periods of time as well.


No, that is not the usage in ms within a frame, its a simple PC based information that shows max usage in seconds.

On consoles, when CPU is using the common memory bus, GPU cannot access memory .....PC tools do not show fine detail in nanoseconds at all.

Its like saying your car went 80 mph in a day, but what about every second of that day lol.
 
Last edited:

Kokoloko85

Member
Does GAF think MS is paying these you tubers to spread FUD on teh opposition, had a lot last few days or is it corordinated by the usual gang ?.

Shame most of them are as thick as a plank and have no idea what they are talking about.

It gives us a laugh..

“Sony too” sounds like something they want to catch on lol
 

Tulipanzo

Member
He is going out of his way to spread negativity around PS5... there is a definite agenda. I understand MS brings lots of money into the industry and they want to keep them in so everyone gets paid but spreading lies isn't the way to do it.

They have tried spreading positivity around Xbox by talking up gamepass recently and that obviously didn't do enough so they came out to take down Sony a bit....
Journos are often more interested in creating a story than they are in solid reporting.

Grubb was the same person to report "more devs have praise for PS5", so how the fuck wouldn't he have heard about any potential problems at 4K?
The difference? Back then the XSX was favoured after the spec reveal, and the narrative of "PS5 is secretly better" was juicer, and so he pushed that.
Right now, the reality is that Sony is heading for a very solid launch, with a massively hyped and well performant machine and a bevvy of anticipated games; meanwhile, MS has just delayed the one game anybody cared for, has little momentum, there's still a secret trash model they refuse to reveal, and all signs pointing to them being more expensive.

There's no story there, and so you see nonsense angles being propped up:
- The PS5 is secretly bad at 4K, even though every dev was saying it's great
- Neither console is ready for next-gen actually
- The Series S will be mega popular (polls both on Gaf and Era showing 80% won't ever buy one)
- People will actually buy XBox for exclusives, which is judo flipped to "people will buy COD/Fifa" when it's pointed out they have none
- Are exclusives anti-consumer? 🤔
 
Xbox is more powerful. Full stop. We've all seen the numbers. No amount of SSD, "special sauce", or variable clocks will change that. Third party games will most likely look better on Series X.

That is not true. You are basing that analysis on the assumption that an mere 18% XSX TFLOP advantage is the ONLY thing that matters. Here's a thought experiment. Take that exact same XSX, remove its SSD, and replace it with a hard drive. Even you should admit that the 18% advantage would mean nothing in that configuration due to the I/O bottleneck. TFLOPs aren't everything.

Now I will give you that the PS5 will likely have a lower average resolution than the XSX. However, at or near 4K you only have to drop the resolution a tiny bit perception wise to make up the entire 18% TFLOP deficit. Instead of the XSX's 2160p (4K), the PS5 would only drop to around 1950p in demanding scenes with dynamic resolution. For comparison, that 1950p has 21% more pixels than 1800p, which with sharpening is itself is a pretty good substitute for 4K. That is the worst case scenario from all this talk about PS5 games looking worse than XSX. It'd be a small resolution drop that the vast majority of gamers would never notice.

But wait. There's more. The PS5 has an upside too. It's that SSD and I/O system you dismissed. The PS5's I/O can read around 4.8 GBs of data more per second than the XSX. That means 4.8 GBs more data per second of textures, geometry, animation, and so on. Did you see that awesome Unreal 5 demo? That was only 1440p but it looked fantastic due to using higher quality assets and lighting, not 4K resolution. Resolution is running head on into diminishing returns territory. The quality of the pixels matters more than the quantity of pixels. The PS5 can use 4.8 GBs of data more per second than the XSX to create those better pixels. All the TFLOPs Only based analysis ignores that truth.
 
Top Bottom