Almost like different games with different engines doing different work (btw, ask PS4 and Xbox One players how much they enjoyed CP2077) have something to do with it and well you took a curious example considering the crazy amount of bugs it had to sort out for over a year after its launch (some which they are still sorting out). Also, not sure what your experience is beside the subjective fact that you like it so it is not sn apples to apples comparison… which is the same discussion we have with the Year of Linux on Desktop (vs Windows/macOS) people when they say their experience is “just fine”.what I really don't understand is, if this is an insufficient hardware issue like some are saying, then how was my 8gbvram enough for cyberpunk with RT but not for Tlou?
Seriously? I'm not going to rehash all in details but if you've been paying attention (may not have affected you personally which is great):
Spiderman: Poor CPU utilization and boundedness (especially with RT On), VRAM limitations/perf degradation
Atomic Heart: Stutter, crashing, FPS drops (at launch)
Returnal: Poor CPU utilization and boundedness (especially with RT On), streaming stutters when loading (again at launch)
I'm so confused. So PC can offer a "far superior" experience compared to consoles, but having to wait anywhere from a few mins to an hour to precompile shaders before startup is somehow a superior experience to just loading the game and playing? (And having to redo that everytime you make a GPU driver or HW change BTW). Then you say APIs and drivers have never been better....but does that mean they are easier to developer for over console even in their current state? Going from "pain in the ass" to less of a "pain in the ass" is still a ..."Pain in the Ass". Also, you do realize that much of the pain with shader compilation stutter is due to how DX12 works and what the developers now have to do at the SW level right? No it's not just a UE4 issue
I never said console games don't have their issue. S**t making video games is difficult...PERIOD. But you're missing my point. Using The Last of US PT1 as a recent example:
To this point, the legacy of The Last of Us was simply that is was an exceptional game that many put up as one of the best gaming experiences ever! The legacy of the game was just as a shining example of what the video game medium is capable of and it's quality helped define the reputations of both PlayStation Studios and Naughty Dog as a whole. Even resulting in one of the best shows in HBO history as well. Keep in mind that this game has literally released on 3 Console Generations to this point with various amounts of remastering and remaking of the original game....and while there of course were patches to improve things, the discussion and legacy of the game never wavered from it just being an outstanding game. Now with the PC port, what is the discussion and legacy? It's about how it doesn't run on this and that sku. It's about how poor of a port it is, how bad it performs, and how its crashing for many users, how it can't hit this resolution and FPS on this system etc . No matter what they do from this point on to fix, these impressions will forever stick when thinking back to the PC version of this game. The legacy is now a bunch of silly memes with super wet characters and black artifacts all over them. I haven't seen a single forum thread or media article discussing just how great a game (at its core) The Last of Us really is with this PC release. Why? (here it is...) Because oftentimes, the nature of the PC and the recent issues that come with that has made it increasingly difficult for gamers to just enjoy the games they are playing. Admittedly, it's hard to just enjoy the game when you have glitching, stuttering, inconsistent performances, crashing, and other issues you have to debug just to get it to work properly. Sure, not EVERYONE has these issues but that's the whole point: there isn't just a single PC to consider. Hell there isn't even a series or level of PCs to consider with HW spanning a decade or more at times.
Everyone loves to act like "The PC" is just this single monolithic beast of a computer...how many of you realize that most dev studios barely have 4000x Series cards to test their games on (nvm RX7000 series)? There maybe 1-3 Ada cards at a studio typically reserved to some key devs on the team and not the QA lab (for example). At the same time, there may not be any non-RTX cards or DX9 cards or Windows 10 systems available to test on either. There's so much trust and reliance on so many external entities doing their job properly and address any new bugs just to ensure a game will work correctly.
You hope Microsoft does it's job with the Windows OS and it's commands that your game relies on. You hope that Nvidia and AMDs drivers function as expected for the full range of skus out in the market (beyond what you can test). You hope that Steam, Epic, Xbox, Origins, Battlenet, Uplay etc work properly for every user despite what they may be running on their machine. You hope that OBS, Geforce Experience, Logitech, Razer, Norton and whatever other SW is running on the user's machine doesn't mess with the game's functionality and performance. Most importantly, you HOPE that the user will follow instructions, heed warnings, and use "common sense" to not try to do things to the game that may break the game (e.g attempt to run ultra settings and RT on a 5 yr old non-RT GPU with only 8Gbs or pair their old Intel 6000 Skylake quad core CPU with an RTX 4090 and wonder why the perf is so bad)
Well saidYes game development in general is a b*t** but PC development has its own challenges making it even worst in many ways. When you really think about it, it's literally a miracle something as complex as a modern video game could ever ship without any egregious issues that would prevent the player from simply enjoying the game.
I still don't fully understand, so its an engine issue not a hardware issue?Almost like different games with different engines doing different work (btw, ask PS4 and Xbox One players how much they enjoyed CP2077) have something to do with it and well you took a curious example considering the crazy amount of bugs it had to sort out for over a year after its launch (some which they are still sorting out). Also, not sure what your experience is beside the subjective fact that you like it so it is not sn apples to apples comparison… which is the same discussion we have with the Year of Linux on Desktop (vs Windows/macOS) people when they say their experience is “just fine”.
They will do what makes business sense, they want to have a decent reputation and this is a new SKU they get money for, it is not a single purchase that gets you a multi platform game running on PC and PS5.
Some of the most egregious bugs you see now are 99.8% bugs the QA’s flagged and are in a backlog somewhere (I do not think they will pull a FROM and never bother with performance ones).
With that said the point made in the post you are answering to are valid: PC has a lot of great attributes, making it cheap and easy to optimise games is not one of them (despite how much you want to pass it on to gamers giving them tons of sliders to balance to get a smooth frame rate or the desired quality settings).
It is also true that with their current business model, GaaS games aside, they still have an incentive to make PS5 the best place to play because it is where they get a cut out of third party sales (games, DLC, and micro transactions). It is not in their best interest to encourage players to move from PS consoles to PC. It does not mean they will danger their PC ports intentionally, but it may means their PC ports are not day and date and they may not spend billions to deeply optimise them for every possible HW, OS, and driver version combination (the matrix of which is very very complex on PC… it is just the truth).
what I really don't understand is, if this is an insufficient hardware issue like some are saying, then how was my 8gbvram enough for cyberpunk with RT but not for Tlou?
CP 2077 has a ton of low res textures. Just compare the Reworked Project, to the original textures.
A lot of them look like something out of a PS2 era.
But the thing is, when using quality textures, CP 2077 can use up the 24G of vram on a 4090.
![]()
Cyberpunk 2077
4K actual FPS cyberpunk 2077 in city areas with DLSS off avg 40FPS from my testing but there are people who swear and post screenshots of the CPU and...forums.guru3d.com
ok thanks, I know very little about these kind of things and want to filter out all the noise around this issue before I commit to an unwanted hardware upgrade
I'm seeing complaints all accross the spectrum. You dont get a mostly negative score on steam with problems on one resolution setting aloneWait, that specs sheet specifies 1080p high for 8GB GPUs. Aren't many of the complaints from gamers because they can't run those at 1440p?
On shaders I assume with so few venders they could be downloaded already compiled for your gpu of choice. Am I wrong or is this a future possibility at least for nvidia and amd?Well said.
CP 2077 has a ton of low res textures. Just compare the Reworked Project, to the original textures.
A lot of them look like something out of a PS2 era.
But the thing is, when using quality textures, CP 2077 can use up the 24G of vram on a 4090.
[/URL][/URL]
No this time they are actually right.
The Oodle Library that came packaged with the game is broken or something and unnecessarily hammers the CPU.
Theres an unofficial fix for that atleast and it eases the strain on the CPU (proof its not the CPUs fault).....but I believe patch 2 already kinda fixed that so you might not need the unofficial fix depending on your system.....youll have to test yourself.
The fact the community got that fixed early tells you that its something ND missed in the port, its not that the 3600 is ancient and couldnt possibly handle this game.
Because, if watched the video you will see, that once the "loading" is done, the game runs "well" on the 3600.
And as for 8GB of VRAM.
Locking players to medium is fine....im all for that even(not really, let me suffer)....if you want Ultra textures build a machine that can handle Ultra settings.....................................but what they call medium is a farce.
![]()
There is no excuse for this level of texture quality and they should feel bad.
TLOU2 runs on base PS4 with bigger, better environments with better and more complex textures than Part 1.......and the texture quality absolutely is NOT what Medium Environment Textures look like.It is easier to have higher quality textures when there is a lot of repetition. That concrete blocks above the P have the same texture. There is a repeating pattern in the brickwork and the floor texture looks pretty simple.
Compared to TLOU where the 3 floor slabs are all different and you have brickwork which does not have repetition plus the other stuff in shot and all the stuff immediately around the camera.
That is the trade-off. If you want to optimise TLOU and keep the variety of assets then you need to lower quality of the assets.
Also an 8GB card running a PS5 game which itself has 10-12GB of usable vram seems a bit of an odd choice. A 3060 12GB or a down clocked 6700XT would be far better choices to ensure you are not hitting an issue with the game trying to keep vram usage below the available limit.
Not sure what your point is. Does the ps5 have 24 gigs of ram? Nope. So it wouldn’t run on either console.
On the PS5, there's only 12.5GB available to developers. Do you really think that developers are allocating 12GB just for the GPU?It is easier to have higher quality textures when there is a lot of repetition. That concrete blocks above the P have the same texture. There is a repeating pattern in the brickwork and the floor texture looks pretty simple.
Compared to TLOU where the 3 floor slabs are all different and you have brickwork which does not have repetition plus the other stuff in shot and all the stuff immediately around the camera.
That is the trade-off. If you want to optimise TLOU and keep the variety of assets then you need to lower quality of the assets.
Also an 8GB card running a PS5 game which itself has 10-12GB of usable vram seems a bit of an odd choice. A 3060 12GB or a down clocked 6700XT would be far better choices to ensure you are not hitting an issue with the game trying to keep vram usage below the available limit.
On the PS5, there's only 12.5GB available to developers. Do you really think that developers are allocating 12GB just for the GPU?
If this isn't proof they're leaning heavily on i/o I don't know what else can be said.
They are heavily relying on streaming since Crash 1 on PS1. It checks out.If this isn't proof they're leaning heavily on i/o I don't know what else can be said.
It's kind of depressing to me that this place is filled to the brim with hot takes that we're expected to craft a discussion around but whenever DF releases a good analysis, we get posts that act like DF are worthless, etc.
Then how did the PS4 run TLOU Part 2 so well? It didnt have Cernys IO.They are heavily relying on streaming since Crash 1 on PS1. It checks out.
This is distant quality lod textures. Not low or medium. Seriously there should not be textures like that in there at allTLOU2 runs on base PS4 with bigger, better environments with better and more complex textures than Part 1.......and the texture quality absolutely is NOT what Medium Environment Textures look like.
This is clearly some bullshit, no one in their right mind actually thinks this is what Medium textures should look like.
its system settings. yesterday, the searchbar was also over 100 MB. today its under. DWM is also only 460 mb today. yesterday it went up to 700 MB but it seems to be dynamic.
![]()
I am currently at a very taxing part of the game. so far, its been smooth sailing as long as i dont quit to menu or restart or fail encounters. thats when performance really goes to shit. but for whatever reason, these new areas in pittsburgh which look stunning btw, are just taxing my gpu to its limits. down to 40 fps after being a locked 60 with just 80% gpu utilization all game at 4k dlss quality. Could be a taxing area, could be a vram issue, could be a memory leak. im just going to wait until they release more patches.
It was a memory leak. I loaded that same section last night and it was a locked 60 fps on 4k dlss with room to spare No drops to 40 fps i could find. I had been playing for a few hours that day so the game must have freaked out.game is %30 taxing compared to console power
4k dlss quality looks better than native 4k so I wouldn't sweat over it (also , 4k dlss quality has a cost rendering cost around 1700p native rendering. even 4k dlss performance has a cost around 1440p rendering. let's not forget that, and don't say you get that perf at 1440p! that's important. you can check benchmarks, DLSS upscaling is pretty costly because some buffers will stay at native resolution, which will hurt performance compared to native '1440p' )
restarting or failing encounters, hmm thats a bit odd. i didnt get any problems playing regular game. other fps drops could be related to CPU but yeah game is pretty horrible on that front as well. I did notice that if I go restart the same map , it will take a long time to reload it for some reason
since you're on windows 10, you can disable "background applications". that way, programs will never stay at VRAM if they're turned off.
taskkill /f /im systemsettings.exe
taskkill /f /im dwm.exe
you can use a cmd code like this and execute it automatically as well before playing a game. most of these applications will restart, just with lesser vram usage
if you want true peak idle vram usage... you can kill explorer.exe and stuff that are bundled with ityou can use task manager to open windows explorer again later
taskkill /f /im explorer.exe
taskkill /f /im textinputhost.exe
taskkill /f /im searchhost.exe
taskkill /f /im startmenuexperiencehost.exe
taskkill /f /im shellexperiencehost.exe
taskkill /f /im lockapp.exe
taskkill /f /im systemsettings.exe
taskkill /f /im dwm.exe
when you do this,, most of the stuff that restarts will stay dead. with this, i can achieve 300 mb usage... without this, 500 mb (hence I played with 4k dlss perf with in game bar at 7.5 gb vram)
I may try testing pittsburgh later. i'm way ahead of there. sadly I play locked 40 as my ryzen 2700 is not up for the task for locked 50/60. what was ur cpu again? if its a zen 2 counterpart... sadly yes... attaining locked 60 fps in terms of cpu boundness with zen 2 cpus is near impossible. which is why I settled on 40 fps locked at 4k dlss perf
are those physical stutters that you see on frametime or visual ones that you see on screen? are you playig with a mouse or gamepad? have you tried nvidia frame limiter capping? this game has some camera hitch issues with mouseIt was a memory leak. I loaded that same section last night and it was a locked 60 fps on 4k dlss with room to spare No drops to 40 fps i could find. I had been playing for a few hours that day so the game must have freaked out.
I did change the volumetrics to medium among some other settings. I didnt realize i had everything set to ultra aside from textures. It dropped my vram usage from 9.1 to 8.5 GB. That said, in native 4k, even though i was getting 50+ fps, just turning the camera after walking around causes a massive stutter. That could be a vram bottleneck because otherwise my gpu has enough power to get close to 55 fps at native 4k.
I am guessing they are doing that horizon culling thing that Jason Schrier pointed out. If I do a quick turn, they must be loading in data from system ram to vram. i had figured that data would always be in vram and the gpu would simply not render it, but i guess this game is different.
It wasnt even a quick turn. I just walked to one end of that level and turned around. Massive drops from 55 fps to 5 fps. I dont have frametime enabled. No idea how to get MSI afterburner to display 1% framerate. I only see average framerate (D3D one).are those physical stutters that you see on frametime or visual ones that you see on screen? are you playig with a mouse or gamepad? have you tried nvidia frame limiter capping?
I've been having smooth sailing with gamepad but slow analog movement maybe helps. I will do a quick turn testing on my end and see what happens (i have to admit i never use quick turning)
yeah interesting, can you name the exact encounter name so I can give it a try with %115 vram usage and see if procures problem on my end;?It wasnt even a quick turn. I just walked to one end of that level and turned around. Massive drops from 55 fps to 5 fps. I dont have frametime enabled. No idea how to get MSI afterburner to display 1% framerate. I only see average framerate (D3D one).
I play with gamepad. This only happens on native settings. DLSS quality pushes my vram requirements to 10 GB which is just at the game's requirements. Native pushes it to 11.3 GB and thats when i get stutters like crazy randomly. It has to be a vram thing because i honestly i wouldnt mind playing this at 40 fps maxed out because my gpu actually ran ultra settings at around 35-40 fps. a few downgrades here and there and i should be able to run it at native 4k and 40 fps locked if it were not for my vram usage.
Its the financial District.yeah interesting, can you name the exact encounter name so I can give it a try with %115 vram usage and see if procures problem on my end;?
okay so for starters i had to reduce my vram usage to 7.3 gb, because game bar recording takes around 200 250 mb. so i had to use native 1440pIts the financial District.
it seems is going to get worse.
It's 100% a VRAM issue. Running at native 4k on my 11GB card often completely saturates the PCIe bus which never happens at 1440p. Instead of just uploading data to the GPU then it will now start copying data back into system RAM which is what causes performance to plummet. You can also see how it raises the memory used by the game below.It wasnt even a quick turn. I just walked to one end of that level and turned around. Massive drops from 55 fps to 5 fps. I dont have frametime enabled. No idea how to get MSI afterburner to display 1% framerate. I only see average framerate (D3D one).
I play with gamepad. This only happens on native settings. DLSS quality pushes my vram requirements to 10 GB which is just at the game's requirements. Native pushes it to 11.3 GB and thats when i get stutters like crazy randomly. It has to be a vram thing because i honestly i wouldnt mind playing this at 40 fps maxed out because my gpu actually ran ultra settings at around 35-40 fps. a few downgrades here and there and i should be able to run it at native 4k and 40 fps locked if it were not for my vram usage.
it seems is going to get worse.
How the hell didn't Sony put Nixxes on this way before? Like...they bought them for this and they didn't think it would be good to have their premium studio for ports taking care of TLOU? It's honestly baffling to me.
I know Naughty Dog wanted to do it...but at what cost?
Someone should forward this to the great Alex to show him how clueless he is in the context of "VRAM explosion". But he would of course choose to stay in denial due to his fanaticism anyways.Thanks. Listening now. Always appreciate insights from actual developers.
neogaf users will still tell you just reduce the resolution and 8gb will magically be enough.Thanks. Listening now. Always appreciate insights from actual developers.
Supposedly (and i have no idea where i read this already) Nixxes is working on Ratchet & Clank after both Spider-Man games.Probably giving ND too big a of a leash unless the porting studios they acquired were busy working on something else, but like you mentioned, you'd figure this franchise would take priority. I do buy the theory it was rushed to take advantage of the wave of excitement generated by the show.
it doesnt matter if you link him this video. he obviously knows whats going on as he speaks to devs and they tell him this but he choses to ignore this because he denied ps5's decompression and io advantage ever since ue5 was demoed with billions of polygons and 8k textures on ps5, he also compared ps5 and series x to the 2070 super and convinced his followers that a midrange pc with a 2070s will cover pc gamers for the generation.. and now as necxtgen games are coming out he has nowhere to run or hide.. all he can do now is refuse to agree and keep lying as much as he can to cover his fanaticism and obvious idiocySomeone should forward this to the greate Alex to show him how clueless he is in the context of "VRAM explosion". But he would of course choose to stay in denial due to his fanaticism anyways.
Supposedly (and i have no idea where i read this already) Nixxes is working on Ratchet & Clank after both Spider-Man games.
Sony probably trusted Naughty Dog way too much with this.
I'm sure Naughty Dog themselves must feel like crap right now...but still, it's an unfortunate situation and once the hype dies, the game won't do great numbers on PC anymore. It's literally at #43 right now while Resident Evil 4 remake that came out a few days before is still at #2.
Hopefully they can do a Horizon Zero Dawn and properly turn it around, they seem very active and working hard on the updates, just a shame it wasn't done before they sold it for fifty fuckin squid init!Supposedly (and i have no idea where i read this already) Nixxes is working on Ratchet & Clank after both Spider-Man games.
Sony probably trusted Naughty Dog way too much with this.
I'm sure Naughty Dog themselves must feel like crap right now...but still, it's an unfortunate situation and once the hype dies, the game won't do great numbers on PC anymore. It's literally at #43 right now while Resident Evil 4 remake that came out a few days before is still at #2.