• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PC Gamers of GAF: Do You Prefer Increased Resolution w/ Lowered Visual Fidelity or Lower Resolution w/ Increased Visual Fidelity?

Increased Resolution w/ Lower Effects or Lower Resolution w/ Increased Effects


  • Total voters
    79

adamsapple

Or is it just one of Phil's balls in my throat?
Visual fidelity as in lowered texture, effects, vegetation quality etc settings.

Something I'm just curious about to see what more people here do.

I know some of y'all will be tempted with "why not both" but this is a one or the other kind of question, please.
 

cormack12

Gold Member
You need a baseline for this thread. Like 4k with low, ultralow or medium etc
alyJcMG.jpeg


Generally I always tried to game at native res on the PC and push whatever I could beyond medium/high
 
Last edited:

hinch7

Member
Resolution most of the time. No point in extra effects if its going to look like playing on a display with vaseline applied to it.

However if you have a good upscaler like DLSS and upscaling from 1080P and beyond to 4K with Path tracing. Then thats prefered.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
Damn, Adam is really getting ready to jump into PC gaming. Xbox saving one peasant at a time.

Nah, the lack of physical media will always be my biggest deterrent, most of my gaming these days is renting or trading stuff.

Just curious because more than half the new topics made these days seem to be about something related to PC gaming and this was something I've always wanted to know.
 
I’d rather developers not to push any device to their limits with tricks. Just give me native resolution with a crystal clear presentation and design around those constraints.
 

squidilix

Member
When native resolution is either too bad performance or full of jaggies, or TAA is very blurry... I choose lower resolution with DLSS / FSR / XeSS / PSSR, so this one.
 

LectureMaster

Gold Member
Visual fidelity as in lowered texture, effects, vegetation quality etc settings.

Something I'm just curious about to see what more people here do.

I know some of y'all will be tempted with "why not both" but this is a one or the other kind of question, please.
I'd rather have a high-res Adam with fuzzy textures rather than a low-res Adam where I couldn't even find the apple.

XMnLtvH.png
 
Last edited:

RagnarokIV

Battlebus imprisoning me \m/ >.< \m/
Native (4K for me) and high settings, never chase ultra. Pretty much always use DLSS nowadays.
If performance isn’t good enough I’ll drop shadows down a notch usually.
 
Last edited:

od-chan

Member
This is a weird way to phrase the question I guess. All the visual effects in the world wouldn't make me play something on 240p. Likewise, there's little to no benefit in many configurations to go above 1080p.

And then, in reality, this is just something that doesn't come up too often. Like increasing the resolution in a meaningful way is a huge hit on performance, so most people will just stick to whatever works for them in their setup.
 

Topher

Identifies as young
Often times I can change quality settings down to medium before I even notice a difference. The difference in resolution is more noticeable.

Both take a back seat to frame rate.
 
Last edited:

rm082e

Member
I run games at my monitor's native res - 2560x1440. I don't tolerate games running at less than 60fps. If a game can run at 90+ I'll leave it uncapped and let G-Sync do it's thing. If I'm pushing a demanding game where I'm dipping below 60, I'll typically use Riva Tuner to cap the frame rate at 60, then adjust in-game settings as needed to make sure it doesn't dip below 60.

I've found that shadow quality is often one of the biggest wastes of GPU power. For example, if there are 4 levels of shadow settings (Low, Medium, High, Ultra), I generally find medium looks the most realistic. The higher levels of shadows tend to draw unrealistically detailed shadows at the expense of GPU performance. I think I did turn the texture quality down from whatever the highest is to the second highest in a couple of games where it was not noticeable.

DLSS is one that I only turn on if the "Quality" setting means it renders the game at 2560x1440 and then upscales to 4k. That's a "nice to have" but I don't ever use it to render at a lower res and upscale to 1440.

Once I start finding games where I have to make significant compromises to maintain 60fps, that's when it's time to upgrade. Currently I don't have anything I'm playing that has that problem with my 3080, but I'm sure it's coming later this year or early next. I'll likely get a 5080 at some point.
 
Last edited:

Zacfoldor

Member
Res to 1440p framerate to 60fps then IQ.

Honestly, graphics have hit a wall and anything over 120fps isn't really necessary, and with upscaling tech, over 1440p isn't really necessary.

At what point is it not worth it to play these games natively at 4k 120hz when you could play them on a much, much cheaper device at 4k upscaled 60hz.

At some point, probably next gen, all of this moves down to IQ. So IQ is really the only path for graphics improvements left. Of course there are many ways to also improve that, but the pond is getting smaller on these graphics boosts. Just look at games today vs 2015.

That's not a diss on PC. Just fyi, biggest PC advantage is actually user mods, not graphics. Disagree?
 
Last edited:
I prefer the visual fidelity over 4K if it's one over the other.

I'd take 1440p with medium/high settings over 4K with decreased fidelity any day of the week to be honest.
 

Magic Carpet

Gold Member
With my RTX 4070 I had to drop Cyberpunk 2077 down to 1080P to get all the graphics turned up with Ray/Path tracing and get 60 FPS.
I guess I'm willing to drop to 1080P if I get all the bells and whistles.
 

Zathalus

Member
Output resolution at 4K, settings maxed unless you can turn it down for no to little visual loss, DLSS as appropriate to hit 60FPS (be it DLAA down to DLSS performance), then sprinkle frame generation on top. This is for single player games. 240hz locked (if possible) for the occasional competitive multiplayer title.
 

TintoConCasera

I bought a sex doll, but I keep it inflated 100% of the time and use it like a regular wife
My priority is resolution, then comes the graphical settings and in last place, RTX.

With my 4070 Super so far I'm able to play most games at max settings and 4K with DLSS going between quality and balanced. Sometimes I'm able to include RTX, and sometimes I'm not, but for me that's the least valuable setting except in very specific games (Cyberpunk and Metro Exodus).
 
Last edited:

kubricks

Member
1080p/60fps on computer monitor, 720p/45fps on handheld, then crank as much visual effect into the game as possible as long as frame rate is stable.

I don't play on TV of that matters.
Oh... I almost always lower shadow to mid or low for better frame rate.
 

dcx4610

Member
Just 1440p and hopefully as high frame rate as possible. 60fps minimum but preferably 120-144fps. After that, my eyes can't see it. To me, you can see/feel frame rate far more than you see fidelity improvements by going from something like High to Ultra unless you are pixel peeping.
 

T4keD0wN

Member
I often still fall prey to my childhood habit of putting post processing and effects to low the first time i start a game, haha

But it kind of varies tbh, i switch between 4k120 monitor and 1440p260 depending on the game, but i almost always use the exact same settings. I am not picking between the 2, but between framerate, lately i am mostly at 4k90-120ish, but ill switch to 1440p if i want to play something fast paced and even lower some settings (usually turning on dlss is enough tbh) to get more frames, technically i can use black frame insertion as low as 75fps, but anything below 110 will cause a ghosting sometimes so i aim for 150-200ish.
 
Last edited:

Wolzard

Member
Minimum 60 fps, I make adjustments to the graphics to obtain this in native resolution and in such a way that there is no noticeable degradation of the visuals. DLSS/FSR when it is well applied.
But the most common thing is that if a certain game demands more than my GPU can handle, I avoid playing that game until I upgrade my GPU.
 

winjer

Gold Member
I prefer a middle ground.
1440p is a good point for good image quality and good performance.
Then I tweak some settings for optimized performance, as there are settings that have little to no difference at ultra, compared to high or even medium, while causing a significant performance hit.
 

rodrigolfp

Haptic Gamepads 4 Life
IQ and performance are kings. Resolution is always DLSS quality , native or super sampled. So bells and whistles <<<<<<<<<<<
 

poppabk

Cheeks Spread for Digital Only Future
Resolution first.
Then turn stuff down to get a decent framerate usually without any noticeable effect to my eyes.
Last of Us had a nice way of doing the settings where it showed you a representative image of the impact of different settings.
 
Good thread.

1080p all the way. I have yet to have issues with my 3060ti from 2021. It maxes out everything I throw at it with that resolution.

Considering I won't upgrade my monitors or tvs, and am still on 1080p sets , of course I am going to say visuals over resolution. (im guessing people who can afford 1k+ 4k monitors and 4090rtx cards to run them, aren't going to agree).
That means better textures, more particles, from the gpu ... plus more headroom on the cpu for more environmental interactivity, better ai, etc...

After 1080p its kinda not needed, especially for a computer monitor, imo.
Even on my tv. I have 4 retro mini modded consoles (snes, genesis, tg16 and ps-classic) a ps2, wii, switch, steamlink, Ps5, Ps3, and XSX.
If i had 4k those older systems would look like ass, and I wouldn't have the hookups. So tv stays as it works and looks fine for a 55" 1080p set.
Hell I even watch dvds on our vcr/dvd combo unit (as it upscales to 1080p nicely either that or the LG set is doing it)

Also I think frame rate over 60fps is redundant. My pc monitor has 140hz refresh, but I put a frame cap whenever I can.
Some early access titles don't have frame caps or v-sync which can make your gpu run super hot. I don't want my office turning into a sauna.

Give me better visual fidelity over resolution any day. (yes I am not talking about 240p. 1080p is the sweet spot, but even native 720p on switch is fine too).

4k was pushed out too early, only for tv companies to sell you something. We are not there yet when content for games needs to be upscaled to get 60fps.
 
Last edited:

Denton

Member
Usually I use 1620p DLSS quality or native, but if 60fps is too much for max details I can go down to 1440p DLSS balanced, if that is not enough, then I start lowering other settings.
 

PeteBull

Member
Voted visual settings, but there should be 3rd option- healthy balance, since u will never be able to apreciate high visual fidelity at very low resolution, and super high res wont matter to much if graphical fidelity is crappy.
Here example of super impressive the matrix demo, on series S with terrible native res(upscaled to 1080p but from internal 1280x533p ;)

Another example, super impressive "almost a game" aka hellblade2, it runs at 1536x643p avg on series s:


And here 3rd example:
And i think its most fair comparision:
DA:V running on old trusty gtx 970, very popular maxwell card from back in the day: (medium settings,upscaled to 1080p, fsr ultra performance):

Vs now over 10yo DA:I running on same gtx 970, but maxed in 1080p:


In that case u can clearly see IQ is so badly destroyed by that ultra perfomance fsr 1080p in DA:V, that running on same gpu, 10yo previous entry in DA series looks vastly superior at native 1080p.
 

lachesis

Member
I noticed that I'm actually okay with 1080p or 1440p - even on my 65" 4k TV.... so I guess better visual fidelity over resolution past 1080p.
 

Stuart360

Member
I focus more on changing whatever needs to be changed to get 60fps.

I also usually game on 1080p or 1440p, which ever gets me 60fps.

Also games these days can still look really good on low settings (not that i ever need to do that anyway) as long as you have textures maxed. Textures are by far the biggest variant when it comes to good graphics.
 
4k + highest textures first
then volumetric lighting, AA, and AO if available
200w.gif


everything else comes after
no super low res shadows allowed though--so distracting
 
Last edited:
Native for my monitor (1440) and then adjust settings to get acceptable frame rate. I thought that's what everyone did...
 
Top Bottom