Kagey K
Banned
It's self explanatory if you step back for a second and think about it.What in the hell are you on about mate? That made no sense…
It's self explanatory if you step back for a second and think about it.What in the hell are you on about mate? That made no sense…
It’s incoherent. If it’s something about shortages both pc and console hardware are affected.It's self explanatory if you step back for a second and think about it.
I'm going to guess due to reply time, you didn't actually think about it.It’s incoherent. If it’s something about shortages both pc and console hardware are affected.
These answers should be accompanied with poster's screen size and viewing distance.I dont really care that much about resolution in all honestly.
I’m not saying I don’t notice between resolutions, I just don’t really care.These answers should be accompanied with poster's screen size and viewing distance.
pretty sure he was being sarcastic, hence the "amirite?"So you say that minecraft at 8k is graphically better than, say, Crysis at 1080p? That's just nonsense. Resolution is just adding clarity, but clarity is just one of the aspects o graphic quality, and definitely not the most important.
Did I say I was ditching playstation for pc because of money…?I'm going to guess due to reply time, you didn't actually think about it.
It's not about shortages, but costs in general.
500 console = 1500 pc so basically the same.
Most PCs are close to PS2 now(obvious exaggeration), yet everyone runs around screaming about them.
But can you share your TV size and viewing distance? If it's not a problem for you, I'm sorry if it sounds rude. It's only that, for me, I was feeling 2k was totally enough until I get a bigger TV. So I think even if you think you don't care it's maybe because you doesn't need to care about something that it's not really noticeable.I’m not saying I don’t notice between resolutions, I just don’t really care.
And would have been happier for it. 4K is already a waste of resources. The resolution race should have stopped once it went beyond 1080p.
Who are the blind cunts that cannot tell the difference between 1080p and 4K at 2m ???
Yeah bro lmao. While it's not as obviously I can clearly tell the difference between 1080p and 4k on the new uncharted collection and I'm just over 2M away.Who are the blind cunts that cannot tell the difference between 1080p and 4K at 2m ???
On an hypothetical identical panel/electronics (minus pixel number obviously) reproducing the same identical content perfectly encoded/lossless, one 1080p and the other 4K?Who are the blind cunts that cannot tell the difference between 1080p and 4K at 2m ???
That's because:Yeah bro lmao. While it's not as obviously I can clearly tell the difference between 1080p and 4k on the new uncharted collection and I'm just over 2M away.
I think you're arguing from the perspective of a console gamer. I sit 40cm away from a 11.6'' 1080p screen, and have absolutely no need for higher pixel density for gaming. Especially not when reaching that higher pixel density, or higher resolutions, or the screen size at which that higher resolution matters, or the hardware power level at which it all becomes even marginally worth it, is a self-perpetuating moneysink that is impossible to get out of if you get caught in it.absolute nonsense. 1080p might look ok if your screen is below 40", but with modern 55"+ TVs the pixel density alone would look like ass.
your argument might work if we never switched from CRT technology to fixed resolution screens with a visible pixel grid, but we did, and those don't look great at low resolutions.
4K is IMO the sweetspot. first of all you don't see the pixel grid easily at normal distances, and secondly games running at or around 2160p look sharp to the eye and with proper treatment also look indistinguishable from native res even when they are slightly below 2160p.
I remember playing 900p games on my old 40" 1080p TV, and even tho 900p is not that far below 1080p, it looked like utter shit.
meanwhile 1800p on a 2160p screen looks almost perfect even tho it's about the same percentage difference as the 900p-1080p example.
so 4K TVs finally almost eliminate the issue of the fixed pixel grid being extremely obviously when running sub native games. while render resolutions at or slightly below 4K look clean even on the big TVs of today.
1440p is the lower bound of what looks good on a big screen, but only with proper Antialiasing.
I was blown away by the image quality of Lethal Weapon 4 on DVD in 1999.I was already overwhelmed by how good ps3 games looked...
I hope so, otherwise there's some confusionpretty sure he was being sarcastic, hence the "amirite?"
And better shadows.. gimme good shadows on modern games ffs, Nvidia PCSS were so damn transformative and sublime in every game they were on, wtf happened?I’d take higher LOD, framerates and draw distances every day of the week.
Of course you were, you had a CRT at the time.I was blown away by the image quality of Lethal Weapon 4 on DVD in 1999.
![]()
On an hypothetical identical panel/electronics (minus pixel number obviously) reproducing the same identical content perfectly encoded/lossless, one 1080p and the other 4K?
Everyone.
That's because:
1 - The two modes are drastically different in the way they are rendered PQ wise to begin with.
2 - You are judging 1080p content on a 4K display.
720p signals look soft and blurry on a 1080p display of considerable size, let alone on a 4K. On a 50'' Pioneer 506 XDE the same 720p is so sharp you could cut yourself.