He's referring to the low texture resolution of the shrubbery. I'm no AA expert; does AA get applied to the edges of textures?What do you mean ignore the shrubbery? Aliasing is aliasing man, however its caused.
He's referring to the low texture resolution of the shrubbery. I'm no AA expert; does AA get applied to the edges of textures?What do you mean ignore the shrubbery? Aliasing is aliasing man, however its caused.
I answered that question earlier in the thread. Though the case of alpha-tested textures is entirely different once again to what I explained there.He's referring to the low texture resolution of the shrubbery. I'm no AA expert; does AA get applied to the edges of textures?
About 3 million pixels short, actually.
Then you don't know what to look for.
Aliasing is aliasing man, however its caused. You cant just say, "Oh those jaggies don't count."
It depends how tolerant you are to aliasing in my opinion.
At 1440p, I'm alright with only SMAA. It's not 100% cleared, but the only time I notice it if I'm standing still. By that account I imagine I'd be fine with 4k with no AA.
I agree that's why it crossed my mind that 4K on an "average" sized monitor at an "average" sitting distance may not need AA at all.
I was thinking more about the grass there.He's referring to the low texture resolution of the shrubbery. I'm no AA expert; does AA get applied to the edges of textures?
Maybe I didn't express myself clearly. I wanted to say that your Dolphin shot (rendered at 2560x2112 with 2x MSAA) has less edge aliasing than it would've if it was rendered at 4K without any AA. And I think the aliasing is pretty noticeable in your shots, at least in the second one; and that's not even the worst case.
I was thinking more about the grass there.
Besides, there's plenty of aliasing outside that.
How is that possible? Wouldn't smaller pixels create less aliasing than "edge blurring" (I know that's an oversimplification)?
Blurring is reducing high frequency information, hence reducing the necessity for anti aliasing.
Shouldn't smaller pixels create less aliasing than downsampling from the same resolution? Because in the end we are still dealing with a 1080p output meaning the pixels can very well be visible when viewed up close where a 4K monitor resolution should mitigate that to a greater degree.
My 1080p 4.8" screen makes jaggies almost invisible. It's a question of PPI.
At 1080P I could already live without AA, I don't like a soft picture.
Good AA really doesnt "soften" the picture in the way you are probably describing...
If he's using the "wrong" type of AA it does.
.Good AA really doesnt "soften" the picture in the way you are probably describing...
As I said earlier in the thread, what it really should be is pixels per degree of the field of view.
PPDFOV if you will.
4k is not nearly a high enough resolution to make aa redundant. also using still images, especially of old games with extremely simple graphics is not a smart way to judge the situation. as detail gets finer and poly counts continue to increase aliasing gets worse
Really? According to this handy PPD caluclator I just found AA is more than likely not needed on a 22" screen @ 4k from a viewing distance of 2 foot.
http://phrogz.net/tmp/ScreenDens2In.html
i could probably see specular and subpixel aliasing with that. Most people probably could.
Shimmer and snap crackle pop of whitish pixels is pretty obvious
Really? According to this handy PPD caluclator I just found AA is more than likely not needed on a 22" screen @ 4k from a viewing distance of 2 foot.
Edit plugged in the wrong numbers AA may be only needed at 2 foot in medium and high contrast areas unneeded at 4.
http://phrogz.net/tmp/ScreenDens2In.html
my own eyes and what i see tell me thats wrong
And the point where it mostly isn't is at around 8k.I plugged in the wrong initial numbers AA is needed at 4K in medium and high contrast areas (which is a lot of places).
And the point where it mostly isn't is at around 8k.
Which is what I wrote (for a screen) in the first reply to this thread![]()
This is Dark Souls 2 @ 5k downsampled with ultra SMAA. You can still see aliasing on the tree on the right. Even 8k doesn't completely eliminate it
That's downsampled to 1080p though.
...Yeah?
Well first that trees branches are only 1 pixel wide there unless you increase pixel density that tree will stay aliased.
Second I'm referring to 4k monitors in the OP not 4k and up down sampled.
Absolutely (depending on your visual acuity and how much of your FoV the screen in question fills).You should make that clear then! You only mention resolution in the OP. I'm curious though, is there even a difference in aliasing on a 4k image downsampled to a 1080p display vs a 4k image on a 4k native display?
That's my thought: you'll still need it, but you don't need to use AS much because it does get harder to notice the jaggies and so it takes less to make them effectively vanish.I wouldn't say unneeded but you wont need very heavy performance killing forms thats for sure.
Absolutely (depending on your visual acuity and how much of your FoV the screen in question fills).
No.
As for when it becomes completely unnecessary, probably around 32k resolution per eye for a full FoV. 8k could be enough for a typical screen FoV, though it probably wouldn't eliminate shimmering completely.
Close the thread XD
More high frequency detail, including false one (aliasing)Yeah, after thinking about it, of course there's a difference. Higher pixel density would obviously allow for more detail in stuff like foliage.
I just got a 4k TV, and no, jaggies aren't eliminated by the higher res. This is entirely dependent on the game, seating, distance, etc., though iI can live with no AA to be honest most of the time.
I feel bad for you
This is Dark Souls 2 @ 5k downsampled with ultra SMAA. You can still see aliasing on the tree on the right. Even 8k doesn't completely eliminate it
Good AA really doesnt "soften" the picture in the way you are probably describing...