You can always disable it, what you crying about? >inb4 muh game can't disable it
then it's a bad game and you have bad taste, simple as.
>Why doesn't good anti aliasing exist anymore?
a lot of game have render scale sliders, just bump it up to 150 or so and turn off whatever post process garbage
World of tanks has old school good antialiasing options.
I thought you didn't like ANY anti-aliasing. simmer down.
Idk, MSAAx2 is soft enough to keep the edges sharp without the whole picture looking like a crusty jpg
>UHHHH YOU CAN JUST HAVE NO ANTI-ALIASING INSTEAD, IDIOT
I mean, based on what we know optically, yeah.
You can just use a 27inch 4k display at the average distance on a desk.
Or you can use a ~60 inch display at a distance of about 8 feet.
Antialiasing is basically unnecessary at that point.
>Why doesn't good anti aliasing exist anymore?
a lot of game have render scale sliders, just bump it up to 150 or so and turn off whatever post process garbage
Setting resolution higher than your physical display means it'll downscale to fit your display, presumably using resampling like bicubic, bilinear, or lanczos.
Resampled downscale of a render without AA will probably look better than a native render with AA, but your GPU will need adequate VRAM to make it work.
Now that Nvidia tech is driving the AI revolution and VRAM is the major limiting factor in loading and running machine learning models, or generating higher resolution images, we're gonna see them price-gauge the higher VRAM models and give less VRAM to the mid-range gaming models.
I've got an RTX 3070 8GB and dabble in AI stuff, but I like gaming at 4K and I think I'm just under the minimum requirement for smoothest performance.
8 months ago
Anonymous
>we're gonna see them price-gauge the higher VRAM models and give less VRAM to the mid-range gaming models.
The 4060 is already doing that and it won't change in the future. Of course Ganker won't even entertain looking at radeon or heaven forbid arc so this place will continue to praise getting shafted by their greatest ally.
8 months ago
Anonymous
>greatest ally
That would be Arc
8 months ago
Anonymous
>RTX 3070 8GB >I like gaming at 4K
>I think I'm just under the minimum requirement for smoothest performance.
This is the answer. No one wants to bother trying to implement MSAA with deferred rendering when Nvidia is actively selling everyone on just layering postprocessing effects on top of each other to pretend that their cards can hit 4k 60FPS with RTX enabled >upscale from 720p >blur everything to hide aliasing >use a sharpening filter to compensate
Here's your high fidelity vidya, bro
It does. It's on PC. It's called downsampling or supersampling. You can do it even on games that don't support it officially with simple custom resolutions in windows.
Not a graphicsgay but I've noticed the same shit and it always looks garbage even to someone like me who seldom cares about that kinda shit. Especially when it leaves visible trails.
I hate that the main reason to consider upgrading for me nowadays is supersampling games with terrible AA that don't even look that good but aren't well optimized either, so you need a high-end card to do it anyway.
Meanwhile shit with cutting edge grafix is hardly worth playing.
I was going to mention SD as well but decided not to.
And that's the thing, even if they seem easy to run, I try upscaling to deal with the aliasing and can't hold 60 frames anymore.
I agree with this picture. I have to do this with my cell phone every once and a while. This picture also demonstrates the behavior of what most should do with their life.
i played bf2043 free weekend and you cant disable TAA in that game. looks israelite a blurry mess. i get close to my screen when i see an enemy far away but the image doesnt get any sharper, hes just blured to frick by TAA. cant turn it off either. frick TAA. i hate TAA.
Oh look, another thread where gays pretend they can notice the difference with [insert meme graphical setting] being turned On/Off to justify wasting $5000 on hardware.
Why does you mother prefer anal over veganal?
Quite a filthy woman if you ask me.
Because that's hot
DLAA, next question.
>dlss is everywhere but only a handful of games have dlaa
Not fair.
God I hate censorship.
deferred rendering.
STALKER uses deferred rendering but still has MSAA
/thread
You can always disable it, what you crying about?
>inb4 muh game can't disable it
then it's a bad game and you have bad taste, simple as.
>Why is anti-aliasing shit now?
>UHHHH YOU CAN JUST HAVE NO ANTI-ALIASING INSTEAD, IDIOT
High-IQ post
I thought you didn't like ANY anti-aliasing. simmer down.
Idk, MSAAx2 is soft enough to keep the edges sharp without the whole picture looking like a crusty jpg
>UHHHH YOU CAN JUST HAVE NO ANTI-ALIASING INSTEAD, IDIOT
I mean, based on what we know optically, yeah.
You can just use a 27inch 4k display at the average distance on a desk.
Or you can use a ~60 inch display at a distance of about 8 feet.
Antialiasing is basically unnecessary at that point.
top-end GPUs struggle to run modern games at 720p upscaled
Sucks for people who play shit games then.
>Why doesn't good anti aliasing exist anymore?
a lot of game have render scale sliders, just bump it up to 150 or so and turn off whatever post process garbage
does that mean 200% render scale = 2x SSAA? mind blown
Setting resolution higher than your physical display means it'll downscale to fit your display, presumably using resampling like bicubic, bilinear, or lanczos.
Resampled downscale of a render without AA will probably look better than a native render with AA, but your GPU will need adequate VRAM to make it work.
>but your GPU will need adequate VRAM to make it work.
lmao 8gb
i swear championing upscaling as a method of AA was just to hamstring VRAM as long as possible
Now that Nvidia tech is driving the AI revolution and VRAM is the major limiting factor in loading and running machine learning models, or generating higher resolution images, we're gonna see them price-gauge the higher VRAM models and give less VRAM to the mid-range gaming models.
I've got an RTX 3070 8GB and dabble in AI stuff, but I like gaming at 4K and I think I'm just under the minimum requirement for smoothest performance.
>we're gonna see them price-gauge the higher VRAM models and give less VRAM to the mid-range gaming models.
The 4060 is already doing that and it won't change in the future. Of course Ganker won't even entertain looking at radeon or heaven forbid arc so this place will continue to praise getting shafted by their greatest ally.
>greatest ally
That would be Arc
>RTX 3070 8GB
>I like gaming at 4K
>I think I'm just under the minimum requirement for smoothest performance.
World of tanks has old school good antialiasing options.
because 4k exists and at that point the pixel density on a typical monitor is so high you don't need antialiasing anymore
not everyone can or wants to play in 4k. i prefer 480p personally, and theres probably plenty of people with a preference to 720p or 1080p
forgot to mention, but the solution imo is just to have multiple options when it comes to anti aliasing. having more options is never a bad thing
You play at a resolution lower than your screen?
i use a 4:3 computer. i prefer using 4:3 shit way more than 16:9. 16:9 is way too wide imo
4K 60fps is much more "expensive "performance wise than 1080p 60fps with subpixel morphological AA or 1080p 60fps with msaa.
Frick anti aliasing.
Into the trash can it goes.
Deferred rendering means that MSAA is functionally impossible, and the only other forms of antialiasing are shitty postprocessing approximations
This is the answer. No one wants to bother trying to implement MSAA with deferred rendering when Nvidia is actively selling everyone on just layering postprocessing effects on top of each other to pretend that their cards can hit 4k 60FPS with RTX enabled
>upscale from 720p
>blur everything to hide aliasing
>use a sharpening filter to compensate
Here's your high fidelity vidya, bro
It's revolting that that's the absolute state of graphical innovation. Thankfully we have indie shit to be more interesting.
It does. It's on PC. It's called downsampling or supersampling. You can do it even on games that don't support it officially with simple custom resolutions in windows.
but then games will run at 12fps (or less!) on anons super powerful RTX 2060 and that makes people angry
>back to relying on shitty SSAA
sasuga, modern devs
SSAA is the one true AA method and people get triggered by this fact.
frick anti aliasing
Not a graphicsgay but I've noticed the same shit and it always looks garbage even to someone like me who seldom cares about that kinda shit. Especially when it leaves visible trails.
I hate that the main reason to consider upgrading for me nowadays is supersampling games with terrible AA that don't even look that good but aren't well optimized either, so you need a high-end card to do it anyway.
Meanwhile shit with cutting edge grafix is hardly worth playing.
For me it's AI, new games are either dogshit or easy to run.
I was going to mention SD as well but decided not to.
And that's the thing, even if they seem easy to run, I try upscaling to deal with the aliasing and can't hold 60 frames anymore.
DLSS and DLAA
Even 4k has jaggies. I want a gpu capable of 8k/60 (even 40-50 would be fine) and forget about AA.
true story
I agree with this picture. I have to do this with my cell phone every once and a while. This picture also demonstrates the behavior of what most should do with their life.
TAA is cheap and effective. You're complaining for attention. You have no real opinion.
It might be me but I like good TAA without ghosting. Zero jaggies and looks miles better than SMAA or FXAA.
You are blind.
>Why doesn't good anti aliasing exist anymore?
It does its just never the default setting
i played bf2043 free weekend and you cant disable TAA in that game. looks israelite a blurry mess. i get close to my screen when i see an enemy far away but the image doesnt get any sharper, hes just blured to frick by TAA. cant turn it off either. frick TAA. i hate TAA.
Oh look, another thread where gays pretend they can notice the difference with [insert meme graphical setting] being turned On/Off to justify wasting $5000 on hardware.
Because most people who actually play games and aren't thirdwordlers use a 4k display which makes AA not necessary.