Is that RDR2? That game is the one exception where DLSS actually looks better than native because of the devs terrible implementation of TAA and the inability to disable it without enabling DLSS (where you use DLAA instead).
FSR is the worst of both worlds because it dumps the resolution while also keeping the god awful TAA.
MSAA doesn't work with modern rendering. Do some research.
2 weeks ago
Anonymous
It does, it's just not as performant
Though it would have less of a performance impact than gay tracing
2 weeks ago
Anonymous
MSAA also only takes care of edge aliasing. It doesn't cover anything else, like shader aliasing, texture aliasing, specular aliasing, unless assets are made specifically to avoid all of them, like Valve did with HL Alyx.
2 weeks ago
Anonymous
You can slap some SMAA on top and it'll still look less blurry than TAA
And it's not like TAA actually completely eliminates jaggies even at 4K, even if it does technically have better coverage.
2 weeks ago
Anonymous
Just use an edge detection post processing solution like FXAA or even TAA alongside MSAA, then. That’s what old games did.
2 weeks ago
Anonymous
It does, it’s just harder to implement with clustered forward rendering.
>You can only disable it with mods
Wrong, you can just turn it off in the settings
>it breaks certain things in the game
Foliage doesn't get AA if you enable MSAA, but who cares because you are trying to apply AA to situations where you're dealing with single pixel details. It's a non-issue.
>with TAA turned off.
This wasn't possible until the PC version btw.
>This wasn't possible until the PC version btw.
consoles can't use DLSS anyway.
>4K screenshot
What's the mad chud, scared of details?
2 weeks ago
Anonymous
The biggest complaint here are the raised requirements so just doing full 180 and rendering everything in native 4K to shit talk DLSS is a full retard move.
2 weeks ago
Anonymous
7800 XT can do RDR2 at 4k native ultra, not always stable 60fps but you can just turn some settings down to high and not notice any quality loss.
Native 4K is possible even on $500 midrange cards.
DLSS here is the only one that fixes Bethesda's shitty SSAO, which clearly uses dithering.
2 weeks ago
Anonymous
SMAA doesn't take care of those other mentioned aliasing types, either. Unless you mean SMAA T1X/T2X, which are literally SMAAs with temporal compontent.
2 weeks ago
Anonymous
meant to
You can slap some SMAA on top and it'll still look less blurry than TAA
And it's not like TAA actually completely eliminates jaggies even at 4K, even if it does technically have better coverage.
2 weeks ago
Anonymous
SMAA does help cover transparencies when an MSAA implementation doesn't, and I would take MSAA's lesser coverage with a crisp image any day over making the whole screen blurred with TAA, and if you dare to turn TAA off then you get an unplayable shimmer fest.
2 weeks ago
Anonymous
>and if you dare to turn TAA off then you get an unplayable shimmer fest.
TAA allows for undesampling of certain effects. What it means is that by using cheaper effects and temporal component they can recreate more complicated effects, which saves on performance.
Like you can recreate alpha transparency by using dithering, or shadow edges are also smoothed out by using dithering.
2 weeks ago
Anonymous
>undesampling
*undersampling
2 weeks ago
Anonymous
Which effectively means what I just said
Devs build effects and assets around it so when you turn it off the game looks like complete shit, so it's effectively forced on in most games, and you get a blurred screen to go with it.
FSR2 doesn't use AI. It's uses more traditional temporal methods.
btw, DLSS can run on non RTX GPUs, Nvidia just chose not to. DLSS 1.9, which was essentially close to what we're using right now with performance profiles, was made specifically to run on shaders alone, instead of RTX only tensor cores.
i remember when graphics were crispy sharp, 1080p was actually 1080p, and the game ran fine
now devs develop "with upscaling in mind" (quote remnant 2 devs), 1080p is memed down to 720p, everything is a blurry and shimmering mess, oh no problem bro just put sharpening on top of it to make it an even more unclear image filled with visual noise everywhere
it could be good if the reaction time of the upscaler is way faster to the point of being unnoticeable, and no dogshit "flags" when for example snowflakes fall down and they look like fucking sperm
"tool for lazy devs" is getting old. Can you make something new? Nobody runs DLSS on their native res. Games are blurry alrdy and doing that makes them even blurrier. Run the game in higher res, turn on upscaling, and now you get better image with same performance as in native. Nothing to hate on about. It's modern day anti-aliasing.
I noticed this problem too but I just fixed it by using much newer dlss dll file that the game uses and by using dlsstweaks to set dlss preset to F when using quality mode, then this weird ghosting shit doesn't happen anymore
>AI upscaling
better upscaling but not a replacement for native res. DLSS mogs FSR and XeSS in image quality. >frame generation
can create artifacts with repetitive motions, otherwise pretty good if you already have decent framerates
>What was supposed to do >turn your 1080p/1440p mid tier GPU into 4K capable GPU
I don't think that was ever the original intent. Why would hardware corpos give mid-tier GPUs a high-end-like boost for free?
Original intent was to make ray tracing performance better, but an extra boost to graphics also became a selling point. Remember that they're going to keep making new DLSS features that are exclusive to the new series, like how only the 4000 series can do frame generation.
To make 4K actually viable.
As cheap as it was, PS4 Pro actually did decent job with 4K (at 30 fps obviously) thanks to checkerboard rendering, which ran at just half the pixel count of native 4K.
Until DLSS 2, consoles were ahead with upscaling.
What's the reason devs keep putting FSR1 into new games? Is FSR2 harder to put in or something? I don't get it. RE4 and Baldur's Gate, both FSR. Meanwhile it's piss easy to just mod DLSS into RE4, and drop a newer version of DLSS into BG3. Why is FSR more complicated to implement despite being a less complicated technology?
Never used XeSS.
FSR2 is terrible at 1080p so if I have DLSS I'll use it.
DLSS is the best one but the real trick is to use it with DLDSR. DLDSR to get 1620p then turn on DLSS.
[...]
and I just tried it out in Starfield, native 1440p, put DLDSR at 2.25 so its 4k, then ingame DLSS on so it goes back down to 1440p. Image doesn't change much except I lost 15 fps
DLDSR is basically just checkerboard upscaling like what PS4 pro and such use. It's supersampling in a non-symmetrical way that gets you most of the image quality gains with less of a performance hit.
but that doesnt do anything except add AI processed false image information on your native resolution
and I just tried it out in Starfield, native 1440p, put DLDSR at 2.25 so its 4k, then ingame DLSS on so it goes back down to 1440p. Image doesn't change much except I lost 15 fps
It'll be really funny to see the narratives surrounding DLSS change in a few years when even the dirtiest poorfag can afford a second hand RTX card for 50 bucks.
gnomish marketing trick
a very bleak future of video games
Tool for devs too lazy (i.e. incompetent) to bother with optimization.
Should be openly ridiculed and discouraged.
An amazing tool for both lazy developers and happy hardware merchants.
There's two kinds of people in the world:
>those that like DLSS
>those that can't use it
It looks like shit.
Is that RDR2? That game is the one exception where DLSS actually looks better than native because of the devs terrible implementation of TAA and the inability to disable it without enabling DLSS (where you use DLAA instead).
FSR is the worst of both worlds because it dumps the resolution while also keeping the god awful TAA.
Every new game has "terrible impentation of TAA" lol.
True. What a terrible tech, I want MSAA back. I would rather deal with crisp PS2 graphics than the blurry shit we have today.
MSAA doesn't work with modern rendering. Do some research.
It does, it's just not as performant
Though it would have less of a performance impact than gay tracing
MSAA also only takes care of edge aliasing. It doesn't cover anything else, like shader aliasing, texture aliasing, specular aliasing, unless assets are made specifically to avoid all of them, like Valve did with HL Alyx.
You can slap some SMAA on top and it'll still look less blurry than TAA
And it's not like TAA actually completely eliminates jaggies even at 4K, even if it does technically have better coverage.
Just use an edge detection post processing solution like FXAA or even TAA alongside MSAA, then. That’s what old games did.
It does, it’s just harder to implement with clustered forward rendering.
RDR2 looks fine with TAA turned off.
You can only disable it with mods and it breaks certain things in the game. And your pic still looks blurry to me.
>You can only disable it with mods
Wrong, you can just turn it off in the settings
>it breaks certain things in the game
Foliage doesn't get AA if you enable MSAA, but who cares because you are trying to apply AA to situations where you're dealing with single pixel details. It's a non-issue.
>This wasn't possible until the PC version btw.
consoles can't use DLSS anyway.
What's the mad chud, scared of details?
The biggest complaint here are the raised requirements so just doing full 180 and rendering everything in native 4K to shit talk DLSS is a full retard move.
7800 XT can do RDR2 at 4k native ultra, not always stable 60fps but you can just turn some settings down to high and not notice any quality loss.
Native 4K is possible even on $500 midrange cards.
>with TAA turned off.
This wasn't possible until the PC version btw.
>4K screenshot
>That game is the one exception where DLSS actually looks better than native
Native looks better. But I would use dlss if it worked on windowed mode.
DLSS here is the only one that fixes Bethesda's shitty SSAO, which clearly uses dithering.
SMAA doesn't take care of those other mentioned aliasing types, either. Unless you mean SMAA T1X/T2X, which are literally SMAAs with temporal compontent.
meant to
SMAA does help cover transparencies when an MSAA implementation doesn't, and I would take MSAA's lesser coverage with a crisp image any day over making the whole screen blurred with TAA, and if you dare to turn TAA off then you get an unplayable shimmer fest.
>and if you dare to turn TAA off then you get an unplayable shimmer fest.
TAA allows for undesampling of certain effects. What it means is that by using cheaper effects and temporal component they can recreate more complicated effects, which saves on performance.
Like you can recreate alpha transparency by using dithering, or shadow edges are also smoothed out by using dithering.
>undesampling
*undersampling
Which effectively means what I just said
Devs build effects and assets around it so when you turn it off the game looks like complete shit, so it's effectively forced on in most games, and you get a blurred screen to go with it.
FSR2 doesn't use AI. It's uses more traditional temporal methods.
btw, DLSS can run on non RTX GPUs, Nvidia just chose not to. DLSS 1.9, which was essentially close to what we're using right now with performance profiles, was made specifically to run on shaders alone, instead of RTX only tensor cores.
Copium.
cancer
fake resolution for fake females in this fake ass society
i remember when graphics were crispy sharp, 1080p was actually 1080p, and the game ran fine
now devs develop "with upscaling in mind" (quote remnant 2 devs), 1080p is memed down to 720p, everything is a blurry and shimmering mess, oh no problem bro just put sharpening on top of it to make it an even more unclear image filled with visual noise everywhere
it could be good if the reaction time of the upscaler is way faster to the point of being unnoticeable, and no dogshit "flags" when for example snowflakes fall down and they look like fucking sperm
Nothing is stopping you from buying a 4090.
I always thought the piss filter era was as bad as it could get
Not anymore, now we're in the upscaling slop era
an amazing technology especially for handhelds
only adhd morons who notice pixed-sized negligeable artifact are complaining
That its going to become fucking worthless when they start going "oh it has DLSS we dont need to optimize the game lmao"
"tool for lazy devs" is getting old. Can you make something new? Nobody runs DLSS on their native res. Games are blurry alrdy and doing that makes them even blurrier. Run the game in higher res, turn on upscaling, and now you get better image with same performance as in native. Nothing to hate on about. It's modern day anti-aliasing.
>the truth is getting old
nothing we can do about it
Why shouldn’t you just use native + DLAA instead?
You can get the same results with DLSS. It's like DLAA but with more choice.
I usually enable DLSS upscaling. It looks alright 95% of the time but it definitely has its quirks.
those are just comic book lines to show you that it's moving.
H-he's fast
I noticed this problem too but I just fixed it by using much newer dlss dll file that the game uses and by using dlsstweaks to set dlss preset to F when using quality mode, then this weird ghosting shit doesn't happen anymore
>AI upscaling
better upscaling but not a replacement for native res. DLSS mogs FSR and XeSS in image quality.
>frame generation
can create artifacts with repetitive motions, otherwise pretty good if you already have decent framerates
>its good if you don't need it
Thats really what it comes down to.
What was supposed to do
>turn your 1080p/1440p mid tier GPU into 4K capable GPU
What it does now
>allow you to run the game at all at 1080p
We're literally going backwards with resolution for the first time coming from previous generation.
>What was supposed to do
>turn your 1080p/1440p mid tier GPU into 4K capable GPU
I don't think that was ever the original intent. Why would hardware corpos give mid-tier GPUs a high-end-like boost for free?
Original intent was to make ray tracing performance better, but an extra boost to graphics also became a selling point. Remember that they're going to keep making new DLSS features that are exclusive to the new series, like how only the 4000 series can do frame generation.
To make 4K actually viable.
As cheap as it was, PS4 Pro actually did decent job with 4K (at 30 fps obviously) thanks to checkerboard rendering, which ran at just half the pixel count of native 4K.
Until DLSS 2, consoles were ahead with upscaling.
Upscaling shills should be beheaded with dull knives.
They're the same pictures.
Looks janky right now. Might be decent in the future, but it's deeply flawed too.
bad
it allows shitty devs and greedy companies to not optimize their games at all
NVIDIA DLSS WON
AYYMDEAD FSR GARBAGE BTFO
ARR ROOK SAME!
Upscaling is for consoles. I will not upscale on my computer. I'd rather drop graphics as long as I can maintain native resolution and framerates.
DLSS gets rid off TAA blur which makes it based on my book
What's the reason devs keep putting FSR1 into new games? Is FSR2 harder to put in or something? I don't get it. RE4 and Baldur's Gate, both FSR. Meanwhile it's piss easy to just mod DLSS into RE4, and drop a newer version of DLSS into BG3. Why is FSR more complicated to implement despite being a less complicated technology?
It looks good because NATIVE AA sucks in most games , but then again if DLAA is implemented , use that , looks 10 times better
I CAN'T FUCKING TELL THE DIFFERENCE ANYMORE
ASK ME ABOUT HOW I FEEL WHEN IT STARTS SUCKING MY COCK, THATS ACTUALLY GROUNDBREAKING YOU GRAFIXCUX
It can already do that if you put your dick into the air intake fan anon
it’s not like you have a choice kek
I'm still on 1080p. Looks like I have a choice.
jesus anon. might as well do back to CRT
I never really used CRTs.
High still has 75% resolution scaling
Never used XeSS.
FSR2 is terrible at 1080p so if I have DLSS I'll use it.
DLSS is the best one but the real trick is to use it with DLDSR. DLDSR to get 1620p then turn on DLSS.
but that doesnt do anything except add AI processed false image information on your native resolution
DLDSR is basically just checkerboard upscaling like what PS4 pro and such use. It's supersampling in a non-symmetrical way that gets you most of the image quality gains with less of a performance hit.
and I just tried it out in Starfield, native 1440p, put DLDSR at 2.25 so its 4k, then ingame DLSS on so it goes back down to 1440p. Image doesn't change much except I lost 15 fps
its just cheaper super sampling.
>and I just tried it out in Starfield
Why? That game is a fucking mess in terms of optimization.
All three look the same.
Great for 4k and higher displays, but if devs are using it to get the game playable at 1080p it's shameful.
It'll be really funny to see the narratives surrounding DLSS change in a few years when even the dirtiest poorfag can afford a second hand RTX card for 50 bucks.
DLSS Quality is literally just free FPS. Fuck FSR and XeSS.
It looks great when neither the camera nor any objects in the game are moving.
FSR looks better than DLSS in motion
>he can’t take the facts
many such low test cases
This. It's why comparison images or heavily compressed videos are worthless.
Goyslop for goyim who were memed with 4k and rtx
>encourages laziness AND forces you to always have the newest thing
Genuinely one of the worst things to ever happen to gaming.
Its trash
It's a cool trick but would only work for more hands off "cinematic" games.
How do I force dlss in GTX cards?
Don't give me that crap about tensor cores. I know it can be done