>optimization is a lost and forgotten art >games becoming more and more bloated >need vastly overpowered video cards to get stable framerates >video cards now having to use a technique that renders things at a lower resolution and then has an algorithm that upscales it in order to get stable frames
>optimization is a lost and forgotten art
Game optimization was all about fooling the user's eyes to make a game look good rather than do all the necessary calculations.
That's why I will never understand those zoomers who parrot "fake frame!" all the time while praising optimization tricks from the late 90s/early 2000s.
>optimization is a lost and forgotten art
Depends on the game. Doom: Eternal came out the same year as Borderlands 3. D:E runs super smooth while looking like a late 2010s game, Borderlands 3 runs like total shit while looking like an early 2010s title.
This is optimization, dumbass. Optimization isn't magic, it means making compromises wherever you can.
>optimization is a lost and forgotten art
Game optimization was all about fooling the user's eyes to make a game look good rather than do all the necessary calculations.
That's why I will never understand those zoomers who parrot "fake frame!" all the time while praising optimization tricks from the late 90s/early 2000s.
This my bias but I genuinely think the understanding of graphics technology among PC gamers has degenerated into nothing, after years of PC games just being console ports with improved resolution and framerate. Because we seem to be at a point where those two numbers are the only thing people care about.
>Devs use it as an excuse to be lazier
It's the publishers and shareholders. Shortcuts are used out of necessity to cope with poor upper management. It's like how VFX in movies aren't shit on account of the VFX studios.
inconsequential it just replaces TAA and shit like FSR would still exist
besides games are designed to run on 16 cores now so the optimization meme is moron speak since they shill games that run on a single thread no matter what
>it just replaces TAA
and is much better than TAA. how is that inconsequential? > FSR would still exist
FSR is fricking garbage compared to both native TAA and DLSS
>old games >perfect working mirrors and reflective surfaces with no performance loss >new games >Mirrors only work if you enable raytracing which tanks your framerate
???
>no performance loss
Half-Life 2's planar reflections destroyed GPUs of the day. As optimized as the game was, there was still a running joke that you needed a supercomputer to play it.
And I'm pretty sure that it only worked if you had newer cards that supported the latest DirectX feature set at that point in time.
working mirrors and reflective surfaces with no performance loss
No? Old games had massively expensive reflections because they'd just essentially render the entire world twice. Or like Duke 3d where there was literally a mirrored version of the level inside the mirror
Didn't have a real effect on vidya.
Games are optimized for consoles which were doing shittier upscaling methods but everyone was moving towards upscaling anyway.
All upscaling tech is cancer that is killing video games, because it's used as a crutch to make a game run at 30fps instead of stopping to explain to customers that the development team is too incompetent to render an empty room with nothing but concrete floors, walls, and a single desk and chair with one light fixture at above 22.5fps.
the reason is because games switched to deferred rendering where implementing MSAA is not easy or efficient so they looked at post processing solutions instead and TAA is what they ended up with
what's wrong with it? it just render edges with more samples
10 months ago
Anonymous
>insanely expensive >only works on geometry, aliasing on things like foliage, particles or fake geometry is untouched >gets more expensive the higher your base resolution is
10 months ago
Anonymous
it's not expensive you can crank it to full on portal without any issues
10 months ago
Anonymous
>you can crank it to max while using hardware released 15 years after the game you're playing
amazing, what a good tech
10 months ago
Anonymous
I did it day one with portal 2 with a laptop with a GTX 540m
10 months ago
Anonymous
because portal was never a very demanding game, what's your point? try turning up MSAA in something like red dead 2 and you will quickly understand why it's not used, because it won't make the image look too much better while costing a shitload of frames
10 months ago
Anonymous
rdr2 uses deferred shading which is why the cost is high, it must do multiple instances of MSAA at the same time
10 months ago
Anonymous
It's still very expensive in forward rendered games. Here's an old benchmark of HL2, you can see it's a very significant cost. And this only gets worse the higher your render resolution is.
10 months ago
Anonymous
>20% fps hit going to 4x MSAA at the same resolution
is this supposed to be bad?
10 months ago
Anonymous
Yes? Compared to the impact of things like DLAA which actually antialias the entire scene instead of just geometry it definitely is.
10 months ago
Anonymous
I got banned, had to open another browser just to post this
10 months ago
Anonymous
If it were up to me you'd be permabanned for defending MSAA
10 months ago
Anonymous
MSAA is the same technique as SSAA but very optimised because it only needs to be applied to the pixels around the edges of polygons on screen rather than the whole image on screen. it's top quality AA.
you're problem with MSAA is because modern devs are moronic and smother their games in awful pixelated effects like square enixes swiss cheese hair fx that aren't compatible with traditional AA, don't have their own smoothing built into the effect and requiring terrible blurring like TAA which ruins everything else and STILL look like shit.
if your game has a good art style and doesn't have shitty effects this is a non-problem. there's nothing wrong with MSAA. the problem is shitty devs.
10 months ago
Anonymous
>if your game has a good art style and doesn't have shitty effects this is a non-problem.
that's fricking moronic lol. you can't make convincing foliage or hair or particles that don't have aliasing. Making everything into a model would be insanely expensive.
MSAA worked well in 2006 when everything on screen was a model. That's just not the case in games today. Which is why MSAA is trash.
10 months ago
Anonymous
they've been doing it for years. it's called transparency in textures.
and it looks infinitely better than pixelated swiss cheese hair.
10 months ago
Anonymous
>flat paper thin tree leaves >prerendered cutscene
frick you
10 months ago
Anonymous
not prerendered moron.
notice the hair looks so good compared to this fricking shit.
(a still image doesn't even capture how bad it is because these pixels stutter around every frame and are inconsistent)
10 months ago
Anonymous
The world would be a better place if Unreal engine went away.
10 months ago
Anonymous
I could make better hair in unreal engine using mostly old techniques from the PS2 days.
they just think they have to do it this way now because it's "newer" but the result looks like shit.
it's worse than last gen games and but they think saying "look at this new dynamic hair simulation" will override how fricking shitty it looks.
10 months ago
Anonymous
>I could make better hair in unreal engine using mostly old techniques from the PS2 days.
You should. And I don't mean in a "do it homosexual" way. Developers tend to adopt certain attitudes towards things like it's a religion. Telling them they could do something achieves nothing. DEMONSTRATING that things could be better forces them to acknowledge that there's a better way and they aren't using it.
10 months ago
Anonymous
There's no reason why everything on screen can't be a primitive OR a shader effect that isn't pixelated and there for doesn't need blanket AA.
the only reason is devs are lazty c**ts and don't even understand the render pipelines they're using.
their current modern techniques for hair, for thin geometry, for shadows are worse than many innovative ones that existed in previous generations.
10 months ago
Anonymous
The fact that it doesn't work on transparency and thin detail is a problem with the technique and not the developers.
10 months ago
Anonymous
it works on transparency. any primitive including vertexes with different levels of transparency.
it doesn't work on shit that isn't polygons and is drawn with a shader.
when you do shit like that you're meant to make sure it's smooth but since modern devs are moronic they render it as a bunch of pixel particles and think the shittiest Vaseline AA ever will clean it up.
10 months ago
Anonymous
>you're problem with MSAA is because modern devs are moronic and smother their games in awful pixelated effects
You're on the right track, but still a bit off. The reason MSAA can't be used anymore is because GPUs went massively multi-core. MSAA enforces *some* kind of order dependency. Not total mind you, you don't need to single thread this like it's 1996, but you have to be aware of what gets drawn and in what order. This is performance DEATH to a modern GPU.
Those "swiss cheese" effects exist for a similar reason. Real transparency, depth of field, translucency, etc. all cause serialisation of the pipeline, which is death. So pixelated time based techniques are used to fake it, to maintain parallelisation.
It's not really fair to say devs are "moronic" because the solution to this problem isn't spending a few hours googling to learn how to do it properly, it requires someone with serious intelligence to author a white paper on "effective multi-sampling and order independent rendering techniques in a massively parallel shader environment" SIGGRAPH paper.
>All upscaling tech is cancer that is killing video games, because it's used as a crutch
So you meant to say developers who use upscaling tech as a crutch are the cancer that's killing videogames?
It's pretty good with DLDSR. Having a 1440p monitor, using DLDSR to force it to into 4k and then using DLSS is nuts.
But I don't see the point of DLSS at 1080p, there's just not enough pixels for the technology to work with. At most I could understand using the quality setting but anything under that will start looking bad.
>1440p + DLSS ultra quality >looks decent >DLDSR 4k + DLSS performance >much better clarity and quality despite fewer pixels
I still don't get why this works the way it does. Why would the target resolution matter at all?
I won't pretend to know how it works but I think the balanced/performance settings will all look good as long as there are enough pixels for DLSS to do its job. That's why performance 1080p looks like garbage but performance 4k looks virtually identical to native.
Yeah I get that, I'm just adding that using DLDSR in general to force higher quality makes DLSS works better even if you're comparing similar internal resolutions. Higher target resolution just makes DLSS work better for whatever weird reason
>I don't see the point of DLSS at 1080p
You literally just mentioned it. Use DLDSR to force 4k (screenshots show this is comparable to native 4k, even on a 1080p monitor) and then use DLSS for free frames.
>screenshots show this is comparable to native 4k, even on a 1080p monitor
It's really not. You'll get a nice AA effect but a higher resolution screen will make it look crisp
https://imgsli.com/OTEwMzc
You are factually incorrect, and this isn't even the 4K option. DLDSR has gotten so good that a 1440p monitor is a waste of money sitting on your desk while you cope.
lol you realize you can't compare monitor resolutions from a screenshot right? Any improvement you get from DLDSR is amplified when using a 1440p screen. I know that for a fact because I use DLDSR 1440p -> 4k with DLSS performance to make games look better.
>Use DLDSR to force 4k (screenshots show this is comparable to native 4k, even on a 1080p monitor) and then use DLSS for free frames.
From my experience it's not quite there yet. I forced DLDSR 2880x1620 on my 1080p monitor and while it very obviously looks better than the native 1080p, it's still not quite as good as native 1440p.
NTA but I read your conversation and I remembered I had DLDSR enabled in my settings but never used it.
I just did a quick comparison on medieval dynasty.
Here's the result. It's really amazing.
My screen is 1440p but the quality improvement with 4K is massive, much bigger than 1080p -> 1440p
Zoom in and observe the fence and foliage, it's impossible to miss.
10 months ago
Anonymous
to be fair that comparison isn't very fair since you're upscaling 1080p with a shitty bilinear filter which makes it look worse than it would natively
DLDSR is very nice thoughever
Tried it on Nioh 2 after upgrading to a card that supported it and immediately noticed the drop in visual quality. Seems really stupid that so much emphasis is being placed on the tech instead of nearly anything else.
DLSS is fricking amazing. It's literally just TAA but with none of the downsides of TAA. It singlehandedly converted me from a TAA hater.
At 1440p, quality mode is literally free frames since the quality is comparable to native TAA. If you want higher quality, DLAA looks much better than any alternative.
I don't really give a shit about muh fake pixels one way or another but I will always use DLSS if given the opportunity because native AA solutions are just terrible. SMAA is way too demanding, FXAA looks like shit, TAA is a blurry mess that looks terrible during motion. DLSS AA gives you frames while looking ok even during motion.
>kill it in the short term >to save it in the long term
AAA gaming willc rash and burn because of this. people will stop buying super expensive cards and people will stop buying games that use this. and then the industry is saved
>they're pretty much out of headroom for gains in raw power
that's clearly moronic given that the 4090 was nearly twice as fast as the 3090. Plenty of architectural improvements to be had.
Seems like we're getting an approximate dozen of these threads a day about DLSS now which I can only assume comes from AMDonkey's seething because AMD still hasn't come out with FSR 3 which is now 2 months overdue and DLSS 3.5 really mindbroke them.
Maybe, but I think there's also a large number of nvidia owners who just don't like DLSS and are salty that even if they sell a kidney to get a 4090 they still are expected to use DLSS to get 4K native. DLSS is marmite and DLSS3 is pure wankery for bullshots and weirdos who don't play games but like to look at benchmarks.
>even if they sell a kidney to get a 4090 they still are expected to use DLSS to get 4K native.
You don't need DLSS to get good framerates at 4K native with a 4090. I don't know who's forcing this meme that the cards are terrible without gimmicks, most of the 3000 and 4000 series are decent in terms of pure rasterization (except 4060/4060ti they're trash).
t. 4090 owner
Yes, I exaggerated. But it's because a 4090 shouldn't be giving you "good framerates", at that cost it should be blowing the doors off of every game out there. The fact it isn't is worrisome, and demonstrating a clear trend towards just faking frames with DLSS becoming worse and worse.
I've yet to experience DLSS at 1440p which doesn't look BAD in motion. I'm sure 4k DLSS is better but I would never use DLSS at 1440p, I'd rather play at 30fps native than 60fps with the fricking ghosting effects and just bad looking upscaling of dlss.
Literally yes. Even Red Dead 2 which has DOGSHIT TAA looks better in motion than DLSS looks in motion in that game at 1440p.
People LOVE to post screenshots as examples of how amazing DLSS is in Red Dead but the second you move that camera it is obvious its not good.
In the process of killing it. All the moronic shitters falling for it and putting more effort into fighting which type of crutch, i mean upscaling, is better and should be implemented into games, aren't helping either. Don't get me wrong, it's a great idea on paper but normalizing it to become the new "native" is exactly what shouldn't happen, yet it is.
Remove DLSS/FSR/XESS and you just simply removed sales points for marketing, specially NV since it's actual proprietary tech and needs $$$ to develop it.
Upscaling tech is good but not as a must have feature
>Upscaling tech is good but not as a must have feature
Exactly my point. It shouldn't outweigh actual optimization, but marketers and the gullible tards falling for them give this shit way too much importance, normalizing the entire thing in the process. And then you get devs who "develop with upscaling in mind", aka their games will run like utter shit without it, further amplifying the issue. The current state of the industry is absolutely horrible and I'd argue that it will get even worse.
Unfortunately devs have shown that even without DLSS being available they are perfectly happy to lean on whatever shitty integrated upscaling they have in their framework of choice. Upscaling won't go away until some major developer makes crazy amounts of money on a major game that uses native rendering again.
>Upscaling tech is good but not as a must have feature
Exactly my point. It shouldn't outweigh actual optimization, but marketers and the gullible tards falling for them give this shit way too much importance, normalizing the entire thing in the process. And then you get devs who "develop with upscaling in mind", aka their games will run like utter shit without it, further amplifying the issue. The current state of the industry is absolutely horrible and I'd argue that it will get even worse.
What do you define as "actual optimization?" Because the modern optimizations that developers use to get the effects they want are exactly what makes modern video games look like trash. The clever rasterization hacks to approximate things like global illumination, reflections, lighting and shadowing, and everything else that makes a cutting edge video game, they all have created horrible problems.
>What do you define as "actual optimization?"
not that guy, but the problem is that without upscaling if your game is 720@20fps then you HAVE to do something about it. But if you can get to 1080@30 with DLSS then devs are using it as the final crutch that lets them ship and collect their bonuses.
>DLSS WILL NOT BE USED FOR PLANNED OBSOLESCENCE >DLSS WILL NOT BE USED FOR PLANNED OBSOLESCENCE >DLSS WILL NOT BE USED FOR PLANNED OBSOLESCENCE >DLSS WILL NOT BE USED FOR PLANNED OBSOLESCENCE >DLSS WILL NOT BE USED FOR PLANNED OBSOLESCENCE
Stop spamming this thread
>optimization is a lost and forgotten art
>games becoming more and more bloated
>need vastly overpowered video cards to get stable framerates
>video cards now having to use a technique that renders things at a lower resolution and then has an algorithm that upscales it in order to get stable frames
>optimization is a lost and forgotten art
Game optimization was all about fooling the user's eyes to make a game look good rather than do all the necessary calculations.
That's why I will never understand those zoomers who parrot "fake frame!" all the time while praising optimization tricks from the late 90s/early 2000s.
>optimization is a lost and forgotten art
Depends on the game. Doom: Eternal came out the same year as Borderlands 3. D:E runs super smooth while looking like a late 2010s game, Borderlands 3 runs like total shit while looking like an early 2010s title.
This is optimization, dumbass. Optimization isn't magic, it means making compromises wherever you can.
This my bias but I genuinely think the understanding of graphics technology among PC gamers has degenerated into nothing, after years of PC games just being console ports with improved resolution and framerate. Because we seem to be at a point where those two numbers are the only thing people care about.
Looks like shit too
>great tech comes out
>shitty devs use said tech's strengths as an excuse to be even lazier
>therefore tech is bad
yes
>Devs use it as an excuse to be lazier
It's the publishers and shareholders. Shortcuts are used out of necessity to cope with poor upper management. It's like how VFX in movies aren't shit on account of the VFX studios.
inconsequential it just replaces TAA and shit like FSR would still exist
besides games are designed to run on 16 cores now so the optimization meme is moron speak since they shill games that run on a single thread no matter what
>it just replaces TAA
and is much better than TAA. how is that inconsequential?
> FSR would still exist
FSR is fricking garbage compared to both native TAA and DLSS
>old games
>perfect working mirrors and reflective surfaces with no performance loss
>new games
>Mirrors only work if you enable raytracing which tanks your framerate
???
>no performance loss
Half-Life 2's planar reflections destroyed GPUs of the day. As optimized as the game was, there was still a running joke that you needed a supercomputer to play it.
And I'm pretty sure that it only worked if you had newer cards that supported the latest DirectX feature set at that point in time.
working mirrors and reflective surfaces with no performance loss
No? Old games had massively expensive reflections because they'd just essentially render the entire world twice. Or like Duke 3d where there was literally a mirrored version of the level inside the mirror
Never played anything which uses it, probably never will
Didn't have a real effect on vidya.
Games are optimized for consoles which were doing shittier upscaling methods but everyone was moving towards upscaling anyway.
even on quality dlss always looks like a blurry mess compared to native on 1440p to me
All upscaling tech is cancer that is killing video games, because it's used as a crutch to make a game run at 30fps instead of stopping to explain to customers that the development team is too incompetent to render an empty room with nothing but concrete floors, walls, and a single desk and chair with one light fixture at above 22.5fps.
but games have to run on AMD hardware too which don't support DLSS, ever thought of that one melvin?
That’s why they add fsr too, nvidia started the trend so the blame lies with them
TAA is what was used before
the reason is because games switched to deferred rendering where implementing MSAA is not easy or efficient so they looked at post processing solutions instead and TAA is what they ended up with
MSAA is fricking garbage though and we're better off with it gone
what's wrong with it? it just render edges with more samples
>insanely expensive
>only works on geometry, aliasing on things like foliage, particles or fake geometry is untouched
>gets more expensive the higher your base resolution is
it's not expensive you can crank it to full on portal without any issues
>you can crank it to max while using hardware released 15 years after the game you're playing
amazing, what a good tech
I did it day one with portal 2 with a laptop with a GTX 540m
because portal was never a very demanding game, what's your point? try turning up MSAA in something like red dead 2 and you will quickly understand why it's not used, because it won't make the image look too much better while costing a shitload of frames
rdr2 uses deferred shading which is why the cost is high, it must do multiple instances of MSAA at the same time
It's still very expensive in forward rendered games. Here's an old benchmark of HL2, you can see it's a very significant cost. And this only gets worse the higher your render resolution is.
>20% fps hit going to 4x MSAA at the same resolution
is this supposed to be bad?
Yes? Compared to the impact of things like DLAA which actually antialias the entire scene instead of just geometry it definitely is.
I got banned, had to open another browser just to post this
If it were up to me you'd be permabanned for defending MSAA
MSAA is the same technique as SSAA but very optimised because it only needs to be applied to the pixels around the edges of polygons on screen rather than the whole image on screen. it's top quality AA.
you're problem with MSAA is because modern devs are moronic and smother their games in awful pixelated effects like square enixes swiss cheese hair fx that aren't compatible with traditional AA, don't have their own smoothing built into the effect and requiring terrible blurring like TAA which ruins everything else and STILL look like shit.
if your game has a good art style and doesn't have shitty effects this is a non-problem. there's nothing wrong with MSAA. the problem is shitty devs.
>if your game has a good art style and doesn't have shitty effects this is a non-problem.
that's fricking moronic lol. you can't make convincing foliage or hair or particles that don't have aliasing. Making everything into a model would be insanely expensive.
MSAA worked well in 2006 when everything on screen was a model. That's just not the case in games today. Which is why MSAA is trash.
they've been doing it for years. it's called transparency in textures.
and it looks infinitely better than pixelated swiss cheese hair.
>flat paper thin tree leaves
>prerendered cutscene
frick you
not prerendered moron.
notice the hair looks so good compared to this fricking shit.
(a still image doesn't even capture how bad it is because these pixels stutter around every frame and are inconsistent)
The world would be a better place if Unreal engine went away.
I could make better hair in unreal engine using mostly old techniques from the PS2 days.
they just think they have to do it this way now because it's "newer" but the result looks like shit.
it's worse than last gen games and but they think saying "look at this new dynamic hair simulation" will override how fricking shitty it looks.
>I could make better hair in unreal engine using mostly old techniques from the PS2 days.
You should. And I don't mean in a "do it homosexual" way. Developers tend to adopt certain attitudes towards things like it's a religion. Telling them they could do something achieves nothing. DEMONSTRATING that things could be better forces them to acknowledge that there's a better way and they aren't using it.
There's no reason why everything on screen can't be a primitive OR a shader effect that isn't pixelated and there for doesn't need blanket AA.
the only reason is devs are lazty c**ts and don't even understand the render pipelines they're using.
their current modern techniques for hair, for thin geometry, for shadows are worse than many innovative ones that existed in previous generations.
The fact that it doesn't work on transparency and thin detail is a problem with the technique and not the developers.
it works on transparency. any primitive including vertexes with different levels of transparency.
it doesn't work on shit that isn't polygons and is drawn with a shader.
when you do shit like that you're meant to make sure it's smooth but since modern devs are moronic they render it as a bunch of pixel particles and think the shittiest Vaseline AA ever will clean it up.
>you're problem with MSAA is because modern devs are moronic and smother their games in awful pixelated effects
You're on the right track, but still a bit off. The reason MSAA can't be used anymore is because GPUs went massively multi-core. MSAA enforces *some* kind of order dependency. Not total mind you, you don't need to single thread this like it's 1996, but you have to be aware of what gets drawn and in what order. This is performance DEATH to a modern GPU.
Those "swiss cheese" effects exist for a similar reason. Real transparency, depth of field, translucency, etc. all cause serialisation of the pipeline, which is death. So pixelated time based techniques are used to fake it, to maintain parallelisation.
It's not really fair to say devs are "moronic" because the solution to this problem isn't spending a few hours googling to learn how to do it properly, it requires someone with serious intelligence to author a white paper on "effective multi-sampling and order independent rendering techniques in a massively parallel shader environment" SIGGRAPH paper.
>All upscaling tech is cancer that is killing video games, because it's used as a crutch
So you meant to say developers who use upscaling tech as a crutch are the cancer that's killing videogames?
I can understand for people wanting to upscale up to 4k but I totally don't get when people upscale on anything below that.
It's pretty good with DLDSR. Having a 1440p monitor, using DLDSR to force it to into 4k and then using DLSS is nuts.
But I don't see the point of DLSS at 1080p, there's just not enough pixels for the technology to work with. At most I could understand using the quality setting but anything under that will start looking bad.
>1440p + DLSS ultra quality
>looks decent
>DLDSR 4k + DLSS performance
>much better clarity and quality despite fewer pixels
I still don't get why this works the way it does. Why would the target resolution matter at all?
I won't pretend to know how it works but I think the balanced/performance settings will all look good as long as there are enough pixels for DLSS to do its job. That's why performance 1080p looks like garbage but performance 4k looks virtually identical to native.
Yeah I get that, I'm just adding that using DLDSR in general to force higher quality makes DLSS works better even if you're comparing similar internal resolutions. Higher target resolution just makes DLSS work better for whatever weird reason
>I don't see the point of DLSS at 1080p
You literally just mentioned it. Use DLDSR to force 4k (screenshots show this is comparable to native 4k, even on a 1080p monitor) and then use DLSS for free frames.
>screenshots show this is comparable to native 4k, even on a 1080p monitor
It's really not. You'll get a nice AA effect but a higher resolution screen will make it look crisp
https://imgsli.com/OTEwMzc
You are factually incorrect, and this isn't even the 4K option. DLDSR has gotten so good that a 1440p monitor is a waste of money sitting on your desk while you cope.
lol you realize you can't compare monitor resolutions from a screenshot right? Any improvement you get from DLDSR is amplified when using a 1440p screen. I know that for a fact because I use DLDSR 1440p -> 4k with DLSS performance to make games look better.
>Use DLDSR to force 4k (screenshots show this is comparable to native 4k, even on a 1080p monitor) and then use DLSS for free frames.
From my experience it's not quite there yet. I forced DLDSR 2880x1620 on my 1080p monitor and while it very obviously looks better than the native 1080p, it's still not quite as good as native 1440p.
I need to see some comparisons.
NTA but I read your conversation and I remembered I had DLDSR enabled in my settings but never used it.
I just did a quick comparison on medieval dynasty.
Here's the result. It's really amazing.
My screen is 1440p but the quality improvement with 4K is massive, much bigger than 1080p -> 1440p
Zoom in and observe the fence and foliage, it's impossible to miss.
to be fair that comparison isn't very fair since you're upscaling 1080p with a shitty bilinear filter which makes it look worse than it would natively
DLDSR is very nice thoughever
Tried it on Nioh 2 after upgrading to a card that supported it and immediately noticed the drop in visual quality. Seems really stupid that so much emphasis is being placed on the tech instead of nearly anything else.
DLSS is fricking amazing. It's literally just TAA but with none of the downsides of TAA. It singlehandedly converted me from a TAA hater.
At 1440p, quality mode is literally free frames since the quality is comparable to native TAA. If you want higher quality, DLAA looks much better than any alternative.
>we dont HAVE to optimize our games
>just upscale them from 720p bro!
>its just like real 1440p!
I hope this dies like SLI and NVlink
DLSS is great, TAA is blurry anyway so DLSS is just free frames.
I have never seen anybody on this board defend DLSS that didn't use broken English.
Feel free to articulate your issues with DLSS and I will respond in perfect, college level English good sir.
while its a nice option to have, i still think optimization goes a long way
>Nvidia makes a graphics enhancer system because lazy devs couldnt optimise shit
> makes devs more lazy
beats the hell out of taa. as long as devs refuse to implement reasonable anti-aliasing methods, dlss is a necessary evil.
>reasonable anti-aliasing methods
like what?
You can't kill what is already dead.
DLSS is nice because you can override TAA with it. DSR+DLSS or DLAA is great.
I don't really give a shit about muh fake pixels one way or another but I will always use DLSS if given the opportunity because native AA solutions are just terrible. SMAA is way too demanding, FXAA looks like shit, TAA is a blurry mess that looks terrible during motion. DLSS AA gives you frames while looking ok even during motion.
>kill it in the short term
>to save it in the long term
AAA gaming willc rash and burn because of this. people will stop buying super expensive cards and people will stop buying games that use this. and then the industry is saved
Neither really. It's just a sign of the times, they're pretty much out of headroom for gains in raw power.
4090 is a 60TF card or something like that, meanwhile a PS4 is 1.8TF
5000 series probably hits 100TF, it's insane how much compute power these things have
Insane to think how mismanaged all that power is
>they're pretty much out of headroom for gains in raw power
that's clearly moronic given that the 4090 was nearly twice as fast as the 3090. Plenty of architectural improvements to be had.
Not in raw power it wasn't, only in Nvidia specific frick-shittery... like RTX.
lets ask remnant 2 about this
>sees pic rel
yeah its killing it
It was supposed to save poorgays sadly devs didn't see it that way
Improved vidya
Seems like we're getting an approximate dozen of these threads a day about DLSS now which I can only assume comes from AMDonkey's seething because AMD still hasn't come out with FSR 3 which is now 2 months overdue and DLSS 3.5 really mindbroke them.
This isn't the right thread for it but DLSS 3.5 looks really good. I hate those splotchy denoiser artifacts so much.
Maybe, but I think there's also a large number of nvidia owners who just don't like DLSS and are salty that even if they sell a kidney to get a 4090 they still are expected to use DLSS to get 4K native. DLSS is marmite and DLSS3 is pure wankery for bullshots and weirdos who don't play games but like to look at benchmarks.
>even if they sell a kidney to get a 4090 they still are expected to use DLSS to get 4K native.
You don't need DLSS to get good framerates at 4K native with a 4090. I don't know who's forcing this meme that the cards are terrible without gimmicks, most of the 3000 and 4000 series are decent in terms of pure rasterization (except 4060/4060ti they're trash).
t. 4090 owner
Yes, I exaggerated. But it's because a 4090 shouldn't be giving you "good framerates", at that cost it should be blowing the doors off of every game out there. The fact it isn't is worrisome, and demonstrating a clear trend towards just faking frames with DLSS becoming worse and worse.
I've yet to experience DLSS at 1440p which doesn't look BAD in motion. I'm sure 4k DLSS is better but I would never use DLSS at 1440p, I'd rather play at 30fps native than 60fps with the fricking ghosting effects and just bad looking upscaling of dlss.
native's TAA looks better in motion to you than DLAA? wtf
Literally yes. Even Red Dead 2 which has DOGSHIT TAA looks better in motion than DLSS looks in motion in that game at 1440p.
People LOVE to post screenshots as examples of how amazing DLSS is in Red Dead but the second you move that camera it is obvious its not good.
Red Dead 2 doesn't even have DLSS, it's designed around shit AA and a mod can't fix that.
In the process of killing it. All the moronic shitters falling for it and putting more effort into fighting which type of crutch, i mean upscaling, is better and should be implemented into games, aren't helping either. Don't get me wrong, it's a great idea on paper but normalizing it to become the new "native" is exactly what shouldn't happen, yet it is.
It's just shills being obnoxious on the internet.
Remove DLSS/FSR/XESS and you just simply removed sales points for marketing, specially NV since it's actual proprietary tech and needs $$$ to develop it.
Upscaling tech is good but not as a must have feature
>Upscaling tech is good but not as a must have feature
Exactly my point. It shouldn't outweigh actual optimization, but marketers and the gullible tards falling for them give this shit way too much importance, normalizing the entire thing in the process. And then you get devs who "develop with upscaling in mind", aka their games will run like utter shit without it, further amplifying the issue. The current state of the industry is absolutely horrible and I'd argue that it will get even worse.
Unfortunately devs have shown that even without DLSS being available they are perfectly happy to lean on whatever shitty integrated upscaling they have in their framework of choice. Upscaling won't go away until some major developer makes crazy amounts of money on a major game that uses native rendering again.
saved vidya after picrel killed it
This is the truth.
What do you define as "actual optimization?" Because the modern optimizations that developers use to get the effects they want are exactly what makes modern video games look like trash. The clever rasterization hacks to approximate things like global illumination, reflections, lighting and shadowing, and everything else that makes a cutting edge video game, they all have created horrible problems.
>What do you define as "actual optimization?"
not that guy, but the problem is that without upscaling if your game is 720@20fps then you HAVE to do something about it. But if you can get to 1080@30 with DLSS then devs are using it as the final crutch that lets them ship and collect their bonuses.
>DLSS WILL NOT BE USED FOR PLANNED OBSOLESCENCE
>DLSS WILL NOT BE USED FOR PLANNED OBSOLESCENCE
>DLSS WILL NOT BE USED FOR PLANNED OBSOLESCENCE
>DLSS WILL NOT BE USED FOR PLANNED OBSOLESCENCE
>DLSS WILL NOT BE USED FOR PLANNED OBSOLESCENCE
Stop spamming this thread
do you all have amnesia? games were horribly optimised before DLSS