It's the future. Seems like we're heading towards games relying more and more on upscaling for presentation, which means devs are gonna be targeting 1080p native on high end cards and then upscale to 1440p or 4k. Which coincidentally means your 1080p cards are gonna be made obsolete.
No. The guy you replied to is right. They have to train these things with big training sets of the game.
Its not just a setting in UE4, you need data from NVIDIA.
>The original DLSS required training the AI network for each new game. DLSS 2.0 trains using non-game-specific content, delivering a generalized network that works across games.
They're both half right, DLSS is natively supported on UE4 and Unity and doesn't require any training but it does still use AI. It genuinely is just a thing you can toggle on or off however.
>They have to train these things with big training sets of the game.
How does it feel to be still in 2018?
Since DLSS 2.0 they're using generic models that work with everything.
DLSS integration into UE4 is literally a plugin.
4K DLSS quality is the future and I love it, it makes playing 4K actually viable and I hope either nvidia or AMD soon make a PROPER working universal DLSS equivalent you can just enable at driver level
upscaling is taking a lower res and displaying it on higher resolution which you don't really need for older games since they already run at whatever resolution pretty much on modern hardware
>it makes playing 4K actually viable
so it's not actually 4k, what's the point
Pixel count, I play on 1440p monitor but I downscale from 4k using DLDSR and it looks alot better. Try it yourself, the increased sample count brings out a lot of detail over your native resolution
you're not actually running 4K, its 1080 upscaled however AMDs version is even worse
What's the point of upscaling in older games? Even entry level GPUs are capable of running them maxxed out at 4k. They don't even use TAA or modern rendering techniques relying on image reconstruction either so they would completely break apart when trying to pass through DLSS or FSR.
DLSS has a big problem with implementation, it works really bad in some games like Escape from Tarkov, while in Monster Hunter Rise you pretty much cannot see it
those are fixed in DLSS 2.4+ versions
They are not, like I said above it really depends on the implementation on game by game basis. In some games DLSS is just great while in others it is really really bad, but I still use it since I can just super sample and then use DLSS.
i have a 1070, not supported i'm pretty sure
yeah it's not sadly, DLDSR is really great tho if you ever get a RTX card because it cleans up the artifacting from super sampling
the software they've made in the past hasn't been ati/amd card locked i'm pretty certain?
2 years ago
Anonymous
I don't think they have but they just don't exist at the moment is what I meant, if you want to use DLDSR its only on nvidia cards, FSR 2.0 is a good go at DLSS tho even if it's not as good yet but it'll get better
DLSS is decent, but nvidia is moronic. FSR2.0 shows they didn't have to sacrifice a huge amount of die area just to give it to you. They could've used existing hardware. Putting AI and tensor cores on the die is fricking moronic. That shit only belongs in workstations and supercomputers. Turing was a fricking failure and rightfully so. Despite using a smaller process node, it's wasn't really more efficient, because they had to cut shaders, ROPs and TMUs to make the AI shit and tensor cores fit.
I don't think I need to tell you just how shit the 3000 series' power consumption is. They upped the shader/SM number back to what it was with pascal, but they still kept all the AI and tensor shit, making the dies frick huge. Despite using an 8nm process, compared to pascal's 16nm process, the 3050's die with 20SMs is only around 12% smaller than the 1080's die with the same number of SMs.
It also shows that developing into this shit has made them stagnate and regressed in IPC for games. A 1080 is clocked roughly the same but performs better than a 3050 despite using the same number of clusters and probably would've had a smaller die size and lower power consumption if it was on the same process node.
Nvidia has fricked us all over.
>I don't think I need to tell you just how shit the 3000 series' power consumption is.
who the frick cares? They're really good cards, if you need to worry about power consumption maybe you should not own a computer in the first place
>Black person didn't get it at all
The power consumption isn't the issue, it's the thing that shows you why they fricked up and why they had to resort to that. The manufacturing process they use is a lot more efficient, yet they didn't manage to increase efficiency. The reason for that is that more of the die area is dedicated to shit you don't fricking need on a gaming GPU. If they used the area with AI and Tensor cores for more shaders ROPs and TMUs, you can either have a smaller die, costing less and using less power to get the same performance, or the same size die using the same amount of power with significantly more performance.
can someone explain to me why the frick did nvidia make dlss only supported on 20 series cards and above? people who use new cards don't fricking need res upscaling, its the 10 series cards and below that need it, what a bunch of fricking morons
>adds nothing to gameplay because nobody knows how to use it aside making light and reflections realistic
it's a meme and most likely will stay a meme because ultra realistic graphics isn't something impressive in 2022
DLSS itself is a meme. It relies on nVidia actually using their resources to have it implemented in games. Sure they do it for all big-name games now, but what about in 10 years or so? Remember PhysX?
Upscaling though is probably here to stay for a while. 4K is actually picking up in adoption and it's clear neither AMD nor nVidia have a way to actually drive that resolution without punching up the power requirements to the point where gamers might actually start to think about their power bills which is quite a feat. Scaling is pretty much the only viable solution for now.
>DLSS itself is a meme. It relies on nVidia actually using their resources to have it implemented in games. Sure they do it for all big-name games now, but what about in 10 years or so? Remember PhysX?
PhysX was basically implemented in game engines when it gained adoption and it stopped needing dedicated hardware/die as CPUs and GPUs got better, in a similar manner to how hardware acceleration in sound cards from the 90s became obsolete when CPUs got good enough to not be dragged down by sound processing.
https://i.imgur.com/yPIH0fr.jpg
Meme? Yay or nay?
I kinda like it in some games, since it allows me to run my TV at native 4k and acceptable framerates.
yea, meme
real-time pathtracing was making much better progress in the year immediately prior to "rtx" and now shit's all fricked up just because MUH DENOISER
DLSS itself is a meme. It relies on nVidia actually using their resources to have it implemented in games. Sure they do it for all big-name games now, but what about in 10 years or so? Remember PhysX?
Upscaling though is probably here to stay for a while. 4K is actually picking up in adoption and it's clear neither AMD nor nVidia have a way to actually drive that resolution without punching up the power requirements to the point where gamers might actually start to think about their power bills which is quite a feat. Scaling is pretty much the only viable solution for now.
>Remember PhysX?
Never took off for graphics stuff, but it has become a fundamental component of physics simulations in many game engines that are still used today. A better comparison would be that hair thing they tried to push with Tomb Raider. That flopped.
in my very limited experience, dlss + dsr was a godsend for rdr2 on my 1080p display. setting dldsr to 2,25x and dlss to qualitt, i was finally able to get rid of the shitty smeary blurry TAA look of the game and it runs nearly as well as native.
D44M already proved you don't need any memes like that. Just a well optimized game and well calibrated AA. DLSS is a crutch for devs who are incompetent.
The only people that dislikes it are AMD third worlders. Expect them to defend that ripoff attempt called FSR, which can also be used with Nvidia cards.
could've paid much less for a freesync alternative
2 years ago
Anonymous
freesync isn't as good as gsync tho
2 years ago
Anonymous
>does the exact same thing >doesn't cost 200 dollars >doesn't draw extra power
yes g-sync is so good that freesync wasn't adopted as the industry standard or anything and new g-sync monitors are released all the time
you use an iphone by any chance?
2 years ago
Anonymous
>LFC ensures that variable refresh rate will still work below the adaptive sync refresh window. In other words, a display that has an adaptive sync window of 40Hz to 100Hz will still suffer from screen tearing or stuttering if your framerate drops below 40fps. LFC will prevent this from happening and this is one of the key benefits that G-Sync offers over FreeSync
2 years ago
Anonymous
So your argument as to why a very expensive proprietary module is better (despite doing the same thing than a free alternative) is because it works at sub 40 fps?
2 years ago
Anonymous
it works at any fps, I literally never see screen tearing since I got this monitor
2 years ago
Anonymous
neither have I because I haven't played below 60fps in over a decade, what a stupid argument, could've spent that money on a better gpu to actually run the game
2 years ago
Anonymous
>implying I don't already have the best gfx card
2 years ago
Anonymous
you have the best graphics card on the market and you're playing below 30 fps?
2 years ago
Anonymous
some games aren't well coded and have hitches while loading or whatever. Obviously average FPS is much higher than 30
2 years ago
Anonymous
My 2016 Nixeus NXVUE24 Freesync monitor has a 30-144hz freesync range and has LFC that kicks in below 30hz.
Your claim is invalid.
2 years ago
Anonymous
so? this seems really moronic
my shit does 48 hz
what does it matter? any game dropping below 60 just doent happen unless its dogshit or really broken and couldn't be fixed by modders
2 years ago
Anonymous
I got a freesync monitor and it had visible backlight flicker as the game was running at 60fps. Had to return in after 2 days. Apparently this isn't a problem with proper g-sync.
2 years ago
Anonymous
Not a freesync problem. When I got my first one I was actually surprised how problem-free it was. Just works globally with every game.
2 years ago
Anonymous
Freesync on cheap monitors doesn't always work well with Nvidia cards. U need to buy a gsync compatible or certified monitor. the actual gsync "ultimate" module or whatever is a meme.
IMO its nice to have, extra setting to utilize in search of that perfect balance of performance and visuals, but not really a card seller because even still only few games actually implemented it, and even fewer implement it well.
DLSS concept is good but the problem is that the implementation of it varies so wildly. DLSS is absolute dogshit in something like RDR2 or Death Stranding but really good in Control and COD:MW for example.
I stepped away from nvidia last month still have my 2080ti but it does upscaling and htpc stuff only now dlss is trash its only useful at higher res like 4k my 6900xt is so powerful I don't really need it and if I do frs2 is getting modded into alot of games now
Death Stranding: Leaves trails behind those particles rising from the ground, doesn't seem out of place though. Birds in the sky leave long trails, looks bad. Less blurry than the game's TAA. The game already runs good but DLSS keeps 4K/60.
Cyberpunk 2077: Basically mandatory since the game runs like ass. Refunded after a few hours so can't give a good DLSS review.
Red Dead Redemption 2: It released without DLSS and I had to stomach the worst TAA I've ever seen in a game. When DLSS came out, it was just a straight upgrade from the game's TAA visually, and it's a demanding game where the performance increase is welcome. Always enable it for RDR2.
Ready or Not: It's alright in this game. Red dots and holo sights leave tiny trails but not enough to interfere with aiming.
Escape From Tarkov: Haven't played much since it was updated into the game but checked it out some. Can barely even tell, possibly the best implementation of DLSS yet.
>Death Stranding: Leaves trails behind those particles rising from the ground
You can replace the DLSS dll with newer version and it's fixed. In Director's Cut they use more recent version out of box.
In Hitman 3 DLSS looks like trash no matter what version of DLSS you use, I unironically prefer FSR 1.0 in this case.
That's not what DLSS does at all. It's resolution and performance related and there's no way the AI would make such lighting adjustments. OP's pic is bait.
>wait decades for a remake >wait another year for it to come to PS5 >wait half a year for it to come to EGS >wait another year for it to come to Steam
>No DLSS >TAA that's even shittier than RDR2's >Graphics: On/off >Stutters >Time jannies >Can't play as Red XIII >$70
Went a little off topic but I really wish it had DLSS. Frame rate isn't even bad, I get 60 FPS at 1800p and DLSS isn't gonna fix stutters but I just hate its TAA and wouldn't mind upscaling to 2160p.
>isn't gonna fix stutters
Run the game in DX11, you'll have to manually enable v-sync in drivers though.
At 1080p it legit looks like a 540p game lmao. You can use edit TAA settings to use less samples in config.
Fortunately I have 4K screen and it looks acceptable with stock TAA settings. Dynamic resolution does a pretty good job.
FSR 2.0 is a game changer for anyone still on older nvidia GPUs or any dx12 capable AMD GPUs, managed to bring my average FPS in Cyberpunk 2077 to 60fps in Performance mode (just like FSR 1 Quality) with incredible clarity bordering on native res.
This is on a 2015 R9 Fury and apparently it's even faster on GTX 10 series GPUs.
The value add to older hardware is great. Technology at its best.
However, it is a real shame that it seems like we're reaching physical limits before 4K 144hz native. At least with current transistor tech and architectures.
DLSS turns games into a blurry smeary mess of shit, no matter what DF says. Maybe it will improve over time but right now it is just not good enough and neither are the similar ones coming out like FSR.
Not a meme
It's a more optimal to get satisfactory image quality out at decent framerates. As it stands, the current approach with pure rasterization and AA no longer scales particularly well. So DLSS comes in and does better via upscaling from a lower resolution
I refuse to accept this going forward. I don't expect games to run at native 8k or 16k or whatever, but 1440p and 4k should absolutely be playable without this bullshit faking it tech. It's all in the hands of game developers so god help us.
It's the future. Seems like we're heading towards games relying more and more on upscaling for presentation, which means devs are gonna be targeting 1080p native on high end cards and then upscale to 1440p or 4k. Which coincidentally means your 1080p cards are gonna be made obsolete.
Hardware has no future gaming is trash.
Ps2 was the last time games where fun
Games were never fun nostalgiahomosexual. They've always been shit.
Play old James Bond games
I can barely tell the difference.
looks like shit, proprietary shitware
DLSS requires Nvidia or the devs to train AI on billion hours of gameplay footage and then repeat until it makes it look perfect
AMD can't catch up that fast with FSR
Neither are training on AI now. It's literally just a switch you can flip on UE4 (and I presume UE5 as well).
moron
if it's still vendor locked it's shit
No. The guy you replied to is right. They have to train these things with big training sets of the game.
Its not just a setting in UE4, you need data from NVIDIA.
>The original DLSS required training the AI network for each new game. DLSS 2.0 trains using non-game-specific content, delivering a generalized network that works across games.
They're both half right, DLSS is natively supported on UE4 and Unity and doesn't require any training but it does still use AI. It genuinely is just a thing you can toggle on or off however.
>They have to train these things with big training sets of the game.
How does it feel to be still in 2018?
Since DLSS 2.0 they're using generic models that work with everything.
DLSS integration into UE4 is literally a plugin.
How does it feel to be wrong?
Does a fresh game off the compiler have thousands of hours of AI training already?
What game?
Nay. Only because it makes native 4k gays seethe.
4K DLSS quality is the future and I love it, it makes playing 4K actually viable and I hope either nvidia or AMD soon make a PROPER working universal DLSS equivalent you can just enable at driver level
This. I just want to be able to use it to upscale older lower res games
upscaling is taking a lower res and displaying it on higher resolution which you don't really need for older games since they already run at whatever resolution pretty much on modern hardware
Pixel count, I play on 1440p monitor but I downscale from 4k using DLDSR and it looks alot better. Try it yourself, the increased sample count brings out a lot of detail over your native resolution
DLSS quality on 4K is 1440p upscaled
i have a 1070, not supported i'm pretty sure
What's the point of upscaling in older games? Even entry level GPUs are capable of running them maxxed out at 4k. They don't even use TAA or modern rendering techniques relying on image reconstruction either so they would completely break apart when trying to pass through DLSS or FSR.
>it makes playing 4K actually viable
so it's not actually 4k, what's the point
There he is. Knew he'd show up.
you're not actually running 4K, its 1080 upscaled however AMDs version is even worse
I don't like the artifacts it causes.
exactly this, i assume the people that use dlss also use motion smoothing on their tv's so everything looks like 60 fps+
DLSS has a big problem with implementation, it works really bad in some games like Escape from Tarkov, while in Monster Hunter Rise you pretty much cannot see it
They are not, like I said above it really depends on the implementation on game by game basis. In some games DLSS is just great while in others it is really really bad, but I still use it since I can just super sample and then use DLSS.
yeah it's not sadly, DLDSR is really great tho if you ever get a RTX card because it cleans up the artifacting from super sampling
>ever get a RTX
naah with nvidia's continuation of vendor locked software and hardware features imma go to amd when prices go down
AMD will surely make a similar tech when they figure it out so yeah it doesn't really matter.
the software they've made in the past hasn't been ati/amd card locked i'm pretty certain?
I don't think they have but they just don't exist at the moment is what I meant, if you want to use DLDSR its only on nvidia cards, FSR 2.0 is a good go at DLSS tho even if it's not as good yet but it'll get better
those are fixed in DLSS 2.4+ versions
You need it if you want to use RTX on a 1440p+ monitor, which is always.
DLSS is decent, but nvidia is moronic. FSR2.0 shows they didn't have to sacrifice a huge amount of die area just to give it to you. They could've used existing hardware. Putting AI and tensor cores on the die is fricking moronic. That shit only belongs in workstations and supercomputers. Turing was a fricking failure and rightfully so. Despite using a smaller process node, it's wasn't really more efficient, because they had to cut shaders, ROPs and TMUs to make the AI shit and tensor cores fit.
I don't think I need to tell you just how shit the 3000 series' power consumption is. They upped the shader/SM number back to what it was with pascal, but they still kept all the AI and tensor shit, making the dies frick huge. Despite using an 8nm process, compared to pascal's 16nm process, the 3050's die with 20SMs is only around 12% smaller than the 1080's die with the same number of SMs.
It also shows that developing into this shit has made them stagnate and regressed in IPC for games. A 1080 is clocked roughly the same but performs better than a 3050 despite using the same number of clusters and probably would've had a smaller die size and lower power consumption if it was on the same process node.
Nvidia has fricked us all over.
>I don't think I need to tell you just how shit the 3000 series' power consumption is.
who the frick cares? They're really good cards, if you need to worry about power consumption maybe you should not own a computer in the first place
>Black person didn't get it at all
The power consumption isn't the issue, it's the thing that shows you why they fricked up and why they had to resort to that. The manufacturing process they use is a lot more efficient, yet they didn't manage to increase efficiency. The reason for that is that more of the die area is dedicated to shit you don't fricking need on a gaming GPU. If they used the area with AI and Tensor cores for more shaders ROPs and TMUs, you can either have a smaller die, costing less and using less power to get the same performance, or the same size die using the same amount of power with significantly more performance.
the excessive power consumption sucks when your fans go into overdrive every time you boot a game, my 3080 sounds like a goddamn jet engine
just undervolt bro
"FSR 2.0 proves blah blah blah..."
stopped reading there. fsr2.0 looks like actual shit.
looks great in cyberpunk with the mod
>b-but cyberpunk is ba-
it looks good.
It's pretty nice if you're doing ML, however.
can someone explain to me why the frick did nvidia make dlss only supported on 20 series cards and above? people who use new cards don't fricking need res upscaling, its the 10 series cards and below that need it, what a bunch of fricking morons
to sell new cards
it relies on tensor cores. and 2060 and 3050 arent exactly spectacular cards on their own.
>adds nothing to gameplay because nobody knows how to use it aside making light and reflections realistic
it's a meme and most likely will stay a meme because ultra realistic graphics isn't something impressive in 2022
That's RTX, not DLSS, learn to read.
>DLSS itself is a meme. It relies on nVidia actually using their resources to have it implemented in games. Sure they do it for all big-name games now, but what about in 10 years or so? Remember PhysX?
PhysX was basically implemented in game engines when it gained adoption and it stopped needing dedicated hardware/die as CPUs and GPUs got better, in a similar manner to how hardware acceleration in sound cards from the 90s became obsolete when CPUs got good enough to not be dragged down by sound processing.
I kinda like it in some games, since it allows me to run my TV at native 4k and acceptable framerates.
What im supposed to be seeing there?
what the hell with these captchas lately holyfrick
yea, meme
real-time pathtracing was making much better progress in the year immediately prior to "rtx" and now shit's all fricked up just because MUH DENOISER
also ironic given that DLSS is some of the noisiest shit around with its temporal artifacting
DLSS itself is a meme. It relies on nVidia actually using their resources to have it implemented in games. Sure they do it for all big-name games now, but what about in 10 years or so? Remember PhysX?
Upscaling though is probably here to stay for a while. 4K is actually picking up in adoption and it's clear neither AMD nor nVidia have a way to actually drive that resolution without punching up the power requirements to the point where gamers might actually start to think about their power bills which is quite a feat. Scaling is pretty much the only viable solution for now.
>Remember PhysX?
Never took off for graphics stuff, but it has become a fundamental component of physics simulations in many game engines that are still used today. A better comparison would be that hair thing they tried to push with Tomb Raider. That flopped.
On resolutions higher than 4K, yay
DLSS is an extremely adaptable and valuable tech but I fear devs will use it as a crutch in the future to excuse their shitty optimization
don't mind me just doing some cheap post processing
in my very limited experience, dlss + dsr was a godsend for rdr2 on my 1080p display. setting dldsr to 2,25x and dlss to qualitt, i was finally able to get rid of the shitty smeary blurry TAA look of the game and it runs nearly as well as native.
D44M already proved you don't need any memes like that. Just a well optimized game and well calibrated AA. DLSS is a crutch for devs who are incompetent.
b-but doom eternal utilizes dlss...
>doom eternal utilizes dlss
Wat?
The only people that dislikes it are AMD third worlders. Expect them to defend that ripoff attempt called FSR, which can also be used with Nvidia cards.
I assume you also paid a $200 premium for your g-sync monitor?
my g-sync monitor is awesome, no screen tearing ever and no added lag like vsync
what monitor
PG279Q. I dunno if they still make/sell it but I love it. 1440p gives a nice resolution bump without burning all your GPU's fill rate like 4K
could've paid much less for a freesync alternative
freesync isn't as good as gsync tho
>does the exact same thing
>doesn't cost 200 dollars
>doesn't draw extra power
yes g-sync is so good that freesync wasn't adopted as the industry standard or anything and new g-sync monitors are released all the time
you use an iphone by any chance?
>LFC ensures that variable refresh rate will still work below the adaptive sync refresh window. In other words, a display that has an adaptive sync window of 40Hz to 100Hz will still suffer from screen tearing or stuttering if your framerate drops below 40fps. LFC will prevent this from happening and this is one of the key benefits that G-Sync offers over FreeSync
So your argument as to why a very expensive proprietary module is better (despite doing the same thing than a free alternative) is because it works at sub 40 fps?
it works at any fps, I literally never see screen tearing since I got this monitor
neither have I because I haven't played below 60fps in over a decade, what a stupid argument, could've spent that money on a better gpu to actually run the game
>implying I don't already have the best gfx card
you have the best graphics card on the market and you're playing below 30 fps?
some games aren't well coded and have hitches while loading or whatever. Obviously average FPS is much higher than 30
My 2016 Nixeus NXVUE24 Freesync monitor has a 30-144hz freesync range and has LFC that kicks in below 30hz.
Your claim is invalid.
so? this seems really moronic
my shit does 48 hz
what does it matter? any game dropping below 60 just doent happen unless its dogshit or really broken and couldn't be fixed by modders
I got a freesync monitor and it had visible backlight flicker as the game was running at 60fps. Had to return in after 2 days. Apparently this isn't a problem with proper g-sync.
Not a freesync problem. When I got my first one I was actually surprised how problem-free it was. Just works globally with every game.
Freesync on cheap monitors doesn't always work well with Nvidia cards. U need to buy a gsync compatible or certified monitor. the actual gsync "ultimate" module or whatever is a meme.
IMO its nice to have, extra setting to utilize in search of that perfect balance of performance and visuals, but not really a card seller because even still only few games actually implemented it, and even fewer implement it well.
It's blurry shite. don't use it if you're able to maintain 60+fps without it.
DLSS concept is good but the problem is that the implementation of it varies so wildly. DLSS is absolute dogshit in something like RDR2 or Death Stranding but really good in Control and COD:MW for example.
I stepped away from nvidia last month still have my 2080ti but it does upscaling and htpc stuff only now dlss is trash its only useful at higher res like 4k my 6900xt is so powerful I don't really need it and if I do frs2 is getting modded into alot of games now
In order of games I've played with DLSS:
Death Stranding: Leaves trails behind those particles rising from the ground, doesn't seem out of place though. Birds in the sky leave long trails, looks bad. Less blurry than the game's TAA. The game already runs good but DLSS keeps 4K/60.
Cyberpunk 2077: Basically mandatory since the game runs like ass. Refunded after a few hours so can't give a good DLSS review.
Red Dead Redemption 2: It released without DLSS and I had to stomach the worst TAA I've ever seen in a game. When DLSS came out, it was just a straight upgrade from the game's TAA visually, and it's a demanding game where the performance increase is welcome. Always enable it for RDR2.
Ready or Not: It's alright in this game. Red dots and holo sights leave tiny trails but not enough to interfere with aiming.
Escape From Tarkov: Haven't played much since it was updated into the game but checked it out some. Can barely even tell, possibly the best implementation of DLSS yet.
>Death Stranding: Leaves trails behind those particles rising from the ground
You can replace the DLSS dll with newer version and it's fixed. In Director's Cut they use more recent version out of box.
In Hitman 3 DLSS looks like trash no matter what version of DLSS you use, I unironically prefer FSR 1.0 in this case.
Whoa increased contrast and lowered brightness. Amazing
That's not what DLSS does at all. It's resolution and performance related and there's no way the AI would make such lighting adjustments. OP's pic is bait.
isnt dlss mainly for performance boosts like rtx
It can add more fine details to 4K and the future iterations will just keep getting better at doing that.
So soon it will be better than native resolutions.
>wait decades for a remake
>wait another year for it to come to PS5
>wait half a year for it to come to EGS
>wait another year for it to come to Steam
>No DLSS
>TAA that's even shittier than RDR2's
>Graphics: On/off
>Stutters
>Time jannies
>Can't play as Red XIII
>$70
Went a little off topic but I really wish it had DLSS. Frame rate isn't even bad, I get 60 FPS at 1800p and DLSS isn't gonna fix stutters but I just hate its TAA and wouldn't mind upscaling to 2160p.
>isn't gonna fix stutters
Run the game in DX11, you'll have to manually enable v-sync in drivers though.
At 1080p it legit looks like a 540p game lmao. You can use edit TAA settings to use less samples in config.
Fortunately I have 4K screen and it looks acceptable with stock TAA settings. Dynamic resolution does a pretty good job.
MEMEMEMEMEE
Dlss causes ghosting.
Frick that shit.
idk things run fine on my 3070ti
FSR 2.0 is a game changer for anyone still on older nvidia GPUs or any dx12 capable AMD GPUs, managed to bring my average FPS in Cyberpunk 2077 to 60fps in Performance mode (just like FSR 1 Quality) with incredible clarity bordering on native res.
This is on a 2015 R9 Fury and apparently it's even faster on GTX 10 series GPUs.
The value add to older hardware is great. Technology at its best.
However, it is a real shame that it seems like we're reaching physical limits before 4K 144hz native. At least with current transistor tech and architectures.
AMD is chasing MCM style designs for their GPUs, that may be the breakthrough we need.
DLSS turns games into a blurry smeary mess of shit, no matter what DF says. Maybe it will improve over time but right now it is just not good enough and neither are the similar ones coming out like FSR.
1080p is 20 years old. Stop being giga poor
For me the latest greatest new technology is frame rate limitation because my GPU is so fast. Its a bit moronic every game doesn't have it.
I put a hard cap to something like 600. No use in going 1600fps during menus
As long as you're playing at 1440p and up it's not I guess
Not a meme
It's a more optimal to get satisfactory image quality out at decent framerates. As it stands, the current approach with pure rasterization and AA no longer scales particularly well. So DLSS comes in and does better via upscaling from a lower resolution
definitely not a meme
I refuse to accept this going forward. I don't expect games to run at native 8k or 16k or whatever, but 1440p and 4k should absolutely be playable without this bullshit faking it tech. It's all in the hands of game developers so god help us.
good bait thread
DLSS only increases the resolution, it doesnt change the colors in any way