He didn't say that.
DLSS is going to be the biggest excuse for devs to never bother optimizing their games. They won't care if the game runs at 10FPS and stutters like mad with every compiled shader no matter the user's specs, just dynamically downscale the resolution to 640x360, let their "fake frame" interpolation fill shit in with terrible graphical artifacts, let Unreal 5 dynamically downscale their bloated 5,000,000 poly models.
games already run like shit even with these technologies available and active, you think developers would bother optimizing the games if they didn't exist?
What killed PC gaming is black people in games Who wear pants and don't exposure their mayo molesting massive malarkeys
At what point does nvidia just start suing devs for damages? They are losing money and having some of their worst sales and it is all traced back to devs who are doing bad jobs to the point their cards can't even run the games anymore.
I used to mess around with resolutions and renderers when the first Unreal game came out and did a lot of fine tuning to get a reasonable experience. It sure as hell wasn't "optimized" at release. I don't see DLSS as a crutch, just more of the same resolution fuckery I would do back in the day but with an automatic transmission instead of manual.
more like saves gaming
In the past you could get away with using lower resolutions on a CRT. Let's say you had a 15 in 1600x1200 monitor - you could use 1024x768 and still get an acceptable picture.
But with LCDs? Try setting the resolution to 1280x720 on a 15.6 in 1920x1080 laptop monitor. It's going to look like shit even though it should look okay, because it's not scaling properly to the pixels that form the physical screen
With things like DLSS and FSR you can finally make use of lower resolutions again, and it's great. Is it a substitute for native rendering? No. But if you can't play at higher resolutions or want a higher frame rate, it's great.
Those of you blaming DLSS and FSR for poor PC ports... the ports would still be horrible without them. Only difference being that you would have even less options lmao
1080p has been the standard for more than a fucking decade, unless you're some turbo pajeet who can't afford a capable card for 1080p60, it just makes the game look worse.
>the ports would still be horrible without them
They would still be bad, but now they're outright unplayable even on high spec hardware.
You just figured that out now?
Yes, quality is roughly half, balanced roughly a third, performance a fourth, ultra performance a ninth.
All of these can look fine as long as you're using an appropriate display.
For example, as far as dlss is concerned performance mode is adequate for a 14 in 1080p monitor or a 27 in 4K monitor, balanced for a 15.6 in 1080p monitor or 32 in 4K monitor, ultra performance could work well on a laptop with a 4K monitor or on a 32 in 8K monitor
???
DLSS Quality 1080p absolutely looks better than raw 720p.
Unless you mean better than the resolution of your display? Then sure it's never going to be better. But the higher your ppi the more you can get away with using it.
1080p DLSS Quality won't be very good on a typical 24 in desktop monitor, but it can be pretty damn fine on a smaller laptop monitor
>All of these can look fine as long as you're using an appropriate display.
Nah, DLSS looks like shit at anything below 4k.
Anyone using it at 1440p or lower is just coping with their weak GPU.
It doesn't, you don't understand how it works. Everything is proportional.
You perceive it to be "better" when using a 2160p output resolution because 2160p monitors tend to be high ppi so they have plenty of headroom for shit like this.
You can have high ppi monitors with lower resolutions as well. Try using DLSS Quality on a hypothetical 24 in 1440p monitor, or DLSS Quality 1080p on a laptop for example. Not going to look bad at all.
dlss 3 is saving pc gaming because devs have gone full retard with their cpu usage
it's so bad that people go out of their way to pay a modder to get dlss 3 in games like jedi survivor in order to get a smooth frame rate
it's funny because dx12 was supposed to bring us better optimization but instead it brought us worse optimization because devs don't know or care enough to make their games run better
>have an i9-13900k, 32 gigs of ram, and an rx 6900 xt >dont have to care about fsr/dlss and just run practically everything at least at 1080 if not 4k ultra quality 60 fps
>have an i9-13900k, 32 gigs of ram, and an rx 6900 xt >dont have to care about fsr/dlss and just run practically everything at least at 1080 if not 4k ultra quality 60 fps
how is it killing pc vidya?
a new age of bad ports
do you really think that's down to dlss only?
He didn't say that.
DLSS is going to be the biggest excuse for devs to never bother optimizing their games. They won't care if the game runs at 10FPS and stutters like mad with every compiled shader no matter the user's specs, just dynamically downscale the resolution to 640x360, let their "fake frame" interpolation fill shit in with terrible graphical artifacts, let Unreal 5 dynamically downscale their bloated 5,000,000 poly models.
games already run like shit even with these technologies available and active, you think developers would bother optimizing the games if they didn't exist?
unhinged discord dweller spotted
At what point does nvidia just start suing devs for damages? They are losing money and having some of their worst sales and it is all traced back to devs who are doing bad jobs to the point their cards can't even run the games anymore.
I used to mess around with resolutions and renderers when the first Unreal game came out and did a lot of fine tuning to get a reasonable experience. It sure as hell wasn't "optimized" at release. I don't see DLSS as a crutch, just more of the same resolution fuckery I would do back in the day but with an automatic transmission instead of manual.
There are plenty bad ports without DLSS
upscaling is fine
frame generation will kill vidya thoughever
>saves pc vidya forever
>add a whole new tier of computing power to your "gaming" cards
>its just for ai shit
no thanks jacketman
>everyone hates interpolation and scaling on tvs
>lets... le bring it to gaming
Who thought this was a good idea?
more like saves gaming
In the past you could get away with using lower resolutions on a CRT. Let's say you had a 15 in 1600x1200 monitor - you could use 1024x768 and still get an acceptable picture.
But with LCDs? Try setting the resolution to 1280x720 on a 15.6 in 1920x1080 laptop monitor. It's going to look like shit even though it should look okay, because it's not scaling properly to the pixels that form the physical screen
With things like DLSS and FSR you can finally make use of lower resolutions again, and it's great. Is it a substitute for native rendering? No. But if you can't play at higher resolutions or want a higher frame rate, it's great.
Those of you blaming DLSS and FSR for poor PC ports... the ports would still be horrible without them. Only difference being that you would have even less options lmao
1080p has been the standard for more than a fucking decade, unless you're some turbo pajeet who can't afford a capable card for 1080p60, it just makes the game look worse.
>the ports would still be horrible without them
They would still be bad, but now they're outright unplayable even on high spec hardware.
They're already putting up DLSS and FSR on requirements, there's no coming back from this.
The requirements would just be higher if FSR didn't exist lol
Either that or they'd have the same requirements listed but for lower resolutions
No, you stupid moron, they're getting away with it BECAUSE of upscaling. I hate you stupid mustard rice redditors so much.
they're not getting away with shit retard, there's a shitstorm every week with people bitching and moaning about the newest shit pc port
you make it sound like tlou1 and jedi survivor didn't get eviscerated for being shit ports
What killed PC gaming is black people in games
Who wear pants and don't exposure their mayo molesting massive malarkeys
Don't you love stagnation?
yeah bro i got your DLSS "quality" right here
You just figured that out now?
Yes, quality is roughly half, balanced roughly a third, performance a fourth, ultra performance a ninth.
All of these can look fine as long as you're using an appropriate display.
For example, as far as dlss is concerned performance mode is adequate for a 14 in 1080p monitor or a 27 in 4K monitor, balanced for a 15.6 in 1080p monitor or 32 in 4K monitor, ultra performance could work well on a laptop with a 4K monitor or on a 32 in 8K monitor
DLSS looked worse than original in every game I've tried so far
???
DLSS Quality 1080p absolutely looks better than raw 720p.
Unless you mean better than the resolution of your display? Then sure it's never going to be better. But the higher your ppi the more you can get away with using it.
1080p DLSS Quality won't be very good on a typical 24 in desktop monitor, but it can be pretty damn fine on a smaller laptop monitor
>All of these can look fine as long as you're using an appropriate display.
Nah, DLSS looks like shit at anything below 4k.
Anyone using it at 1440p or lower is just coping with their weak GPU.
Yeah I got a 3080 with a 4K screen and I use it quite often, looks great. Shit I even use it with Skyrim now that there's a plugin that supports ENB.
It doesn't, you don't understand how it works. Everything is proportional.
You perceive it to be "better" when using a 2160p output resolution because 2160p monitors tend to be high ppi so they have plenty of headroom for shit like this.
You can have high ppi monitors with lower resolutions as well. Try using DLSS Quality on a hypothetical 24 in 1440p monitor, or DLSS Quality 1080p on a laptop for example. Not going to look bad at all.
>24 in 1440p monitor
>laptop
No one cares about your 50 inch monitor, bro
I'm just trying to explain that DLSS has uses in resolutions other than 2160p, depending on the display in question.
dlss 3 is saving pc gaming because devs have gone full retard with their cpu usage
it's so bad that people go out of their way to pay a modder to get dlss 3 in games like jedi survivor in order to get a smooth frame rate
it's funny because dx12 was supposed to bring us better optimization but instead it brought us worse optimization because devs don't know or care enough to make their games run better
I remember when Ganker used to make fun of consoles last gen for using checkerboard rendering.
I loe how it was supposed to solve optimization problems but turns out it added more problems than it solved ones.
The next generation of consoles will feature mandatory frame interpolation and upscaling and I'll be here to laugh at niggles defending it.
i honestly don't even know what the fuck dlss or tsaa or whatever the fuck these god damn acronyms even mean or what they do
kills console gaming you mean
you can always turn it off on pc
Nothing wrong with upscaling
Nothing wrong with frame interpolation
I have played path traced Cyberpunk on my RTX4080 and the experience was mind blowing.
>pays $1200 for fake frames and a lower resolution
Good goy
I gave up on that shit because it has tons of visual bugs
i tried turning it on in hitman in "quality" setting and couldn't see the difference
>have an i9-13900k, 32 gigs of ram, and an rx 6900 xt
>dont have to care about fsr/dlss and just run practically everything at least at 1080 if not 4k ultra quality 60 fps
love my 6900xt first halo I bought, definitely worth it
Based 6900 XT bros.
>Consoletard is tech illiterate.
Everytime
I use dlss quality at 1440p and I cannot tell the difference no matter how hard I look. It's more fos so I'll take it.
Since this is the designated upscaling thread. Does FSR have an AA only mode like DLAA?