>1080p
Fake frames, because every upscaler looks like shit at low resolution.
But if you can't get 60FPS at 1080p without FG then you will have bad experience anyway.
So only real answer is to get better GPU, because both DLSS and FG are only useful for high end cards anyway.
If the end result is virtually indistinguishable from running something 'natively' but I get 50%+ more fps then I don't give a fuck. Crying about this shit is cringe.
I think most of the people playing 1080p are also playing on the cheapest, most horrible 1080p monitors that were available 10 years ago. The kind of monitor with such slow response times that everything becomes a blur anyway if you do a 180, and the kind that kept people using CRTs into the late 2000s.
4 weeks ago
Anonymous
I have a decent 144fps 1ms IPS monitor.
I'll most likely never upgrade my monitor, maybe 1080p OLED I guess.
Sorry but it looks awful in motion, TAA brainwashed people into thinking it's acceptable.
We would need much higher frame-rates to make this less visible.
>DLSS 1080p Q still looks far worse than native.
This used to be true with older versions of DLSS. Now it's not so clear cut. In some games DLSS even at 1080p looks better than native because native in 90% cases comes alongside TAA (aka alongside shit).
Wrong, Kek. Still happens
This was actually the after launch fix for the game’s “poor optimization on Nvidia hardware”.
Before the fix the cards would get half the frame rate of the competition, and we can’t have that now can we? 😉
The Steam forums are, as we speak, a massacre of Nvidia owners saying the game keeps crashing for them.
>fake detail >fake performance
If I can't turn off LODs and occlusion culling I don't play the game. Skyrim runs at 0.18 FPS but at least I'm seeing the game as the artists truly intended.
>Frame generation is not good, it will fuck up the input latency
Frame generation always works in sync with Reflex and that cancels out the vast majority of the input latency.
Input latency isn't such a big issue anyway. If it was you'd have people complaining about AMD cards, or about Nvidia cards before Reflex became a thing. Because in those cases your games had input latency equal to framegen+Reflex.
If you have a base framerate greater than 40ish FPS, use framegen. If you have a 1440p Display, use DLSS Quality. Only go lower than that if you're truely desperate to get a bit more fps.
There's no reason to choose one over the other. Use both if you can. I played AW2 at 1440p DLAA without Framegen and locked my FPS to 30 for a kinomatic experience. It was worth it
I've got a 4070 Ti and a 1080p monitor, and I'm getting at least 40 FPS with medium ray tracing settings. I wanted to see if any of these memes are usable or I should just play the game rasterized.
Maybe just use it for reflections only? Alan Wake is a weird case because they precalculated the lighting, so the benefits of path traced lighting are less obvious than a game like Cyberpunk because the prebaked lighting is still really good. Path tracing just cleans up the image a lot.
Just watch the DF video and decide if it's worth it.
Almost never. Alan Wake 2 was a special case since the DLAA Path Traced image just looked so perfect, and given the cinematic nature of the game it just made it feel like a movie. Had I been able to play it with DLAA consistently above 60fps, I probably would have just done that. I was already fluctuating to the high 30's so it just made sense to lock it to 30fps.
I know somebody who can't tell the difference between 30 and 60 fps. I play games with him in person a lot. Once I mentioned him to change a setting to improve his frame rate quite a lot, but he seemed unaware of the concept and completely didn't care. At first I thought people who say "you can't see more than 24 fps" were trolling but at least for this guy it's real. I remember when we played Unreal Tournament 3, new at the time. He was getting like 37 fps and saw no need to improve it. To this day he thinks 144hz monitors are a scam. He only has had one functioning eye since he was a child. He also always beats me in every game we play.
If the end result is virtually indistinguishable from running something 'natively' but I get 50%+ more fps then I don't give a fuck. Crying about this shit is cringe.
Use DLDSR to downsample from 1440p to 1080p, add DLSS quality upscaling on top of that. God tier combo if you're still stuck at 1080p. Niggas who use that know what I'm talking about.
The upscaling at 1080p is okay if you're not using anything below quality setting. Balanced and under that are visibly blurrier when you're still at 1080p.
The fake frames are a good idea only if you already have a base framerate of at least 50fps. I don't think you should listen to the anons complaining about input lag, they're dumb. Unless you're trying to make your framerates go from 30 to 60 frames per second then they are right and the input lag will be noticeable. At least 50fps baseline before framegen.
>The fake frames are a good idea only if you already have a base framerate of at least 50fps. I don't think you should listen to the anons complaining about input lag, they're dumb.
Are you telling me you can't feel the difference between say 50 fps and 120 fps? maybe you're the dumb if your brain is that slow
I've never seen anyone actually record this supposed "DLSS bad in motion" thing and post it alongside native so we can tell the difference. Pretty sure people just say it looks bad in motion because you can't prove/disprove it either way so it's not something you can argue against.
I'll tell you why, there is no way anyone will upload 200GB uncompressed video files for you to watch. On top of that your display matters a lot, many homosexuals would not see the difference because they use old VA for example.
Temporal solutions will always have drawbacks, but some of them can be tuned by game developers depending on the type of game.
Because it's bullshit. DLSS is specifically good with motion. That's where it's better than native.
The whole point of TAA is that edges shimmer with native resolution when you're moving, and traditional AA like MSAA doesn't solve for this problem. But TAA sucks because it blurs everything and makes the image look soft. DLSS claws the clarity back but retains the temporal stability. So no shimmer that you'd get running native res, but also without the TAA blur.
We have tons of videos of DLSS in motion, specifically because FSR doesn't hold up in motion and produces horrible artifacts. So you get comparison footage with DLSS specifically in these hard cases to show what it looks like without artifacting.
Go look at an object in the game that is thin, like the wireframe on a fan, or a chainlink fence. Watch carefully as you move the camera around slowly, and you'll see the thin objects slightly blurring and ghosting. As soon as you stop moving, it snaps back into sharpness. I'll admit DLSS has come a long way, but there is still some ghosting.
Okay but what's your alternative? If you use native without AA treatment, you'll get shimmering on thin edges. If you use TAA, you'll get blurring across the entire image at all times and you'll still get ghosting because it's temporally accumulated. DLSS feels like the least worst option here.
Some games do have significant ghosting problems, but that's usually a bug.
4090, 4k, and I see it on any DLSS setting, but it's most obvious on Performance and lower.
Okay but what's your alternative? If you use native without AA treatment, you'll get shimmering on thin edges. If you use TAA, you'll get blurring across the entire image at all times and you'll still get ghosting because it's temporally accumulated. DLSS feels like the least worst option here.
Some games do have significant ghosting problems, but that's usually a bug.
There isn't one and the ghosting isn't that noticeable unless I'm looking for it. I'm just saying it is there.
I went from 1080p to a 4k monitor. DLSS is a blurry mess on 1080p but on 4k I barely notice the blur. I crank it to balanced or performance presets because of that.
Native resolution and framerate or bust. If you use any dynamic upscaling, frame insertion, etc then you lost. Simple as. If you play 4k and hope to get more than 60fps on newer games, you're mistaken unless you tank gfx options. Hardly nothing is optimized on launch for the past several years so figure it out or don't buy the unplayable vidjas. >you can't see or feel the difference!
Yes I can. And it's telling that you can't. You are part of the problem that will cause desktop hardware to nosedive further than it already has.
Why? It was made with Sweet Baby Inc. Why would you ever even want to play it? I don't say this often but maybe you should just kys. :/
>1080p
Fake frames, because every upscaler looks like shit at low resolution.
But if you can't get 60FPS at 1080p without FG then you will have bad experience anyway.
So only real answer is to get better GPU, because both DLSS and FG are only useful for high end cards anyway.
FSR does. DLSS is good even at 1080p (only on quality mode)
DLSS 1080p Q still looks far worse than native.
FSR is just unusable at that resolution.
>DLSS 1080p Q still looks far worse than native.
No it doesn't, difference is small.
I think most of the people playing 1080p are also playing on the cheapest, most horrible 1080p monitors that were available 10 years ago. The kind of monitor with such slow response times that everything becomes a blur anyway if you do a 180, and the kind that kept people using CRTs into the late 2000s.
I have a decent 144fps 1ms IPS monitor.
I'll most likely never upgrade my monitor, maybe 1080p OLED I guess.
Sorry but it looks awful in motion, TAA brainwashed people into thinking it's acceptable.
We would need much higher frame-rates to make this less visible.
>DLSS 1080p Q still looks far worse than native.
This used to be true with older versions of DLSS. Now it's not so clear cut. In some games DLSS even at 1080p looks better than native because native in 90% cases comes alongside TAA (aka alongside shit).
Thanks, guys.
Yes
You need at least 1440p monitor to make upscaling look decent
Native 4k at 30 fps is all you need
try 1440p with DLSS through DSR if you have a 1080p monitor
in the future all games will be rendered at 480p and AI upscaled to 4k
>Future
Cucksoles are already there.
tell the ai to upscale the boobs
>fake resolution, fake frames, fake raytracing
>gaming in 2023
you forgot "fake difficulty" and "fake fun"
You forgot
>fake fun
You also forgot fake stability.
You also forgot this was an old memory bug which has been fixed
Wrong, Kek. Still happens
This was actually the after launch fix for the game’s “poor optimization on Nvidia hardware”.
Before the fix the cards would get half the frame rate of the competition, and we can’t have that now can we? 😉
The Steam forums are, as we speak, a massacre of Nvidia owners saying the game keeps crashing for them.
I don't know. I get over 200fps @ 4k with everything set to maximum. What do you get?
>things that don't show up in the average FPS graphs when reading reviews
At least your Nvidia is 15% faster even though it cost 30% more! The way it's meant to be played! Blurry textures loading in adds character and SOUL!
>hogsharts legaycy
>fake detail
>fake performance
If I can't turn off LODs and occlusion culling I don't play the game. Skyrim runs at 0.18 FPS but at least I'm seeing the game as the artists truly intended.
Neither. No game that requires any of those memes at 1080p is worth playing
Frame generation is not good, it will fuck up the input latency and make the game play like jello. Only upscale if you have to.
>Frame generation is not good, it will fuck up the input latency
Frame generation always works in sync with Reflex and that cancels out the vast majority of the input latency.
Input latency isn't such a big issue anyway. If it was you'd have people complaining about AMD cards, or about Nvidia cards before Reflex became a thing. Because in those cases your games had input latency equal to framegen+Reflex.
If you have a base framerate greater than 40ish FPS, use framegen. If you have a 1440p Display, use DLSS Quality. Only go lower than that if you're truely desperate to get a bit more fps.
There's no reason to choose one over the other. Use both if you can. I played AW2 at 1440p DLAA without Framegen and locked my FPS to 30 for a kinomatic experience. It was worth it
I've got a 4070 Ti and a 1080p monitor, and I'm getting at least 40 FPS with medium ray tracing settings. I wanted to see if any of these memes are usable or I should just play the game rasterized.
Maybe just use it for reflections only? Alan Wake is a weird case because they precalculated the lighting, so the benefits of path traced lighting are less obvious than a game like Cyberpunk because the prebaked lighting is still really good. Path tracing just cleans up the image a lot.
Just watch the DF video and decide if it's worth it.
Ohhh, thanks. That helps a lot.
>locked my FPS to 30
Do you do this often? What games do you typically play? Not trying to shit on you or anything, legitimately curious.
Almost never. Alan Wake 2 was a special case since the DLAA Path Traced image just looked so perfect, and given the cinematic nature of the game it just made it feel like a movie. Had I been able to play it with DLAA consistently above 60fps, I probably would have just done that. I was already fluctuating to the high 30's so it just made sense to lock it to 30fps.
I know somebody who can't tell the difference between 30 and 60 fps. I play games with him in person a lot. Once I mentioned him to change a setting to improve his frame rate quite a lot, but he seemed unaware of the concept and completely didn't care. At first I thought people who say "you can't see more than 24 fps" were trolling but at least for this guy it's real. I remember when we played Unreal Tournament 3, new at the time. He was getting like 37 fps and saw no need to improve it. To this day he thinks 144hz monitors are a scam. He only has had one functioning eye since he was a child. He also always beats me in every game we play.
Explain how it's fake.
no
why does everyone get so hung up on 'fake' resolution and frames?
games are entirely fake as-is. what's 'real' about shaders, etc.?
If the end result is virtually indistinguishable from running something 'natively' but I get 50%+ more fps then I don't give a fuck. Crying about this shit is cringe.
Use DLDSR to downsample from 1440p to 1080p, add DLSS quality upscaling on top of that. God tier combo if you're still stuck at 1080p. Niggas who use that know what I'm talking about.
Fake resolution because fake frames cause lag.
The upscaling at 1080p is okay if you're not using anything below quality setting. Balanced and under that are visibly blurrier when you're still at 1080p.
The fake frames are a good idea only if you already have a base framerate of at least 50fps. I don't think you should listen to the anons complaining about input lag, they're dumb. Unless you're trying to make your framerates go from 30 to 60 frames per second then they are right and the input lag will be noticeable. At least 50fps baseline before framegen.
>The fake frames are a good idea only if you already have a base framerate of at least 50fps. I don't think you should listen to the anons complaining about input lag, they're dumb.
Are you telling me you can't feel the difference between say 50 fps and 120 fps? maybe you're the dumb if your brain is that slow
I've never seen anyone actually record this supposed "DLSS bad in motion" thing and post it alongside native so we can tell the difference. Pretty sure people just say it looks bad in motion because you can't prove/disprove it either way so it's not something you can argue against.
I'll tell you why, there is no way anyone will upload 200GB uncompressed video files for you to watch. On top of that your display matters a lot, many homosexuals would not see the difference because they use old VA for example.
Temporal solutions will always have drawbacks, but some of them can be tuned by game developers depending on the type of game.
Because it's bullshit. DLSS is specifically good with motion. That's where it's better than native.
The whole point of TAA is that edges shimmer with native resolution when you're moving, and traditional AA like MSAA doesn't solve for this problem. But TAA sucks because it blurs everything and makes the image look soft. DLSS claws the clarity back but retains the temporal stability. So no shimmer that you'd get running native res, but also without the TAA blur.
We have tons of videos of DLSS in motion, specifically because FSR doesn't hold up in motion and produces horrible artifacts. So you get comparison footage with DLSS specifically in these hard cases to show what it looks like without artifacting.
Go look at an object in the game that is thin, like the wireframe on a fan, or a chainlink fence. Watch carefully as you move the camera around slowly, and you'll see the thin objects slightly blurring and ghosting. As soon as you stop moving, it snaps back into sharpness. I'll admit DLSS has come a long way, but there is still some ghosting.
Okay but what's your alternative? If you use native without AA treatment, you'll get shimmering on thin edges. If you use TAA, you'll get blurring across the entire image at all times and you'll still get ghosting because it's temporally accumulated. DLSS feels like the least worst option here.
Some games do have significant ghosting problems, but that's usually a bug.
what gpu, resolution and dlss setting are you using to see this?
4090, 4k, and I see it on any DLSS setting, but it's most obvious on Performance and lower.
There isn't one and the ghosting isn't that noticeable unless I'm looking for it. I'm just saying it is there.
>Performance
DLSS performance was made for running games at 8k 60hz
You also need it if you want to maintain 60 fps with path tracing.
possibly. I just leave it at auto in CP2077
Fake resolution by a mile
1080 frame gen is almost unnoticeable except in cyberpunk, that’s the only game iv ever seen frame ghosting
It's all fake you stupid homosexual. Never post again.
Shut the fuck up, you're a figment of my imagination.
I wish anime girls were real.
I went from 1080p to a 4k monitor. DLSS is a blurry mess on 1080p but on 4k I barely notice the blur. I crank it to balanced or performance presets because of that.
Native resolution and framerate or bust. If you use any dynamic upscaling, frame insertion, etc then you lost. Simple as. If you play 4k and hope to get more than 60fps on newer games, you're mistaken unless you tank gfx options. Hardly nothing is optimized on launch for the past several years so figure it out or don't buy the unplayable vidjas.
>you can't see or feel the difference!
Yes I can. And it's telling that you can't. You are part of the problem that will cause desktop hardware to nosedive further than it already has.
Alan Wake 2 is garbage.
Fake frames are garbage.
Fake resolution is garbage.
The ghosting/blurring is absolutely horrible unless you play explicitly on 4K.
>not going to 4k any time soon
if I stick with 1440p I dont have to worry about fake frames... right?
If a game doesn't work without fake frames and fake resolutions with a new mid-range GPU then it's slop that shouldn't be played.
>1440p native 60FPS on 4070 max settings no RT
AW2 is fine.