4k with DLSS literally looks better than native. It softens grainy artifacts whilst keep all of the intended details. I seriously don't understand how people can claim that it makes games look blurry and bad. At 1080p sure, but at 4k or 1440P? it's literally black magic. Jensen really cooked on this one.
CRIME Shirt $21.68 |
Ape Out Shirt $21.68 |
CRIME Shirt $21.68 |
I am trans btw
4k is reddit
games are less and less optimized because 'lol dlss will bruteforce this shit'. That's why.
>MMUH OPTIMIZATION
Optimization means TRICKS, bake shadows into textures instead of having real ones. The goal is to not rely on tricks anymmore and DLSS is helping with that. Instead we now have real time, physical accurate lightning that simply can't be """optimized" because of what is done here, no tricks, no homosexualry, pure innovation. morons
Alot of words to say "I'm a wiener sucking troony Black person who has no idea of what I'm talking about"
optimization means work*
ftfy
No tricks means less work for devs, which I don't want. Also pure/native graphic often looks worse than tricked one.
???
DLSS is just the ultimate trick, and it's making devs unable to optimize shit by themselves
You shouldn't NEED DLSS to run a game properly
This anon helps the chinese build cities out of cardboard in an attempt to make everyones deaths easier.
that is exactly the point. we build with cheap cardboard before, now we are using real bricks, which are more expensive.
DLSS takes RAM resource from other programs this is why games no longer have physx or good AI because every thing is directed to DLSS and RTX can be optimized to not drown graphics by rendering every pixel to simulate shadows and light paths
>DLLS takes RAM from other programs
>laughs in 64 gigs
All that ram to DLSS such a shame
Would be a good point if all modern games looked better than RDR2 and performed similar on 2020 hardware; but they don't, they look like shit and get choppy on 4070 cards at 1440p.
How many dicks do you have in your ass right now?
If I count yours, then 1
>Why do you hate DLSS?
because it's a cost-saving measure for publishers: instead of optimizing games, they force you to turn that smear-inducing garbage on and forces you to waste money on top-end hardware if you want 60 FPS.
>The goal is to not rely on tricks anymmore
but DLSS IS a trick. it's shitty upscaling combined with shitty TXAA and results in a muddy, smeary mess.
Show a picture from your PC game of it being a muddy smeary mess.
>still shots
nice bait
Well the only dogshit that does fine its gonna get dumpstered in a junkjard and swapped to ue shit that cant do right those gimmicks
reminder this goyslop game can't even run at 60fps on 4k on a 4090
reminder this goyslop game also has baked in goytracing and ZERO dynamic lightning despite the entire game revolving around colored magics
it's still mostly tricks, because the raytracing only works in certain distance from you
>being able to correctly draw a 2x2 triangle is a trick
this is why you live in Bangladesh and have no toilet.
Mirrors Edge looks fricking fantastic for the era it was in. You know how? Baked lighting.
Raytracing is neat, but should not be the only solution that is used by every dev.
Fear shows its age, but the physics and effects are still cool looking, the distortion from a grenade, the particles being blown from gunfights in an office, the lights swinging side to side as the battle ends, highlighting the bloodstains on the walls.
Were the lights accurate? No, but they looked good as spotlights.
The tradeoff was that the environment was completely static. So for example on the PC version when you add all of the PhysX objects in, none of those objects can cast shadows so they look completely out of place.
Baked lighting sucks. It's a dead-end. It means less interactive games for players and more downtime for developers as they're waiting for their lightmaps to compile as opposed to do actually interesting or meaningful things. I think the tragedy of the entire 8th generation of consoles was in part due to developers chasing super realistic global illumination and indirect lighting which was a white elephant. But I don't care how good something static looks, it's a game and I want to interact with it. If developers are going to waste an inordinate amount of time and resources into depicting light, then they'd better let me the player do something interesting with lighting.
i get what you're saying here but in reality raytracing hasnt caused any games to actually create meaningful interactions with the gameworld aside from maybe that game control (i wouldnt know it seemed like mediocre tech demo slop and everyone forgot about it instantly)
and the game industry has been trending away from physics and that kind of interactive gameplay for over a decade
these games are using ray tracing to have pretty night and day cycles and thats about it
it would be nice if they came up with a more practical use for it but that would also take a lot of creativity. its only lighting afterall. the actual physics take a lot of good programming that modern devs seem either incapable or unwilling to do
i mean look at cyberpunk, the flagship raytracing title, it has some of the worst physics of any game ever made, it was designed from the ground up so superficially that it took them like a year to stop npcs from constantly clipping into shit and the way the player physically interacts with objects and runs into NPCs still feels jank and shitty as frick and probably always will
I bought a 3090 when it came out to see what the big deal with ray tracing was, and ended up thinking the feature was a meme. Maybe one day it'll be good, but the only good feature the 2000 and 3000 series have brought is dlss. Even now I force myself to turn the feature on just to see how little to none existent the change is in most games.
best bait I've seen in a while
>muh tricks
120 fames @ 4k vs 120 frames @ 4k
>one isn’t real and one is
b***h you can’t tell the difference and neither can anyone else.
You only say so because TAA is awful.
bilinear is better
This is why PC sucks, no innovation just bigger numbers brute force it harder. This is why console believe or not is kinda better, limitations breed innovation. PC breed cheap band aid solutions. In a perfect world games are developed for consle level power and optimized, and then when played on a PC is just amazingly more performance than that
Here's your innovation bro
>ps4 and xbox one were the first to go full in on the upscaling meme
holy shit anon you are right
>Consoles
>Innovation
What the frick I'm reading
Dude upscaling and other forms of resolution scaling have always existed. Innovation used to happen when consoles still used unique hardware. Nowadays that they're just okay pcs there's nothing special.
Its helps my medicore GPU run better and produce better looking results. The difference is large. For me, its a nice bonus.
>This is why console believe or not is kinda better, limitations breed innovation
this has only been true for the switch the past 2 generations, the SeX and snoystation 5 were flops with zero innovation aside from MOAR GRAPHIXZ
yeah, until you turn on path tracing and the entire screen turns into smearing cum. only works for basic ray tracing and rasterization, for now at least
>4k with DLSS literally looks better than native
No.
>Why do you hate DLSS?
Developers use it as a crutch for their shitty poorly optimized garbage.
Whatever gets my framerate to a solid 75 is a good thing in my books. It's easier to work around moronic devs than to teach them not to be moronic.
It's giving devs even more breathing room to bloat further. Get ready for horrors beyond your imagination within the next few years.
Fake frames, 4k is widely a meme, unsupported by most of the good games that have already come out natively, and Nvidia gays are the apple gays of gayming. The only good thing these gays made was the Tegra chip so I can play tendie games for free
>Fake frames,
moron actually believes that people are walking about inside his screen
Well let's not be unfair to Nvidia they do have great GPU architect and they're drivers run perfect when you take time to optimize it the only stupid thing they did was that they went balls deep in the Ray-tracing and DLSS for some reason
>they're drivers run perfect when you take time to optimize it
so just like ayymd?
*Plus they could of have some side business of selling affordable GPUs but don't because reasons they wouldn't hurt for having one
Smeared framed, its particularly obvious on high motion parts of where there's lighting in a dark area and you end up with a halo around people
I like it because it makes my 2019 2070S actually viable at 4K.
I don't like the fact devs now use temporal upscaling as the only way to optimize games.
DLSS/DLAA is just better at antialiasing than any other method as of right now
I don't get it. What does this even show?
Are you saying that the round window only uses DLSS because the rest of the image is fine?
It looks more like a render bug
nta, but it looks like it's all the windows. I assume he's posting it because DLSS still requires proper set up otherwise you get moronic bugs like this.
Temporal Ghosting
DLSS/DLAA = Temporal
>Doubt
Theres too many artefacts in that first screenshot to be 4k with no AA. Did you switch DLSS and no AA todd?
raw 4K image in older games with no post processing AA or TAA doesn't look as good as you think, HL2 is a good example while it does look really good, there is a lot of shimmer and crawling from all the wirefences and shit.
You just can't fix that without temporal AA (DLSS or god forbid..TAA)
>what do you mean picture with no AA has no AA?
>4k no AA
>4k with TAA
>4k with DLSS Balanced instead of Quality
>le hurr le durr DLSS is shit, look how different it is
Why are you morons like this?
The DLSS picture easily btfo's other examples, despite being balanced mode, that's the point, while giving massive performance boost.
DLSS picture is the only one that fixes dithering from low quality ambient occlusion.
Sharpening artifacts aren't extra detail moron. 4k without AA looks the best because it's the original image. TAA/DLSS destructively alter it.
What do you expect? They're fricking idiots and they're probably being paid off by Jensen himself.
Yes, that's it. People who like to use a tech you can't afford are being paid off by a CEO of the company who makes GPU's.
You still can't explain why horribly optimised games that run way worse than RDR2 or Death Stranding force me to turn fake frames on.
Devs got worse, it's a fact, all white men who know some coding got into banking, fintech and app development, no sane white male wants to develop vidya for 60k/year 100 hrs a week.
One likely explanation is that the guys are turning switches on and off and calling it game design.
For example, Death Stranding has realistic lighting and it looks more photorealistic than many games that are using ray tracing. But the sky lighting is static and everything is finely tuned by the artists so that the player always sees the most "oprtimal" image possible. Meanwhile, in Unreal Engine 5, you can turn on "lumen" and let the engine calculate the lighting more or less accurately based on complex calculations, and that's it.
The guys at Croteam have started using Unreal Engine 5 and the justification is that they are "game designers" and not "engine designers". Great, but if I follow this reasoning, then "game designers" don't need to know how to make 3D models, nor textures, nor music, nor sound effects, nor animations, nor fricking gameplay in many cases... The industry will move towards this idea of a few guys putting together a bunch of pieces made by others and considering it a job well done, while you have to use fake framerates and fake resolution to get the overpriced garbage to run at 40fps (and it looks worse than something made by qualified developers that runs at 120fps.
>The guys at Croteam have started using Unreal Engine 5 and the justification is that they are "game designers" and not "engine designers".
That is cope from them as their engine wizard fricked off to google a number of years back (which contributed to SS4 being the mess it was).
>sharpening
lol
sharpening was specifically disabled here
What do you think image reconstruction is doing? Your PC is rendering a frame at 1080-1440p then DLSS is extrapolating what it "should" look like if it were a native 4k frame. It works almost the same way as an unsharp mask, but the algorithm is smarter because of the ML training.
>What do you think image reconstruction is doing?
It renders different parts of the image every frame by jittering the view port and uses temporal component to combine the frames together based on motion vectors and previous frames.
Sharpening is used as a counter to temporal blurring, except Nvidia's sharpening sucks dick and it's better off without their sharpening. In this specific example the sharpening was disabled.
AMD's CAS on the other hand is the most natural sharpening you can ask for.
That's an elaborate way of not saying that DLSS tries to make a frame rendered at a lower resolution appear to be higher resolution. It introduces similar artifacting to traditional sharpening and motion blur, but less than a pure implementation of either. But in a static frame you're only seeing the sharpen-like effect.
>Sharpening is used as a counter to temporal blurring
No, nvidia uses an additional sharpen pass to counter the blur introduced from temporal scaling. Sharpening in general is used to fake a more detailed image by increasing pixel contrast.
>That's an elaborate way of not saying that DLSS tries to make a frame rendered at a lower resolution appear to be higher resolution
Anon, you asked about what image reconstruction is, stop putting words into my mouth.
>No, nvidia uses an additional sharpen pass to counter the blur introduced from temporal scaling.
So does AMD, so does Epic, so do other companies that have their own TAA solution, with Capcom having forced sharpening in all their RE Engine games on PC.
It's a well known fact for years that temporal AA introduces blur and it's not some top secret Nvidia knowledge.
t. my master thesis was about image scaling and sharpening
>you asked about what image reconstruction is
That was a rhetorical question
>rest
Yeah. Back to the original point, DLSS/FSR/XESS is not unlike traditional image scaling. You're creating an output that might be perceptually better, but the scaling process is destructive unless it's integer scaling. A native render is always going to be better
*A native render is always going to be better because it is the original unscaled image.
Years ago TAA used to be called temporal supersampling by some, because it is possible to render higher resolution image over multiple frames in real time by using view port jittering.
DLSS/FSR2 upscaling use the same principles, except from subnative resolution.
TAA meand breasts And Ass you insufferable incel.
Then again you are an nShitia drone so you're probably used to Bobs And Vagene (BAV), so I guess you get a pass this time.
>praise AMD's CAS
>get called Nvidia shill
Ok.
Anyways, where do I talk about video games with people who actually know their shit?
After 14 years I'm sick and tired of this place.
It helps when you remember the average Gankertard complaining about all this stuff doesn't game or have the hardware to try it out.
>praise AMD's CAS
also you deserve the moron for that alone.
Yeah but that only changes the predominant kind of artifacting to jitter and ghosting. It doesn't make scaling any less lossy, just lossy in a different way.
Let's see 4K with MSAA
DLSS is nice, but i hate RTX, it fricks up your framerate while it gives a minor enhancement on graphics, DLSS exists only to cover for Nvidia's frick up on RTX's optimization.
also, DLSS bring into the table a huge problem, devs are pieces of shit and wouln't optimize their games now because they test their games with DLSS turned on, and be like "it runs fine, no further work needed" frick those homosexuals.
because i laughed at ps4/xbox one for using upscaling algorithms
my honor requires me that i also make fun of dlss
this tbqhwyf
i didn't finally get a PC to upscale
I don't. I can't tell the difference between native and DLSS Quality on 1080p so I could not care less.
>eyelet
>imagine not playing 10cm away from the screen
You're one of those morons who gaslit themselves into believing their 4k monitor is a good purchase, aren't you?
no, I use a 10 year old 1080p monitor and am fine with as low as 40fps. I can't tolerate AA though. the simple stuff is okay, but any sort of temporal aliasing always looks like complete shit and smears to frick and back. I was playing talos principle 2 recently and holy shit what an offender that game is. I had to disable anything related to upscaling / aliasing to make the experience playable and lost half my fps in the process
>and am fine with as low as 40fps
stopped reading right there.
I will only play at 1080p.
It's not actually 4k, nvidia tries to pretend that it is and scams people into paying 2 grand on a card to play at 1080p which they wouldn't do if there was any honesty involved. It deserves all the hate and then some.
Show a 1080p image which looks the same as a DLSS 4k from 1080p image.
I don't.
It's the one genuinely cool technology to come out of Nvidia's AI pursuits.
Frame generation can go frick itself, however.
DLSS is a better implementation of TAA, I don't hate it. And it's optional, it's not like I'm forced to use it. What I hate is that Nvidia wants me to pay a 30+% premium over the competition for the privilege of having access to this technology.
Same with Raytracing. No one "hates Raytracing", what people actually hate is that the 1080Ti 11GB was $700 and the 2080 8GB (same performance) was also $700... for the privilege of destroying your framerate in a couple of games.
At the end of the day, when you're buying a GPU or CPU, only three things truly matter: price, performance and power consumption. Everything else should be a nice bonus, not THE main selling point of the product.
it is used as a crutch instead of optimizing the game engine/ code
>It's used as a crutch
>because new games won't run on my 6 year old GPU at 4k ultra settings
>he forgets to mention that new games look worse than 6 year old games
>why don't new games look good on my 6 year old GPU?
so true!
>Look, here's one game which doesn't look as a good so that means all games.
i accept your concession
Funny because as he talks about this technology being crucial
Game studios are using raytracing to map the shadow and then using conventional rasterization to make it better perform and look similar
I don't have a problem with Raytracing but in a game like Cyberpunk you're either only turning on Pathtracing or only using rasterization because Ultra Raytrace settings barely looks better while still have inaccurate visuals
The problem with DLSS 4k is that it infact does not look better than Native 4k and neither does 1440p dlss to 1440 native.
You can argue its a good inbetween
>The problem with DLSS 4k is that it infact does not look better than Native 4k and neither does 1440p dlss to 1440 native.
Well that should be easy for you to show us all. Demonstrate it on your computer with the latest DLSS and using old youtube clip screengrabs
>without
Demonstrate it on your computer with the latest DLSS and WITHOUT using old youtube clip screengrabs
OP made the thread and defends it. He should be posting it. But he won't because he knows dlss will look worse
Or you could just post images showing how bad DLSS is on games running on your PC. It would make everything much easier all round.
No I don't feel like it. Plus I'm in bed jerking off while tuning back into this thread every bit
So in other words you're talking shit. Got it.
I'm just telling you the truth. Native will always beat out DLSS and FSR
>Native will always beat out DLSS and FSR
You didn't try DLSS/FSR2 with Skyrim.
Show us. You can't just make a claim and then not back it up.
Looks like blurry AI generated oil painting ass. And devs now rely on it instead of optimizing for fricks sake.
I shant be purchasing an Nvidia GPU. Perchance.
What the FRICK is DLSS?
Deep Licking Shit Show
Guess I love DLSS then
I use it whenever I can
Getting 4k 144fps on a 2070 feels like cheating
4k is for gays and morons
>1080
lmao poorgay, I bet you actually play videogames
When you use DLDSR on other games, even with a 1080P monitor, it will be like putting on glasses for the first time and 1080P will forever look disgusting. Same thing as 60hz into 144hz, you have a 144hz monitor so why would the same logic not apply to higher resolutions?
Seems like you're just a troony Black person though.
I like DLAA
Native 4K
DLSS Performance
now post a webm of him moving. you get
homie, without DLSS/FSR2 you'll still have temporal AA, and UE4's TAA has even more ghosting issues.
You know DLSS gets updates and solve these kinds of issues right? Your webm is old as frick from DLSS 1.0. Is this what you spergs are upset about?
I'm pretty sure that's TLOU Part 1
You can't technically solve it. The only way to reduce aliasing (ghosting, jaggies etc) is to output more frames at a higher resolution. Nvidia can pile up per-game hacks to minimize ghosting or blur in specific titles but the general problem is always going to be there, because the game is being rendered at low res and (with framegen) low fps.
https://en.wikipedia.org/wiki/Nyquist%E2%80%93Shannon_sampling_theorem
>now post a webm of him moving. you get
no you don't you dishonest AMD shill
Literally looks better than Native 4K. Look at the fence and people in the background.
moron without an RTX card.
>4k
>60fps
moron
???
>getting a high resolution monitor but low refresh rate
60fps to 144fps is pretty noticeable. maybe not as noticeable as 1080p>4k but definitely something so spend a bit more on for a good monitor
You need to keep upgrading your CPU for consistent 144 fps and it's not viable at all in latest demanding games. I prefer consistency.
>upscaling artifacts on the native screenshot but not on 1080p blown out to 4k
You're a funny guy.
Every time there's a post like this. Here's the webm of me taking these screenshots.
>using a Cuckbox controller
that's your problem right there.
Joke on you, I use DS4.
FSR/dlss is a godsend for VR
>d how people can claim that it makes games look blurry and bad
it does
?si=5DCXOi_w7aUjWFen&t=1153
i love dlss
im able to 4k 60 on my living room tv on a 4070 because of it
if im not hitting 60 with dlss there is always frame gen.
>normal people:
>enable DLSS to improve framerate for zero noticeable difference
>enjoy game
>morons:
>ignoring gameplay to hunt for erroneous pixels to complain about
>normal people
>pooo poo pee pee
>penis penis
>morons
>pee pee poo poo
>zero noticeable difference
it's painfully easy to notice though, particularly on performance mode
I play at 4K on a 43 inch display at performance mode and the difference is hardly noticeable. It only becomes noticeable once you zoom in the screen
bullshit
what games are you playing?
Riddle me this anons - if dlss is so good how come it looks like shit at 1080p which is the resolution the overwhelming majority of pc players play at?
>muh 4k
If you are playing at 4k you'll have the hardware where dlss isn't needed.
>but muh alan wake and cyberpunk!
Games which run badly on the most monstrous of gpus even at lower resolutions
Anon is probably sitting 6ft from the screen so they can't see shit anyway.
>If you are playing at 4k you'll have the hardware where dlss isn't needed.
lol
lmao even
The latest game with DLSS I play is Yakuza 8, but that applies to pretty much any game. Another more recent game is Ready or Not.
For any game with DLSS, I just set up Performance mode and apply CAS through reshade, as I like sharper image and CAS just works.
>Anon is probably sitting 6ft from the screen so they can't see shit anyway.
I sit like 1.2 meter away from display
The reason DLSS/DLAA is a compromised experience is because it applies to the whole screen, so you're destroying the quality of the entire image just for less jaggies which is a tradeoff. This is why a good looking game that doesn't rely on it will look much better than a good looking game that does, but obviously the game(s) that were built with the solution in mind means it will probably be the best option in that instance, but it still won't be a good option - just a lesser evil.
Gaming peaked with rasterization at 240 and pixels stretched out onto monitors that blended the sharp edges with phosphorous glow.
I still don't get if DLSS slows down or improves performance
latency, ghosting, fake frames, fake performance
>hmmmm lets add shitty graininess
>and a dash of smearing
>now lets increase input latency
>game looks like 120 FPS but feels like 60
wonder why Ganker hates it
Because they're contrarian homosexuals against innovation who've never seen it live.
Lol
>BUT GUISE IT LOOKS BETTER IN. MOTION
(It doesnt)
literally no one defended original DLSS. FSR1 btfo'd it despite being traditional spatial upscaler
>innovation
regression actually
I don't understand people complaining about seeing artifacts, do you only play games where there's nothing going on? If you're playing a shooter or fighting game or a racing game there's no way you'll notice shit like that, and why the hell would you care if you did see one fricked up pixel every once in a while?
>Looks better than native
It does not.
DLAA is hands down the best AA method for differed rendering though.
Lol dlss looks noticebly worse, especially when you're actually playing.
The worst part was that there was some shill youtube reviewer claiming it was better than native and in every single example it had worse blur. Is it better than running 1080p? I guess? but keep in mind you're getting worse performance, artifacting and weird visual anomalies
Looks much better on those images.
You're an idiot that doesn't even understand the whole argument then.
When you're looking at the whole picture the left will look undeniably better because it sharper, when you look at the right from the whole picture its much more blurry.
Actual fricking brainlet. Go ahead and go stare at pixels more like all those shilltubers that dont actually use it.
>but when I zoom a 200x200 pixel section of the image to a full screen 4k image I think the extremely aliased version looks better
>oh and you're the moron
fricking lol now do it from your PC poorgay
>wtf you made a point
>you're just poor!
>even though you need a stronger rig to run native resolutions
instead of calling people poor why dont you shove your shit technology up your ass and upgrade to a 4070 super which you probably cant afford and run games at 1440p native without having to fake it.
>I use youtube videos to make a point because I can't run it myself
>no I'm not poor, I just get really upset when people call me poor. I choose not to buy things and live like a 3rd worlder but that doesn't mean I'm not poor
lol poorhomosexual
Native will always be better.
It's a fact that cannot be disputed.
Also, for the people saying right looks better in are utter morons. In native 4K and not some cropped + zoomed-in still image, it will be nice and crisp, the right image just proves DLSS is a blurry heap of shit.
>inb4 poor
I have a 4090 and will never use DLSS.
And cars that cost 10x more are better than the ones that do not but you still get to enjoy your life with the more modest car too.
Your screenshot has a million jagged aliasing artifacts. People who aren't blind are going to use some kind of AA, and DLSS is better than TAA.
>clear textures are artifacts
this is your brain on nvidia marketing.
homie you blind.
I can fully understand the hatred for dlss. I thought it was a meme too. It took me using it to see I was being dumb about it. It's not a solution to every issue. Some games are just going to look better native, but for 90% of games. It's better to use just dlss.
The game you're playing doesn't even support dlss so you can't use that as a fair comparison. Fsr just isn't that good yet dude.
there is a DLSS and DLAA plugin for it
Yeah but that messes with hud and other ui elements. It's not the same as a native implementation of it, but still a nice option to have though.
the only real flaw i noticed it makes the main menu blurry since letters dont have polygonal edges so it things those are aliased textures, but in game everything looks much better than native TAA
>Native will always be better.
Funny, i just played RE4 and injected DLSS through plugin and ... it looks 10x better
Removes all flickering and jaggies
>game look the exact same as during ps4 gen
>they require usd 1000 cards to run right
>somehow this is progress
I hate almost everything about modern games technology. I hate how a beautiful, photorealistic game like Death Stranding runs at 144 frames per second at native resolution on my PC, for example, but much less polished games made in Unreal Engine 5 will make my RTX3090 struggle to maintain 60 fps, even at resolutions as low as 720p, because people are using brute force to try to get beautiful images instead of using the heads and talent of artists.
But the worst thing is TAA (Temporal Anti Aliasing), which uses several frames to calculate who knows what and try to remove aliasing, which generates one of the most disgusting artifacts possible in a game, where everything is blurred and fast-moving things start to leave traces in the image. It's the worst modern cancer, and DLSS is, as far as I know, strongly based on this same solution, which is why games with it (or DLAA), all have that weird thing in the image where everything seems to be moving in an ether and leaving traces in it.
Hair look much better with DLAA than when anti-aliasing is off when zoomed in on a screenshot but jaggies aren't really noticeable at 4k or higher.
DLAA is great on modern games because most of them force TAA anyway, so replacing it with a better/heavier algorithm that can take advantage of tensor accelerators is a no-brainer. Most of the problems come when you introduce resolution scaling.
Is there any way to force another types of AA without mods?
What do you mean? For UE4/5 games you can force disable TAA by loading engine settings. If the game doesn't natively support DLSS then you can look for injection DLLs, but DLAA/SS will be applied to the entire frame (i.e. UI) instead of just the 3D parts.
There are games where only TAA is an option, and I was wondering if, in these cases, I could apply other types of AA in some external way, I don't know.
I've managed to do this in some games, but I've installed mods for it.
You can inject post process AA (like FXAA, SMAA) with reshade to any game.
if you have gpu to spare use dldsr and get supersampling without mods
>TAA - general blur that looks like half resolution in its resolved image
>FSR - Ghosting artifacts out the ass in addition to blur
>DLSS - Blur in motion
For me it is TXAA
DLAA good
DLSS bad
Simple as
nVidbros. I am still using a 1050ti and really don't want to upgrade to a new laptop. How fricked am I? How much more time can I get out this before it's over?
No AAA slop, mostly just shit like Timespinner/Ori/Trails of/Hollow Knight....
ANSWER ME
nvidia keeps wining
4K dlss performance setting (1080p internal) has always worked really well for me. 1080p native without dlss isn't going anywhere, so I'm happy.
ITT
More importantly why do games look as detailed as ones released a decade ago and yet somehow require vastly significant hardware? I'm playing baldurs gate which genuinely looks like it could have been released in 2014 and getting frame dips under 60 with dlss set to performance
Notice how the shills can't address this point.
>get your vaseline blurry shit out of here
show a Vaseline blurry DLSS image from a game running on your own PC
DLSS (and all other upscalers) are a good crutch if you can't run at native resolution, it beats your monitor's scaler in every scenario.
The hate should be directed towards developers who can't get their games to run well because they use extremely bloated game engines.
It's fricking tragic to see Dragon Engine games like Yakuza 8 run like butter on absolute dogshit hardware while looking better than every UE5 game I've seen or played.
You guys should try using the profile inspector to enable preset c over the default one. It makes games looks that much better and sharper. At least for me personally. I can game at native 4k at 60+fps but I've honestly just accepted that dlss is here and it looks better than the blurry taa crap.
I use it as antialiasing. I would use dlaa but not everything supports that. I game on a 3090 ti and just enable it on all my games. It honestly improves the image. I don't need the fps boost in my case.
You have DALL-E generals, multiple off-topic coomer threads, twitter screencaps e-celeb threads, threads whining about zoomers and nothing else, off topic /soc/ threads and bunch of other things and you complain about discussion about stuff you directly see and use when playing video games.
next time close your eyes before attempting to play video games
Because most gamers still use 1080p. And now devs are using dlss as a crutch so even at 1080p you need to use dlss unless you have a top of the line GPU. Therefore the game is always going to look like shit.
While this is the developers fault, dlss enables them to do this so I hate dlss.
Who cares HOW it is rendered. If the display is rendered? If the game is being displayed in 4k at +60 frames, I don’t give a rats ass if it’s “fake” or “upscaled” or rendered by two wires shoved up your mom’s ass.
>bought brand new PC
>4070 super
>dont know what to play
Dragons Dongma 2 is going to be kino tho
>4k with DLSS literally looks better than native
I wish I was dumb enough to fall for marketing like this. I'd be happy with every purchase.
You should try it on any modern game. It really sharpens the picture. I used to hate it too, but then I actual used it.
No, you should explain why it's bad, and why you prefer running a much lower refresh rate over a higher refresh rate.
>NOOOOOO MUH 4K IMAGE AT 100 FAMES PER SECOND ISN'T HECKEN REAL. EVEN THOUGH I CAN'T TELL THE DIFFERENCE NOOOOOOOOOOOOOO
I want you to explain you overly dramatic screeching troony.
Frame generation is not important. It works best in higher end GPUs because there's less delay between frames, letting the algorithm better predict what subsequent frames should look like - but if you're already getting a higher end GPU, why not just get one that can run whatever you want natively anyway?
Thank you for the easy win, homosexual.
If you can't explain your stupid as frick arguments then you're just a moron.
here's the scoop and i'm gonna tell ya, people who hate dlss either have 1080p screens or have cards that can't make good use of it
In ideal world
>DLSS will help older hardware play newer games
In reality
>DLSS will help us sell new hardware, because without it games are slideshows
It's cool technology but it's mandatory at this point because AAA games are garbage.
You can only reply to this post if you aren't poor.
those posts don't explain anything, moron. Just a bunch of 1080p owners validating being poor
i hate that devs are so bad now they use the game using DLSS as the targeted performance
I hate that we have to wait for people with shit GPUs, roll on the 5090 release so the gap widens even more.
i have a 3090ti moronic Black person, keep eating 3rd world contracted games up though goy
>poorgay proud of his 3 year old GPU
lol
>one of the most expensive nvidia cards with 24GB vram
>Still costs over 1K used
>poorgay
You are brown.
I sold mine for 500 ages ago.
>It literally explains you NEED DLSS for AVERAGE performance
I don't NEED it. You might need it because your PC is dogshit. Mine is top end. I like the extra frames DLSS gives and use it in every game which has it.
You're just flapping your gums because your PC is so shit it can't even run DLSS well. Fricking LOL
Jeet incompetence never ceases to amaze me. I was looking up how to's for some shit and ran across a couple of jeet channels. Holy shit. Their "how to" guide was like 30-40 minutes long going through a bunch of bullshit that at the end didn't work.
Find the white guy's channel and after a 5 minute "how to" shit is working fine.
Why are they like this?
So if I have a 1080p monitor I should just not use DLSS at all even when my card can use it?
Correct. DLSS is for increasing the frame rates at 4k to enable ray tracing, which is why it has very little artificing at 4k and lots at 1080p
You can do what ever you want. All you have to do is believe in yourself and never give up and then you can do anything!
You can use DLAA instead. If game supports DLSS, you can use this wrapper for force DLAA.
It runs at native resolution and replaces base TAA, resulting with better image. But it does have performance cost. Just don't use it in online games.
https://github.com/emoose/DLSSTweaks
> Why do you hate DLSS?
A shit that destroys quality of image.
>4k with DLSS literally looks better than native
Sure thing.
show a 4k DLSS "Quality" image from your own PC with its image destroyed.
>Oh shit. I can't do that because it'll show I'm talking shit on my 800x600 monitor
What proof do you want? I use it and I enjoy it. You're saying it "destroys" the image, which I know from using it that it clearly doesn't.
I'm asking to see your destroyed image, and know you can't produce one because you're just a broke whining homosexual getting upset over things you don't have lol
>it's so easy to see this destroyed image that I can't produce one but instead have to go off on a tangent about audio quality
lol stupid homosexual. DLSS isn't like MP3
Frame Generation and DLSS only exist because of Raytracing. Raytracing looks almost identical to Rasterization so there's legitimately no point in enabling it.
Path Tracing on the other hand? Seems really nice but even the 4080 Super - 4090 brought to their knees and its only really impressive in Cyberpunk 2077 and people already moved on to newer titles.
Alan Woke 2 had PathTracing and it barely made any difference compared to rasterization but still took a massive toll on performance even though it looks worse than Cyberpunk's Pathtracing.
And the worst part is that this feature is only really useful in open world games but the majority of open world games that dropped either didn't have it or they implemented bullshit like rt based ambient occlusion and shadowing.
Path tracing and ray tracing are two names for the same thing. If there is a difference, it's in technicalities that aren't even really used in games, because they're not really using "ray tracing" to begin with. No game is actually rendering the entire image based on ray tracing (or path tracing, the same fricking thing) as you do in Blender, for example.
>not real frames
>not real ray tracing
>not real rendering
lol cope
i love dlss
i love nvidia
nvidia, small, family owned business
i like
>correct, it's even worse
You know frick all about audio and you know frick all about DLSS.
>I know more than you but I won't prove any of it
yes you're an inferior poorgay. At least you know your place.
Native 4K looks better.
really?
>Much more sharp clearer image
>perfect in motion
uh? lol
>I much prefer games running 40fps without DLSS than running 120fps with DLSS
>i prefer buying a $2000 gpu and playing at upscaled 1440p with visual anomalies.
Post your rig.
4090 13900k
Not him. Try making screenshot comparison between this
[SystemSettings]
r.DefaultFeature.AntiAliasing=0
r.PostProcessAAQuality=0
r.MaxAnisotropy=16
r.SceneColorFringe.Max=0
r.SceneColorFringeQuality=0
r.MotionBlur.Max=0
r.MotionBlurQuality=0
r.DepthOfFieldQuality=0
r.DefaultFeature.Bloom=0
r.BloomQuality=0
r.LensFlareQuality=0
r.Tonemapper.GrainQuantization=0
and your settings.
see
Who cares about skyrim.
I want to see if disabling TAA breaks anything in your game, many UE4 games use dithering for shadows and foliage.
You can already see the heavy use of dithering in native 4K (TAA) screenshot.
This is why I'm asking, UE4 looks awful without AA. But temporal AA smearing is just as bad. People show static screenshots to hide that fact, With motion blur on top enabled by default you just can't see anything anymore.
>People should show a moving screen shot
lol
No, just move camera and take a screenshot dummy.
go on then, nothing stopping you from doing it and showing us
I sold my 3060Ti and FSR is awful. So all I can is compare TAA or FSR to SMAA or no AA.
Only UE4 game I have installed right now is SS Remake.
Screenshots won't be lined up perfectly because I'm taking them while moving camera and I have to restart the game to disable AA in config file.
I would need game with higher resolution textures but w/e.
FSR in avatar was actually hugely improved. It still had some ghosting issues but for the most part it was comparable to DLSS.
The problem is that AMD are so moronic that instead of focusing on their brand and a specific generation of cards like Nvidia did they decided to make it available to all gpus.
It's not terrible that FSR is open source and available to everyone, it's bad that there no point in using FSR at anything below 4K Q.
Disocclusion artifacts are just too distracting. Even Intel has better quality of image but at higher performance cost with XeSS.
Ironically my problem with XeSS is that it's too faithful to native image. DLSS offers better stability.
DLSS is the best of all upscaling techniques no question there. It was more impressive from the start than shitty RT implementation in many games. DLSS1 is still what many homosexuals remember and it was bad but around 2.2 it was good enough to use instead of TAA.
You are genuinely moronic aren't you.
I don't like fsr either. I've seen that it's better in games like avatar, but it really is no substitute for dlss. In helldivers 2 even at 4k ultra quality there's some shimmering and jaggies found about where as games with dlss quality the picture looks so much better than native taa implementations.
Its making dev lazy fricksin optimization like any nvdia gimmick when it messes games
The funniest part about people hating on DLSS is that these people assume the standard for image clarity is MSAA, when barely any game released in past 10 years supports it, with the only alternative being FXAA/SMAA, which does nothing about temporal aliasing (shimmering) or TAA, which is almost always way more flawed than DLSS.
the GTX 1080 ti was the last good video card nvidia made and I dont remember needing dlss or any of that garbage before to run games at 4k.
Hell even Battlefield 1 had better lighting, reflections and textures than Cybergoy and it dropped 4 years before it while also using the the same old engine techniques but refined from 15-20 years ago.
But oh boy here comes anon telling me that I should use DLSS because its the only way to experience REALISTIC LIGHTING AND SHADOWING at playable framerates.
Frick off jensen
I like DLSS, but I believe in 2016 we reached the peak graphics and anything more is unnecessary.
Hitman 2016 visuals should be standard for third person AAA games, while BF1 looks seriously great.
And these games were detailed just enough that they didn't require TAA.
BECAUSE THEY'RE NOT NATURAL WHOLEGRAIM POLYGONS LIKE GOD INTENDED THEM TO BE!
I think the reason why DLSS is hated is because its going to encourage game developers to cut even more corners on optimization and it works in Nvidia favor heavily, as if games didn't already run like shit, have absolutely no physics and terrible bloat
Look at games like Avatar, one of the first titles to run at 30 fps on console, the frick are they thinking? are ubisoft incapable or releasing a single AAA game that runs at 60 fps?
Ganker is full of poor third-worlders engaging in sour grapes, how have you not figured this out by now?
I use it whenever it's available. It's keeping my 2070 alive.
However while it does increase performance and provide anti-aliasing, it's not a straight upgrade. It has some quirks. Some things will just always have trails for some reason. Webm is BG3. You rarely see any trails during most of the game, but for some reason, the dog always does. Like it doesn't know what to do with fur.
Webm is mine, not just some random bullshit from the internet. 1440p, cropped and zoomed in, DLSS balanced. Same issue occurs with DLSS quality. So not trying to shitpost here, just zoomed into show it, and it does catch your eye while playing.
BG3 uses outdated DLSS 2 for some stupid reason. There is a mod to update it to DLSS 3, try that and check again, but I'm not sure if a 2070 can even handle DLSS 3
Or you can stop being a cum guzzling goyim and just use FSR3 with fluid frame.
AMD once again saving nvidia gpus.
Literally available in like two shitty video games. AMD is always lightyears behind.
I think he means this mod
https://github.com/Nukem9/dlssg-to-fsr3
you can literally just throw the DLL file into any game you like and boot it up homosexual.
Not my fault you bought nvidia when the 6800XT exists.
Does the same thing with the dog happen with fsr3?
Bro thats just a texture placed over a human model
as expected from a lying AMD gay, when asked for evidence or sources they leave the thread or start to shitpost. No wonder nobody gives a shit about your piece of plastic
>projecting
i dont play the shit you play tourist. go try it yourself it takes 5 mins to copy/paste
I did update it, that was the latest DLSS at the time of the webm, about three months ago.
2070 can use the latest DLSS, just can't use frame gen. It does improve the visual quality, but it still has those kinds of quirks.
Post version and the same dog on a different angle then. there are too many schizos here to believe you blindly.
@666570901
Thanks for (You)s.
>not native looks better than native
heh, gottem
>he plays old emulators at native resolution
4k balanced/performance on 1080p screen is the best fix for blurry games so I like it.
Because dlss only works well in 4k and e everyone here is a poor gay with a 1440p monitor or 1080p kek
Its a meme
Frame generation sucks dude. Everyone knows that. It's the upscaling part of it that's great. To be fair though. Dlss 1.0-2.5 sucked. It finally got good with 2.5.1. If they don't drop the feature it might be good in a few years.
>20xx cards come out
>whoa thats cool i guess i should save up
>30xx cards come out
>oh an even newer DLSS mode the old ones cant do, i guess maybe now is a good time to buy, surely they wont cuck their customers aga-
>LOL KEKED BY FRAME GENERATION
hope this 3080 lasts me because im never buying videogame hardware again, these fricking israelites man
To be fair DLSS wasnt a gimmick, if anything its too good, while it took them some months and versions to figure out how to do the temporal upscaling properly now its dark magic most the time, it will make devs lazy, soon every game will rely on some form of termporal upscaling over already blurry TAA.
Frame gen has too many problems, it will take many versions to solve occlusion problems.
>new hardware has new features
#wow #whoa
iirc you can get frame generation working on 30xx cards they just make it near impossible to basically lock it behind a paywall with hardware as the DRM
>it works,
>but it isn't very playable because it relies on 40 series hardware
The AAA market is so unbelievably narrow that I don't see why anybody would care about this shit unless they were a dev.
Everyone and their dog will use UE5 for their games, that engine hates native resolutions.
I don't get it. If not native, what does it want to run as, and what does it do to lower res?
In UE5 even 1080p is heavy to run if you enable all the main new features, nanite, lumen and virtual shadow maps. Changing graphic settings has overall very small impact on the performance unlike how in the past the difference between Ultra and Low was huge. There are still UE5 games which scale well with settings like pic rel.
Many mainstream GPUs can't handle native 1080p 60FPS already and 4090 can't do native 4k 60 as well in some titles.
So games default to using upscaling as part of presets.
UE5 comes with TSR as default but I need to check if it's even possible to disable TAA.
man i pirated that game and was blown away by how cool the visuals were right off the bat. with a 3080 at 1440p it stuttered below 60fps a little but it wasnt terrible considering its not some fast paced hand eye coordination heavy game
unfortunately the sound was clipping in and out after a little while so i guess ill have to forget about it and maybe buy it one day
Tekken is another new game with mandatory upscaling. Honestly it's nice as there wasn't the usual b***hing about performance. It looks really good at 4k with it set to dlss quality.
Last time I played tekken was in some arcade, but I'll get the free copy just to look at it in action. It's not like I will find someone to play vs anyway.
I know there is demo but old habits die hard.
Huh, Talos Principle 2 without lumen still looks pretty good.
Darktide is the only game I've played so far where DLSS actually looks better than any other option, including down sampling. Fricking devs obviously designed it around TAA, which looks disgusting.
No thanks, I'll buy a console when I want israeli image quality.
You forget that on pc we have the choice of whether or not we want to use it. The PS5 uses fsr with checkerboarding to fill out the resolution. Unless someone can correct me on that last part of course.
I will play on my Xbox Series X when I want israeli image quality. What the hell do you think I pay a ridiculous premium for on PC?
You’re not far off. It uses an octagons in a technology called octagonal spatial rendering technology. Ridiculous name, yes, but the main functionality is to provide image quality based on predicted patterns. It also uses another scanning technology to determine if the user is a homosexual or not, guess what it found when it scanned you…
Anybody that says DLSS on 1440p is good is literally blind. I have tested it in many games, and no it does not look good.
DLSS is made for 4k, anything below that and the flaws of it become far too obvious. It's not so bad that you absolutely cannot use it, its still a nice thing to have but its not magic. I recently played Alan Wake 2 with DLSS quality to get 60fps in the game and while it was fine, it sure as shit was very obvious that I was upscaling the image.