What do you think about these temporal solutions that are so common in games these days?
Do you think they look good?
Do you wish games were less graphically demanding so that they could run at native resolutions without temporal AA or upscaling of any kind?
They're not solutions, they're shortcuts.
I can't tell a difference
DLSS on Quality is fine.
I'm pretty fucking tired of it. It's hard to care about graphics when the end result is all of them being blurry as shit.
Going back to older titles its staggering how much fucking sharper they are, it feels good on your eyes to play them because they look right. I swear to god TAA and DLSS/FSR fuck with your eyes because the visual fidelity is so good, your eyes expect to be able to focus on it, but since they're blurry you can't and so you end up straining your eyes playing these games.
just add sharpening?
Sharpening helps, until there's any kind of motion that is.
I simply fon'
TAA is a curse to humanity, I will badmouth it until the day i die. why the fuck would you have an option to make the game blurry as shit?
fucking ruined kingdom come: deliverance for me
there wasnt taa in kingdom come
there was smaa
SMAA or MSAA are the best.
FXAA, TAA, DLSS, FSR, XESS are a scam.
>smaa
does nothing
>msaa
destroys your framerate
>>msaa
>destroys your framerate
>msaa
>looks good and does what AA is supposed to do
every other AA is just vaseline on your screen
>SMAA or MSAA
kek no, they are terrible
>every other AA is just vaseline on your screen
lol
On consoles it should be at least 1080p 60 fps upscaled to 4K using FSR2 at worst, but as it is, the games run at 720p, shit is fucked up.
For the first time in the history of games, resolution goes down.
720p was the absolute worse scenario on Xbone, while PS4 managed to be consistently 900p-1080p.
i know it's insane
but maybe a videogame should run normally on resolution i set in the options - video settings
I miss oldschool sega games.
I miss oldschool ps1 games.
I miss games that were just fun. If a game is fun enough and going for a vibe you forgive it.
All of this modern optometrist Left or Right, Better or Worse bullshit is the absolutely wrong place to focus.
Can I drive a car through a building? Will people inside react naturally? Are there easter eggs where people will be on the balcony doing things I can interact with some way? Can I run into a guys uber delivery on one map and see him yelling into his phone at the driver later on?
I'm just tired of games that don't go out of their way to be memorable. But yeah, totally, a sharper wall texture, that will do it. How about mud that actually cakes up when it rains, so you have to know the map and know where mud is likely to build up so you know where you can drift safely?
fuck off
I think it's crazy that 4k has been a thing now for almost a decade and fucking nobody is doing 4k native. What's really depressing is that we'll move onto 8k before hardware catches up and starts doing 4k natively.
Native resolutions are a thing of the past. TAA is just going to keep getting worse and worse at lower resolutions.
It's destroying PC gaming. Consoles are a much more effective and economical route if you actually want that sort of thing.
I know nothing about programming but there's something about games made by pajeets where they don't get the sampling right or they purposefully turn it down for performance because they suck at coding which makes everything have a bunch of jaggies so their solution is to make everything blurry with anti aliasing programs to cover up their shitty programming
FSR is based. I can play 4k60fps on a fucking AMD GPU. Free performance
I sincerely hope you're being sarcastic...
kys tranny
nagger, please, that comparison is like glasses off on the left and glasses on on the right, though they both look bad anyway.
DLSS/DLAA is the best anti aliasing solution ever devised
only if you define anti aliasing to mean Vaseline filter.
B-but new thing good! Nvidia and AMD told me so!
DLSS quality or DLAA with negative mip lod bias and sharpening is acceptable. Anything else is garbage.
Hijacking this thread.
Answer me, why do old games have these cool melodic catchy tunes, that stick in your head but modern games have these ambient utter shite elevator music most of time? Is this what the zoomers want?
the same reason why everything has frame generation and anti aliasing, the people making games are all pajeets
DLSS/xess is good, dlaa is great, fsr is barely OK, taa is hot garbage
>fsr is barely OK, taa is hot garbage
>fsr over taa
Actually retarded
Yes? Taa never looked good, always turned it off in every title I could
FSR is usable at least
What? FSR has more artifacting and ghosting than fucking TAA, it completely breaks all transparencies like hair and foilage
If you care about image quality you shouldn't be using dogshit FSR, ever
FSR is sharper and looks better even with those minor artifacts. TAA actually reduces your effective resolution. Some games like RDR2 are so bad with TAA on that you literally get a better looking image by rendering at 1440p and upscaling to 4k rather than 4K native.
>FSR is sharper and looks better even with those minor artifacts
Keep telling yourself that lol
Which one is which, tranny?
nta
top TAA
bottom FSR
top is sharper
I think top is FSR, the hair looks very dithered there.
https://imgsli.com/MjE5MTIx
you cant see the difference?
I can see the difference. #1 looks better, which is FSR.
Pic related. You can switch between this and
in your browser
>stills
have a nice day you fucking nagger
Dilate tranny
FSR can be better, but only at higher resolutions, at 1080p I would prefer TAA than say, FSR quality, which would be 720p upscaled to 1080p.
Why would you even use FSR on anything other than 4k and maybe 1440p?
People don't actually do this, do they?
Filename related. Hint: Look at the FPS
>People don't actually do this, do they?
That's currently the video game development meta.
>1080p I would prefer TAA than say, FSR quality, which would be 720p upscaled to 1080p.
Well of course you would; FSR is a lot better than TAA, but they have to be working from the same internal resolution.
TAA looks like absolute dog shit compared to anything.
Traditional forms of AA like MSAA only work 3D models, while stuff like foliage is drawn using shaders
>while stuff like foliage is drawn using shaders
Can someone explain that to a retard? What is "drawing something with shaders"?
How do you draw something using shaders? Could someone explain it to me?
Question to techies: why do most games look like utter shit with shimmering up the ass without TAA nowadays?
Vid related is from the talos principle 2. Other examples that come to mind are FF7R. There's also a strange dithering on shadows or fogs in those games.
To me, it seems to be a new technique everybody started using for some reasons but with some serious side-effects
Deffered rendering causes a lot more aliasing than forward rendering, that's really it.
FF7R is due to low res effects with no filtering
It looks like it has some sort of sharpening filter to help with the TAA's blur? I know other games have it, like RE4R
>Question to techies: why do most games look like utter shit with shimmering up the ass without TAA nowadays?
In your example high polygon count foliage models. In teh past foliage was fake, plane sheet with drone image of foliage on the them (if any foliage at all, de_dust home). No edges to shimmer. But fail up clsoe when you see this brush is fake.
Modern approach is making real branches and grass form polygons that are passing close, but this is massive amount of edges and they create aliasing
So devs cover everything up in TAA vaseline because there are too many edges, SMAA / FXAA don't suffice anymore, and MSAA is too expensive? Welp that makes sense.
That reminds me when I found out leaves in elden ring were billboards, when you realize it it looks cheap, but I still prefer it way more to the blurry mess that is TAA and AI upscaling
more edges = more shimmer
as simple as that
dithering is for performance reason, dont want to render a pixel more than once if you dont have to
it will only get worse form here, there is no future in real time rendering without denoising, be it temporal or frame by frame
I hate the way the industry is going. I'm also playing Talos Principle 2 like the other anon posted, and I often think about cleaning my glasses because everything looks so much blurrier than it should...
I think the idea of games with a clean image is going to disappear for at least the next 10 or 15 years, until maybe technology starts doing a good job of hiding its shortcomings.
Games use to filter each effect separately, but now they don't bother and just have TAA filter everything.
See
, you can see there the hair, reflection in the water, and global illumination on the ground are just a mess of unfiltered pixels that the devs rely on TAA to 'fix' it.
They now rely on TAA for both geometric (inc. alpha tested vegetation) and shading antialiasing. There's too much detail in everything to go without. IMO, more prefiltering is the solution, but that would require hardcore R&D that game studios have no spare capacity to perform.
>Deffered rendering causes a lot more aliasing than forward rendering, that's really it.
Nah, forward rendering is potentially less aliased as it's compatible with the aforementioned prefiltering (which nobody uses) and MSAA (which nobody uses anymore as it's too expensive). In modern games, forward renderers have the exact same aliasing characteristics as deferred ones.
Not a techie but there are multiple reasons
Over sharpness being one
The other is missmatching resolutions, which digital displays can only do one resolution
what does Ganker think about frame generation?
I love frame generation, nice webm
it's completely idiotic
good idea, awful implementation
it should be a reactive reprojections with termporal correction, what VR uses
VR samples inputes for fake frames and dosnt wait for new ones, but where in VR its very simplistic and buggy, you could push it into a proper frame gen with 0 delay that can run as fast as you want, fucking 3x the framerate, 10x the framerate etc
also what i mean
tech to make games run few times faster already exists, nobody uses it
>nobody uses it
Why
tinfoil hat on?
its too good
imagine your old gpu suddenly doing 3x the framerate with 0 drawbacks, why upgrade?
better sell some broken shit like tv motion smoothing tier crap that also creates delay for no reason, you can sell an "improved" version next year, every year
Wrong. This is the exact same kind of reasoning that followed that one "MASSIVE IMPROVEMENT IN NUMBER OF POLYGONS" tweak. Turns out, there's always a dog buried in it.
Yeah, the dog burried is that real frames are still better sin both corners and occlusions are still buggy with reprojections.
But that can be fixed with ... frame gen. Why not combine both? Instead of fucking making fake frames just take the real frame, sample new controller input, displace frame, and only then correct holes and blend motion with motion vectors.
advantages over nvidian and amd frame gen
+less vram cost
+much faster to run
+more accurate
+0 delay, dosnt wait for next frame
+dosnt fall apart untill like 20 fake frames in row, insane temporal coherency
and few problems
-smooths the characters movement relative to the world, dosnt smooth objects in the world
-will be one frame late on camera cuts, but nobody will notice 10ms delay once in a while
-leaves holes in corners of the screen that need to be filled with something
-requires proper death pass to work, if game has a lot of bullshit screen space effects it will look as smooth as it should
-has to be done before ui, motion blur etc
Still, the temporal stability and zero delay makes it 10x better than the fake frame gen, and you avoid insane artifacts everywhere.
Whats stopping AA devs and Indies that they are in the game making field, not engine making field.
Whats the point for an smaller devs spending shitton of time supporting this when if they making some 2d slop their game already runs at 500 fps, and if they are an AA dev all their work may be outdated by a game production 5 year cycle? They start working on it, and in 5 years industry moved to full AI generation.
If support for such solutions wont come from big engine makers or nvidia or AMD, then it will for ever stay as VR unique thing.
>but nobody will notice 10ms delay once in a while
Most people don't realize a full second is 1000ms.
But there has to be some actual technical catch, after all what's stopping AA devs and indies. It doesn't make sense this way anon.
That's just motion blur with more artifacts
motion blur dosnt update your input faster, this does
you can make 30 fps "feel" like how much you want, even if it looks lower fps
and ironically you could hide the artifacts with motion blur and just color extrusion
same fps, but now you get 5ms input response instead of 33
Cool tech demo but it doesn't even apply VRR so whole image is tearing, so I don't know if artifacts I'm seeing are from screen tearing or from this reprojection. Also it's great way to hear coil whine, damn they sing so nicely at over 2000FPS.
artifacts are from edge dissocclusion, TAA and frame gen has exactly same problem, in TAA its reason for ghosting if its not masked out properly
and in frame gen it created this, holes that are filled with just blurred surrounding color
here you see it more since its for presentation, a good solution would do one of those things to fill it
no idea what it has to do with VRR, VRR is display based, not engine based
Nah, you can set this demo to use rendering without fake frames it just doesn't trigger VRR. This has nothing to do with artifacting you are talking about. I can't use FSR2 because I notice those shimmering edges of fast moving objects all the time.
yeah, no frame caps, and it runs in a window, so it renders like 10 frames per real one
frame limits, vsync and vrr support would work same as everywhere else
Looks like the terrible shit they have on TV's that I always disable
Interpolation. Sometimes advertised as smooth/fluid motion. It's been around for over a decade and is always barftastic.
>Do you wish games were less graphically demanding so that they could run at native resolutions without temporal AA or upscaling of any kind?
Yes. For the past 5 years or so games have been getting worse-looking as technology advances. 6th and 7th gen games unironically look better, and older graphical solutions looked better than what they're doing now.
I don't play glorified tech demos that hammer my GPU so I don't have to worry about any of this shit 🙂
TAA wildly varies on the game. Ratchet and Clank has good TAA meanwhile Resident Evil 4 Remake has awful TAA. Some devs try, others say fuck it and just slap on DLSS/FSR and hope that slop version is good enough.
Ackshually, the RTX 4090 is more like $2400.
1707X960 looks worse than just running at native 1080p
All 3 look absolutely terrible. I can't believe that's the best they can do.
MSAA does nothing. Anyone arguing for that worthless waste of performance is an idiot.
DLSS quality at 1440p looks identical to native while doubling the framerate
Cool story bro.
TAA is literal fucking shit, DLSS Quality is ok at 1080p but is still noticeably lower quality than native at this res
Just give me a render scale option for every game so I can crank it to 125-150%, then aliasing is nowhere near as bad.
Use DLAA
Only enforceable if the game has DLSS or already has it added right? Just reading into it a bit more right now
>What do you think about these temporal solutions that are so common in games these days?
Shit. TAA is an abomination, and AI upscaling is a terrible hack that shouldn't exist.
seriously what in the FUCK has happened to video game hair
FF7R's hair looks great when you bump it up to 4k and down sample it. If that's what TAA can do, can make CGI quality hair, then I like TAA.
>Do you think they look good?
lol no
upscaling is minimum 34% loss in render scale no amount of ai shit will fix that
all post process aa is terrible, so much so you need another post process sharpener to offset it
graphics are a complete meme at this point, we're going backwards to playing games 1024x768
>having to compare the dlls and fsr trash to taa so it doesn't look so bad
LMAO, that's how you know it's garbage, applying vasiline to native resolution is the only way for your trash tech to compete.
And all this shit is just a "solution" to the absolute disaster that introducing ray tracing into gaming too early caused.
Nvidia is a shitty company that has actively hurt gaming as an industry.
DLSS quality is literally just a good anti aliasing with free performance
>TAA blurry mess
>DLSS oversharpened and artifacted mess
>FSR even more blurry than TAA with more artifacts than DLSS
Fuck this gay ass shit. I don't even give two shits about graphics anymore, stuff looked fine 5-6 years ago I just want high frame rates not this janky ass AI upscaling and Frame interpolation.
>newer ue 4 versions support taau
>use it in a game that uses new version
>scale to 200%
>it's still blurry as fuck but disabling tas causes shimmering on wooden plank floor
fuck this shit tech
I hate all of them. I hate temporal filtering. I hate upscaling. I hate deferred rendering. I spent years upgrading to play games at native res with supersampling and just as both frame hits and transparency were saved by our blessed SGSSAA, deferred rendering took over and we were forced into the hell of Vaseline filters forever except they've gotten even worse now by smearing frames that don't exist.
Fuck everything.
>people posting individual screenshots to justify upscaling algorithms
At least be fucking honest in your shilling. You can cherry pick a decent corner of one screenshot; these techniques (and other temporal techniques like TAA and Lumen) immediately look like shit the moment things are put into motion. And if you can't tell the difference, you're fucking lying to yourself. You may think it's worth it, but they are not replacements for native rendering.
lumen looks good, but needs light probes to fix fast moving scenes or rapid camera cuts, relying on ray tracing alone is always retarded if you cant afford more than a handful of samples per frame and need to accumulate over an entire second
Also spoilers: temporal techniques have always sucked. Interlacing is a scar on almost two decades worth of content.
why nobody talk about xess? its better than fsr2 and can be used on any gpu. it also combines excellently with fsr1.
Hardly any games uses it, but I agree, it looks nice.
The performance hit on non-Intel GPUs is not worth it most of the time.
You really need to drop the resolution scaling to see any performance benefit on those GPUs.
I've only tried it in R&C and I wasn't impressed.
It's kind of crazy how hard hardware manufacturers push 4K while consoles can barely render at 900p 30fps + upscaling, and most PC gamers don't even have a 4K monitor in the first place
>TAA
I want to murder whoever forces this shit in games
>upscaling
It's good at 4K but a meme for everything else. Devs are using it as a scapegoat to avoid optimizing their game.
TAA looks like hot fucking dog shit
what's even the point of this shit? in ttp2 all this shit was on by default and with the blurring whenever I was in motion the screen was just a blurry soup. is this really innovation in 2023?
It's an excuse to be lazy with game development and to normalize $2,000 GPUs.
It all started with Nvidia pushing RTX.
>Wow, look at these goytraced effects!
>Oh... Your game now runs at 22 FPS? Don't worry goy, just use DLSS! Remember, it's better than native!
Then the problem got worse with consolefags wanting in the raytraced shit too, they get it a lot worse.
>Here, now your PS5 and XSX games get shitty and grainy raytracing
>I guess we'll have to upscale from 1280x720 now...
>Enjoy your 4k* gaming!
It's truly awful, maybe it'll get better in a few years, but from now, we'll have to suffer blurry messes.
We are in a terrible period where we can now do real time ray tracing but we need stronger hardware and software optimizations
They look awful and anyone who isn't blind should be able to tell the difference.
I don't care for them. I jumped from a 1080 to a 6800 over the weekend, and in Starfield, I was able to turn off all the frame-gen/re-render crap. At 1080p ultra, the game looked crisp, visible, and legible in motion. What really gets me is one of the first things you see when you get to New Atlantis. The scrolling sign above the guard shack. With FSR and such on, it's a smeary, illegible mess. It's perfectly readable without all of that on.
Motion is bane for any kind of temporal AA solution, specially these modern upscaling techniques
not so sure, i have seen denoising algorithms turn an incoherent noisy render in maya made only from dots and turn it into a stable video like magic
i think its a matter of trying to do it in real time, you are very limited what you can do, but with more resources you can do some real magic
also i think bigger problem is not motion, but vram, since gpu manufacturers cheap out on ram so much you cant store more than 1 frame, thats why you get ghosting, you drag a mistake for a million frames
you need 5 frames in memory for good results, + depth pass + motion vectors for each, then you dont create ghosting as you always use the raw data for the final output, you alos reduce blurriness and disappearing detail
I've been a PC gamer since the 90s and I don't know what any of this shit means nor do I care.
DLAA is the best followed by DLSS
the others are all shit
I'm guessing we're at least 10 years away from realtime upscaling actually being good and not a blur filter.
For me? It's SGSSAA.