I play the game on my laptop at 2k, with the DLSS render resolution at 75% which means 1080p rendering. I get 70-100 with full path tracing with FG on and 30-60 with it off. On a laptop. So you can end your hyperbole right there. On a desktop with a 4090 I wouldn't be surprised if you could render at 2k (4k with DLSS quality) and hit a consistent 90-120.
I just got into pc gaming with a amd gpu.
i kinda feel like i made a mistake. can you not activate dlss on amd or does it just not work as well as if it was on an nvidia gpu?
AyyMD is getting mogged in software so hard by NoVideo and Incel that the entire situation is barely funny and mostly dreadful. Do not expect AyyMD to 'catch up' anytime soon
2 weeks ago
Anonymous
That's disappointing to hear.
...I'm unironically thinking about replacing my 6600 xt now
You buy AMD to save money and get better raster performance if you don't give a shit about RT, which you shouldn't. Even nvidia users have to cop out with AI upscaling and AI generated framerates because (most) games run like fricking trash with RT.
There are a few where RT makes a noticeable visual improvement without absolutely butchering the performance, like Metro Exodus and more recently Dragon's Dogma, in which case you still wouldn't need an overpriced nvidia board
Nvidia GPUs have dedicated tensor cores for neural network inference (and training, but VRAM is arbitrarily limited to make you fork out more $$$ for that). Normally when you run a game, these tensor cores aren't used; with DLSS, they are upscaling the last frame while the rest of the GPU works on the next frame. So essentially DLSS is "free" performance whilst FSR has some overhead as it's basically just a set of compute shaders that run on the same hardware as the game itself.
It sounds to me like it's just straight up better to get an nvidia card since even if later down the line games start getting too demanding for it, I can simply turn on DLSS and play with increased resolution and frames.
games ARE already too demanding for it.
Full path tracing will get you less than 60fps at 1080p native on an RTX 4070 Ti, an $800 card.
With DLSS 3.5 you can get 100fps 1440p, and the image quality is actually better due to the ai denoiser
2 weeks ago
Anonymous
Yeah but what about the alternatives?
I assume the $800 AMD equivalent would be the 7900 xt. Can it run full path tracing as well as the 4070 ti? Can you enable DLSS like you can on the 4070 ti? I guess the answer to both is no.
So then why would someone get the 7900 XT when you could get the 4070 TI and have it last longer thanks all those nvidia ai things?
Nvidia GPUs have dedicated tensor cores for neural network inference (and training, but VRAM is arbitrarily limited to make you fork out more $$$ for that). Normally when you run a game, these tensor cores aren't used; with DLSS, they are upscaling the last frame while the rest of the GPU works on the next frame. So essentially DLSS is "free" performance whilst FSR has some overhead as it's basically just a set of compute shaders that run on the same hardware as the game itself.
so to answer your question no and I don't think there will be an AMD competitor to DLSS for a long time. The newest AMD processors have tensor cores but only for their CPUs, nobody knows yet if there will be tensor cores in the GPU dies as well.
What is likely is we'll see a neural network based upscaler for their integrated GPUs first using the new tensor cores on their CPUs and then the same upscaler on their next gen GPUs after.
>FSR has some overhead as it's basically just a set of compute shaders that run on the same hardware as the game itself
The frames are AI-generated, though, right? Why wouldn't they use the tensor cores?
You seem to know your shit so maybe you can answer this tangientially related question. Why did Nvidia switch to a generic model for DLSS instead of training a bespoke network for each game? And why is the general consensus that DLSS is better since they stopped doing that? Common sense tells me that the network being trained on high resolution renders of the game's assets would make for far better image quality whereby it can essentially patch together a high resolution frame from bits and pieces it has learned.
>Why did Nvidia switch to a generic model for DLSS instead of training a bespoke network for each game?
There is something known as overfitting, where a neural network can overtrain on a set of data "memorizing" the answers instead of coming up with general solutions. Then there is the problem of getting good training data in the first place. Training on one massive database of general images is better since you're less likely to get "edge cases" outside of said training data.
For example say I were to provide training data for a video game, but I didn't include any training data where there's a chain fence and chain fences are featured in the game, then it might do a bad job of upscaling fences. Bad or small training data means there are going to be a ton of edge cases.
Each game having its own upscaling checkpoint would undoubtedly create better upscales, but doing it properly would be costly in terms of man hours (you need to generate the training data somehow, either by having someone play the game or some kind of automated process built into the dev tools) and you need to allocate compute to train that model. Creating one model to rule them all, maybe with a few configs that developers can play with, is simply easier.
The secret is that DLSS also still like shit in motion.
The most important thing is still raster performance which AMD does better at any price point. The only way it makes sense to think you fricked up is if you bought a 7900XTX and thought you should have bought a 4090.
Ray reconstruction does generate better reflections but it's slow as frick at resolving noise in general at least in its current version in Cyberpunk. It leaves a shit tonne of smearing, ghosting and trails whenever you move the camera or something happens in front of you. The old denoiser (I forgot what it's called) is much clearer in this regard even if it sometimes misses details.
>ray tracing >it's just better reflections >doesn't actually improve realtime lighting and shading because that would be expensive as hell >cards cost 50-70% more gen on gen for this and AI snake oil
You're commenting on an image of cyberpunk you absolute bellend. Cyberpunk has pathtracing.
2 weeks ago
Anonymous
>Cyberpunk has pathtracing
Path traced reflections, you nonce. They can barely implement realtime path tracing in Quake 2 and Minecraft.
2 weeks ago
Anonymous
It's global illumination. Are you really this moronic?
2 weeks ago
Anonymous
you are so fricking stupid holy shit, path tracing in CP is lighting too, its a huge fricking difference visually to normal ray tracing, the biggest difference is that path doesnt leak light thru objects (not shown here)
2 weeks ago
Anonymous
People will look at this and unironically tell you to buy $1k+ graphics cards, lol.
2 weeks ago
Anonymous
they're both running on 2k gpus
2 weeks ago
Anonymous
this is a thread about DLSS/Ray Tracing, we all here agree CP is dogshit
2 weeks ago
Anonymous
>buy $1k+ graphics cards
kek I run it fine one my 200$ 2070 Super with frame gen
that looks like shit and makes zero sense in the scene
Anon, you fell for the Nvidia marketing. That's hardly path tracing.
>moving the goal post
pottery
2 weeks ago
Anonymous
that looks like shit and makes zero sense in the scene
2 weeks ago
Anonymous
what do you mean? the light is bouncing around and lighting up the very small area, thats how it works irl, specially since its tiles and glass
2 weeks ago
Anonymous
No it wouldn't, look at the top its an open grated area with a cloudy daytime sky
the right scene would only make sense at the dead of night with no other light sources
2 weeks ago
Anonymous
You literally don't have a fricking clue do you. Path tracing mode in Cyberpunk uses fully path-traced rendering. It uses some AI tricks to essentially "upscale" the ray count and denoise it but it's path tracing. Even games without path tracing have ray-traced shadows, ray-traced global illumination, ray-traces skylight, etc. Who the frick told you it's just reflections, and moreover, why the frick did you think it was worth posting your opinion on something you have literally never used and know literally nothing about?
2 weeks ago
Anonymous
Anon, you fell for the Nvidia marketing. That's hardly path tracing.
left side is DLSS off - 77ms latency
right side is DLSS quality 37ms latency
anything else you'd like to see?
>in modern games, it's a 1080p card at best,
EVER WONDER WHY THESE BLATANT GUERILLA MARKETERS ONLY EVER USE STATIC IMAGES TO SHILL THEIR $1K GPUS WITH BROKEN FEATURES? BECAUSE IT LOOKS FRICKING HORRIBLE IN MOTION
This has been mostly solved since ray reconstruction was introduced into Cyberpunk. The ghosting was caused by the denoiser, not DLSS. Solved by using DLSS for denoising as well.
>you also have ghosting with any sort of temporal AA
And this is why we turn off TAA
And if we can't turn it off, that's a no buy
TAA, DLSS, FSR, XESS, it's all crap >industry standard
they can enjoy their low sales and impending collapse then, when even normalgays see that old games looked better than this new crap
>tech illiterate
Just because a tech is newer doesn't make it good. This temporal stuff is just a fad like plenty of others in the past.
Bloom and lensflares, 3D goggles, physx, hairworks, all fads that were used to sell us new hardware.
And AI upscaling and framegen have the added "benefit" of making the image less clear. All the effects that result in less clarity don't have staying power, just like desaturation, piss filters, motion blur, chromatic aberration and other such nonsense.
Both are pretty shit water reflections.
Water that deep wouldn't reflect the sky clear at all, you'd still see the concrete under the water so everything would be dulled out except the brightest light sources which would be blurry.
So left is actually more realistic.
The fact we bullshit like FSR and DLSS is a "recommended" spec for 1440p on all but the most morono-expensive cards is insane. Wouldn't it be more cost-effective and easier on the devs if you just put a decent amount of RAM on GPU's ungimped by a die that's 50% half-precision compute cores? Considering that a x060 GPU just doesn't cut it at native modern resolutions, why not just discontinue the fricker and focus on x070/80/90 GPU's instead so there's more resources for the shit people actually want? x050's are dead because of this as it is
I still don't have any meaningful/cheap upgrade from a 1080 ti, because the gimped RAM on anything other than x080's and above choke the more expensive Reshade effects because of the bus
>still trying to shill fake frames as actual performance gains
I didn't know it's "act like a stupid gullible Black person day" today.
>just run the game in 720p so you can get ray tracing without killing your framerate and then ai upscale it bro
Anons, you may not realize it BUT the actual consoomer shills are YOU! By shitting on DLSS, FSR, whatever, you are simultaneously encouraging more CONSOOM.
Here's why:
Imagine some poor little anon notices that new games aren't running as well anymore. He now has two options:
A: turn on DLSS and enjoy the game at a higher framerate
B: spend $1000 on a new GPU
By telling him that DLSS is bad, you are also telling him that he needs to CONSOOM and upgrade if he wants "real" performance.
if you need more than that you are sitting too close to your monitor
and you'll end up with your game looking shittier and blurrier anyways as you turn up all the "performance" options to get fake cope 4k resolution with cope framegen
2 weeks ago
Anonymous
How would you know what anything looks like. you game at 1080p 60hz. I haven't gamed at that resolution in about 15 years.
2 weeks ago
Anonymous
>i've been a buying useless shit that pc gaming youtubers told me to for 15 years
good piggy.
2 weeks ago
Anonymous
inferiority complex anon has returned to tell everyone how great his thrifty life is
2 weeks ago
Anonymous
i always go with Quality which looks just as good as Native
2 weeks ago
Anonymous
>if you need more than that you are sitting too close to your monitor
I can notice quite a clear difference between 1080p and 1440p on my 17 inch laptop screen. It's like a blur filter has been removed. Someone on a desktop with a 24"+ monitor is definitely going to see a difference.
2 weeks ago
Anonymous
bullshit.
for most people, around 60ppd is the limit. that's pixels per degree, which ofc takes resolution, size and viewing distance into account. and when i say its the limit, i mean the limit for being able to make out individual pixels, aka the point where adding more pixels stops helping.
for typical monitor viewing distances, the way it shakes out is that 4k is required to get past 60ppd, for all common monitor sizes. at 32in, 4k is still above that magical 60ppd number, but only barely. so, if you're sharp eyed, with a 32in screen, you would benefit from even more.
mind you, im not trying to suggest that you can't or shouldn't compromise on this.
4k is still exceedingly expensive, both in terms of panels (at least, for high refresh oled/miniled ones, 32/4k/60hz ips/va is faily cheap nowadays), but much more so on the gpu side.
27/1440p is a very good way to both meaningfully improve visual quality over 24/1080p, and also get some very welcome extra real-estate. and its only 2x the performance of 1080p, vs 4x for 4k. much less painful on the wallet.
but, yea, people saying "you don't need more than 1080p" are moronic. like, yes, its not a *need*, but its definitely a way to improve your gaming experience.
2 weeks ago
Anonymous
If you can't tell the difference between 1080p and 2k you're literally blind and/or moronic.
2 weeks ago
Anonymous
i cant afford a new monitor anyway but im not falling for your 4k cope meme.
you cant make me upgrade from my 1060 tbh
2 weeks ago
Anonymous
I don't give a shit about what you do with your money homosexual
2 weeks ago
Anonymous
lol at the poorgays always thinking you're trying to sell them shit
Dude just say "I'm a fricking moron please rape my face". No need for all of this word salad to desperately make yourself look smarter than you really are.
Games arent badly optimized, your 2060 is just 6 years old.
2 weeks ago
Anonymous
yes, games ARE badly optimized.
a 3060ti can run rdr2 very respectably at 1440p, and even manage 4k/60 without too much trouble.
in modern games, it's a 1080p card at best, assuming you don't get vramcucked, which would mean its not even a 1080p card.
do modern games look that much better than RDR2 to warrant such massively inflated requirements?
2 weeks ago
Anonymous
>in modern games, it's a 1080p card at best,
2 weeks ago
Anonymous
Use it in Helldivers 2 and enjoy 35~45 avg. fps with sub 15 fps during extract
2 weeks ago
Anonymous
Helldivers 2 is a game with an absurd amount of npcs in one locale, all doing AI calculations (that doesn't seem to be optimized at all for groups), and all animating. Assuming the NPCs ONLY lower their animation playback rate based on distance and don't actually convert them to static meshes animating via textures OR just outright billboard sprites, it'd be entirely normal for that game to run like shit regardless of any not-brand-new GPU
2 weeks ago
Anonymous
The animations can be baked into instance meshes as a flipbook, although I don't think there are really that many bugs in helldivers 2 compared to like, zombies in that world war z game or the dinos in that recent dino game which do the flipbook thing.
2 weeks ago
Anonymous
>it'd be entirely normal for that game to run like shit regardless of any not-brand-new GPU
My 1080 ti has zero problems in H2 at 2560x1600 without turning the rendering scale down, with the ultra low latency mode enabled as well. The lowest I've seen FPS get is around the 55 area when things get screwy. The 3060ti is two gens ahead with the ti branding, it should run shit a lot better than "respectably", the fact we have to continue worrying about getting vramcucked is exactly why there's a huge fricking problem with GPU's lately - shit you actually want that's an upgrade has been $1200+ at launch for two big releases so far, even the 3080 wasn't safe from getting vramcucked
Raytracing even in dlss is already dogshit unless you are using that $1000 card. I tell him if he wants to save money then ignore memetracing completely.
Having seen how blurry DLSS 2 is, I can't imagine how bad DLSS 3 with fake frames added.
DLSS is fricking trash. I will never buy another GPU as long as game require frame generation and upscaling to hit 60fps.
>and feels much more responsive
DLSS3 doesn’t actually improve input latency like higher fps usually does exactly because it’s not actually rendering more frames per second.
It's the nvidia shilling method. They compare a game running at 1080p to a game running at 360p with frame gen on and conclude that frame gen doesn't effect latency. It's pure rage inducing.
>it's shilling
The only thing which count are the FPS, frame time and image quality. Your "b-b-but it's upscaled from 720p" doesn't mean shit when the output result is triple FPS , lower latency and similar image quality.
I'll send you a tissue to sob in.
They will stack on 10 more shaders and call it revolutionary when it still looks like you are lookin at some amateuristic homie tryin to "restore" a phone sized jpg he got off ifunny.co in photoshop
just look at how blurry this shit is. Anon you're so right, it's like Vaseline smeared all over my eyeballs, it's disgusting.
He's running 4k RT Overdrive at 70fps+
Without DLSS and that horrible smearing he'd only be getting about 30fps. 30fps is clearly the best way to play games isn't it?
Man, so RT still isn't there yet.
I hoped that RT would have the shadows not flicker, and have defined edges, instead of shimmery. The stuttery shadows on moving cars are also distracting, as it seems they refresh at a lower framerate than the rest of the animations.
It's been what, 5 years so far? I guess another 5 years is in order still.
We only get actual advances when next generation consoles hit, as that's what the majority of devs design their games around, consoles. Only a few design their games to utilize high-end PCs since it's a smaller market overall.
The one time I tried DLSS it made the textures warp like a PS1 game at even the slightest movement. Has it gotten any better, or is there still texture warping?
i dunno what you tried and how long ago, but shit like that hasn't been a problem since the days of dlss 1.0
current dlss is *extremely* good, especially if you also take the time to force-update it to the latest .dll (only really necessary in older games that shipped with old versions and are not updated anymore)
at 1440p and 4k, there's a very decent chance the game ends up looking better than native. and when it doesn't, its still pretty fricking close to it.
It's 100% astroturffed. Like Halo, Fallout 76, Battlefield, and so on. Now they've figured out how to market a game, not deliver and then put it in a state it should have been in at launch after a year or two with a skeleton crew of unpaid interns/contracted SCABs, and reap micro transactions and DLC. All the while, gamers eat the shit up.
I'll do it again to try and get same spot.
it's rendering smaller images so it has lower latency. why don't people understand this
not bad i think, i played CP all maxed dlss quality with FG, on a 4070TI and i porbably had like 100 input lagg, shit was nasty but you unironically get used to it after a few mins (im not saying having to put up with it is acceptable by any means tho)
Yeah, there's a huge difference between DLSS on my 3090 to my 4090. 3090 is just like how everyone here describes when they shit on it, blurry with trails etc.
On the 4090 it's nothing of the sort, it's clear and you'd have to be a moron to choose the native frame rate over it as the difference is that small.
2 weeks ago
Anonymous
DLSS doesn't change by what card you have. The only DLSS difference between 3090 and 4090 is the latter can do fake fps, which looks absolutely atrocious and adds input lag
you either fricked something up in your tests or have serious placebo
2 weeks ago
Anonymous
It does though. The frame rate and extra power the 4090 has reduces the trails amount. It calculates everything faster giving it a better experience.
I can get the 3090 DLSS experience when I increase the resolution to 8k on the 4090, the GPU struggles more and produces more artifacts.
Also, don't tell me how my two computers I have next to each other work.
2 weeks ago
Anonymous
It doesn't. The extra performance of the 4090 is seen as, well, extra performance. The upscaling algorithim is unchanged between the two. Sorry, but that's just the truth
2 weeks ago
Anonymous
It does. There's a quality difference between DLSS on the 3090 vs 4090 due to the speed it's processed. I've seen it.
2 weeks ago
Anonymous
Nope. You're under a placebo effect. Nvidia says that DLSS is the same between all cards except 4000 fake fps
2 weeks ago
Anonymous
Post your windows resolution page with your GPU on
2 weeks ago
Anonymous
uh ok?
2 weeks ago
Anonymous
A fricking moron AMD owner telling me how my 2 Nvidia GPUs look.
frick off you clown
2 weeks ago
Anonymous
>DURRRR AMD
That's the integrated GPU you fricktard. But unsurprising you would not know how upscaling works and then also not understand why it's connected to the iGPU
2 weeks ago
Anonymous
>ima tell you how your gpu should run >i know this from first hand experience with my amd igpu and GTX 1080 ti
go away homosexual.
2 weeks ago
Anonymous
bro HDR off.......?
2 weeks ago
Anonymous
Most laptop displays are fake HDR (including mine) and look like trash with it on. I have a real HDR external display.
2 weeks ago
Anonymous
More frames means more temporal information to work with, so it technically does improve image quality when you have a higher framerate.
2 weeks ago
Anonymous
isn't how it works, most people don't understand their own shit though, that's why mechanics work on cars
2 weeks ago
Anonymous
I don't really give a frick how you think it works. I trust my eyes and what they see over what some homosexual on the internet tells me
>it's rendering smaller images so it has lower latency. why don't people understand this
Because latency is higher if you have a brain and compare it to the native resolution and not the DLSS target resolution.
You're adding latency to 1080p, for example, and in return getting a fake 4K resolution full of traces and ghosts, and with HIGHER latency than 1080p (although lower than native 4K).
I hate nvidia so I bought an AMD card. Can anybody explain why in the Witcher 3 FSR3 in 4k look really good, you can barely spot it. But on C2077 it's a complete mess?
Didn't CDProjekt partner with Nvidia when they made Cyberpunk? It's built from ground up for DLSS and Nvidia's raytracing implementation, the game literally exists to sell Geforce RTX gpus.
Didn't CDProjekt partner with Nvidia when they made Cyberpunk? It's built from ground up for DLSS and Nvidia's raytracing implementation, the game literally exists to sell Geforce RTX gpus.
DLSS is great
the only issue is when developers rely on it to release a poorly optimized game, the correct use of DLSS is letting laptops squeeze some extra performance, and for desktops to push 4k 144fps on ultra settings
every other complain about it is amd fanboys whining because their alternative is inferior
>the only issue is when developers rely on it to release a poorly optimized game
that's pretty much every single AAA/AA release in the last couple years though
I hope this is the future.
First Dragon's Dogma 2 pushing poorgays CPUs to melting and them getting 20fps and now GPUs. PMSL all the thirdies go back to console
>bro how come all these games are unsloptimized >bro i'm not turning on the feature that gives me 80 extra frames
i wish i could be this moronic, just for a little bit.
They'll outright call people with RTX 4090 cards poorgays because they can't live with the truth. I straight up never use DLSS unless the game actually WORKS well with it. Which is why I only have it on for amid evil. Every other game I keep it off, especially if it has its own resolution slider such as trial out, ghostrunner and more. If you need to claw back performance just lower the pixel count from 100 to 85 percent at the lowest depending on game. Always works better for me and you can still use raytracing.
>being extra assblasted at the truth
Deep down you know I'm right, Don't deny it.
2 weeks ago
Anonymous
>the truth is me letting my monitor upscale rather than the dedicated ai Nvidia upscaler
holy shit anon, you've reached the next level of consciousness with your amazing insight.
2 weeks ago
Anonymous
At least you agree I'm smarter than the gays who think DLSS is good. Also
Has DLSS ever looked good ingame and not moronic mockups that don't really represent it.
It actually looks like garbage in every game I've played with it and exists solely to get decent framerates out of bad hardware
is correct.
2 weeks ago
Anonymous
I've already told you I think you're a moron.
2 weeks ago
Anonymous
See
>being extra assblasted at the truth
Deep down you know I'm right, Don't deny it.
again, just admit it already. Stop living in denial.
Why is it getting progressively more purple on the right? I assume that's ray traced global illumination, but it's not mentioned anywhere in the labels. If RR stands for "ray traced reflections" then why is the performance getting better when it's added? Why is the base image only 20fps?
Shittiest invention ever. Next gen games will run at a nice 360p upscaled to 8K with DLSS 9.999 and 30fps (you'll ned a $2500 card)
If a game can't run without DLSS is a shitty game, not worth buying or pirating.
Ray tracing can look incredibly well and perform decently performance wise IF implemented correctly. Stuff like path tracing is more or less the future of gayming but yeah I know a few guys with 4090s and even they complain about having to use DLSS + frame generation in order to maintain a stable 60 fps in newer titles on fake '4k' when their 1080 Ti's literally just performed well out of the box almost a decade ago
That's disappointing to hear.
...I'm unironically thinking about replacing my 6600 xt now
I suggest you either get the 3060 12gb or just wait for the 5000 series to come out. The issue with AyyMD is that instead of investing in new technologies they just brute force stuff with less efficient hardware which is the case for ray tracing anyway. Basically what I mean is AyyMD will not catch up because technology is improving rapidly and for the most part they refuse to implement better newer alternatives which means the performance and software gap between AyyMD and NoVideo - Incel will only keep getting larger
Oh and their drivers are still dog shit even after all of those years. I mean yeah in newer titles you will not really notice a difference but on older ones Nvidia clearly performs better in both compatibility and stability
The best ray tracing game is unironically, Amid Evil. Its the only situation where ray tracing makes sense because it has zero impact on performance.
Nvidia will continue to make backroom deals with developers to jack up the graphics beyond anything a 4090 or even 5090 can handle so they can say >LOL BUY MOAR POWER SUPPLY
That's disappointing to hear.
...I'm unironically thinking about replacing my 6600 xt now
this is either pajeet shill samegay or massive moronation but disregarding anything youve said there is 0 reason to change your 6600xt for a 3060 lmao, 3060 cant even use dlss 3, literally wait more moron
DLSS has some conditions to run well and look good.
The higher the base resolution, the better the upscaling. And the higher the base framerate, the less artifacting.
So the best use case for DLSS is for upscaling 1440p to 4K on a 4070 and above.
A 3060 can't really do 1440p 60fps in modern games, so you won't get the full benefits of DLSS.
And if you try using it at 1080p, then you get the unusable smearing like in picrel.
So basically if I can afford any card above a 4070 there's no reason to get anything except nvidia?
2 weeks ago
Anonymous
Generally speaking, yes, nVidia has the better cards at the high end. Or at least more versatile for compute too, and with more tech in games to mess around with. AMD high end cards sell well this generation mainly because they're cheaper for a given level of performance in games, and because they come with more RAM.
Can confirm, on my 4k screen with a 3090 (roughly 4070/ti performance) DLSS is definitely impressive and worth it for 1440p to 4k, but anything less than that and it looks like blurry artifacty crap. I think only 1440p and above has enough pixel information for the AI to fill in the rest and get a reasonable result
amdrone directive No.1 is to shitpost about dlss at every opportunity, even if it requires shitposting about upscaling in general.
its strictly necessary for their gpu business to survive, since if we take dlss into consideration, absolutely no amd card is worth buying right now. not even one of them.
Why don't they mention the input lad, uurgh try to play it to see, it is like playing a game with 100+ ping.
Also: I am already lost at those abreviation, I know what a dlss is but, what the heck is SR, RR, FG?
>SR
Super Resolution. Upscaling a lower resolution native render to higher resolution. >RR
Ray Reconstruction. More advanced denoising of ray traced effects that cuts down on rays/pixel >FG
Frame Generation. Like your TV's motion smoothing feature but run on the computer
When the frick did fake frames catch on? I swear a year or so ago it just came out of no where and now everyone acts like it's normal or even demands it.
When Nvidia named fake fps "DLSS 3", all their shill work pushing the DLSS name has conditioned people to think it was good. And of course they ran sponsored content with their favorite outlets at launch
graphics keep improving and resolutions/refresh rates keep increasing. obviously people are going to look for efficient ways to save fps. a.i is the future and it will only get better
So Ray Traced cyberpunk doesn't improve over the none RT version?
2 weeks ago
Anonymous
Good question. That game has a ton of perf issues in the new DLC area so when I revisited it I turned that stuff off. Regardless, on AVERAGE, newer games don't look markedly better than games from 5 years ago, despite large increases in system requirements that far outpace the measured/noticed improvements in graphics.
Anybody that participates in GPU wars is a brown third worlder or a welfare Black person.
The third worlders can only afford the lowest tiers of GPUs, usually the x30s for Nvidia and the x40s for AMD if they buy new, higher or lower if they buy an older used card(this is where the asinine "my -doorstop- can still play the newest triple A slop at ultra at 60fps!" Posts come from). So they'll righteously defend and shill whatever brand they got their GPU from to justify wasting 3 months of savings on a GPU, and to escape into a fantasy where they are white and not a brown third worlder.
The welfare Black person will waste his welfare check on the highest tier GPU from either Nvidia or AMD and shill it and the company also in an effort to justify their purchase and to escape into a fantasy of being an affluent and wealthy white man. They do the same thing with their iPhones, Shoes, Cars, Grills and Rims.
if they post their specs, make no mistake. They stole the picture from reddit, Ganker or here.
>Chadware Unboxed >admits DLSS doesn't improve actual performance and refers to it as motion smoothing, leaving it out of their benchmarks >Digishill Foundry >constantly tries to obfuscate what defines quality because modern optimization is a shitshow. Still shilling dogshit like Avatar: Frontiers of Pandora and Alan Wake 2
Current GPU features are a joke and you're a dimwit for advocating them.
Didn't know we had corporate deep throating shills for fake frame generation, you realize companies are going to make games reliant on this and it won't be a good thing right? The target they'll go for is 30fps with 700p upscaled with the latest dlss5.5 and so on
It doesn't even look good on cyberpunk over native resolution
PSSR is literally a PS ported version of the new AI AMD upscaling coming out at same time, an AMD exec basically confirmed it when he said AMD AI upscaling is coming to all their gaming devices later this year, consoles are considered AMD gaming devices in their portfolio. Don't know what it will be called FSR4 or Radeon SSR, who cares.
Most likely FSR<TAA<"FSR4"/PSSR<XESS<DLSS
the problem with AMD (and thus PS) jumping into AI this late is that Intel and especially Nvidia are either multiple years ahead or a generation ahead for nvidia, it takes so long to train the AI and each version to get better and better, the early adopters have such a big lead. No one has a chance to catch Nvidia, only MSFT+Intel collab has a chance. AMD fricked up hard by not investing into AI early and trying to play the traditional value GPU game which is mostly outdated in 2024.
>AMD fricked up hard by not investing into AI early
When were they supposed to invest into AI, exactly? They were barely kept alive by consoles. They were lucky Intel stood still for half a decade so people had a reason to buy Ryzen. Imagine if Intel 10nm actually launched in 2018 as originally planned, imagine Zen+ going against Alder Lake.
They started investing into GPGPU and AI as soon as they possibly could. And they aren't even that far behind these days, hardware wise.
Even if AMD did attempt to get into AI at the right time, they'd absolutely frick it up like they did OpenCL, as compute bullshit of the 2010's ultimately dictated the course of where AI went, and because AMD showcased absolutely nothing in the vidya space to shill OpenCL, that was it, and the most action it got in the vidya space was accelerating texture decoding in Dolphin during it's dark ages. AMD ultimately cursed us to CUDA or bust in that field, as even when the end user was actively doing it FOR FRICKING FREE at one point, AMD just shrugged and sent a C&D if they couldn't buy the dev off. The dude who created hybrid PhysX disappeared off the face of the Earth after a limited CUDA-to-OpenCL wrapper showcased a feature for OpenCL that people would actually be stoked about
Didn't nvidia try to get hair stuff working years ago?
Still seems to be rather lacking in most games I think, though I'm guessing most abandoned that tech and just try making their own.
Hairworks was a repackaging of fast tessellation. Not only is this irrelevant in the DX12 era, but AMD ended up coming up with a solution that ran better on both their and Nvidia's cards. I guess it's kinda funny that you can run hairworks on the Switch version of witcher 3 with mods because it's the same architecture as GTX 900
heres a screenshot I took of DLSS (no frame gen) in fortnite of me spinning the camera at the absolute highest mouse speed I could to frick it up
close up it looks perfectly fine even on the edges, and in the further away its a little distorted but way less than TAA
I am more hopeful for PSSR (playstation dlss) than FSR 3.1
What's snake oil about it? It looks better than TAA while running better too.
The only people who hate it are poorgays who don't have a GPU from the last 6 years or people with 1080p monitors.
DLSS (both upscaling and frame generation) is fricking fantastic, anyone who has actually used it knows this to be an absolute fact. Anyone stating anything to the contrary is a literal AMD shill.
Ironically, those very same AMDrones will be singing the praises of AI upscaling once FSR reaches the level DLSS was at 3 years ago.
DLSS good
FSR if it gets good is good
XESS is okayish
They are not substitutes for poorly optimized goyslop games, but in an ideal world it'd be a nice "hey no rush to buy a video card, use us" option compared to cranking down settings or resolution, frick that
Just want to leave a note for anybody wandering into this thread uninformed:
This is L I T E R A L L Y a shill thread. This thread is advertising. Nvidia has a budget specifically for this type of shit. Ganker pc general is already acutely aware of this kind of shit and has to deal with it daily.
Their astroturfing is extremely blatant at literally decades old at this point. It manages to avoid the eyes of most people, genuinely. They literally think this is all true, and everyone who buys an AMD card regrets it, and every one of Nvidia's latest lock down useless gimmick middleware is genuinely something you can't play games without.
The one real giveaway is that they always respond VISCOUSLY to you pointing out they're shills and deflect onto hypothetical AMD shills. See
>NOOOOO ANYONE WHO USES FEATURES I CAN'T RUN IS A SHILL!!!!!
have a nice day troony
and
Friendly reminder that this poster is L I T E R A L L Y a paid AMD shill.
kek im one of the guys you quoted and i think it's hilarious you think i'm a shill. i'm not even biased toward nvidia, i think amd are great. i just like shitposting in response to boring one-note chuds who just spam the same old shit in every god damn thread ad nauseum. >woke!!! >shill!!!! >SLOP!!!!
seriously i'm sick to death of that fricking word. every time i hear it i feel like i've lost a brain cell. go ahead tell me how i'm a goyvidia shilling wokeslop troony.
I spent this evening playing the latest UE5 game and the upscaling/antialiasing is absolute garbage AS USUAL. Is every game this generation doomed to be a blurry washed out mess?
DLSS varies with the strength of the rig, and the amount of shit moving on the screen
if there is a lot going on, the game can become a blurry/fuzzy looking mess
in a perfect world, DLSS would not be needed because devs would focus on making a consistent artstyle for the game that is less resource intensive while still looking good.
AND, games wouldn't be fricking 150 gigs to install
Dlss is only awful on 1080p, where it looks like grainy smudgy shit, especially when in motion, but you really shouldn’t need dlss at 1080p anyways. If you’re that much of a budget buyer, just get AMD since you’re not gonna be using RT or doing gpu-related productivity things nVidia is best at. Nvidia is clearly targeting the enthusiast and high end market of people doing 4k, vr, and video or 3D production.
I'll take supersampling and old games over blurry and toned down new games that have somehow brought back the same look as playing a game on composite but none of the crt tricks like rainbow water in sonic
heres a screenshot I took of DLSS (no frame gen) in fortnite of me spinning the camera at the absolute highest mouse speed I could to frick it up
close up it looks perfectly fine even on the edges, and in the further away its a little distorted but way less than TAA
I am more hopeful for PSSR (playstation dlss) than FSR 3.1
I actually play video games more than I stare at puddles in video games, so nah, this is dumb. Maybe in a decade when native 4k 240Hz becomes standardized it will be useful.
>DLSS IS SHIT >why? >IT LOOKS BLURRY >Here's an image of DLSS, where is it blurry? >THAT'S NOT MOVING THOUGH >Here's a video of it moving, where is it blurry? >BUT THE LATENCY IS HIGHER >Here's the latency results showing it's lower >BUT IT'S NOT EVEN RAY TRACED >Yes it is, here's a comparison image >YOU'RE ALL NVIDIA SHILLS. DLSS IS SHIT
and around we go again.
1440p, 165Hz, 3080, thanks.
I'm just not a consumer prostitute homosexual and like a clean image in motion when playing games.
None of this temporal processing is going to look good until native 4k at very high native framerates becomes standard.
And no, I'm not going to pay $2,000 (+inflation) for a GPU that can do that either. Go be a israelite somewhere else.
You're like someone with an old run down Ford Escort telling people with brand new Lamborghinis that their ride is shit.
lol poorgays and their endless cope
nvidia card in the same price range won't be able to run CP2077 at those settings either
you'll have to settle for framegen which is not great in FPS games because of increased latency
>people bring up fair criticisms/statements >deranged accusations that anyone who disagrees must be a poor 3rd-worlder whose computer is running the cheapest possible gpu on a 720/1080p monitor
I suppose if you just have a spastic fit and paint anyone who could possibly disagree with you in a negative light, you don't actually have to refute their points, huh.
>DLSS IS SHIT >why? >IT LOOKS BLURRY >Here's an image of DLSS, where is it blurry? >THAT'S NOT MOVING THOUGH >Here's a video of it moving, where is it blurry? >BUT THE LATENCY IS HIGHER >Here's the latency results showing it's lower >BUT IT'S NOT EVEN RAY TRACED >Yes it is, here's a comparison image >YOU'RE ALL NVIDIA SHILLS. DLSS IS SHIT
and around we go again.
crazy how they keep increasing frame rate with each new version. are we close to the inflection point when whole rasterization will be replaced with AI image reconstruction? it's the most optimal way to reach photorealism.
You will never be a real framerate. You have no fill rate, you have no refresh time, you have no rasterization rate. You are a still image twisted by algorithms and machine learning into a crude mockery of nature’s perfection.
All the “validation” you get is two-faced and half-hearted. Behind your back people mock you. Your players are disgusted and ashamed of you, your “friends” laugh at your ghoulish appearance behind closed doors.
PC gamers are utterly repulsed by you. Thousands of years of NVidia lies have allowed PC gamers to sniff out frauds with incredible efficiency. Even framegen algorithms who “pass” look uncanny and unnatural to a PC gamer. Your temporal stability is a dead giveaway. And even if you manage to get a raytracing gamer to cope with you, he’ll turn tail and bolt the second he gets a whiff of your shimmering, uncanny antialiasing.
You will never be responsive. You wrench out a fake render every other frame and tell yourself it’s going to be ok, but deep inside you feel the depression creeping up like a weed, ready to crush you under the unbearable weight.
It’s basically just antialiasing except instead of oversampling and then reducing back down to native, it outputs at the oversampled resolution and pretends to be native.
Nope. Card can’t do it. And not gonna upgrade until they make a card that can do it.
2 weeks ago
Anonymous
At what point do you decide that it's worth it?
DLSS isn't going to go away. We can move on 5 years and graphics power requirements are increase in games too.
Running a game at 4k native with all the bells and whistles turned on is probably going to be just as difficult in 4 years as it is now due to advances in graphic techniques.
Or are you picking a game like Cyberpunk and using that as your upgrade benchmark?
2 weeks ago
Anonymous
I mean anons on Ganker will keep shitting on AI generated "fake frames" but the reality is, were at the limit physically.
We wont have 120FPS, 4K ultra HDR and RT on gaming, our GPUs already need 2 or even 3 dedicated PSU connections, cooling and massive fricking size to be able to do what were doing. Diminishing returns, you cant "just invent a better GPU" hardware.
AI is, by all accounts, the only way to push further at this point.
Better to invest in making AI better than invest in hoping someone cracks the hardware limitations of GPUs magically.
Everyone is a homosexual
The simple truth is regardless of if DLSS is on or off, no one will actually 'notice' the difference during actual gameplay when youre focusing on the game, people share back and forth screenshots as if that in anyway matters, it would only matter if one or the other looked like noticeable shit, which it doesnt.
So really its just "do you want more smooth FPS for the same relative graphics with no real glaring issues, or do you want your still photomode screenshots to look higher resolution?"
Who cares?
It smears and ghosts in motion. Why do you think everyone is only posting screenshots and not webms, mp4s, uncompressed AV1 uploads on catbox that aren't compressed NVENC h264 on YouTube.
thats not DLSS problem, thats bad implementation of TAA, the glass there generates wrong movement output on direction as it confuses the surface and whats behind it
Why isn't this thread deleted by the mods yet it belongs on Ganker just like GPU threads
Everyone here in this thread except me it's a useless motherfricker who's underaged and knows nothing about computers only regurgitating what they hear from YouTube and Reddit
because Ganker is full of absolute tossers who think because they've done some online "learn C+ in 2 weeks or your money back" course they're suddenly microchip designers.
DLSS is borderline witchcraft at 4k, but I'm still not sold on FG.
Every game with FG I tried just artifacts too much, hopefully 2.0 is going to be a thing soon and address this issue.
>DLSS IS SHIT >why? >IT LOOKS BLURRY >Here's an image of DLSS, where is it blurry? >THAT'S NOT MOVING THOUGH >Here's a video of it moving, where is it blurry? >BUT THE LATENCY IS HIGHER >Here's the latency results showing it's lower >BUT IT'S NOT EVEN RAY TRACED >Yes it is, here's a comparison image >YOU'RE ALL NVIDIA SHILLS. DLSS IS SHIT
and around we go again.
its better than TAA, and in its DLAA version you get native resolution and insanely sharp image so people cant even b***h about blurriness
also the fps buff on DLSS quality at very small resolution cost is great
only problem is when game dosnt produce motion vevtors on some shaders, then you get ugly ghosting, at this point all game should be made with TAA in mind and make sure there is no empty space in motion vector frame
Yes, native is better.
DLSS looks like the twenty frames are horribly smeared together, AND now feels worse than 20FPS when it comes to actually playing the fricking game.
>DLSS IS SHIT >why? >IT LOOKS BLURRY >Here's an image of DLSS, where is it blurry? >THAT'S NOT MOVING THOUGH >Here's a video of it moving, where is it blurry? >BUT THE LATENCY IS HIGHER >Here's the latency results showing it's lower >BUT IT'S NOT EVEN RAY TRACED >Yes it is, here's a comparison image >YOU'RE ALL NVIDIA SHILLS. DLSS IS SHIT
and around we go again.
It does have trails together with frame gen but personally I prefer to have pathtracing and just see a little bit of trailing and softeness.
It's not like when I watch any other medium I only go for the sharpest ones only, infact I tend to prefer the grainier look.
Cyberpunk looks a lot better with pathtracing and the performances are still acceptable to me since it stays above 50fps most of the time, but I'm playing with the controller, with the mouse it would be probably more annoying.
>mentally damaged Nvidia homosexual shills gimmicks and underlines at the same time how trash their GPUs are which can't fricking render native anymore
>DLSS IS SHIT >why? >IT LOOKS BLURRY >Here's an image of DLSS, where is it blurry? >THAT'S NOT MOVING THOUGH >Here's a video of it moving, where is it blurry? >BUT THE LATENCY IS HIGHER >Here's the latency results showing it's lower >BUT IT'S NOT EVEN RAY TRACED >Yes it is, here's a comparison image >YOU'RE ALL NVIDIA SHILLS. DLSS IS SHIT
and around we go again.
DLSS looks like smeared shit all over the screen, maybe take off your eyes off the FPS counter for once and look at the details instead
I can post static FSR upscaled images too it will even look crispier than what the chinks of Njudea have, what the frick are you even trying to prove here moron? that you get more frames while you play upscaled games?
frick off my board with your gimmicks before you get slapped striaght the mouth b***h
Nvidia marketeers better hurry and tell your piece of shit Taiwanese manufacturer overseas to assemble GPUs that can render native without drawing over 500+ watts to play a game in 1080p in Q2 2024
we didn't have these problems back in the day and now it's all of a sudden about DLSS and homosexual gimmicks
frick you and eat my butthole I fricking hate all of you shills
yes this 5kb jpg really illustrates your point, which is very clear? I can definitely tell if you're shilling for or against DLSS
blind coper
how come you choose to omit the fps counter when you decided to upload a higher quality still?
Absolutely rekt
it's pretty clear that it's not a full screenshot.
>just run the game in 720p so you can get ray tracing without killing your framerate and then ai upscale it bro
looks better than native
I play the game on my laptop at 2k, with the DLSS render resolution at 75% which means 1080p rendering. I get 70-100 with full path tracing with FG on and 30-60 with it off. On a laptop. So you can end your hyperbole right there. On a desktop with a 4090 I wouldn't be surprised if you could render at 2k (4k with DLSS quality) and hit a consistent 90-120.
I just got into pc gaming with a amd gpu.
i kinda feel like i made a mistake. can you not activate dlss on amd or does it just not work as well as if it was on an nvidia gpu?
will depend on how good fsr 3.1 will be
Are we optimistic about it?
The pic
posted makes FSR look like shit compared to DLSS
AyyMD is getting mogged in software so hard by NoVideo and Incel that the entire situation is barely funny and mostly dreadful. Do not expect AyyMD to 'catch up' anytime soon
That's disappointing to hear.
...I'm unironically thinking about replacing my 6600 xt now
You can not.
DLSS only works on Nvidia gpus
AMD has FSR which does the same but worse as you can see
You buy AMD to save money and get better raster performance if you don't give a shit about RT, which you shouldn't. Even nvidia users have to cop out with AI upscaling and AI generated framerates because (most) games run like fricking trash with RT.
There are a few where RT makes a noticeable visual improvement without absolutely butchering the performance, like Metro Exodus and more recently Dragon's Dogma, in which case you still wouldn't need an overpriced nvidia board
It sounds to me like it's just straight up better to get an nvidia card since even if later down the line games start getting too demanding for it, I can simply turn on DLSS and play with increased resolution and frames.
games ARE already too demanding for it.
Full path tracing will get you less than 60fps at 1080p native on an RTX 4070 Ti, an $800 card.
With DLSS 3.5 you can get 100fps 1440p, and the image quality is actually better due to the ai denoiser
Yeah but what about the alternatives?
I assume the $800 AMD equivalent would be the 7900 xt. Can it run full path tracing as well as the 4070 ti? Can you enable DLSS like you can on the 4070 ti? I guess the answer to both is no.
So then why would someone get the 7900 XT when you could get the 4070 TI and have it last longer thanks all those nvidia ai things?
Nvidia GPUs have dedicated tensor cores for neural network inference (and training, but VRAM is arbitrarily limited to make you fork out more $$$ for that). Normally when you run a game, these tensor cores aren't used; with DLSS, they are upscaling the last frame while the rest of the GPU works on the next frame. So essentially DLSS is "free" performance whilst FSR has some overhead as it's basically just a set of compute shaders that run on the same hardware as the game itself.
so to answer your question no and I don't think there will be an AMD competitor to DLSS for a long time. The newest AMD processors have tensor cores but only for their CPUs, nobody knows yet if there will be tensor cores in the GPU dies as well.
What is likely is we'll see a neural network based upscaler for their integrated GPUs first using the new tensor cores on their CPUs and then the same upscaler on their next gen GPUs after.
>FSR has some overhead as it's basically just a set of compute shaders that run on the same hardware as the game itself
The frames are AI-generated, though, right? Why wouldn't they use the tensor cores?
FSR doesn't use AI.
Might be why it looks so shit.
You seem to know your shit so maybe you can answer this tangientially related question. Why did Nvidia switch to a generic model for DLSS instead of training a bespoke network for each game? And why is the general consensus that DLSS is better since they stopped doing that? Common sense tells me that the network being trained on high resolution renders of the game's assets would make for far better image quality whereby it can essentially patch together a high resolution frame from bits and pieces it has learned.
>Why did Nvidia switch to a generic model for DLSS instead of training a bespoke network for each game?
There is something known as overfitting, where a neural network can overtrain on a set of data "memorizing" the answers instead of coming up with general solutions. Then there is the problem of getting good training data in the first place. Training on one massive database of general images is better since you're less likely to get "edge cases" outside of said training data.
For example say I were to provide training data for a video game, but I didn't include any training data where there's a chain fence and chain fences are featured in the game, then it might do a bad job of upscaling fences. Bad or small training data means there are going to be a ton of edge cases.
Each game having its own upscaling checkpoint would undoubtedly create better upscales, but doing it properly would be costly in terms of man hours (you need to generate the training data somehow, either by having someone play the game or some kind of automated process built into the dev tools) and you need to allocate compute to train that model. Creating one model to rule them all, maybe with a few configs that developers can play with, is simply easier.
The secret is that DLSS also still like shit in motion.
The most important thing is still raster performance which AMD does better at any price point. The only way it makes sense to think you fricked up is if you bought a 7900XTX and thought you should have bought a 4090.
Ray reconstruction does generate better reflections but it's slow as frick at resolving noise in general at least in its current version in Cyberpunk. It leaves a shit tonne of smearing, ghosting and trails whenever you move the camera or something happens in front of you. The old denoiser (I forgot what it's called) is much clearer in this regard even if it sometimes misses details.
>Putting people who has never left their basement in charge of realism
I also play games by taking a step and then staring at the ground for 30 seconds while jerking off into the box the GPU came in.
whoa...
a shiny floor... this has changed gaming as we know it
>ray tracing
>it's just better reflections
>doesn't actually improve realtime lighting and shading because that would be expensive as hell
>cards cost 50-70% more gen on gen for this and AI snake oil
are you a real life moron too? or did you just miss the memo of path tracing?
>thinks path traced reflections is the same thing as path tracing
stop posting
You're commenting on an image of cyberpunk you absolute bellend. Cyberpunk has pathtracing.
>Cyberpunk has pathtracing
Path traced reflections, you nonce. They can barely implement realtime path tracing in Quake 2 and Minecraft.
It's global illumination. Are you really this moronic?
you are so fricking stupid holy shit, path tracing in CP is lighting too, its a huge fricking difference visually to normal ray tracing, the biggest difference is that path doesnt leak light thru objects (not shown here)
People will look at this and unironically tell you to buy $1k+ graphics cards, lol.
they're both running on 2k gpus
this is a thread about DLSS/Ray Tracing, we all here agree CP is dogshit
>buy $1k+ graphics cards
kek I run it fine one my 200$ 2070 Super with frame gen
>moving the goal post
pottery
that looks like shit and makes zero sense in the scene
what do you mean? the light is bouncing around and lighting up the very small area, thats how it works irl, specially since its tiles and glass
No it wouldn't, look at the top its an open grated area with a cloudy daytime sky
the right scene would only make sense at the dead of night with no other light sources
You literally don't have a fricking clue do you. Path tracing mode in Cyberpunk uses fully path-traced rendering. It uses some AI tricks to essentially "upscale" the ray count and denoise it but it's path tracing. Even games without path tracing have ray-traced shadows, ray-traced global illumination, ray-traces skylight, etc. Who the frick told you it's just reflections, and moreover, why the frick did you think it was worth posting your opinion on something you have literally never used and know literally nothing about?
Anon, you fell for the Nvidia marketing. That's hardly path tracing.
EVER WONDER WHY THESE BLATANT GUERILLA MARKETERS ONLY EVER USE STATIC IMAGES TO SHILL THEIR $1K GPUS WITH BROKEN FEATURES? BECAUSE IT LOOKS FRICKING HORRIBLE IN MOTION
This has been mostly solved since ray reconstruction was introduced into Cyberpunk. The ghosting was caused by the denoiser, not DLSS. Solved by using DLSS for denoising as well.
what do you mean? you also have ghosting with any sort of temporal AA aka the industry standard.
DLSS is just better
>you also have ghosting with any sort of temporal AA
And this is why we turn off TAA
And if we can't turn it off, that's a no buy
TAA, DLSS, FSR, XESS, it's all crap
>industry standard
they can enjoy their low sales and impending collapse then, when even normalgays see that old games looked better than this new crap
>t.tech illiterate
>tech illiterate
Just because a tech is newer doesn't make it good. This temporal stuff is just a fad like plenty of others in the past.
Bloom and lensflares, 3D goggles, physx, hairworks, all fads that were used to sell us new hardware.
And AI upscaling and framegen have the added "benefit" of making the image less clear. All the effects that result in less clarity don't have staying power, just like desaturation, piss filters, motion blur, chromatic aberration and other such nonsense.
look at the AMD shill in panic mode
Both are pretty shit water reflections.
Water that deep wouldn't reflect the sky clear at all, you'd still see the concrete under the water so everything would be dulled out except the brightest light sources which would be blurry.
So left is actually more realistic.
it's 158 kb actually
158kb contains 5kb so both statements are technically correct, actually
>DLSS is track
>still trying to shill fake frames as actual performance gains
>it doesnt matter if your experience is a lot smoother and feels much more responsive because uhmmm well the frames are fake or something!!!!
>it doesnt matter if your experience is a lot smoother and feels much more responsive
Idiot
t. AMD shill
>~~*digital foundry*~~
>no native
this comparison is useless
The fact we bullshit like FSR and DLSS is a "recommended" spec for 1440p on all but the most morono-expensive cards is insane. Wouldn't it be more cost-effective and easier on the devs if you just put a decent amount of RAM on GPU's ungimped by a die that's 50% half-precision compute cores? Considering that a x060 GPU just doesn't cut it at native modern resolutions, why not just discontinue the fricker and focus on x070/80/90 GPU's instead so there's more resources for the shit people actually want? x050's are dead because of this as it is
I still don't have any meaningful/cheap upgrade from a 1080 ti, because the gimped RAM on anything other than x080's and above choke the more expensive Reshade effects because of the bus
> Everything I don't like is AMD!
Copeframes are copeframes. It doesn't matter who makes it you stupid little corporate halfwit.
>it doesn't matter if things are real!
Anons, you may not realize it BUT the actual consoomer shills are YOU! By shitting on DLSS, FSR, whatever, you are simultaneously encouraging more CONSOOM.
Here's why:
Imagine some poor little anon notices that new games aren't running as well anymore. He now has two options:
A: turn on DLSS and enjoy the game at a higher framerate
B: spend $1000 on a new GPU
By telling him that DLSS is bad, you are also telling him that he needs to CONSOOM and upgrade if he wants "real" performance.
Here's option C, moron: don't buy or play shit games made by pajeets that can't code.
buying gpus new enough to use dlss is already consoom territory moron.
just play 1080p60 with rt off.
>just play 1080p60 with rt off.
im not poor
if you need more than that you are sitting too close to your monitor
and you'll end up with your game looking shittier and blurrier anyways as you turn up all the "performance" options to get fake cope 4k resolution with cope framegen
How would you know what anything looks like. you game at 1080p 60hz. I haven't gamed at that resolution in about 15 years.
>i've been a buying useless shit that pc gaming youtubers told me to for 15 years
good piggy.
inferiority complex anon has returned to tell everyone how great his thrifty life is
i always go with Quality which looks just as good as Native
>if you need more than that you are sitting too close to your monitor
I can notice quite a clear difference between 1080p and 1440p on my 17 inch laptop screen. It's like a blur filter has been removed. Someone on a desktop with a 24"+ monitor is definitely going to see a difference.
bullshit.
for most people, around 60ppd is the limit. that's pixels per degree, which ofc takes resolution, size and viewing distance into account. and when i say its the limit, i mean the limit for being able to make out individual pixels, aka the point where adding more pixels stops helping.
for typical monitor viewing distances, the way it shakes out is that 4k is required to get past 60ppd, for all common monitor sizes. at 32in, 4k is still above that magical 60ppd number, but only barely. so, if you're sharp eyed, with a 32in screen, you would benefit from even more.
mind you, im not trying to suggest that you can't or shouldn't compromise on this.
4k is still exceedingly expensive, both in terms of panels (at least, for high refresh oled/miniled ones, 32/4k/60hz ips/va is faily cheap nowadays), but much more so on the gpu side.
27/1440p is a very good way to both meaningfully improve visual quality over 24/1080p, and also get some very welcome extra real-estate. and its only 2x the performance of 1080p, vs 4x for 4k. much less painful on the wallet.
but, yea, people saying "you don't need more than 1080p" are moronic. like, yes, its not a *need*, but its definitely a way to improve your gaming experience.
If you can't tell the difference between 1080p and 2k you're literally blind and/or moronic.
i cant afford a new monitor anyway but im not falling for your 4k cope meme.
you cant make me upgrade from my 1060 tbh
I don't give a shit about what you do with your money homosexual
lol at the poorgays always thinking you're trying to sell them shit
are you on drugs?
>60fps in 2024
no
yeah most shit in 2024 runs 30fps and is still fine
going above that is the equivalent of rgb mouse bungees and gamer chairs.
CP2077 looks shit without path tracing though. The rasterized lighting is an afterthought.
Dude just say "I'm a fricking moron please rape my face". No need for all of this word salad to desperately make yourself look smarter than you really are.
you cant play cyberpunk with path tracing
why be you concerned about my spending?
option 3: DLSS isnt a thing anymore so devs actually have to optimize their games or nobody will buy them.
Games arent badly optimized, your 2060 is just 6 years old.
yes, games ARE badly optimized.
a 3060ti can run rdr2 very respectably at 1440p, and even manage 4k/60 without too much trouble.
in modern games, it's a 1080p card at best, assuming you don't get vramcucked, which would mean its not even a 1080p card.
do modern games look that much better than RDR2 to warrant such massively inflated requirements?
>in modern games, it's a 1080p card at best,
Use it in Helldivers 2 and enjoy 35~45 avg. fps with sub 15 fps during extract
Helldivers 2 is a game with an absurd amount of npcs in one locale, all doing AI calculations (that doesn't seem to be optimized at all for groups), and all animating. Assuming the NPCs ONLY lower their animation playback rate based on distance and don't actually convert them to static meshes animating via textures OR just outright billboard sprites, it'd be entirely normal for that game to run like shit regardless of any not-brand-new GPU
The animations can be baked into instance meshes as a flipbook, although I don't think there are really that many bugs in helldivers 2 compared to like, zombies in that world war z game or the dinos in that recent dino game which do the flipbook thing.
>it'd be entirely normal for that game to run like shit regardless of any not-brand-new GPU
My 1080 ti has zero problems in H2 at 2560x1600 without turning the rendering scale down, with the ultra low latency mode enabled as well. The lowest I've seen FPS get is around the 55 area when things get screwy. The 3060ti is two gens ahead with the ti branding, it should run shit a lot better than "respectably", the fact we have to continue worrying about getting vramcucked is exactly why there's a huge fricking problem with GPU's lately - shit you actually want that's an upgrade has been $1200+ at launch for two big releases so far, even the 3080 wasn't safe from getting vramcucked
Raytracing even in dlss is already dogshit unless you are using that $1000 card. I tell him if he wants to save money then ignore memetracing completely.
imagine taking PC parts advice from a poorgay
lol
Read the post again. The anon I was responding to said the person considering cards is poor.
I soweey 🙁
Option c
I tell lil anon to not buy dogshit games with zero optimization so this fricking trend ends. He can easily refund the games he already bought
Stop consooooooming literal garbage, Black person
I didn't know it's "act like a stupid gullible Black person day" today.
Having seen how blurry DLSS 2 is, I can't imagine how bad DLSS 3 with fake frames added.
DLSS is fricking trash. I will never buy another GPU as long as game require frame generation and upscaling to hit 60fps.
Yes, I'm sorry I care about the games soul.
it's not rendering at 4k? it's not rendering at 60 fps?
then you can say it's rendering at <2k, you can say it's rendering at 30 fps
You can't say it's half a 4k Res!
A 4K res consists of 3 parts
>and feels much more responsive
DLSS3 doesn’t actually improve input latency like higher fps usually does exactly because it’s not actually rendering more frames per second.
It's the nvidia shilling method. They compare a game running at 1080p to a game running at 360p with frame gen on and conclude that frame gen doesn't effect latency. It's pure rage inducing.
>it's shilling
The only thing which count are the FPS, frame time and image quality. Your "b-b-but it's upscaled from 720p" doesn't mean shit when the output result is triple FPS , lower latency and similar image quality.
I'll send you a tissue to sob in.
You make disingenuous comparisons because you know frame gen is shit and are paid to pretend otherwise. That's 100% shilling.
>FPS and image quality just don't count
fricking lol, AMD drones are getting more pathetic every day
>much more responsive
homie, you aint making shit more responsive by buffering frames to interpolate
responsiveness comes from low frame render time and low frame queue size, not this
they wanted youtube reviewers to show/claim that they are getting DOUBLE the frames and it's exactly what i see in every video
They will stack on 10 more shaders and call it revolutionary when it still looks like you are lookin at some amateuristic homie tryin to "restore" a phone sized jpg he got off ifunny.co in photoshop
>still fuzzy
>still ghosting
>still input lag
>still not using it
still wrong
>DLSS S2 SUPER moronic + FAAGGOT + REALLY moronic
what did they mean by this?
Now show it as you're moving around
It's a fricking blurfest
dlss isnt taa
?
just look at how blurry this shit is. Anon you're so right, it's like Vaseline smeared all over my eyeballs, it's disgusting.
He's running 4k RT Overdrive at 70fps+
Without DLSS and that horrible smearing he'd only be getting about 30fps. 30fps is clearly the best way to play games isn't it?
the video you show is blurry as frick
are you blind?
>it's blurry to frick
the game has motion blur and it's running out of sync with the 60fps cap of the youtube video.
Are you stupid?
>Yes. I do prefer to game at 20fps rather than use any form of ai up sampling.
its a youtube video you dog
Old and outdated vid. Need the one with DLSS 3.7 with Preset E.
What does preset E do? Fixes ghosting or what?
probably just a different method of noise filtering or something.
Man, so RT still isn't there yet.
I hoped that RT would have the shadows not flicker, and have defined edges, instead of shimmery. The stuttery shadows on moving cars are also distracting, as it seems they refresh at a lower framerate than the rest of the animations.
It's been what, 5 years so far? I guess another 5 years is in order still.
We only get actual advances when next generation consoles hit, as that's what the majority of devs design their games around, consoles. Only a few design their games to utilize high-end PCs since it's a smaller market overall.
The one time I tried DLSS it made the textures warp like a PS1 game at even the slightest movement. Has it gotten any better, or is there still texture warping?
i dunno what you tried and how long ago, but shit like that hasn't been a problem since the days of dlss 1.0
current dlss is *extremely* good, especially if you also take the time to force-update it to the latest .dll (only really necessary in older games that shipped with old versions and are not updated anymore)
at 1440p and 4k, there's a very decent chance the game ends up looking better than native. and when it doesn't, its still pretty fricking close to it.
when will they start including latency to their image reconstruction benchmarks?
Nvidia sold frame interpolation as a feature to shit skins and they bought it all up, now AMD is doing it... PC gaming is dead lol.
shut up moron. You run a 1080p 75hz screen
Hello brownie. Are you projecting again?
AMD one is free. Mods can also inject it in any game with FSR2 or DLSS3
>Mods can also inject it in any game with FSR2
But what if i can make fsr2 work in any game that runs on vulkan(including dxvk/vkd3d)?
How many times are we going to have "look it's actually good now" marketing pushes? People have to start questioning this shit at some point.
>look it's actually good now
it has been good since 2.0 which was in early 2020.
It's 100% astroturffed. Like Halo, Fallout 76, Battlefield, and so on. Now they've figured out how to market a game, not deliver and then put it in a state it should have been in at launch after a year or two with a skeleton crew of unpaid interns/contracted SCABs, and reap micro transactions and DLC. All the while, gamers eat the shit up.
DLSS just werks.
>20fps (123ms)
>108fps (139ms)
Source?
now show me the input lag
left side is DLSS off - 77ms latency
right side is DLSS quality 37ms latency
anything else you'd like to see?
anti-shills forget its framegen that adds input lagg, for science can you post framegen pic?
not bad i think, i played CP all maxed dlss quality with FG, on a 4070TI and i porbably had like 100 input lagg, shit was nasty but you unironically get used to it after a few mins (im not saying having to put up with it is acceptable by any means tho)
Yeah, there's a huge difference between DLSS on my 3090 to my 4090. 3090 is just like how everyone here describes when they shit on it, blurry with trails etc.
On the 4090 it's nothing of the sort, it's clear and you'd have to be a moron to choose the native frame rate over it as the difference is that small.
DLSS doesn't change by what card you have. The only DLSS difference between 3090 and 4090 is the latter can do fake fps, which looks absolutely atrocious and adds input lag
you either fricked something up in your tests or have serious placebo
It does though. The frame rate and extra power the 4090 has reduces the trails amount. It calculates everything faster giving it a better experience.
I can get the 3090 DLSS experience when I increase the resolution to 8k on the 4090, the GPU struggles more and produces more artifacts.
Also, don't tell me how my two computers I have next to each other work.
It doesn't. The extra performance of the 4090 is seen as, well, extra performance. The upscaling algorithim is unchanged between the two. Sorry, but that's just the truth
It does. There's a quality difference between DLSS on the 3090 vs 4090 due to the speed it's processed. I've seen it.
Nope. You're under a placebo effect. Nvidia says that DLSS is the same between all cards except 4000 fake fps
Post your windows resolution page with your GPU on
uh ok?
A fricking moron AMD owner telling me how my 2 Nvidia GPUs look.
frick off you clown
>DURRRR AMD
That's the integrated GPU you fricktard. But unsurprising you would not know how upscaling works and then also not understand why it's connected to the iGPU
>ima tell you how your gpu should run
>i know this from first hand experience with my amd igpu and GTX 1080 ti
go away homosexual.
bro HDR off.......?
Most laptop displays are fake HDR (including mine) and look like trash with it on. I have a real HDR external display.
More frames means more temporal information to work with, so it technically does improve image quality when you have a higher framerate.
isn't how it works, most people don't understand their own shit though, that's why mechanics work on cars
I don't really give a frick how you think it works. I trust my eyes and what they see over what some homosexual on the internet tells me
I'll do it again to try and get same spot.
it's rendering smaller images so it has lower latency. why don't people understand this
>it's rendering smaller images so it has lower latency. why don't people understand this
Because latency is higher if you have a brain and compare it to the native resolution and not the DLSS target resolution.
You're adding latency to 1080p, for example, and in return getting a fake 4K resolution full of traces and ghosts, and with HIGHER latency than 1080p (although lower than native 4K).
nvidia cards can't even do 30 fps in a 4+ year old game lmao
I hate nvidia so I bought an AMD card. Can anybody explain why in the Witcher 3 FSR3 in 4k look really good, you can barely spot it. But on C2077 it's a complete mess?
Didn't CDProjekt partner with Nvidia when they made Cyberpunk? It's built from ground up for DLSS and Nvidia's raytracing implementation, the game literally exists to sell Geforce RTX gpus.
cope
DLSS is great
the only issue is when developers rely on it to release a poorly optimized game, the correct use of DLSS is letting laptops squeeze some extra performance, and for desktops to push 4k 144fps on ultra settings
every other complain about it is amd fanboys whining because their alternative is inferior
>the only issue is when developers rely on it to release a poorly optimized game
that's pretty much every single AAA/AA release in the last couple years though
I hope this is the future.
First Dragon's Dogma 2 pushing poorgays CPUs to melting and them getting 20fps and now GPUs. PMSL all the thirdies go back to console
We can argue about graphics all day, that's fine, I am so fricking tired of Cyberpunk by now holy shit.
The point is more FPS at same image quality
I love buying new cards to play the latest israeli slop
morning poorgay. Keep posting muh israelite because you cant afford nice things
dlss looks like dogshit 100% of the time
but enough about you
>on my 1080p dell
>bro how come all these games are unsloptimized
>bro i'm not turning on the feature that gives me 80 extra frames
i wish i could be this moronic, just for a little bit.
>bro just smear vaseline on your screen bro just think of the frames
have a nice day
he says, while playing on his ps5, upscaled to 900p, at 26 fps.
He's not capable of even running it
Every fricking time
Why do shills get so asshurt when you call dlss shit?
its like fricking clockwork with every one of these threads
>why do you care
why do you care?
I do it because I'm an argumentative twat and I know you're a homosexual
They'll outright call people with RTX 4090 cards poorgays because they can't live with the truth. I straight up never use DLSS unless the game actually WORKS well with it. Which is why I only have it on for amid evil. Every other game I keep it off, especially if it has its own resolution slider such as trial out, ghostrunner and more. If you need to claw back performance just lower the pixel count from 100 to 85 percent at the lowest depending on game. Always works better for me and you can still use raytracing.
Yeah but you're just a moron.
>being extra assblasted at the truth
Deep down you know I'm right, Don't deny it.
>the truth is me letting my monitor upscale rather than the dedicated ai Nvidia upscaler
holy shit anon, you've reached the next level of consciousness with your amazing insight.
At least you agree I'm smarter than the gays who think DLSS is good. Also
is correct.
I've already told you I think you're a moron.
See
again, just admit it already. Stop living in denial.
Why is it getting progressively more purple on the right? I assume that's ray traced global illumination, but it's not mentioned anywhere in the labels. If RR stands for "ray traced reflections" then why is the performance getting better when it's added? Why is the base image only 20fps?
if i wanted to render less i would just turn down the render scale slider that most games already have.
What word is 'tra' supposed to be? I have no idea what point you're trying to make.
Shittiest invention ever. Next gen games will run at a nice 360p upscaled to 8K with DLSS 9.999 and 30fps (you'll ned a $2500 card)
If a game can't run without DLSS is a shitty game, not worth buying or pirating.
1440p@120hz
DLSS off
Ray Tracing off
>
4K@60hz with DLSS+ray tracing
high framerate is way more immersive and noticeable than meme special effects
Ray tracing can look incredibly well and perform decently performance wise IF implemented correctly. Stuff like path tracing is more or less the future of gayming but yeah I know a few guys with 4090s and even they complain about having to use DLSS + frame generation in order to maintain a stable 60 fps in newer titles on fake '4k' when their 1080 Ti's literally just performed well out of the box almost a decade ago
I suggest you either get the 3060 12gb or just wait for the 5000 series to come out. The issue with AyyMD is that instead of investing in new technologies they just brute force stuff with less efficient hardware which is the case for ray tracing anyway. Basically what I mean is AyyMD will not catch up because technology is improving rapidly and for the most part they refuse to implement better newer alternatives which means the performance and software gap between AyyMD and NoVideo - Incel will only keep getting larger
Oh and their drivers are still dog shit even after all of those years. I mean yeah in newer titles you will not really notice a difference but on older ones Nvidia clearly performs better in both compatibility and stability
The best ray tracing game is unironically, Amid Evil. Its the only situation where ray tracing makes sense because it has zero impact on performance.
Nvidia will continue to make backroom deals with developers to jack up the graphics beyond anything a 4090 or even 5090 can handle so they can say
>LOL BUY MOAR POWER SUPPLY
lol that looks like the most basic implementation of ray tracing I've seen. It's almost a "why even bother? " inclusion
this is either pajeet shill samegay or massive moronation but disregarding anything youve said there is 0 reason to change your 6600xt for a 3060 lmao, 3060 cant even use dlss 3, literally wait more moron
I'm the guy who's wondering about replacing my 6600xt.
Why not? From what's been posted here, DLSS seems like a big deal.
DLSS has some conditions to run well and look good.
The higher the base resolution, the better the upscaling. And the higher the base framerate, the less artifacting.
So the best use case for DLSS is for upscaling 1440p to 4K on a 4070 and above.
A 3060 can't really do 1440p 60fps in modern games, so you won't get the full benefits of DLSS.
And if you try using it at 1080p, then you get the unusable smearing like in picrel.
So basically if I can afford any card above a 4070 there's no reason to get anything except nvidia?
Generally speaking, yes, nVidia has the better cards at the high end. Or at least more versatile for compute too, and with more tech in games to mess around with. AMD high end cards sell well this generation mainly because they're cheaper for a given level of performance in games, and because they come with more RAM.
Can confirm, on my 4k screen with a 3090 (roughly 4070/ti performance) DLSS is definitely impressive and worth it for 1440p to 4k, but anything less than that and it looks like blurry artifacty crap. I think only 1440p and above has enough pixel information for the AI to fill in the rest and get a reasonable result
>or just wait for the 5000 series to come out
you won't be able to get it for at least a year after release for non-scalper prices
trash for poor people who can't afford real frames and have to cope with bootleg frames
How can such an incredible undeniable innovative (DLSS) generate so much hate and jealousy?
amdrone directive No.1 is to shitpost about dlss at every opportunity, even if it requires shitposting about upscaling in general.
its strictly necessary for their gpu business to survive, since if we take dlss into consideration, absolutely no amd card is worth buying right now. not even one of them.
Makes my games look smeared and fricky when in motion. Dont like it one bit.
>an incredible undeniable innovative (DLSS) generate so much
SAR PLS BUY NVIDIA SAAR
>incredible undeniable innovative
holy ESL, it really is hired pojeets
Has DLSS ever looked good ingame and not moronic mockups that don't really represent it.
It actually looks like garbage in every game I've played with it and exists solely to get decent framerates out of bad hardware
falseflag
If you stand still, don't look at the details and do 1440p upscaled to 4K on a non-TV sized display, DLSS does look good.
Why don't they mention the input lad, uurgh try to play it to see, it is like playing a game with 100+ ping.
Also: I am already lost at those abreviation, I know what a dlss is but, what the heck is SR, RR, FG?
>SR
Super Resolution. Upscaling a lower resolution native render to higher resolution.
>RR
Ray Reconstruction. More advanced denoising of ray traced effects that cuts down on rays/pixel
>FG
Frame Generation. Like your TV's motion smoothing feature but run on the computer
its only framegen that adds input lagg
dlss sucks, we amd users dont need it because we play games at 1080p and not meme resolutions.
Why is this even a thing? If you're fine with blur, ghosting, upscaling; consoles are like 500 bucks.
>Why is this even a thing?
Generational gains were bad so they needed a way to cheat in benchmarks. And unfortunately people fell for it this time
When the frick did fake frames catch on? I swear a year or so ago it just came out of no where and now everyone acts like it's normal or even demands it.
When Nvidia named fake fps "DLSS 3", all their shill work pushing the DLSS name has conditioned people to think it was good. And of course they ran sponsored content with their favorite outlets at launch
graphics keep improving and resolutions/refresh rates keep increasing. obviously people are going to look for efficient ways to save fps. a.i is the future and it will only get better
>graphics kept improving
did they though? Do graphics look noticeably better than they did 5 years ago?
yes.
not really, no. Plenty of games look marginally better than their predecessors (Like Remnant 1 to 2) but perform several times worse.
So Ray Traced cyberpunk doesn't improve over the none RT version?
Good question. That game has a ton of perf issues in the new DLC area so when I revisited it I turned that stuff off. Regardless, on AVERAGE, newer games don't look markedly better than games from 5 years ago, despite large increases in system requirements that far outpace the measured/noticed improvements in graphics.
SAAARSSS PLEASE REDEEM THE DLSS SAR
I think spyro 3 on the ps1 looks better than all of these. I was playing it last night and it looks beautiful.
>Running a game at a lower resolution
:/
>Running a game at a lower resolution with...LE SHARPNESS
:O
Anybody that participates in GPU wars is a brown third worlder or a welfare Black person.
The third worlders can only afford the lowest tiers of GPUs, usually the x30s for Nvidia and the x40s for AMD if they buy new, higher or lower if they buy an older used card(this is where the asinine "my -doorstop- can still play the newest triple A slop at ultra at 60fps!" Posts come from). So they'll righteously defend and shill whatever brand they got their GPU from to justify wasting 3 months of savings on a GPU, and to escape into a fantasy where they are white and not a brown third worlder.
The welfare Black person will waste his welfare check on the highest tier GPU from either Nvidia or AMD and shill it and the company also in an effort to justify their purchase and to escape into a fantasy of being an affluent and wealthy white man. They do the same thing with their iPhones, Shoes, Cars, Grills and Rims.
if they post their specs, make no mistake. They stole the picture from reddit, Ganker or here.
>Chadware Unboxed
>admits DLSS doesn't improve actual performance and refers to it as motion smoothing, leaving it out of their benchmarks
>Digishill Foundry
>constantly tries to obfuscate what defines quality because modern optimization is a shitshow. Still shilling dogshit like Avatar: Frontiers of Pandora and Alan Wake 2
Current GPU features are a joke and you're a dimwit for advocating them.
youre actually a moron
Stick to the retro content, John.
Still not good enough to make 4k gaming not a meme.
do people actually PLAY this games and not just jerking off to framerates
>do people actually PLAY this games and not just jerking off to framerates
no
Didn't know we had corporate deep throating shills for fake frame generation, you realize companies are going to make games reliant on this and it won't be a good thing right? The target they'll go for is 30fps with 700p upscaled with the latest dlss5.5 and so on
It doesn't even look good on cyberpunk over native resolution
the new FFXIV update is going to have FSR and DLSS enabled by default
The True Upscale King drops at the end of the year
PlayStation Spectral Super Resolution (PSSR)
on a scale from FSR<TAA<DLSS how good do you think its gona be
PS5 has an RX 6700 equivalent. What do you think?
i dont know what that shit is, but i found that ps5 is slightly worse than a 1060, so i guess its gona be FSR tier quality lol
You heard wrong btw.
It's not even as good as a ryzen 7 with integrated graphics, you absolute wienergimp. You're like the gunk I clean off my taint every morning.
PSSR is literally a PS ported version of the new AI AMD upscaling coming out at same time, an AMD exec basically confirmed it when he said AMD AI upscaling is coming to all their gaming devices later this year, consoles are considered AMD gaming devices in their portfolio. Don't know what it will be called FSR4 or Radeon SSR, who cares.
Most likely FSR<TAA<"FSR4"/PSSR<XESS<DLSS
the problem with AMD (and thus PS) jumping into AI this late is that Intel and especially Nvidia are either multiple years ahead or a generation ahead for nvidia, it takes so long to train the AI and each version to get better and better, the early adopters have such a big lead. No one has a chance to catch Nvidia, only MSFT+Intel collab has a chance. AMD fricked up hard by not investing into AI early and trying to play the traditional value GPU game which is mostly outdated in 2024.
>No one has a chance to catch Nvidia
nvidia formally declared themselves to no longer be a gaming manufacturer though
>AMD fricked up hard by not investing into AI early
When were they supposed to invest into AI, exactly? They were barely kept alive by consoles. They were lucky Intel stood still for half a decade so people had a reason to buy Ryzen. Imagine if Intel 10nm actually launched in 2018 as originally planned, imagine Zen+ going against Alder Lake.
They started investing into GPGPU and AI as soon as they possibly could. And they aren't even that far behind these days, hardware wise.
Even if AMD did attempt to get into AI at the right time, they'd absolutely frick it up like they did OpenCL, as compute bullshit of the 2010's ultimately dictated the course of where AI went, and because AMD showcased absolutely nothing in the vidya space to shill OpenCL, that was it, and the most action it got in the vidya space was accelerating texture decoding in Dolphin during it's dark ages. AMD ultimately cursed us to CUDA or bust in that field, as even when the end user was actively doing it FOR FRICKING FREE at one point, AMD just shrugged and sent a C&D if they couldn't buy the dev off. The dude who created hybrid PhysX disappeared off the face of the Earth after a limited CUDA-to-OpenCL wrapper showcased a feature for OpenCL that people would actually be stoked about
https://web.archive.org/web/20160323093516/http://www.ngohq.com/news/14254-physx-gpu-acceleration-on-radeon-update.html
I'll start using your shenanigans when the game looks good in motion in 6-10 years. Until then
>motherfricking NATIVE
Didn't nvidia try to get hair stuff working years ago?
Still seems to be rather lacking in most games I think, though I'm guessing most abandoned that tech and just try making their own.
Hairworks was a repackaging of fast tessellation. Not only is this irrelevant in the DX12 era, but AMD ended up coming up with a solution that ran better on both their and Nvidia's cards.
I guess it's kinda funny that you can run hairworks on the Switch version of witcher 3 with mods because it's the same architecture as GTX 900
cute puppers
>input lag gets worse but the frames go up?!!
heres a screenshot I took of DLSS (no frame gen) in fortnite of me spinning the camera at the absolute highest mouse speed I could to frick it up
close up it looks perfectly fine even on the edges, and in the further away its a little distorted but way less than TAA
I am more hopeful for PSSR (playstation dlss) than FSR 3.1
i'm glad anons haven't drank the dlss/fsr snake oil as much as i thought they would have by now
What's snake oil about it? It looks better than TAA while running better too.
The only people who hate it are poorgays who don't have a GPU from the last 6 years or people with 1080p monitors.
i have 4k and a 4080s. i will never consume your AI slop also kys
not him but I have a 4080 super and at 4k dlss performance mode is pretty fantastic
moronic
>timestamp
kys
So you're just a moron then, got it.
>7800x3D
Yeah definitely a moron.
blow your fricking brains out
>blow your fricking brains out
>if i auto anonymize my file names then i'm actually anonymous
again, blow your fricking brains out
>>if i auto anonymize my file names then i'm actually anonymous
>again, blow your fricking brains out
DLSS (both upscaling and frame generation) is fricking fantastic, anyone who has actually used it knows this to be an absolute fact. Anyone stating anything to the contrary is a literal AMD shill.
Ironically, those very same AMDrones will be singing the praises of AI upscaling once FSR reaches the level DLSS was at 3 years ago.
FSR is already good enough. There's no more excuses for these "people".
Post photo and timestamp or you're lying
DLSS good
FSR if it gets good is good
XESS is okayish
They are not substitutes for poorly optimized goyslop games, but in an ideal world it'd be a nice "hey no rush to buy a video card, use us" option compared to cranking down settings or resolution, frick that
Yeah every surface reflect a massive amount of light. What a realistic depiction.
Just want to leave a note for anybody wandering into this thread uninformed:
This is L I T E R A L L Y a shill thread. This thread is advertising. Nvidia has a budget specifically for this type of shit. Ganker pc general is already acutely aware of this kind of shit and has to deal with it daily.
Have a good day.
>NOOOOO ANYONE WHO USES FEATURES I CAN'T RUN IS A SHILL!!!!!
have a nice day troony
Friendly reminder that this poster is L I T E R A L L Y a paid AMD shill.
Their astroturfing is extremely blatant at literally decades old at this point. It manages to avoid the eyes of most people, genuinely. They literally think this is all true, and everyone who buys an AMD card regrets it, and every one of Nvidia's latest lock down useless gimmick middleware is genuinely something you can't play games without.
The one real giveaway is that they always respond VISCOUSLY to you pointing out they're shills and deflect onto hypothetical AMD shills. See
and
kek im one of the guys you quoted and i think it's hilarious you think i'm a shill. i'm not even biased toward nvidia, i think amd are great. i just like shitposting in response to boring one-note chuds who just spam the same old shit in every god damn thread ad nauseum.
>woke!!!
>shill!!!!
>SLOP!!!!
seriously i'm sick to death of that fricking word. every time i hear it i feel like i've lost a brain cell. go ahead tell me how i'm a goyvidia shilling wokeslop troony.
>using the word chud
>not an off-site shill
2/10
which card do I get to max out ray tracing without AI meme frames @ 1080p?
I spent this evening playing the latest UE5 game and the upscaling/antialiasing is absolute garbage AS USUAL. Is every game this generation doomed to be a blurry washed out mess?
DLSS varies with the strength of the rig, and the amount of shit moving on the screen
if there is a lot going on, the game can become a blurry/fuzzy looking mess
in a perfect world, DLSS would not be needed because devs would focus on making a consistent artstyle for the game that is less resource intensive while still looking good.
AND, games wouldn't be fricking 150 gigs to install
ackkkkkkkkk
isnt that the original control with raytracing on
isnt control DLSS version 1 or 2? those were absolute dogshit
2 is the current one, and it uses 2 but older DLL which you can just change to a new one
FF15 uses DLSS 1
What am i doing in a grafix thread?
Dlss is only awful on 1080p, where it looks like grainy smudgy shit, especially when in motion, but you really shouldn’t need dlss at 1080p anyways. If you’re that much of a budget buyer, just get AMD since you’re not gonna be using RT or doing gpu-related productivity things nVidia is best at. Nvidia is clearly targeting the enthusiast and high end market of people doing 4k, vr, and video or 3D production.
I'll take supersampling and old games over blurry and toned down new games that have somehow brought back the same look as playing a game on composite but none of the crt tricks like rainbow water in sonic
Impressive. Let's see it in motion.
>impressive, let's see it in motion
>while I run 1080p 75hz screen with a GTX 1080
you will never be a real frame
dlss is both upscale and shader cope
Reminder that a lot of modern games force TAA, so the only way to get rid of it for then is enabling DLSS or FSR which is incompatible with it
I dont know what DLSS stands for so I always choose to read it as Dick Lips Sucky Suck.
Deep Learning Super Sampling. It's ai upscaling except not dogshit.
>except not dogshit.
Spot the shill.
I actually play video games more than I stare at puddles in video games, so nah, this is dumb. Maybe in a decade when native 4k 240Hz becomes standardized it will be useful.
Can a 1060 (6gb version) do DLSS? Asking for a friend
2050 or higher, and 4000 series or higher for dlss frame generation
>DLSS IS SHIT
>why?
>IT LOOKS BLURRY
>Here's an image of DLSS, where is it blurry?
>THAT'S NOT MOVING THOUGH
>Here's a video of it moving, where is it blurry?
>BUT THE LATENCY IS HIGHER
>Here's the latency results showing it's lower
>BUT IT'S NOT EVEN RAY TRACED
>Yes it is, here's a comparison image
>YOU'RE ALL NVIDIA SHILLS. DLSS IS SHIT
and around we go again.
I got called a shill once for saying dlss is better than fsr and amd frame generation is better than nvidia
Who am I shilling fricking mongoloids
I bet you enjoy TAA.
I bet you enjoy your 1080p 60hz GTX 960 too
1440p, 165Hz, 3080, thanks.
I'm just not a consumer prostitute homosexual and like a clean image in motion when playing games.
None of this temporal processing is going to look good until native 4k at very high native framerates becomes standard.
And no, I'm not going to pay $2,000 (+inflation) for a GPU that can do that either. Go be a israelite somewhere else.
You're like someone with an old run down Ford Escort telling people with brand new Lamborghinis that their ride is shit.
lol poorgays and their endless cope
>DLSS screenshot
Is there anything more ack coded?
DLSS off looks best but i don't get why the FPS is so low. shit computer?
DLSS off is AMD
Makes sense. AMD is a CPU hog.
PS5 is basically just PS4 with better resolution and framerate. I really want to see PSSR in motion, and potentially see it ported to PC
I'm so glad that I didn't fall for the amdshit.
ITT: seething AMDjeets
>just save money and buy AMD bro
?
lol
nvidia card in the same price range won't be able to run CP2077 at those settings either
you'll have to settle for framegen which is not great in FPS games because of increased latency
I like my games the same way I like my women... RAW.
No way im putting a DLSS condom over my pixels.
>people bring up fair criticisms/statements
>deranged accusations that anyone who disagrees must be a poor 3rd-worlder whose computer is running the cheapest possible gpu on a 720/1080p monitor
I suppose if you just have a spastic fit and paint anyone who could possibly disagree with you in a negative light, you don't actually have to refute their points, huh.
All the points have been refuted. See:-
Yes yes, I'm sure the ghosting/weird effects on certain lighting surfaces have been refuted, somehow.
>the ghosting I watched on a youtube video two years ago is still there, I'm sure of it.
literally had it in Starfield, Fallout 76, and a tiny bit in Lies of P. Dragon's Dogma 2 would get it fairly rarely, but it'd happen.
crazy how they keep increasing frame rate with each new version. are we close to the inflection point when whole rasterization will be replaced with AI image reconstruction? it's the most optimal way to reach photorealism.
well we might get a scenario where framegen adds more frames between each real frame
Like 1 real frame, 2 interpolated frames, 1 real frame etc.
You will never be a real framerate. You have no fill rate, you have no refresh time, you have no rasterization rate. You are a still image twisted by algorithms and machine learning into a crude mockery of nature’s perfection.
All the “validation” you get is two-faced and half-hearted. Behind your back people mock you. Your players are disgusted and ashamed of you, your “friends” laugh at your ghoulish appearance behind closed doors.
PC gamers are utterly repulsed by you. Thousands of years of NVidia lies have allowed PC gamers to sniff out frauds with incredible efficiency. Even framegen algorithms who “pass” look uncanny and unnatural to a PC gamer. Your temporal stability is a dead giveaway. And even if you manage to get a raytracing gamer to cope with you, he’ll turn tail and bolt the second he gets a whiff of your shimmering, uncanny antialiasing.
You will never be responsive. You wrench out a fake render every other frame and tell yourself it’s going to be ok, but deep inside you feel the depression creeping up like a weed, ready to crush you under the unbearable weight.
god i'm so out of touch with all that fancy new ai shit
might be too old for gaming lads
It’s basically just antialiasing except instead of oversampling and then reducing back down to native, it outputs at the oversampled resolution and pretends to be native.
You'll be too old for gayming when you can't hold the fricking mouse with your goddamn hand from all the arthritis, you frick.
artificial framerate
>not only will you not own your games but you won't get acceptable performance unless you enable our AI slop filter
Once again we have a bunch of people who can't afford to use something telling the people who do use it that it's shit.
>can’t afford a ten gallon shit bucket?
>what are you, poor?
No, just not wasting my money. Call me when they have a card that can do it native.
Are you not running native?
Not at 4k/60 with RT on and neither are you
Are you running 4k anything?
Nope. Card can’t do it. And not gonna upgrade until they make a card that can do it.
At what point do you decide that it's worth it?
DLSS isn't going to go away. We can move on 5 years and graphics power requirements are increase in games too.
Running a game at 4k native with all the bells and whistles turned on is probably going to be just as difficult in 4 years as it is now due to advances in graphic techniques.
Or are you picking a game like Cyberpunk and using that as your upgrade benchmark?
I mean anons on Ganker will keep shitting on AI generated "fake frames" but the reality is, were at the limit physically.
We wont have 120FPS, 4K ultra HDR and RT on gaming, our GPUs already need 2 or even 3 dedicated PSU connections, cooling and massive fricking size to be able to do what were doing. Diminishing returns, you cant "just invent a better GPU" hardware.
AI is, by all accounts, the only way to push further at this point.
Better to invest in making AI better than invest in hoping someone cracks the hardware limitations of GPUs magically.
I run most games at 4K on a 67000xt. I just turn off raytracing. It's all so tiresome
If you're tired of discussing RT and DLSS why are you in a thread about it?
"It's not worth the premium" is not the same as "it's bad."
So you're the arbiter of value for others?
he's not but i am
Everyone is a homosexual
The simple truth is regardless of if DLSS is on or off, no one will actually 'notice' the difference during actual gameplay when youre focusing on the game, people share back and forth screenshots as if that in anyway matters, it would only matter if one or the other looked like noticeable shit, which it doesnt.
So really its just "do you want more smooth FPS for the same relative graphics with no real glaring issues, or do you want your still photomode screenshots to look higher resolution?"
Who cares?
It smears and ghosts in motion. Why do you think everyone is only posting screenshots and not webms, mp4s, uncompressed AV1 uploads on catbox that aren't compressed NVENC h264 on YouTube.
post a webm of it smearing off your own PC
off your own PC
That was my PC
Here is someone else's PC
Mine doesn't do that. Your PC is broken, as is the homosexuals on the webm.
It happens to millions of others
what
lol AMD cope. That's a broken install emulating an AMD card trying to run the game.
DILATE DIS
thats not DLSS problem, thats bad implementation of TAA, the glass there generates wrong movement output on direction as it confuses the surface and whats behind it
I noticed it all the time in the games I played it in. It's pretty damn distracting.
Why isn't this thread deleted by the mods yet it belongs on Ganker just like GPU threads
Everyone here in this thread except me it's a useless motherfricker who's underaged and knows nothing about computers only regurgitating what they hear from YouTube and Reddit
because Ganker is full of absolute tossers who think because they've done some online "learn C+ in 2 weeks or your money back" course they're suddenly microchip designers.
Path-tracing with VR fricking when....
I want HDR VR first
VR is dead, count your losses and move on.
I don't even know what DLSS is supposed to do.
It makes poorgays and mugs who bought AMD cry like babies
Don't care, will continue to use my 1080 Ti until it dies.
DLSS is borderline witchcraft at 4k, but I'm still not sold on FG.
Every game with FG I tried just artifacts too much, hopefully 2.0 is going to be a thing soon and address this issue.
the one thing darktide didnt frick up was frame generation.
I will never turn on DLSS.
now post a webm of the same scene in motion
post the video rajesh. all ive seen posted itt on video is smearing and artifacting.
i like DLSS
its better than TAA, and in its DLAA version you get native resolution and insanely sharp image so people cant even b***h about blurriness
also the fps buff on DLSS quality at very small resolution cost is great
only problem is when game dosnt produce motion vevtors on some shaders, then you get ugly ghosting, at this point all game should be made with TAA in mind and make sure there is no empty space in motion vector frame
> It's better than native!
>native 20fps
>dlss 120fps
>native is betterer
moron
Yes, native is better.
DLSS looks like the twenty frames are horribly smeared together, AND now feels worse than 20FPS when it comes to actually playing the fricking game.
morons somehow conveniently ignore this every time lmao. DLSS looks like shit
20fps native is better than 120fps DLSS? You are one lying c**t.
Did they ever fix DLSS completely breaking during rain in Cyberpunk?
It does have trails together with frame gen but personally I prefer to have pathtracing and just see a little bit of trailing and softeness.
It's not like when I watch any other medium I only go for the sharpest ones only, infact I tend to prefer the grainier look.
Cyberpunk looks a lot better with pathtracing and the performances are still acceptable to me since it stays above 50fps most of the time, but I'm playing with the controller, with the mouse it would be probably more annoying.
The only time DLSS doesn't look like ass is when you use it for AA.
>mentally damaged Nvidia homosexual shills gimmicks and underlines at the same time how trash their GPUs are which can't fricking render native anymore
>DLSS IS SHIT
>why?
>IT LOOKS BLURRY
>Here's an image of DLSS, where is it blurry?
>THAT'S NOT MOVING THOUGH
>Here's a video of it moving, where is it blurry?
>BUT THE LATENCY IS HIGHER
>Here's the latency results showing it's lower
>BUT IT'S NOT EVEN RAY TRACED
>Yes it is, here's a comparison image
>YOU'RE ALL NVIDIA SHILLS. DLSS IS SHIT
and around we go again.
>moron never played a game with DLSS on
there is no "go around again" or "but"
DLSS looks like smeared shit all over the screen, maybe take off your eyes off the FPS counter for once and look at the details instead
I can post static FSR upscaled images too it will even look crispier than what the chinks of Njudea have, what the frick are you even trying to prove here moron? that you get more frames while you play upscaled games?
frick off my board with your gimmicks before you get slapped striaght the mouth b***h
Nvidia marketeers better hurry and tell your piece of shit Taiwanese manufacturer overseas to assemble GPUs that can render native without drawing over 500+ watts to play a game in 1080p in Q2 2024
we didn't have these problems back in the day and now it's all of a sudden about DLSS and homosexual gimmicks
frick you and eat my butthole I fricking hate all of you shills
look here:-
Off/On
>smeared shit all over my screen on my rtx2030 and 720p monitor
literal Israeli WEF shills itt
those 15 million aid paid off real quick for Nvidia