>introduces new tech that will make cards punch way above their weight
>devs just use it as a crutch for bad optimization
everytime
>introduces new tech that will make cards punch way above their weight
>devs just use it as a crutch for bad optimization
everytime
just turn your settings down
Well yeah it makes the game look better at the same performance.
>it makes the game look better
fricking lol
I love having vaseline smeared over my screen at all times just to maintain 60fps because modern devs and engines are apparently fricking awful at their job
Most people don't have that experience. You're probably overreacting.
>Makes Claim
>Doesn't share Claim
>can't even be bothered to fake on either
Can't be an AMD cuck they're burning their houses down as we speak.
More like nGreedia wienersuckers that are burning down their houses with their great 12vhpwr connectors lmao
>vaseline smeared over my screen
that was true for the early versions of the thing
not anymore (in most cases, implementation quality differs from game to game)
You're thinking of TAA, which DLSS isn't.
Also
>what is dlaa
better? nah, shit never looks as good as native.
>anon refuses to use the solutions offered to him
Don't care, will use AMD so I can game on Linux
>Don't care, will use AMD so I can game on Linux
based
it was the intention from the beginning
>punch way above their weight
why do you apes insist on sounding like a journalist
For me it's 4x TXAA with Fidelity CAS at 50% sharpness.
>10 years ago people were fine playing 1080p 60fps and 144fps was considered high end
>now gamers want 4k 144fps as the norm
nah what happened to YOU?
This. 1080p 60FPSchads aren't having any trouble.
and in the before times we used to play games in 240p. What's your point?
Well then you should have no problem with low rsolutions.
>>now gamers want 4k 144fps as the norm
we don't even have 720p 30fps as the norm right now
fake resolution, fake framerates, fake women, fake men
fake and gay world
10 years before then people were fine playing 320x200 (stretched to 4:3 aspect ratio by the monitor) at 35fps. Technology has, in fact, changed since you were a kid. It's the same as how we've moved on from cassette tapes to flacs.
Guess what else changed? The requirements. Just go play your 20 year old game in 1000 fps and shut up about it.
No shit the requirements changed. There is no reason for developers to create modern games as if they're making a game for 486es back in 1993. Requirements went up because hardware become stronger. You're just some dumb kid who can't stand the fact a 2023 game is not, in fact, meant for a poverty grade system built from 2013 parts.
A high end gaming computer from 1994 was severely outmoded and in need of upgrades by 1996. This is the norm.
>You're just some dumb kid who can't stand the fact a 2023 game is not, in fact, meant for a poverty grade system built from 2013 parts.
Are you talking to yourself? I'm not the one complaining about AAA not running in high fps.
Are you a bot or just moronic?
>tell that you guys are stupid for wanting giga performance when it has always been 30-60 fps
>"HAHA YOU MAD THAT YOU CANT JUST RUN WITH POVERY SYSTEM IUHAIDHASIUHD"
dsfsdjhsd?
>nah what happened to YOU?
10 years
moronic devs are pushing for 4K and every game looks like absolute dog shit at 1080p due to TAA vaseline smear.
Just play in 1080p with DSR + DLSS and treat it as AA. Oh wait you can't because you alrdy upgraded your monitor and now you need to upscale to 4k+ to get rid of the blur. :^^
4k was considered high end 10 years ago you turbo homosexual
still is
But you couldn't really get above 60fps on 4k in 2013. Now I think gpus are pushing above 90fps in some new games.
The thing is that people are getting used to bigger screens, tv's and smartphones have bigger displays and its what people are getting into.
If you have a big tv and a new pc, its just natural, that people want to connect them , so they can play on a tv, more so when people already do this on consoles.
My monitor is hd, but when i upgrade it just make sense, that i would like to play in qhd, because it makes no sense to pay money for hd gaming in the current year.
Its not anyones fault that resolution and monitors play a crucial part in building a pc now, but this stupid industry will find a way to make it more expensive for everyone soon enough again, so i really dont know anymore.
>Its not anyones fault that resolution and monitors play a crucial part in building a pc now
Nobody is forcing you to buy 4k. Lke bro they're not gonna come to your door and shoot you if you don't buy.
I'm still playing on my 1200p 16:10 60fps monitor from 2011. Couldn't ever justify spending more money on a monitor to spend more money on cpu/gpu to play the same games.
Always thought dlss was a crutch for ray tracing.
I bareky see a difference when I play 2K with DLSS. Meme Tracing and Frame Generation are shit though
i did a blind test with hogwarts legacy when frame generation was first being utilized and in most cases you literally can not tell the difference
i'll add that i'd imagine in varies from game to game, but typically there's no reason to have it off, especially not when you're looking at the massive performance gains
Forced TAA and DLSS/FSR/XeSS are a cancer to the gaming industry
Everything looks like vaseline smeared across my screen especially in motion
Fricking moron consoombrains need to stop normalizing games not looking sharp anymore
Nintendo's decision to get out of the hardware arms race feels smarter by every year that passes, hardware isn't important for good looking games anymore, it just allows devs to be lazier with optimization. Go play Uncharted 2 on the PS3 and tell me that shit doesn't look better than 99% of games released today.
You say this but anything not developed by nintendo themselves, even their system-selling third party exclusives, runs like absolute fricking shit.
TotK runs like shit as well. Even their flagship Switch game is poorly optimized.
I fell through the floor on the final robotnik fight four times because it fails to render if you run too fast in multi-player. stop.
Except even Nintendo themselves are just as lazy with optimization regardless of the hardware difference, instead of running upscalers they just have to game run at 366p period.
>games look like this
>still can't hit 60fps on a 4090 at native resolution
homie why you are playing at 16k
>optimization means I can play a new game at high settings with my decade old computer - every unintelligent jobless frickwit who grew up during 7th gen
we already passee the point where games dont need to look any better. devs should focus on actually making the games fun instead of raising the amount of ass hairs my character has in order to israelite more money out of me.
I'm gonna one up you. We should go back to high end Ps2/Gamecube era graphics.
>we alrea-ACK!
You're a brain damaged mongoloid with bad vision who can't handle change and demand the world and all technology in it be stuck in your childhood forever.
And he will be happy.
>hardware engineers work their ass off for 20% gains so software Black folk can expand to fill all available resources
what else do you expect them to do? moron.
This is literally how every efficiency/productivity increase in every industry has worked since the beginning of time.
consoles have been using this since 2010 which is the only reason they've been able to maintain 30-60 fps
They've been using upscaling far longer than that. 6th gen and prior upscaled games to roughly 640x480, or 640x576 in PAL territories - the general viewing area of a CRT, with full broadcast resolution of 720x480 or 720x576. 7th gen from the very beginning scaled some games up to 720p, ran some games at 720p native, then scaled both up to 1080p if you had a 1080p panel connected.
it's crazy how they've taken this tech and marketed fake 4k with it
>6th gen and prior upscaled games to roughly 640x480
Then what was the native resolution
512x448
ps1 went down to 256×224
It varied depending on the system and game. You'd have to look it up.
Out of necessity. The consoles had to connect to and output to televisions, which worked at a fixed resolution of 720x480 NTSC or 720x576 PAL, interlaced in either case. Signals had to come in at those resolutions. 640x480 is something of an approximation for NTSC's actual visual resolution, as much of the full picture in the broadcast is offscreen. In television broadcasts, this extra offscreen space is used to store extra data, such as color data and closed captioning.
Why would 6th gen and earlier bother upscaling anything when they were still using CRTs?
>introduces new tech that will make cards punch way above their weight
What fricking tech? Upscaling and interpolation? Why the frick would you ever use any of that?
>>devs just use it as a crutch for bad optimization
Predictable, games get more and more broken.
It got so bad Epic had to switch default settings in their engine for shader compilation since devs didnt give a shit about stutter a spreading workload on your cpu.
Ray tracing implementation in games is also a joke, instead of doing what it was made for, real time Global illumination, they instead slap it on as some worthless reflections or AO, or broken shadows in last 3 weeks before release.
>great AA solution
>better image stability
>better performance
DLSS is great tech, game still frick up motion vectors by not generating them for all objects so you get ghosting up the ass, but when it works, it works wonders
>play indie game
>it looks like some PS2 shit
>performs worse than the latest monster hunter on my computer
i hate it
The reason is that those indie games usually use UE or Unity which are extremely bloated for the simple graphics those devs are going for. The baseline performance is shit and it's pretty much impossible to improve without heavily rewriting the engine (beyond devs capabilities). Making use of the baseline performance is beyond their capabilities as well though, so you're stuck with bad graphics and bad performance. Hopefully Godot will eventually take off and at least 2D games will finally perform as they should.
i can't tell the difference besides the left one having better shadows.
cant wait to switch to an rtx 40 series card after using 1050
dlss is cool
>4k meme
It's going to keep happening, card manufactures and the gaming industry aren't anywhere close on the same page. I believe that's why Nvidia are trying to nope out of the gaming meme, they cant facilitate the moronic obsession with demand for better graphics when they're barely keeping up with the demand. If the whole VRAM shortage wasn't obvious I think they're just tired of burning silicon
>input lag:on
>ghosting:on
>temporal aliasing:on
>fake frames:on
>not native resolution:on
>garbage crutch for shitty developers:on
>no good game supports it:on
>homosexuals buy nvidia:on
>get btfo'd by amd at ever turn because they cannot make a good card for cheap:on