>introduces new tech that will make cards punch way above their weight

>introduces new tech that will make cards punch way above their weight
>devs just use it as a crutch for bad optimization
everytime

Mike Stoklasa's Worst Fan Shirt $21.68

POSIWID: The Purpose Of A System Is What It Does Shirt $21.68

Mike Stoklasa's Worst Fan Shirt $21.68

  1. 5 months ago
    Anonymous

    just turn your settings down

  2. 5 months ago
    Anonymous

    Well yeah it makes the game look better at the same performance.

    • 5 months ago
      Anonymous

      >it makes the game look better
      fricking lol
      I love having vaseline smeared over my screen at all times just to maintain 60fps because modern devs and engines are apparently fricking awful at their job

      • 5 months ago
        Anonymous

        Most people don't have that experience. You're probably overreacting.

      • 5 months ago
        Anonymous

        >Makes Claim
        >Doesn't share Claim
        >can't even be bothered to fake on either
        Can't be an AMD cuck they're burning their houses down as we speak.

        • 5 months ago
          Anonymous

          More like nGreedia wienersuckers that are burning down their houses with their great 12vhpwr connectors lmao

      • 5 months ago
        Anonymous

        >vaseline smeared over my screen
        that was true for the early versions of the thing
        not anymore (in most cases, implementation quality differs from game to game)

      • 5 months ago
        Anonymous

        You're thinking of TAA, which DLSS isn't.
        Also
        >what is dlaa

    • 5 months ago
      Anonymous

      better? nah, shit never looks as good as native.

  3. 5 months ago
    Anonymous

    >anon refuses to use the solutions offered to him

  4. 5 months ago
    Anonymous

    Don't care, will use AMD so I can game on Linux

    • 5 months ago
      Anonymous

      >Don't care, will use AMD so I can game on Linux
      based

  5. 5 months ago
    Anonymous

    it was the intention from the beginning

  6. 5 months ago
    Anonymous

    >punch way above their weight
    why do you apes insist on sounding like a journalist

  7. 5 months ago
    Anonymous

    For me it's 4x TXAA with Fidelity CAS at 50% sharpness.

  8. 5 months ago
    Anonymous

    >10 years ago people were fine playing 1080p 60fps and 144fps was considered high end
    >now gamers want 4k 144fps as the norm
    nah what happened to YOU?

    • 5 months ago
      Anonymous

      This. 1080p 60FPSchads aren't having any trouble.

    • 5 months ago
      Anonymous

      and in the before times we used to play games in 240p. What's your point?

      • 5 months ago
        Anonymous

        Well then you should have no problem with low rsolutions.

    • 5 months ago
      Anonymous

      >>now gamers want 4k 144fps as the norm
      we don't even have 720p 30fps as the norm right now
      fake resolution, fake framerates, fake women, fake men
      fake and gay world

    • 5 months ago
      Anonymous

      10 years before then people were fine playing 320x200 (stretched to 4:3 aspect ratio by the monitor) at 35fps. Technology has, in fact, changed since you were a kid. It's the same as how we've moved on from cassette tapes to flacs.

      • 5 months ago
        Anonymous

        Guess what else changed? The requirements. Just go play your 20 year old game in 1000 fps and shut up about it.

        • 5 months ago
          Anonymous

          No shit the requirements changed. There is no reason for developers to create modern games as if they're making a game for 486es back in 1993. Requirements went up because hardware become stronger. You're just some dumb kid who can't stand the fact a 2023 game is not, in fact, meant for a poverty grade system built from 2013 parts.
          A high end gaming computer from 1994 was severely outmoded and in need of upgrades by 1996. This is the norm.

          • 5 months ago
            Anonymous

            >You're just some dumb kid who can't stand the fact a 2023 game is not, in fact, meant for a poverty grade system built from 2013 parts.
            Are you talking to yourself? I'm not the one complaining about AAA not running in high fps.

            • 5 months ago
              Anonymous

              Are you a bot or just moronic?

              • 5 months ago
                Anonymous

                >tell that you guys are stupid for wanting giga performance when it has always been 30-60 fps
                >"HAHA YOU MAD THAT YOU CANT JUST RUN WITH POVERY SYSTEM IUHAIDHASIUHD"
                dsfsdjhsd?

        • 5 months ago
          Anonymous

          No shit the requirements changed. There is no reason for developers to create modern games as if they're making a game for 486es back in 1993. Requirements went up because hardware become stronger. You're just some dumb kid who can't stand the fact a 2023 game is not, in fact, meant for a poverty grade system built from 2013 parts.
          A high end gaming computer from 1994 was severely outmoded and in need of upgrades by 1996. This is the norm.

          >You're just some dumb kid who can't stand the fact a 2023 game is not, in fact, meant for a poverty grade system built from 2013 parts.
          Are you talking to yourself? I'm not the one complaining about AAA not running in high fps.

          Are you a bot or just moronic?

    • 5 months ago
      Anonymous

      >nah what happened to YOU?
      10 years

    • 5 months ago
      Anonymous

      moronic devs are pushing for 4K and every game looks like absolute dog shit at 1080p due to TAA vaseline smear.

      • 5 months ago
        Anonymous

        Just play in 1080p with DSR + DLSS and treat it as AA. Oh wait you can't because you alrdy upgraded your monitor and now you need to upscale to 4k+ to get rid of the blur. :^^

    • 5 months ago
      Anonymous

      4k was considered high end 10 years ago you turbo homosexual

      • 5 months ago
        Anonymous

        still is

        • 5 months ago
          Anonymous

          But you couldn't really get above 60fps on 4k in 2013. Now I think gpus are pushing above 90fps in some new games.

    • 5 months ago
      Anonymous

      The thing is that people are getting used to bigger screens, tv's and smartphones have bigger displays and its what people are getting into.
      If you have a big tv and a new pc, its just natural, that people want to connect them , so they can play on a tv, more so when people already do this on consoles.
      My monitor is hd, but when i upgrade it just make sense, that i would like to play in qhd, because it makes no sense to pay money for hd gaming in the current year.
      Its not anyones fault that resolution and monitors play a crucial part in building a pc now, but this stupid industry will find a way to make it more expensive for everyone soon enough again, so i really dont know anymore.

      • 5 months ago
        Anonymous

        >Its not anyones fault that resolution and monitors play a crucial part in building a pc now
        Nobody is forcing you to buy 4k. Lke bro they're not gonna come to your door and shoot you if you don't buy.

    • 5 months ago
      Anonymous

      I'm still playing on my 1200p 16:10 60fps monitor from 2011. Couldn't ever justify spending more money on a monitor to spend more money on cpu/gpu to play the same games.

  9. 5 months ago
    Anonymous

    Always thought dlss was a crutch for ray tracing.

  10. 5 months ago
    Anonymous

    I bareky see a difference when I play 2K with DLSS. Meme Tracing and Frame Generation are shit though

    • 5 months ago
      Anonymous

      i did a blind test with hogwarts legacy when frame generation was first being utilized and in most cases you literally can not tell the difference

      • 5 months ago
        Anonymous

        i'll add that i'd imagine in varies from game to game, but typically there's no reason to have it off, especially not when you're looking at the massive performance gains

  11. 5 months ago
    Anonymous

    Forced TAA and DLSS/FSR/XeSS are a cancer to the gaming industry
    Everything looks like vaseline smeared across my screen especially in motion
    Fricking moron consoombrains need to stop normalizing games not looking sharp anymore

  12. 5 months ago
    Anonymous

    Nintendo's decision to get out of the hardware arms race feels smarter by every year that passes, hardware isn't important for good looking games anymore, it just allows devs to be lazier with optimization. Go play Uncharted 2 on the PS3 and tell me that shit doesn't look better than 99% of games released today.

    • 5 months ago
      Anonymous

      You say this but anything not developed by nintendo themselves, even their system-selling third party exclusives, runs like absolute fricking shit.

      • 5 months ago
        Anonymous

        TotK runs like shit as well. Even their flagship Switch game is poorly optimized.

    • 5 months ago
      Anonymous

      I fell through the floor on the final robotnik fight four times because it fails to render if you run too fast in multi-player. stop.

    • 5 months ago
      Anonymous

      Except even Nintendo themselves are just as lazy with optimization regardless of the hardware difference, instead of running upscalers they just have to game run at 366p period.

  13. 5 months ago
    Anonymous

    >games look like this
    >still can't hit 60fps on a 4090 at native resolution

    • 5 months ago
      Anonymous

      homie why you are playing at 16k

  14. 5 months ago
    Anonymous

    >optimization means I can play a new game at high settings with my decade old computer - every unintelligent jobless frickwit who grew up during 7th gen

    • 5 months ago
      Anonymous

      we already passee the point where games dont need to look any better. devs should focus on actually making the games fun instead of raising the amount of ass hairs my character has in order to israelite more money out of me.

      • 5 months ago
        Anonymous

        I'm gonna one up you. We should go back to high end Ps2/Gamecube era graphics.

      • 5 months ago
        Anonymous

        >we alrea-ACK!
        You're a brain damaged mongoloid with bad vision who can't handle change and demand the world and all technology in it be stuck in your childhood forever.

        • 5 months ago
          Anonymous

          And he will be happy.

  15. 5 months ago
    Anonymous

    >hardware engineers work their ass off for 20% gains so software Black folk can expand to fill all available resources

    • 5 months ago
      Anonymous

      what else do you expect them to do? moron.

  16. 5 months ago
    Anonymous

    This is literally how every efficiency/productivity increase in every industry has worked since the beginning of time.

  17. 5 months ago
    Anonymous

    consoles have been using this since 2010 which is the only reason they've been able to maintain 30-60 fps

    • 5 months ago
      Anonymous

      They've been using upscaling far longer than that. 6th gen and prior upscaled games to roughly 640x480, or 640x576 in PAL territories - the general viewing area of a CRT, with full broadcast resolution of 720x480 or 720x576. 7th gen from the very beginning scaled some games up to 720p, ran some games at 720p native, then scaled both up to 1080p if you had a 1080p panel connected.

      • 5 months ago
        Anonymous

        it's crazy how they've taken this tech and marketed fake 4k with it

      • 5 months ago
        Anonymous

        >6th gen and prior upscaled games to roughly 640x480
        Then what was the native resolution

        • 5 months ago
          Anonymous

          512x448

          • 5 months ago
            Anonymous

            ps1 went down to 256×224

        • 5 months ago
          Anonymous

          It varied depending on the system and game. You'd have to look it up.

          Why would 6th gen and earlier bother upscaling anything when they were still using CRTs?

          Out of necessity. The consoles had to connect to and output to televisions, which worked at a fixed resolution of 720x480 NTSC or 720x576 PAL, interlaced in either case. Signals had to come in at those resolutions. 640x480 is something of an approximation for NTSC's actual visual resolution, as much of the full picture in the broadcast is offscreen. In television broadcasts, this extra offscreen space is used to store extra data, such as color data and closed captioning.

      • 5 months ago
        Anonymous

        Why would 6th gen and earlier bother upscaling anything when they were still using CRTs?

  18. 5 months ago
    Anonymous

    >introduces new tech that will make cards punch way above their weight
    What fricking tech? Upscaling and interpolation? Why the frick would you ever use any of that?

    • 5 months ago
      Anonymous

      >>devs just use it as a crutch for bad optimization
      Predictable, games get more and more broken.
      It got so bad Epic had to switch default settings in their engine for shader compilation since devs didnt give a shit about stutter a spreading workload on your cpu.
      Ray tracing implementation in games is also a joke, instead of doing what it was made for, real time Global illumination, they instead slap it on as some worthless reflections or AO, or broken shadows in last 3 weeks before release.

      >great AA solution
      >better image stability
      >better performance
      DLSS is great tech, game still frick up motion vectors by not generating them for all objects so you get ghosting up the ass, but when it works, it works wonders

  19. 5 months ago
    Anonymous

    >play indie game
    >it looks like some PS2 shit
    >performs worse than the latest monster hunter on my computer
    i hate it

    • 5 months ago
      Anonymous

      The reason is that those indie games usually use UE or Unity which are extremely bloated for the simple graphics those devs are going for. The baseline performance is shit and it's pretty much impossible to improve without heavily rewriting the engine (beyond devs capabilities). Making use of the baseline performance is beyond their capabilities as well though, so you're stuck with bad graphics and bad performance. Hopefully Godot will eventually take off and at least 2D games will finally perform as they should.

  20. 5 months ago
    Anonymous

    i can't tell the difference besides the left one having better shadows.

  21. 5 months ago
    Anonymous

    cant wait to switch to an rtx 40 series card after using 1050
    dlss is cool

  22. 5 months ago
    Anonymous

    >4k meme

  23. 5 months ago
    Anonymous

    It's going to keep happening, card manufactures and the gaming industry aren't anywhere close on the same page. I believe that's why Nvidia are trying to nope out of the gaming meme, they cant facilitate the moronic obsession with demand for better graphics when they're barely keeping up with the demand. If the whole VRAM shortage wasn't obvious I think they're just tired of burning silicon

  24. 5 months ago
    Anonymous

    >input lag:on
    >ghosting:on
    >temporal aliasing:on
    >fake frames:on
    >not native resolution:on
    >garbage crutch for shitty developers:on
    >no good game supports it:on
    >homosexuals buy nvidia:on
    >get btfo'd by amd at ever turn because they cannot make a good card for cheap:on

Your email address will not be published. Required fields are marked *