Meme? Yay or nay?

Meme? Yay or nay?

Mike Stoklasa's Worst Fan Shirt $21.68

Thalidomide Vintage Ad Shirt $22.14

Mike Stoklasa's Worst Fan Shirt $21.68

  1. 2 years ago
    Anonymous

    It's the future. Seems like we're heading towards games relying more and more on upscaling for presentation, which means devs are gonna be targeting 1080p native on high end cards and then upscale to 1440p or 4k. Which coincidentally means your 1080p cards are gonna be made obsolete.

  2. 2 years ago
    Anonymous

    Hardware has no future gaming is trash.
    Ps2 was the last time games where fun

    • 2 years ago
      Anonymous

      Games were never fun nostalgiahomosexual. They've always been shit.

  3. 2 years ago
    01001101 01000100 01000110

    I can barely tell the difference.

  4. 2 years ago
    Anonymous

    looks like shit, proprietary shitware

  5. 2 years ago
    Anonymous

    DLSS requires Nvidia or the devs to train AI on billion hours of gameplay footage and then repeat until it makes it look perfect

    AMD can't catch up that fast with FSR

    • 2 years ago
      Anonymous

      Neither are training on AI now. It's literally just a switch you can flip on UE4 (and I presume UE5 as well).

      • 2 years ago
        Anonymous

        moron

      • 2 years ago
        Anonymous

        if it's still vendor locked it's shit

      • 2 years ago
        Anonymous

        No. The guy you replied to is right. They have to train these things with big training sets of the game.
        Its not just a setting in UE4, you need data from NVIDIA.

        • 2 years ago
          Anonymous

          >The original DLSS required training the AI network for each new game. DLSS 2.0 trains using non-game-specific content, delivering a generalized network that works across games.

          They're both half right, DLSS is natively supported on UE4 and Unity and doesn't require any training but it does still use AI. It genuinely is just a thing you can toggle on or off however.

        • 2 years ago
          Anonymous

          >They have to train these things with big training sets of the game.
          How does it feel to be still in 2018?
          Since DLSS 2.0 they're using generic models that work with everything.
          DLSS integration into UE4 is literally a plugin.

        • 2 years ago
          Anonymous

          How does it feel to be wrong?

          Does a fresh game off the compiler have thousands of hours of AI training already?

  6. 2 years ago
    Anonymous

    What game?

  7. 2 years ago
    Anonymous

    Nay. Only because it makes native 4k gays seethe.

  8. 2 years ago
    Anonymous

    4K DLSS quality is the future and I love it, it makes playing 4K actually viable and I hope either nvidia or AMD soon make a PROPER working universal DLSS equivalent you can just enable at driver level

    • 2 years ago
      Anonymous

      This. I just want to be able to use it to upscale older lower res games

      • 2 years ago
        Anonymous

        upscaling is taking a lower res and displaying it on higher resolution which you don't really need for older games since they already run at whatever resolution pretty much on modern hardware

        >it makes playing 4K actually viable
        so it's not actually 4k, what's the point

        Pixel count, I play on 1440p monitor but I downscale from 4k using DLDSR and it looks alot better. Try it yourself, the increased sample count brings out a lot of detail over your native resolution

        you're not actually running 4K, its 1080 upscaled however AMDs version is even worse

        DLSS quality on 4K is 1440p upscaled

        • 2 years ago
          Anonymous

          i have a 1070, not supported i'm pretty sure

      • 2 years ago
        Anonymous

        What's the point of upscaling in older games? Even entry level GPUs are capable of running them maxxed out at 4k. They don't even use TAA or modern rendering techniques relying on image reconstruction either so they would completely break apart when trying to pass through DLSS or FSR.

    • 2 years ago
      Anonymous

      >it makes playing 4K actually viable
      so it's not actually 4k, what's the point

      • 2 years ago
        Anonymous

        There he is. Knew he'd show up.

    • 2 years ago
      Anonymous

      you're not actually running 4K, its 1080 upscaled however AMDs version is even worse

  9. 2 years ago
    Anonymous

    I don't like the artifacts it causes.

    • 2 years ago
      Anonymous

      exactly this, i assume the people that use dlss also use motion smoothing on their tv's so everything looks like 60 fps+

      • 2 years ago
        Anonymous

        DLSS has a big problem with implementation, it works really bad in some games like Escape from Tarkov, while in Monster Hunter Rise you pretty much cannot see it

        those are fixed in DLSS 2.4+ versions

        They are not, like I said above it really depends on the implementation on game by game basis. In some games DLSS is just great while in others it is really really bad, but I still use it since I can just super sample and then use DLSS.

        i have a 1070, not supported i'm pretty sure

        yeah it's not sadly, DLDSR is really great tho if you ever get a RTX card because it cleans up the artifacting from super sampling

        • 2 years ago
          Anonymous

          >ever get a RTX
          naah with nvidia's continuation of vendor locked software and hardware features imma go to amd when prices go down

          • 2 years ago
            Anonymous

            AMD will surely make a similar tech when they figure it out so yeah it doesn't really matter.

            • 2 years ago
              Anonymous

              the software they've made in the past hasn't been ati/amd card locked i'm pretty certain?

              • 2 years ago
                Anonymous

                I don't think they have but they just don't exist at the moment is what I meant, if you want to use DLDSR its only on nvidia cards, FSR 2.0 is a good go at DLSS tho even if it's not as good yet but it'll get better

    • 2 years ago
      Anonymous

      those are fixed in DLSS 2.4+ versions

  10. 2 years ago
    Anonymous

    You need it if you want to use RTX on a 1440p+ monitor, which is always.

  11. 2 years ago
    Anonymous

    DLSS is decent, but nvidia is moronic. FSR2.0 shows they didn't have to sacrifice a huge amount of die area just to give it to you. They could've used existing hardware. Putting AI and tensor cores on the die is fricking moronic. That shit only belongs in workstations and supercomputers. Turing was a fricking failure and rightfully so. Despite using a smaller process node, it's wasn't really more efficient, because they had to cut shaders, ROPs and TMUs to make the AI shit and tensor cores fit.
    I don't think I need to tell you just how shit the 3000 series' power consumption is. They upped the shader/SM number back to what it was with pascal, but they still kept all the AI and tensor shit, making the dies frick huge. Despite using an 8nm process, compared to pascal's 16nm process, the 3050's die with 20SMs is only around 12% smaller than the 1080's die with the same number of SMs.
    It also shows that developing into this shit has made them stagnate and regressed in IPC for games. A 1080 is clocked roughly the same but performs better than a 3050 despite using the same number of clusters and probably would've had a smaller die size and lower power consumption if it was on the same process node.
    Nvidia has fricked us all over.

    • 2 years ago
      Anonymous

      >I don't think I need to tell you just how shit the 3000 series' power consumption is.
      who the frick cares? They're really good cards, if you need to worry about power consumption maybe you should not own a computer in the first place

      • 2 years ago
        Anonymous

        >Black person didn't get it at all
        The power consumption isn't the issue, it's the thing that shows you why they fricked up and why they had to resort to that. The manufacturing process they use is a lot more efficient, yet they didn't manage to increase efficiency. The reason for that is that more of the die area is dedicated to shit you don't fricking need on a gaming GPU. If they used the area with AI and Tensor cores for more shaders ROPs and TMUs, you can either have a smaller die, costing less and using less power to get the same performance, or the same size die using the same amount of power with significantly more performance.

      • 2 years ago
        Anonymous

        the excessive power consumption sucks when your fans go into overdrive every time you boot a game, my 3080 sounds like a goddamn jet engine

        • 2 years ago
          Anonymous

          just undervolt bro

    • 2 years ago
      Anonymous

      "FSR 2.0 proves blah blah blah..."
      stopped reading there. fsr2.0 looks like actual shit.

      • 2 years ago
        Anonymous

        looks great in cyberpunk with the mod
        >b-but cyberpunk is ba-
        it looks good.

    • 2 years ago
      Anonymous

      It's pretty nice if you're doing ML, however.

  12. 2 years ago
    Anonymous

    can someone explain to me why the frick did nvidia make dlss only supported on 20 series cards and above? people who use new cards don't fricking need res upscaling, its the 10 series cards and below that need it, what a bunch of fricking morons

    • 2 years ago
      Anonymous

      to sell new cards

    • 2 years ago
      Anonymous

      it relies on tensor cores. and 2060 and 3050 arent exactly spectacular cards on their own.

  13. 2 years ago
    Anonymous

    >adds nothing to gameplay because nobody knows how to use it aside making light and reflections realistic
    it's a meme and most likely will stay a meme because ultra realistic graphics isn't something impressive in 2022

    • 2 years ago
      Anonymous

      That's RTX, not DLSS, learn to read.

      DLSS itself is a meme. It relies on nVidia actually using their resources to have it implemented in games. Sure they do it for all big-name games now, but what about in 10 years or so? Remember PhysX?
      Upscaling though is probably here to stay for a while. 4K is actually picking up in adoption and it's clear neither AMD nor nVidia have a way to actually drive that resolution without punching up the power requirements to the point where gamers might actually start to think about their power bills which is quite a feat. Scaling is pretty much the only viable solution for now.

      >DLSS itself is a meme. It relies on nVidia actually using their resources to have it implemented in games. Sure they do it for all big-name games now, but what about in 10 years or so? Remember PhysX?

      PhysX was basically implemented in game engines when it gained adoption and it stopped needing dedicated hardware/die as CPUs and GPUs got better, in a similar manner to how hardware acceleration in sound cards from the 90s became obsolete when CPUs got good enough to not be dragged down by sound processing.

      https://i.imgur.com/yPIH0fr.jpg

      Meme? Yay or nay?

      I kinda like it in some games, since it allows me to run my TV at native 4k and acceptable framerates.

  14. 2 years ago
    Anonymous

    What im supposed to be seeing there?
    what the hell with these captchas lately holyfrick

  15. 2 years ago
    Anonymous

    yea, meme
    real-time pathtracing was making much better progress in the year immediately prior to "rtx" and now shit's all fricked up just because MUH DENOISER

    • 2 years ago
      Anonymous

      also ironic given that DLSS is some of the noisiest shit around with its temporal artifacting

  16. 2 years ago
    Anonymous

    DLSS itself is a meme. It relies on nVidia actually using their resources to have it implemented in games. Sure they do it for all big-name games now, but what about in 10 years or so? Remember PhysX?
    Upscaling though is probably here to stay for a while. 4K is actually picking up in adoption and it's clear neither AMD nor nVidia have a way to actually drive that resolution without punching up the power requirements to the point where gamers might actually start to think about their power bills which is quite a feat. Scaling is pretty much the only viable solution for now.

    • 2 years ago
      Anonymous

      >Remember PhysX?
      Never took off for graphics stuff, but it has become a fundamental component of physics simulations in many game engines that are still used today. A better comparison would be that hair thing they tried to push with Tomb Raider. That flopped.

  17. 2 years ago
    Anonymous

    On resolutions higher than 4K, yay

  18. 2 years ago
    Anonymous

    DLSS is an extremely adaptable and valuable tech but I fear devs will use it as a crutch in the future to excuse their shitty optimization

  19. 2 years ago
    Anonymous

    don't mind me just doing some cheap post processing

  20. 2 years ago
    Anonymous

    in my very limited experience, dlss + dsr was a godsend for rdr2 on my 1080p display. setting dldsr to 2,25x and dlss to qualitt, i was finally able to get rid of the shitty smeary blurry TAA look of the game and it runs nearly as well as native.

  21. 2 years ago
    Anonymous

    D44M already proved you don't need any memes like that. Just a well optimized game and well calibrated AA. DLSS is a crutch for devs who are incompetent.

    • 2 years ago
      Anonymous

      b-but doom eternal utilizes dlss...

      • 2 years ago
        Anonymous

        >doom eternal utilizes dlss
        Wat?

  22. 2 years ago
    Anonymous

    The only people that dislikes it are AMD third worlders. Expect them to defend that ripoff attempt called FSR, which can also be used with Nvidia cards.

    • 2 years ago
      Anonymous

      I assume you also paid a $200 premium for your g-sync monitor?

      • 2 years ago
        Anonymous

        my g-sync monitor is awesome, no screen tearing ever and no added lag like vsync

        • 2 years ago
          Anonymous

          what monitor

          • 2 years ago
            Anonymous

            PG279Q. I dunno if they still make/sell it but I love it. 1440p gives a nice resolution bump without burning all your GPU's fill rate like 4K

            • 2 years ago
              Anonymous

              could've paid much less for a freesync alternative

              • 2 years ago
                Anonymous

                freesync isn't as good as gsync tho

              • 2 years ago
                Anonymous

                >does the exact same thing
                >doesn't cost 200 dollars
                >doesn't draw extra power
                yes g-sync is so good that freesync wasn't adopted as the industry standard or anything and new g-sync monitors are released all the time

                you use an iphone by any chance?

              • 2 years ago
                Anonymous

                >LFC ensures that variable refresh rate will still work below the adaptive sync refresh window. In other words, a display that has an adaptive sync window of 40Hz to 100Hz will still suffer from screen tearing or stuttering if your framerate drops below 40fps. LFC will prevent this from happening and this is one of the key benefits that G-Sync offers over FreeSync

              • 2 years ago
                Anonymous

                So your argument as to why a very expensive proprietary module is better (despite doing the same thing than a free alternative) is because it works at sub 40 fps?

              • 2 years ago
                Anonymous

                it works at any fps, I literally never see screen tearing since I got this monitor

              • 2 years ago
                Anonymous

                neither have I because I haven't played below 60fps in over a decade, what a stupid argument, could've spent that money on a better gpu to actually run the game

              • 2 years ago
                Anonymous

                >implying I don't already have the best gfx card

              • 2 years ago
                Anonymous

                you have the best graphics card on the market and you're playing below 30 fps?

              • 2 years ago
                Anonymous

                some games aren't well coded and have hitches while loading or whatever. Obviously average FPS is much higher than 30

              • 2 years ago
                Anonymous

                My 2016 Nixeus NXVUE24 Freesync monitor has a 30-144hz freesync range and has LFC that kicks in below 30hz.
                Your claim is invalid.

              • 2 years ago
                Anonymous

                so? this seems really moronic
                my shit does 48 hz
                what does it matter? any game dropping below 60 just doent happen unless its dogshit or really broken and couldn't be fixed by modders

              • 2 years ago
                Anonymous

                I got a freesync monitor and it had visible backlight flicker as the game was running at 60fps. Had to return in after 2 days. Apparently this isn't a problem with proper g-sync.

              • 2 years ago
                Anonymous

                Not a freesync problem. When I got my first one I was actually surprised how problem-free it was. Just works globally with every game.

              • 2 years ago
                Anonymous

                Freesync on cheap monitors doesn't always work well with Nvidia cards. U need to buy a gsync compatible or certified monitor. the actual gsync "ultimate" module or whatever is a meme.

  23. 2 years ago
    Anonymous

    IMO its nice to have, extra setting to utilize in search of that perfect balance of performance and visuals, but not really a card seller because even still only few games actually implemented it, and even fewer implement it well.

  24. 2 years ago
    Anonymous

    It's blurry shite. don't use it if you're able to maintain 60+fps without it.

  25. 2 years ago
    Anonymous

    DLSS concept is good but the problem is that the implementation of it varies so wildly. DLSS is absolute dogshit in something like RDR2 or Death Stranding but really good in Control and COD:MW for example.

  26. 2 years ago
    Anonymous

    I stepped away from nvidia last month still have my 2080ti but it does upscaling and htpc stuff only now dlss is trash its only useful at higher res like 4k my 6900xt is so powerful I don't really need it and if I do frs2 is getting modded into alot of games now

  27. 2 years ago
    Anonymous

    In order of games I've played with DLSS:

    Death Stranding: Leaves trails behind those particles rising from the ground, doesn't seem out of place though. Birds in the sky leave long trails, looks bad. Less blurry than the game's TAA. The game already runs good but DLSS keeps 4K/60.

    Cyberpunk 2077: Basically mandatory since the game runs like ass. Refunded after a few hours so can't give a good DLSS review.

    Red Dead Redemption 2: It released without DLSS and I had to stomach the worst TAA I've ever seen in a game. When DLSS came out, it was just a straight upgrade from the game's TAA visually, and it's a demanding game where the performance increase is welcome. Always enable it for RDR2.

    Ready or Not: It's alright in this game. Red dots and holo sights leave tiny trails but not enough to interfere with aiming.

    Escape From Tarkov: Haven't played much since it was updated into the game but checked it out some. Can barely even tell, possibly the best implementation of DLSS yet.

    • 2 years ago
      Anonymous

      >Death Stranding: Leaves trails behind those particles rising from the ground
      You can replace the DLSS dll with newer version and it's fixed. In Director's Cut they use more recent version out of box.
      In Hitman 3 DLSS looks like trash no matter what version of DLSS you use, I unironically prefer FSR 1.0 in this case.

  28. 2 years ago
    Anonymous

    Whoa increased contrast and lowered brightness. Amazing

    • 2 years ago
      Anonymous

      That's not what DLSS does at all. It's resolution and performance related and there's no way the AI would make such lighting adjustments. OP's pic is bait.

  29. 2 years ago
    Anonymous

    isnt dlss mainly for performance boosts like rtx

    • 2 years ago
      Anonymous

      It can add more fine details to 4K and the future iterations will just keep getting better at doing that.

      So soon it will be better than native resolutions.

  30. 2 years ago
    Anonymous

    >wait decades for a remake
    >wait another year for it to come to PS5
    >wait half a year for it to come to EGS
    >wait another year for it to come to Steam

    >No DLSS
    >TAA that's even shittier than RDR2's
    >Graphics: On/off
    >Stutters
    >Time jannies
    >Can't play as Red XIII
    >$70

    Went a little off topic but I really wish it had DLSS. Frame rate isn't even bad, I get 60 FPS at 1800p and DLSS isn't gonna fix stutters but I just hate its TAA and wouldn't mind upscaling to 2160p.

    • 2 years ago
      Anonymous

      >isn't gonna fix stutters
      Run the game in DX11, you'll have to manually enable v-sync in drivers though.
      At 1080p it legit looks like a 540p game lmao. You can use edit TAA settings to use less samples in config.
      Fortunately I have 4K screen and it looks acceptable with stock TAA settings. Dynamic resolution does a pretty good job.

  31. 2 years ago
    Anonymous

    MEMEMEMEMEE

  32. 2 years ago
    Anonymous

    Dlss causes ghosting.
    Frick that shit.

  33. 2 years ago
    Anonymous

    idk things run fine on my 3070ti

  34. 2 years ago
    Anonymous

    FSR 2.0 is a game changer for anyone still on older nvidia GPUs or any dx12 capable AMD GPUs, managed to bring my average FPS in Cyberpunk 2077 to 60fps in Performance mode (just like FSR 1 Quality) with incredible clarity bordering on native res.
    This is on a 2015 R9 Fury and apparently it's even faster on GTX 10 series GPUs.

    • 2 years ago
      Anonymous

      The value add to older hardware is great. Technology at its best.
      However, it is a real shame that it seems like we're reaching physical limits before 4K 144hz native. At least with current transistor tech and architectures.

      • 2 years ago
        Anonymous

        AMD is chasing MCM style designs for their GPUs, that may be the breakthrough we need.

  35. 2 years ago
    Anonymous

    DLSS turns games into a blurry smeary mess of shit, no matter what DF says. Maybe it will improve over time but right now it is just not good enough and neither are the similar ones coming out like FSR.

    • 2 years ago
      Anonymous

      1080p is 20 years old. Stop being giga poor

  36. 2 years ago
    Anonymous

    For me the latest greatest new technology is frame rate limitation because my GPU is so fast. Its a bit moronic every game doesn't have it.

    • 2 years ago
      Anonymous

      I put a hard cap to something like 600. No use in going 1600fps during menus

  37. 2 years ago
    Anonymous

    As long as you're playing at 1440p and up it's not I guess

  38. 2 years ago
    Anonymous

    Not a meme
    It's a more optimal to get satisfactory image quality out at decent framerates. As it stands, the current approach with pure rasterization and AA no longer scales particularly well. So DLSS comes in and does better via upscaling from a lower resolution

  39. 2 years ago
    Anonymous

    definitely not a meme

  40. 2 years ago
    Anonymous

    I refuse to accept this going forward. I don't expect games to run at native 8k or 16k or whatever, but 1440p and 4k should absolutely be playable without this bullshit faking it tech. It's all in the hands of game developers so god help us.

  41. 2 years ago
    Anonymous

    good bait thread
    DLSS only increases the resolution, it doesnt change the colors in any way

Your email address will not be published. Required fields are marked *