Are AMD GPU's really that bad?

Are AMD GPU's really that bad?

Schizophrenic Conspiracy Theorist Shirt $21.68

Homeless People Are Sexy Shirt $21.68

Schizophrenic Conspiracy Theorist Shirt $21.68

  1. 1 year ago
    Anonymous

    What are the best graphics cards/processors for upgrading a pc?

    • 1 year ago
      Anonymous

      pentium 3 and voodoo 2

      • 1 year ago
        Anonymous

        Riva tnt shits on the voodoo

    • 1 year ago
      Anonymous

      The ones that offer a sizeable performance improvement without needing a new motherboard or power supply.

      Considering a 6800XT to upgrade from my 1080ti, i'd get a 3080ti but those are like $1,200. I might go team red after 10 years of green.
      Anyone know if I can do stable diffusion on windows worth a 6800Xt? SD is about the only thing id miss from nvidia tbh.

      You cannot run stable diffusion on a 6800xt on windows, AMD's GPU compute ecosystem, ROCM is currently Linux-only. I get about 2.43 iterations a second on my RX 6800, which is similar to a 3060, 6800XT doesn't do much better. Radeon cards aren't especially good for anything beyond gaming, save for mining when that was still profitable.

      • 1 year ago
        Anonymous

        I'll be honest, most I do on my 1080 is game, save for the occasional SD or video I make. And i'd assume that's the same for alot of people.. so would I have to run stable diffusion in a linux VM for AMD?

        • 1 year ago
          Anonymous

          >would I have to run stable diffusion in a linux VM for AMD?
          That would be a solution, but Hyper-V and most hypervisors you can run within windows can't give a VM full control to a GPU that's already used by Windows. If you have integrated graphics or another graphics card, you could use them for video output to circumvent the issue.

      • 1 year ago
        Anonymous

        >I get about 2.43 iterations a second on my RX 6800
        on linux? damn so i have to run linux to try out AI shit? i was considering dual booting anyway but it's still a bit of a pain in the ass. haven't touched linux in a while.
        NTA btw.

        • 1 year ago
          Anonymous

          >on linux
          Yes.
          >damn so i have to run linux to try out AI shit?
          Only if you want to use your radeon GPU. Pytorch, Tensorflow and all AI frameworks work perfectly fine on CPUs, it's just not as fast.
          >haven't touched linux in a while
          Install either fedora or anything arch. They are in my experience the easiest way to get ROCM up and running, everything already in their repositories.

          I get 6 it/s on a 6700 XT on Linux, I assume your numbers are Windows?

          My number on windows would be none. To give you more details, I'm running fedora 37, python 3.10 and basically using the sample code from stable diffusion's huggingface page, I haven't tried any of the fancy GUIs available.

          from diffusers import StableDiffusionPipeline, DPMSolverMultistepScheduler
          from sys import argv
          import torch
          from torch import autocast
          model_id = "stabilityai/stable-diffusion-2-1"

          # Use the DPMSolverMultistepScheduler (DPM-Solver++) scheduler here instead
          pipe = StableDiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.float16)
          pipe.scheduler = DPMSolverMultistepScheduler.from_config(pipe.scheduler.config)
          pipe = pipe.to("cuda")

          prompt = argv[1]
          #with autocast("cuda"):
          image = pipe(prompt).images[0]

          image.save("out.png")

          • 1 year ago
            Anonymous

            >Pytorch, Tensorflow and all AI frameworks work perfectly fine on CPUs, it's just not as fast.
            how much slower are we talking? or to be more specific, how do you think a 5700x will perform? more cores is better i assume?
            and RAM will be relevant instead of VRAM?

            • 1 year ago
              Anonymous

              Generating an image in stable diffusion, from prompt to final output takes around 30 seconds on my rx 6800.
              Doing the same on my laptop, with an R5 5600U takes about 10 minutes.
              Given the two extra cores and higher clockspeed, you should get closer to half that, and of course it will use RAM like anything else running on your CPU.

              • 1 year ago
                Anonymous

                thanks anon.

      • 1 year ago
        Anonymous

        I get 6 it/s on a 6700 XT on Linux, I assume your numbers are Windows?

  2. 1 year ago
    Anonymous

    No it's what consoles have been using for decades

    • 1 year ago
      Anonymous

      Not really strengthening your case there, bucko.

  3. 1 year ago
    Anonymous

    they're good at the poorgay pricerange. if youre paying 1k+ for a gpu anyway you're a moron if you pick AMD though.

    • 1 year ago
      Anonymous

      This. Better performance/price than Nvidia on the low end

      • 1 year ago
        Anonymous

        My 3060 12gb was $240 what would have been better?

        • 1 year ago
          Anonymous

          That's a great price for a 3060. They've been consistently around $350 for months, while the RX 6600 is $225 and has been as low as $190. It's not even really a choice when the price gap is that big.

    • 1 year ago
      Anonymous

      >if youre paying 1k+ for a gpu you're a moron
      ftfy

      • 1 year ago
        Anonymous

        Feels good having disposable income and not being a frickup

        • 1 year ago
          Anonymous

          Just becuase you have disposable income to buy a pile of shit dosent change the fact that you bought a pile of shit

          • 1 year ago
            Anonymous

            yeah but you’re a poor homosexual who cant buy a pile of shit and are mad about it

    • 1 year ago
      Anonymous

      I'd rather have the 7900 XT than the 4070TI (way more VRAM), but the 4080 is better than the 7900 XTX imo (you won't need more than 16GB for quite a while). The 4090 is untouchable.
      For anything below the $800 price range, AMD is better and cheaper.

      • 1 year ago
        Anonymous

        >muh VRAM
        I bet you are one of the morons that fell for the RX 480 8GB meme.

        • 1 year ago
          Anonymous

          But 2 games currently at 1440p or higher use more than 12GB and more will come because games are no longer optimized

          two games being hogwarts and warhammer something

    • 1 year ago
      Anonymous

      nvidia shill thinks he's slick lol

      • 1 year ago
        Anonymous

        He's right about what people's perceptions are. If you have to choose between a $1,000 7900XTX and a $1,200 4080 the $200 becomes less meaningful in that price range. People are spending top dollar and they want top results for those kinds of cards. AMD has made good improvements since RX 5000 in their GPU division, but they have a deserved reputational hit for a number of reasons. The OpenGL driver issue took them 10 years too long to address. That was performance left on the table for generations. Even this gen, they're supposed to be the efficiency company and both of the 7900s launched with idle power consumption of ~100W. They need a couple years of good performance out of their driver support to turn that perception around. They also need to forget this strategy of pricing their flagship cards $100-$200 below Nvidia's. If they want to gain market share then they're going to need to try and really undercut Nvidia.

  4. 1 year ago
    Anonymous

    No, as long as you stick to the previous gen (which you should be doing for any graphics cards).

  5. 1 year ago
    Anonymous

    AMD is the best on GNU/Linux.

  6. 1 year ago
    Anonymous

    best value for buck. If you are not buying 4090 i suggest buying amd

    • 1 year ago
      Anonymous

      Nobody on this board should be buying a 4090

      • 1 year ago
        Anonymous

        >should
        and how were you planning on stopping me?

        • 1 year ago
          Anonymous

          by collapsing the market

      • 1 year ago
        Anonymous

        >Nobody on this board should be buying a 4090
        LOL

        • 1 year ago
          Anonymous

          >winshit11
          >most israelite'd GPU on the market
          peak goyslop

          • 1 year ago
            Anonymous

            not my problem

            • 1 year ago
              Anonymous

              good goyim

              • 1 year ago
                Anonymous

                stop kvetching

        • 1 year ago
          Anonymous

          Point still stands

        • 1 year ago
          Anonymous

          good goy

  7. 1 year ago
    Anonymous

    6700XT is all you need.

    • 1 year ago
      Anonymous

      this, get yourself an ASrock or Sapphire 6700xt for less than 400 bucks and you're good to go for 1440p@120 fps

      • 1 year ago
        Anonymous

        I just bought an XFX 6700xt, did I frick up?

        • 1 year ago
          Anonymous

          no, that's good

        • 1 year ago
          Anonymous

          >XFX
          reviews on both Amazon and newegg say they have some coil whine at 75c+ with those cards

          My powercolor red devil runs cool as a cucumber. Haven't seen it hit 60c once in my meshify 2 compact.

          >PowerColor
          usually shit and also reported coil whine, some cheap features like adding in bronze tubes instead of proper heatsinks

          • 1 year ago
            Anonymous

            Well mine stays cool and doesn't whine so I dunno what to tell you.

            • 1 year ago
              Anonymous

              which version is it? there is 3 different variants of each card they have

              • 1 year ago
                Anonymous

                >red devil

          • 1 year ago
            Anonymous

            >bronze tubes instead of proper heatsinks
            Source???
            You can't say these things without proof Black person

            • 1 year ago
              Anonymous

              My source is that I made it the frick up
              Powercolor is for poor chinx and street shitters

          • 1 year ago
            Anonymous

            Over the years I have owned multiple AMD GPUs from different partners. Every single one of them had coil whine at some temperature. It's like the brand itself is cursed.

            • 1 year ago
              Anonymous

              Only GPU I've ever had with noticeable coil whine was a Zotac GTX 1060 3GB.

        • 1 year ago
          Anonymous

          I've had one for a while and haven't had any problems (besides fitting it in my case).

        • 1 year ago
          Anonymous

          Nah it's good.

          >XFX
          reviews on both amazon and newegg say they have some coil whine at 75c+ with those cards
          [...]
          >PowerColor
          usually shit and also reported coil whine, some cheap features like adding in bronze tubes instead of proper heatsinks

          >reviews on both amazon and newegg say they have some coil whine at 75c+ with those cards
          How do you even reach that? With one side case fan mine doesn't even go over 65c in FURMARK let alone actual vidya. But it is the mid tier variant with the better cooler.

      • 1 year ago
        Anonymous

        My powercolor red devil runs cool as a cucumber. Haven't seen it hit 60c once in my meshify 2 compact.

  8. 1 year ago
    Anonymous

    In my experience they have some weird performance issues on the desktop, like stuttering when watching videos in Firefox or resizing windows. Kinda similar to the NVIDIA experience on Linux. Other than that they're fine.

    • 1 year ago
      Anonymous

      Isn't that a common issue with MPO ? They actually fixed that a month or so ago, and it's more a Windows issue that directly an AMD issue.

      • 1 year ago
        Anonymous

        Just tried turning it off and it didn't change anything. When I watch a 4K YouTube video in Firefox it stutters really badly even though it claims it's not dropping any frames. I remember not having this problem when I was using NVIDIA.

        • 1 year ago
          Anonymous

          >When I watch a 4K YouTube video in Firefox it stutters really badly
          Hardware acceleration bug strikes again.

    • 1 year ago
      Anonymous

      >in my experience as an incompetent moron that can barely operate a computer,

      • 1 year ago
        Anonymous

        >in my experience as an incompetent moron that can barely operate a computer,
        Are you going to finish the sentence? We'd love to hear your thoughts on this topic.

  9. 1 year ago
    Anonymous

    I've used 4 amd gpus and one nvidia. Only one amd gpu fricked up on me. My 6700xt is currently in transit to asus. Not even six fricking months and it shit the bed. Anyway, you'll find plenty of people b***hing about one company or the other because their products stopped working. Keep in mind that people with bad experiences are more likely to be vocal about it. So for every bad thing you read, there are thousands of people who didn't have that experience

    • 1 year ago
      Anonymous

      >asus
      Why did you buy asus?

      • 1 year ago
        Anonymous

        It was the cheapest 6700xt at the time. $399.99

      • 1 year ago
        Anonymous

        Do Asus is now considered a shit brand? What happened?

        • 1 year ago
          Anonymous

          ASUS has always been shit, just less shit than other brands. Their laptops blow caps and resistors all the time.

          • 1 year ago
            Anonymous

            I have a budget laptop (~530€) from Asus since 2015 and still works, although following a recent drop the display has a weird pink line on it(1 pixel wide). My old 970DirectCU II is also still kicking and so does my old Asus motherboard from 2013 in my media PC. Weird to hear people are considering it bad.

        • 1 year ago
          Anonymous

          nta but their customer service is shit
          if you're on AMD you should always pick sapphire if you can.

          • 1 year ago
            Anonymous

            >had 2 sapphire nitro+ GPUs
            >both amd
            >both coil whine
            Frick Satan

            y̷̧͈͓̖̦̪͖̰͔͈̓͗̍̾̓̑͆̋̑̚̚ͅŏ̴̝̬̤̹̀̈̉̃̋̀̊̊̾̃ư̵̧̝̗͓̦̝͍̑̀̌̔̇̂̍̓̈́͊ͅ ̸̜̜̫͆̈́̀d̶̢͖̹̉͆̕ợ̵̧̞͂͑̾̔̑͒̏̎̾͋͒̄̂ņ̶̃̎͌t̸̨̧̛̟̭̮̠͛̊̌̃̋̔̾̍̕͝͠ͅ ̷̨̼̙̭̝̟̯̱͛̐͋̽͋̐̚͠l̴̡̡͚̙̮͚̺̉͋̓͂͊̽̚̕̚͠͝ȋ̶̛̜̗̼̯̓̈̊̑͆̾̀́̊̓̓̓͜k̶̝̹̒̌̃͐̾̽̇̎͠͝͠ͅĕ̸̞̤͎̣̓̈ ̷̨̩̙̩͉͚̘͌̂̽̋͗̔͋͐͗͛͛̀͝c̵̢̮͉̫̗̖̻̭̖̠̮͐̈́̆̓̈́̊̉͋͆͠ǫ̵̤̦̰͓̻͍̟͇̰̒͑̔́̄̄͑́̒̆͠i̸̛̙̫̯̜͎͇͖͐͐̒̈́̋̍̀̇̾́́̍ͅĺ̶̪̜̻̝͓̐͜ ̵͍̮͎̱̬̩̮́̓̋̓͋̂̌́͒̚̚͝ẁ̸͈̙̹̗̜͎̮̪̤̬̤̺̼̌̔͗̃͊͌̓̓̿̚͜ͅh̸̩͇̹̙̺̘̰̹͌̒̂́́͊̀̾͂̍̾̓͆ͅi̷̟̟͙̘͉̬͕̻̜̙͈̭̗̹̖͑n̸̨͎̤͖̞̣͙̦̼̹̙̂̉̂̑̐̇́͘ͅe̸̠̝̱̫͐̔̐̇̚ ̷̛̥̦͉̤̙̞͚͚͈̪̊̀a̴̯̖͇̦̋̿̀͑̆̀͝n̶͈͇͒̃̃̊͒ȏ̴͍͚͍̱̟͋̆̄̈́͋̾ņ̶̛͔͍̺̫͒̊͆́͗̾̈̽̋͜?̴̼̆̾̒͑͌͝

            Frick you demon

            • 1 year ago
              Anonymous

              everytging fricking xoil whines these days. my pny 1660ti coil whined, so did my 2060 super from msi, my 3060 ti from asus and now my 3070 from evga - they all fricking coil whine. so does the ps5 and nowadays apparently even motherboards coil whine too. picking a different brand isnt going to save you from it.

        • 1 year ago
          Anonymous

          Not good, not terrible. That's Asus. I got their motherboard brands for Intel until I got fed up with lanes being cut off or diverted if I populated the PCI-E or M.2.

        • 1 year ago
          Anonymous

          Their Nvidia cards are usually good; their TUF models were among the best AND the cheapest for Ampere.
          As for AMD, all I remember is their awful RX 480 model. They re-used some old design or something, and half the heatpipes didn't even touch the die. Needless to say, it was shit.

      • 1 year ago
        Anonymous

        Hey moron all the AIBs are meaningless brands with a revolving door of engineers on top of a few chinese PCB mega manufacturers
        you think there's some fricking artisan craftsmanship going on in shenzen?

        • 1 year ago
          Anonymous

          It's just a general question, why are you so worked up, homosexual? Again, why buy Asus? State reasons, pros, cons, etc.

          • 1 year ago
            Anonymous

            the brands are completely fricking meaningless so the answer should be "the benchmarks showed it was competently designed (like all but one or two models of card every release wave)" and/or "it was the one that was on sale"

  10. 1 year ago
    Anonymous

    So why are NVIDIA cards so expensive compared to AMD?
    Do they

    • 1 year ago
      Anonymous

      they do

    • 1 year ago
      Anonymous

      Brand

    • 1 year ago
      Anonymous

      don't worry anon, nowadays AMD carfs are moronicly expensive too.

    • 1 year ago
      Anonymous

      green shills

    • 1 year ago
      Anonymous

      pic related and brand , but also it has more features.Don't worry, AMD is right behind them in term of price, 7900 xtx is 1k and 7900 xt 850 $

      again this bot

    • 1 year ago
      Anonymous

      AI

    • 1 year ago
      Anonymous

      >Do they

  11. 1 year ago
    Anonymous

    how are the new amd cards with encoding? is that av1 any good? how much performance loss when streaming?
    i gotta stream some sf6 lobbies for fgg soon

  12. 1 year ago
    Anonymous

    I don't care about rt but if they fix the vr performance I'll buy one of their cards immediately

  13. 1 year ago
    Anonymous

    nvidia has better software / features despite still being iffy as frick. amd drivers and software have always been a clusterfrick and outright inferior. fsr is absolute ass compared to dlss.

  14. 1 year ago
    Anonymous
    • 1 year ago
      Anonymous

      >Buy rx570 for €120 a couple years ago
      >Can't even get the same performance for that price these days
      Kinda crazy how GPU progress doesn't translate into prices any more.

      • 1 year ago
        Anonymous

        Because it doesn't. They think they can get away with pushing the buttcoin bubble using AI as the new excuse.

        • 1 year ago
          Anonymous

          Thank god all the big studios have completely fallen off a cliff when it comes to quality or I might have been be tempted to upgrade.

    • 1 year ago
      Anonymous

      Those scores for the RX 6800 through to RX 6950 are way too fricking high, and that's coming from someone who owns a RX 6800 XT

    • 1 year ago
      Anonymous

      It's still funny to me how much better the Vega cards aged compared to the 1070, 1070 Ti, and 1080.

      • 1 year ago
        Anonymous

        Where's the 1080 Ti?

      • 1 year ago
        Anonymous

        AMD cards generally age better because they have to keep writing drivers for their older architectures because that's what the consoles have. that said Nvidia has gotten better about it, Kepler went to shit in like 4 years but Maxwell from 2015 is still completely functional.

        • 1 year ago
          Anonymous

          >because they have to keep writing drivers for their older architectures because that's what the consoles have.
          Uhh...what?

        • 1 year ago
          Anonymous

          AMD cards "age better" because it takes several years for their drivers to not be dogshit and get the performance you should be getting on day 1.

          • 1 year ago
            Anonymous

            yep
            when I buy a gpu I want it to be good NOW now 6 years after launch
            I've never had a gpu for more than 4 years and I never will
            that fine wine shit is cope

  15. 1 year ago
    Anonymous

    AMD has better software / features despite still being iffy as frick. Nvidia drivers and software lately have been a clusterfrick and outright inferior. DLSS is absolute ass and unsupported versus FSR/RSR which works for literally every fricking game Ganker objectively plays compared to DLSS.

    • 1 year ago
      Anonymous

      show me the framerate gains between dlss and fsr so we can all laugh at you

      • 1 year ago
        Anonymous

        FSR vs DLSS. You tell me.

      • 1 year ago
        Anonymous

        >dlss
        >FSR

    • 1 year ago
      Anonymous

      I don't think I've had a single driver issue since I switched to NVIDIA in 2016.

    • 1 year ago
      Anonymous

      >Nvidia drivers and software lately have been a clusterfrick
      ??????????

  16. 1 year ago
    Anonymous

    FSR vs DLSS. You tell me.

    • 1 year ago
      Anonymous

      show me the framerate gains between dlss and fsr so we can all laugh at you

      DLSS looks worse.

      • 1 year ago
        Anonymous

        You're moronic right? Just look at the car at the very bottom of the screen in FSR vs DLSS

        • 1 year ago
          Anonymous

          Stfu dumb Black person DLSS has obvious ghosting and none of the others do. Ghosting is what kills PC games especially driving and FPS.

      • 1 year ago
        Anonymous

        i think current dlss version is at 3.1.2 or something.

        • 1 year ago
          Anonymous

          >DLSS 3.1.2

          • 1 year ago
            Anonymous

            >using still pictures instead of videos
            nobody's ever going to pay for your filthy cards

            • 1 year ago
              Anonymous

              stfu fricking homosexual. DLSS is crap. I know what I see as I own two PCs one with AMD and Nvidia. Stop gobbling ghosting artifacts. If you can't see it then you're a fricking jit.

            • 1 year ago
              Anonymous

              No one is going to buy a $1000 card to use DSL or FSR, that's fricking moronic, they both look like shit.

          • 1 year ago
            Anonymous

            i said version 3.1.2, not dlss3. its confusing i know. the 3.1.2 is the current version number of the basic dlss, without frame generation.

          • 1 year ago
            Anonymous

            its hilarious how bad this is but since nvidia is so big not a single so called "enthusiast" site will ever call them out on it

      • 1 year ago
        Anonymous

        not only are you wrong, but don't move the goalpost; the point of DLSS is to give me more frames, show me the frame gains from FSR you disingenuous Black person

        • 1 year ago
          Anonymous

          i think current dlss version is at 3.1.2 or something.

          LOL

          • 1 year ago
            Anonymous

            This moron is baiting and you guys are falling for it.

      • 1 year ago
        Anonymous

        How? It's clearly superior. It's much sharper and has far superior fine detail definition, FSR is very blurry. Look at the white text on that light blue banner. The text is super clear and easy to read with DLSS while with FSR it's mush.

      • 1 year ago
        Anonymous

        Holy frick that ghosting is out of this world. Might as well just crank the resolution down

      • 1 year ago
        Anonymous

        Look at the jaggies on FSR. Also DLSS used to have a lot of ghosting but from 3.4 an upwards they have been actively trying to fix that. Your screenshot is most likely a cherrypick as usual since you're showing version 2.4

  17. 1 year ago
    Anonymous
    • 1 year ago
      Anonymous

      is 7900XTX really on par with a 4080? it's 500€ cheaper

      • 1 year ago
        Anonymous

        its slightly better

        for production Nvidia gpus are better
        for pure gaming AMD gpus are better

        Nvidia charge more because of brand name and marketshare and raytracing nobody uses

      • 1 year ago
        Anonymous

        Ironically in my country there are 4080's available for cheaper than the 7900XTX which kind of blows my mind. I guess I'm an amd fanboy but I'd take a cheaper 4080 any day over the XTX.

  18. 1 year ago
    Anonymous

    I've been using mostly amd gpus since the HD 5850, not had any problems.

  19. 1 year ago
    Anonymous

    my 7900xtx seems alright
    h264 decoding has issues though

  20. 1 year ago
    Anonymous
    • 1 year ago
      Anonymous

      how can ray tracing be that performance heavy? is it this a a hardware or optimization issue?

      • 1 year ago
        Anonymous

        it really is that heavy when you fully path trace a game.

      • 1 year ago
        Anonymous

        RT destroys your VRAM Capacity and VRAM Bandwidth. Nvidia is forcing DLSS on people to cover up the fact that RT is a bloated mess and they are also shipping GPUs with less VRAM to save themselves money without passing the savings to the consumer.

        • 1 year ago
          Anonymous

          i doubt thats real vram usage, nogwarts runs 1440p60 with ultra textures on my 3060ti which only has 8gb.

          • 1 year ago
            Anonymous

            No it doesn't

            • 1 year ago
              Anonymous

              k

          • 1 year ago
            Anonymous

            It's pretty close to actual usage, at 4K with RT enabled the 3080 (10GB) performs worse than a 3060 (12GB) and gets demolished by the 6800 XT (16GB).
            And by "demolished" I mean like 30 FPS vs. 6 FPS, so it's arguably unplayable on both cards.

            I just bought an XFX 6700xt, did I frick up?

            AMD is probably going to launch its midrange RDNA3 cards in June, according to believable rumors. Did you frick up? Maybe, it depends on the price.

            • 1 year ago
              Anonymous

              >Maybe, it depends on the price
              I spent $370 for what it's worth

              • 1 year ago
                Anonymous

                I meant how much their next-gen cards will cost. Like, let's take a hypothetical 7700 XT for example:
                >30% faster than the 6700 XT
                >16GB VRAM
                >AV1 encoding
                >Significantly better raytracing
                If they sell this for $399, I'd say you fricked up. But it'll be $499 or more in my opinion, so I doubt you'll regret your purchase.

                >at 4K with RT enabled the 3080 (10GB) performs worse than a 3060 (12GB)
                where the hell do you get that stuff anon? do you just make it up?

                Looking at charts like this makes me so glad I'm not moronic

                That benchmark is wrong: come on, in what world the 7900 XTX performs worse than a 3060. The reviewer even admitted he fricked up on the TechPowerUp forums, but they never bothered to retest and update the graphs. Pic related is the correct benchmark. My memory was a bit off, it was 14 FPS for the 6800 XT and 4 for the 3080: still, completely unplayable on both. But as you can see, the 3060 performs better because it has a bit more VRAM.
                Basically, the only card that can run this game at these absurd settings is the 4090.

                >muh VRAM
                I bet you are one of the morons that fell for the RX 480 8GB meme.

                Actually, I bought a 580 4GB for €130 at the end of 2018: it was supposed to be a temporary card, after my old 680 died. I was planning on getting a 3080 or 6800 XT + 1440p monitor upgrade, but then mining happened so I'm still using that same 580. In hindsight, I should've bought the 8GB version lmao.

              • 1 year ago
                Anonymous

                >using raytracing
                it was a meme on my 3090 and continues to be a meme
                imagine buying a 4090 and being fine with 48fps lows and a 61fps average

              • 1 year ago
                Anonymous

                show the gpu temps.

            • 1 year ago
              Anonymous

              Those midrange cards better have more than 8gb of vram. Then I might consider them. 4060 will only have 8gb of ram so if 7600 has 12 or more I'm gonna consider it very much. 12 gigs should be the minimun.

            • 1 year ago
              Anonymous

              >at 4K with RT enabled the 3080 (10GB) performs worse than a 3060 (12GB)
              where the hell do you get that stuff anon? do you just make it up?

              • 1 year ago
                Anonymous

                >1.1 fps
                Let me guess, you need more?

              • 1 year ago
                Anonymous

                I want less

              • 1 year ago
                Anonymous

                Looking at charts like this makes me so glad I'm not moronic

    • 1 year ago
      Anonymous

      >make a game run like shit with a press of a button, only for $1999.99!

  21. 1 year ago
    Anonymous

    Not really.

  22. 1 year ago
    Anonymous

    Yes goy just buy Nvidia

  23. 1 year ago
    Anonymous

    gtx 1060 6gb/ amd ryzen 5 1600 here what to upgrade if i want a good 1080p60fps gaming experience?

    • 1 year ago
      Anonymous

      ryzen 5 5600 + radeon 6700xt
      should be around 500 bux in total

      • 1 year ago
        Anonymous

        danke herr anon

    • 1 year ago
      Anonymous

      RTX3060 ti

      ryzen 5 5600 + radeon 6700xt
      should be around 500 bux in total

      I have that card, it's made for 2k.

      • 1 year ago
        Anonymous

        so? not like it stops working if you connect it to a 1080p displays.

        • 1 year ago
          Anonymous

          Might as welll just buy a 4090

  24. 1 year ago
    Anonymous

    I had to flash the bios on my GPU to get it to work. What do you think.

  25. 1 year ago
    Anonymous

    the drivers are shit and that's a fact.

    • 1 year ago
      Anonymous

      if they were then why are none of the major tech guys saying it today on their AMD card reviews like Gamers Nexus? They go through like 30 mins of video talking inside and out of the GPU and not once they mention drivers.

  26. 1 year ago
    Anonymous

    Considering a 6800XT to upgrade from my 1080ti, i'd get a 3080ti but those are like $1,200. I might go team red after 10 years of green.
    Anyone know if I can do stable diffusion on windows worth a 6800Xt? SD is about the only thing id miss from nvidia tbh.

    • 1 year ago
      Anonymous

      6000 series prices are coming down fast. keep waiting while longer and you'll get 6950 xt for 450

    • 1 year ago
      Anonymous

      >I can do stable diffusion on windows worth a 6800Xt
      You can, but it's a pain in the ass and it's slow. If SD is a must then just stick with nvidia.

  27. 1 year ago
    Anonymous

    Idk, I just got a 6500 and got an excellent score for Blue Protocol, works fine for me. NVIDIA however, is absolute shit

  28. 1 year ago
    Anonymous

    I'm upgrading to a GTX 970

  29. 1 year ago
    Anonymous

    >New architecture AMD (5000/7000 Series)
    horrible
    don't bother
    you are the beta tester
    >second generation / refresh of new architecture (6000 Series)
    AMD WON

  30. 1 year ago
    Anonymous

    no, only shills and mentally ill people will say other wise
    Nvidia if you are actually doing lots of rendering besides gaming but since majority of people just play games and watch youtube AMD is better for price/performance

  31. 1 year ago
    Anonymous

    im using a amd FX 9370 but this week i got a new 5600G with some new DDR4, i hope it will help me with my emulation, i can play most games without any problem but sometimes i get some lag.
    i hope this is good

    • 1 year ago
      Anonymous

      FX series were a total joke, even budget pentiums were better for emulation (G2358) when they were around.

      • 1 year ago
        Anonymous

        FX had the same fate as xeons, bad single core good multicore performance
        now that games use multi-core by default both an fx or xeon are better gaming cpu's than the mainlines

    • 1 year ago
      Anonymous

      >i got a new 5600G with some new DDR4, i hope it will help me with my emulation
      You'll be fine on that budget build. Just don't expect anything higher than 60fps at 1080p even during emulation.

      • 1 year ago
        Anonymous

        why would someone ever need more then that for anything ever

        • 1 year ago
          Anonymous

          It's 2023. In 2013 Ganker was hoping for 4K 144FPS NATIVE as standard, but here we are still stuck with 1080p as the default standard on native resolutions because Ganker missed one thing: how shitty developers became at optimization.

  32. 1 year ago
    Anonymous

    i have a 5700xt bros, and I must c-consoom. Should I upgrade or wait a generation or two?

    • 1 year ago
      Anonymous

      Rx 5700 here, will upgrade maybe at the end of the year If I find a good deal or next year. Using a C2 42 LG and play most games in 1440p ultra or 4k for smaller budget games or older games.

      But to be 100% honest with you, I feel like I just want to upgrade for the sake of it then something I really need, it's also your case a 5700 XT will still do the job for 90% games, only a handful will make it kneel.

      Anyway I bought my gpu at the end of 2019, 4 years of use seems fair for an upgrade.

  33. 1 year ago
    Anonymous

    No.

  34. 1 year ago
    Anonymous

    >people complaining about games not working right or having graphic problems
    >90% of the time they're using AMD
    Shit on Nvidia all you want, at least their shit actually works.

  35. 1 year ago
    Anonymous

    the driver issue hasn't been a thing since opengl fricking died. nvidia customized their drivers while amd followed the opengl recommendations.

    • 1 year ago
      Anonymous

      >since opengl fricking died
      AMD caught up with opengl on windows in last year's summer update.

      • 1 year ago
        Anonymous

        oh so even more of a non-issue for minecraft, the last relevant opengl game

  36. 1 year ago
    Anonymous

    Don't worry AMD bros. AMD is paying Linus millions of dollars and all of a sudden he became pro AMD just like that. And just like that in a few years people will stop shitting on AMD GPU's.

    • 1 year ago
      Anonymous

      Youtube was a mistake. Those goddamn thumbnails are nauseating. It's a shame, some of their videos are fun like them building a pc out of parts all bought on wish and aliexpress

      • 1 year ago
        Anonymous

        I mean Linus himself said that having those stupid face thumbnail increases the views as much as 20%. That's why everyone does it.

        • 1 year ago
          Anonymous

          I know it works and the clickbait titles too but I still fricking hate it. I stopped watching his videos because of it

    • 1 year ago
      Anonymous

      >shilling is never okay
      >except when it's for Nvidia

  37. 1 year ago
    Anonymous

    Damn it feels good to be an AMD chad. No stutter. No driver issues. Pure gaming bliss better than consoles.

    • 1 year ago
      Anonymous

      >Driver casually corrupts your OS

      • 1 year ago
        Anonymous

        Are you running windows by any chance?

        • 1 year ago
          Anonymous

          Linuxgay detected
          >Hurdygurdy I love programming everything I do hurdygurdy i love it when nothing works out of the box
          >hurdygurdy i love the fact that only 10% out of 50000 games on steam work on my os!
          >hurdygurdy no softwares work for me but thats great
          >hurdygurdy amd is poorly supported by anything linux but i love it

          • 1 year ago
            Anonymous

            why are you so mad about someone not using windows? is moonlighting on your shilling career for nvidia allowed?

            • 1 year ago
              Anonymous

              id use linux if it was literally plug and play like windows is and you dont have to frick with shit to get things running and the gui wasnt so archaic

              if
              1: support all games with no downsides
              2. support all softwares with no fuss or downsides
              3. actually works and isnt just fork #823749873 of something else

              • 1 year ago
                Anonymous
              • 1 year ago
                Anonymous

                works on mine too, but windows works better

              • 1 year ago
                Anonymous

                even my clean install of Windows 10 that was meant for a singular game likes to ~~*rescan*~~ the drive when I rarely boot into it
                not that I ever use it anyway
                i dislike Windows and will never go back, especially when considering amdgpu/mesa on Linux is ahead of the drivers on Windows for the 7900xtx

              • 1 year ago
                Anonymous

                works on my machine
                sounds like user error

              • 1 year ago
                Anonymous

                clean install so its the product that is sold to users
                enjoy TikTok

              • 1 year ago
                Anonymous

                clean install works perfectly fine on my machines
                try taking your hardware to the geek squad if you can't figure it out on your own

              • 1 year ago
                Anonymous

                idk why you feel a need to defend a company in your free time when their product clearly has issues
                and that's not even getting started on the ones involving privacy 🙂

              • 1 year ago
                Anonymous

                I dunno what to tell ya man, some people just can't handle easy shit like hardware. People like you are why prebuilts are a thing.

              • 1 year ago
                Anonymous

                the hardware works fine on Linux
                strange

              • 1 year ago
                Anonymous

                Why is it that """linux professionals""" always seem to have so many issues running the most user friendly OS on the market? How are they so inept?

              • 1 year ago
                Anonymous

                anon you're at it again trying to needlessly protect Windows/Microsoft for something that isn't my problem
                the hardware is fine and works elsewhere
                I'm getting second-hand embarassment from this encounter with (you)

              • 1 year ago
                Anonymous

                >it's not my problem
                Is that not why you're seething about the easiest OS on the market in this thread?

              • 1 year ago
                Anonymous

                but I don't use Windows and simply said the drivers for my gpu are better on Linux
                you're the one seething over this fact
                you can't stand the fact that someone isn't using Windows for some reason
                its pretty strange

              • 1 year ago
                Anonymous

                I'm just glad there's an OS out there for those who are too incompetent to use Windows.

              • 1 year ago
                Anonymous

                man you really don't know when to stop do you, anon

              • 1 year ago
                Anonymous

                >too incompetent to use Windows.

                Lol anon I....

              • 1 year ago
                Anonymous

                I didn't believe they existed until I met people from Ganker

              • 1 year ago
                Anonymous

                >too incompetent to use Windows.
                Anon pls, Windows is baby duck zone.

              • 1 year ago
                Anonymous

                Exactly, which is why it's insane that we share a board with people too inept to use it.

              • 1 year ago
                Anonymous

                I've used operating systems across all brands, and it usually comes down to user moronation when something goes wrong, be it Windows, Linux, Solaris, BSD, OSX. RTFM or the documentation you slobs is it so hard, and don't use hardware that isn't supported or badly supported.

              • 1 year ago
                Anonymous

                Try using it before talking shit. Unless you need specific software for work like adobe shit, linux is 100% usable. Popular desktop environments like KDE are made to be similar to windows. Enough that a windows user will be able to navigate the os with little issue. With Steam's proton, most games will work flawlessly sans anything that uses anti cheat

              • 1 year ago
                Anonymous

                do you need two different os for it to work like windows
                no wonder linux will never grow or become the standard

              • 1 year ago
                Anonymous

                here's a question for ya linux bro. how is linux at things like switching monitors/sound devices on the fly?
                for my setup i switch between dual monitors OR a tv. (work vs gaming, basically)
                when using monitors audio is via optical (2.0)
                when using the TV over HDMI audio is 5.1

                this is partly due to how windows handles disconnected devices, there's no way to get windows to send an audio signal over HDMI without having the display connected (having a ghost window or some other hacky kludgy solution is just not worth the hassle. nor is dealing with hacked realtek drivers to get shitty dolby live/dts to work over optical)

                right now i'm using displayfusion to handle this, one keyboard combo sets the audio and display devices as needed. i'd like to switch to linux and just be done with MS's c**t behavior, but it's the little things like this that always seem to be an unbelievably huge pain in the dick w/ linux.

              • 1 year ago
                Anonymous

                Couldn't tell you, I'm still fairly new to linux. Been using it for a year. I use headphones and cheap speakers. I can easily switch between the two similar to windows on the task bar. Monitors are more complicated because you're either using xorg, which doesn't like multiple displays, or wayland, which has better support but doesn't play well with nvidia gpus.

                I'd suggest researching recommended beginner distros like mint and fricking around with a liveboot so you can test the setup without needing to install the os

              • 1 year ago
                Anonymous

                not him but ask in the linux threads or steam deck when they show up
                most users of linux gaming have less than a year of usage thanks to the renewed interest from Valve

              • 1 year ago
                Anonymous

                yeah, good call.
                it's mostly hypothetical, i'm fine with W10 for now. the gaming thing is the only thing keeping me tethered to windows (oh and i guess now with work, visual studio. god i miss apache and mysql.). but i know they're going to push me to linux eventually due to even more anticonsumer bullshit.

                First time hearing about this, that's actually fricked

                yeah, if you use gamestream for LAN streaming, you can avoid it for the time being by blocking the webhelper executable via windows firewall, as well as some hosts file entries. but eventually they'll sneak one past the goalie and frick you (probably a video card driver installation will remove geforce experience and then there you go.)

                i really, really, REALLY dislike software vendors doing this shit, if it works, and costs them nothing to keep it, why frick your users like this? hell, at least open source it so other people can carry the torch. it's evil and anti consumer

                never buying another nvidia product again.

              • 1 year ago
                Anonymous

                Gaming becoming viable for linux over the past year or so is what made me move over. I still dual boot since having two operating systems separating work vs. play has been a godsend for productivity and focus.

              • 1 year ago
                Anonymous

                yeah, last time i played around with linux (steam, proton) back in 2017 or so, it was like 95% of the way there. Linux on the desktop seems to perpetually be at the horizon of 'good enough!'

                maybe one day it'll be good for non standard (surround/multi monitor/hdr/controllers etc) setups. for single monitor, stereo sound, mouse/kb i'm sure it's great as is.

          • 1 year ago
            Anonymous

            >Hurdygurdy
            Swede detected.

      • 1 year ago
        Anonymous

        that wasnt a driver issue that was a windows issue. windows update would install its own drivers while you were installing yours. obviously this is a bad idea but microsoft pajeets allowed it.

        • 1 year ago
          Anonymous

          >It was microsofts fault!!

          • 1 year ago
            Anonymous

            >only happens with ayyyymd
            >blames microshit
            Listen buddy, you're not getting paid. It's too obvious.

            the driver literally tells you that windows is bricking your pc and you should stop being such a frick up using your own pc. its not the driver or amd fault, its amd and the user fault for being dumb enough to not group policy.

            • 1 year ago
              Anonymous

              nice bait

              • 1 year ago
                Anonymous

                nice bait

            • 1 year ago
              Anonymous

              never had this problem with nvidia.

            • 1 year ago
              Anonymous

              Never happened with Nvidia. Maybe the pajeets at AMD can have a talk with the windows pajeets and sort it out.

              • 1 year ago
                Anonymous

                never had this problem with nvidia.

                nice bait

                *windows automatically removes your driver*
                sorry chud guess you arent playing games today

              • 1 year ago
                Anonymous

                That doesnt remove anything though. Try again baitbro

              • 1 year ago
                Anonymous

                i did a clean install of w11 when it first came out and never had any driver issues for anything in my pc

              • 1 year ago
                Anonymous

                forget that, shit like webm related is funnier

              • 1 year ago
                Anonymous

                Windows was bad before but dear god they shit the god damn bed since they went with the always updating model. Every month something is broken and its forced on everyone. Even when you reinstall your PC, the ISO you grab is automatically updated before you install it.

              • 1 year ago
                Anonymous

                Something is wrong with your pc then because i've have zero issues with windows for years, nothing ever breaks or crashes

              • 1 year ago
                Anonymous

                google MPO Windows
                dont tell me there are zero issues

              • 1 year ago
                Anonymous

                If you were foolish enough to use 3rd party disable updoot software instead of rummaging with the group policy editor, you will run into problems sometime down the line guaranteed while also being susceptible to bad updates like defender jailing your chrome browser back in september 2022.

        • 1 year ago
          Anonymous

          >only happens with ayyyymd
          >blames microshit
          Listen buddy, you're not getting paid. It's too obvious.

        • 1 year ago
          Anonymous

          >windows update would install its own drivers while you were installing yours.
          Happens with AMD and Nvidia, I call it the update driver loop.

  38. 1 year ago
    Anonymous

    I just got 6800 a month ago and to this day, I still don't know how to check fans. It just doesn't spin and temp will be only around 55-60C when playing vidya which my friend said it's normal that they don't spin around that temp so I don't really do anything about it

    • 1 year ago
      Anonymous

      If you're on windows, I believe the driver software lets you set a fan curve. Otherwise, you can use something like MSI Afterburner to set custom fan curves

    • 1 year ago
      Anonymous

      right click desktop, go to amd panel, click on performance tab, your temps and fan controls should be there. by default the fans dont spin on amd cards until they hit a certain temp so its normal to boot up your pc and see the fans are not working.

    • 1 year ago
      Anonymous

      That's the default setting for AMD on all operating systems. You can edit a fan curve on windows IIRC but I don't think it's necessary.

  39. 1 year ago
    Anonymous

    Remember when PC Gaming was simple?

    • 1 year ago
      Anonymous

      Nice try Satan, but you can't trick me

      • 1 year ago
        Anonymous

        y̷̧͈͓̖̦̪͖̰͔͈̓͗̍̾̓̑͆̋̑̚̚ͅŏ̴̝̬̤̹̀̈̉̃̋̀̊̊̾̃ư̵̧̝̗͓̦̝͍̑̀̌̔̇̂̍̓̈́͊ͅ ̸̜̜̫͆̈́̀d̶̢͖̹̉͆̕ợ̵̧̞͂͑̾̔̑͒̏̎̾͋͒̄̂ņ̶̃̎͌t̸̨̧̛̟̭̮̠͛̊̌̃̋̔̾̍̕͝͠ͅ ̷̨̼̙̭̝̟̯̱͛̐͋̽͋̐̚͠l̴̡̡͚̙̮͚̺̉͋̓͂͊̽̚̕̚͠͝ȋ̶̛̜̗̼̯̓̈̊̑͆̾̀́̊̓̓̓͜k̶̝̹̒̌̃͐̾̽̇̎͠͝͠ͅĕ̸̞̤͎̣̓̈ ̷̨̩̙̩͉͚̘͌̂̽̋͗̔͋͐͗͛͛̀͝c̵̢̮͉̫̗̖̻̭̖̠̮͐̈́̆̓̈́̊̉͋͆͠ǫ̵̤̦̰͓̻͍̟͇̰̒͑̔́̄̄͑́̒̆͠i̸̛̙̫̯̜͎͇͖͐͐̒̈́̋̍̀̇̾́́̍ͅĺ̶̪̜̻̝͓̐͜ ̵͍̮͎̱̬̩̮́̓̋̓͋̂̌́͒̚̚͝ẁ̸͈̙̹̗̜͎̮̪̤̬̤̺̼̌̔͗̃͊͌̓̓̿̚͜ͅh̸̩͇̹̙̺̘̰̹͌̒̂́́͊̀̾͂̍̾̓͆ͅi̷̟̟͙̘͉̬͕̻̜̙͈̭̗̹̖͑n̸̨͎̤͖̞̣͙̦̼̹̙̂̉̂̑̐̇́͘ͅe̸̠̝̱̫͐̔̐̇̚ ̷̛̥̦͉̤̙̞͚͚͈̪̊̀a̴̯̖͇̦̋̿̀͑̆̀͝n̶͈͇͒̃̃̊͒ȏ̴͍͚͍̱̟͋̆̄̈́͋̾ņ̶̛͔͍̺̫͒̊͆́͗̾̈̽̋͜?̴̼̆̾̒͑͌͝

  40. 1 year ago
    Anonymous

    they're fine unless you spend all day looking at graphs
    and I suppose with certain rendering applications and stuff nvidia tends to be better

    I'd say that their software suite is a bit more fleshed out compared to nvidia's too
    like I used to have AMD and I always wondered why people said that you needed msi afterburner
    then I got an nvidia and I wondered why the driver software was so barebones
    "oh right that's why people use msi afterburner"

  41. 1 year ago
    Anonymous

    Nvidia removing gamestream in lieu of cloud/monthly payment is infuriating
    >the feature works so well that it's cannibalizing our paid option (local LAN streaming vs streaming over the internet)
    >i know, we'll deprecate it and remove it via an update
    >but won't they complain?
    >no, we'll just tell them it's to "improve their experience"
    >oh right, works every time.

    yeah, frick those c**ts. jenson if you can read this, i hope someone beats you to death with a hammer.
    Sunlight seems to work well enough so far for hosting; I was about to snag a 4080, but yeah frick them. 7900 XTX it is.

    • 1 year ago
      Anonymous

      First time hearing about this, that's actually fricked

    • 1 year ago
      Anonymous

      They removed it from their Shield shit, they didn't remove the feature from the graphics cards. You can still use Moonlight on Android anyway, I assume that works on Shield devices but I don't have one of those.

      • 1 year ago
        Anonymous

        anon, they are removing it from geforce experience, period -- meaning moonlight (the client) wouldn't be able to connect to the host.

        Sunshine so far seems to be a good enough replacement, slightly more cumbersome to set up, but not a deal breaker (and luckily it works with AMD cards)

        • 1 year ago
          Anonymous

          > are removing
          ?
          Wasn't it removed since last month?

          • 1 year ago
            Anonymous

            i have no idea if they went through with it on the host side since i put geforce experience in a box and don't let it reach out to the mothership for an update to be gimped by the c**ts at nvidia. I have gamestream disabled ATM so that i can frick around with Sunshine.

            so there's no confusion, i don't own a shield. i use moonlight on a laptop in my living room to stream from my gaming PC. The host uses geforce experience's gamestream, moonlight utilizes this stream (instead of a nvidia shield, it's a simple laptop). Nvidia is removing gamestream from geforce experience, nerfing in-home/ LAN streaming (ostensibly to prop up their cloud gaming homosexualry)

            • 1 year ago
              Anonymous

              >i have no idea if they went through with it on the host side
              They didn't remove anything

              Anyway, how good is Sunshine? Does it actually do all-hardware capture + encode like GFE does or is it another shitty solution with software capture? Software capture solutions tend to suck at high res. Steam suffers from this for instance.

  42. 1 year ago
    Anonymous

    I went
    >HD 5770
    >GT 520MX
    >GTX 960M
    >GTX 1060 6GB
    >RX 5600XT
    >RX 6700
    And I've had a consistently better experience with the AMD cards. I also like Adrenalin quite a lot. But then again that's only empirical evidence that doesn't amount to much, and I'm moderately convinced that my three Nvidia cards had been cursed by an african warlord or some shit

    • 1 year ago
      Anonymous

      For me it was
      >HD 5450
      >HD 5770
      >R7 360
      >GTX 1070
      >6700xt
      Only the 6700xt gave me problems but it's under warranty and currently being rma'd. The 6700xt was a beast when it was still working

      • 1 year ago
        Anonymous

        I went
        >HD 5770
        >GT 520MX
        >GTX 960M
        >GTX 1060 6GB
        >RX 5600XT
        >RX 6700
        And I've had a consistently better experience with the AMD cards. I also like Adrenalin quite a lot. But then again that's only empirical evidence that doesn't amount to much, and I'm moderately convinced that my three Nvidia cards had been cursed by an african warlord or some shit

        5770bros
        That was the first GPU I bought with my own money, before then all I had was the 8600 GT my dad got me in high school and an old shitty FX 5000 series GPU.

    • 1 year ago
      Anonymous

      >first GPU was hd5770
      zoomzoom

      • 1 year ago
        Anonymous

        First GPU in my own PC in 2010, yeah. Before that, it was whatever was in my dad or my brother's PCs.

        For me it was
        >HD 5450
        >HD 5770
        >R7 360
        >GTX 1070
        >6700xt
        Only the 6700xt gave me problems but it's under warranty and currently being rma'd. The 6700xt was a beast when it was still working

        My 6700 non XT's been working like an absolute charm, pretty much curbstomping everything at 1920x1200. It doesn't UV as well as my 5600XT did, but -50mv +150mhz core +100mhz vram is still nice to have.

    • 1 year ago
      Anonymous

      For me it was
      >Radeon 9250 (parents old PC)
      >HD 4670
      >GTX 650
      >R7 260X
      >GTX 1060 3GB
      >RX 6700XT

  43. 1 year ago
    Anonymous

    they are worse

  44. 1 year ago
    Anonymous

    They are bad for working on animation and by bad i mean that they take some seconds more than nvidia gpus...barely any difference

  45. 1 year ago
    Anonymous

    I once bought an AMD Gpu and the next day found out trannies exist.

  46. 1 year ago
    Anonymous

    AMD is for europoors and other third-worlders.
    If you are in USA, are over 18 years old and you don't buy Nvidia you pretty much failed at life.

  47. 1 year ago
    Anonymous

    The problem with AMD is that they have no drivers. Generally a GPU needs good drivers to run a video game.

  48. 1 year ago
    Anonymous

    Rate my AMD PC. I sold my Nvidia card because it was crashing too much and the fans were loud asf.

    • 1 year ago
      Anonymous

      What am I supposed to rate? Looks like any other generic PC build with rgb lights.

      • 1 year ago
        Anonymous

        Thanks, I hate it

        • 1 year ago
          Anonymous

          Get a solid case or solid panel accessory from fractal directly so you never have to look at it. You're supposed to use your PC, not take pictures of it.

    • 1 year ago
      Anonymous

      Why are the current XFX GPUs so sexy?
      Will be replacing the old ass CPU/mobo soon.

      • 1 year ago
        Anonymous

        Nice dick.

  49. 1 year ago
    Anonymous

    Gpus are fine
    The drivers suck ass

  50. 1 year ago
    Anonymous

    If you are planning to neither use any AI related applications, video editing nor raytracing the 7900 XTX has the best value. If you are planning to use any of those applications, even the RTX 4070 Ti for 300 bucks less offers a similar performance.
    So in short: if you plan to only game without raytracing, the 7900 XTX is your best choice. Else use RTX, as either the performane is better than AMD, or nvidia is way cheaper.

  51. 1 year ago
    Anonymous

    AMD is trash plain and simple
    if you don't have money problems and is actually serious about their applications/games, nvidia is the only option

  52. 1 year ago
    Anonymous

    i'm using a GTX 1080 from like 2018, if i upgaded to a 7900 XTX, what would I be in for? (i dream of being able to play newer games at 4k without dropping quality to potato.)

    i don't see the point in buying a mid-range card just to be in a position to buy yet another video card in a couple of years.. i'd rather spend more up front but be set for quite a number of years (ie. the GTX 1080)

  53. 1 year ago
    Anonymous

    AMD features are way better than Nvidia lately after upgrading my computer. HDR works way better, same with the DLSS equivalent and more. I checked reviews too and they say the same thing. Believe the people who actually benchmarked the stuff with data and guys who own the hardware than random anonymous Nvidia fanboys.

    • 1 year ago
      Anonymous

      >Smart access memory
      >is literally just rebar

  54. 1 year ago
    Anonymous

    How are the open source drivers on Linux? My future build in a few years will be full AMD so i am curious.

    • 1 year ago
      Anonymous

      It just works out of the box on a fresh install. If you've been using nvidia you might need to reinstall mesa.

  55. 1 year ago
    Anonymous

    Never bought a video card before
    Is it ok to buy a used one or is it better to get it completely new?
    How long does one last

    • 1 year ago
      Anonymous

      If you had asked before the mining craze it'd be alright. Afterwards however, it's a pretty hard gamble.

    • 1 year ago
      Anonymous

      >Is it ok to buy a used one
      It's usually fine, but you have no idea what that card was used for, and how. This story made headlines a few months ago: https://videocardz.com/newz/radeon-gpu-cracking-not-caused-by-drivers-storing-conditions-and-cryptomining-to-blame
      Imo, unless you're getting a significantly better deal, don't risk it and buy new.

      >How long does one last
      Dunno, forever? Video cards can die, but it's pretty rare. They usually outlive their actual usefulness. All I can say on this topic is, high-end cards tend to break much more frequently than low-end ones.
      High-end cards have a much bigger GPU die = It expands (and then contracts) more with heat = Much more likely to break the solder underneath.
      High-end cards also have more memory dies = If one of them breaks, RIP card.

  56. 1 year ago
    Anonymous

    Much more lacking in drivers, which is a pain in the ass. It is still worth considering in the end though. Nvidia and scalpers get autisticly israeli over even GTX shit over 5 years old.

  57. 1 year ago
    Anonymous

    Nah, usually what the issue ends up being some proprietary bullshit
    The actual cards perform fine.

  58. 1 year ago
    Anonymous

    No, they're not that bad in absolute terms, but I don't think they're good enough to be very attractive given their pricing. When a 7900XTX is $1000 I don't think saving $200 over a 4080 is that attractive when you miss out on better RT performance and features. Who wants to spend $1000 on a compromised product? If it were cheaper and had amazing bang/buck it would be a different matter, but it's not cheap.

  59. 1 year ago
    Anonymous

    Will never touch another AMD card after 7900 XT. Constant issues from software to wattage bud with multiple monitors
    Decided to just go with a msrp tuf 4070 ti

    • 1 year ago
      Anonymous

      >buying the beta test and not the refresh or the final end-of-line product
      AMD sucks precisely because of this.
      rx580 > rx480
      r9 390 > r9 290
      6700xt > 5700xt
      5800x3d/5950x > first generation Ryzen

  60. 1 year ago
    Anonymous

    The GPU are decent for the price. Still worse than Nvidia but you're getting what you pay for. The CPU have been and still are shit for most games. They're great for work though. Overall, AMD is fullfills the role of "competition" in this incredibly inbred industry. Things could be a lot better but no one has to work very hard when there are only two options.

    • 1 year ago
      Anonymous

      you're completely clueless.

  61. 1 year ago
    Anonymous

    Ganker should I get the 3060ti or the 6700xt?

    • 1 year ago
      Anonymous

      6700 xt

      t. 3060 ti owner

      • 1 year ago
        Anonymous

        What makes the 3060ti bad for you?

        • 1 year ago
          Anonymous

          8gb instead of 12 vram. Sure it has marginally greater clockspeeds, but vram is better if you compare the benefits.

      • 1 year ago
        Anonymous

        I've got a 3060ti myself, my !st Nvidia card. Gotta say DLSS is nice

        I'd get a 6700 XT because it has more VRAM, but with the 3060Ti you get DLSS, way better performance in productivity and AI tasks, better video encoder for streaming.
        Performance in videogames is more or less the same.

        I went with a 6700xt Sapphire. Love it and no issues but that doesn't mean anything nowadays. I'll prolly swap to Nvidia next build.

        6700xt, sapphire
        You're going to end up using some version of FSR someday because DLSS will be segmented per new generation of GPU, newer versions of DLSS will very likely not work on older generations of cards as the years go by
        See: 1080ti users playing around on the latest FSR

        These mixed results are exactly why I really have no idea anymore with GPUs.

        • 1 year ago
          Anonymous

          Right this moment, it's pretty easy: AyyMD is better for low-end budgeting, Nviisraelite is better for high end richgayging.
          Outside of meme features like DLSS or AI shit a 6700XT will beat the similarly priced RTX 3060 any day.
          But if you're gonna be spending buckets of cash on a GPU than AMD doesn't really have anything to go up against the RTX 40 series. Not right now, anyway.

    • 1 year ago
      Anonymous

      I've got a 3060ti myself, my !st Nvidia card. Gotta say DLSS is nice

    • 1 year ago
      Anonymous

      I'd get a 6700 XT because it has more VRAM, but with the 3060Ti you get DLSS, way better performance in productivity and AI tasks, better video encoder for streaming.
      Performance in videogames is more or less the same.

      • 1 year ago
        Anonymous

        I really wish I could get the vram of the 6700xt and all the perks of the 3060ti for somewhere in the middle in terms of price, why is this so hard to ask for?

    • 1 year ago
      Anonymous

      I went with a 6700xt Sapphire. Love it and no issues but that doesn't mean anything nowadays. I'll prolly swap to Nvidia next build.

    • 1 year ago
      Anonymous

      6700xt, sapphire
      You're going to end up using some version of FSR someday because DLSS will be segmented per new generation of GPU, newer versions of DLSS will very likely not work on older generations of cards as the years go by
      See: 1080ti users playing around on the latest FSR

      • 1 year ago
        Anonymous

        The DLSS upscalers and all their updates work on all generations of cards which have ever supported DLSS to begin with. They don't work on Pascal because Pascal never had the hardware which DLSS uses to begin with. The only thing that doesn't work is frame generation, but that's something different entirely and they just crammed it under "DLSS" for the naming.

        • 1 year ago
          Anonymous

          DLSS is basically Nvidia trying to find some way to use the enterprise hardware they leave on gamer chips despite being basically useless for games. They could easily code an FSR style upscaler that would run on raster hardware, they just don't because getting something they can sell out of byproduct is the point

          • 1 year ago
            Anonymous

            DLSS doesn't actually run on tensor cores. Well, DLSS3.0 might, but nothing before that uses special hardware.

      • 1 year ago
        Anonymous

        [...]
        [...]
        [...]
        [...]
        These mixed results are exactly why I really have no idea anymore with GPUs.

        FSR is available for Nvidia too. DLSS is proprietary and limited to Nvidia.

    • 1 year ago
      Anonymous

      I'm biased because I have a 3060 Ti. This thing is a fricking monster and runs pretty much everything I throw at it perfectly. Of course that might be because I only play at 1080p.

      • 1 year ago
        Anonymous

        >3060ti at 1080p
        Well if you're happy with it then that's great, that card will last you a long time at that resolution.

        • 1 year ago
          Anonymous

          >that card will last you a long time at that resolution.

          I learned that when I had FFXIV at max settings for 1080p and when doing an Alliance Raid with everyone going crazy with their shit I checked and saw the fans weren't spinning but I know they were working fine because I opened a benchmark after and they started spinning. The game did fricking nothing to it lol.

      • 1 year ago
        Anonymous

        I'm planning on getting a 6700xt for 1080p, so similar boat. I plan to literally never upgrade, I already care extremely little for AAA games and I'll be fine 1080p/120hzing the rest of my life.

    • 1 year ago
      Anonymous

      unless you can get the 3060ti at msrp ($399) then get the 6700xt (almost matches the 3070 now) since at $369 its a beast

  62. 1 year ago
    Anonymous

    I have one, its fine
    But they attract linuxtards who are insufferable and annoying so when i upgrade next im going back to nvidia just to spite them

    • 1 year ago
      Anonymous

      hate to break it to you but modern nvidia works fine on tuxOS

    • 1 year ago
      Anonymous

      Anon, Nvidia was historically the go-to GPU on gnu/linux, their proprietary drivers were that good in comparison back then.

      • 1 year ago
        Anonymous

        What a big load of bullshit, Nvidia always been inferior on Linux. The Linux creator openly hates Nvidia because they treat them like second class citizens.

  63. 1 year ago
    Anonymous

    IDK how people can play with DLSS or FSR. In screenshots or compressed videos it looks okay, but in-person it just looks off compared to native.

  64. 1 year ago
    Anonymous

    I'm undecided on what GPU to get for 1440p. Everything I was looking at a lowerish price has 8GB. Already run into issues with that on my 1070. Was thinking of going msi again since my msi 1070 is still going strong. Any suggestions?

    • 1 year ago
      Anonymous

      3060ti or 3070, then agian i havent been keeping up with tech since 2015

      • 1 year ago
        Anonymous

        I was looking at those but they have 8 gbs and I keep running into issues with that as is.

      • 1 year ago
        Anonymous

        The 3060ti is trash, honestly not worth upgrading to that from a 1070 because you're still just gonna want a better card.

        • 1 year ago
          Anonymous

          Is it? I got it for 350$ and runs games 1440p fine

    • 1 year ago
      Anonymous

      >Was thinking of going msi again
      The same company can make great cards one generation, and awful cards the very next generation. Always check reviews for the specific model you want to buy, and comparisons with other models of the same card.

    • 1 year ago
      Anonymous

      Find a used 1080ti.

      • 1 year ago
        Anonymous

        a 1080ti doesn't hold up for 1440p with newer games, especially UE5. I doubt it can even hold up 1080p60fps on Silent Hill 2

    • 1 year ago
      Anonymous

      For 1440p I'd just try to find a good deal on one of the higher end AMD cards, like 6800 and up. If you're really concerned about ray-tracing you probably want at least like a 3080 or better from Nvidia. The 7900 cards from AMD will do RT but they take a pretty big hit in performance. If you can get one for a good deal though they're great at 1440.

      Really depends what you want to spend, I'm generally of the opinion that you don't want to cheap out and get something you're just going to want to ditch in 2 years, so even if something is a little more than you need for 1440 now, you have to think about how it's going to be doing in the future. Something that barely does 1440 now is going to be doing 1080 pretty soon. Whatever the case you probably want like 16GB, 8 is way too low now. I wouldn't bother with anything lower than a 3080 or a 6800 right now.

    • 1 year ago
      Anonymous

      >I'm undecided on what GPU to get for 1440p
      keep in mind that 1440p is going to move your build up a price tier. 6700 xt, 6800 xt and 6950 xt are options at $350, $550 and $650-700 depending on if you can catch a deal. All of these cards will be able to handle 1440 utlra 60+ FPS and the higher you go the more they tolerate RT. My 6950 xt gets 90-120 FPS on the RE4 demo with ultra RT at 1440 native so you should expect most of that on the 6800 xt if you were to get that. On games with heavy RT loads like Cyberpunk I get 50-60 with just RT reflections on, so FSR has to be used with heavy RT loads. Of the 3000 series only the 3080 and 3090 are worth looking at if you can get a good deal via used or open box.

      4000 series is a total fricking ripoff but the performance is good on the 4080 and 4090. 4090 is the only card in existence that can get 60+ FPS in cyberpunk at ultra RT at 1440. 7000 series is also a ripoff, just less so. Prices on the 7000 series have already dropped so I'd wait for more drops if you were considering those at all.

      Every gen's 4k card turns into a 1440p card when the next gen is released.so buying top of the line last gen is always the best bet if you don't want to get gouged for the crown israeliteel of current gen.

      • 1 year ago
        Anonymous

        >Every gen's 4k card turns into a 1440p card

        This trend keeps slowing down though, we are to the point AAA games cost way too much to produce and we don't get the big jump like in past gen anymore, that's why we could afford to keep our current gpu longer now, since only a few games will really make them outdated in the coming years.

        • 1 year ago
          Anonymous

          I believe the conspiracy theory that Nvidia and AMD pay off AAA devs to make unoptimized shit to justify the existence of their newer cards

          • 1 year ago
            Anonymous

            I don't think they need to pay anyone, you often see morons everywhere (including this very thread) defending shit optimization because "HURR HURR poorgays"

        • 1 year ago
          Anonymous

          sure but cyberpunk still can't be run at max settings native 4k on a 4090. RT loads are fricking insane and GPUs are advancing to cope.
          >that's why we could afford to keep our current gpu longer now, since only a few games will really make them outdated in the coming years
          upgrading beyond the current gen or even the previous one just seems insane to me. DLSS/FSR is extending the life of cards far beyond what we've ever seen, everything capable of upscaling lasts as long as the user can tolerate it.

          i can't imagine that holding after this generation, the 4080/4090 are just too fricking expensive for any kind of mass adoption -- sure they sell a few, but a game dev can't bank on enough users having one when sorting out specs. and it's not like the next gen cards will be any cheaper; or frick, more powerful.

          the 4090 is already pushing PSU's to their limit, basic physics dictates that wringing additional watts out of a 12v rail is begging for trouble.
          until +24v rails become a thing, gpu's are going to be kind of capped on performance. and as such, requirements for games.

          >i can't imagine that holding after this generation, the 4080/4090 are just too fricking expensive for any kind of mass adoption
          halo products are incredibly powerful marketing. The 3000 series is priced worse across the board than the 6000 series despite having comparable or even worse performance. Your typical consumer sees Nvidia made the best card so that means the 3060ti 8gb is better than whatever AMD is offering at the same or slightly lower price point. no one wants to wade through dozens of benchmarks, they just know one brand made the best card so they buy something they can afford from that brand.
          >the 4090 is already pushing PSU's to their limit, basic physics dictates that wringing additional watts out of a 12v rail is begging for trouble.
          we're going to hit the nightmare scenario of a flagship costing $2000+ because it comes with it's own GPU enclosure that needs a dedicated PSU +24v rails.

          • 1 year ago
            Anonymous

            >we're going to hit the nightmare scenario of a flagship costing $2000+ because it comes with it's own GPU enclosure that needs a dedicated PSU +24v rails.
            my man. 2 years tops, screenshot it.

            • 1 year ago
              Anonymous

              i can imagine the cope
              >wow 2000 is a really good deal! It comes with it's own mini-case with fans and RGB!
              >you were all b***hing about the psu wattage requirement on old gpus but now that it's lower since you can just use your own psu for the rest of the system you're b***hing
              >The EGPU design, as expected, keeps temps and noise levels cool throughout the entire system by sequestering it in it's own case. These are the best noise adjusted thermals we've ever seen. It's a great product at a horrible price.
              >Nvidia's New GPU Comes With It's Own Case And It's Brilliant

              • 1 year ago
                Anonymous

                >the nvidia side-car. (it's just as gay as it sounds)

                But yeah, that's the only way forward at this point unfortunately, short of cloud based gaming (or dev's focusing more on optimization)
                sadly my elon-net will never allow me to play anything online with sub 20ms latency. i'll give up gaming before moving back to a city/town

            • 1 year ago
              Anonymous

              doubt it, we'd need a new connection standard, the 4090 is already pushing the limits of thunderbolt which isn't exactly common on motherboards

              • 1 year ago
                Anonymous

                Wouldn't your pc just have two wall plugs

              • 1 year ago
                Anonymous

                The issue is the data limits of the connection from the all in one gpu to the pc

      • 1 year ago
        Anonymous

        i can't imagine that holding after this generation, the 4080/4090 are just too fricking expensive for any kind of mass adoption -- sure they sell a few, but a game dev can't bank on enough users having one when sorting out specs. and it's not like the next gen cards will be any cheaper; or frick, more powerful.

        the 4090 is already pushing PSU's to their limit, basic physics dictates that wringing additional watts out of a 12v rail is begging for trouble.
        until +24v rails become a thing, gpu's are going to be kind of capped on performance. and as such, requirements for games.

  65. 1 year ago
    Anonymous

    I have a 6950xt, I paid like 600 bucks for it and it's great. If you don't care about money Nvidia is better, but most people care about money. There were worse Nvidia cards here that cost
    way more just because the prices are so fricked.

  66. 1 year ago
    Anonymous

    nah, i'm fairly satisfied with my XFX QICK 6800 so far.

    • 1 year ago
      Anonymous

      >hold forward to jason

  67. 1 year ago
    Anonymous

    Got lucky and got a founders 6800XT early last year during the shortage for MSRP. Never had an issues except with Horizon Zero Dawn at first. But once I turned on SAM it's been amazing. 4k 6FPS everything maxed out.

  68. 1 year ago
    Anonymous

    a 6700 xt cost me 380 euros.A 3060 ti would have been 480.That was a month ago.

  69. 1 year ago
    Anonymous

    Got a job in my third world country, gonna save for a pc around december/january of next year to replace my Ryzen 5 1600 and RX580gb

    How long do you think a Ryzen 7 7700x and a RTX4080 will last at 1440p if the most recent game i've played is Spiderman Remastered and Hi Fi Rush?

    • 1 year ago
      Anonymous

      dont ask how long it will last, ask will it run the shit you have now. but dont ask that because you can already google benchmarks.

      • 1 year ago
        Anonymous

        About 7 years. 10 or more if you drop your fps from 120 to 60 on modern titles. DLSS to last 15 years

        I see, thank you for your answers anons

    • 1 year ago
      Anonymous

      About 7 years. 10 or more if you drop your fps from 120 to 60 on modern titles. DLSS to last 15 years

    • 1 year ago
      Anonymous

      I'm in similar conditions and have similar-ish plans. With graphics generally stagnating and options such as DLSS/FSR popping up a setup like that could last you an actual decade if not more.
      I've had my current PC with some minor upgrades for about eight years now. It's starting to show its age, but I've not found anything I can't play decently with it yet.

    • 1 year ago
      Anonymous

      No one knows. The i5 2500K lasted forever because Intel barely improved their products gen-to-gen; the Ryzen 3600 is arguably already outdated because single-thread performance improves by 30% year after year.
      The GTX 480 was the fastest card on the market in 2010, by 2014-2015 it was already unusable in newer games due to lack of VRAM. On the other hand, the GTX 970 is still somewhat decent today.

      Don't worry AMD bros. AMD is paying Linus millions of dollars and all of a sudden he became pro AMD just like that. And just like that in a few years people will stop shitting on AMD GPU's.

      It's pretty annoying honestly. This is probably the least competitive AMD has been since the Vega days, but NOW they start shilling for them? Why, what happened.

      • 1 year ago
        Anonymous

        >gtx 970
        I have yet to upgrade from it because it can still run most newer games on at least medium settings. This card has been amazing. That said, it's probably time for me to get something better by now.

      • 1 year ago
        Anonymous

        AMD saw the opportunity when intel stopped extreme tech upgrades. And realized that Linus needs that money for his two fricking labs. So they are paying him millions of dollars.

      • 1 year ago
        Anonymous

        I'm still running a Ryzen 3600 in my 1440p build and it's still performing quite well. Truth of the matter is CPUs stop mattering to a certain point when you go above 1080p.

        • 1 year ago
          Anonymous

          Not for strategy games

          • 1 year ago
            Anonymous

            Good thing I don't care about strategy games

          • 1 year ago
            Anonymous

            >tfw hearing the 5800x3d is an instant +90 fps in MWO
            >after I got a 5900x
            A-At least I have moar coars right?

        • 1 year ago
          Anonymous

          3600 is pretty much on par with what PS5/XSX have so you're probably fine for this gen, at least for 60fps.

          • 1 year ago
            Anonymous

            I've been running 144fps on it fine.

            • 1 year ago
              Anonymous

              On older and less demanding games, sure

    • 1 year ago
      Anonymous

      >How long do you think a Ryzen 7 7700x and a RTX4080 will last at 1440p
      Get the 7800x3d, it will last you longer, think of it as a refresh of the 5800x3d.

  70. 1 year ago
    Anonymous

    They are good budget options as long as you only use your PC for games.

  71. 1 year ago
    Anonymous

    Both are fine.
    The problem is when CEO's like Jensen treat the people and partners who helped them get where they are like shit.
    You don't shit all over the people that made your company what it is.
    AMD is a *little* more humble, but not my much, they've been caught lying about shit.

  72. 1 year ago
    Anonymous

    Probably not but I've never thought
    >I wish I had an AMD card

  73. 1 year ago
    Anonymous

    Both companies are fine besides objectively shitty cards for the price like the 3050/60 (non Ti) or the 6500XT and 7900XT (not XTX).

  74. 1 year ago
    Anonymous

    Not reading the thread but my 6700 XT works on my machine.

  75. 1 year ago
    Anonymous

    Linux that Windows this
    Blah blah blah fricking pussy shit

  76. 1 year ago
    Anonymous

    No but their drivers are. GL playing older games on AMD GPU's.

  77. 1 year ago
    Anonymous

    AMD cards are great the same way game consoles are great. They are made to do one thing and do it fairly well for a reasonable price. AMD cards are rasterization beasts. 90% of the silicon on the chips are dedicated to blasting out polygons and frames and not wasted on anything else unnecessary for games. That's why they are cheaper. That's what makes them better for the dedicated gamer on a budget.

    NVIDIA cards on the other hand aren't just gaming cards. They are built to be used for a lot more than just rasterization polygons. AI/ML, CV, video/image processing, scientific computing, engineering simulations, crypto, CGI rendering, and so much more. You are paying for a lot more as many of these things can be used for gaming, but isn't strictly necessary as NVIDIA specific features are luxury features for gaming. NVIDIA cards are built for both gaming and professionals but not strictly focused either and you are paying for that extra silicon even if you don't use it.

    Consumer AMD cards can do many of these things too, maybe not as well as they aren't built specifically for these tasks so you aren't strictly missing out anyways outside of specific professional tools that don't support AMD GPU acceleration.

  78. 1 year ago
    Anonymous

    Not inherently bad, no. The issue is more about the featureset of cards now. Raytracing is not popular by any stretch, but little timmy takbir down the way is going to beg his parents for an nvidia card to raytrace, and not an AMD.
    If you don't care about raytracing, or NVENC, sure, go AMD. Me though, I use moonlight daily, and I don't really want to set up a new remote desktop, so I'm fine paying a slight premium for nvidia cards.

    • 1 year ago
      Anonymous

      anon, read up on disabling geforce experience's ability to auto update. they're going to frick you in the very near future.

      >i have no idea if they went through with it on the host side
      They didn't remove anything

      Anyway, how good is Sunshine? Does it actually do all-hardware capture + encode like GFE does or is it another shitty solution with software capture? Software capture solutions tend to suck at high res. Steam suffers from this for instance.

      they've stated their plan is to remove gamestream from geforce experience, they just haven't done it, yet.

      sunshine does seem to work pretty well, my client (laptop) only does 1080p, but over a wired gigabit LAN, it's been virtually flawless with 5.1 audio. it's hardware encoded h264 (host: GTX 1080)

      • 1 year ago
        Anonymous

        >implying I even fricking use geforce experience
        ????

        • 1 year ago
          Anonymous

          sigh. are we talking about the same thing here? do you stream from one computer to another using moonlight on the client computer?
          what are you running on your host machine to facilitate that.

      • 1 year ago
        Anonymous

        No, they've stated their plan is to remove it from the Shield app.
        >Starting in mid-February, a planned update to the NVIDIA Games app will begin rolling out to SHIELD owners.
        >NVIDIA Games app
        >rolling out to SHIELD owners
        Will they remove it from PC as well? Maybe, but they haven't said they will.
        >only does 1080p
        Not surprising it works, 1080p works perfectly well even with fully software everything, even encode. I haven't tried Sunshine yet but at 4k only GFE and Parsec performed decently, with GFE + Moonlight being smoother and having better frame delivery.

        • 1 year ago
          Anonymous

          i've got a new mini PC on order that'll be replacing the laptop, it does 4k, so we'll see about sunshine @ 4k.

          and okay, maybe they have not overtly said they're killing gamestream completely (i.e. removing from GFE) but reading between the lines it sounds very likely (hell, even the Sunshine devs have said as much). Outside of shield, was there any 'official' use case for gamestream? if not, and it's cannibalizing their online streaming option, do you really expect them to keep it in GFE?
          That's not a chance I'm willing to take, and given what they're doing to shield owners, it's not a company I'm willing to support with new hardware purchases regardless of what they do with gamestream/GFE.

          • 1 year ago
            Anonymous

            How can it cannibalize online streaming when there are many other options which provide the same functionality which run on Android? It's not like this is stopping Shield owners from streaming from their PCs.

            • 1 year ago
              Anonymous

              You know, i'm not sure..
              i mean i can stream $game to my living room over my LAN for $0
              OR i can pay a monthly fee to stream $game from nvidia over the internet
              i'm not sure which is the better deal for me.

              • 1 year ago
                Anonymous

                Not sure what point you think you're making here. You know you never needed Gamestream for that, right? Never did and still don't, even on Shield whatever since it just runs Android.

              • 1 year ago
                Anonymous

                dude...
                unless you're using sunlight/parsec
                open up GFE, disable gamestream.
                now try using moonlight, report back with results.

                i'm hoping this is one of those "we're two morons talking about totally different things, LOL." but hope is fading

              • 1 year ago
                Anonymous

                Why are you stuck on Moonlight? NVIDIA's shit isn't the only way to stream games from a PC, even if they remove it the option to stream games from a PC is still there. Get it? They cannot force you to pay for GeForce Now by removing Gamestream, because there are multiple alternatives. Understand?

              • 1 year ago
                Anonymous

                that's the entire point of all this. replacing GFE/gamestream with sunlight before they get around to killing it (which dollars to doughnuts, they will)

                from nvidia's perspective keeping GFE/gamestream alive for PC users (not shield, don't mention shield, shield is fricking absolutely irrelevant to this conversation) is potentially costing them revenue from people going that route vs their paid cloud option.
                In other words, they have a free option that competes with their own paid option. why would they keep the free option around?

                i mention GFE/moonlight because that is what I was using. and in making an informed decision on an upgrade for an aging 5 year old video card, being able to stream games using the new card -- it's kind of relevant. thankfully with sunshine i'm not tied to nvidia, and they can get fricked over their treatment of their Shield customers.

                i don't know how else to try to explain this to you, you are so far up your own ass that you can't seem to grasp basic english. you arrogant mother fricker.

              • 1 year ago
                Anonymous

                Gamestream has far better reliability, latency and compatibility than other user-land streaming software. This is because gamestream can directly capture and encode GPU framebuffers low level within the cards instead of needing to use unreliable userspace high level D3D/OpenGL/Vulkan/GDI API hooks. The hooks add latency, sometimes crashes some games, or doesn't capture anything resulting in black screens etc. and sometimes you get tearing, or poor compression artifacts.

                Gamestream on the other hand just works and rarely has compatibility issues.

              • 1 year ago
                Anonymous

                I don't think DXGI desktop duplication requires hooking into 3D APIs like that. Looking Glass uses it and achieves very low latency capture as far as I know.

              • 1 year ago
                Anonymous

                DXGI desktop streaming suffers tearing and frame skipping/dropping issues. It's a fallback for games that can't use API hooking and generally a much worse experience.

  79. 1 year ago
    Anonymous

    yeah sadly

  80. 1 year ago
    Anonymous

    >shit drivers
    >lack of love for AI
    >lack of proprietary software and graphical features like nvidia's DLSS (no Black person, fidelity fx doesn't get close), hairworks and the like
    >breakthrough industry features like physx will be unavailable for your card when they're introduced by nvidia - it took years for physx to go open source
    >massive unexplainable problems in a myriad of games like VRChat where it can't even play videos correctly
    >new releases often have massive issues with amd products and you will wait weeks for a patch to fix it
    >just as expensive as nvidia cards that are an improvement to all of the above

  81. 1 year ago
    Anonymous

    my rx 6700 xt is just fine, slight coil whine aside, but who cares , it runna da games

  82. 1 year ago
    Anonymous

    Nah, they're ok for the price
    t. rocking a RX480 since 2016

    • 1 year ago
      Anonymous

      how's it playing games sub 40 fps anon?

      • 1 year ago
        Anonymous

        Feels good, it can even run some newer games at 60fps

  83. 1 year ago
    Anonymous

    bought a 7900xt about a month ago
    feels pretty good, runs everything at pretty much max settings
    only had a game crash GPU drivers ONCE in 1 month of heavy use
    overall I'd say Im pretty happy with it and I'll recommend

  84. 1 year ago
    Anonymous

    is it not weird to you that amd can produce great cpus and all but their gpus are shitte

    • 1 year ago
      Anonymous

      It's only recently that their desktop PC line finally had integrated gpus, before you had to buy laptop chips to get that. They tested it out with the 5600g and the 5700g as budget PC options and it seems to have worked out, so they put igpus in the 7000s.

  85. 1 year ago
    Anonymous

    Are they good for coom AI art? Streaming? Posting on youtube?

    • 1 year ago
      Anonymous

      cumbrain has brainrot

  86. 1 year ago
    Anonymous

    Nah, been using them since 2010 or so. I'm on a 6900XT now, but I got to give props to my previous RX 580. It had no business doing as well as it did after I bought a 1440p monitor.

  87. 1 year ago
    Anonymous

    I like MSI products

  88. 1 year ago
    Anonymous

    the only true amd AIB gpu tierlist
    >GOD TIER
    Sapphire (Nitro), Powercolor (Red Devil), Asus (ROG Strix)
    >HIGH TIER
    Sapphire (any other model), Gigabyte (AORUS)
    >MID TIER
    MSI (any model), Gigabyte (any model), Asus (any other model)
    >SHIT TIER
    Powercolor (Any fricking model other than a Red Devil), Any literally who chink manufacturer
    >WILDCARD TIER
    Any XFX card is a lottery and ranges between a god tier performance card or a piece of shit you'll want to replace in 3 months

    • 1 year ago
      Anonymous

      >gigabyte and asus anywhere above shit tier
      lol

    • 1 year ago
      Anonymous

      I never understood why Sapphire gets sucked off this much. I haven't used any of their cards recently but I've had 4 of them and out of those 4, 3 cards had broken fans in like 3 years of use. I've literally never experienced anything like that with any other manufacturer. In fact I don't think I've had any card with broken fans that wasn't Sapphire.

      DXGI desktop streaming suffers tearing and frame skipping/dropping issues. It's a fallback for games that can't use API hooking and generally a much worse experience.

      I'm not sure what program you're talking, DXGI desktop duplication is a Windows API and doesn't streaming anything by itself.

      • 1 year ago
        Anonymous

        Sapphire is like the EVGA of AMD GPUs, clearly not every single gpu they produce will be perfect but 9 out of 10 times they haave the most reliable performance out of all of them.

      • 1 year ago
        Anonymous

        >3 cards had broken fans in like 3 years of use
        So you're the reason why Sapphire has easy-to-remove fans now.

        • 1 year ago
          Anonymous

          Do they really? That's nice to hear, though all my fans died or started rattling just out of warranty so I never sent 'em back for Sapphire to notice.

          • 1 year ago
            Anonymous

            The fans are all easily unscrewed without removing the heatsink or anything. I think it's been a feature since the rx480 my old sapphire rx580 had easily removable fans. Check your old sapphire gpu right now and see if its a nitro with quick connect modular fans.

            • 1 year ago
              Anonymous

              Last Sapphire cards I had were R9 290Xs, but I sold them off as I upgraded so I don't have anything to check anymore.

      • 1 year ago
        Anonymous

        Steam in-home streaming when playing a non-3d/UI application and many popular third-party remote desktop solutions (LogMeIn, Teradici, etc) use the desktop duplication API and they all have the same problem with dealing with tearing because they don't grab frames in lockstep with the game's fliprate/frameloop. 3D API hooking lets you grab frame buffers in sync with the game's frameloop allowing the streaming software to encode and deliver frames at the same rate the game is running at without tearing. If the streaming client supports VRR/Freesync/Gsync, that can also be used to avoid tearing and less bandwidth is used for games running at a framerate lower than the display refreshrate.

        • 1 year ago
          Anonymous

          I haven't used those other programs but I've only seen tearing on Steam when using NVFBC (which is considered deprecated IIRC), DXGI desktop capture has always worked perfectly well for me.

          • 1 year ago
            Anonymous

            Then you likely are running games with vsync enabled (adds extra lag).

            • 1 year ago
              Anonymous

              No, I'm not because Steam streaming specifically recommends disabling VSync when you stream.

              • 1 year ago
                Anonymous

                Steam uses 3D API hooking except for non-3D/UI applications.

              • 1 year ago
                Anonymous

                Not the case, I've seen it use DXGI when streaming games. How about Looking Glass then, are you going to tell me that also doesn't work?

    • 1 year ago
      Anonymous

      I bought a Gigabyte windforce 5600XT in early 2020 and the pos would green screen on me randomly due to Gigabyte messing up the vbios, refunded it and got a 2070S instead i had no gpu so that one is also Gigabyte and this one works aside from needing a custom fan curve or it whirs up like a car also due to Gigabyte fricking up the vbios

    • 1 year ago
      Anonymous

      Gigabyte GPUs are actual fricking dogshit, both AMD and Nvidia.
      Now that Nvidia completely destroyed their relationship with one of the few exclusive AIBs they had who was actually worth a damn, the best AIBs now are Sapphire and maybe PNY if you don't play on overclocking.

      • 1 year ago
        Anonymous

        I've had 3 different GB GPUs and they were all fine
        Worst problem I had was with a 3060ti, in that its fan curve was way too high, it was an easy fix.

  89. 1 year ago
    Anonymous

    you cant do cool ai shit with them so yea

    • 1 year ago
      Anonymous

      You can, just not on windows.

      • 1 year ago
        Anonymous

        You can do it on Windows but Shark.ai is a lot to set up and you can't use LoRAs on it.

  90. 1 year ago
    Anonymous

    They can't generate my anime tittles like NVidia can.

  91. 1 year ago
    Anonymous

    So how's the VR compatibility with AMD+Linux?
    I wanna drop a ton of cash on a new setup that can also do VR/Raytracing but I guess I can go without Raytracing and dodge Nvidia as long as VR is good on AMD's side.

    • 1 year ago
      Anonymous

      it works about the same as it did on Nvidia (3090)
      artifacting in steamvr's overlay is slightly different but ever-present until it gets fixed by Valve or whomever does it finally

  92. 1 year ago
    Anonymous

    BR here
    Got this recently. Upgrade from a GTX 1060 6GB
    I'm in fricking heaven.

    • 1 year ago
      Anonymous

      >he bought a refresh
      Good job anon. I got the same GPU model from sapphire.

    • 1 year ago
      Anonymous

      I'm gonna buy that card when I get my tax refund, looking forward to it. On a craptop right now that struggles with Ace Attorney sometimes.

  93. 1 year ago
    Anonymous

    An HD 5770 was my first real GPU that I bought and did pretty well for its time. I think it was a mid-tier card. Can't remember having any problems with it. I most recently upgraded from a 1070ti to a 6950XT and the performance upgrade was obviously noticeable. Haven't had a problem yet but there is this odd hitch when playing full screen video where it will black screen for a second before displaying the fullscreen. It might be because I'm running it through an AVR though. Other than that, I haven't encountered any problems with current day AMD GPUs.

  94. 1 year ago
    Anonymous

    No, my 6800 is an absolute beast and maxes out 1440p games with high framerates. Not sure why people are trying to spend 800 dollars on meme cards. Really starting to believe Nvidia has an entire department dedicated to shilling on this board

  95. 1 year ago
    Anonymous

    Yes they are. I had sold my amd gpu because of driver issues after 2 months of problems. I feel bad for the poor homosexual that bought it.

  96. 1 year ago
    Anonymous

    Sorta off topic but hopefully an anon can shed some light. I added and replaced some fans in my rig today before I swapped them out I ran Prime95 and it settled at 83 degrees after 20 minutes. After swapping my new fans and dusting I run Prime95 again - it settles at 60 degreees for about 5 - 6 minutes then spikes up to 90 degrees then it slowly goes back down to 60. Running multiple runs and the behaviour repeats itself every time. The CPU is a 5800x.

    • 1 year ago
      Anonymous

      >it settles at 60 degreees for about 5 - 6 minutes then spikes up to 90 degrees then it slowly goes back down to 60.
      Sounds like it's going through all the various instruction sets available to the CPU testing stress loads on each one. This isn't a problem, stress test your CPU by playing the most demanding game in your collection. Synthetic loads aren't that good as an indicator of daily use.

      • 1 year ago
        Anonymous

        I had that thought but that doesn't explain why it was stable before I swapped fans.

        • 1 year ago
          Anonymous

          Check your fan curves in whatever software you're using to control them and see if they're default settings. It sounds like what AMD does with their GPU fans only starting up at 60C except why would it apply to a CPU.

          • 1 year ago
            Anonymous

            >GPU fans only starting up at 60C
            perfectly fine for a gpu and probably preferable since larger variations in temps will cause more damage to components than a steady warm state

            • 1 year ago
              Anonymous

              Yes, yes, we all know that, my own 6750xt is idling at 37C no fans on right now. The question is why is that behavior being applied to a CPU.

          • 1 year ago
            Anonymous

            Thats what they do on nvidia. Thats normal

            • 1 year ago
              Anonymous

              Anon, the original post was referring to a 5800x. A CPU. Semi-passive heatsink behavior is known for GPU cards but not usually seen for modern CPUs on desktop.

  97. 1 year ago
    Anonymous

    I like my 6700XT except there's one game I can't play on it unless I roll back to old drivers because of AMD messing with OpenGL driver calls or something, besides that it's been great.

    • 1 year ago
      Anonymous

      Is it Hatsune Miku Project Diva Arcade Future Tone?

      • 1 year ago
        Anonymous

        No, it's the Combat Mission series of games, not surprised that other games are having issues, though.

  98. 1 year ago
    Anonymous

    Went with AMD for my most recent PC because it came with two games I was gonna buy Day 1 anyway, figured it was like getting it for 120 dollars off.
    It sucks, developers literally just do not fricking care about AMD GPUs. Even when there is Vulkan support it doesn't feel like enough, simple shit like watching videos on the internet can behave fricky sometimes, drivers are a pain in the ass, emulators are an inconsistent crapshoot (PS3 emulation runs fantastic, but somehow can barely run a Saturn game at 12fps)

  99. 1 year ago
    Anonymous

    I'm upgrading a 10 year old pc so I got a 6750 xt but I don't know what cpu to get now

    • 1 year ago
      Anonymous

      help don't let me drown in the sea

      • 1 year ago
        Anonymous

        5800x3d or wait for the next monolith die from AMD 7800x3d

  100. 1 year ago
    Anonymous

    the gpus are pretty good, it's the drivers that are bad

    • 1 year ago
      Anonymous

      This, they work fine on Linux because people other than AMD work on them.
      AMD should really start hiring some of the people working on their open source drivers, it's clear their current in-house staff are shit.

      • 1 year ago
        Anonymous

        their driver team are mostly h1b visa pajeets and they will always be trash until that changes

        • 1 year ago
          Anonymous

          Jesus christ
          Reality imitates memes

  101. 1 year ago
    Anonymous

    They are the PC version of playing games on a clone console.

    • 1 year ago
      Anonymous

      moron

    • 1 year ago
      Anonymous

      amd bros BTFO

  102. 1 year ago
    Anonymous

    i have an rx 6600, it's fine, it played CP2077 without issue

  103. 1 year ago
    Anonymous

    they're the best for value.
    i got an rx 6600 for $200 and its been incredible tbh.

  104. 1 year ago
    Anonymous

    never take any opinion on Gankerirgins seriously
    for real, some of you fricking losers need to get a life, stop this brand homosexualry and give factual arguments instead of devolving every single thread into a shitflinging fest

    • 1 year ago
      Anonymous

      Mad coz bad GPU

    • 1 year ago
      Anonymous

      Modern Ganker is 90% brand fanboys.
      Just go with whatever offers the best value for you.
      Personally I've owned multiple AMD and Nvidia GPUs in my life and I've never had issues with either brand because I'm not moronic.

  105. 1 year ago
    Anonymous

    Unironically no, however that's only in regards to gaming. Nvidia still comes out on top for other shit not related to gaming.

  106. 1 year ago
    Anonymous

    >AMD driver
    >driver crashed
    how am I supposed to fix this

    • 1 year ago
      Anonymous

      what game are you playing on windows?

      • 1 year ago
        Anonymous

        vlc

        • 1 year ago
          Anonymous

          >VLC
          You're on windows and you use that garbage? Not even on linux is it worth using that junk normalgay program. Get MPC+HC off the KLite bundle or go full blown mpv.

          • 1 year ago
            Anonymous

            mpc crash almost every time you enter full screen

            • 1 year ago
              Anonymous

              That's some weird shit, I have no problems on my end. Try SMPlayer then, it's a GUI of mpv.

  107. 1 year ago
    Anonymous

    I like to dabble every now and then with 3D-modeling, and unlike most of this board I actually like the idea of ray-tracing and DLSS. So when it came time to upgrade from a seven year old 980, I went with a 3060 Ti. If I'm going to upgrade to a 40 series card it will probably be a 4070 Ti because the 4080 and 4090 are not only ridiculously expensive, but they also look like fricking bricks and are absolutely butt-ugly.

    I would look into AMD but I always see reviews on sites as well as comments from anons on here about how they buy a Radeon card and end up regretting it, and remember my first custom PC had a Radeon card and would flicker every now and again for no reason.

    • 1 year ago
      Anonymous

      >and unlike most of this board I actually like the idea of ray-tracing
      And yet you went with a GPU that's dogshit at it.

      • 1 year ago
        Anonymous

        Jokes on you, the only game I have installed that has ray-tracing is Minecraft. It works just fine for me!

        Except I never play Minecraft....

  108. 1 year ago
    Anonymous

    This generation AMD gpu's is really bad, 7900xtx eatup mostly 600W and perform worst or like 4080.

    • 1 year ago
      Anonymous

      the 7900xtx caps itself at 291w

  109. 1 year ago
    Anonymous

    I've been running the 7900xtx for the last few months. Not really seeing any problems with it. I can't really compare performance against a modern nVidia card since my previous system was a 1060, but stability is as expected (modern games crash occasionally, but when is that not true?), and performance is good enough that I wouldn't be able to tell if it were better (sidenote: what's a good FPS counter these days that works on every game?)

    • 1 year ago
      Anonymous

      Steam has a fps counter in its options menu

      • 1 year ago
        Anonymous

        This would generally require either buying games, or going way out of my way to run them through steam.

        the amd software included with the drivers has an fps/stats counter as well. AMD Software -> Performance -> Metrics -> Overlay -> Show Metrics Overlay. Games shouldn't crash btw, that is not normal

        I don't want ALL the metrics, that's a huge obtrusive block that gets in the way of playing games. I just want a tiny little FPS counter.

        • 1 year ago
          Anonymous

          Afterburner then, Mr. Pirategay. I think you can even use it on emulators? Not sure.

        • 1 year ago
          Anonymous

          You can configure what metrics you want to see with the amd software

        • 1 year ago
          Anonymous

          >I don't want ALL the metrics
          on the Tracking / Overlay tab, select the Tracking tab and disable metrics except for FPS

    • 1 year ago
      Anonymous

      >modern games crash occasionally, but when is that not true?
      what games crash?

      >what's a good FPS counter these days that works on every game
      On windows you have MSI Afterburner, turn on the onscreen display. For linux you have MangoHud.

      • 1 year ago
        Anonymous

        Recently, Hogwarts Legacy crashed about 3-5 times in the maybe 30 hours I played it. I also tried NMS again and got at least 3 hard locks within 2 hours so that one could be a card issue or an issue with literally anything else on my machine. Hard locks in Returnal. Maybe a couple of crashes in mechwarrior 5 but I could be misremembering from the previous time I played that.

        Except for NMS, it's nothing out of the ordinary.

        • 1 year ago
          Anonymous

          What the frick, none of that is normal.

    • 1 year ago
      Anonymous

      the amd software included with the drivers has an fps/stats counter as well. AMD Software -> Performance -> Metrics -> Overlay -> Show Metrics Overlay. Games shouldn't crash btw, that is not normal

  110. 1 year ago
    Anonymous

    I want to buy a new GPU so fricking bad but I can't think of a single upcoming game that justifies it
    Still using a 1080 ti and I only play at 1080p

    • 1 year ago
      Anonymous

      You don't need more at 1080p. That 1080ti is a rare good product on the same level as the i5 2500k/2600k for CPUs succeeded by the 5800x3d.

      • 1 year ago
        Anonymous

        Yeah it still holds up incredibly well
        I do have to mess with settings occasionally, but a lot of games still run great maxed out at 1080p
        Just feel like upgrading to further future proof myself for when it's not enough anymore, but I always struggle to find game(s) coming out to justify it

    • 1 year ago
      Anonymous

      Honestly, I only upgraded in preparation for Dragon's Dogma 2.
      I was still running a 1070.

      • 1 year ago
        Anonymous

        Not that guy but what did you get?
        Dragon's Dogma 2 is the only upcoming game that I'm interested in, if not for that I would continue using my GTX 660 until it is literally stops working at all.

        • 1 year ago
          Anonymous

          >Not that guy but what did you get?
          RX 6950 XT
          Saw it on sale for $650 so I snatched it up.
          I'll probably use this GPU until I quit (modern) gaming altogether.

          • 1 year ago
            Anonymous

            Your GPU will have the same lifespan as the Steam Deck. For as long as the Deck can run games, your GPU should be able to match its driver. So, 10 years.

  111. 1 year ago
    Anonymous

    they're good

  112. 1 year ago
    Anonymous

    convince me it's a terrible idea to upgrade my GTX 1080 to the 7900XTX
    work bonus burning a hole in my pocket.

    • 1 year ago
      Anonymous

      Invest in silver.

    • 1 year ago
      Anonymous

      RDNA3 is a beta test, don't buy it until the drivers are better.

      • 1 year ago
        Anonymous

        RDNA3's drivers are better than 2's, which are 1's, which are better than Vega's
        When is an AMD GPU not a beta test?

        • 1 year ago
          Anonymous

          >When is an AMD GPU not a beta test?
          From my experience, after a year or two passes. The drivers really suck.

    • 1 year ago
      Anonymous

      honestly, if you have all the kit and caboodle right now like 1440p/4k monitor and a computer that wont bottleneck, go for it
      if you're doing a whole system refresh id wait till next gen
      i did a full refresh at the start of the year with a 7600x and a 4090 with an lgc2, i dont regret it for a second bar the mental gymnastics i did to justify spending 500 dollars more than my entire system for the gpu, but again, i said frick it i havnt bought a gpu in 7 years and a cpu in 11 so i got my moneys worth

      • 1 year ago
        Anonymous

        thanks anon, yeah i upgraded everything else last year
        >ryzen 5700G
        >32gb ram
        i just held off on the GPU because, damn they were rare and way, way overpriced.
        now they're just overpriced

        most of my gaming is on a 4k TV, and the GTX 1080 is showing its age. it's been a trooper though, got it for like 550 back in 2018. so i don't mind paying a bit of money for a card that'll last a few years (and i'd wager a 7900 XTX will be relevant even longer)

  113. 1 year ago
    Anonymous

    >bought 3080 10gb for 900 dollars last June
    >it can barely run AAA games above 60fps now never reaching 142fps goal
    Guess its 1080p all settings low for as much fps for the next 5 years for me.

    • 1 year ago
      Anonymous

      Turn of DLSS and it should run faster.

    • 1 year ago
      Anonymous

      3080 10gb for 900 dollars last June
      Why in the frick would you do that?
      You never buy a GPU above MSRP.

    • 1 year ago
      Anonymous

      Disable meme tracing and stop playing at 4K.

    • 1 year ago
      Anonymous

      i have a 10gb 3080 and have no issues running AAA games at 4k at 100+fps. You sure its not your RAM or CPU?

      • 1 year ago
        Anonymous

        probably bottlenecked by 3700X and 16GB ddr4. I plan on getting a 5800X3D soon

  114. 1 year ago
    Anonymous

    Does switching to 1440p really make a difference visually?

    • 1 year ago
      Anonymous

      If you have two 1080p monitors don't bother.

    • 1 year ago
      Anonymous

      yes
      1080p looks blurry by comparison

    • 1 year ago
      Anonymous

      Yes, not only is it clearer but there's also far less aliasing
      It's more significant than going from 1440p to 4K

  115. 1 year ago
    Anonymous

    They play games well. If, for some reason, you know for a fact that you will be doing some non-gaming tasks that use CUDA cores, that's when you get Nvidia.

  116. 1 year ago
    Anonymous

    Just built my PC what to play with this?

    • 1 year ago
      Anonymous

      HiFi Rush
      Star Ocean The Divine Force
      Daemon X Machina
      Rift Breaker
      Phantom Brigade

    • 1 year ago
      Anonymous

      why are you black

      • 1 year ago
        Anonymous

        Because now I'm the true master race on Ganker lmao. All I play is PC now and emulate consoles I gave all my consoles and games away

    • 1 year ago
      Anonymous

      >B450M DS3H
      Stupid Black person haha

  117. 1 year ago
    Anonymous

    AMD works. It's just some things like emulators run like shit if they don't have Vulkan support

  118. 1 year ago
    Anonymous

    I had a rx590 sapphire pulse I bought for $140 5 years ago. It replaced my decade old gtx770. I upgraded to a 3080 when it came out for $750. I prefer g-sync over free-sync. If I were to upgrade, I'll probably get another nvidia card. It's usually software and gpu optimization. Plus AMD runs a literal furnace. Not sure why they always get so hot.

  119. 1 year ago
    Anonymous

    You all fight between Amd & Nvidia. But when it comes to manufacturers do you have a go-to? I remember buying an EVGA Nvidia card back in 2016 solely because it had two DVI slots but now that I have only HDMI/DP monitors I don't need to scour for specific ports from specific manufacturers.

    • 1 year ago
      Anonymous

      Sapphire and XFX on AMD
      used to be EVGA on Nvidia, now maybe MSI takes that spot

  120. 1 year ago
    Anonymous

    >still no 7500 xt

  121. 1 year ago
    Anonymous

    Is $50 for a 1050 ti worth it?

    • 1 year ago
      Anonymous

      A 1050ti is kind of pointless for gaming, it's barely any better than modern integrated GPUs

  122. 1 year ago
    Anonymous

    No, they're not. In the current generation (or to a degree one past), the AMD GPUs are a way better buy overall and AMD is a less shitty company, supports FOSS in general. If you can afford $1600+ for marginal at times or very situational gains, then by all means by Nvidia 4090 as its the only NV card worth it so far. However for everyone else, the $1K AMD 7900XTX , or the $800 XT is a better value for the performance. I expect the same will be true as lower tier cards are released as well.

  123. 1 year ago
    Anonymous

    Is a 6750xt good for 1440p gaming?

    • 1 year ago
      Anonymous

      Yes

  124. 1 year ago
    Anonymous

    >This is a trick question.
    Neither are inherently good or better. It is per title preferential rendering which is then compounded with opinion which will typically override statistical or observational analysis. Truthfully, NVIDIA is had a ton of tight stuff they let slip for publicity stunt everyone bought into building their business on functional, or rather fictional theory which eventually got forwarded to AMD, such as XBOX / PlayStation and now their only device in this category is the useless Switch which is a very niche product for new younger people. As a stylized preference they are extremely elitist and it is reflected in their work which I'll be one to accept it has grown on. Most people don't deserve their gift of inheritance and thus don't expect it to ever be different. They've successfully covered up every blunder of their ignorance therefore it is futile to go against their stance when people blindly follow their trend without full realization of the raw history. As enthusiast GPU, one will be hard pressed to find a single title where they aren't the best acceleration even if how their doing their pipeline is fraudulent and filled with error or never fully realized. AMD on the other hand used to be really excellent and providing a different type of GPU back when they were ATI, hence the reason they were touted as worse yet were genuinely beefier in the cutting edge state of the art engineering. They just were better hardware and destroyed everything NVIDIA ever tried then especially in efficiency and reliability. Won't hear this often, because there is a lot of people who spread opinionated falsehood about the past. Often it is user error contributing to public perception. Fragility just would require higher quality support of the known tolerance of the product. AMD basically picked up a ton of the slack when they should have been cutthroat, and at the end of the day having a GPU in the PC to do 3D accelerated work is better than not having one

    • 1 year ago
      Anonymous

      Didn't read but GPUs should not cost as much as a used beater

  125. 1 year ago
    Anonymous

    they're affordable for the average user, however lacking every feature known to mankind.

  126. 1 year ago
    Anonymous

    People say that they're good. I felt for 5700xt youtuber shilling. Bought saphire one. Worst gpu i ever had.
    Never again will i buy amd shit.

  127. 1 year ago
    Anonymous

    >Trouble is now AMD has evolved into public perception as being the open sourced option viable price performance option competitive to NVIDIA and their proliferation of the identical practice, when this couldn't be further of the truth. They basically have everything at their fingertips, and can progress innovation at their leisure. They are a specialized with fine tuned colorspace control for unique usage case scenario in specific title which were solely designed for the hardware itself. Good example of this is BATTLEFIELD 4, which is now a devoid of life corrupt experience littered with tryhard hacking nonsense and will never see another content update directly of the publisher officially. This is terrible for the industry as a whole when people accept a worse product. Accept the highest end enthusiast, or go home. End of story. Otherwise stagnation will. ATI is gone and dead, and their involvement in the Wii / GC / Xbox 360 has come to a close. It isn't the thing it once could be, and they've exhausted every known trick in the book, and their skill had been completed ignored and forgotten for gimmicky moronic bullshit nuance homosexual shit, no joke. Everything that was cool about this contender has rotted into brain dead territory. There is no resolution either, both vendor are here to stay forever. Is it excellent in the STEAMDECK where they have full throttle Windows 11 driver support for their high end device, absolutely, but there is no foreseeable option where the consumer is able to choose their experience. Just end up needing to purchase a pair or few of both the highest tier GPU to get the full story. There is no tradeoff whatsoever. Stuff designed for AMD is just better in their HARDWARE, and stuff for NVIDIA is just suited to their preference of style. Only exception to this rule is super funded or just skillfully incorporated title which feature unrivaled and unconditional support of both to their strength / level of fidelity

  128. 1 year ago
    Anonymous

    >INB4 Console homosexual justify Series X / PS5 as high end. These homosexualS just need to be shot in the head

  129. 1 year ago
    Anonymous

    >DO I WANT HOW 7990 XTX render color

    or

    DO I WANT HOW NVIDIA DOES RTX

    HMM I'M moronic

  130. 1 year ago
    Anonymous

    nvidia color profile
    AMD color profile

    AMD 16xAF
    NVIDIA 16xAF

    DLSS vs Fsr
    DLSS x Fsr

    Playing HALO or Playing HALO
    the same fricking thing, but it looks different and shit, and they both do different shit good it's such an annoying mind frick. POINT is, can't have both unless you truly stupid as frick with your rich spending decisions so choose wisely and don't come crying here when you fricking chose wrong. Doesn't hurt to just accept one or the other as we've accepted NVIDIA is extension into itself trend and AMD is considered worse even though it was a never a competition trying to utilized their driver specific functionality in specific titles

  131. 1 year ago
    Anonymous

    RX 6600 is more than enough

  132. 1 year ago
    Anonymous

    Rx 6600 is a piece of shit
    7990 XTX or 4090 or you're the fricking problem

Leave a Reply to Anonymous Cancel reply

Your email address will not be published. Required fields are marked *