AMD won

https://videocardz.com/newz/amd-radeon-rx-7600-to-launch-with-269-e299-msrp

Schizophrenic Conspiracy Theorist Shirt $21.68

Homeless People Are Sexy Shirt $21.68

Schizophrenic Conspiracy Theorist Shirt $21.68

  1. 10 months ago
    Anonymous

    >8GB
    Are they moronic

    • 10 months ago
      Anonymous

      You are the moronic for listening to the k*kes tricks about 8GB vram "is not enough"

      • 10 months ago
        Anonymous

        Guarantee you barely even knew what vram was 2 years ago

        Nice Dunning-Kruger replies

        • 10 months ago
          Anonymous

          Aren't 8gigs OK if you have PCIe 4 for your RAM?

          • 10 months ago
            Anonymous

            No. The newest GPUs don't even use PCIe 5.0, not even the RTX 4090. They work on a PCIe 5.0 motherboard because those boards are backwards compatible (for GPUs, not CPUs) but the GPU won't perform any differently on the PCIe 5.0 board versus the PCIe 4.0 board.

            Watch this video if you want a super in depth explanation: https://www.youtube.com/watch?v=v2SuyiHs-O4

      • 10 months ago
        Anonymous

        I have an 8gb 1080 and I can tell by how textures load in late that it isn’t enough anymore

        • 10 months ago
          Anonymous

          Not that the anon you're replying to is correct (inherently), but the reason they take so long to load is every dev has completely removed all precaching, or they try to bake it in at menus and just... stop doing it when you load in, to load as you go (leading to HORRENDOUS popin)
          more vram is beneficial in this, but by and large, it's beneficial at higher res. 8gb is enough for 1080p, but 1080p is the clueless normalgay's res of choice, now.

    • 10 months ago
      Anonymous

      Guarantee you barely even knew what vram was 2 years ago

      • 10 months ago
        Anonymous

        yeah, cause games weren't crazy vram dependent. now 12gb of vram is required to run 30fps

        • 10 months ago
          Anonymous

          What changed about games that they need VRAM now?

          • 10 months ago
            Anonymous

            Console game ports have ridiculously big textures for some reason.

        • 10 months ago
          Anonymous

          VRAM size has very little to do with framerate. you either have enough or you don't. if you don't, textures stop loading or the game starts stuttering, NOT running consistently at a lower framerate.
          in reality the issue is solved by reducing the texture setting, the render resolution, lowering shadow resolution, things like that. and before devs started relying on the 9th gen consoles as their new target hardware we never had any problems with 6GB VRAM, let alone 8GB. it's just lazy console ports again, multiplat gaming never changes.

        • 10 months ago
          Anonymous

          Name one game.

    • 10 months ago
      Anonymous

      8 gb is still ok for 1080p
      which is the point of this card

      • 10 months ago
        Anonymous

        GTX 750ti was "still ok for 1080p"
        Demands have risen. 8gb is not okay.

        • 10 months ago
          Anonymous

          Studios put out unoptimized piles of shit so nvidia can market overpriced weak video cards all thanks to fricking dlss which is what they use to measure and market their newer video card's performance. The only two cards worth buying from that line up is the 4080 or 4090 and even then they're still overpriced to hell and back.

      • 10 months ago
        Anonymous

        My rx480 had 8gb of vram 7 years ago
        There's absolutely zero reason newer cards should release with the same amount of vram in 2023

        • 10 months ago
          Anonymous

          no game came close to utilizing the full 8gb in 1080p 7 years ago
          while I do agree there should be some scaling up , I think inflation are hitting their big margin and they still want to keep them , That's why they are skimping on vram

          • 10 months ago
            Anonymous

            deus ex Mankind divided and FFXV did

          • 10 months ago
            Anonymous

            The base requirements move upwards, therefore the overhead should as well.

      • 10 months ago
        Anonymous

        Yeah, but there are games (and not shitty disgusting ports like TLOU) that already require more than 8vram for 1080p, like Forza Horizon 5.. and I don't remember what other game I saw that exceeded 8... but if forza, which is a game from a few years ago, requires more than 8GB, why isn't it unreasonable to think that the death of 8GB is near?

      • 10 months ago
        Anonymous

        >2023
        >1080p
        Lol

        • 10 months ago
          Anonymous

          shills with lowlife manipulation tactics, calculated people like you might lose their soul, hope it was worth it and you re "happy"

          • 10 months ago
            Anonymous

            I can definitely tell the difference between 1080p and 1440p. Framerates after 60 Hz are a placebo though, so 60 Hz is the same as 144 Hz.
            1440p@60Hz > 1080p@144Hz

            • 10 months ago
              Anonymous

              k, you get a pass. for now.

            • 10 months ago
              Anonymous

              > 60 Hz is the same as 144 Hz.
              Shut your trap you are blind

            • 10 months ago
              Anonymous

              >Framerates after 60 Hz are a placebo though
              you're either playing with a controller or you've got learning disabled levels of hand-eye coordination.

            • 10 months ago
              Anonymous

              To be honest I cant tell the difference between 60hz and 144hz when gaming. Going from 60hz to 200+Hz might be a different story.

            • 10 months ago
              Anonymous

              >60hz and 144hz are the same

              Are you blind or something?

              • 10 months ago
                Anonymous

                probably tested 144fps on a 60hz display OR the game didn't reach 144fps on a 144hz panel
                or a very slow paced game with camera panning slowly
                or just very obtuse and lying
                to be genuinely unable to notice higher fluidity due to better refresh rate your brain would literally, not for emphasis, need to be broken and your kinetic vision so shit you'd barely be able to function irl

            • 10 months ago
              Anonymous

              >60 Hz is the same as 144 Hz
              Bait, but whether it's a meaningful improvement depends on the game. FPS or fightan? Sure. Third person cover poker? Not really.

      • 10 months ago
        Anonymous

        yeah, for low and medium texture settings...

        the consoles have 16gb, and most games are cross platform, 8gb is 2x less than what we should be getting, simple as that

        • 10 months ago
          Anonymous

          >the consoles have 16gb, and most games are cross platform, 8gb is 2x less than what we should be getting, simple as that
          You fail to mention that the 16GB of RAM that consoles have is shared RAM, meaning both the CPU and GPU use it, while in PC we have DRAM for the CPU and VRAM for the GPU.
          You're not entirely incorrect, though, because the shared RAM in consoles is faster than the DRAM in PCs, and the shared RAM also means consoles only need to load assets and instructions into the RAM only once, where as with the PC they need to load a lot of assets/instructions into the RAM twice, once into the DRAM and once into the VRAM. Because of that, the PC needs more combined RAM (DRAM + VRAM) than the consoles to perform the same, but you wouldn't need 16GB of VRAM to get the same performance as consoles, as certain instructions are only needed by the CPU and other instructions are only needed by the GPU.

          • 10 months ago
            Anonymous

            The "actual" vram of the consoles is 10gb. Hence why 8gb owners are completely fricked.

            • 10 months ago
              Anonymous

              That's not completely accurate either. The shared 16GB of RAM on consoles can be divided between the CPU and GPU however the game engine sees fit. In practice the GPU often needs more RAM than the CPU, though, so it wouldn't surprise me if indeed over half of that 16GB of RAM on consoles is used by the GPU.

              • 10 months ago
                Anonymous

                the problem is that both the cpu and gpu access the same pool of memory
                on pc you have a bottleneck by having to put things into ram and then into vram that's why ram overclocking makes such a massive difference when it comes to 1% lows and stutters

              • 10 months ago
                Anonymous

                >The unit ships with 16 GB of GDDR6 SDRAM, with 10 GB running at 560 GB/s primarily to be used with the graphics system and the other 6 GB at 336 GB/s to be used for the other computing functions. After accounting for the system software, about 13.5 GB of memory will be available for games and other applications, with the system software only drawing from the slower pool

              • 10 months ago
                Anonymous

                Sauce? Because none of this info adds up to what I learned/found about the PS5. Every source I found claims it has 16GB of GDDR6 SDRAM running at 448 GB/s with a 256-bit memory bus width, and then it has another DDR4 RAM chip of only 512 MB for background tasks. the 16GB of GDDR6 SDRAM is used by games, and how it divides up that RAM between GPU and CPU tasks is completely up to the game engine.

              • 10 months ago
                Anonymous

                That's the wikipedia page of the xbox series x, so maybe the PS5 is indeed different

              • 10 months ago
                Anonymous

                >so maybe the PS5 is indeed different
                It is, only XSX has split memory bandwidth and some suspect it's the reason why it's not performing as well as PS5 despite being faster on paper. The bandwidth is honestly kinda shit even without the split.
                512GB/s can't fully feed a 60 CU RX 6800 and it needs L4 cache yet Xbox Series X has 560/336GB/s without L4 feeding 52 CUs. And consoles are targeting resolution over framerate so the bandwidth matters more. 4070 can sorta get away with 500GB/s because nvidia has MUCH better memory compression than AMD and you're also not expected to run games at 4k with a 4070.

              • 10 months ago
                Anonymous

                https://www.eurogamer.net/digitalfoundry-2020-inside-xbox-series-x-full-specs

              • 10 months ago
                Anonymous

                He's talking about the Series X.
                Afaik the PS5 has 2GB of the 16 reserved for the OS.
                Series X is split into a faster pool of 10GB and a slower pool of 6GB. Out of those 6GB I think 4 are presently available for games (I remember hearing a while back that MS had freed up several hundred megabytes for use in games on Series S and X).
                Series S is split into 8GB of faster memory and 2GB of slower memory. The 2GB are used by the OS and the 8 of faster memory are used for games, so games don't use anything from the slower memory pool.

          • 10 months ago
            Anonymous

            >You fail to mention that the 16GB of RAM that consoles have is shared RAM, meaning both the CPU and GPU use it, while in PC we have DRAM for the CPU and VRAM for the GPU.
            dosnt matter, as you said the gpu ram bandwith is too low on the pc to use anything out of ram for the current frame, so it has to store the same info twice, once in gpu, once in ram
            and sure the consoles system eats up 3gb or what ever, and you need to preload shit you dont need right now to another 3gb, its still not 8gb, modern gpus need 12gb to match console ports, and since most of the run with upscaling and other gimmick native render on pcs need 16gb due to more frame buffer usage

            16gb is the minimum middle range gpus should have right now

            • 10 months ago
              Anonymous

              You more or less repeated what I said and yet you still fail to see the point.

              I can guarantee you that unless you're dealing with an absolutely garbage port, you do not need 16GB of VRAM in order to run a game at exactly the same graphical settings as what the consoles provide.

              Hell, even with shitty ports, you still don't need 16GB of VRAM if all you want to do is MATCH the looks and performance of a console. Look at Jedi Survivor, for example. Shit port, needs 16GB of VRAM to run it on max graphics 4k 60FPS, but that's not how the consoles run that game. The consoles run that game at ~720p 60FPS with upscaling to 4k. I can guarantee you, if you run Jedi Survivor on PC at 720p 60FPS, you don't need 16GB of VRAM.

              • 10 months ago
                Anonymous

                >you do not need 16GB of VRAM in order to run a game at exactly the same graphical settings as what the consoles provide.
                you mean silky smooth cinematic 20fps and upsacaled 720 with DRS and checkerboard?
                most pcs run at higher res, which is a lot more vram usagem then you have just basic architecture overhead you want to have, truth is that no matter the engine or dev, next gen titles ported to pcs run like shit on 8gb
                be it the new RE4remake on camcop inhouse negine, unreal 5 titles like that dead space clone, flopspoken or square engine, or last of us remake, they all struggle on 8gb and you need to lower the settings ( there are also cpu problems, but thats another shitshow )

              • 10 months ago
                Anonymous

                Okay but that's all besides the point. This anon here

                yeah, for low and medium texture settings...

                the consoles have 16gb, and most games are cross platform, 8gb is 2x less than what we should be getting, simple as that

                said 16GB of VRAM is the minimum we should get because that's what consoles have, which makes no sense and is based on absolutely nothing.

              • 10 months ago
                Anonymous

                >based on absolutely nothing.
                its based on a 3 year old console based on 2 generation old gpu architecture
                if they have 16gb, why would the "next next gen" gpus only have 8gb?

              • 10 months ago
                Anonymous

                I'll refer you back to my original reply as opposed to typing it all over again:

                >the consoles have 16gb, and most games are cross platform, 8gb is 2x less than what we should be getting, simple as that
                You fail to mention that the 16GB of RAM that consoles have is shared RAM, meaning both the CPU and GPU use it, while in PC we have DRAM for the CPU and VRAM for the GPU.
                You're not entirely incorrect, though, because the shared RAM in consoles is faster than the DRAM in PCs, and the shared RAM also means consoles only need to load assets and instructions into the RAM only once, where as with the PC they need to load a lot of assets/instructions into the RAM twice, once into the DRAM and once into the VRAM. Because of that, the PC needs more combined RAM (DRAM + VRAM) than the consoles to perform the same, but you wouldn't need 16GB of VRAM to get the same performance as consoles, as certain instructions are only needed by the CPU and other instructions are only needed by the GPU.

              • 10 months ago
                Anonymous

                and yet you are wrong, as proven by every single big AAA cross platform release, not only is 8gb not enough, 12gb struggles at 4k

              • 10 months ago
                Anonymous

                I am not wrong, and I already debunked this too, here:

                You more or less repeated what I said and yet you still fail to see the point.

                I can guarantee you that unless you're dealing with an absolutely garbage port, you do not need 16GB of VRAM in order to run a game at exactly the same graphical settings as what the consoles provide.

                Hell, even with shitty ports, you still don't need 16GB of VRAM if all you want to do is MATCH the looks and performance of a console. Look at Jedi Survivor, for example. Shit port, needs 16GB of VRAM to run it on max graphics 4k 60FPS, but that's not how the consoles run that game. The consoles run that game at ~720p 60FPS with upscaling to 4k. I can guarantee you, if you run Jedi Survivor on PC at 720p 60FPS, you don't need 16GB of VRAM.

              • 10 months ago
                Anonymous

                16gb minimum is nonsensical and not based on any real number, it's completely arbitrary based on the fact that consoles have that much shared memory
                you can very easily have a game on PC that with a lot of RT effects will run out of vram on a 16gb card
                you can also have a console port that runs perfectly fine on a 10gb card with memory to spare
                have people forgotten that gen 8 had 8gb ps4 and 5gb/8gb xbox one s/x? yet games had no problem running on 6gb and even 4gb cards for the longest time
                it all depends on a game and the port, one cannot make a blanket statement like your gpu needs 16gb because that's what consoles have
                memory bandwidth and latency, both ram and vram, will be the bottleneck far more often on PC than the vram pool

      • 10 months ago
        Anonymous

        >8 gb is still ok for 1080p
        No it's not. Stop lying you stupid poorgay.

    • 10 months ago
      Anonymous

      >under 300
      Finally, an actual replacement candidate for my RX570.

      You are the moronic for listening to the k*kes tricks about 8GB vram "is not enough"

      You need more if you are doing some extreme cgi stuff in professional capacity, or you are an AI enthusiast, but for games, going above that is pretty pointless.

      • 10 months ago
        Anonymous

        >going above that is pretty pointless.
        >He says while games are wanting 12+GB because of the PS5 and being unoptimized as shite.
        I mean I love the price, but ya'll are crazy if you think 8GB now a days is enough. My 1060 is 6GB and even THAT is starting to feel long in tooth.

        • 10 months ago
          Anonymous

          What games need 12gb+ vram?

          • 10 months ago
            Anonymous

            Poorly optimized games. Which is the entire point of upgrading your 8gb gpus

          • 10 months ago
            Anonymous

            the last of us. and every other new game coming from derpsoles. they have +24gb shared graphics mem, meaning pc cards will all suck at the new games.

            • 10 months ago
              Anonymous

              >they have +24gb shared graphics mem
              Anon, consoles only have 16 GB total, and only ~14 of that is available for devs, meaning maybe 10-12 GB available as VRAM depending on the game.

              • 10 months ago
                Anonymous

                >he doesn't know

            • 10 months ago
              Anonymous

              also meaning, pc gamers will shit on new games because not enough ram meaning a HUGE downfall in sales on pc. this is gon be gud bois. it will be pc-geddon for derpsole games.

              now is our chance to wreck this shit! no more shitty derpsole ports possibru! yesssss

          • 10 months ago
            Anonymous

            hogwarts legacy performs badly with 8 GB.

      • 10 months ago
        Anonymous

        >"professional" cgi or ai stuff
        >using amd for that at all
        lol

    • 10 months ago
      Anonymous

      needs at least 16gb. this is a joke.

      • 10 months ago
        Anonymous

        Someone in the comments on the article mentioned how other gpu manufacturers could increase the vram size on their model. Are there precedents that would give this claim credibility?

      • 10 months ago
        Anonymous

        >need 16GB for movie games
        Look if you want to play movie games just buy a console. Seriously.

      • 10 months ago
        Anonymous

        >In 2023
        >With BOTH companies conspiring together
        >With resellers and flippers around every corner
        You honestly think the internet wouldn't be cleared of a 16GB model in a week?

    • 10 months ago
      Anonymous

      homie it's the weakest card in the lineup

    • 10 months ago
      Anonymous

      I'm not moronic so 8gb is fine, your 4K gayshit might struggle though

      • 10 months ago
        Anonymous

        coping poorgay

        • 10 months ago
          Anonymous

          Camel, eye of the needle, etc.

  2. 10 months ago
    Anonymous

    >haha take that Nvidiots!!
    >*driver crashes*

    • 10 months ago
      Anonymous

      Works in my machine

    • 10 months ago
      Anonymous

      I know this pain due to having the misfortune of owning a 5700xt. it's better to pay more than have your pc use feel like crap.

    • 10 months ago
      Anonymous

      Sucks to be using windows.

      • 10 months ago
        Anonymous

        >fall for the meme AMD card
        >drivers fricking suck on windows
        >fall for the linux meme, that makes running regular games a battle
        >"old games are better anyway" cope meme
        >put actual effort to run modern titles, and when they run, they run well because dxvk is a god
        >but they actually fricking suck if compared to the old shit you was playing previously, with the difficulty of getting it to run being directly proportional to how the game sucks ass and disappoints you
        >basically only play games that could easily run on the igpu, now being sure the AMD card was a waste of money

        • 10 months ago
          Anonymous

          I can't parse this post through all the ESLness.

          • 10 months ago
            Anonymous

            I can.

        • 10 months ago
          Anonymous

          >linux meme, that makes running regular games a battle
          The only difference between linux and windows compatibility these days is the amount of driver hacks for old apis on windows. Virtually all dx11+ games work out of the box now.

        • 10 months ago
          Anonymous

          AMD is fine if you're a normie, but if you try to go deeper into gaming like emulation, or AI you will find that everything works but with some small annoyances or glitches, these annoyances start to build up slowly on your patience and system.
          For me it's just not worth it, I used to own AMD cards myself but not anymore.

          • 10 months ago
            Anonymous

            AMD Emulates fine friend.

            • 10 months ago
              Anonymous

              Let's ignore all of those errors with Tears of the Kingdom that only happen on AMD.

              • 10 months ago
                Anonymous

                Works on my machine.

  3. 10 months ago
    Anonymous

    >For European gamers this means €299

    • 10 months ago
      Anonymous

      To me it looks like they were both planned to be 299 $ and € but finally added a $30 discount on US prices for some unspecified reason

      • 10 months ago
        Anonymous

        >unspecified reason
        the 7600 seems to perform only slightly better than the 6650XT which already costs less than 300 and with the 4060 releasing nobody would buy amd if they costs the same

  4. 10 months ago
    Anonymous

    my 3060 Ti is better than that shit, who cares

  5. 10 months ago
    Anonymous

    Will one of these companies just release another sub-$300 GPU with more than 8 GB of RAM?
    To this day the only GPU that fills that criteria is the RX 6700, and it only just recently hit sub-$300.

    • 10 months ago
      Anonymous

      Never, it's their goal to actively avoid releasing a quality budget product. Customers want a repeat of the 1080ti, but manufacturers don't.

      • 10 months ago
        Anonymous

        The 1080 Ti was such a great product, Jesus Christ. Served me well all the way up to recently, when I switched a 4k monitor and RTX 4080, and only because I had some extra money because I didn't go on any vacation trips in 2020 and 2021 because of the stupid scamdemic.

        I want the old Nvidia back, the Nvidia who made the GTX 1000 series.

        • 10 months ago
          Anonymous

          they need a generation of humbling to lower the prices to where they belong
          4000 series cards aren't selling out instantly in an era of increased pc gaming and AI generated shit so hopefully they tone it down next generation but frick nvidia

        • 10 months ago
          Anonymous

          even the GTX 1080 was great. just swapped mine out for a 7900 xtx, because frick nvidia. hell, even the GTX 1080 could play most of the games i played @ 4k/medium. 1080p it would still be completely adequate. (Hell, it's sitting in my eGPU enclosure for use on the laptop come vacation time later this summer)

          >drivers
          i've had absolutely no issues with game crashing or other software issues in the 2 months i've had it. (with one notable exception, a lack of airflow was routinely making the card hit the 110 junction max; and RDR2 barfed a few times as a result of the throttling. After getting a $20 case fan, it's been flawless.)

          • 10 months ago
            Anonymous

            >a lack of airflow was routinely making the card hit the 110 junction max
            What 7900 XTX do you have? Reference? Reference 7900 XTX had an issue with faulty vapour chambers causing the Tj to spike into oblivion.
            At wattage did you hit 110 Tj? Even with complete dogshit airflow I can't see the card getting that hot below 350W unless you have ultra silent fan curve.

  6. 10 months ago
    Anonymous

    my sapphire 6600 is keeping me comfy bros

  7. 10 months ago
    Anonymous

    How does it compare to their other cards? I was planning on upgrading from a 6600xt

    • 10 months ago
      Anonymous

      This is not a suitable upgrade from a 6600 XT.
      We won't see that in the sub-$400 price-class until like 2026.

      • 10 months ago
        Anonymous

        Gotcha. Thanks anon

      • 10 months ago
        Anonymous

        Gotcha. Thanks anon

        Anon from the future it seems

  8. 10 months ago
    Anonymous
  9. 10 months ago
    Anonymous

    considering its barely better than a rx6600 that cost 240 bucks, it makes sense. in fact id rather spend 60 bucks less, the power difference seems so minimal.

  10. 10 months ago
    Anonymous

    I don't know about PCs anymore. I built some little shitty lowrange computer a few years ago and don't know where to go now. May just restart because I'm spooked
    B450 + R51600 + 1060 6B is holding well, but now SF6 is showing me cracks.

    • 10 months ago
      Anonymous

      >SF6
      So a PS4 game did you in, huh?

    • 10 months ago
      Anonymous

      you can upgrade the cpu and gpu and have a pretty strong gaymen pc

    • 10 months ago
      Anonymous

      the 1060 may hold, you'll need new CPU and that means new mobo and new ram unfortunately. If you don't get something used, it'll easily cost you $300

      you can upgrade the cpu and gpu and have a pretty strong gaymen pc

      >replace the whole pc
      no shit

      • 10 months ago
        Anonymous

        >replace the whole pc
        no, just the cpu and gpu. You can keep the cpu cooler, mobo, ram, psu and case

    • 10 months ago
      Anonymous

      5800X3D for sure. GPU it depends, the 7600 might be good.

  11. 10 months ago
    Anonymous

    >vidya graphics haven't improved in 5 years
    >barely improved over the 5 years before that
    >but OMG NEW GRAPHICS CARDZZZZ!!!!1!
    are you niglets just burning out your cards with overclock dick flexing, or what?

    • 10 months ago
      Anonymous

      Bruh I need a 4090 to max out the latest shitty console port I wasn't going to play anyways.

    • 10 months ago
      Anonymous

      Resolution and Hz size on the other hand...

      But GPUs have generally been a terrible deal for nearly a decade now and it's only getting worse.

  12. 10 months ago
    Anonymous

    if you want a 1080p card just buy the arc 750 for $199

    • 10 months ago
      Anonymous

      Thank you greatest ally Intelaviv.

      • 10 months ago
        Anonymous

        Witches. Witches in video games.

      • 10 months ago
        Anonymous

        the ai is too fricking strong

  13. 10 months ago
    Anonymous

    whats the Ram size for 7700 and 7800?

    • 10 months ago
      Anonymous

      12, 16

      • 10 months ago
        Anonymous

        >7800 16GB
        yeah so unless it performs better than the 6800 and costs less I think I may just commit to the 7900XT

        • 10 months ago
          Anonymous

          >7800 16GB
          I just got goosebumps thinking about my old 7800 GT (256 MB VRAM) from 2005 rising from the ashes as a 16 GB beast. it's a shame nvidia is doing the x0y0 naming scheme now instead of xy00, or we could have a "RTX 7800 GT" for nostalgia's sake soon. I wonder if that will ever be a thing, nostalgia-themed GPUs, like those sports cars that look like a classic but have a sick modern engine under the hood.

  14. 10 months ago
    Anonymous

    could be the first gpu with a non-shit fps/$ ratio since before quarantine.
    either way, i got a 6600 in late 2021 and i won't be upgrading until next gpu generation.

  15. 10 months ago
    Anonymous

    oh boy more ESL "gamers" on the horizon, can't wait

  16. 10 months ago
    Anonymous

    it's slower than 6600 xt

    • 10 months ago
      Anonymous

      Its the same than the 6650XT but $100 less.

  17. 10 months ago
    Anonymous

    AMD keeps getting better and better every generation, the only thing left is to add proper CUDA and NVENC alternatives.

  18. 10 months ago
    Anonymous

    >amd so it's shit support from the get go and it's not even their fault
    >8GB for shit slop games that ask for more and more
    at least the price is nice but then again you remember point uno which makes any amd gpu instant loss, fricking nvidia.

  19. 10 months ago
    Anonymous

    people say vram isnt needed in 1080P but so much games have shit textures everywhere because of optimization requirements...

  20. 10 months ago
    Anonymous

    I am planning on getting a 6800XT next month bros, anyone had experience with it?

    This will be my first AMD card since I am tired of the greedy shitters Nvidia continues to be. I always hear memes about no drivers/shit drivers for AMD, is that true for 6800XT as well?

    • 10 months ago
      Anonymous

      It seems like amd's previous gen cards had some driver issues, but the 6000 gen as a whole is pretty solid. I got the asus 6800xt a few months ago (upgraded from an rtx 2080) and it's pretty good, I've zero issues whatsoever

      • 10 months ago
        Anonymous

        >asus 6800xt

        I've heard Saphire is the way to go when getting AMD, allegedly they only do AMD cards and are noted as the the go-to for cards. Wonder if that is just marketing though.

        • 10 months ago
          Anonymous

          Saphire is just turboshit went it comes to fans and their dedicated software manages to be even more shit. They're basically masters at cutting corners.

        • 10 months ago
          Anonymous

          In my experience, Sapphire, Gigabyte, and MSI all make decent AMD GPUs.
          XFX can be hit or miss (though personally all the XFX GPUs I've had were pretty solid), and Powercolor I've only ever heard bad things about, but I'm not sure.

          • 10 months ago
            Anonymous

            Red devil is hit or miss cause Ive been hearing the 7900XTX models are starting to heat up more. Hellhound is their best product but is limited to RGB colors, performance wise its great I never hit temps over 45 C. I heard XFX non reference is the best value, not sure about their other products

          • 10 months ago
            Anonymous

            I see, will check my options then. Thanks anon.

            Saphire is just turboshit went it comes to fans and their dedicated software manages to be even more shit. They're basically masters at cutting corners.

            So what manufacturer do you recommend anon? I've googled far and wide and everyone praises Saphire, not seeing much turboshitter claims from people who talk about it outside 4chins.

            >only problem is windows update might randomly decide to replace your manually installed AMD drivers with old ones

            Any way to circumvent this? I don't really plan on streaming anything, I just wanna buy a good card to future proof myself for awhile. Going from 1060 to this will be a big leap.

            • 10 months ago
              Anonymous

              >Any way to circumvent this?
              OO Shut Up 10 or "Ten", bing it for the irony.
              free, allows you to turn off any windows "feature" you want to. the only hassle is if you allow updates to install some settings might be changed so after a windows update you have to check your settings. it even notices such changes though and lets you quickly revert or accept them in bulk.

            • 10 months ago
              Anonymous

              >So what manufacturer do you recommend anon?
              Asus, msi, gigabyte

    • 10 months ago
      Anonymous

      I've had a 6800 non-XT for 3 or 4 months now. No problems here, it just works.

    • 10 months ago
      Anonymous

      amd has and always will have issues with drivers. it's something you have to accept if you decide to go with amd

    • 10 months ago
      Anonymous

      at this point nvidia has slightly worse drivers than amd, at least for last gen cards. only problem is windows update might randomly decide to replace your manually installed AMD drivers with old ones. this has happened to me once in two years with my RX 6600.
      if you're streaming, just use software x.264 instead of the AMD encoder.
      otherwise it's been all peachy.

      • 10 months ago
        Anonymous

        >at this point nvidia has slightly worse drivers than amd, at least for last gen cards
        AMDgay delusion knows no bounds, no gravity, nothing to keep it down to Earth.

    • 10 months ago
      Anonymous

      AMD drivers are fine nowadays, especially on Linux if you care about that shit.

    • 10 months ago
      Anonymous

      Using 6600xt since November. Works fine. The only thing i've noticed, is If you do UV and see a black screen it means that you need to add some more milliamps. It won't crash the driver outright like nVidia would, so getting stable UV requires actual usage for a while to see if it's perfectly stable

  21. 10 months ago
    Anonymous

    >AMD Won
    >only difference is $30

    lmao are americans this poor

  22. 10 months ago
    Anonymous

    Reminder that Vega 56 was 200$ in 2018, had 8gb of HBM2 RAM and probably performs the same 7600

    Lol
    Lmao

    • 10 months ago
      Anonymous

      ...if you can get the game to start and not crash or produce artifacts of some kind. pretty sure there are games that literally don't support the Vega series anymore despite supporting GPUs older than Vega. Vega was a weird phase for AMD.

      • 10 months ago
        Anonymous

        Vega is still supported and probably will till the end of time since a lot of workstation GPUs use the architecture

    • 10 months ago
      Anonymous

      >Pooga

  23. 10 months ago
    Anonymous

    is 1080ti to this a good upgrade?

    • 10 months ago
      Anonymous

      probably not, most 1080ti's arent new or unused, its newer hardware and software so its probably gonna perform better. Mind you this is for low end budget gaming and essentials, anyone with a brain knows 8GB isnt enough for modern games today

  24. 10 months ago
    Anonymous

    Is it better than my 1660Ti.

  25. 10 months ago
    Anonymous

    >nooo 8gb is bad you NEED 128GB OR IT'S BAD
    I run 1440p 144fps on a 3060ti without any issues whatsoever
    have a nice day shills

    • 10 months ago
      Anonymous

      256 bit bus width on the 3060ti compared to all the new cards being at just 128. It's actually better performance for less money compared to the 4060 series.

    • 10 months ago
      Anonymous

      a lot of it is people expecting to play unfinished and broken goyslop day 1 with every bell and whistle turned on in the graphics menu
      but i don't think you *have* to do that, needless to say

      nowadays pc gaymers seem to call everything "obsolete"

  26. 10 months ago
    Anonymous

    Is the 7900 xt worth buying?

  27. 10 months ago
    Anonymous

    >6 years old card
    >399$ on release
    >2048bit bus width

  28. 10 months ago
    Anonymous

    I'm still running a 980ti

    • 10 months ago
      Anonymous

      Same. Except I don't have a job and that if it breaks I am going to kill myself

    • 10 months ago
      Anonymous

      I'm running a 970 3.5 GB

  29. 10 months ago
    Anonymous

    I've got a 2070 super on 1080p what's the point of moving to a higher resolution?

    • 10 months ago
      Anonymous

      If you enjoy higher pixel density / image quality and size. Modern games on highest settings may not always hit 144hz though.

  30. 10 months ago
    Anonymous

    >trusting a card that cheap inside your system
    also curry tech

    • 10 months ago
      Anonymous

      I know you are brown.

  31. 10 months ago
    Anonymous

    if its at least near the 6700XT in performance they might have a winner
    unless you are still on a 1060 or 580 or older dont bother upgrading

    • 10 months ago
      Anonymous

      >its ok to buy and support absolute horseshit if its an upgrade
      thank you for sharing your eternal wisdom, almighty GPU oracle

      • 10 months ago
        Anonymous

        no problem

  32. 10 months ago
    Anonymous

    >8gb
    ill just wait for the 8600

  33. 10 months ago
    Anonymous

    Can anyone tell me how much FPS it takes to run a 144 or 240 Hz monitor to see the difference?

    • 10 months ago
      Anonymous

      1:1. You won't notice a difference if your game runs at 90fps on a 60hz monitor. The only advantage of high framerate in that case would be stability. If you want to make full use of a 144hz monitor your system needs to provide a consistent >= 144fps rate

      • 10 months ago
        Anonymous

        At 120 Hz on the PC and 120 Hz on the console, are there any differences?

        • 10 months ago
          Anonymous

          Yes. The PC image will look crisper but the console will have better frametimes unless your PC is very overtuned.

        • 10 months ago
          Anonymous

          In fluidity? No. But again PC and consoles have different building parts which affect the stability and average performance of the system. Depending on how much money you throw at a PC, you won't have frame drops compared to consoles which have to make the most of the tools they have and intelligently allocate resources. To answer your question, on paper they are the same but in practice on a console you will have troubles running at 120hz all the time

        • 10 months ago
          Anonymous

          exactly the same assuming they both maintain 120fps and have consistent frame time
          on console on one hand you don't have shit in background but on the other hand cpu is shit so weird little things can cause bug spiked
          on pc the cpu is much beefier so spikes are lower BUT they can be introduced more often because if background apps and the OS shitting itself be it windows or loonix
          also games that don't generate shader cache on pc on launch WILL stutter while almost every console game in existence ships with shader cache
          so basically choose your poison but in 99% of cases it'll be identical and I am kinda splitting hairs here

  34. 10 months ago
    Anonymous

    VRAM is adapted to the power of the GPU. Whats the point of more VRAM if the GPU cant display a game that needs 10 GB ?

    THe VRAM is perfectly adapted to the capacity of this GPU.

    The only reason it has 8GB is because it's not very powerful, not more than older cards. THe point of this useless card is for nvidia to make some free money and test the waters with the
    >its a shit gpubut look at the framerate with DLSS on goy? its not that powerful but if you just make an approximation of the pixels then it runs alright
    thing and how the consumers bite it.

    • 10 months ago
      Anonymous

      >THe point of this useless card is for nvidia to make some free money
      >nvidia

      • 10 months ago
        Anonymous

        >he doesn’t know

    • 10 months ago
      Anonymous

      So in 2023, a $300 GPU is still way weaker than the GPU of a fricking 2019 console???
      >its a shit gpubut look at the framerate with DLSS on goy? its not that powerful but if you just make an approximation of the pixels then it runs alright
      moron, current gen games are unplayable on 8GB gpus. DLSS or not.

  35. 10 months ago
    Anonymous

    I wonder if it's good for render

  36. 10 months ago
    Anonymous

    Yeah but what is the performance?

    • 10 months ago
      Anonymous

      >8gb
      what a dumb question

      • 10 months ago
        Anonymous

        Ah yes, a 1080p medium-high card in 2023.

        • 10 months ago
          Anonymous

          Even on 1080p you will have serious troubles with 8gb gpu.
          Again, video games are designed for consoles, wich means that they target 10gp of vram.

          • 10 months ago
            Anonymous

            That GPU will have no trouble running any non-DS game at 1080p on max/120. Not very future-proofed, though.

          • 10 months ago
            Anonymous

            >video games
            Movie games.

          • 10 months ago
            Anonymous

            Stop using Ultra settings you idiot

            • 10 months ago
              Anonymous

              You stupid frick, you think devs give a frick about your poorgay gpu???
              8gb is unplayable for current gen games.

              • 10 months ago
                Anonymous

                Wrong. I watched benchmarks on YouTube and it still looks fine to me.

  37. 10 months ago
    Anonymous

    nvidia will just shift the stack up a tier so lower cards will get the better sku like 5070 getting the full xx104, they will add more vram and keep prices exactly the same
    and people will love it and say it's value and buy the shit out of it like they did with Ampere
    nvidia has been doing the "flip flop" for over half a decade now

  38. 10 months ago
    Anonymous

    >AMD

    Sorry I don't buy pajeet shit

    • 10 months ago
      Anonymous

      AMD is an american company run by a taiwanese woman.
      Nvidia is an american company run by a taiwanese man.
      Intel is an american-israeli company run by israelites

      • 10 months ago
        Anonymous

        Intel still has the best architecture. AMD is clever by using stacked L3 cache but Zen4 is still inferior to Raptor Cove and honestly even Golden Cove since Raptor Cove is just GC with extra L2 and no ring latency penalty when going between E-cores and P-cores. And don't forget that Intel is a full node behind and in Intel's defense, TSMC is sponsored by Taiwanese goverment and gets big bucks from Apple with Intel has none of these benefits and to their credit, they used to be two nodes behind until a while ago so at least there's some progress lol

        • 10 months ago
          Anonymous

          Intels architecture is old and outdated.
          The only thing they master is making it survive intense heat and powerdrain.
          AMD chips can do more with less but you can't put them through hell without killing them.
          Intel only managed to keep up with AMD in recent years by allowing their chips to bruteforce the benchmark with sheer energy.

          AMD easily win any value and efficiency battle but Intel know that enough people will just buy the winning chip - And if that's Intel with 1-2% more performance at 30% the energy cost and twice as hot they'll still buy that.

        • 10 months ago
          Anonymous

          Intel must be scared shitless. CPU's have reached a point where outside of very, very specialized use cases there's no real reason to upgrade even an 8+ year old cpu.
          video cards are quickly approaching that point as well.
          i'm guessing that the future will lean way more towards SaaS offerings to maintain the paypiggie revenue streams.

          >a lack of airflow was routinely making the card hit the 110 junction max
          What 7900 XTX do you have? Reference? Reference 7900 XTX had an issue with faulty vapour chambers causing the Tj to spike into oblivion.
          At wattage did you hit 110 Tj? Even with complete dogshit airflow I can't see the card getting that hot below 350W unless you have ultra silent fan curve.

          XFX merc 310; yeah i read about that heat issue, in my case it seemed like a legitimate airflow issue, after installing the fan it hasn't gone above 95c and all is well in the world.

          • 10 months ago
            Anonymous

            >Intel must be scared shitless.
            They still make like 3x the amount of revenue AMD does

            • 10 months ago
              Anonymous

              yeah, for now.
              they're losing the server market to AMD, consoles are all AMD, consumer purchases are slowing down, and they can't make a decent GPU to save their lives.

              dunno, long term you don't see them being knocked down a peg or two?

              • 10 months ago
                Anonymous

                >yeah, for now.
                >they're losing the server market to AMD
                2 more years 🙂
                >consoles are all AMD
                >they can't make a decent GPU to save their lives.
                Neither of these have ever been a market for Intel in the past.
                >dunno, long term you don't see them being knocked down a peg or two?
                Because they aren't lmao. Only a delusional moron things AMD is going to make Intel go bankrupt or out of the game. Intel has always make all of its money via OEM and still continues to do so. AMD will never be anywhere close in comparison.

              • 10 months ago
                Anonymous

                >and they can't make a decent GPU to save their lives.
                I dunno wtf you're talking about but the Intel Arc cards are pretty good value cards right now. Probably the best bang-for-buck if you're on a tight budget.
                I'm cautiously optimistic about Intel Battlemage.

              • 10 months ago
                Anonymous

                i'd like to see intel do well with their GPU efforts too. Having some more competition and competition on price is nothing but an upside for consumers.

                i could be wrong, but aren't they on par with like a 2060 performance wise?

          • 10 months ago
            Anonymous

            >Intel must be scared shitless
            Intel is FRICKED. Krzanich fricked up big time and set Intel foundries back 4 years. Swan did basically fricking nothing and Gelsinger is beyond incompetent (as a CEO, he's a brilliant engineer) by focusing on things that don't matter. Their entire GPU division is a massive money sink that will not see dividends, ever. AMD is destroying them in the server market because of MCM despite Intel having better uarch than them. Their shares took a massive nosedive in the past few years.
            Unless they change the strategy they will never match AMD in the server market and desktop market is pennies in comparison. I have no idea why they entered the GPU business, it should be beyond obvious that no ever will catch to nvidia. Not even Apple with their unlimited money hack, best talent in the industry and priority access to TSMC's latest node managed to.

  39. 10 months ago
    Anonymous

    But doesn`t the 4060 have 16GB?

    • 10 months ago
      Anonymous

      No. 4060($300) is 8GB. 4060 Ti($400) is 8GB. There is a 4060 Ti with 16GB but that's $500.

  40. 10 months ago
    Anonymous

    Okay but is it better than my old ass 1070 that still is running games perfectly fine at 144 fps and 1080p?

    • 10 months ago
      Anonymous

      not really no

  41. 10 months ago
    Anonymous

    That is 100 credits too expensive still

  42. 10 months ago
    Anonymous

    This is the RX 7600
    Not a RX 7600 XT.

    Wouldn't surprise me if they give us a XT model that is like 30-50 bucks more expensive with 12GB Vram a little later.

    • 10 months ago
      Anonymous

      Do you people even know what you're talking about?
      Unless they change the memory bus and thus make it a fundamentally different GPU they can't give us 12GB on a hypothetical XT model. They could double the 8GB and give us 16GB on a hypothetical RX 7600 XT, but not 12GB.

      • 10 months ago
        Anonymous

        They can just bin one of the ram dies. 8+4=12 bud

        • 10 months ago
          Anonymous

          >lets just have two ram segments, what could possibly go wrong

  43. 10 months ago
    Anonymous

    The only games that need more than 8GB of VRAM minimum are shitty games that aren't worth playing anyways. Graphics homosexualry is inversely proportional to how good/fun a game actually is.

    • 10 months ago
      Anonymous

      This

      >but what about muh Hogwartz Legacy, Cyberplunder and TLOU Part 2
      If you care about those trashfires you belong on /r/gaming. Not Ganker.
      All the games that shat the bed at 8 GB VRAM or lower were western AAA trash of the worst caliber.

  44. 10 months ago
    Anonymous

    The only games that need more than 4GB of VRAM minimum are shitty games that aren't worth playing anyways. Graphics homosexualry is inversely proportional to how good/fun a game actually is.

  45. 10 months ago
    Anonymous

    >Muh shared memory
    All the games shitting the bed were on PS4 as well. That argument doesn't even matter. The games were just incredibly shitty ports and the reason why we keep getting those isn't the VRAM no longer being enough - It's publishers not giving their devs enough time to optimize anything - they just throw everything onto the market to hit the numbers for their next financial report.

    • 10 months ago
      Anonymous

      While some ports are indeed inexcusably bad (looking at you, Jedi Survivor), I do think devs get more flack than they deserve. The fact of the matter is that optimizing for PC is much harder than for consoles, especially when Ngreedia and AMD keep being stingy with their VRAM and gamers refuse to upgrade their hardware because the newest hardware is too expensive.

      Another party to blame here is the Unreal Engine 4. We shouldn't forget about that clusterfrick of an engine. The one thing that engine had going for it is that it made game development easier than ever, yet made optimizing games more difficult than ever. The Unreal Engine 4 is really starting to show its age, and simply cannot handle what devs are trying to do now in 2023, neither on PC nor on console.

      Seriously, we often b***h about PC ports being garbage, but most of these games do not run any better on consoles.

      Now of course the simple solution would be to just make games less graphically demanding, but then you'd have idiot homosexual zoomers complain that the game looks like crap, despite their shitty PC not being able to run the game on 'max' graphics anyway.

      • 10 months ago
        Anonymous

        Modern games look like crap anyway, because of bad art direction and smearing TAA everywhere to cover up dithering hacks.

  46. 10 months ago
    Anonymous

    >AMD broke CPU stagnation and forced core counts and IPC to go up while maintaining affordable pricing, everyone won
    >AMD does frick all in the GPU space and it's still stagnant

  47. 10 months ago
    Anonymous

    >$269
    >8GB GDDR6
    >128-bit bus
    >x8 lanes only still
    AMD looked at the 4060 and found a way to make a product just as bad.

  48. 10 months ago
    Anonymous

    >6700 is $269 on amazon right now and performs better and has 10gb vram.

    • 10 months ago
      Anonymous

      shut the frick up you're supposed to remember RDNA2

  49. 10 months ago
    Anonymous

    Looks like if you wanted an RX580 that can do DX12, this is it. I'm OK with it.

  50. 10 months ago
    Anonymous

    why is AMD's numbering so fricking moronic? holy shit, i buy nvidia just because AMD's branding fricking sucks ass

Leave a Reply to Anonymous Cancel reply

Your email address will not be published. Required fields are marked *