For gaming, it really can't get any better than this. Literally perfect.

For gaming, it really can't get any better than this. Literally perfect.

CRIME Shirt $21.68

UFOs Are A Psyop Shirt $21.68

CRIME Shirt $21.68

  1. 2 weeks ago
    Anonymous

    Yep. Shame their gpus are trash.

    • 2 weeks ago
      Anonymous

      4090s are still burning and cracking left and right.

      • 2 weeks ago
        Anonymous
        • 2 weeks ago
          Anonymous

          >42% slower
          >costs over 50% more

        • 2 weeks ago
          Anonymous

          you can get a 7900XTX for as little as £750 compared to £1500 for 4090. paying twice the price for ~30fps more...

          • 2 weeks ago
            Anonymous

            Stop kvetching. The more you buy the more you save.

          • 2 weeks ago
            Anonymous

            7900XTX was a $1000 card. It only fell in price because nobody wanted it. 4090 at $1600 is selling well, sometimes isn’t even available at MSRP, and 4080S at $1000 MSRP is considered rare and a good deal, or the vanilla 4080 which is virtually the same.
            AMD can no longer price their cards the same as nvidia just because they match them in non-RT games because RT and DLSS have become key selling points and since they can’t match them there they have to discount their stuff a lot.

            • 2 weeks ago
              Anonymous

              >RT and DLSS have become key selling points

              • 2 weeks ago
                Anonymous

                RT hasn't but DLSS definitely has. It's delusional to think otherwise. Most people still game on 1080p resolutions and at this point every tech outlet in existence has made thousands of video comparisons about FSR vs DLSS at 1080p, showing DLSS looking drastically better. For most people it's a free FPS button that gives extra longevity to a card. If you put a vote asking Nvidiots why they buy their cards I guarantee you 80% would say DLSS.

            • 2 weeks ago
              Anonymous

              Both the 4090 and 4080 are selling because enthusiasts and small companies are buying them for AI. They were both going down in price before the AI boom, the demand from gamers willing to spend $1000+ for a graphics card died relatively fast.
              No one wants AMD cards for AI, all they have is gaming, so they had to keep dropping the price.

              >AMD can no longer price their cards the same as nvidia
              They never could. Not now, not ten years ago.

        • 2 weeks ago
          Anonymous

          why hello saar!

          • 2 weeks ago
            Anonymous

            >no FSR/DLSS and RT

            If the cable is bent like thick futa wiener it will burn down. If straight like ruler it is fine.

            I've had it bent for a year now and no problems

      • 2 weeks ago
        Anonymous

        Seriously, What the frick is happening with GPU's, what is the objectively correct options by Ganker's?

        • 2 weeks ago
          Anonymous

          Ganker is the absolute worst place to ask for anything gpu related. I would legit take advice from quora and PCMR over Ganker.

          • 2 weeks ago
            Anonymous

            No shit, because the majority here are coping brokegays and brown people

        • 2 weeks ago
          Anonymous

          AMD is run by a woman.
          Nvidia is printing money with "AI" chips and doesn't give a frick about gayming, at least not as much as they did 5 years ago.
          GPU cryptocurrency mining is dead, ever since Ethereum moved from proof of work (the thing that requires computing power) to proof of stake.
          AMD (nor Nvidia) never designed their chip to be competitive for crypto mining, but it just happened that their (AMD's) GPUs were in a soft spot in the perf/dollar ratio, so as long as the crypto gold rush was going on, AMD was selling GPUs.
          Now that the cryptoBlack folk are gone, the new gold rush is AI, and since Nvidia owns the AI market with both their chips and their API (CUDA), AND they outperform AMD GPUs on every single metric (both gaming and GPGPU), AMD API for GPGPU sucks btw), AND they bring tons of little things that make a huge difference (DLSS) or are still in the tech showcase stage (ray tracing), AND the gaming console market is tanking (AMD provides gpu for Xbox and PS4/PS5). ..
          So I don't have a short answer on the state of the GPU market, but it's clear that AMD is in a very bad place right now, and it's all their fault. Which is not a good thing. So get ready for the Nvidia monopoly.

          • 2 weeks ago
            Anonymous

            >tech improvement has shifted from improving tech to chasing casinos like cryptoBS and algorithmically generated garbage
            it's so fricking over.

          • 2 weeks ago
            Anonymous

            >doomposter moron doesnt know that Ai gold rush is a fricking bubble
            just have a nice day

            • 2 weeks ago
              Anonymous

              If you're too moronic to understand that gold rush === bubble, you might consider removing yourself from the gene pool instead. I think I was pretty clear about that in my post.

            • 1 week ago
              Anonymous

              moron

        • 2 weeks ago
          Anonymous

          Only reason to buy AMD is very low end for the RX 6600 or if something else in the stack is a really good temporary deal. At msrp Nvidia is too much better in features for the price benefit amd is willing to give
          This is coming from someone that is not a brand zombie or meme victim

          As far as Nvidia vs Nvidia they all have similar p/p ratio, just buy what you want and can afford if you care about new PC games

          • 2 weeks ago
            Anonymous

            Unless you use Linux in that case Nvidia is literally not an option because AMD is the monopoly and everything just works on AMD while everything is just broken on Nvidia unless you want to fix it yourself like a dork because Nvidia and the community wont do it for you.

            Btw Linux is 15-30% faster than Windows in gaming REGARDLESS of GPU brand.

            • 2 weeks ago
              Anonymous

              >Unless you use Linux in that case Nvidia is literally not an option
              nvidia is actually pretty good with linux nowadays.
              any turing card and upwards gets new nvidia drivers pretty good. both the proprietary and open source drivers are really good and catching up to amd especially when it comes to wayland. nvidia is really good with wayland nowadays.
              for older cards however either drivers are trash and you are better off with amd but even if you buy a 180$ rtx2060 you will have a fantastic linux experience.

        • 2 weeks ago
          Anonymous

          >Seriously, What the frick is happening with GPU's, what is the objectively correct options by Ganker's?
          i dont know about Ganker but my go to is always the midrange nvidia. for example the 1660super or the 4060super/ti when a good deal is available.
          typically thats the price perforamce/sweetspot for me.
          i do buy amd gpus too but only for my lunix machine where i also go with midrange.
          i am however considering intel when battlemage arrives.

          • 2 weeks ago
            Anonymous

            >when battlemage arrives
            >battlemage arrives
            >arrives

        • 2 weeks ago
          Anonymous

          Go with AMD if you plan to game on linux
          Go with Nvidia if you're a spoon fed moron

          • 2 weeks ago
            Anonymous

            >this is me by the way
            [giggles in happy woman noises]

        • 1 week ago
          Anonymous

          I just got a 4070 TI Super under MSRP and have zero complaints so far, it's perfect for what I play and for making AISlop on /trash/. I got the TUF specifically

      • 2 weeks ago
        Anonymous

        GN came to the conclusion that this is mainly user error

        • 2 weeks ago
          Anonymous

          They are right but forgot how many moronic people buy Nvidia products.

        • 2 weeks ago
          Anonymous

          It's user error, but it's shit design as well. How come no other connector had issues like that?
          It's not nV fault the PCI-SIG is moronic.

          • 2 weeks ago
            Anonymous

            Personally i like molex to pcie adapters.

        • 2 weeks ago
          Anonymous

          A faulty connection makes it more likely to melt, but it can melt even with a proper connection. The connector itself also has only a few connection cycles before it becomes prone to implosion.

        • 2 weeks ago
          Anonymous

          It's mainly user error combined with shitty design that was "Tweaked" to alleviate some of the issues. Doesn't really help that 4090 uses a good amount of power and voltage which is causing a lot of these problems to reoccur over and over again.

          When Der8auer is calling your shit hot trash, you fricked up. https://www.youtube.com/watch?v=p0fW5SLFphU

          • 2 weeks ago
            Anonymous

            I'm starting to think out computers need to be completely reconfigured.
            The video cards becaose the motherboards, and get the giant 24-pin connectors.

            While the motherboards only need 12 pins or some shit like that.

          • 2 weeks ago
            Anonymous

            frick off. nobody cares about your edaddy

      • 2 weeks ago
        Anonymous

        Does it only happen with the 4090 or with the whole 40 Lineup?

        • 2 weeks ago
          Anonymous

          Any card that uses the 12VHPWR connector (it absolutely must be plugged in horizontally and cannot be bent) but mainly 4090 due to extra power draw

          • 2 weeks ago
            Anonymous

            Show a pic of your 4090 to explain what you mean

            • 2 weeks ago
              Anonymous

              If the cable is bent like thick futa wiener it will burn down. If straight like ruler it is fine.

              • 2 weeks ago
                Anonymous

                Show me on yours

              • 2 weeks ago
                Anonymous

                The cable or the thick futa wiener?

  2. 2 weeks ago
    Anonymous

    r5 3600 plays everything already for 50$

    • 2 weeks ago
      Anonymous

      Technically it does, but still...

    • 2 weeks ago
      Anonymous

      7600 is closer to 7800X3D than 3600 is to 7600.
      3600 is like half as fast as 7600 while 7600 is like 10% slower than 7800x3d in games

      • 2 weeks ago
        Anonymous

        now let's see 5600's numbers
        the actual budget golden standard

  3. 2 weeks ago
    Anonymous

    recently upgraded to 7800X3D + 4080super from an i58400 +1060 6gb, it really does feel great to be part of the upper caste. the steam hardware survey is depressing though.

    • 2 weeks ago
      Anonymous

      Based anon I'm the hwinfo above. I was on a 5800X/2070S before and it was already such a jump I can imagine how nice that upgrade was for you tbh. Cheers fren, may we make many thirdies and neet failures seethe

      • 2 weeks ago
        Anonymous

        I’ve just moved from a GTX 670 / i5 2500k to a RTX 4080 / 7800x3D

        Its a jump of like 7 or 8 generations haha

        • 2 weeks ago
          Anonymous

          Congratulations, that's a massive jump. I too just moved from i5-4440 / GTX970 to 7800X3D / RTX4070S and while I do enjoy the 165 FPS (limit on my monitor) at 1440p, the thing that impressed me most was HDR. 🙂

    • 2 weeks ago
      Anonymous

      >gaming laptop only beats budget PCs and laptops
      kek

    • 2 weeks ago
      Anonymous

      yeah baby

      • 2 weeks ago
        Anonymous

        I like how to get into the 90%+ territory you only need to not run a laptop setup.

    • 2 weeks ago
      Anonymous

      >3080ti is better than 95%
      >OC'ed 3080ti is better than 96%
      >4070ti is better than 96%
      >OC'ed 070ti is better than 97%
      anon, are you using 2666MT/s ram with stock timings?

  4. 2 weeks ago
    Anonymous

    Pluton chip.
    Never buy Ryzen past the 5xxx series

    • 2 weeks ago
      Anonymous

      Unplug your Ethernet, easy fix.

  5. 2 weeks ago
    Anonymous

    Hell yeah brother. I think I'm going to sell my gpu for a small loss when the 50XX's come out of the jump is big enough.

  6. 2 weeks ago
    Anonymous

    yep, that's my cpu

  7. 2 weeks ago
    Anonymous

    if I was still rocking my trusty 3600 I'd probably be looking at a 5700X3D or 5800X3D right now but I'm fine with the base 5700X
    I might grab an X3D chip when I move to AM5 (or most likely wait™ until AM6 and whatever Intel has to offer by then)

    • 2 weeks ago
      Anonymous

      Very reasonable setup. Mine is similar. I have a 1070ti left over from years ago. Would like to replace it, but there's no new game worth buying a new GPU for.

  8. 2 weeks ago
    Anonymous

    i just curve optimizer'ed my 7800x3d with all core -35 and it's passed all tests ive thrown at it
    did i get a golden sample?

    • 2 weeks ago
      Anonymous

      Redpill me on this

      Based anon I'm the hwinfo above. I was on a 5800X/2070S before and it was already such a jump I can imagine how nice that upgrade was for you tbh. Cheers fren, may we make many thirdies and neet failures seethe

      I said hwinfo above but another person posted one a few seconds before me, it's over.

      • 2 weeks ago
        Anonymous

        >Redpill me on this
        1. go in bios
        2. find PBO settings
        3. set PBO limits to motherboard
        4. find and set curve optimizer to all core negetive 35
        5. run stress tests like prime95/cinebench/occt. recommended occt with core cycling mode

        Did you check the scores for various benchmarks?
        Clock stretching is a thing.

        in cinebench r23 it was around 17000 for multicore, now it's 18000
        so essentially gained around 1000 points

    • 2 weeks ago
      Anonymous

      Did you check the scores for various benchmarks?
      Clock stretching is a thing.

    • 2 weeks ago
      Anonymous

      Maybe, mine's garbage apparently and won't even boot with -10.

    • 2 weeks ago
      Anonymous

      run FPU Julia in AIDA64 a couple of times, if its not stable itll crash within the first pass

      • 2 weeks ago
        Anonymous

        i ran several occt tests (all cores/core cycling) then a few cinebench multi/single tests and it was fine
        my old 5900x would error within seconds on core cycling mode if it werent stable

        • 2 weeks ago
          Anonymous

          >core cycling
          so did i, then i ran FPU julia and got a BSOD

          • 2 weeks ago
            Anonymous

            i would but its now a linux machine without gui

            • 2 weeks ago
              Anonymous

              shame, do give it a try when you can, that particular stress test revealed instability when no other test did

              • 2 weeks ago
                Anonymous

                thanks for the recommendation anyways, might try this with the 5900x later just to see if it catches anything

        • 2 weeks ago
          Anonymous

          did you test different instruction sets? my -30mV uv was stable with avx512 but crashed with sse. Also it wasn't immediate, it would crash about 25-30 minutes in

          • 2 weeks ago
            Anonymous

            ive tested like 4 times all with different settings for 1 hrs each
            not a single error

  9. 2 weeks ago
    Anonymous

    >amd
    Enjoy your no drivers lmao

    • 2 weeks ago
      Anonymous

      Better than having house fires. Have you seen Intel's wattage pulls? Absolutely ridiculous.

      • 2 weeks ago
        Anonymous

        Yes in fricking cinebench where it triples the 7800X3D score. In gaming it tops out at 150W.

    • 2 weeks ago
      Anonymous

      Cope.

      >Redpill me on this
      1. go in bios
      2. find PBO settings
      3. set PBO limits to motherboard
      4. find and set curve optimizer to all core negetive 35
      5. run stress tests like prime95/cinebench/occt. recommended occt with core cycling mode

      [...]
      in cinebench r23 it was around 17000 for multicore, now it's 18000
      so essentially gained around 1000 points

      Thanks tbh I'll look into it, how much benefit is there really? I didn't oc anything the past 2 hardware upgrades I've gotten so I'm out of the loop by over 10 years.

      • 2 weeks ago
        Anonymous

        >how much benefit is there really
        it boosted my cinebench r23 score from 17000 to 18000
        it was worth it for me because i was expecting to spend the whole night testing
        had to spend an entire day on my old 5900x because one of the cores can only do -5 while some does -30

        • 2 weeks ago
          Anonymous

          cinebench as a stress test is worthless, just run FPU julia in AIDA64 like i said, that test lasts like 10 seconds and causes a crash immediately if any of your cores are unstable

    • 2 weeks ago
      Anonymous

      >2012 memes in current year
      lmao stay in the past old man future is...... RED

    • 2 weeks ago
      Anonymous

      Cope

    • 2 weeks ago
      Anonymous
  10. 2 weeks ago
    Anonymous

    Yeah, especially with the next generation forcing you to transition to windows 11. Like they did back in 2018.

  11. 2 weeks ago
    Anonymous

    I… I got AMD 7 7700x
    is it over?

    • 2 weeks ago
      Anonymous

      Yes, if you couldn't afford the best you should have stuck with the 5800x3d

      • 2 weeks ago
        Anonymous

        nah I’m happy and that’s enough for me

    • 2 weeks ago
      Anonymous

      I bought a 7600x

    • 2 weeks ago
      Anonymous

      I have the 7700, and what i tell you is, you will be able to play all current triple A games at max settings at 120 fps, but rarely something above that, and of course you need to have a rtx 4070 ti or above to get those frames.

    • 2 weeks ago
      Anonymous

      They are very similar in performance. 7800x3D is just a slightly better buy.

  12. 2 weeks ago
    Anonymous

    I've got 6000MHz ram with 30-38-38-28-135 timings. I've been planning to update uefi and see if it can go better for like a year

    • 2 weeks ago
      Anonymous

      RAM barely matters unless you fricked something up.
      Worth it only if you have autism and you simply enjoy tweaking the hardware. So if fit that description check ActuallyHardcoreOverclocking on israelitetube.

  13. 2 weeks ago
    Anonymous

    Is it true that AM5 has terrible boot times across the board?

    • 2 weeks ago
      Anonymous

      on first boot its really slow because it has to train ram or something
      i had to wait like 3 mins on the first ever boot

    • 2 weeks ago
      Anonymous

      I'll be honest with you, I'm this guy

      Hell yeah brother. I think I'm going to sell my gpu for a small loss when the 50XX's come out of the jump is big enough.

      My boot times are completely fricked since going to am5. Booting takes damn near an entire minute. Nothing has been able to help it either. Startup says my bios time was 38 seconds
      I dont shut my pc off though so it's really a nothingburger of a "downside"

      • 2 weeks ago
        Anonymous

        Maybe, mine's garbage apparently and won't even boot with -10.

        you Black folk need to disable memory training, 30-40 seconds is normal after the initial xmp/expo setup.

        • 2 weeks ago
          Anonymous

          maybe they turn the power strip off
          this causes a retrain

        • 2 weeks ago
          Anonymous

          Would that really have anything to do with core offset?

          • 2 weeks ago
            Anonymous

            sorry brosev, i misquoted

        • 2 weeks ago
          Anonymous

          I did. I looked into boot time fixes and tried everything including this and fast boot stuff. This is the best I could get it to. But again that's my literal only complaint since upgrading and I've booted my pc less than a handful of times since then

        • 2 weeks ago
          Anonymous

          >30-40 seconds is normal
          lol what? My ddr4+gen3 ssd boots in 15 seconds max

          • 2 weeks ago
            Anonymous

            >DDR4
            No shit moron, great reading comprehension

            My 7700x boots in 15 seconds or less.
            GTFO out here with this shit. This website is full of fricking morons.

            You seem to be the outlier homosexual, btw you bought a dogshit cpu and are calling others morons? Kek

            • 2 weeks ago
              Anonymous

              >dogshit
              What was I supposed to buy?
              I wanted 8 cores and AVX512.

        • 2 weeks ago
          Anonymous

          My 7700x boots in 15 seconds or less.
          GTFO out here with this shit. This website is full of fricking morons.

          • 2 weeks ago
            Anonymous

            your 16gb ram will boot faster than 128gb ram

            • 2 weeks ago
              Anonymous

              32.

      • 2 weeks ago
        Anonymous

        This is due to DDR5 memory training. You can disable it in the EFI settings

    • 2 weeks ago
      Anonymous

      I'll be honest with you, I'm this guy [...]
      My boot times are completely fricked since going to am5. Booting takes damn near an entire minute. Nothing has been able to help it either. Startup says my bios time was 38 seconds
      I dont shut my pc off though so it's really a nothingburger of a "downside"

      Yeah I got an 7800X3D last June as an upgrade over an i7-6700k and I thought I might have had a faulty motherboard with how long it took to boot compared to my old PC, like I press the power button and it can sit there with the lights on but no initialization for 20-30 seconds before it finally kicks in and shows the BIOS splash page/POST page
      it gives me massive anxiety because I once had a PC that died like that, it would power on but never do anything one day and then never recovered

      like you said if I just cycle it in and out of sleep then it's no problem because it's fine when it's running, but if I ever need to cold boot for whatever reason then I know to press the button and then go do something else for a minute

    • 2 weeks ago
      Anonymous

      I'll be honest with you, I'm this guy [...]
      My boot times are completely fricked since going to am5. Booting takes damn near an entire minute. Nothing has been able to help it either. Startup says my bios time was 38 seconds
      I dont shut my pc off though so it's really a nothingburger of a "downside"

      [...]
      Yeah I got an 7800X3D last June as an upgrade over an i7-6700k and I thought I might have had a faulty motherboard with how long it took to boot compared to my old PC, like I press the power button and it can sit there with the lights on but no initialization for 20-30 seconds before it finally kicks in and shows the BIOS splash page/POST page
      it gives me massive anxiety because I once had a PC that died like that, it would power on but never do anything one day and then never recovered

      like you said if I just cycle it in and out of sleep then it's no problem because it's fine when it's running, but if I ever need to cold boot for whatever reason then I know to press the button and then go do something else for a minute

      Enable "Memory Context Restore" in BIOS, for whatever reason it trains your RAM on every boot otherwise which causes those ridiculously long boot times. Mine boots in just a few seconds after I turned it on.

      • 2 weeks ago
        Anonymous

        What's RAM training?

        • 2 weeks ago
          Anonymous

          The BIOS optimizes the timings and volatages of your RAM upon first booting a system with fresh hardware installed. On DDR5 it can take a while - multiple minutes in some cases. For whatever reason some motherboards will raise the flag to redo this process way too often, even every time you reboot the system, so the check needs to be disabled via a setting in the BIOS (if you run into that issue).

    • 2 weeks ago
      Anonymous

      i had no idea this was a problem. my pc boots crazy fast with this cpu

  14. 2 weeks ago
    Anonymous

    recently upgraded from an intel cpu (i dont remember which but i got it in 2015) to this. yeah i guess its fine. cyberpunk runs better now.

  15. 2 weeks ago
    Anonymous

    It'd be better if pcie controller wasn't prone to killing itself tho

  16. 2 weeks ago
    Anonymous

    How did Intel fall behind? They were dominant for decades

    • 2 weeks ago
      Anonymous

      Intel really fricked up on their fabrication IIRC. AMD were able to hit 7nm with TSMC and Intel were stuck on 10 and fell behind bad.

      • 2 weeks ago
        Anonymous

        Intel 10 = TSMC 7. When AMD was on TSMC 7, Intel was stuck on 14. Now it's Intel 10 vs. TSMC 5, so AMD is still a node ahead.
        The next node from Intel (20A) is supposedly going to leapfrog TSMC's best (3nm), but I wouldn't trust anything Intel says.

    • 2 weeks ago
      Anonymous

      giant mommy got it together

      >relatively good value
      hahaha

      >mfw went from 2600x to 5800x3d all on the same mobo/ram
      >mfw the performance upgrades over the years

      waiting fior amazon do deliver my 5800x3d ($269) upgrading from 3700x

      • 2 weeks ago
        Anonymous

        You'll really enjoy it. The 0.1% min fps increase is shocking. Games feel so much smoother that it's almost unreal. Furthermore my avg framerate in some games is as high as max fps i used to get beforehand (rtx3070).

    • 2 weeks ago
      Anonymous

      Intel is on 7nm. AMD is on 5nm and has V-Cache. FWIW stacking cache is a TSMC technology, AMD did not invent it. They just used it. When AMD was on GF 12nm and Intel was on 14nm, they couldn’t compete and went for being the value proposition (2700X vs 9900K).
      It’s the same reason why AMD was able to catch up to nvidia that one time when nvidia used Samsung 8nm but fell behind again when nvidia returned to TSMC.
      Intel hoarded all the new and shiny ASML stuff however so this situation might reverse.

  17. 2 weeks ago
    Anonymous

    >8 cores

    LMAO

    • 2 weeks ago
      Anonymous

      It's sad that intel can't do more than 8 cores indeed.

    • 2 weeks ago
      Anonymous

      more than enough for games and basic PC usage

      besides, I'd rather have 8 performance cores rather than 4 performance p-cores and 16 trash e-cores like intel does, all whilst running WORSE in games and eating so much power for this meager performance that it's thermal throttling itself anyway
      I used to love intel CPUs but they are disappointing now and have been for a while

  18. 2 weeks ago
    Anonymous

    Thanks, but my 11900KF gets the job done

  19. 2 weeks ago
    Anonymous

    Got a 5800x3d+3080 that Ill ride till the 5000 series and wait 3+ more years after that. Honestly with framegen it seems fine to not need new GPUs that much. At this point its only if you want path tracing in any game.

    Ill need pretty much a new PC with the power draw its going to require, new motherboard platform.

    • 2 weeks ago
      Anonymous

      Frame gen? On a 3080? Wat?
      Isn't it just some ghetto version of it in some games using mods? I thought it only works on 40xx series
      Btw frame gen is a god tier feature and the quickest way to spot somebody has not used a 4070 or better is to talk shit about frame gen. Unless you're an esportsgay, but you're not human if you are anyway

      • 2 weeks ago
        Anonymous

        Lossless Scaling is The One Thing Nvidia DOESN'T WANT YOU TO KNOW!

  20. 2 weeks ago
    Anonymous

    How is the 7600?

    • 2 weeks ago
      Anonymous

      Assuming you mean cpu, it is fine enough, if a little power limited by stock.

      >7800x3d
      should i just get a 4070 super and be done with it?

      7900GRE

  21. 2 weeks ago
    Anonymous

    >7800x3d
    should i just get a 4070 super and be done with it?

    • 2 weeks ago
      Anonymous

      7900 gre is the better choice; for rt the 4070 super is gimped in terms of future performance, for rasterization the 7900 gre is superior

    • 2 weeks ago
      Anonymous

      that's the setup I have and I like it, though I do need to use quality DLSS and frame gen to do 1440p path tracing on cyberpunk, but it runs at a solid 60fps and I can't feel the dreaded "input latency" that benchmarkers love to talk about
      If you have the money, go for the 4080 super instead just for future proofing with VRAM and the extra tensor cores, but I still stand by the 4070 Super being good

  22. 2 weeks ago
    Anonymous

    >cores cores cores
    i don't know what multi-cores even do. from what i see, number of cores go up means good cpu mm mm

  23. 2 weeks ago
    Anonymous

    got a 7800x3d and 4080s. i just like how neither gets hot, ever

    • 2 weeks ago
      Anonymous

      they do get hot but you're looking at edge temp not junction.

    • 2 weeks ago
      Anonymous
    • 2 weeks ago
      Anonymous

      What do you play on yours anon?
      Since upgrading I've played DD2, P3R, P5R, and now working on P4G
      Really putting that 4080s to work.

  24. 2 weeks ago
    Anonymous

    Not gonna lie AFMF is pretty sweet for those DX11 games that run well anyway and using it to get 240 fps is pretty nice, even if the input feels like 120fps (which is in line with how AMD suggests it is used).

    • 2 weeks ago
      Anonymous

      >framegen
      So, it’s less guesswork as your FPS gets higher but still guesswork and input lag doesn’t get better so real point in guess. It’s also more guesswork when your FPS is low and therefor a lot more ugly and gives you less “benefit”. So what’s the fricking point other than for random videos and articles to state a fake higher FPS without any effort?

      • 2 weeks ago
        Anonymous

        *so no real point in using
        pos phone

      • 2 weeks ago
        Anonymous

        You get motion clarity of the higher fps which visually is noticeable.

        • 2 weeks ago
          Anonymous

          >visual clarity
          >needed more as your fps gets worse
          >more guesswork and more ugly as your fps goes down
          It just sounds moronic and the worst gimmick introduced in decades for vidya

          • 2 weeks ago
            Anonymous

            It is never intended to be used to turn 30 fps into 60 despite what Ganker will tell you. It is meant to be a means for gpus to keep up with relatively sudden surge in monitor refresh rate we've had.

            • 2 weeks ago
              Anonymous

              >high fps already
              >less guesswork and but also less needed
              this is just moronic

              • 2 weeks ago
                Anonymous

                When cpus can't keep up leverage gpu cycles is a sensible option.

          • 2 weeks ago
            Anonymous

            >framegen
            So, it’s less guesswork as your FPS gets higher but still guesswork and input lag doesn’t get better so real point in guess. It’s also more guesswork when your FPS is low and therefor a lot more ugly and gives you less “benefit”. So what’s the fricking point other than for random videos and articles to state a fake higher FPS without any effort?

            It's literally frame smoothing technology. Its not a performance enhancement feature. It's a visual enhancer. If the existing framepacing is bad it will feel even worse. If it feels good it will feel even better. Garbage in garbage out.

            • 2 weeks ago
              Anonymous

              >visual enhancing
              >looks uglier
              lmao

              • 2 weeks ago
                Anonymous

                Unless you are a fighter pilot when used at the framerates AMD and nvidia suggest it is quite hard to spot - especially when running at high resolutions which inherently benefit any of these 'guesswork' technologies.

                >no FSR/DLSS and RT
                [...]
                I've had it bent for a year now and no problems

                >I've had it bent for a year now and no problems
                If the bend happens after 35mm (iirc) then that is fine, it is bending right at the connector that has caused issues.

              • 2 weeks ago
                Anonymous

                I sure hope my 4090 melts just before the 5090 release

              • 2 weeks ago
                Anonymous

                Based warranty enjoyer. Unless it is an asus card then you are fricked.

              • 2 weeks ago
                Anonymous

                I only buy FE cards. Nvidia ship them from a store about 10 miles away from me so it's good.

              • 2 weeks ago
                Anonymous

                >I only buy FE cards
                They still burn

              • 2 weeks ago
                Anonymous

                let's hope it does in a few months time.

              • 2 weeks ago
                Anonymous

                So nvidia can say "user error" and deny you an RMA

              • 2 weeks ago
                Anonymous

                You appear to care more than I do. If my GPU breaks and it gets a warranty refused, which it won't anyway, I'll buy another.

              • 2 weeks ago
                Anonymous

                They hate processing RMAs and giving out refunds. Its easier to pin the blame for shit design on customer negligence.

              • 2 weeks ago
                Anonymous

                Yeah, I've done it before without issue with a launch 2080ti. I know how it goes.

              • 2 weeks ago
                Anonymous

                >AMD and nvidia suggest (…) buy product
                LMAOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO

              • 2 weeks ago
                Anonymous

                See

                [...]
                It's literally frame smoothing technology. Its not a performance enhancement feature. It's a visual enhancer. If the existing framepacing is bad it will feel even worse. If it feels good it will feel even better. Garbage in garbage out.

              • 2 weeks ago
                Anonymous

                It's a godsent for emulators and videos take a guess why.

      • 2 weeks ago
        Anonymous

        frame generation was made by Nvidia to make pathtracing modes playable. These pathtracing modes are usually tech demos, mods or in the case of Cyberpunk literally bought advertisement graphics modes. AMD just followed suit with their own mode because they’re focussed on just doing catchup instead of generating something new. DLSS3 is actually useful though as it makes morons buy Nvidia cards even here.

  25. 2 weeks ago
    Anonymous

    Every time I'm considering an upgrade I remember that the only games I've played that weren't smooth enough were Elden Ring, Wolong, and an Asshomosexual that's close to death(Predecessor) and while I enjoyed each, it's not enough to want an upgrade for me.

  26. 2 weeks ago
    Anonymous

    the ultimate benchmark is cpu heavy games like factorio or cities skylines, if it holds up for a heated 24 hour autism sesh then your OC is stable

  27. 2 weeks ago
    Anonymous

    Upgrading to a 5080 when it comes out

  28. 2 weeks ago
    Anonymous

    >mfw we will never get anything close the the price/performance of the 1080TI ever again.

  29. 2 weeks ago
    Anonymous

    7800X3D+Arc 770 here. Haven't had a single major issue at all.

  30. 2 weeks ago
    Anonymous

    Cool, but I'll wait for Zen 5.

  31. 2 weeks ago
    Anonymous

    no mr sheklestien i dont want to buy your ddr5, can you please leave me and my family alone?

    • 2 weeks ago
      Anonymous

      Just slap moar L3 on the cpu and dram means little.

    • 2 weeks ago
      Anonymous

      DDR5 isn’t even remotely expensive anymore. Your average 32 GB 6000 megatroony CL 30 memory is less than 140€.

  32. 2 weeks ago
    Anonymous

    is my 4090 supposed to be at 20-30% 3d utilization when playing a youtube video

    • 2 weeks ago
      Anonymous

      If decoding the 4k video, sure.

    • 2 weeks ago
      Anonymous

      Don't trust windows with anything.
      Check power draw in hwinfo.

    • 2 weeks ago
      Anonymous

      not likely your issue but ALWAYS turn off ambient mode in the YouTube cogwheel settings. can absolutely eat resources. no clue why they’ve left that shit in

    • 2 weeks ago
      Anonymous

      You have the ai upscaling thing enabled, also youtube has put some new measure in to tax peoples PC's when it sees them running ad blocking software

  33. 2 weeks ago
    Anonymous

    >mfw went from 2600x to 5800x3d all on the same mobo/ram
    >mfw the performance upgrades over the years

    • 2 weeks ago
      Anonymous

      The 5800x3d 9and i suppose 5700x3d even if it did release a lot later) has cemented that 1) AM4 is GOAT, 2) MOAR CACHE is still viable as long as you can do something about the latency (hence 3d stacking) and 3) unless something fundamentally changes in games high cache will disproportionally pull its weight as more strain is put on the cpu.

      • 2 weeks ago
        Anonymous

        >am4 is goat
        But I'm on am5, have a better gaming cpu, and will actually be able to upgrade my cpu without upgrading mobo/ram. How is am4 goat when taking cope out of the equation?

        • 2 weeks ago
          Anonymous

          >How is am4 goat when taking cope out of the equation?
          Well ignoring that a 5800x3d hangs with a 7700x (and in select titles out paces it) come back in 5 years and we can compare the support AM5 gets.

          • 2 weeks ago
            Anonymous

            >7700X
            Kek.

            • 2 weeks ago
              Anonymous

              ...is a cpu that wipes the floor with every other 5xxx cpu in gaming yes. V cache is just some advanced packaging to tackle a problem cpu manufacturers have been fighting for over 30 years.

              • 2 weeks ago
                Anonymous

                You're right, I'm not saying it's a bad cpu, but it's moronic to buy when the 7800x3d exists. I'm arguing the statement "am4 is goat", because a 5800x3d is comparable to a 7700x is irrelevant, because 7800x3d is better and am5 is upgradeable

              • 2 weeks ago
                Anonymous

                > but it's moronic to buy when the 7800x3d exists
                Yes and no. yes in so much as if you can afford it sure, the 7800x3d is the way to go. No in that you can't sell a 400 dorra or whatever cpu to someone who only has 300 dorra to spend.

              • 2 weeks ago
                Anonymous

                I'd ask if you could consider a world where the 7700x was available and the 7800x3d was not, but what am I saying? Of course you had breakfast this morning.

              • 2 weeks ago
                Anonymous

                > but it's moronic to buy when the 7800x3d exists
                Yes and no. yes in so much as if you can afford it sure, the 7800x3d is the way to go. No in that you can't sell a 400 dorra or whatever cpu to someone who only has 300 dorra to spend.

                Just like everything else pertaining to pc parts, have you tried not being poor? Anyway I'm not hating on anybody who can't get the best parts, simply saying am4 is not "goat". It's a dead socket, 5800x3d is great and a good upgrade for many people, but it would be ignorant to go am4 right now on a new build

              • 2 weeks ago
                Anonymous

                What the frick are you even talking about? LOL
                What does "being poor" have to do with it?

              • 2 weeks ago
                Anonymous

                >What does "being poor" have to do with it?
                it is internet gaslighting for not consooming. Not consooming means not increasing the revenue of your lord and master and thus you are untermensch and thus not worthy of being called human. Buy now or face the gulag goyim.

        • 2 weeks ago
          Anonymous

          >How is am4 goat when taking cope out of the equation?
          by skipping am5 and am6 (assuming still the same slot) entirely?
          you don't need to keep consooming and updooting every product, you know?

    • 2 weeks ago
      Anonymous

      same i upgraded from a 1600 to a 5700x3d. and went from 16 to 32gb of ram and its like i jumped several generations on the same hardware.
      feels gud.

      • 2 weeks ago
        Anonymous

        giant mommy got it together
        [...]
        >relatively good value
        hahaha
        [...]
        waiting fior amazon do deliver my 5800x3d ($269) upgrading from 3700x

        The 5800x3d 9and i suppose 5700x3d even if it did release a lot later) has cemented that 1) AM4 is GOAT, 2) MOAR CACHE is still viable as long as you can do something about the latency (hence 3d stacking) and 3) unless something fundamentally changes in games high cache will disproportionally pull its weight as more strain is put on the cpu.

        Based 5's

        Socket AM4 is indeed the king socket right now.
        CPUs are cheap.
        RAM is even cheaper.
        Motherboards are cheap.

        We are dining like kings. for the foreseeable future.

  34. 2 weeks ago
    Anonymous

    5800X3D will still be enough for 99% of games for the next 10 years though.
    >BUT MUH HECKIN' 8K 300FPS FSR RSR DLSS DLAA RT LGBTQAI++++
    If a game needs any of these to look good, it means the company entirely outsourced their graphics department and put 0 effort into them, effectively making their game not worth my time.
    And 60FPS is still perfect for 99% of games.

    • 2 weeks ago
      Anonymous

      Nailed it.

  35. 2 weeks ago
    Anonymous

    AMD won.
    Intel lost.
    Cope chuds

    • 2 weeks ago
      Anonymous

      the equivalent intel cpu is a better investment because you can still do productive stuff with it
      x3d meme becomes useless the moment you log out of the game
      sorry amdrones, but blue/green is still the goat pairing

  36. 2 weeks ago
    Anonymous

    this is a niche product that you shouldn't be buying unless you already have a 4090, exclusively play ancient games or run at meme low resolutions
    you're almost never cpu bottlenecked so a 7700x / 13700k will match if not outperform it for half the price whilst being far better cpu's at actually being cpu's

    • 2 weeks ago
      Anonymous

      I don't disagree with you, it's overkill for most games. But...

      ...the 13700k and 7800X3D are basically the same price these days.

    • 2 weeks ago
      Anonymous

      That CPU is more expensive, uses over 3x more power and gets dumpstered by the 7800x3d in vidya benchmarks.
      Eat shit, shill.

      • 2 weeks ago
        Anonymous

        Nobody cares about your troonyfield and Cybertroony stress testing
        13700k is actually useful outside of games and is still powerful to max out anything that isn't modern AAA slop

        • 2 weeks ago
          Anonymous

          At least you can use your PC to dry your tears.

          • 2 weeks ago
            Anonymous

            >happy with PC
            >no you should have bought this one , it's better, you're crying shill , admit it
            what makes a person behave so bitterly?

            • 2 weeks ago
              Anonymous

              Because you're lying, CPU performance is a lot more substantial for gaming than you make it out to be. Even lower end GPUs can benefit from having a CPU that doesn't suck ass.
              And you're recommending a CPU that's objectively inferior in every applicable metric.

        • 2 weeks ago
          Anonymous

          >goes to the video game board
          >gets mad at people talking about video games
          Congrats, reddit! You finally fit in!

  37. 2 weeks ago
    Anonymous

    not so fast

    • 2 weeks ago
      Anonymous

      I still have my old system in the box.
      2500K was the GOAT.

    • 2 weeks ago
      Anonymous

      I'm still using my i7-2600k. It's a workhorse that's served me well. But it is showing its age.

  38. 2 weeks ago
    Anonymous

    I have a 7600. Do better CPUs even do anything or just muh 2 more frames for double the price?

  39. 2 weeks ago
    Anonymous

    Having to wait 30-60 seconds more when booting your PC once a day is a problem for people?

    • 2 weeks ago
      Anonymous

      thats just your pc shorting out, normal for ayymd

    • 2 weeks ago
      Anonymous

      most people here are phone users. if their device doesnt wake up in 5 seconds they assume it froze or its buggy/broken.

    • 2 weeks ago
      Anonymous

      >Shutting down your PC
      ???

      • 2 weeks ago
        Anonymous

        >Keeping your PC on 24/7
        ??????????????

  40. 2 weeks ago
    Anonymous

    good perf, good tdp
    reliable unless you use the big 3 shitter motherboards too.

  41. 2 weeks ago
    Anonymous

    A 3090 is better than a gaming laptop but not a real gaming PC
    How times have moved on.

    • 2 weeks ago
      Anonymous

      >crippled by 4770k cpu

      • 2 weeks ago
        Anonymous

        no, actually. it was HEDT I used for music production, which I threw a 3090 in.

        • 2 weeks ago
          Anonymous

          A cpu btfo so apocalyptically by the 3950x intel withdrew from the market.

          >music production
          Yeah okay then intel makes sense.

          • 2 weeks ago
            Anonymous

            >has 48 PCIE lanes

            • 2 weeks ago
              Anonymous

              >[laughs in 3960x]

              • 2 weeks ago
                Anonymous

                laugh all you want. there's not a chance in hell I'm putting an AMD cpu in to my DAW PC.

              • 2 weeks ago
                Anonymous

                Reminds me of the podcast MLID (yeah >MLID) did with an aussie DAW engineer who mentioned some latency regressions in windows that were ignored until the gaymen crowd complained and SUDDENLY MS managed to fix it. https://www.youtube.com/watch?v=_HJu5xt43iQ

              • 2 weeks ago
                Anonymous

                The latency was the reason not to go with AMD.
                Something with the way AMD CPU dealt with the memory, it caused the minimum latency to be about 15ms - I could get down to 1ms with the Intel. There was a thing about it on the Cubase forums.
                I don't even know if it's fixed or not yet. I spent thousands getting ultra low latency so I'm never going to take a chance on AMD to find out.

              • 2 weeks ago
                Anonymous

                i mean fair enough - you fit into a niche scenario where intel is better and there is nothing wr0ng with that.

              • 2 weeks ago
                Anonymous

                I don't use that Pc any more. It was ok for the time, but

                yeah baby

                that's my current main.
                I'm trying to decide if I should buy a duplicate and have it as a separate entity rather than clog up all my USB ports in my music rig with gaming wheels and joypad shit.
                I'm on the fence as it's getting close to new GPU/CPUs in 5 months

              • 2 weeks ago
                Anonymous

                Not him but RAM latency is measured in ns. If you had ms worth of RAM latency your system would be unusable.
                What were you using to test system latency anyway?

              • 2 weeks ago
                Anonymous

                I didn't say ram latency. I was talking about ASIO latency via a PCIe soundcard, it's measured in ms and scales with the size of the buffer and audio sample rate.
                At the time I bought it AMD CPUs had a limit on the minimum latency I could achieve (latency counts when you're playing VST synths like a piano as it messes with your timing)

              • 2 weeks ago
                Anonymous

                Not him but that is also determined by the controller too? I think I can get lower latency on my Audient Evo than the scarlet solo. Although both are light-years better than the shitty Behringer one someone gifted me, that thing would crap out and overflow even at 512 samples.
                (Guitar)
                Also I can't solve some noise issues does this come down to motherboard ultimately, and environment. Because these interfaces I have are all USB powered.

              • 2 weeks ago
                Anonymous

                So it might be anything.
                >CPU
                >Motherboard
                >Chipset
                >OS
                >Driver
                >Soundcard
                >UEFI
                Without a way to separate those variables you can't find the solution.
                It would be nice if we could have more direct PCI-E lanes.
                Because I've seen multiple people complain about devices connected via chipset on AM4.

              • 2 weeks ago
                Anonymous

                I'm not sure of your location, but in the UK there's a computer store named Scan. It has a subdivision which makes custom PCs for AI/Audio/graphic workstations etc called 3XS.
                One of the guys who put together and tweak these Pro audio PCs used to provide all the audio performance benchmark results for UK mags like Sound On Sound and post his findings/opinion on gearbawdz forum He went into lots of details in some reports years ago and explained it all. All the Dawbench results used to be published on Scan website but fricked if I can find it now, he's probably left and took them down.
                Anyway the 10900x PC was one of his, it was made to fit in a 4u server case. At the time the only upgrade I could go was more ram and the CPU to a 10980xe. At the time I thought it was the dogs bollocks. Now, it can't even run Dogma 2 at 1440p

              • 2 weeks ago
                Anonymous

                No such thing as futureproof, eh?
                HEDT died all together, getting workstation hardware is asking for trouble.
                There is a reason why people stick with their own hardware for as long as possible for any production. Changing platform brings surprises.
                But don't worry DD2 runs poorly no matter what.
                >inb4 DD2 cultists screaming it works fine on their machine

              • 2 weeks ago
                Anonymous
              • 2 weeks ago
                Anonymous

                ha ha , it does though on my main PC
                Here's it running 8k

              • 2 weeks ago
                Anonymous

                there's a trick to minimize ASIO latency, you disable all the shit in your bios except what you use
                i managed to lower my dpc latency to like sub 200us on every metrics (pic kinda related)

                disable all these things
                >all power management like c states
                >all virtualization stuff like SVM
                >all unused shit like sata ports/usb ports/second ethernet port/wifi/bluetooth
                etc. wifi is especially harmful btw

              • 2 weeks ago
                Anonymous

                DPC latency at idle is a meme anyway.
                15 years later and people still get memed by LatencyMon

              • 2 weeks ago
                Anonymous

                it managed to eliminate the lag when playing DTXMania so it works for me ;^)

              • 1 week ago
                Anonymous

                it managed to eliminate the lag when playing DTXMania so it works for me ;^)

                is it ok?

              • 2 weeks ago
                Anonymous

                Then you're an idiot. The higher temps of Intels will mean your DAW will be noisy. So if you're recording anything other than pre-arranged loops, like say a vocal take, you're fricked.

  42. 2 weeks ago
    Anonymous

    >tfw my Ryzen 9 5900X still destroys

    I won't need a new CPU probably ever

  43. 2 weeks ago
    Anonymous

    intel ARCs looks so fricking sexy
    Shame their performance is still subpar

  44. 2 weeks ago
    Anonymous

    I hope ms flight sim 2024 is better at utilizing cpu than ms flight sim 2020

  45. 2 weeks ago
    Anonymous

    Actually 2 x 16GB DDR4 sticks are much, much, much cheaper than DDR5 counterpart.

    So the Ryzen 7 5800 3D is a much cheaper option, since it's compatible with DDR4.
    And with the 3D cache on your CPU, you don't actually need fast DDR5 ram.

    The AM4 platform with DDR4 remains the best platform for gaming.

    • 2 weeks ago
      Anonymous

      Maybe on a performance per dollar basis if you want to save money.
      But if you want the absolute best you go AM5.

      • 2 weeks ago
        Anonymous

        I bet you most people are tight on money and can't afford the graphics cards.

        AM4 with DDR4 really is the best for most people.

        • 2 weeks ago
          Anonymous

          Then going 5800x3d and going with a lower-end card like RNDA2 or RTX3xxx and sticking to 1080p is their best bet. At least with the 5800x3d you can really utilize the lower resolution much better.

          • 2 weeks ago
            Anonymous

            I think You have misunderstood something. The 5800x will work just at good as AM5 cpus at high resolution graphics.

            • 2 weeks ago
              Anonymous

              I know it will. Try again. Really think about the discussion at hand this time. You can do it.

              • 2 weeks ago
                Anonymous

                I'm not sure I understand you.5800x3D is so good it will work well on low, mid high end.

                There is no need to deliberately pick low end graphics.

              • 2 weeks ago
                Anonymous

                Follow the reply chain, and try once more. I believe in you.

              • 2 weeks ago
                Anonymous

                The logic is to save money on AM4 system:
                RAM, CPU, motherboard, and not lose that much performance. So you can get better graphics thn you would normally get.

                What's your logic? Just to go low end across the system?

              • 2 weeks ago
                Anonymous

                If you're that poor, then sure?

              • 2 weeks ago
                Anonymous

                That's kind of irrational of you. At that point the 5700x3D probably makes the most sense?

                The 5800x3D doesn't lose that much to your am5 system. And you still save a bunch of money.

              • 2 weeks ago
                Anonymous

                Sure. There's that. Same with the 5600x3d.

              • 2 weeks ago
                Anonymous

                I don't think that exist.

                Not all CPUs have 3D cache.

              • 2 weeks ago
                Anonymous

                It exists but it was a limited edition only sold by Microcenter. Good deal if you could get it but otherwise irrelevant.

    • 2 weeks ago
      Anonymous

      >the am4 platform remains the best platform for gaming

      I bet you most people are tight on money and can't afford the graphics cards.

      AM4 with DDR4 really is the best for most people.

      >am4 really is the best for most people
      Nice job moving the goalposts homosexual. This is pure cope btw. Am4 is objectively trash to build with now, but will be fine for a few years if you already have it. You'd have to be braindead to build an am4 pc right now

      • 2 weeks ago
        Anonymous

        >You'd have to be braindead to build an am4 pc right now
        Nope. Those CPUs with 3D cache is good for another 10 years, by the time upgrades are needed the AM5 will be ancient news.

        • 2 weeks ago
          Anonymous

          >using any pc part for 10 years
          Disregard, didn't realize what kind of poor moron i was talking to

          • 2 weeks ago
            Anonymous

            That's what people did on the first generation of 4 core i7 cpus from intel, there was nothing substantially better the next 15 years.

          • 2 weeks ago
            Anonymous

            please keep this mentality up
            the big zogs from amd/nvidia/intel loves gullible idiots like you
            please stay tuned for the next product

            • 2 weeks ago
              Anonymous

              >please stay tuned for the next product
              I can't wait. I need a more powerful PC. This 13900 and 4090 are starting to show their age

    • 2 weeks ago
      Anonymous

      The only problem with AM4 right now is that it's at the end of its lifespan, while AM5 will get new CPUs in the future.
      If you're already on AM4, great. If you're on some older or Intel socket, you may as well go for AM5 unless you can get all the AM4 stuff for cheap.

      • 2 weeks ago
        Anonymous

        >The only problem with AM4 right now is that it's at the end of its lifespan
        This is the cheapest point to buy the RAM and motherboard and CPU actually.

        You only need CPU upgrades every 10 years or so. It usually takes 5 years to develop a new generation of CPUs.
        Only morons with big pockets upgrade from the 1st generation of AM5 CPU to another generation of AM5 CPU.

        • 2 weeks ago
          Anonymous

          why is it that the most vocal people are always the most clueless?

          • 2 weeks ago
            Anonymous

            Most Sandy bridge owners never upgraded their CPUs past the one they had.
            You don't know what you're talking aobut.

            • 2 weeks ago
              Anonymous

              >normies don't know how to swap parts
              Shocking.

              • 2 weeks ago
                Anonymous

                Look at that Ryzen 7 5800X3D on your chart

                Technically it does, but still...

                It's only 4% below That AM5 CPU at the top.
                The AM4 platform is good for another 10 years.

              • 2 weeks ago
                Anonymous

                That's quite a bit lower tier product with fewer cores tho. It's not that 5800X3D is anywhere near obsolete or that you can't use it for another ten years(tho I wouldn't count on your motherboard to last that long under heavy use), but it (and DDR4 etc.) will be pretty big bottleneck for the GPU by then. Unless you're aiming for an ultimate budget build(at which point it's better to go second hand anyway), a brand new AM4 system makes little sense.

              • 2 weeks ago
                Anonymous

                The king of Kings.
                Ignore the newbie cpus.

                >but it (and DDR4 etc.) will be pretty big bottleneck for the GPU by then
                I doubt it. Just increase the resolution, and GPU is the bottleneck.

                128GB DDR4 is very affordable by the way in case you didn't know. Try that on AM5 and see what happens.

              • 2 weeks ago
                Anonymous

                >5800X3D
                >128GB DDR4
                with how many sticks exactly? this cpu is picky when it comes to this stuff

              • 2 weeks ago
                Anonymous

                4 sticks. The CPU handles standard settings just fine.

              • 2 weeks ago
                Anonymous

                What is the point of 128GB if you run it at like 2666-3000MHz and loose timings? A dual rank 32GB 3800C14 or so is far superior.
                If you want fast RAM at 64GB or higher you need DDR5 and that’s not up to debate.

              • 2 weeks ago
                Anonymous

                3D cache means RAM speed is irrelevant.

              • 2 weeks ago
                Anonymous

                >f you run it at like 2666-3000MHz
                Default AM4 is 3200 mhz. There is no need to downclock the RAM.

              • 2 weeks ago
                Anonymous

                >le king of kings
                It's a compromise, moron. DDR5 isn't even that expensive. By choosing AM4 you save maybe a hundred bucks, get worse perfromance and will never have an upgrade path. If that's a tradeoff you're willing to take, go for it, but giving braindead advice is just poor form.

              • 2 weeks ago
                Anonymous

                Why the frick would you ever want to upgrade from a nice and cheap motherboard with nice and cheap 64 GB ram to begin with?

              • 2 weeks ago
                Anonymous

                >Why the frick would you ever want to upgrade from a nice and cheap motherboard with nice and cheap 4 GB ram to begin with?
                I don't know. Because your processor lacks a key feature? Because your RAM speed is insufficient? Because your setup has one broken part and there is no remaining stock for reasonable prices?

                There's a reason nobody uses Pentium 3 on their daily drive anymore. You're penny pinching on places that don't make much sense and sound like you just want to stick to old shit because of pseudoboomer nostalgia than any rational reason.

              • 2 weeks ago
                Anonymous

                >I don't know. Because your processor lacks a key feature?
                Such as?
                >Because your RAM speed is insufficient?
                It's not, the 3D cache literally makes your overpriced DDR5 RAM superfluous.

                >Because your setup has one broken part and there is no remaining stock for reasonable prices?
                Okay, let me check
                DDR3 RAM, still available this day today.
                DDR2 RAM, still available this day today.
                DDR1 RAM, still available this day today.
                Hmm, doesn't seem to be a problem.

                Okay, let me check some more.
                AM3+ motherboards, still available this day today.
                AM2 motherboards, still available this day today.
                Socket 754 and 939 motherboards... hold on! WhoopS! Looks like you have a point there.

                The old Socket 939 mother boards are really difficult to find these days.
                Let's see. 754 was 2003. 939 was introduced in 2004. 20 years ago.
                Its successor AM2 was introduced in 2006.
                So let's say you purchased a Socket 939 in year 2006. 18 years later, this motherboard is impossible to find.
                So yeah, you definitely shouldn't stretch your system to last 18 years.

                But 10 years actually appears to be a fair and reasonable number both from performance point of view as well as availability point of view.
                Even 15 years might be a good stretch for exceptionally good architectures such as the 5800x3D.
                But 18 years is probably stretching it.

      • 2 weeks ago
        Anonymous

        AM4 owners can expand to 64 GB DDR4 for cheap.
        Even 128 GB DDR4 accross 4 sticks is very cheap.

        AM5 can't do that without raping your wallet.

  46. 2 weeks ago
    Anonymous

    I have an i5-8400

  47. 2 weeks ago
    Anonymous

    It won't run Dragon's Dogma 2 properly. Nothing easily available can. We are still at least generation generation away from that.

  48. 2 weeks ago
    Anonymous

    The CPU is the most important part of the system and it pisses me off that young homosexuals here kept crying in DD2 threads before launch "is my GPU gonna make it?", homie lol

    • 2 weeks ago
      Anonymous

      No homosexual, every component is important.
      You need well balanced system, your system is as fast as the slowest part.

      • 2 weeks ago
        Anonymous

        Every component is important but the CPU is the most important. It's what dictates the entire platform.

        • 2 weeks ago
          Anonymous

          You can cut corners with the CPU without losing much frame rates.

          • 2 weeks ago
            Anonymous

            To a degree yes like cores and cache but you can't skimp on architecture.

  49. 2 weeks ago
    Anonymous

    >5600X
    >3080 10GB (bought used for cheap in great condition, still can't believe it wasn't a scam when the package arrived)
    I have an absolute total of 0 (zero) reason to upgrade
    I don't know what you Black folk are getting so hard bottlenecked on that you need to invest on new $2000 pieces every other year

  50. 2 weeks ago
    Anonymous

    idk shit about cpus all i know is i have moronic computer habits and leave 600000 billion tabs open and apparently you need more cpu for that so i bought a big cpu

  51. 2 weeks ago
    Anonymous

    >he doesn't use a Mac for music production

  52. 2 weeks ago
    Anonymous

    I plan on getting a PC with it and 4070 ti super.

  53. 2 weeks ago
    Anonymous

    AMD realized that most modern games are shit and the only ones worth playing are 10+ years old and made a CPU that has enough L3 cache to run those games on modern systems.
    I have a 5800X3D I recently upgraded from a 3900x and the FPS increase across all games made from 2000-2014 has been insane,

  54. 2 weeks ago
    Anonymous

    Im waiting for the 9800x3D so i can finally see 60 fps in the main town of DD2.

  55. 2 weeks ago
    Anonymous

    I have a ryzen 5 4600g

  56. 2 weeks ago
    Anonymous

    Impressive circlejerk thread. And Ganker doesn't have fanboys right.

    • 2 weeks ago
      Anonymous

      Don't leave home without your 3D cache.

      Nearest Intel CPU is literally 22% behind.
      Nearest AMD CPU without 3D cache is 30% behind.

      • 2 weeks ago
        Anonymous

        >marketing alerts lisa to the powa intel is going to leash
        >next day at crack of dawn
        >lisa still hungover drunkenly staggers into R&D heads office
        >"wasssssaup, wass da prablem"!?
        >Mes Su, Intel got a good system
        >*hic* Fuuuuug, what epyc chips, can they be used for desktop!? *hic*
        >y-yes at only 15% loss to yields
        >*hic*
        >DOT IT FGT, make Pat cry within the week
        >*hic*
        >GENIUS Mrs Su, it shall be done

      • 2 weeks ago
        Anonymous

        >you should buy this car it has really comfortable back seats vs other cars
        >but I drive, I don't sit in the back. I only care about the driver seat.
        >No, this car has the best comfy back seats though, you should buy it based on this
        this is what you 1080p 4090 benchmark morons sound like

        • 2 weeks ago
          Anonymous

          You may not like the methodology, but they are actually good measurements for how the CPUs will perform for you 8-10 years down the line when current year graphics cards become so fact that they no longer bottleneck at high resolution.

        • 2 weeks ago
          Anonymous

          Technically the CPUs should be benchmarked at 640x480 resolution if possible.

        • 2 weeks ago
          Anonymous

          >i only care about the driver seat
          Then why are you in a thread about the back seat?

    • 2 weeks ago
      Anonymous

      The market has basically backstabbed intel as revenge for holding everybody hostage at 4 cores for 10 years straight without any development nor improvements.
      It's a just and deserved punishment.

  57. 2 weeks ago
    Anonymous

    >gpu broke
    >my i3 still going strong after 8 years
    im scared bros i don't want to buy a new gpu and a cpu

    • 2 weeks ago
      Anonymous

      You must.

      Let the i3 rest in peace at last, and join the 3D cache kings.

      • 2 weeks ago
        Anonymous

        im about to start a welding course so i hope with the extra cash i will be able to upgrade without going in the red

  58. 2 weeks ago
    Anonymous

    I am moronic but will i notice any difference emulation wise with 5800x3d/7800x3d when i have a 5600x?

    • 2 weeks ago
      Anonymous

      Nobody ever benchmarks emulation software in the CPU tests, even though they really should.

      So I actually have no idea.

      • 1 week ago
        Anonymous

        They do, actually.

        • 1 week ago
          Anonymous

          So, for emlutating switch, the 3D cache improves Zen 4 by 10%.
          And it improves Zen 3 by 8%.

          For PS3 emulation, the 3D cache improves Zen 4 by 39%
          And it improves Zen 3 by only 2%.

          Interesting.
          There is something weird in these emulators that really cripple the Zen 3 based cpus.

    • 2 weeks ago
      Anonymous

      Only for stuff like RDR1 on RPCS3. I think the 7800X3D can hit 60FPS. If all you care about is Switch and shit, you can do 1-2x native easily depending on your GPU. Just make sure you got the VRAM for it. RPCS3 likes having more cores and threads, but for everything else, 6c/12t is more than enough for 1080p.

    • 2 weeks ago
      Anonymous

      isn't the 5600x already a beast?

    • 2 weeks ago
      Anonymous

      not as substantially as shills say
      5600x is already a great cpu
      5800x3d might be worth giving a shot if you find a huge sale, but unless you are trying a bunch of pointless upscales settings (literally 1% difference for 50% more stress on the cpu), then you won't really need it
      absolutely not worth it getting an entire new system for 7800x3d though, am5 is safely skippable unless you are a richgay

      • 1 week ago
        Anonymous

        Is there a way to quantify the difference between 5600x and 5800x3d?
        even at absolute cheapest the latter is like twice the price usually around here

    • 2 weeks ago
      Anonymous

      not as substantially as shills say
      5600x is already a great cpu
      5800x3d might be worth giving a shot if you find a huge sale, but unless you are trying a bunch of pointless upscales settings (literally 1% difference for 50% more stress on the cpu), then you won't really need it
      absolutely not worth it getting an entire new system for 7800x3d though, am5 is safely skippable unless you are a richgay

      AVX-512 is pretty important for 7th gen emulation, isn't it? Other than that, yeah Zen 4 doesn't justify a new system if you're already on 3.

      • 2 weeks ago
        Anonymous

        Not really. My 13900k doesn't have it and I can play the most demanding ps3 games at full speed.

    • 1 week ago
      Anonymous

      not worth upgrading. wait a few years and buy all the computer parts instead

  59. 2 weeks ago
    Anonymous

    DAILY REMINDER
    This board is infested with paid nShitia and Blizzard shills

  60. 2 weeks ago
    Anonymous

    >poorgays cant afford nvidia

    • 2 weeks ago
      Anonymous

      >t.xx60owner

  61. 2 weeks ago
    Anonymous

    Nvidia makes better GPUs but their drivers are absolute shit on Linux

    • 2 weeks ago
      Anonymous

      My laptop's fricking struggling to do anything without a kernel panic or some other moronic bullshit with a 13th gen i7 and a 4070 mobile. Using KVM is also hell due to linuxgays gatekeeping everything about the simplest fricking things such as enabling Hyper-V. Not to mention the 3x loss in performance due to moronic e-cores slowing down everything. Once this thing becomes e-waste I'll definitely go for a full AMD laptop and never look back.

  62. 2 weeks ago
    Anonymous

    >ryzen 5 4600g
    >rx 6600
    1080p is all i need

  63. 2 weeks ago
    Anonymous

    BUY NOW OR WAIT FOR 5000 SERIES

  64. 2 weeks ago
    Anonymous

    At the high end, depends on the game.
    I bought a 13700k because it was $300 cheaper than it at the time and I have 8 extra cores and less power consumption.

    • 2 weeks ago
      Anonymous

      >At the high end, depends on the game.
      Definitely.
      In Tomb Raider the 7800 3D is 27% faster than the 24 core 14900K

      • 2 weeks ago
        Anonymous

        Source?

        • 2 weeks ago
          Anonymous

          Don't leave home without your 3D cache.

          Nearest Intel CPU is literally 22% behind.
          Nearest AMD CPU without 3D cache is 30% behind.

          • 2 weeks ago
            Anonymous

            >100 fps less
            Further proves how important RAM speeds are for Intel, a literal 25% increase in FPS.

            • 2 weeks ago
              Anonymous

              I don't trust intel with maximum overclocks.

              They are fudging with their power profiles to look good in performance at launch.
              But then they release new power profiles to scale back performance post-launch so their chips don't get friend en mass and get class action lawsuited.

              • 2 weeks ago
                Anonymous

                How often is it under max load when gaming? Maybe build shaders I guess it could spike up a bit.
                Never had an issue and uses 25-30W less at idle.

              • 2 weeks ago
                Anonymous

                I used to think idle is a concern. Until I realised I have multiple computers, and I'm better off just shutting down the PC when I'm not playing.
                Idle power consumption is more important on my laptop where I do all of the browsing an idling stuff.

              • 2 weeks ago
                Anonymous

                It's not just idle per say but less demanding tasks, such as posting on Ganker or watching a YouTube video.
                People have done tests with HWInfo open in the background all day and Intel was far more efficient overall.

              • 2 weeks ago
                Anonymous

                >It's not just idle per say but less demanding tasks, such as posting on Ganker or watching a YouTube video.
                Yeah that's what my laptop is for. It's superior at that task than any desktop PCs.

                That's why Idle is irrelevant.

              • 2 weeks ago
                Anonymous

                So you turn your PC off every time you have a break and watch YouTube on a tiny screen? You're not the norm here..

              • 2 weeks ago
                Anonymous

                Yup.

                Actually it's the other way around. the laptop is my main machine, and when I've had enough of your shit, I'm taking a break and booting up the desktop.

              • 2 weeks ago
                Anonymous

                >People have done tests with HWInfo open in the background all day and Intel was far more efficient overall.
                because intel lies and makes the number smaller in software
                if you measure the power draw of the whole system (which is the only thing that matters), they're equal

              • 2 weeks ago
                Anonymous

                Holy schizo
                You realize people have tested the power draw from the wall, right?

              • 2 weeks ago
                Anonymous

                >You realize people have tested the power draw from the wall, right?
                yeah, that's what the chart shows, dumbass

                the mobo's components uses power as well. the reason why you cannot trust whole system power draw comparisons between 2 different chipsets is because you can't isolate the variable that you want to compare (which is the cpu itself). you could use a power-hungry mobo with tons of power phases, power-hungry chipsets, onboard wifi/soundcard etc. vs a barebones mobo with a mediocre amount of power phases, no extra onboard modules and get significantly different whole system power draw results

                >the mobo's components uses power as well
                as does the memory controller, PCIe lanes, etc
                this is what AMD shows in HWiNFO, intel doesn't
                that's why their numbers look lower
                >because you can't isolate the variable that you want to compare (which is the cpu itself)
                you still need a motherboard to use the CPU

              • 2 weeks ago
                Anonymous

                >you still need a motherboard to use the CPU
                that's not the point, the point is that you can maliciously tweak the results to favor whatever conclusion you want by using mobos with lopsided power consumption profiles. even within the same socket, lower-end, barebones mobos will consume less power than high-end ones with lots of power phases and onboard modules

              • 2 weeks ago
                Anonymous

                multiple reviewers show the same results
                you shouldn't trust software sensors for anything other than sanity checking when overclocking or installing a new cooler

              • 2 weeks ago
                Anonymous

                the mobo's components uses power as well. the reason why you cannot trust whole system power draw comparisons between 2 different chipsets is because you can't isolate the variable that you want to compare (which is the cpu itself). you could use a power-hungry mobo with tons of power phases, power-hungry chipsets, onboard wifi/soundcard etc. vs a barebones mobo with a mediocre amount of power phases, no extra onboard modules and get significantly different whole system power draw results

              • 2 weeks ago
                Anonymous

                >you could use a power-hungry mobo with tons of power phases, power-hungry chipsets, onboard wifi/soundcard etc.
                Or in some instances those power hungry mobos could be the only ones that are available to that particular CPU.
                So the user is better off knowing about this.

                PCI-E 4.0 lanes are very power hungry for example.

              • 2 weeks ago
                Anonymous

                >PCI-E 4.0 lanes are very power hungry for example.
                for the purposes of comparing am5 with raptor lake, both have mobos with pcie5 and pcie4 for both the 16x gpu slot and the 4x ssd slot so the opportunity to muddy the water is still present

                multiple reviewers show the same results
                you shouldn't trust software sensors for anything other than sanity checking when overclocking or installing a new cooler

                i'm not disputing that the software sensors can be inaccurate, i'm saying that whole system comparisons are flawed because they can be rigged by the tester, perhaps even unintentionally. these are 2 different arguments

              • 2 weeks ago
                Anonymous

                there's no other way to accurately test idle power consumption
                software sensors are obviously inaccurate because manufacturers are measuring different things (and you fell for it)
                the CPU isn't connected directly to your power socket, the whole system is, so this is the only number that matters

              • 2 weeks ago
                Anonymous

                >goes up to 200W stress testing
                Yeah that sensor is "incorrect". Schizo.

              • 2 weeks ago
                Anonymous

                He is right though. You can only trust the power outlet.
                That is the complete picture.

              • 2 weeks ago
                Anonymous

                you can trust the power outlet but literally the same chip can give you different results when using an x670 extreme mobo vs a b650m barebones mobo, so the comparison is really meaningless

              • 2 weeks ago
                Anonymous

                Counter example to your argument is in

                >both have mobos with pcie5 and pcie4 for both the 16x gpu slot and the 4x ssd slot so the opportunity to muddy the water is still present
                I think that's probably not as bad as you make it.
                If the mother board used is PCi-E 5.0 then it will use more power, but you get more performance.
                So that's actually not unfair.

                What's unfair is for example if the high-end intel 6core processors back in the day only had 1 highend chipset, and that chipset used way, way more power than the competition's chipset, and you hide this information, then it's your methodology that is muddying the waters.

                There are flaws to your approach as well, it's not the perfect one you think it is.

                They are both not perfect.

                But I prefer the outlet method because I don't trust israelitetel when they are behind in performance and might be seeking desperate measures.

              • 2 weeks ago
                Anonymous

                the sensor isn't necessarily "incorrect" but amd and intel are measuring different things
                when amd says "package power" they mean the cores, memory controller, IF, etc
                when intel says "package power" they mean the cores and nothing else
                the number isn't incorrect but a comparison between them is

              • 2 weeks ago
                Anonymous

                I guess the way Tech Jesus does his power testing is probably the best we can do

              • 2 weeks ago
                Anonymous

                >both have mobos with pcie5 and pcie4 for both the 16x gpu slot and the 4x ssd slot so the opportunity to muddy the water is still present
                I think that's probably not as bad as you make it.
                If the mother board used is PCi-E 5.0 then it will use more power, but you get more performance.
                So that's actually not unfair.

                What's unfair is for example if the high-end intel 6core processors back in the day only had 1 highend chipset, and that chipset used way, way more power than the competition's chipset, and you hide this information, then it's your methodology that is muddying the waters.

                There are flaws to your approach as well, it's not the perfect one you think it is.

              • 2 weeks ago
                Anonymous

                there's no other way to accurately test idle power consumption
                software sensors are obviously inaccurate because manufacturers are measuring different things (and you fell for it)
                the CPU isn't connected directly to your power socket, the whole system is, so this is the only number that matters

                btw, they used an MSI X670E Tomahawk to test the 7800X3D
                this board has PCIe 5.0 and dual chipsets, so this is a worse-case scenario for AMD
                no one is muddying the waters here other than intel

        • 2 weeks ago
          Anonymous

          >400w intel vs 80w amd
          LMAO

          • 1 week ago
            Anonymous

            >lying
            Copium

            >have to turn on ac so intel system doesnt overheat
            so 500-2000w more than amd

            I don't have the air con often but the GPU pulling 350W has something to do with that...

  65. 2 weeks ago
    Anonymous

    Will one even benefit from an X3D if they play in 4k?

    • 2 weeks ago
      Anonymous

      I doubt it.

      But in ten years, if you upgrade to the latest graphics cards at that time, the X3D will show a difference compared to other CPU like the 5800X without the cache.

      • 2 weeks ago
        Anonymous

        will probably keep my cpu as is then
        planning on making the leap to 4k if next gen cards are good

  66. 2 weeks ago
    Anonymous

    Please do the needful Saar

    My 13900k is better

  67. 2 weeks ago
    Anonymous

    i bought one of these and a rtx4080. but i have literally nothing to play. all the big studio titles with great graphics fricking suck.

    • 2 weeks ago
      Anonymous

      This is why I'm getting the 5700X3D instead.
      There will be less buyers remorse when I realise I hate all modern slops.

  68. 2 weeks ago
    Anonymous

    worth upgrading to from a 5800X or just wait?

    • 2 weeks ago
      Anonymous

      What GPU and res?

      • 2 weeks ago
        Anonymous

        4080 1440p

    • 2 weeks ago
      Anonymous

      wait

  69. 2 weeks ago
    Anonymous

    What kind of chip you got in that thing, a dorito?

  70. 2 weeks ago
    Anonymous

    >AMD finally decides to make CPUs with iGPUs
    >Can play basically anything if you get a X500G

  71. 2 weeks ago
    Anonymous

    >blocks your path

    • 2 weeks ago
      Anonymous

      I bet the 7800X3D beats it in multicore performance

      • 2 weeks ago
        Anonymous

        The day the Ryzen 8 core chips with V-cache become obsolete, I bet that will be the ideal time to move over to Threadripper.

      • 2 weeks ago
        Anonymous

        it does, Cinebench R15 - 2459 points for the 1920X and 2966 for the 7800X3D

      • 2 weeks ago
        Anonymous

        Easily. I was just shitposting. The newer Ryzen chips are amazing and absolutely blow this thing our of the water.
        I bought one half price 6 years ago, the only benefit is the amount of I/O because of the PCI-E lane count. I have around 10 USB 3.0 ports, 4 16x PCI-E slots and 3 NVME slots, it's crazy

  72. 2 weeks ago
    Anonymous

    I don't understand why the 7950X3D is not doing better in games, since the comparatively poor raw performance should technically bottleneck it. I mean it does worse than an i5 13600K on Benchmarks. Truly a magical chip.

    • 2 weeks ago
      Anonymous

      Shitty scheduling. You have to have game bar and chipset drivers installed, which normal people do not do.

      Review websites really don't like the Ryzel 5700x3D or something.

      I'm trying to find more than 2 power consumption charts of 5700x3D vs 5800x3D, but there are no more than this.

      If you care about power consumption then Intel is the way to go unless out literally turn your PC off for 20 minutes when you have a break like an autist in this thread suggests he does.

      • 2 weeks ago
        Anonymous

        >Intel is the way to go
        I don't think so.

        Intel is gay and israeli anyway, so I'm not giving them anything.

        • 2 weeks ago
          Anonymous

          Once you factor in idle power consumption, it actually uses less power overall.
          And each to their own.. I buy for price / performance.

          • 2 weeks ago
            Anonymous

            Nah you're not going to save anything by using 50% more power than AMD while the load is on.

            Unless you are a casual who only play games 10 minutes per day.

            • 2 weeks ago
              Anonymous

              >play 2 hours and use 25-50W more on Intel
              >leave PC or post on Ganker for an hour
              >use 25-30W less on Intel

              • 2 weeks ago
                Anonymous

                >use 25-50W more on Intel
                The israelitetel is +600 watts

                >Intel is the way to go
                I don't think so.

                Intel is gay and israeli anyway, so I'm not giving them anything.

                It uses over 200 watts more during cyberpunk.

              • 2 weeks ago
                Anonymous

                >buying an i9 for gaming
                Ah yes, the classic AMD mantra.

              • 2 weeks ago
                Anonymous

                Yeah I'm not going to buy high end israelitetel, nor low end israelitetel.

              • 2 weeks ago
                Anonymous

                >have to turn on ac so intel system doesnt overheat
                so 500-2000w more than amd

        • 2 weeks ago
          Anonymous

          >cares about power consumption
          >has a GPU that pulls 350W
          You homosexuals always cease to amaze me.

          • 2 weeks ago
            Anonymous

            >has a GPU that pulls 350W
            I don't have that though. They are too expensive.

            • 2 weeks ago
              Anonymous

              >buys high end CPU with his mid-range GPU
              lol

              • 2 weeks ago
                Anonymous

                The 5700x3D is actually cheap, can last +10 years easily.
                You think it's high end, but it's actually a cheap CPU. Isn't that a good thing?

                Or do you want me to buy a crappier CPU?

              • 2 weeks ago
                Anonymous

                >5700X3D will last 10 years

              • 2 weeks ago
                Anonymous

                i5 2500K lasted 1+ years didn't it?

              • 2 weeks ago
                Anonymous

                It lasted 5 years tops

              • 2 weeks ago
                Anonymous

                Doubt it. israelitetel kept cucking you at 4 cores, and demanded +1000 dollars for 6 cores.

              • 2 weeks ago
                Anonymous

                >sub 60 fps in BF1 multiplayer
                It was over by 2016 my guy.

              • 2 weeks ago
                Anonymous

                I don't play homosexual games.

              • 2 weeks ago
                Anonymous

                >nooo not that game
                You sound like those Linux homosexuals

              • 2 weeks ago
                Anonymous

                Modern gaming is goyslop, drip fed to you by moronic women and trannies.

              • 2 weeks ago
                Anonymous

                You're just being convinced of this so you don't hold the ethically bankrupt to account

              • 2 weeks ago
                Anonymous

                Those trannies and women ARE the ethically bankrupt.

              • 2 weeks ago
                Anonymous

                I have a long backlog of sovful RPCS3 games that I will take 10 years to finish
                Whereas all you have are troonyfied slops like starfield to look forward to
                Perhaps you shouldn't act so smug, you haven't realized that you became the slop consoomer you made so much fun of before

      • 2 weeks ago
        Anonymous

        >game bar
        qrd?

        • 2 weeks ago
          Anonymous

          AMD didn't ship with a scheduler and relies on the chipset drivers to do it. The chipset driver relies on Game Bar to determine if a game is open or not and schedule correctly. Kinda smart but they also don't know their audience, gamers, who don't install bloat and disable game bar upon install.

          • 2 weeks ago
            Anonymous

            ty, maybe time to install the chipset driver :>

    • 1 week ago
      Anonymous

      Shitty scheduling. You have to have game bar and chipset drivers installed, which normal people do not do.
      [...]
      If you care about power consumption then Intel is the way to go unless out literally turn your PC off for 20 minutes when you have a break like an autist in this thread suggests he does.

      Process Lasso exists

  73. 2 weeks ago
    Anonymous

    3770K and a 1080 Ti in a mini-ITX.
    All I'll ever need, all I'll ever want.

  74. 2 weeks ago
    Anonymous

    Review websites really don't like the Ryzel 5700x3D or something.

    I'm trying to find more than 2 power consumption charts of 5700x3D vs 5800x3D, but there are no more than this.

  75. 2 weeks ago
    Anonymous

    Any external gpus i can use on my desktop for my mini pc?

  76. 2 weeks ago
    Anonymous

    >AMD CPU
    >Intel GPU
    >Linux OS
    Why is this working better for me than Windows did at a quarter of the price. I could buy 2 systems like this and still have enough for hookers and blow before i could afford a system with an Intel CPU and nvidia GPU on Windows 11 that was more powerful than what I'm rocking and even then it's power usage would be almost double.
    Shits moronic if I can do this. And this is at 4k and stable 120fps without the cheap crutch of upscale. Yes this does depend on how optimized the game is, but still.

    • 1 week ago
      Anonymous

      >intel gpu
      Wtf why? Aren't they overpriced and run like shit? I thought they were memes.

  77. 2 weeks ago
    Anonymous

    bananas

  78. 2 weeks ago
    Anonymous

    AND SHIT

  79. 2 weeks ago
    Anonymous

    yes

  80. 2 weeks ago
    Anonymous

    >all this over $8 a year on your power bill
    I know times are tough but should you be buying a new PC to begin with?

  81. 2 weeks ago
    Anonymous

    >ayymd
    No I'm not brown.

  82. 2 weeks ago
    Anonymous

    But how does this new god of gaming 7800X3D run Minecraft? One of few very CPU bound games most of the time, but none of the hardware benchmarking sites or channels actually test it.

    • 2 weeks ago
      Anonymous

      You have to be over 18 to post here

    • 2 weeks ago
      Anonymous

      The 3D cache helps in all games, just some more than others.

  83. 1 week ago
    Anonymous

    I don't think I'm ever going to build another gaming PC. The prices are exorbitant and I don't remember the last AAA slop I enjoyed.
    I don't need a small nuclear reactor to play Slay the Spire and shitpost here.

  84. 1 week ago
    Anonymous

    3d cache really is all powerful

  85. 1 week ago
    Anonymous

    >50% of your games will run better than Intel
    >50% less cores for multitasking and productivity
    >costs more than the XX700 varient
    >AMD jank
    Yeah, nah.

    • 1 week ago
      Anonymous

      Jewtel is behind in power comsumption. +200 watts.

      MOAR COARs won't help you.

      • 1 week ago
        Anonymous

        280W during gaming?
        Idle is 5-8W vs AMD 30W
        AMD kool-aid lmao

        • 1 week ago
          Anonymous

          Jewtel s +200 watts more power consumping

          >Intel is the way to go
          I don't think so.

          Intel is gay and israeli anyway, so I'm not giving them anything.

  86. 1 week ago
    Anonymous

    As a current 3600 owner what CPU do I upgrade to? And no I'm not getting an AM5 Motherboard

    • 1 week ago
      Anonymous

      5800x3d. literally the goat choice if you're sticking with AM4.

  87. 1 week ago
    Anonymous

    Is MPO still broken on Windows that you need to disable it for Nvidia GPUs???????????

  88. 1 week ago
    Anonymous

    Why doesn't Intel do 3D cache

    • 1 week ago
      Anonymous

      The cache takes up real estate on the processor die.
      Jewtel chose to fill up that area with Moar Cores rather than Moar Cache.

      • 1 week ago
        Anonymous

        I'm talking the vertical stack cache layer

  89. 1 week ago
    Anonymous

    Jewtel's strategy turns out they depend on advancement to DDR6 as fast as possible and hope DDR6 is an improvement to make up for the lack of cache.

    AMD's strategy proved to be very beneficial no matter which RAM type you use, DDR4 or DDR5 or DDR6 don't matter.
    This means AMD has massive advantages at the high end in their AM5 platform all the way down to their affordable stuff in AM4 platform.

    • 1 week ago
      Anonymous

      It doesn't matter if you're gaming at 4K

      • 1 week ago
        Anonymous

        yeah, I keep forgetting to turn on XMP each time I BIOS update and it reverts to 3600MT.
        Don't notice any difference in FPS or performance when I turn it on.

  90. 1 week ago
    Anonymous

    True. Very satisfied with mine.

  91. 1 week ago
    Anonymous

    >have 3700x i bought five years ago
    >sometimes get the temptation to upgrade
    >remember i don't play new games

  92. 1 week ago
    Anonymous

    I have this.
    It runs my games just fine but I'm not a real techy person.

    • 1 week ago
      Anonymous

      Its fine, about the same as the ps5 cpu

Your email address will not be published. Required fields are marked *