AMD or Intel for gaming and why?

AMD or Intel for gaming and why?

POSIWID: The Purpose Of A System Is What It Does Shirt $21.68

DMT Has Friends For Me Shirt $21.68

POSIWID: The Purpose Of A System Is What It Does Shirt $21.68

  1. 3 months ago
    Anonymous

    AMD for home
    Intel for work

    • 3 months ago
      Anonymous

      You can game just fine with Intel
      You can't work on something professional with AMD

      AMD is king for gaming.
      Intel is king for workstations.
      Intel can also be used for gaming but you have to put way more money into it to get performance equal to that of a Ryzen. Just look at the 14900KS.
      >400W or even more
      >over 600 bucks
      >for gaming performance equal to that of a 7800x3D
      >which consumes less than 90W
      >and only costs around 350 bucks

      That and AM4/AM5 are longliving cheap platforms while keeping up with Intel is generally more expensive as you constantly upgrade parts.
      Well, that and Intel also runs way hotter so i guess you might have to spend less on your heaters next winter.

      ROFL shitel is shit at everything.
      Threadripper is way better for workstations and hedt.

    • 3 months ago
      Anonymous

      >intel with its massive security issues for work

  2. 3 months ago
    Anonymous

    Depends on the game. If a game doesn't utilize all cores then core speed is crucial which means intel.

  3. 3 months ago
    Anonymous

    You can game just fine with Intel
    You can't work on something professional with AMD

    • 3 months ago
      Anonymous

      >You can game just fine with Intel
      >at $200 higher price than an equivalent AMD
      >at double the power consumption of the equivalent AMD
      >with higher temperatures that cause more throttling compared to the equivalent AMD

      I added addendums to your post to correctly explain the situation
      pic 1/2

      • 3 months ago
        Anonymous

        pic 2/2
        sorry, I need to correct myself, I should have said OVER THREE TIMES THE POWER CONSUMPTION of the equivalent AMD, my mistake

        • 3 months ago
          Anonymous

          >295
          Waffle makers by Intel when?

        • 3 months ago
          Anonymous

          Imagine living in a shithole without almost free power.

          • 3 months ago
            Anonymous

            Is your PC supplied directly supplied by pic related? It better be if you're going high end Inter+Nvidia

            • 3 months ago
              Anonymous

              I'm on a grid fed by one. My Intel CPU idles at like 10w and barely gets to 90w while gaming (because only the poors use CPU heavily for gaming). My GPU consumes only a modest 450w, though.

              • 3 months ago
                Anonymous

                >My GPU consumes only a modest 450w
                Intel or AMD?

              • 3 months ago
                Anonymous

                GPU. Neither of those companies make GPUs worth mentioning.

          • 3 months ago
            Anonymous

            power consumption means heat, cpus are not really doing anything else with those huge amounts of energy that goes into them. more power consumed -> more heat generated, and generating more heat means that CPU is more imperfect inside as heat is primarily generated by poorly aligned cpu layers having hard time to connect to each other due to smaller contact and resistance. more energy efficient cpu means that manufacturers can squeeze out more of it. even if your energy would be 100% free, you'd still prefer a more efficient cpu, because it would simply be better as long as it's same performance bracket

            • 3 months ago
              Anonymous

              Yet there's no application where I actually use my CPU heavily in 2024.

      • 3 months ago
        Anonymous

        pic 2/2
        sorry, I need to correct myself, I should have said OVER THREE TIMES THE POWER CONSUMPTION of the equivalent AMD, my mistake

        >more expensive
        >3x the power draw
        >less performance
        Damn it Intel. it's over. Pack it up. Go home.

  4. 3 months ago
    Anonymous

    AMD is king for gaming.
    Intel is king for workstations.
    Intel can also be used for gaming but you have to put way more money into it to get performance equal to that of a Ryzen. Just look at the 14900KS.
    >400W or even more
    >over 600 bucks
    >for gaming performance equal to that of a 7800x3D
    >which consumes less than 90W
    >and only costs around 350 bucks

    That and AM4/AM5 are longliving cheap platforms while keeping up with Intel is generally more expensive as you constantly upgrade parts.
    Well, that and Intel also runs way hotter so i guess you might have to spend less on your heaters next winter.

    • 3 months ago
      Anonymous

      >AMD is king for gaming.
      >Intel is king for workstations.
      isn't that the other way around ?

      • 3 months ago
        Anonymous

        Used to be the case when AMD offered more cores and clock speeds than intel.
        But when they started putting estrogen cores, they fricked up with gaming performance but at the same time it improved workstatiom performance

        • 3 months ago
          Anonymous

          power efficiency is king in workstation shit
          why would you settle for intel if you're actually maxing that shit

          • 3 months ago
            Anonymous

            If have a really strong cooling solution it probably doesn't matter all that much. (Unless you care about your bills). If you don't take into account power efficiency, intel is better in that regard.
            I have a 5600, just to be clear.

            • 3 months ago
              Anonymous

              yeah but why wouldn't you care about that
              you're paying more for air conditioning and power just to have a chip from a blue brand?

              • 3 months ago
                Anonymous

                I know, I'd care - some don't, they want or even need the BEST of the BEST.
                It's up to you, really.

                how did they do it?

                Their CPU architecture seems to be giving them headaches again and they start to plateau 'till they get to a smaller lithography.

              • 3 months ago
                Anonymous

                yeah but this isn't something like 20% more power efficiency difference that you can just dismiss
                the gap is unbelievably fricking massive for a TWO PERCENT increase

              • 3 months ago
                Anonymous

                These threads just expose how Ganker is filled with paid pajeet marketing shills because nobody in the real world has this irrational hate for AMD like you see here outside of UserBenchmarks.

                This image right here destroys the Intel vs AMD argument. It's settled. AMD won. Intel has nothing on the horizon to answer back with, and AMD is chasing Intel out of the server space with Threadripper/EPYC. Even Grace Hopper can't compete. It's both slower and doesn't have x86 compatibility.

              • 3 months ago
                Anonymous

                >Power consumption
                >meaning frick all
                lmao; each CPU has their specialties and use case. you call out 'shills' but totally fricking write like one

              • 3 months ago
                Anonymous

                it means a lot if you want to use it to regularly make money on anything higher than an occassional hobbyist level
                you want to tell the boss that you'll be doubling to tripling the power draw in a specific category because you're sucking off intel?

              • 3 months ago
                Anonymous

                There's more info in that image than just 'power consumption.'
                When you need 350w on top of your competitor for a 1.1% performance uplift then your architecture is bad. You're doing it wrong.

    • 3 months ago
      Anonymous

      consumes less than 90W
      moron

      • 3 months ago
        Anonymous

        Is google too hard for you?
        Stock in games it doesn't really go over 60w at least for techpowerup
        It can use more power but under specific uses like avx or if you overclock/disable power limits.

    • 3 months ago
      Anonymous

      AMD for high end gaming machines, Intel for low end PCs and productivity machines. 12400f destroys AM4 on performance and power efficiency, but it ultimately destroys them at memory latency. Also Meteor Lake will be re-released in the future as budget CPUs as Intel moves to smaller micron processing for their next gen lineup, so sticking to the ultra cheap 12100f is still worth it now as there'll be a good budget upgrade path in the future. Plus Intel APO is coming to 12th gen CPUs too.

      For high end gaming machines, AMD is pretty much the only next gen option. Not worth it for budget builds because DDR5 is still shit. Furthermore, 7500f is pretty cheap if you ever manage to find one, but it still suffers from the typical AMD memory latency problem and performs the same as 12400f when memory latency becomes an issue. 7800X3D is a monster, but it's expensive and the money could go to your GPU instead. Not worth buying unless you have a 4070 Super at bare minimum.

      AM4 is trash man. Not worth buying anymore. Maybe if you got a *really* good deal on 5600X3D or 5700X3D it's worth it, but otherwise it's outdated trash with PCIE 4.0 at best.

      • 3 months ago
        Anonymous

        I am not saying that people should still buy AM4 but AM4 shows that AMD is willing to support a platform for a very long time. Someone could have started with a 3600 Ryzen only to later upgrade to a 5800X3D without switching the motherboard..
        Intel meanwhile drop support way faster.

        • 3 months ago
          Anonymous

          When I needed to upgrade my CPU, I went for a 10400F because that was the best value at the time. Now all my friends with AM4 motherboards upgraded to 5600s and 5800X3Ds, while I'm stuck with this.
          One of them went from a 1600 to a 5800X3D, the dream upgrade: he tripled his CPU performance (or more in some games) for like €350.

          I regret my choice to be honest, I underestimated how useful their long-term support would be.

          • 3 months ago
            Anonymous

            This. Intel fanboys really downplay how good long term socket support is. Using arguments like "well you don't need to upgrade your cpu until like 5 years after"..... expect if you bought a R5 1600 in 2017 you could upgrade to the R7 5800X3D in 2022. While if you bought an i5 7400 in 2017 you are SOL.

            • 3 months ago
              Anonymous

              >upgrading to a $300 high end CPU while reusing the same crappy weak VRM motherboard that needs BIOS update
              I wouldn't want to be that dumb.

              • 3 months ago
                Anonymous

                wtf are you talking about, B350s and 370s support X3Ds without an issue.

                >inb4 muh PCI 4.0
                There have been so many tests to see if it matters, it doesn't. The 4090 barely loses performance between the two of them.

            • 3 months ago
              Anonymous

              I upgrade my entire PC every 2.5 years because I'm not a poorgay.

      • 3 months ago
        Anonymous

        >pcie 4.0 at best
        Black person that shit is irrelevant. pci3 would be more than fine for damn near everything, let alone 4.

        • 3 months ago
          Anonymous

          Yeah, pci gen bottlenecking on modern motherboards is a total scam.

          • 3 months ago
            Anonymous

            I'm assuming that all the non 4090's are capped at their (relative) 100% from a 4.0/5.0 slot?

            • 3 months ago
              Anonymous

              Not sure.

              • 3 months ago
                Anonymous

                ultimately, still negligible performance loss from 4 > 3, marginal at 4/3 > 2, and notable at fricking 1.1, but realistically you're not getting a fricking 1.1 slot nowadays.

  5. 3 months ago
    Anonymous

    intel destroys amd on emulation so I picked that

    • 3 months ago
      Anonymous

      Maybe if we're literally only talking about PS2 emulation, and even that's probably not down to actual performance and more down to developer incompetence.

      • 3 months ago
        Anonymous

        cope. rpcs3 runs way better on intel than on amd on similar price cpus

        • 3 months ago
          Anonymous

          Considering you can run rpcs3 on an Athlon, I strongly doubt that is the case.

          • 3 months ago
            Anonymous

            >I strongly doubt that is the case
            just look up test videos on youtube. i've bought i3 12100f and it only has 4cores. yet it runs gt6 at around 30-40fps, meanwhile ryzen 3600, same budget CPU, can't even reach 30 despite having 6 cores. heck it even loses in pc games because amd's IPC is trash.

            • 3 months ago
              Anonymous

              >just look up test videos on youtube
              Or I could just run it on my athlon.
              You doughnut
              What next, you're gonna tell me to check out userbenchmark?

              • 3 months ago
                Anonymous

                >athlon
                i had fx6300 and it was pretty shit for rpcs3, only a few smaller games ran at full speed. you're coping.

              • 3 months ago
                Anonymous

                >you're coping.
                Black person I have my tiny athlon PC on my desk, and you're trying to convince me that it's using some kind of shamanic ritual to trick me into believing it's working.

            • 3 months ago
              Anonymous

              I guess my 5700x shits on both of those so I'll be fine
              Thanks buddy. Enjoy your budget build and have fun

  6. 3 months ago
    Anonymous

    Why does the 7800X3D get so much hype?

    • 3 months ago
      Anonymous

      Because its overkill yet afforadable and absolutely stompts the shit out of both Intel and other more expensive AMD CPUs in most cases.
      The 7800X3D is so good it negatively affects AMDs topline.

    • 3 months ago
      Anonymous

      Because it's the best overall for games, less than $400, and has a sub 90 watt draw at peak

    • 3 months ago
      Anonymous

      It's the 1080ti of CPUs

    • 3 months ago
      Anonymous

      Best bang for buck gaming chip in a long time.

    • 3 months ago
      Anonymous

      the new i5 2500k

    • 3 months ago
      Anonymous

      why did no one warn me this chip runs so hot. I understand now that its normal according to amd but it freaked me out. I'm like a caveman cowering at fire with this thing

      • 3 months ago
        Anonymous

        Modern CPUs are designed to run as hot as possible to win technically-not-overclocking-but-very-expensive-cooling setups. An AM5 going up to 95c or whatever at the junction point gets them the "performance crown", even though they should be run at the top eco mode to not ramp your fans constantly. You don't even lose performance doing that on a 7800x outside of max-load all-core benchmarks because it's moronicly power efficient.

        • 3 months ago
          Anonymous

          >to win
          *journo benchmarks with

      • 3 months ago
        Anonymous

        Are you sure you aren't just using some version of speccy from 2002 like every giant moron on the internet

        • 3 months ago
          Anonymous

          no I'm not sure about that, I'm a massive moron.

          • 3 months ago
            Anonymous

            If you temperature monitor says the CPU is a running hotter than the universe one yottosecond after the big bang it might not have the right settings for that chipset

        • 3 months ago
          Anonymous

          >Are you sure you aren't just using some version of speccy from 2002 like every giant moron on the internet
          That's just this stupid shithole board.

  7. 3 months ago
    Anonymous

    >depends on the model
    >depends on the series
    >depends on the software
    >depends on the other hardware used

    Every generalization is made by morons or fanboys(morons)

  8. 3 months ago
    Anonymous

    How is every single fricking reply so wrong holy shit
    There is no fricking reason that you'd want a space heater if you're doing professional work for prolonged periods of time
    Maybe on a small scale sure but if it's the main reason you're buying then frick no, that extra heat isn't worth it
    Intel sucks at emulation ever since they killed avx512 while AMD has come up with a good solution that doesn't ignite the CPU
    And games don't care about pure raw clocks since 7800X3D beats the 14900k on average

  9. 3 months ago
    Anonymous

    AMD for gaming
    Intel for gaming+rendering+machine learning+CAD etc

  10. 3 months ago
    Anonymous

    the one with more L2 cache

  11. 3 months ago
    Anonymous

    >for gaming
    >yea but professional ???
    can't people read or something?

    • 3 months ago
      Anonymous

      Amd is better for work because in case you need a HEDT, Intel has none.

  12. 3 months ago
    Anonymous

    OP asked which CPU is better for gaming. Not which CPU is best for your blender and adobe shit.
    In that case the only answer is AMD unless you like spending more money on chips and electricity bills than needed just to achieve equal performance to a Ryzen from a generation ago. Nobody who is building their own PC is buying Intel right now - the top 10 bestselling CPUs for PC builders are all Ryzen CPUs. And that for a good reason.

    • 3 months ago
      Anonymous

      >And that for a good reason.
      nobody who is building their own pc has a job right now

      • 3 months ago
        Anonymous

        Pretty sure NEETs arent spending two months pay on a 7800x3D anon

  13. 3 months ago
    Anonymous

    What happened to Intel? Why did they get lapped so hard?

    • 3 months ago
      Anonymous

      They've been coasting off the success of core2duo all this time while AMD had to invent Ryzen to not die

      • 3 months ago
        Anonymous

        >Ryzen

        Why was Ryzen so special?

        • 3 months ago
          Anonymous

          AMD brought back Jim Keller who actually knew what he was doing, compared to the mongoloids that designed bulldozer. It also helped that Intel were using their own fabs and got stuck on 14nm for years (their 10nm) was just a refinement of 14, not an actual node shrink).

          • 3 months ago
            Anonymous

            14nm++++++++++++++++

            • 3 months ago
              Anonymous

              Now on 10nm+++++
              >20A will fix it

              • 3 months ago
                Anonymous

                and yet still keeps up with TSMC 5nm. It's incredible how AMD still lags behind Intel's own shitty fabs, same goes for Nvidia where they also get btfo despite Nvidia using TSMC as well.

              • 3 months ago
                Anonymous

                >and yet still keeps up with TSMC 5nm.
                No? Intel 7/10nm is less dense than TSMC 7+ and later nodes. Intel 4 is behind TSMC N5. Intel 3 and 20A are imaginary.
                >It's incredible how AMD still lags behind Intel's own shitty fabs
                Are you not aware that intel is using TSMC to fab most of their chip tiles?

    • 3 months ago
      Anonymous

      AMD invested into a new platform with Ryzen and the bet paid off. Intel meanwhile are still rocking the same old shit for over 10 years just constantly increasing the heat resistence, power drain and clock count. Intel are also working on a new platform but just like Ryzen it will probably take atleast a generation before everything is optimized for it.
      In short: They sat on their ass for too long.

      • 3 months ago
        Anonymous

        Wrong. AMD has the advantage of having their CPUs manufactured by TSMC which is the biggest fab with the lowest micron processing in the world. AMD is a fabless company. Meanwhile Intel uses their own fabs which didn't benefit from government bucks unlike TSMC, until the last few years. They couldn't match TSMC's speed at upgrading their fabrication process so they did architecture and software upgrades instead. AMD will be utterly fricked when pooh invades taiwan though. They'll end up borrowing intel fabs one day.

        • 3 months ago
          Anonymous

          Lmao Intel is still behind tsmc's nodes, and even uses them for their I/O. Even after getting their CHIPS gibs they're still demanding several more $billion.

        • 3 months ago
          Anonymous

          >AMD will be utterly fricked when pooh invades taiwan though.
          Everyone would be fricked. Chip shortages would drive the prices insanely high.

          • 3 months ago
            Anonymous

            isn't that why various countries started (re)building their own fabs after the last shortage?

            • 3 months ago
              Anonymous

              And how many of those are standing and how much of the total production do they have? Don't forget that Taiwan's manufacture is cheaper too. I would expect at least 30% price increase. Data source for my estimate you ask? It once appeared to me in a dream.

              • 3 months ago
                Anonymous

                from what vague information I read years ago in passing, I recall it takes a long fricking time to make a fab anyway (something in the range of like 5-7 years or some shit? I dunno), so probably 0.

    • 3 months ago
      Anonymous

      It's all bs to control the market and make it look like there's no monopoly, the creator of the Ryzen architecture is an Intel engineer, he worked for Intel, went to AMD, created Ryzen, and then went back to Intel. Look it up. The AMD team did good on maturing the tech of course can't deny that. Same shit for AMD and Nvidia, they've been colluding with prices a few times. These tech companies want you think they are in competition but in reality, they all work together behind the scenes.

      • 3 months ago
        Anonymous

        >It's all bs to control the market and make it look like there's no monopoly, the creator of the Ryzen architecture is an Intel engineer, he worked for Intel, went to AMD, created Ryzen, and then went back to Intel. Look it up.
        What are you talking about?

        • 3 months ago
          Anonymous

          He's a moron that doesn't know what a corporate work force looks like. Companies in the same industry have near constant cross contamination of labor. That's why non-compete clauses are huge. 90% of corporate espionage comes from some dude taking files from his old company to the new one.

    • 3 months ago
      Anonymous

      They got lazy when AMD couldn't compete and made the same chips for 6 generations.

      • 3 months ago
        Anonymous

        kek look at this and the other homosexual thinks I'm living in the past for holding onto the 4790k for so long. Guy knows fukol and wants to comment about how he wastes money to coooonsssuummmee new hardware like the fricking moron he is.

        • 3 months ago
          Anonymous

          >fukol
          wat

          • 3 months ago
            Anonymous

            Ah sorry friend, I mean fokol. 2nd language.

            • 3 months ago
              Anonymous

              it's frickall

              • 3 months ago
                Anonymous

                >“Fokol” in Afrikaans essentially implies a sense of nothing, of having northing or being nothing. While the word 'Niks” is the direct translation to “nothing” in Afrikaans, “Fokol” is a commonly used Afrikaans slang word.

              • 3 months ago
                Anonymous

                That's obviously a loanword you dumb idiot poopface

              • 3 months ago
                Anonymous
      • 3 months ago
        Anonymous

        Meanwhile with AMD, the AMD video drivers are CPU heavy causing a limit to occur with the 7800X3D.

        The absolute state of AMDrones.

        • 3 months ago
          Anonymous

          >nvidiots care about fortnite
          your opinion has been discarded

          • 3 months ago
            Anonymous

            >NO NOT LIKE THAT
            everytime, kek

            • 3 months ago
              Anonymous

              yes
              >You will immediately cease and not continue to access the site if you are under the age of 18.
              no one who cares about fortnite in any capacity meets this rule
              or you're a manchild or pedophile

        • 3 months ago
          Anonymous

          >CPU thread
          >posts GPU benchmarks
          >compares a $1000 GPU against a $2000 GPU
          >fortnite
          Ganker was a mitake

          • 3 months ago
            Anonymous

            Absolute moron, those benchmarks showase the CPU limit you stupid mongaloid Black person.

        • 3 months ago
          Anonymous

          OH N-

    • 3 months ago
      Anonymous

      Their foundry side is now totally inferior to Taiwan, generations behind at this point, and is dragging down the chip design side

  14. 3 months ago
    Anonymous

    >buying Intel
    >ever
    Buyers remorse much?

    • 3 months ago
      Anonymous

      >every time someone has problems running a game, they have an AMD
      >never once saw an intel user have the same problem

      • 3 months ago
        Anonymous

        New Intel chips have plenty of issues with older games because of the E-cores. Sims 3 for example outright won't work unless you disable E-cores or download a mod.

  15. 3 months ago
    Anonymous

    Dont know about AMD but Intel now thermal trhottles unless you have good CPU cooling.

  16. 3 months ago
    Anonymous

    How much would I save by going full AMD? I still have a 980 and 4790k.

    • 3 months ago
      Anonymous

      AMD requires (yes, requires) higher speed ram to do the same job that an Intel cpu does internally. Basically, it's taking the cost to do something and putting it elsewhere and people see that as some kind of discount. It doesn't help that the majority of buildgays have no idea what they're doing and see no issue with buying ram with higher speed than what they need anyway.

      • 3 months ago
        Anonymous

        This anon knows what he's talking about. Intel shit works out of box. AMD needs the external shit to compensate not to mention the software suite needed.

        i'm on win 10 and read it doesn't know how to use the p and e cores correctly because of the scheduler or something, but i haven't noticed any issues so far. its fast though, 13700k

        https://www.intel.com/content/www/us/en/gaming/resources/how-hybrid-design-works.html

        Windows scheduler is so moronic Intel just said frick it and put that shit on the silicon.

        • 3 months ago
          Anonymous

          is there any tests that show the difference between the scheduler acting correctly vs win 10? i dont play much for 'aaa' games but nothing besides AI or benchmark programs like cinemark seem to come close to using it all. in my usage i haven't noticed any issues. white it might technically be an issue, i'm not seeing it so far

      • 3 months ago
        Anonymous

        The last time RAM speed mattered so much was Zen 2. Now they are using a single CCD per chiplet and the cache is big enough, the inter-CCD latency (based on infinity fabric clock speed, which is based on RAM clock speed) is not that big of a deal. With X3D CPUs, the cache is so massive RAM speed doesn't matter at all. See pic ralated.

        RDNA3 is great, but the product naming fricks up the entire stack and it leaves a very bad impression. AMD shouldn't have gone with chiplets without letting them mature a little first, as well.

        >AMD shouldn't have gone with chiplets without letting them mature a little first, as well.
        It's not something they could predict, they started working on it half a decade ago. RDNA2's "gimmick" ended up better than they imagined (ie, big cache but low memory bandwidth), RDNA3's (ie, chiplets) ended up worse.
        I agree, RDNA3 isn't that bad: it's around 30-40% more efficient than RDNA2, which isn't a bad gen-to-gen improvement. It's about average I'd say, if not slightly above average. But when the competition improves efficiency by 70-80% in a single generation... yeah, your "mere" 30% looks bad in comparison.

        >AMD shouldn't have gone with chiplets without letting them mature a little first, as well.
        pretty sure chiplets cost them less than what a monolithic die would
        this amd generation was focused on increasing profit margins by lowering production costs, not pure power

        That was their plan, but complex packaging costs skyrocketed due to the AI boom. While silicon price dropped. That's why RDNA4 is (according to rumors) going to be monolithic only, they scrapped their chiplet designs.
        The 4080 is probably a lot cheaper to produce than the 7900XTX by now.

        • 3 months ago
          Anonymous

          If they went with a naming convention that had the XTX as the 7900xt and the 7900xt as the 7800xt, and so on, and so forth down the stack, then the gen-on-gen uplift would have been mindbogglingly massive. The 7900xt sitting in between the 4080 and 4090 would have made nvidia's connector fire disasters look even more embarrassing. Instead the "7900xt" looked like a barely there uplift from the much, much cheaper 6950xt at launch. Imagine if they called the 7600 a 7400 instead? That would have been a massive improvement over the previous gen's bottom card.

          • 3 months ago
            Anonymous

            Well, in that case the 4080 is "actually" a XX70 die sold for Titan prices. Names are completely arbitrary, while prices mostly depend on what they believe customers are going to pay for their products.
            When I say "gen-to-gen improvement", I'm comparing die size and power consumption. In terms of performance-per-dollar, Lovelance is only 20-30% better than Ampere... at best, if you compare the 3080 with the 4070 (ie, the best Lovelace card). With the 4060, price-performance didn't improve at all compared to the 3060.

            But the 4060 is a 160mm2 die, 115W. The 3060 is a 276mm2 die, 180W. That's a massive gen-to-gen improvement.
            The reason they cost the same is simple: people are buying it anyway, and Nvidia makes more money selling 4060s than 3060s for $300. Not that long ago, a graphics card with a die that small would be called "XX50" and sold for $150.

            • 3 months ago
              Anonymous

              >But the 4060 is a 160mm2 die, 115W. The 3060 is a 276mm2 die, 180W
              Smaller die. Less power draw. That's rather obvious. What's egregious is that the 4060 is actually 5% slower than the 3060. Lovelace's naming conventions hurt it, as well.

              Truth be told both brands have stellar hardware this gen. Across the board. it's them trying to sell weaker cards as a product two tiers up the stack that has everyone scratching their heads and calling shenanigans. Especially when they try to use that to justify the price.

              • 3 months ago
                Anonymous

                >4060 is actually 5% slower than the 3060
                cope, none of the benchmarks back it up

              • 3 months ago
                Anonymous

                fr fr on god that homie be sus with that ohio ahhh energy

              • 3 months ago
                Anonymous
              • 3 months ago
                Anonymous

                nice cherrypicking, homosexual
                now post averages across several games

              • 3 months ago
                Anonymous

                >i-it doesn't count
                a newer generation being worse than the older generation is not acceptable in any scenario, especially in games that are from companies like id that are known for having well-built games

              • 3 months ago
                Anonymous

                >posts the only two games where 3060 can leverage its 12 gig VRAM
                >i-it's 5% slower

              • 3 months ago
                Anonymous

                Got a counter argument with some facts to back up your claims, or are you just going to be a homosexual?

              • 3 months ago
                Anonymous

                sure, I'll just post regular benchmark instead of 2 games

              • 3 months ago
                Anonymous

                >le 5% face

              • 3 months ago
                Anonymous

                Yes, it's 5% slower in a few games. Has something like this ever happened before? The newer generation being slower in certain games?

              • 3 months ago
                Anonymous

                Overclocked 3060 12GB at 2GHz for the gpu and 9000 for vram is better than overclocked 4060 in all games and situations that require more than 8GB of vram.
                Which is most UE5 games released recently.
                4060 should be 4050 and cost under $200

              • 3 months ago
                Anonymous

                oh and the ti version is even worse

              • 3 months ago
                Anonymous

                jesus

              • 3 months ago
                Anonymous

                128 bit bus and only 8GB of vram is a nasty combo.
                Imagine losers that paid $400 for 4060ti..

              • 3 months ago
                Anonymous

                >Imagine losers that paid $400 for 4060ti..
                The 3070 wasn't that much better. Once the mining boom ended, both the 3070 and the 6800XT dropped to around $600. Guess which card sold better.

              • 3 months ago
                Anonymous

                >What's egregious is that the 4060 is actually 5% slower than the 3060.
                I don't know what you're talking about. 4060 gets at least 5-10 fps boost in most games. Maybe you're thinking of 3060 ti vs 4060 ti or something.

              • 3 months ago
                Anonymous

                >both brands have stellar hardware this gen
                >it's them trying to sell weaker cards as a product two tiers up
                Yep. And prices would've normalized by now... if it wasn't for the AI boom. Nvidia doesn't even need to sell their gaming cards to make money anymore, there are 4080s rotting on shelves since launch. Eventually someone is going to buy them, why would they care.

                From 2017-2018 to 2023-2024, we went through two mining booms and an AI boom.

    • 3 months ago
      Anonymous

      depends on a price bracket and use case scenarios. certain games benefit MASSIVELY from -x3d cache thingy, certain games run marginally better on intels, all depends on what tier you're aiming for and what kind of hardware around would you want. Another bonus is that AMD tends to stick to their sockets, so if you'll buy 7xxx cpu on a decent mobo, you're probably good for next couple generations. I have same mobo starting from first gen ryzen up to this day (5xxx), and I can definitely say that it was quite convenient this way.

      >full amd
      oh boy, you'll save a lot on GPU (though again, depends on a price bracket - for example, cheap 4070 costs same as slightly-more-powerful 7800xt), but at the same time you'll miss out on nvidia gimmicks and possibly lose some compatibility. Again, it's a random thing, but I remember more than one case through recent years where some game would flat out refuse to work on AMD for a while, it's all kinds of bullshit. But if you can cope with that, AMD GPU price/performance is to die for.

  17. 3 months ago
    Anonymous

    I don't give a frick my 5600H is good enough

  18. 3 months ago
    Anonymous

    how is one good for gaming while the other is good for work?
    can someone explain it for the tech illiterate?

    • 3 months ago
      Anonymous

      I guess an easy example would be like how Nvidia has a bunch of proprietary bullshit for compute that's kind of fricked work up. Personally I haven't really looked into if Intel has similar arrangements.

    • 3 months ago
      Anonymous

      Cause the kind of software people use has been used for a quite a while and its all optimized for Intel and their current architecture. Even if AMD is technically more powerful and efficient - your blender software isn't making any use of it.

      Look at emulators for example. AMD was a shit pick for anyone emulating games for quite some time cause AMD openGL performance used to be shit due to lack of optimization. But that got better over time and ever since most emulators started adding Vulkan - which heavily favors AMD - AMD has been kicking Intels ass in that field as well.

      • 3 months ago
        Anonymous

        >OpenGL
        >vulkan
        Anon we are talking about CPUs not GPUs. The AYYYMDrones don't even know basic computing.

      • 3 months ago
        Anonymous

        Vulkan on Nvidia is perfectly fine, the frick are you talking about

        • 3 months ago
          Anonymous

          Fine doesn't mean equal. AMD is generally better there within the same price range of GPUs..

          >OpenGL
          >vulkan
          Anon we are talking about CPUs not GPUs. The AYYYMDrones don't even know basic computing.

          The point was about optimization.

  19. 3 months ago
    Anonymous

    I've never, ever owned a single AMD processor. I've been buying Intel since the Pentium 3 and it has never let me down. I just go with the original, not the cheap copy.

    • 3 months ago
      Anonymous

      yea old people are moronic when it comes to CPUs thanks for the confirmation

      • 3 months ago
        Anonymous

        ITT: just buy AMD it's just cheaper and... it's just cheaper, m-okay?

        We can afford to get the best processor since we have jobs, you zoomers depend on daddy's money to upgrade your PC.

        • 3 months ago
          Anonymous

          kek I've had both intel
          >2700k, 4790k
          and AMD
          >athlon x4 and 5600x
          it's just most boomers are still moronic from when Intel paid to win and you've been stuck in your ways since Nothing to do with money, just brains.

          • 3 months ago
            Anonymous

            >2700k, 4790k
            Seems you are the one that've been stuck in the past, zoomer. There's a correlation between disposable income and IQ, that's why stupid people pick AMD.

            • 3 months ago
              Anonymous

              Na I looked up benchmarks and didn't need to upgrade for a long time. Honestly could have still stuck with the 4790k but I wanted better end game performance in paradox shit. I mean you are still brainwashed from back when intel was paying so I'd refrain from talking about IQ man.

            • 3 months ago
              Anonymous

              It would appear there is a correlation between brand loyalty and IQ as well

              • 3 months ago
                Anonymous

                Maybe, that explains why dumb people is loyal to AMD.

              • 3 months ago
                Anonymous

                how do you manage to call someone stupid while proving why you're stupid

              • 3 months ago
                Anonymous

                >is
                ESL. detected.

    • 3 months ago
      Anonymous

      >not the cheap copy
      Maybe you should check who came up with x86_64.

      • 3 months ago
        Anonymous

        Maybe you should check who came up with x86.

    • 3 months ago
      Anonymous

      >I really love buying Israeli computer hardware
      Risky.

      • 3 months ago
        Anonymous

        but AMD also has an office in israel

        • 3 months ago
          Anonymous

          Office in country =/= country of origin

          • 3 months ago
            Anonymous

            amd and intel are from the same country

          • 3 months ago
            Anonymous

            What? They're both originated from the US.

  20. 3 months ago
    Anonymous

    AMD is best for everything now, makes zero sense to buy intel at all

  21. 3 months ago
    Anonymous

    ITT anon's hiow 2 computah knowledge stops in 2014. Always buy Intel do not question thanks goy.

  22. 3 months ago
    Anonymous

    https://www.neowin.net/news/eu-fines-intel-400-million-for-blocking-amds-market-access-through-payments-to-pc-makers/

    Intel should have put those 400 million into the development of better CPUs instead of bribing hardware makers to block AMD

    • 3 months ago
      Anonymous

      Never buying Intel products ever again.

    • 3 months ago
      Anonymous

      >2002 - 2007
      22 years of seething and counting.

      • 3 months ago
        Anonymous

        Unfortunately, boomers are still influenced by it. Ever speak to them about PC hardware? always get vague shit like
        >Intel is just more stable so I always run Intel
        >I just go with the original, not the cheap copy

        • 3 months ago
          Anonymous

          I'm talking about AMD fanboys seething for 30 something years. The CPU market landscape, production, and management is wholly different now. There's no reason to stick to a single brand for the sake of loyalty. Back in 2002 I'd fricking buy a duron over any crappy celeron or expensive pentium trash, but now intel is the superior budget option with lower memory latency.

          • 3 months ago
            Anonymous

            what cope is this
            the turbo poorgay market is super fricking tiny compared to midrange and amd is easily winning there
            imagine memory latency being a fricking marketing point for you when that hardware is melting itself to death trying to compete

          • 3 months ago
            Anonymous

            No one is really seething though? it is good to bring up that paying because again, as mentioned, boomers are still influenced by it to this day and are loyal to Intel because of it. Me personally, I just buy whatever has the best performance for the cost. I've always done this, it's why I've had hardware from Intel, AMD and Nvidia over the years.

            • 3 months ago
              Anonymous

              Boomers are only loyal to intel simply because it's the most visible brand. Intel had the superior production capacity and higher market penetration. You could find intel in every prebuilt PC. AMD was cheaper but rarer simply because they couldn't produce CPUs as fast as intel. For those boomers it's seemed like popularity = quality.

          • 3 months ago
            Anonymous

            >budget
            Wrong unless you mean super budget. You can get a 5600x - which is more than enough for pretty much everything - for less than 100 bucks. Intel only has something for the super poorgays who can't afford to spend more than 50 bucks on a CPU but those people are not looking for upgrades - they are looking for something that just barely works - and definitely can't run modern more demanding titles. The bare minimum if you will.

            • 3 months ago
              Anonymous

              >5600x
              I can confirm, I got a 5600 OCd to x level, and it's a little beast of a CPU, runs all the modern slop with less than 60% usage

            • 3 months ago
              Anonymous

              >5600x
              >for less than 100 bucks
              Where? In the used market? Did you mean those $120 warranty-less ryzen 5600 shipped from china for OEM PCs? I'd rather spend a bit more for a one year official warranty.

              • 3 months ago
                Anonymous

                Checked. Apparently prices went up recently. Last time i checked it was new on Amazon for like 95 EUR - Now it's 135 EUR.

            • 3 months ago
              Anonymous

              >5600x
              I can confirm, I got a 5600 OCd to x level, and it's a little beast of a CPU, runs all the modern slop with less than 60% usage

              >5600X
              12400f is better and cheaper.
              >OCing 5600 to 5600X levels
              I wouldn't do that. 5600 is a lower binned 5600X. Enjoy reliability issues in the future I guess.

              • 3 months ago
                Anonymous

                is better and cheaper.
                Nah. The 12400f is 140 bucks here at best. More expensive. The board is also more expensive than a cheap generic AM4 for the 5600x

              • 3 months ago
                Anonymous

                H610M is cheaper than AM4 and have PCIE 5.0 ports for GPUs. 12100f and 12400f are pretty much fine on those. 5600X requires moderately strong VRM like all X CPUs so a cheap AM4 won't cut it in the long run.

              • 3 months ago
                Anonymous

                >buying H610 board
                Oof.
                >PCI-E 5.0
                They don't have that on H610 and there is not a single GPU that supports it anyway.
                >5600X requires moderately strong VRM like all X CPUs so a cheap AM4 won't cut it in the long run.
                Oh no ~65W is just too much.
                You are clueless.

              • 3 months ago
                Anonymous

                >there is not a single GPU that supports it anyway
                PCIe 5 SSDs exist, though.

                >they are about the same
                It depends on the game. These benchmarks only test like a dozen games, that's a pretty small sample size. I believe 12400f would perform better generally due to the lower memory latency. Starfield runs straight up like trash on AM4 systems.
                >5600 + Cheapest A520 MOBO = €190.96
                That mobo doesn't support PCIE 4.0+ GPUs and needs BIOS update to run ryzen 5000 CPUs. I'd rather pay slightly more for convenience. B550Ms are cheap nowadays, just use those.

                It depends on the games yes, but both are 3-5% from each other in certain titles. https://www.youtube.com/watch?v=v5N8SzBSzsk&t=440
                I'd choose AM4 over intel shit any day of the week though. If i wanted an upgrade nowadays I could buy a 5800x3D which doesn't pull 900w and has cuck cores.

              • 3 months ago
                Anonymous

                >If i wanted an upgrade nowadays I could buy a 5800x3D which doesn't pull 900w and has cuck cores
                It's shit for video editing and other non gaming tasks. 13600K is the best CPU in that price range. It does pull more watts but the difference is miniscule next to how much your GPU is pulling, unless you somehow keep it running at peak power somehow. Also lower idle wattage.

                Me? I'd rather stick to budget CPUs until the next generation intel and AM6 come out. 12400f and ryzen 5600 are still getting amazing fps on AAA games, even 12100f is still great. They'll be good for the next 5 years or so.

              • 3 months ago
                Anonymous

                is better
                Not really, they are about the same.
                >and cheaper
                It depends on your region, I guess. These are the prices where I live, both use DDR4:
                12400F + Cheapest H610 MOBO = €228.86
                5600 + Cheapest A520 MOBO = €190.96

              • 3 months ago
                Anonymous

                >they are about the same
                It depends on the game. These benchmarks only test like a dozen games, that's a pretty small sample size. I believe 12400f would perform better generally due to the lower memory latency. Starfield runs straight up like trash on AM4 systems.
                >5600 + Cheapest A520 MOBO = €190.96
                That mobo doesn't support PCIE 4.0+ GPUs and needs BIOS update to run ryzen 5000 CPUs. I'd rather pay slightly more for convenience. B550Ms are cheap nowadays, just use those.

              • 3 months ago
                Anonymous

                >It depends on the game.
                Yes it does. That's an average, which means in some games the 5600 wins and in others the 12400F wins. On average, the performance is about the same. I can also cherrypick games, you know.

                >That mobo doesn't support PCIE 4.0+ GPUs
                ???
                PCIe is backwards compatible. A520 only supports PCIe 3.0, that's true, but it doesn't matter unless you're using a 6500XT which is limited to four lanes. But no one should be using this GPU for gaming anyway.
                If you really need PCIe 4.0 for some reason (and this reason is definitely not gaming, even PCIe 2.0 is enough most of the time), with the cheapest B550 MOBO the total price goes up to €211.41. Still cheaper than Intel.

                >needs BIOS update to run ryzen 5000 CPUs
                No it doesn't. X570, B550 and A520 were released with Zen 3. They support Zen 3 out of the box, even if you end up with a motherboard produced four years ago and never updated since then... somehow.

              • 3 months ago
                Anonymous

                >That's an average
                Yeah out of only a dozen or so games.
                >I can also cherrypick games, you know
                I picked one where the performance is straight up bad, not just less good.
                >PCIe is backwards compatible
                Budget cards with tiny bus like 6500XT run really bad on anything less than PCIE 4.0.
                >with the cheapest B550 MOBO the total price goes up to €211.41
                That's my point. Intel CPU + H610M are still cheaper in my region though.
                >No it doesn't. X570, B550 and A520 were released with Zen 3. They support Zen 3 out of the box
                The early BIOS versions don't support AM4. A520M is an old board so you're likely getting an outdated BIOS.

              • 3 months ago
                Anonymous

                >Enjoy reliability issues
                lmao what "issues"? once you stress test everything and makes sure it works, it works. haven't had any issues with my undervolt+small overclock. silicon degradation is a meme, sandy bridge cpu are still chugging away at their 4.5ghz even today. and if it happens in a decade? just feed it 0.05v and issue's solved

        • 3 months ago
          Anonymous

          Those are valid points though.

          • 3 months ago
            Anonymous

            It's not. Stable how? they can never expand and neither can you. Copy how? who was pasting cores together first to get multi cores because they couldn't make it on a single die like someone else? who made x64?

            Boomers are only loyal to intel simply because it's the most visible brand. Intel had the superior production capacity and higher market penetration. You could find intel in every prebuilt PC. AMD was cheaper but rarer simply because they couldn't produce CPUs as fast as intel. For those boomers it's seemed like popularity = quality.

            >the most visible brand
            >You could find intel in every prebuilt PC
            yes because Intel paid, keep up.

  23. 3 months ago
    Anonymous

    [...]

    > I just buy whatever has the best performance for the cost
    based thinking consumer

  24. 3 months ago
    Anonymous

    R7 5800x3D if you're on ddr4 am4 mobo and gpu under 4080/7900xt.
    R7 7800x3D if you're on a 4080, 7900xtx or 4090, ddr5, am5, etc.
    R5 3600 if you are severly constrained for cash, and on some ancient quad core and dd3... 3600 with a b450 mobo abd 16GB ddr4 3200 with max out a 6650xt and 306012GB.
    While the combo is under $200 new.
    Of course 5600 and 5600x are great vermeer same as 5800x3D but only 32mb of cashe and only 6 core.
    7500f and 7600x are good ddr5 choices.

    Thats it.
    Do not get memed buying some quad core, or buying anything with E cores for vidya. 12400f is the only intel cpu somewhat ok if ylu get it cheap and with a cheap mobo, but thats a rare bundle option on sale.

  25. 3 months ago
    Anonymous

    what kind of work what needs high end cpu?
    cant you just use gpu instead?

  26. 3 months ago
    Anonymous

    AMD for CPU, Nvidia GPU. Intel is garbage now.

  27. 3 months ago
    Anonymous

    >AMD CPU
    >Uh muh PC just random powers off for some reason idk what it is

    >Intel CPU
    >just works

    Anyone else noticed this pattern?

    • 3 months ago
      Anonymous

      Nope.

    • 3 months ago
      Anonymous

      Did you put an AMD CPU on an Intel mainboard?
      Check your PSU, like a power delivery issue

  28. 3 months ago
    Anonymous

    Performance is about the same. You'll never be CPU-bottlenecked in a AAA videogame anyway. In lighter eSport games where CPU performance actually matter, any modern processor can pump out hundreds of frames per second.
    AMD is cheaper, more efficient, AM5 offers better upgradability. If you're still using a Zen 1 or Zen 2 CPU, you can upgrade to a much better CPU (5600 or 5800X3D) for relatively cheap.

    Intel isn't even THAT far behind. But they don't offer anything unique (unlike let's say Nvidia: their hardware is worse per dollar, but they have DLSS), so there's no reason to buy their products. Why would you ever buy a worse CPU when you can get a better one, even if only slightly better.
    But as I said, any modern CPU is more than enough for current videogames. You can get whatever you want, both are fine, it doesn't matter.

  29. 3 months ago
    Anonymous

    intel currently offers superior price / performance in real world scenarios (1440p+, playing modern games)
    if in the rare instance you're no longer GPU limited (4090, play less demanding games, low resolutions) then an AMD 3d cache CPU can squeeze out some extra frames, but please don't buy one of those otherwise

    • 3 months ago
      Anonymous

      >(1440p+, playing modern games)
      literally fricking any recent cpu made in the past few years can manage when you're gpu bound
      why the frick would you pick intel when it's just drawing more power for no gain whatsoever

  30. 3 months ago
    Anonymous

    It's not like the Pentium/Celeron vs Athlon/Duron days anymore. Duron used to perform almost as good as pentium 3 at less than half the price, but now the difference between 2 brands is within a hair breadth. AMD aren't making ultra budget chips anymore like they used to, and even the cheapest 12100f runs any AAA title at 100+ fps. I don't understand why CPU brand wars are still a thing.

    • 3 months ago
      Anonymous

      Because
      >muh tribalism
      morons need to validate their purchase somehow.

    • 3 months ago
      Anonymous

      The performance and use cases have shifted severely, now it's either you have a high-cache CPU or you don't. You have something eating 200-250W peaks (x7000k) or you don't. Z and X series mobos now cost 200-300$ starting and you get very little features outside of expansion and maybe some additional I/O.

      And no, the 12100f is not 'good enough' unless you are making a very budget PC that will in fact play a lot of titles at 1080p with FSR/DLSS with very bouncy FPS and some really annoying stutters. A LOT of modern games eat CPUs for breakfast.

      t. Have built way too many machines and sell 'boutique' gamer PCs with a bumped up price just because I put in an LED strip and 2 RGB fans in there with a case that costs more than the CPU.

  31. 3 months ago
    Anonymous

    >AMD GPUs run laps over Nvidia when it comes to price/performance ratio
    >LMAO poorgay, where's your 4090???

    >AMD CPUs obliterate Intel's for gaming purposes, being stronger, less power hungry, colder AND cheaper
    >bro, buy middle-of-the-road Intel, you never know when you'll want to start rendering stuff

    At least people stopped posting Userbenchmark reviews

  32. 3 months ago
    Anonymous

    For gaming? Definitely AMD.
    >X3D exists
    >No useless E cores
    >AVX512 for emulation
    >APUs are neat at the low end
    >Uses half the power

    • 3 months ago
      Anonymous

      Intel is better for emulation though.

      • 3 months ago
        Anonymous

        Not if you ever want to play PS3 games.
        Nobody cares how much faster it is at something that already runs at full speed like Dolphin.

  33. 3 months ago
    Anonymous

    >you HAVE to pick a side goyim
    Frick off moron. I buy whatever is good at the time i need it. Right now that means the 7800x3d for gaming until something better comes along.

  34. 3 months ago
    Anonymous

    Intel, unless you want Proton chinese spyware and for the chip sandwich to cook itself.

  35. 3 months ago
    Anonymous

    it just works

  36. 3 months ago
    Anonymous

    how did they do it?

    • 3 months ago
      Anonymous

      AMD could also push power consumption to 400W and improve performance by 5%. But they didn't because that would be pretty stupid. Raptor Lake is actually pretty efficient, just not as efficient as Zen 4. Intel can't compete at lower TDPs, so all they can do is pump more power into the CPU. No one is ever going to buy a slower CPU, but maybe there are some customers out there who don't care about power consumption.

      Radeon did the same when they had to compete with Nvidia's Pascal. If the RX 480 and GTX 1060 were priced the same, used the same amount of power, but the 1060 was 10% faster... no one was ever going to buy the 480. That's why they pushed the 580 (rebranded 480) to 200W, so that it ended up beating the 1060 (120W) by 5%.
      And Nvidia did it when they had to compete with the much more efficient RDNA2, the 3080 was infamous for tripping power supplies.

      Basically, whoever ends up with a worse architecture has to drive power consumption to the moon if they want to sell. Right now it's Intel, in a few years who knows.

      • 3 months ago
        Anonymous

        Last gen for GPUs was crazy all around. AMD's own product documentation recommended a 1000w PSU for the 6900/50XT because of transient spikes.

        • 3 months ago
          Anonymous

          Case in point, the 6900XT and 6950XT weren't even supposed to exist. They saw they could compete with the 3090 and they went for it, they clocked Navi21 as high as they possibly could. Completely destroying the great efficiency of RDNA2 for single digit % performance improvements.
          Then they came up with the 6950XT, which was even dumber... and Nvidia responded with the 450+W 3090Ti lmao. Right now it's the other way around. Nvidia can be more conservative with clock speeds because AMD's RDNA3 is worse than Lovelace.

          My point was, no architecture is inherently a power hog. You could limit the 14900K to a much more reasonable 100W TDP and it would still perform great... just not as good as Zen 4. And no one would buy it.

          • 3 months ago
            Anonymous

            RDNA3 is great, but the product naming fricks up the entire stack and it leaves a very bad impression. AMD shouldn't have gone with chiplets without letting them mature a little first, as well.

            • 3 months ago
              Anonymous

              >AMD shouldn't have gone with chiplets without letting them mature a little first, as well.
              pretty sure chiplets cost them less than what a monolithic die would
              this amd generation was focused on increasing profit margins by lowering production costs, not pure power

              • 3 months ago
                Anonymous

                Yet RNDA4 will be a monolithic die and the halo card will be a $500 4080.

    • 3 months ago
      Anonymous

      how the frick is 7 7800X3D so low consumption

      • 3 months ago
        Anonymous

        stock ryzen chips are already heavily overclocked to win in benchmarks. If you limit a 7700x you get the same frequencies at the same wattage.
        They can't do that with the 7800x3D. The 3D cache traps heat and the TSVs can't increase voltage as high. So by necessity they HAVE to use lower power limits.

      • 3 months ago
        Anonymous

        Low clock speed and huge cache. Most games benefit from cache more than clock speed.

        like this anon said

        stock ryzen chips are already heavily overclocked to win in benchmarks. If you limit a 7700x you get the same frequencies at the same wattage.
        They can't do that with the 7800x3D. The 3D cache traps heat and the TSVs can't increase voltage as high. So by necessity they HAVE to use lower power limits.

        you cant overclock an x3D CPU without melting it

  37. 3 months ago
    Anonymous

    Only broke morons get AMD. Intel has shit like QuickSync and Heterogenous Architecture. Which allows for core performance to used correctly before passing it onto another.

    If any of you gays want to learn feel free to
    https://www.intel.com/content/dam/www/public/us/en/documents/white-papers/ia-introduction-basics-paper.pdf

    As a disclaimer my first AMD cpu was the shitty Tri-core bulldozer APU; like the 2nd one they ever made. It was absolute dogshit. So it was a shitty first impression.

    • 3 months ago
      Anonymous

      >Only broke morons get AMD
      Is this nekker stuck in 2012?
      Do you know that amd is more expensive than intel right?
      Intcels are so fricking funny.

  38. 3 months ago
    Anonymous

    Anyone who gives two fricks about 'Wattage' is a moron looking to nitpick.
    Like just lower your PL1 and PL2 values dumbass.

    • 3 months ago
      Anonymous

      That cripples Intel CPUs hard while Ryzens fly at modest power limits.

      • 3 months ago
        Anonymous

        I know I was making a joke; I like you anon.
        I work at Intel actually. There's alot of work done to make sure Windows doesn't frick over all the work on the silicon level. Shit is assssss

  39. 3 months ago
    Anonymous
  40. 3 months ago
    Anonymous

    I have a Ryzen 7 7800X3D and it absolutely dumpsters anything I throw at it.

    • 3 months ago
      Anonymous

      I'm about to build my first computer and the build is centered around using that chip. I am excited but scared to frick it up.

      • 3 months ago
        Anonymous

        You'll be fine. Just follow basic instructions, don't get ahead of yourself, and be sure to take your time. Stop if you feel overwhelmed or unsure and take a break.

  41. 3 months ago
    Anonymous
  42. 3 months ago
    Anonymous

    >IntelME

  43. 3 months ago
    Anonymous

    both are fine. doesn't matter now.

  44. 3 months ago
    Anonymous

    Intel Software built into the chip is godly if you do literally anything else than gaming.

    • 3 months ago
      Anonymous

      Too bad you can't test that because you need a different mother board

  45. 3 months ago
    Anonymous

    intel and nvidia. single core performance is still king for a lot of things. intel/nvidia just werks with everything and gets support faster for everything when its new. you pay a tax for it sure but not being one of those goons whining on message boards about drivers is worth it

    • 3 months ago
      Anonymous

      AMD's winning at single-core now. It's over for Intel. Pack it up, Ramdeep. This is an /amd/ board, now.

    • 3 months ago
      Anonymous

      The performance is amazing because of the E and P cores and knowing what threshold is needed to pass it to a P core.
      It's wizardry to make sure Windows Search doesn't have a fricking autistic rampage which leads to it shitting the bed on a P core while you game.
      Also everyone reeeing about power usage.
      https://www.msi.com/Handheld/Claw-A1MX
      Intel is finally trying to get into the handheld market. Ticktok AMDgays

      • 3 months ago
        Anonymous

        i'm on win 10 and read it doesn't know how to use the p and e cores correctly because of the scheduler or something, but i haven't noticed any issues so far. its fast though, 13700k

      • 3 months ago
        Anonymous

        >XeSS looks like desaturated shit
        >Intel iGPU can't compete with AMD's APU
        >need to run Win11 or the P+E core configuration goes full moron and your performance suffers
        Pajeeeeeeeet.

        • 3 months ago
          Anonymous

          >XeSS looks like desaturated shit
          Tell me how FSR looks like.
          >installing windows 10 in 2020+4
          Might as well go with trannux.

          • 3 months ago
            Anonymous

            >what's FSR look like
            Don't know. AMD's rasterization is powerful enough to not need upscaling tech 🙂
            >using Windows on a handheld
            Need to dedicate more of that nvme to bloat my homie?

            • 3 months ago
              Anonymous

              >AMD's rasterization is powerful enough
              This is iGPU we're talking about. Even the top end ryzen iGPU is a lot weaker than gtx 1650.
              >steam OS
              Steam deck can't even run 2020 games well at 720p. I'd rather game on a netbook than ick on eck. Both are only useful for indie games, but at least you're getting full keyboard and a bigger screen at a lower price.

              • 3 months ago
                Anonymous

                >This is iGPU we're talking about.
                Intel iGPUs have been dogshit for years, is meteor lake actually any good?

              • 3 months ago
                Anonymous

                nta but i managed to get ai to run with the igpu and it wasn't as dogshit slow as i figured it'd be. slower than a dedicated card for sure, but not bad. for a normie it'd be enough for faceberg and youtube. i hope intel does something with that arc platform and gives us another option

              • 3 months ago
                Anonymous

                They might just say frick it and place the ARC GPU shit into the CPU. So instead of iGPU it's dedicated graphics ala APU.
                In its current state intel's igpu peaked like their 14nm+++++++++

              • 3 months ago
                Anonymous

                i think the igpu stuff is impressive for what it is on a chip, but keeping arc on the chip would seal its fate as a nothingburger. its to little space to have enough power and architecture and would never compete with ati/nvidia

              • 3 months ago
                Anonymous

                They're gonna try some moronic AI shit first to squeeze out performance which may help but yeah they need to figure out a new hardware solution.

              • 3 months ago
                Anonymous

                i keep up with ai shit and making cards specifically for it at this point is dumb. transformers for example has to go, it'll eventually be replaced. so anything they make hardware around now will be gone in a few years. there is a LOT of optimization to be done yet with everything ai, trying to generalize cards for it at this point is a mistake imo
                they should be focusing on getting cards out and entering the market

              • 3 months ago
                Anonymous

                NOPE gonna have to wait for at least arrow or panther lake.
                best case they can shrink the die from the Arc GPUs

  46. 3 months ago
    Anonymous

    AMD inexpensive cpu building for someone else
    CPU what i buy so i dont run into weird bugs

    similar if it were nvidia vs amd. i kinda want the product to be good when i buy it. i dont want to wait a year for "fine wine" and unexplainable bugs

    • 3 months ago
      Anonymous

      How's the weather in Mumbai?

      • 3 months ago
        Anonymous

        are indians now buying higher quality products now or something?
        whats the meme?
        they arent buying heavily discounted ryzen cpus and making excuses for why they dont need ray tracing?

        • 3 months ago
          Anonymous

          Paid. Pajeet. Marketing. Shill.

        • 3 months ago
          Anonymous

          >meme tracing
          Oh look here comes the nshidia shill

          • 3 months ago
            Anonymous

            you dont need ray tracing
            you couldnt afford it if you did.

          • 3 months ago
            Anonymous

            kek it's like a Scooby Doo scene where they pull off the Intel shill's mask and it's an Nvidia fanboy underneath

  47. 3 months ago
    Anonymous

    >the pajeets at work defend intel like their lives depend on it
    >meanwhile everyone with an iq above room temperature uses AMD
    Really makes you think...

  48. 3 months ago
    Anonymous

    5800X3D and the 7800X3D are the most efficient CPUs ever.

    You would care about power "consumption" if you are from the EU.

    • 3 months ago
      Anonymous

      >3D cache
      Good for you. Enjoy cache latency though.

      • 3 months ago
        Anonymous

        The cache is on the chip. Huh?

      • 3 months ago
        Anonymous

        Sour grapes ass homie.

    • 3 months ago
      Anonymous

      you would care about power consumption if you're not an irresponsible child or in charge of anything where profit is concerned

  49. 3 months ago
    Anonymous

    Man the shitel employees are really trying and failing to damage control in here

  50. 3 months ago
    Anonymous

    Man I really don't want to but I guess I should move over to AM5. I was planning on just upgrading to the 5800X3D but I might as well just do a rebuild. I already don't like my 3800X (which I got a nice mobo deal with when the 3900X was out of stock forever). At least all I need to do besides buy the necessary upgrades is contact Noctua to get an AM5 mounting bracket.

    • 3 months ago
      Anonymous

      Not worth it, DDR5+7800X3D vs DDR4+5800X3D is a fairly minor difference. Save the money from the mobo+RAM for the next gen or the one after.

      t. Have both machines in the office

      • 3 months ago
        Anonymous

        >DDR5+7800X3D vs DDR4+5800X3D is a fairly minor difference
        I have to disagree. You can easily upgrade AM5 CPU several years later if you have a decent motherboard. AM4 is good if you want to save some money right now or even skip AM5 era.

        • 3 months ago
          Anonymous

          Why would he spend 500$ more instead of just getting a 5800X3D and some more RAM, homie? The CPU is still a top performer, you could get almost half a current GPU for that money.

    • 3 months ago
      Anonymous

      in the future, don't plan to ever upgrade. spend the money up front and get extra ram, processing power that you think you won't need. i got 9 years out of my last build and it cost $1200 for the tower at the time, before prices went nuts

  51. 3 months ago
    Anonymous

    If you dont immediately know what this file is associated with, you have no knowledge of what CPUs are or how they function.

  52. 3 months ago
    Anonymous

    >bought 7900X because it was on a deep discount with a mobo
    >turns out 7800X3D is better for games
    oh well, still a big step up from my i5 2500k, I'll miss the little guy

    • 3 months ago
      Anonymous

      If you loved the 2500k that much why not get it's 10 year glow up the 12600k?
      You some sort of brokie?
      https://ark.intel.com/content/www/us/en/ark/products/134589/intel-core-i5-12600k-processor-20m-cache-up-to-4-90-ghz.html

      • 3 months ago
        Anonymous

        Why do the Intel shills in this thread have painfully awkward English?

        • 3 months ago
          Anonymous

          Do you even know what this program is?

          • 3 months ago
            Anonymous

            No, Ramdeep, I do not. How does that change the fact that everyone in this thread shilling for Intel has awkward English as if it's a second language for them?
            Hell, even this unrelated counter argument seems like something a pajeet would do.

            • 3 months ago
              Anonymous

              Rangebanning every third world country solve 90% of the problems on Ganker

            • 3 months ago
              Anonymous

              you're a fricking moron homeboy

              • 3 months ago
                Anonymous

                Riveting comeback, Ranpoo. It doesn't make my observation any less poignant or correct.

                Rangebanning every third world country solve 90% of the problems on Ganker

                All of Ganker would immediately improve if non-US IPs were immediately banned. I'd say the UK, France, Canada, and Australia can come back, but only the rural IP addresses. Urban areas might as well be considered third world at this point.

    • 3 months ago
      Anonymous

      3d cache thingy is to die for, 12 cores is only good for i dunno, ffmpeg or something, it's more of a workstation CPU. then again, i doubt you're getting any framerate issues in vidya on 7900x

  53. 3 months ago
    Anonymous

    i just got i7-12700F. it was the only thing they had in the store atm. is it good? do i need a special cooler for it?

    • 3 months ago
      Anonymous

      Just buy an AIO like a white man and be done with it.

      • 3 months ago
        Anonymous

        i already had the whole pc, old cpu got fried. otherwise i'd probably just get a macbook. i'm sick of computers and technology in general, just want good drivers for audio and aggregate devices

        what cooler do you have now?

        i just put the cooler that came with it, but also have old ass nepton 240m somewhere

        You can get away with something from Thermalright or Noctua if it didn't come with an OEM cooler. The 12700k is a good CPU, even with the cuckcore nonsense. You did well.

        It's good. It's not a K CPU, you don't need a water cooler for it, just get a decent air cooler like deepcool AK500 or AK620.

        perfect, thanks guys!

      • 3 months ago
        Anonymous

        Never buy AiO. Either splurge all the way for a custom waterloop or go air

        • 3 months ago
          Anonymous

          Why not?

          • 3 months ago
            Anonymous

            You're spending a lot more for something that cools just as much as an air cooler

            • 3 months ago
              Anonymous

              Thats not true though

        • 3 months ago
          Anonymous

          moron

      • 3 months ago
        Anonymous

        >quiet
        >good temps all year long
        There is no reason for custom loop unless you are going way out of standard hardware specs (OCing) or because you want to have a "cool" aquarium PC. AIO is simple, easy to install, and just works.

    • 3 months ago
      Anonymous

      what cooler do you have now?

    • 3 months ago
      Anonymous

      You can get away with something from Thermalright or Noctua if it didn't come with an OEM cooler. The 12700k is a good CPU, even with the cuckcore nonsense. You did well.

    • 3 months ago
      Anonymous

      It's good. It's not a K CPU, you don't need a water cooler for it, just get a decent air cooler like deepcool AK500 or AK620.

      • 3 months ago
        Anonymous

        K just means it's unlocked. They didn't give it extra power output or anything.

        • 3 months ago
          Anonymous

          >They didn't give it extra power output or anything.
          Actually they did. K CPUs draw more power but could be limited.

    • 3 months ago
      Anonymous

      I have a 12700k and I undervolted it with -0.075v
      and just use a 120mm air cooler(ID Cooling 226 XT)

      The temps and power consumption while gaming is really good

  54. 3 months ago
    Anonymous

    Intel still prices their CPUs like they haven't fallen off hard in the last ten years.

  55. 3 months ago
    Anonymous

    As somebody who always went with Intel, this gen is making me regret my 12700k and my next upgrade will most likely be a 7800x3d or its equivalent

    • 3 months ago
      Anonymous

      12700k was fine when it came out, certainly a lot better than previous intel chips. Out of all the times to buy intel, that was probably the best since the original ryzen.

    • 3 months ago
      Anonymous

      I was full on frustrated with AMD back in the Phenom and Bulldozer days. They had gone from being a respectable brand to irredeemable garbage. Ryzen being this good was like being blindsided.

    • 3 months ago
      Anonymous

      what are you trying to do that a 12700k seems bad for

      • 3 months ago
        Anonymous

        I wouldn't definitely call it bad of course, especially considering I went from a 3930k to that, not to mention it also pairs decently enough with my 4090 when playing at 4k
        Yet, I still can't help but wish I had more CPU power for those games I'm CPU bound, or for the heaviest rpcs3 games like inFamous

        Why would you upgrade your CPU for extra 10 fps?

        Why not?

        • 3 months ago
          Anonymous

          i built my last build around emulation and it was right before rpcs3 started to become usable and actually made use of multiple cores over something like pcsx2. it was an i5 4060k i think? i managed to play through ac4/4a on xenia using vulkan but so much of the graphics were fricked, i had make cheats because i couldn't see that i was being railed by a giant laser. i have a 13700k now but haven't tried rpcs3 yet, but i'm hoping its good

          • 3 months ago
            Anonymous

            You can run RPCS3 with mobile Ryzens now. Of course a 13700k will be enough.

            • 3 months ago
              Anonymous

              i remember thinking 'oh frick' when reading about how rpcs3's approach was to emulate each core to a core on your comp, which i didn't have enough for. i knew my new build was already kinda fricked. but it gave me so much great emulation, i have no complaints

    • 3 months ago
      Anonymous

      Why would you upgrade your CPU for extra 10 fps?

  56. 3 months ago
    Anonymous

    The 7800x3d price to performance is unbeatable.

  57. 3 months ago
    Anonymous
  58. 3 months ago
    Anonymous

    Whichever one you find the cheapest, but if both are are around the same price, then its the X3D series.
    Intel mostly what has good is you get to keep ddr4 memory if you pick a ddr4 based mobo, great for those who bought like 64gb on their last system, so you don't shill $400 on ddr5

  59. 3 months ago
    Anonymous

    Imagine playing at such a low resolution that this is a decision.

  60. 3 months ago
    Anonymous

    7800X3D is the best gaming CPU you can get, but I'm waiting for Zen 5

  61. 3 months ago
    Anonymous

    new paradigm when?

  62. 3 months ago
    Anonymous

    Support is no more a thing and there's no need to fanboy over a chip maker. Just get the best one in your price range at the time you want to buy a cpu.

  63. 3 months ago
    Anonymous

    Any Mobo reccomendations for a 7800X3d?

    • 3 months ago
      Anonymous

      Pretty much any b650 is enough. It all depends on what features you need.

    • 3 months ago
      Anonymous

      B650 Aorus Elite AX

    • 3 months ago
      Anonymous

      It draws like 70 watts, literally anything is fine

    • 3 months ago
      Anonymous

      i'd go for some flagship x670 and then pray it'll get updates until ryzen goes into 10xxx models

    • 3 months ago
      Anonymous

      I have a 7600X and I got an Asrock B650 PG Lightning, from my research it ended up being the best mobo for its price range. Also has a nice M.2 5.0 slot.

  64. 3 months ago
    Anonymous

    Did I frick up buying a 7600 instead of 7800X3D? I figured I'll be GPU bound at 4K in most games anyway. GPU is 6950 XT.

    • 3 months ago
      Anonymous

      should be more than enough for now, but yeah, you did lose some performance. i don't think i'd care all that much, unless you're playing this one or two very specific games that improve massively with 3d cache and are demanding enough to cause issues on modern high-end hardware, and the only one I can think of is VRChat

      • 3 months ago
        Anonymous

        It didn't seem worth twice the price for slightly better performance. The plan was to just get on AM5 as cheap as possible and then upgrade to a better one at end of life like you could with 5800X3d on AM4. Also 7600 seemed to have the same performance as 7800X3D in some older games

    • 3 months ago
      Anonymous

      Worst case scenario youre probably having 20 or so less FPS with a 7600 over a 7800x3d. I doubt 20 fps are worth 250 bucks of price difference, especially when 250 more bucks on a GPU would give you a meatier FPS boost.
      Plus, you can always just buy a 7800x3d in some years when its price is low, since you have an Am5 setup now (provided you werent moronic and didnt buy a trash mobo).

  65. 3 months ago
    Anonymous

    >Want to upgrade my PC
    >Microcenter is opening up in my city soon
    >However the date is between March-June

    Decisions, Decisions

  66. 3 months ago
    Anonymous

    7800x3d. The additional cache makes a difference in 1% lows. I have a 5800x3d, I will not buy anything without a large cache. It's junk, garbage, waste of silicon. Intel, junk. AMD non 3d CPUs, junk.

  67. 3 months ago
    Anonymous

    Intel is far superior.

  68. 3 months ago
    Anonymous

    >5800X
    >4080
    Am I bottlenecking CS2 right now bros?

    • 3 months ago
      Anonymous

      >tranimeposter
      >buys 4080 to play CS
      Checks outnyh0p

    • 3 months ago
      Anonymous

      Black person it's quite simple.
      Play the game with some kind of overlay RTSS, nV overlay whatever then check if your GPU is near 100% utilization.

  69. 3 months ago
    Anonymous

    Put it this way, Ganker shits on everything Intel & Nvidia and shills AMD but whenever a new game comes out the people b***hing about crashes are always AMD users.

    • 3 months ago
      Anonymous

      More lies in one post have never been told.

    • 3 months ago
      Anonymous

      >the people b***hing about crashes are always AMD users.
      thats the funny part. ati always sucked. nvidia was good before intel bought them and is still good. amd just can't keep up in processors, which are hardly relevant now anyways because everything is so fast that its the gpu that matters
      intel-nvidia, can't go wrong even if you pay the tax

      • 3 months ago
        Anonymous

        What the frick is this stupid word salad?

        • 3 months ago
          Anonymous

          when your name is rasheed, probably year 2

        • 3 months ago
          Anonymous

          brandmonkey cognition in action

      • 3 months ago
        Anonymous

        >. nvidia was good before intel bought them and is still good. amd just can't keep up in processors, which are hardly relevant now anyways because everything is so fast that its the gpu that matters
        what

  70. 3 months ago
    Anonymous

    Intel is embarrassingly shitty nowadays so AMD by default.
    https://www.radgametools.com/oodleintel.htm

  71. 3 months ago
    Anonymous

    You should never buy AMD just for the mere fact that Ganker is shilling it. These trolls always want to bait people into bad purchases.

    • 3 months ago
      Anonymous

      You should never trust Ganker with anything, let alone your money.

  72. 3 months ago
    Anonymous

    Is this price fricked up?
    Amd Ryzen 5 5600G 4.4ghz
    16GB Ram 3200mhz DDR4(2x8)
    SSD 480GB
    600w atx 20+4
    600 usd in 1 payment or 800 in 3... I swear ive seen this pc for 300 in amazon or some shit

    • 3 months ago
      Anonymous

      >5600G
      Don't. Get proper GPU, it's shit deal.

      • 3 months ago
        Anonymous

        ...no GPU?

        Ive seen it get 60 fps on cs2 and silly games which i think its enough until i can get a decent gpu to pair it with

        if not bait, check the Ganker build a pc thread and post your country and budget, you'll get a much better machine

        Will do sir, thank you sir

        • 3 months ago
          Anonymous

          i was a decade out of date from my last build and i just monitored the pc threads for recommendations, eventually asked a few questions and i'm very happy with my build. unlike Ganker which is always mean you can ask dumb questions and get a non-meme response

        • 3 months ago
          Anonymous

          For $800 (assuming it's USD) you can get a much, much, much better computer. Like, a 5600 + 6700XT.
          A 5600G with no graphics card for $600 is a scam.

    • 3 months ago
      Anonymous

      if not bait, check the Ganker build a pc thread and post your country and budget, you'll get a much better machine

    • 3 months ago
      Anonymous

      ...no GPU?

    • 3 months ago
      Anonymous

      That is not a good deal. You are not far off with your $300 estimation.

  73. 3 months ago
    Anonymous

    >want to upgrade GPU
    >realize that I'd also have to upgrade CPU and RAM and MB and PSU
    >give up and keep on playing TF2 and minecraft without shaders

  74. 3 months ago
    Anonymous

    you guys ever had a OC instability so hard that it does an impossible game mechanic instead of crash?

    • 3 months ago
      Anonymous

      Depends on what you mean. In Dead Space if you run it at 200 fps you can break the engine and do wild stuff. Speedrunners use it to their advantage. REmake 2 also has weird things like a knife attack doing multiple hits instead of one on high fps.

      • 3 months ago
        Anonymous

        A few more:

        In Monster Hunter World, some weapons (bowguns with piercing bullets, for example) hit more often than originally intended at higher framerates. A normal user is never going to notice, but this breaks speedruns. There's a mod to fix this.

        In MH Rise, monsters used to track their moves better at higher framerates. On Switch (or on PC at 30 FPS), you could dodge some moves by just walking (this was intended behaviour). On PC they would snipe the shit out of you. This was fixed by Capcom a few months after the PC release.

        In RE4 HD (the original, not the remake) enemies throw items (like grenades) twice as often at 60 FPS compared to 30. Also, you have half the time to beat QTEs.

        In RE1 HD Remake, zombies track the player better at higher framerates. It's easier to run around them and not get grabbed at 30 FPS compared to 60.

        • 3 months ago
          Anonymous

          So the PAL versions of og RE trilogy is insignificantly easier?

          • 3 months ago
            Anonymous

            No idea. RE1 HD Remake is a port of the GC release (you could choose between 50hz and 60hz mode on the PAL version. Everyone played at 60hz, unless your TV was ancient and didn't support it). It was probably a tiny bit easier if played at 50hz, I assume.
            Personally, I tried both 30 and 60 FPS on the PC version, and I couldn't tell the difference in zombie behaviour. But it can mess with speedruns apparently.

            It has nothing to do with framerate, but the first NA version of RE1 on PS1 is way harder than the JP version. Because it doesn't support autoaim for whatever reason. You press the aim button and the character doesn't automatically turn towards the enemy.

            • 3 months ago
              Anonymous

              >It has nothing to do with framerate, but the first NA version of RE1 on PS1 is way harder than the JP version.
              They all have weird quirks and small changes. I watched some dude beating RE2 without pressing forward once and he picked some version of the game because of something but I don't even remember what his reasoning was.

  75. 3 months ago
    Anonymous

    been using a amd rysen 7 3700x since 2020 and have 0 with palying all games on hight or ultra,
    except total war 2 on ultra,that shit pu my pc on fire.

  76. 3 months ago
    Anonymous

    i dont have anything to add about anons new build but i just upgraded to a 13700k, 4070 suti and a nvme drive, i had a normal hdd before and was enjoying sims 3 still. amazing difference

    • 3 months ago
      Anonymous

      >CPU significantly more powerful than GPU
      Why?

  77. 3 months ago
    Anonymous

    For gaming, absolutely AMD. A 7600x has performance on par with intel models twice more expensive.

    • 3 months ago
      Anonymous

      >for gaming
      AMD smashes Intel at production as well.

      • 3 months ago
        Anonymous

        Does it? I've never looked at comparisons in that setting since I just play games and use photoshop/Office. I doubt I'd get an Intel anyway since I'm in Europe and intel CPUs are ridiculously expensive, like 50% higher than their price in USA, while AMD CPUs are very affordable.

        • 3 months ago
          Anonymous

          see

          yeah but this isn't something like 20% more power efficiency difference that you can just dismiss
          the gap is unbelievably fricking massive for a TWO PERCENT increase

  78. 3 months ago
    Anonymous

    I don't like Intelaviv but I have DDR4 sticks a plenty so I'm probably going to get one when my 9th gen starts struggling, there's literally no games worth upgrading for.

  79. 3 months ago
    Anonymous
  80. 3 months ago
    Anonymous

    I say this as an owner of an AMD CPU and GPU: Intel. I've had nothing but problems since switching from Intel + Nvidia and my next PC will be avoiding AMD at all costs.

Your email address will not be published. Required fields are marked *