You can game just fine with Intel
You can't work on something professional with AMD
AMD is king for gaming.
Intel is king for workstations.
Intel can also be used for gaming but you have to put way more money into it to get performance equal to that of a Ryzen. Just look at the 14900KS. >400W or even more >over 600 bucks >for gaming performance equal to that of a 7800x3D >which consumes less than 90W >and only costs around 350 bucks
That and AM4/AM5 are longliving cheap platforms while keeping up with Intel is generally more expensive as you constantly upgrade parts.
Well, that and Intel also runs way hotter so i guess you might have to spend less on your heaters next winter.
ROFL shitel is shit at everything.
Threadripper is way better for workstations and hedt.
>You can game just fine with Intel >at $200 higher price than an equivalent AMD >at double the power consumption of the equivalent AMD >with higher temperatures that cause more throttling compared to the equivalent AMD
I added addendums to your post to correctly explain the situation
pic 1/2
I'm on a grid fed by one. My Intel CPU idles at like 10w and barely gets to 90w while gaming (because only the poors use CPU heavily for gaming). My GPU consumes only a modest 450w, though.
3 months ago
Anonymous
>My GPU consumes only a modest 450w
Intel or AMD?
3 months ago
Anonymous
GPU. Neither of those companies make GPUs worth mentioning.
power consumption means heat, cpus are not really doing anything else with those huge amounts of energy that goes into them. more power consumed -> more heat generated, and generating more heat means that CPU is more imperfect inside as heat is primarily generated by poorly aligned cpu layers having hard time to connect to each other due to smaller contact and resistance. more energy efficient cpu means that manufacturers can squeeze out more of it. even if your energy would be 100% free, you'd still prefer a more efficient cpu, because it would simply be better as long as it's same performance bracket
AMD is king for gaming.
Intel is king for workstations.
Intel can also be used for gaming but you have to put way more money into it to get performance equal to that of a Ryzen. Just look at the 14900KS. >400W or even more >over 600 bucks >for gaming performance equal to that of a 7800x3D >which consumes less than 90W >and only costs around 350 bucks
That and AM4/AM5 are longliving cheap platforms while keeping up with Intel is generally more expensive as you constantly upgrade parts.
Well, that and Intel also runs way hotter so i guess you might have to spend less on your heaters next winter.
Used to be the case when AMD offered more cores and clock speeds than intel.
But when they started putting estrogen cores, they fricked up with gaming performance but at the same time it improved workstatiom performance
If have a really strong cooling solution it probably doesn't matter all that much. (Unless you care about your bills). If you don't take into account power efficiency, intel is better in that regard.
I have a 5600, just to be clear.
yeah but why wouldn't you care about that
you're paying more for air conditioning and power just to have a chip from a blue brand?
3 months ago
Anonymous
I know, I'd care - some don't, they want or even need the BEST of the BEST.
It's up to you, really.
how did they do it?
Their CPU architecture seems to be giving them headaches again and they start to plateau 'till they get to a smaller lithography.
3 months ago
Anonymous
yeah but this isn't something like 20% more power efficiency difference that you can just dismiss
the gap is unbelievably fricking massive for a TWO PERCENT increase
3 months ago
Anonymous
These threads just expose how Ganker is filled with paid pajeet marketing shills because nobody in the real world has this irrational hate for AMD like you see here outside of UserBenchmarks.
This image right here destroys the Intel vs AMD argument. It's settled. AMD won. Intel has nothing on the horizon to answer back with, and AMD is chasing Intel out of the server space with Threadripper/EPYC. Even Grace Hopper can't compete. It's both slower and doesn't have x86 compatibility.
3 months ago
Anonymous
>Power consumption >meaning frick all
lmao; each CPU has their specialties and use case. you call out 'shills' but totally fricking write like one
3 months ago
Anonymous
it means a lot if you want to use it to regularly make money on anything higher than an occassional hobbyist level
you want to tell the boss that you'll be doubling to tripling the power draw in a specific category because you're sucking off intel?
3 months ago
Anonymous
There's more info in that image than just 'power consumption.'
When you need 350w on top of your competitor for a 1.1% performance uplift then your architecture is bad. You're doing it wrong.
Is google too hard for you?
Stock in games it doesn't really go over 60w at least for techpowerup
It can use more power but under specific uses like avx or if you overclock/disable power limits.
AMD for high end gaming machines, Intel for low end PCs and productivity machines. 12400f destroys AM4 on performance and power efficiency, but it ultimately destroys them at memory latency. Also Meteor Lake will be re-released in the future as budget CPUs as Intel moves to smaller micron processing for their next gen lineup, so sticking to the ultra cheap 12100f is still worth it now as there'll be a good budget upgrade path in the future. Plus Intel APO is coming to 12th gen CPUs too.
For high end gaming machines, AMD is pretty much the only next gen option. Not worth it for budget builds because DDR5 is still shit. Furthermore, 7500f is pretty cheap if you ever manage to find one, but it still suffers from the typical AMD memory latency problem and performs the same as 12400f when memory latency becomes an issue. 7800X3D is a monster, but it's expensive and the money could go to your GPU instead. Not worth buying unless you have a 4070 Super at bare minimum.
AM4 is trash man. Not worth buying anymore. Maybe if you got a *really* good deal on 5600X3D or 5700X3D it's worth it, but otherwise it's outdated trash with PCIE 4.0 at best.
I am not saying that people should still buy AM4 but AM4 shows that AMD is willing to support a platform for a very long time. Someone could have started with a 3600 Ryzen only to later upgrade to a 5800X3D without switching the motherboard..
Intel meanwhile drop support way faster.
When I needed to upgrade my CPU, I went for a 10400F because that was the best value at the time. Now all my friends with AM4 motherboards upgraded to 5600s and 5800X3Ds, while I'm stuck with this.
One of them went from a 1600 to a 5800X3D, the dream upgrade: he tripled his CPU performance (or more in some games) for like €350.
I regret my choice to be honest, I underestimated how useful their long-term support would be.
This. Intel fanboys really downplay how good long term socket support is. Using arguments like "well you don't need to upgrade your cpu until like 5 years after"..... expect if you bought a R5 1600 in 2017 you could upgrade to the R7 5800X3D in 2022. While if you bought an i5 7400 in 2017 you are SOL.
ultimately, still negligible performance loss from 4 > 3, marginal at 4/3 > 2, and notable at fricking 1.1, but realistically you're not getting a fricking 1.1 slot nowadays.
Maybe if we're literally only talking about PS2 emulation, and even that's probably not down to actual performance and more down to developer incompetence.
>I strongly doubt that is the case
just look up test videos on youtube. i've bought i3 12100f and it only has 4cores. yet it runs gt6 at around 30-40fps, meanwhile ryzen 3600, same budget CPU, can't even reach 30 despite having 6 cores. heck it even loses in pc games because amd's IPC is trash.
>just look up test videos on youtube
Or I could just run it on my athlon.
You doughnut
What next, you're gonna tell me to check out userbenchmark?
3 months ago
Anonymous
>athlon
i had fx6300 and it was pretty shit for rpcs3, only a few smaller games ran at full speed. you're coping.
3 months ago
Anonymous
>you're coping.
Black person I have my tiny athlon PC on my desk, and you're trying to convince me that it's using some kind of shamanic ritual to trick me into believing it's working.
Because its overkill yet afforadable and absolutely stompts the shit out of both Intel and other more expensive AMD CPUs in most cases.
The 7800X3D is so good it negatively affects AMDs topline.
why did no one warn me this chip runs so hot. I understand now that its normal according to amd but it freaked me out. I'm like a caveman cowering at fire with this thing
Modern CPUs are designed to run as hot as possible to win technically-not-overclocking-but-very-expensive-cooling setups. An AM5 going up to 95c or whatever at the junction point gets them the "performance crown", even though they should be run at the top eco mode to not ramp your fans constantly. You don't even lose performance doing that on a 7800x outside of max-load all-core benchmarks because it's moronicly power efficient.
If you temperature monitor says the CPU is a running hotter than the universe one yottosecond after the big bang it might not have the right settings for that chipset
How is every single fricking reply so wrong holy shit
There is no fricking reason that you'd want a space heater if you're doing professional work for prolonged periods of time
Maybe on a small scale sure but if it's the main reason you're buying then frick no, that extra heat isn't worth it
Intel sucks at emulation ever since they killed avx512 while AMD has come up with a good solution that doesn't ignite the CPU
And games don't care about pure raw clocks since 7800X3D beats the 14900k on average
OP asked which CPU is better for gaming. Not which CPU is best for your blender and adobe shit.
In that case the only answer is AMD unless you like spending more money on chips and electricity bills than needed just to achieve equal performance to a Ryzen from a generation ago. Nobody who is building their own PC is buying Intel right now - the top 10 bestselling CPUs for PC builders are all Ryzen CPUs. And that for a good reason.
AMD brought back Jim Keller who actually knew what he was doing, compared to the mongoloids that designed bulldozer. It also helped that Intel were using their own fabs and got stuck on 14nm for years (their 10nm) was just a refinement of 14, not an actual node shrink).
and yet still keeps up with TSMC 5nm. It's incredible how AMD still lags behind Intel's own shitty fabs, same goes for Nvidia where they also get btfo despite Nvidia using TSMC as well.
3 months ago
Anonymous
>and yet still keeps up with TSMC 5nm.
No? Intel 7/10nm is less dense than TSMC 7+ and later nodes. Intel 4 is behind TSMC N5. Intel 3 and 20A are imaginary. >It's incredible how AMD still lags behind Intel's own shitty fabs
Are you not aware that intel is using TSMC to fab most of their chip tiles?
AMD invested into a new platform with Ryzen and the bet paid off. Intel meanwhile are still rocking the same old shit for over 10 years just constantly increasing the heat resistence, power drain and clock count. Intel are also working on a new platform but just like Ryzen it will probably take atleast a generation before everything is optimized for it.
In short: They sat on their ass for too long.
Wrong. AMD has the advantage of having their CPUs manufactured by TSMC which is the biggest fab with the lowest micron processing in the world. AMD is a fabless company. Meanwhile Intel uses their own fabs which didn't benefit from government bucks unlike TSMC, until the last few years. They couldn't match TSMC's speed at upgrading their fabrication process so they did architecture and software upgrades instead. AMD will be utterly fricked when pooh invades taiwan though. They'll end up borrowing intel fabs one day.
Lmao Intel is still behind tsmc's nodes, and even uses them for their I/O. Even after getting their CHIPS gibs they're still demanding several more $billion.
And how many of those are standing and how much of the total production do they have? Don't forget that Taiwan's manufacture is cheaper too. I would expect at least 30% price increase. Data source for my estimate you ask? It once appeared to me in a dream.
3 months ago
Anonymous
from what vague information I read years ago in passing, I recall it takes a long fricking time to make a fab anyway (something in the range of like 5-7 years or some shit? I dunno), so probably 0.
It's all bs to control the market and make it look like there's no monopoly, the creator of the Ryzen architecture is an Intel engineer, he worked for Intel, went to AMD, created Ryzen, and then went back to Intel. Look it up. The AMD team did good on maturing the tech of course can't deny that. Same shit for AMD and Nvidia, they've been colluding with prices a few times. These tech companies want you think they are in competition but in reality, they all work together behind the scenes.
>It's all bs to control the market and make it look like there's no monopoly, the creator of the Ryzen architecture is an Intel engineer, he worked for Intel, went to AMD, created Ryzen, and then went back to Intel. Look it up.
What are you talking about?
He's a moron that doesn't know what a corporate work force looks like. Companies in the same industry have near constant cross contamination of labor. That's why non-compete clauses are huge. 90% of corporate espionage comes from some dude taking files from his old company to the new one.
kek look at this and the other homosexual thinks I'm living in the past for holding onto the 4790k for so long. Guy knows fukol and wants to comment about how he wastes money to coooonsssuummmee new hardware like the fricking moron he is.
>“Fokol” in Afrikaans essentially implies a sense of nothing, of having northing or being nothing. While the word 'Niks” is the direct translation to “nothing” in Afrikaans, “Fokol” is a commonly used Afrikaans slang word.
3 months ago
Anonymous
That's obviously a loanword you dumb idiot poopface
yes >You will immediately cease and not continue to access the site if you are under the age of 18.
no one who cares about fortnite in any capacity meets this rule
or you're a manchild or pedophile
New Intel chips have plenty of issues with older games because of the E-cores. Sims 3 for example outright won't work unless you disable E-cores or download a mod.
AMD requires (yes, requires) higher speed ram to do the same job that an Intel cpu does internally. Basically, it's taking the cost to do something and putting it elsewhere and people see that as some kind of discount. It doesn't help that the majority of buildgays have no idea what they're doing and see no issue with buying ram with higher speed than what they need anyway.
This anon knows what he's talking about. Intel shit works out of box. AMD needs the external shit to compensate not to mention the software suite needed.
i'm on win 10 and read it doesn't know how to use the p and e cores correctly because of the scheduler or something, but i haven't noticed any issues so far. its fast though, 13700k
is there any tests that show the difference between the scheduler acting correctly vs win 10? i dont play much for 'aaa' games but nothing besides AI or benchmark programs like cinemark seem to come close to using it all. in my usage i haven't noticed any issues. white it might technically be an issue, i'm not seeing it so far
The last time RAM speed mattered so much was Zen 2. Now they are using a single CCD per chiplet and the cache is big enough, the inter-CCD latency (based on infinity fabric clock speed, which is based on RAM clock speed) is not that big of a deal. With X3D CPUs, the cache is so massive RAM speed doesn't matter at all. See pic ralated.
RDNA3 is great, but the product naming fricks up the entire stack and it leaves a very bad impression. AMD shouldn't have gone with chiplets without letting them mature a little first, as well.
>AMD shouldn't have gone with chiplets without letting them mature a little first, as well.
It's not something they could predict, they started working on it half a decade ago. RDNA2's "gimmick" ended up better than they imagined (ie, big cache but low memory bandwidth), RDNA3's (ie, chiplets) ended up worse.
I agree, RDNA3 isn't that bad: it's around 30-40% more efficient than RDNA2, which isn't a bad gen-to-gen improvement. It's about average I'd say, if not slightly above average. But when the competition improves efficiency by 70-80% in a single generation... yeah, your "mere" 30% looks bad in comparison.
>AMD shouldn't have gone with chiplets without letting them mature a little first, as well.
pretty sure chiplets cost them less than what a monolithic die would
this amd generation was focused on increasing profit margins by lowering production costs, not pure power
That was their plan, but complex packaging costs skyrocketed due to the AI boom. While silicon price dropped. That's why RDNA4 is (according to rumors) going to be monolithic only, they scrapped their chiplet designs.
The 4080 is probably a lot cheaper to produce than the 7900XTX by now.
If they went with a naming convention that had the XTX as the 7900xt and the 7900xt as the 7800xt, and so on, and so forth down the stack, then the gen-on-gen uplift would have been mindbogglingly massive. The 7900xt sitting in between the 4080 and 4090 would have made nvidia's connector fire disasters look even more embarrassing. Instead the "7900xt" looked like a barely there uplift from the much, much cheaper 6950xt at launch. Imagine if they called the 7600 a 7400 instead? That would have been a massive improvement over the previous gen's bottom card.
Well, in that case the 4080 is "actually" a XX70 die sold for Titan prices. Names are completely arbitrary, while prices mostly depend on what they believe customers are going to pay for their products.
When I say "gen-to-gen improvement", I'm comparing die size and power consumption. In terms of performance-per-dollar, Lovelance is only 20-30% better than Ampere... at best, if you compare the 3080 with the 4070 (ie, the best Lovelace card). With the 4060, price-performance didn't improve at all compared to the 3060.
But the 4060 is a 160mm2 die, 115W. The 3060 is a 276mm2 die, 180W. That's a massive gen-to-gen improvement.
The reason they cost the same is simple: people are buying it anyway, and Nvidia makes more money selling 4060s than 3060s for $300. Not that long ago, a graphics card with a die that small would be called "XX50" and sold for $150.
>But the 4060 is a 160mm2 die, 115W. The 3060 is a 276mm2 die, 180W
Smaller die. Less power draw. That's rather obvious. What's egregious is that the 4060 is actually 5% slower than the 3060. Lovelace's naming conventions hurt it, as well.
Truth be told both brands have stellar hardware this gen. Across the board. it's them trying to sell weaker cards as a product two tiers up the stack that has everyone scratching their heads and calling shenanigans. Especially when they try to use that to justify the price.
3 months ago
Anonymous
>4060 is actually 5% slower than the 3060
cope, none of the benchmarks back it up
3 months ago
Anonymous
fr fr on god that homie be sus with that ohio ahhh energy
3 months ago
Anonymous
3 months ago
Anonymous
nice cherrypicking, homosexual
now post averages across several games
3 months ago
Anonymous
>i-it doesn't count
a newer generation being worse than the older generation is not acceptable in any scenario, especially in games that are from companies like id that are known for having well-built games
3 months ago
Anonymous
>posts the only two games where 3060 can leverage its 12 gig VRAM >i-it's 5% slower
3 months ago
Anonymous
Got a counter argument with some facts to back up your claims, or are you just going to be a homosexual?
3 months ago
Anonymous
sure, I'll just post regular benchmark instead of 2 games
3 months ago
Anonymous
>le 5% face
3 months ago
Anonymous
Yes, it's 5% slower in a few games. Has something like this ever happened before? The newer generation being slower in certain games?
3 months ago
Anonymous
Overclocked 3060 12GB at 2GHz for the gpu and 9000 for vram is better than overclocked 4060 in all games and situations that require more than 8GB of vram.
Which is most UE5 games released recently.
4060 should be 4050 and cost under $200
3 months ago
Anonymous
oh and the ti version is even worse
3 months ago
Anonymous
jesus
3 months ago
Anonymous
128 bit bus and only 8GB of vram is a nasty combo.
Imagine losers that paid $400 for 4060ti..
3 months ago
Anonymous
>Imagine losers that paid $400 for 4060ti..
The 3070 wasn't that much better. Once the mining boom ended, both the 3070 and the 6800XT dropped to around $600. Guess which card sold better.
3 months ago
Anonymous
>What's egregious is that the 4060 is actually 5% slower than the 3060.
I don't know what you're talking about. 4060 gets at least 5-10 fps boost in most games. Maybe you're thinking of 3060 ti vs 4060 ti or something.
3 months ago
Anonymous
>both brands have stellar hardware this gen >it's them trying to sell weaker cards as a product two tiers up
Yep. And prices would've normalized by now... if it wasn't for the AI boom. Nvidia doesn't even need to sell their gaming cards to make money anymore, there are 4080s rotting on shelves since launch. Eventually someone is going to buy them, why would they care.
From 2017-2018 to 2023-2024, we went through two mining booms and an AI boom.
depends on a price bracket and use case scenarios. certain games benefit MASSIVELY from -x3d cache thingy, certain games run marginally better on intels, all depends on what tier you're aiming for and what kind of hardware around would you want. Another bonus is that AMD tends to stick to their sockets, so if you'll buy 7xxx cpu on a decent mobo, you're probably good for next couple generations. I have same mobo starting from first gen ryzen up to this day (5xxx), and I can definitely say that it was quite convenient this way.
>full amd
oh boy, you'll save a lot on GPU (though again, depends on a price bracket - for example, cheap 4070 costs same as slightly-more-powerful 7800xt), but at the same time you'll miss out on nvidia gimmicks and possibly lose some compatibility. Again, it's a random thing, but I remember more than one case through recent years where some game would flat out refuse to work on AMD for a while, it's all kinds of bullshit. But if you can cope with that, AMD GPU price/performance is to die for.
I guess an easy example would be like how Nvidia has a bunch of proprietary bullshit for compute that's kind of fricked work up. Personally I haven't really looked into if Intel has similar arrangements.
Cause the kind of software people use has been used for a quite a while and its all optimized for Intel and their current architecture. Even if AMD is technically more powerful and efficient - your blender software isn't making any use of it.
Look at emulators for example. AMD was a shit pick for anyone emulating games for quite some time cause AMD openGL performance used to be shit due to lack of optimization. But that got better over time and ever since most emulators started adding Vulkan - which heavily favors AMD - AMD has been kicking Intels ass in that field as well.
I've never, ever owned a single AMD processor. I've been buying Intel since the Pentium 3 and it has never let me down. I just go with the original, not the cheap copy.
kek I've had both intel >2700k, 4790k
and AMD >athlon x4 and 5600x
it's just most boomers are still moronic from when Intel paid to win and you've been stuck in your ways since Nothing to do with money, just brains.
>2700k, 4790k
Seems you are the one that've been stuck in the past, zoomer. There's a correlation between disposable income and IQ, that's why stupid people pick AMD.
Na I looked up benchmarks and didn't need to upgrade for a long time. Honestly could have still stuck with the 4790k but I wanted better end game performance in paradox shit. I mean you are still brainwashed from back when intel was paying so I'd refrain from talking about IQ man.
Unfortunately, boomers are still influenced by it. Ever speak to them about PC hardware? always get vague shit like >Intel is just more stable so I always run Intel >I just go with the original, not the cheap copy
I'm talking about AMD fanboys seething for 30 something years. The CPU market landscape, production, and management is wholly different now. There's no reason to stick to a single brand for the sake of loyalty. Back in 2002 I'd fricking buy a duron over any crappy celeron or expensive pentium trash, but now intel is the superior budget option with lower memory latency.
what cope is this
the turbo poorgay market is super fricking tiny compared to midrange and amd is easily winning there
imagine memory latency being a fricking marketing point for you when that hardware is melting itself to death trying to compete
No one is really seething though? it is good to bring up that paying because again, as mentioned, boomers are still influenced by it to this day and are loyal to Intel because of it. Me personally, I just buy whatever has the best performance for the cost. I've always done this, it's why I've had hardware from Intel, AMD and Nvidia over the years.
Boomers are only loyal to intel simply because it's the most visible brand. Intel had the superior production capacity and higher market penetration. You could find intel in every prebuilt PC. AMD was cheaper but rarer simply because they couldn't produce CPUs as fast as intel. For those boomers it's seemed like popularity = quality.
>budget
Wrong unless you mean super budget. You can get a 5600x - which is more than enough for pretty much everything - for less than 100 bucks. Intel only has something for the super poorgays who can't afford to spend more than 50 bucks on a CPU but those people are not looking for upgrades - they are looking for something that just barely works - and definitely can't run modern more demanding titles. The bare minimum if you will.
>5600x >for less than 100 bucks
Where? In the used market? Did you mean those $120 warranty-less ryzen 5600 shipped from china for OEM PCs? I'd rather spend a bit more for a one year official warranty.
3 months ago
Anonymous
Checked. Apparently prices went up recently. Last time i checked it was new on Amazon for like 95 EUR - Now it's 135 EUR.
>5600x
I can confirm, I got a 5600 OCd to x level, and it's a little beast of a CPU, runs all the modern slop with less than 60% usage
>5600X
12400f is better and cheaper. >OCing 5600 to 5600X levels
I wouldn't do that. 5600 is a lower binned 5600X. Enjoy reliability issues in the future I guess.
3 months ago
Anonymous
is better and cheaper.
Nah. The 12400f is 140 bucks here at best. More expensive. The board is also more expensive than a cheap generic AM4 for the 5600x
3 months ago
Anonymous
H610M is cheaper than AM4 and have PCIE 5.0 ports for GPUs. 12100f and 12400f are pretty much fine on those. 5600X requires moderately strong VRM like all X CPUs so a cheap AM4 won't cut it in the long run.
3 months ago
Anonymous
>buying H610 board
Oof. >PCI-E 5.0
They don't have that on H610 and there is not a single GPU that supports it anyway. >5600X requires moderately strong VRM like all X CPUs so a cheap AM4 won't cut it in the long run.
Oh no ~65W is just too much.
You are clueless.
3 months ago
Anonymous
>there is not a single GPU that supports it anyway
PCIe 5 SSDs exist, though.
>they are about the same
It depends on the game. These benchmarks only test like a dozen games, that's a pretty small sample size. I believe 12400f would perform better generally due to the lower memory latency. Starfield runs straight up like trash on AM4 systems. >5600 + Cheapest A520 MOBO = €190.96
That mobo doesn't support PCIE 4.0+ GPUs and needs BIOS update to run ryzen 5000 CPUs. I'd rather pay slightly more for convenience. B550Ms are cheap nowadays, just use those.
It depends on the games yes, but both are 3-5% from each other in certain titles. https://www.youtube.com/watch?v=v5N8SzBSzsk&t=440
I'd choose AM4 over intel shit any day of the week though. If i wanted an upgrade nowadays I could buy a 5800x3D which doesn't pull 900w and has cuck cores.
3 months ago
Anonymous
>If i wanted an upgrade nowadays I could buy a 5800x3D which doesn't pull 900w and has cuck cores
It's shit for video editing and other non gaming tasks. 13600K is the best CPU in that price range. It does pull more watts but the difference is miniscule next to how much your GPU is pulling, unless you somehow keep it running at peak power somehow. Also lower idle wattage.
Me? I'd rather stick to budget CPUs until the next generation intel and AM6 come out. 12400f and ryzen 5600 are still getting amazing fps on AAA games, even 12100f is still great. They'll be good for the next 5 years or so.
3 months ago
Anonymous
is better
Not really, they are about the same. >and cheaper
It depends on your region, I guess. These are the prices where I live, both use DDR4:
12400F + Cheapest H610 MOBO = €228.86
5600 + Cheapest A520 MOBO = €190.96
3 months ago
Anonymous
>they are about the same
It depends on the game. These benchmarks only test like a dozen games, that's a pretty small sample size. I believe 12400f would perform better generally due to the lower memory latency. Starfield runs straight up like trash on AM4 systems. >5600 + Cheapest A520 MOBO = €190.96
That mobo doesn't support PCIE 4.0+ GPUs and needs BIOS update to run ryzen 5000 CPUs. I'd rather pay slightly more for convenience. B550Ms are cheap nowadays, just use those.
3 months ago
Anonymous
>It depends on the game.
Yes it does. That's an average, which means in some games the 5600 wins and in others the 12400F wins. On average, the performance is about the same. I can also cherrypick games, you know.
>That mobo doesn't support PCIE 4.0+ GPUs
???
PCIe is backwards compatible. A520 only supports PCIe 3.0, that's true, but it doesn't matter unless you're using a 6500XT which is limited to four lanes. But no one should be using this GPU for gaming anyway.
If you really need PCIe 4.0 for some reason (and this reason is definitely not gaming, even PCIe 2.0 is enough most of the time), with the cheapest B550 MOBO the total price goes up to €211.41. Still cheaper than Intel.
>needs BIOS update to run ryzen 5000 CPUs
No it doesn't. X570, B550 and A520 were released with Zen 3. They support Zen 3 out of the box, even if you end up with a motherboard produced four years ago and never updated since then... somehow.
3 months ago
Anonymous
>That's an average
Yeah out of only a dozen or so games. >I can also cherrypick games, you know
I picked one where the performance is straight up bad, not just less good. >PCIe is backwards compatible
Budget cards with tiny bus like 6500XT run really bad on anything less than PCIE 4.0. >with the cheapest B550 MOBO the total price goes up to €211.41
That's my point. Intel CPU + H610M are still cheaper in my region though. >No it doesn't. X570, B550 and A520 were released with Zen 3. They support Zen 3 out of the box
The early BIOS versions don't support AM4. A520M is an old board so you're likely getting an outdated BIOS.
3 months ago
Anonymous
>Enjoy reliability issues
lmao what "issues"? once you stress test everything and makes sure it works, it works. haven't had any issues with my undervolt+small overclock. silicon degradation is a meme, sandy bridge cpu are still chugging away at their 4.5ghz even today. and if it happens in a decade? just feed it 0.05v and issue's solved
It's not. Stable how? they can never expand and neither can you. Copy how? who was pasting cores together first to get multi cores because they couldn't make it on a single die like someone else? who made x64?
Boomers are only loyal to intel simply because it's the most visible brand. Intel had the superior production capacity and higher market penetration. You could find intel in every prebuilt PC. AMD was cheaper but rarer simply because they couldn't produce CPUs as fast as intel. For those boomers it's seemed like popularity = quality.
>the most visible brand >You could find intel in every prebuilt PC
yes because Intel paid, keep up.
R7 5800x3D if you're on ddr4 am4 mobo and gpu under 4080/7900xt.
R7 7800x3D if you're on a 4080, 7900xtx or 4090, ddr5, am5, etc.
R5 3600 if you are severly constrained for cash, and on some ancient quad core and dd3... 3600 with a b450 mobo abd 16GB ddr4 3200 with max out a 6650xt and 306012GB.
While the combo is under $200 new.
Of course 5600 and 5600x are great vermeer same as 5800x3D but only 32mb of cashe and only 6 core.
7500f and 7600x are good ddr5 choices.
Thats it.
Do not get memed buying some quad core, or buying anything with E cores for vidya. 12400f is the only intel cpu somewhat ok if ylu get it cheap and with a cheap mobo, but thats a rare bundle option on sale.
Performance is about the same. You'll never be CPU-bottlenecked in a AAA videogame anyway. In lighter eSport games where CPU performance actually matter, any modern processor can pump out hundreds of frames per second.
AMD is cheaper, more efficient, AM5 offers better upgradability. If you're still using a Zen 1 or Zen 2 CPU, you can upgrade to a much better CPU (5600 or 5800X3D) for relatively cheap.
Intel isn't even THAT far behind. But they don't offer anything unique (unlike let's say Nvidia: their hardware is worse per dollar, but they have DLSS), so there's no reason to buy their products. Why would you ever buy a worse CPU when you can get a better one, even if only slightly better.
But as I said, any modern CPU is more than enough for current videogames. You can get whatever you want, both are fine, it doesn't matter.
intel currently offers superior price / performance in real world scenarios (1440p+, playing modern games)
if in the rare instance you're no longer GPU limited (4090, play less demanding games, low resolutions) then an AMD 3d cache CPU can squeeze out some extra frames, but please don't buy one of those otherwise
>(1440p+, playing modern games)
literally fricking any recent cpu made in the past few years can manage when you're gpu bound
why the frick would you pick intel when it's just drawing more power for no gain whatsoever
It's not like the Pentium/Celeron vs Athlon/Duron days anymore. Duron used to perform almost as good as pentium 3 at less than half the price, but now the difference between 2 brands is within a hair breadth. AMD aren't making ultra budget chips anymore like they used to, and even the cheapest 12100f runs any AAA title at 100+ fps. I don't understand why CPU brand wars are still a thing.
The performance and use cases have shifted severely, now it's either you have a high-cache CPU or you don't. You have something eating 200-250W peaks (x7000k) or you don't. Z and X series mobos now cost 200-300$ starting and you get very little features outside of expansion and maybe some additional I/O.
And no, the 12100f is not 'good enough' unless you are making a very budget PC that will in fact play a lot of titles at 1080p with FSR/DLSS with very bouncy FPS and some really annoying stutters. A LOT of modern games eat CPUs for breakfast.
t. Have built way too many machines and sell 'boutique' gamer PCs with a bumped up price just because I put in an LED strip and 2 RGB fans in there with a case that costs more than the CPU.
>AMD GPUs run laps over Nvidia when it comes to price/performance ratio >LMAO poorgay, where's your 4090???
>AMD CPUs obliterate Intel's for gaming purposes, being stronger, less power hungry, colder AND cheaper >bro, buy middle-of-the-road Intel, you never know when you'll want to start rendering stuff
At least people stopped posting Userbenchmark reviews
>you HAVE to pick a side goyim
Frick off moron. I buy whatever is good at the time i need it. Right now that means the 7800x3d for gaming until something better comes along.
AMD could also push power consumption to 400W and improve performance by 5%. But they didn't because that would be pretty stupid. Raptor Lake is actually pretty efficient, just not as efficient as Zen 4. Intel can't compete at lower TDPs, so all they can do is pump more power into the CPU. No one is ever going to buy a slower CPU, but maybe there are some customers out there who don't care about power consumption.
Radeon did the same when they had to compete with Nvidia's Pascal. If the RX 480 and GTX 1060 were priced the same, used the same amount of power, but the 1060 was 10% faster... no one was ever going to buy the 480. That's why they pushed the 580 (rebranded 480) to 200W, so that it ended up beating the 1060 (120W) by 5%.
And Nvidia did it when they had to compete with the much more efficient RDNA2, the 3080 was infamous for tripping power supplies.
Basically, whoever ends up with a worse architecture has to drive power consumption to the moon if they want to sell. Right now it's Intel, in a few years who knows.
Case in point, the 6900XT and 6950XT weren't even supposed to exist. They saw they could compete with the 3090 and they went for it, they clocked Navi21 as high as they possibly could. Completely destroying the great efficiency of RDNA2 for single digit % performance improvements.
Then they came up with the 6950XT, which was even dumber... and Nvidia responded with the 450+W 3090Ti lmao. Right now it's the other way around. Nvidia can be more conservative with clock speeds because AMD's RDNA3 is worse than Lovelace.
My point was, no architecture is inherently a power hog. You could limit the 14900K to a much more reasonable 100W TDP and it would still perform great... just not as good as Zen 4. And no one would buy it.
RDNA3 is great, but the product naming fricks up the entire stack and it leaves a very bad impression. AMD shouldn't have gone with chiplets without letting them mature a little first, as well.
>AMD shouldn't have gone with chiplets without letting them mature a little first, as well.
pretty sure chiplets cost them less than what a monolithic die would
this amd generation was focused on increasing profit margins by lowering production costs, not pure power
3 months ago
Anonymous
Yet RNDA4 will be a monolithic die and the halo card will be a $500 4080.
stock ryzen chips are already heavily overclocked to win in benchmarks. If you limit a 7700x you get the same frequencies at the same wattage.
They can't do that with the 7800x3D. The 3D cache traps heat and the TSVs can't increase voltage as high. So by necessity they HAVE to use lower power limits.
Low clock speed and huge cache. Most games benefit from cache more than clock speed.
like this anon said
stock ryzen chips are already heavily overclocked to win in benchmarks. If you limit a 7700x you get the same frequencies at the same wattage.
They can't do that with the 7800x3D. The 3D cache traps heat and the TSVs can't increase voltage as high. So by necessity they HAVE to use lower power limits.
Only broke morons get AMD. Intel has shit like QuickSync and Heterogenous Architecture. Which allows for core performance to used correctly before passing it onto another.
If any of you gays want to learn feel free to
https://www.intel.com/content/dam/www/public/us/en/documents/white-papers/ia-introduction-basics-paper.pdf
As a disclaimer my first AMD cpu was the shitty Tri-core bulldozer APU; like the 2nd one they ever made. It was absolute dogshit. So it was a shitty first impression.
I know I was making a joke; I like you anon.
I work at Intel actually. There's alot of work done to make sure Windows doesn't frick over all the work on the silicon level. Shit is assssss
You'll be fine. Just follow basic instructions, don't get ahead of yourself, and be sure to take your time. Stop if you feel overwhelmed or unsure and take a break.
intel and nvidia. single core performance is still king for a lot of things. intel/nvidia just werks with everything and gets support faster for everything when its new. you pay a tax for it sure but not being one of those goons whining on message boards about drivers is worth it
The performance is amazing because of the E and P cores and knowing what threshold is needed to pass it to a P core.
It's wizardry to make sure Windows Search doesn't have a fricking autistic rampage which leads to it shitting the bed on a P core while you game.
Also everyone reeeing about power usage.
https://www.msi.com/Handheld/Claw-A1MX
Intel is finally trying to get into the handheld market. Ticktok AMDgays
i'm on win 10 and read it doesn't know how to use the p and e cores correctly because of the scheduler or something, but i haven't noticed any issues so far. its fast though, 13700k
>XeSS looks like desaturated shit >Intel iGPU can't compete with AMD's APU >need to run Win11 or the P+E core configuration goes full moron and your performance suffers
Pajeeeeeeeet.
>what's FSR look like
Don't know. AMD's rasterization is powerful enough to not need upscaling tech 🙂 >using Windows on a handheld
Need to dedicate more of that nvme to bloat my homie?
>AMD's rasterization is powerful enough
This is iGPU we're talking about. Even the top end ryzen iGPU is a lot weaker than gtx 1650. >steam OS
Steam deck can't even run 2020 games well at 720p. I'd rather game on a netbook than ick on eck. Both are only useful for indie games, but at least you're getting full keyboard and a bigger screen at a lower price.
3 months ago
Anonymous
>This is iGPU we're talking about.
Intel iGPUs have been dogshit for years, is meteor lake actually any good?
3 months ago
Anonymous
nta but i managed to get ai to run with the igpu and it wasn't as dogshit slow as i figured it'd be. slower than a dedicated card for sure, but not bad. for a normie it'd be enough for faceberg and youtube. i hope intel does something with that arc platform and gives us another option
3 months ago
Anonymous
They might just say frick it and place the ARC GPU shit into the CPU. So instead of iGPU it's dedicated graphics ala APU.
In its current state intel's igpu peaked like their 14nm+++++++++
3 months ago
Anonymous
i think the igpu stuff is impressive for what it is on a chip, but keeping arc on the chip would seal its fate as a nothingburger. its to little space to have enough power and architecture and would never compete with ati/nvidia
3 months ago
Anonymous
They're gonna try some moronic AI shit first to squeeze out performance which may help but yeah they need to figure out a new hardware solution.
3 months ago
Anonymous
i keep up with ai shit and making cards specifically for it at this point is dumb. transformers for example has to go, it'll eventually be replaced. so anything they make hardware around now will be gone in a few years. there is a LOT of optimization to be done yet with everything ai, trying to generalize cards for it at this point is a mistake imo
they should be focusing on getting cards out and entering the market
3 months ago
Anonymous
NOPE gonna have to wait for at least arrow or panther lake.
best case they can shrink the die from the Arc GPUs
are indians now buying higher quality products now or something?
whats the meme?
they arent buying heavily discounted ryzen cpus and making excuses for why they dont need ray tracing?
>the pajeets at work defend intel like their lives depend on it >meanwhile everyone with an iq above room temperature uses AMD
Really makes you think...
Man I really don't want to but I guess I should move over to AM5. I was planning on just upgrading to the 5800X3D but I might as well just do a rebuild. I already don't like my 3800X (which I got a nice mobo deal with when the 3900X was out of stock forever). At least all I need to do besides buy the necessary upgrades is contact Noctua to get an AM5 mounting bracket.
>DDR5+7800X3D vs DDR4+5800X3D is a fairly minor difference
I have to disagree. You can easily upgrade AM5 CPU several years later if you have a decent motherboard. AM4 is good if you want to save some money right now or even skip AM5 era.
Why would he spend 500$ more instead of just getting a 5800X3D and some more RAM, homie? The CPU is still a top performer, you could get almost half a current GPU for that money.
in the future, don't plan to ever upgrade. spend the money up front and get extra ram, processing power that you think you won't need. i got 9 years out of my last build and it cost $1200 for the tower at the time, before prices went nuts
>bought 7900X because it was on a deep discount with a mobo >turns out 7800X3D is better for games
oh well, still a big step up from my i5 2500k, I'll miss the little guy
If you loved the 2500k that much why not get it's 10 year glow up the 12600k?
You some sort of brokie?
https://ark.intel.com/content/www/us/en/ark/products/134589/intel-core-i5-12600k-processor-20m-cache-up-to-4-90-ghz.html
No, Ramdeep, I do not. How does that change the fact that everyone in this thread shilling for Intel has awkward English as if it's a second language for them?
Hell, even this unrelated counter argument seems like something a pajeet would do.
Riveting comeback, Ranpoo. It doesn't make my observation any less poignant or correct.
Rangebanning every third world country solve 90% of the problems on Ganker
All of Ganker would immediately improve if non-US IPs were immediately banned. I'd say the UK, France, Canada, and Australia can come back, but only the rural IP addresses. Urban areas might as well be considered third world at this point.
3d cache thingy is to die for, 12 cores is only good for i dunno, ffmpeg or something, it's more of a workstation CPU. then again, i doubt you're getting any framerate issues in vidya on 7900x
i already had the whole pc, old cpu got fried. otherwise i'd probably just get a macbook. i'm sick of computers and technology in general, just want good drivers for audio and aggregate devices
what cooler do you have now?
i just put the cooler that came with it, but also have old ass nepton 240m somewhere
You can get away with something from Thermalright or Noctua if it didn't come with an OEM cooler. The 12700k is a good CPU, even with the cuckcore nonsense. You did well.
It's good. It's not a K CPU, you don't need a water cooler for it, just get a decent air cooler like deepcool AK500 or AK620.
>quiet >good temps all year long
There is no reason for custom loop unless you are going way out of standard hardware specs (OCing) or because you want to have a "cool" aquarium PC. AIO is simple, easy to install, and just works.
You can get away with something from Thermalright or Noctua if it didn't come with an OEM cooler. The 12700k is a good CPU, even with the cuckcore nonsense. You did well.
12700k was fine when it came out, certainly a lot better than previous intel chips. Out of all the times to buy intel, that was probably the best since the original ryzen.
I was full on frustrated with AMD back in the Phenom and Bulldozer days. They had gone from being a respectable brand to irredeemable garbage. Ryzen being this good was like being blindsided.
I wouldn't definitely call it bad of course, especially considering I went from a 3930k to that, not to mention it also pairs decently enough with my 4090 when playing at 4k
Yet, I still can't help but wish I had more CPU power for those games I'm CPU bound, or for the heaviest rpcs3 games like inFamous
i built my last build around emulation and it was right before rpcs3 started to become usable and actually made use of multiple cores over something like pcsx2. it was an i5 4060k i think? i managed to play through ac4/4a on xenia using vulkan but so much of the graphics were fricked, i had make cheats because i couldn't see that i was being railed by a giant laser. i have a 13700k now but haven't tried rpcs3 yet, but i'm hoping its good
i remember thinking 'oh frick' when reading about how rpcs3's approach was to emulate each core to a core on your comp, which i didn't have enough for. i knew my new build was already kinda fricked. but it gave me so much great emulation, i have no complaints
Whichever one you find the cheapest, but if both are are around the same price, then its the X3D series.
Intel mostly what has good is you get to keep ddr4 memory if you pick a ddr4 based mobo, great for those who bought like 64gb on their last system, so you don't shill $400 on ddr5
Support is no more a thing and there's no need to fanboy over a chip maker. Just get the best one in your price range at the time you want to buy a cpu.
I have a 7600X and I got an Asrock B650 PG Lightning, from my research it ended up being the best mobo for its price range. Also has a nice M.2 5.0 slot.
should be more than enough for now, but yeah, you did lose some performance. i don't think i'd care all that much, unless you're playing this one or two very specific games that improve massively with 3d cache and are demanding enough to cause issues on modern high-end hardware, and the only one I can think of is VRChat
It didn't seem worth twice the price for slightly better performance. The plan was to just get on AM5 as cheap as possible and then upgrade to a better one at end of life like you could with 5800X3d on AM4. Also 7600 seemed to have the same performance as 7800X3D in some older games
Worst case scenario youre probably having 20 or so less FPS with a 7600 over a 7800x3d. I doubt 20 fps are worth 250 bucks of price difference, especially when 250 more bucks on a GPU would give you a meatier FPS boost.
Plus, you can always just buy a 7800x3d in some years when its price is low, since you have an Am5 setup now (provided you werent moronic and didnt buy a trash mobo).
7800x3d. The additional cache makes a difference in 1% lows. I have a 5800x3d, I will not buy anything without a large cache. It's junk, garbage, waste of silicon. Intel, junk. AMD non 3d CPUs, junk.
Put it this way, Ganker shits on everything Intel & Nvidia and shills AMD but whenever a new game comes out the people b***hing about crashes are always AMD users.
>the people b***hing about crashes are always AMD users.
thats the funny part. ati always sucked. nvidia was good before intel bought them and is still good. amd just can't keep up in processors, which are hardly relevant now anyways because everything is so fast that its the gpu that matters
intel-nvidia, can't go wrong even if you pay the tax
>. nvidia was good before intel bought them and is still good. amd just can't keep up in processors, which are hardly relevant now anyways because everything is so fast that its the gpu that matters
what
Is this price fricked up?
Amd Ryzen 5 5600G 4.4ghz
16GB Ram 3200mhz DDR4(2x8)
SSD 480GB
600w atx 20+4
600 usd in 1 payment or 800 in 3... I swear ive seen this pc for 300 in amazon or some shit
i was a decade out of date from my last build and i just monitored the pc threads for recommendations, eventually asked a few questions and i'm very happy with my build. unlike Ganker which is always mean you can ask dumb questions and get a non-meme response
Depends on what you mean. In Dead Space if you run it at 200 fps you can break the engine and do wild stuff. Speedrunners use it to their advantage. REmake 2 also has weird things like a knife attack doing multiple hits instead of one on high fps.
In Monster Hunter World, some weapons (bowguns with piercing bullets, for example) hit more often than originally intended at higher framerates. A normal user is never going to notice, but this breaks speedruns. There's a mod to fix this.
In MH Rise, monsters used to track their moves better at higher framerates. On Switch (or on PC at 30 FPS), you could dodge some moves by just walking (this was intended behaviour). On PC they would snipe the shit out of you. This was fixed by Capcom a few months after the PC release.
In RE4 HD (the original, not the remake) enemies throw items (like grenades) twice as often at 60 FPS compared to 30. Also, you have half the time to beat QTEs.
In RE1 HD Remake, zombies track the player better at higher framerates. It's easier to run around them and not get grabbed at 30 FPS compared to 60.
No idea. RE1 HD Remake is a port of the GC release (you could choose between 50hz and 60hz mode on the PAL version. Everyone played at 60hz, unless your TV was ancient and didn't support it). It was probably a tiny bit easier if played at 50hz, I assume.
Personally, I tried both 30 and 60 FPS on the PC version, and I couldn't tell the difference in zombie behaviour. But it can mess with speedruns apparently.
It has nothing to do with framerate, but the first NA version of RE1 on PS1 is way harder than the JP version. Because it doesn't support autoaim for whatever reason. You press the aim button and the character doesn't automatically turn towards the enemy.
>It has nothing to do with framerate, but the first NA version of RE1 on PS1 is way harder than the JP version.
They all have weird quirks and small changes. I watched some dude beating RE2 without pressing forward once and he picked some version of the game because of something but I don't even remember what his reasoning was.
i dont have anything to add about anons new build but i just upgraded to a 13700k, 4070 suti and a nvme drive, i had a normal hdd before and was enjoying sims 3 still. amazing difference
Does it? I've never looked at comparisons in that setting since I just play games and use photoshop/Office. I doubt I'd get an Intel anyway since I'm in Europe and intel CPUs are ridiculously expensive, like 50% higher than their price in USA, while AMD CPUs are very affordable.
yeah but this isn't something like 20% more power efficiency difference that you can just dismiss
the gap is unbelievably fricking massive for a TWO PERCENT increase
I don't like Intelaviv but I have DDR4 sticks a plenty so I'm probably going to get one when my 9th gen starts struggling, there's literally no games worth upgrading for.
I say this as an owner of an AMD CPU and GPU: Intel. I've had nothing but problems since switching from Intel + Nvidia and my next PC will be avoiding AMD at all costs.
AMD for home
Intel for work
ROFL shitel is shit at everything.
Threadripper is way better for workstations and hedt.
>intel with its massive security issues for work
Depends on the game. If a game doesn't utilize all cores then core speed is crucial which means intel.
You can game just fine with Intel
You can't work on something professional with AMD
>You can game just fine with Intel
>at $200 higher price than an equivalent AMD
>at double the power consumption of the equivalent AMD
>with higher temperatures that cause more throttling compared to the equivalent AMD
I added addendums to your post to correctly explain the situation
pic 1/2
pic 2/2
sorry, I need to correct myself, I should have said OVER THREE TIMES THE POWER CONSUMPTION of the equivalent AMD, my mistake
>295
Waffle makers by Intel when?
Imagine living in a shithole without almost free power.
Is your PC supplied directly supplied by pic related? It better be if you're going high end Inter+Nvidia
I'm on a grid fed by one. My Intel CPU idles at like 10w and barely gets to 90w while gaming (because only the poors use CPU heavily for gaming). My GPU consumes only a modest 450w, though.
>My GPU consumes only a modest 450w
Intel or AMD?
GPU. Neither of those companies make GPUs worth mentioning.
power consumption means heat, cpus are not really doing anything else with those huge amounts of energy that goes into them. more power consumed -> more heat generated, and generating more heat means that CPU is more imperfect inside as heat is primarily generated by poorly aligned cpu layers having hard time to connect to each other due to smaller contact and resistance. more energy efficient cpu means that manufacturers can squeeze out more of it. even if your energy would be 100% free, you'd still prefer a more efficient cpu, because it would simply be better as long as it's same performance bracket
Yet there's no application where I actually use my CPU heavily in 2024.
>more expensive
>3x the power draw
>less performance
Damn it Intel. it's over. Pack it up. Go home.
AMD is king for gaming.
Intel is king for workstations.
Intel can also be used for gaming but you have to put way more money into it to get performance equal to that of a Ryzen. Just look at the 14900KS.
>400W or even more
>over 600 bucks
>for gaming performance equal to that of a 7800x3D
>which consumes less than 90W
>and only costs around 350 bucks
That and AM4/AM5 are longliving cheap platforms while keeping up with Intel is generally more expensive as you constantly upgrade parts.
Well, that and Intel also runs way hotter so i guess you might have to spend less on your heaters next winter.
>AMD is king for gaming.
>Intel is king for workstations.
isn't that the other way around ?
Used to be the case when AMD offered more cores and clock speeds than intel.
But when they started putting estrogen cores, they fricked up with gaming performance but at the same time it improved workstatiom performance
power efficiency is king in workstation shit
why would you settle for intel if you're actually maxing that shit
If have a really strong cooling solution it probably doesn't matter all that much. (Unless you care about your bills). If you don't take into account power efficiency, intel is better in that regard.
I have a 5600, just to be clear.
yeah but why wouldn't you care about that
you're paying more for air conditioning and power just to have a chip from a blue brand?
I know, I'd care - some don't, they want or even need the BEST of the BEST.
It's up to you, really.
Their CPU architecture seems to be giving them headaches again and they start to plateau 'till they get to a smaller lithography.
yeah but this isn't something like 20% more power efficiency difference that you can just dismiss
the gap is unbelievably fricking massive for a TWO PERCENT increase
These threads just expose how Ganker is filled with paid pajeet marketing shills because nobody in the real world has this irrational hate for AMD like you see here outside of UserBenchmarks.
This image right here destroys the Intel vs AMD argument. It's settled. AMD won. Intel has nothing on the horizon to answer back with, and AMD is chasing Intel out of the server space with Threadripper/EPYC. Even Grace Hopper can't compete. It's both slower and doesn't have x86 compatibility.
>Power consumption
>meaning frick all
lmao; each CPU has their specialties and use case. you call out 'shills' but totally fricking write like one
it means a lot if you want to use it to regularly make money on anything higher than an occassional hobbyist level
you want to tell the boss that you'll be doubling to tripling the power draw in a specific category because you're sucking off intel?
There's more info in that image than just 'power consumption.'
When you need 350w on top of your competitor for a 1.1% performance uplift then your architecture is bad. You're doing it wrong.
consumes less than 90W
moron
Is google too hard for you?
Stock in games it doesn't really go over 60w at least for techpowerup
It can use more power but under specific uses like avx or if you overclock/disable power limits.
AMD for high end gaming machines, Intel for low end PCs and productivity machines. 12400f destroys AM4 on performance and power efficiency, but it ultimately destroys them at memory latency. Also Meteor Lake will be re-released in the future as budget CPUs as Intel moves to smaller micron processing for their next gen lineup, so sticking to the ultra cheap 12100f is still worth it now as there'll be a good budget upgrade path in the future. Plus Intel APO is coming to 12th gen CPUs too.
For high end gaming machines, AMD is pretty much the only next gen option. Not worth it for budget builds because DDR5 is still shit. Furthermore, 7500f is pretty cheap if you ever manage to find one, but it still suffers from the typical AMD memory latency problem and performs the same as 12400f when memory latency becomes an issue. 7800X3D is a monster, but it's expensive and the money could go to your GPU instead. Not worth buying unless you have a 4070 Super at bare minimum.
AM4 is trash man. Not worth buying anymore. Maybe if you got a *really* good deal on 5600X3D or 5700X3D it's worth it, but otherwise it's outdated trash with PCIE 4.0 at best.
I am not saying that people should still buy AM4 but AM4 shows that AMD is willing to support a platform for a very long time. Someone could have started with a 3600 Ryzen only to later upgrade to a 5800X3D without switching the motherboard..
Intel meanwhile drop support way faster.
When I needed to upgrade my CPU, I went for a 10400F because that was the best value at the time. Now all my friends with AM4 motherboards upgraded to 5600s and 5800X3Ds, while I'm stuck with this.
One of them went from a 1600 to a 5800X3D, the dream upgrade: he tripled his CPU performance (or more in some games) for like €350.
I regret my choice to be honest, I underestimated how useful their long-term support would be.
This. Intel fanboys really downplay how good long term socket support is. Using arguments like "well you don't need to upgrade your cpu until like 5 years after"..... expect if you bought a R5 1600 in 2017 you could upgrade to the R7 5800X3D in 2022. While if you bought an i5 7400 in 2017 you are SOL.
>upgrading to a $300 high end CPU while reusing the same crappy weak VRM motherboard that needs BIOS update
I wouldn't want to be that dumb.
wtf are you talking about, B350s and 370s support X3Ds without an issue.
>inb4 muh PCI 4.0
There have been so many tests to see if it matters, it doesn't. The 4090 barely loses performance between the two of them.
I upgrade my entire PC every 2.5 years because I'm not a poorgay.
>pcie 4.0 at best
Black person that shit is irrelevant. pci3 would be more than fine for damn near everything, let alone 4.
Yeah, pci gen bottlenecking on modern motherboards is a total scam.
I'm assuming that all the non 4090's are capped at their (relative) 100% from a 4.0/5.0 slot?
Not sure.
ultimately, still negligible performance loss from 4 > 3, marginal at 4/3 > 2, and notable at fricking 1.1, but realistically you're not getting a fricking 1.1 slot nowadays.
intel destroys amd on emulation so I picked that
Maybe if we're literally only talking about PS2 emulation, and even that's probably not down to actual performance and more down to developer incompetence.
cope. rpcs3 runs way better on intel than on amd on similar price cpus
Considering you can run rpcs3 on an Athlon, I strongly doubt that is the case.
>I strongly doubt that is the case
just look up test videos on youtube. i've bought i3 12100f and it only has 4cores. yet it runs gt6 at around 30-40fps, meanwhile ryzen 3600, same budget CPU, can't even reach 30 despite having 6 cores. heck it even loses in pc games because amd's IPC is trash.
>just look up test videos on youtube
Or I could just run it on my athlon.
You doughnut
What next, you're gonna tell me to check out userbenchmark?
>athlon
i had fx6300 and it was pretty shit for rpcs3, only a few smaller games ran at full speed. you're coping.
>you're coping.
Black person I have my tiny athlon PC on my desk, and you're trying to convince me that it's using some kind of shamanic ritual to trick me into believing it's working.
I guess my 5700x shits on both of those so I'll be fine
Thanks buddy. Enjoy your budget build and have fun
Why does the 7800X3D get so much hype?
Because its overkill yet afforadable and absolutely stompts the shit out of both Intel and other more expensive AMD CPUs in most cases.
The 7800X3D is so good it negatively affects AMDs topline.
Because it's the best overall for games, less than $400, and has a sub 90 watt draw at peak
It's the 1080ti of CPUs
Best bang for buck gaming chip in a long time.
the new i5 2500k
why did no one warn me this chip runs so hot. I understand now that its normal according to amd but it freaked me out. I'm like a caveman cowering at fire with this thing
Modern CPUs are designed to run as hot as possible to win technically-not-overclocking-but-very-expensive-cooling setups. An AM5 going up to 95c or whatever at the junction point gets them the "performance crown", even though they should be run at the top eco mode to not ramp your fans constantly. You don't even lose performance doing that on a 7800x outside of max-load all-core benchmarks because it's moronicly power efficient.
>to win
*journo benchmarks with
Are you sure you aren't just using some version of speccy from 2002 like every giant moron on the internet
no I'm not sure about that, I'm a massive moron.
If you temperature monitor says the CPU is a running hotter than the universe one yottosecond after the big bang it might not have the right settings for that chipset
>Are you sure you aren't just using some version of speccy from 2002 like every giant moron on the internet
That's just this stupid shithole board.
>depends on the model
>depends on the series
>depends on the software
>depends on the other hardware used
Every generalization is made by morons or fanboys(morons)
How is every single fricking reply so wrong holy shit
There is no fricking reason that you'd want a space heater if you're doing professional work for prolonged periods of time
Maybe on a small scale sure but if it's the main reason you're buying then frick no, that extra heat isn't worth it
Intel sucks at emulation ever since they killed avx512 while AMD has come up with a good solution that doesn't ignite the CPU
And games don't care about pure raw clocks since 7800X3D beats the 14900k on average
AMD for gaming
Intel for gaming+rendering+machine learning+CAD etc
the one with more L2 cache
>for gaming
>yea but professional ???
can't people read or something?
Amd is better for work because in case you need a HEDT, Intel has none.
OP asked which CPU is better for gaming. Not which CPU is best for your blender and adobe shit.
In that case the only answer is AMD unless you like spending more money on chips and electricity bills than needed just to achieve equal performance to a Ryzen from a generation ago. Nobody who is building their own PC is buying Intel right now - the top 10 bestselling CPUs for PC builders are all Ryzen CPUs. And that for a good reason.
>And that for a good reason.
nobody who is building their own pc has a job right now
Pretty sure NEETs arent spending two months pay on a 7800x3D anon
What happened to Intel? Why did they get lapped so hard?
They've been coasting off the success of core2duo all this time while AMD had to invent Ryzen to not die
>Ryzen
Why was Ryzen so special?
AMD brought back Jim Keller who actually knew what he was doing, compared to the mongoloids that designed bulldozer. It also helped that Intel were using their own fabs and got stuck on 14nm for years (their 10nm) was just a refinement of 14, not an actual node shrink).
14nm++++++++++++++++
Now on 10nm+++++
>20A will fix it
and yet still keeps up with TSMC 5nm. It's incredible how AMD still lags behind Intel's own shitty fabs, same goes for Nvidia where they also get btfo despite Nvidia using TSMC as well.
>and yet still keeps up with TSMC 5nm.
No? Intel 7/10nm is less dense than TSMC 7+ and later nodes. Intel 4 is behind TSMC N5. Intel 3 and 20A are imaginary.
>It's incredible how AMD still lags behind Intel's own shitty fabs
Are you not aware that intel is using TSMC to fab most of their chip tiles?
AMD invested into a new platform with Ryzen and the bet paid off. Intel meanwhile are still rocking the same old shit for over 10 years just constantly increasing the heat resistence, power drain and clock count. Intel are also working on a new platform but just like Ryzen it will probably take atleast a generation before everything is optimized for it.
In short: They sat on their ass for too long.
Wrong. AMD has the advantage of having their CPUs manufactured by TSMC which is the biggest fab with the lowest micron processing in the world. AMD is a fabless company. Meanwhile Intel uses their own fabs which didn't benefit from government bucks unlike TSMC, until the last few years. They couldn't match TSMC's speed at upgrading their fabrication process so they did architecture and software upgrades instead. AMD will be utterly fricked when pooh invades taiwan though. They'll end up borrowing intel fabs one day.
Lmao Intel is still behind tsmc's nodes, and even uses them for their I/O. Even after getting their CHIPS gibs they're still demanding several more $billion.
>AMD will be utterly fricked when pooh invades taiwan though.
Everyone would be fricked. Chip shortages would drive the prices insanely high.
isn't that why various countries started (re)building their own fabs after the last shortage?
And how many of those are standing and how much of the total production do they have? Don't forget that Taiwan's manufacture is cheaper too. I would expect at least 30% price increase. Data source for my estimate you ask? It once appeared to me in a dream.
from what vague information I read years ago in passing, I recall it takes a long fricking time to make a fab anyway (something in the range of like 5-7 years or some shit? I dunno), so probably 0.
It's all bs to control the market and make it look like there's no monopoly, the creator of the Ryzen architecture is an Intel engineer, he worked for Intel, went to AMD, created Ryzen, and then went back to Intel. Look it up. The AMD team did good on maturing the tech of course can't deny that. Same shit for AMD and Nvidia, they've been colluding with prices a few times. These tech companies want you think they are in competition but in reality, they all work together behind the scenes.
>It's all bs to control the market and make it look like there's no monopoly, the creator of the Ryzen architecture is an Intel engineer, he worked for Intel, went to AMD, created Ryzen, and then went back to Intel. Look it up.
What are you talking about?
He's a moron that doesn't know what a corporate work force looks like. Companies in the same industry have near constant cross contamination of labor. That's why non-compete clauses are huge. 90% of corporate espionage comes from some dude taking files from his old company to the new one.
They got lazy when AMD couldn't compete and made the same chips for 6 generations.
kek look at this and the other homosexual thinks I'm living in the past for holding onto the 4790k for so long. Guy knows fukol and wants to comment about how he wastes money to coooonsssuummmee new hardware like the fricking moron he is.
>fukol
wat
Ah sorry friend, I mean fokol. 2nd language.
it's frickall
>“Fokol” in Afrikaans essentially implies a sense of nothing, of having northing or being nothing. While the word 'Niks” is the direct translation to “nothing” in Afrikaans, “Fokol” is a commonly used Afrikaans slang word.
That's obviously a loanword you dumb idiot poopface
Meanwhile with AMD, the AMD video drivers are CPU heavy causing a limit to occur with the 7800X3D.
The absolute state of AMDrones.
>nvidiots care about fortnite
your opinion has been discarded
>NO NOT LIKE THAT
everytime, kek
yes
>You will immediately cease and not continue to access the site if you are under the age of 18.
no one who cares about fortnite in any capacity meets this rule
or you're a manchild or pedophile
>CPU thread
>posts GPU benchmarks
>compares a $1000 GPU against a $2000 GPU
>fortnite
Ganker was a mitake
Absolute moron, those benchmarks showase the CPU limit you stupid mongaloid Black person.
OH N-
Their foundry side is now totally inferior to Taiwan, generations behind at this point, and is dragging down the chip design side
>buying Intel
>ever
Buyers remorse much?
>every time someone has problems running a game, they have an AMD
>never once saw an intel user have the same problem
New Intel chips have plenty of issues with older games because of the E-cores. Sims 3 for example outright won't work unless you disable E-cores or download a mod.
Dont know about AMD but Intel now thermal trhottles unless you have good CPU cooling.
How much would I save by going full AMD? I still have a 980 and 4790k.
AMD requires (yes, requires) higher speed ram to do the same job that an Intel cpu does internally. Basically, it's taking the cost to do something and putting it elsewhere and people see that as some kind of discount. It doesn't help that the majority of buildgays have no idea what they're doing and see no issue with buying ram with higher speed than what they need anyway.
This anon knows what he's talking about. Intel shit works out of box. AMD needs the external shit to compensate not to mention the software suite needed.
https://www.intel.com/content/www/us/en/gaming/resources/how-hybrid-design-works.html
Windows scheduler is so moronic Intel just said frick it and put that shit on the silicon.
is there any tests that show the difference between the scheduler acting correctly vs win 10? i dont play much for 'aaa' games but nothing besides AI or benchmark programs like cinemark seem to come close to using it all. in my usage i haven't noticed any issues. white it might technically be an issue, i'm not seeing it so far
The last time RAM speed mattered so much was Zen 2. Now they are using a single CCD per chiplet and the cache is big enough, the inter-CCD latency (based on infinity fabric clock speed, which is based on RAM clock speed) is not that big of a deal. With X3D CPUs, the cache is so massive RAM speed doesn't matter at all. See pic ralated.
>AMD shouldn't have gone with chiplets without letting them mature a little first, as well.
It's not something they could predict, they started working on it half a decade ago. RDNA2's "gimmick" ended up better than they imagined (ie, big cache but low memory bandwidth), RDNA3's (ie, chiplets) ended up worse.
I agree, RDNA3 isn't that bad: it's around 30-40% more efficient than RDNA2, which isn't a bad gen-to-gen improvement. It's about average I'd say, if not slightly above average. But when the competition improves efficiency by 70-80% in a single generation... yeah, your "mere" 30% looks bad in comparison.
That was their plan, but complex packaging costs skyrocketed due to the AI boom. While silicon price dropped. That's why RDNA4 is (according to rumors) going to be monolithic only, they scrapped their chiplet designs.
The 4080 is probably a lot cheaper to produce than the 7900XTX by now.
If they went with a naming convention that had the XTX as the 7900xt and the 7900xt as the 7800xt, and so on, and so forth down the stack, then the gen-on-gen uplift would have been mindbogglingly massive. The 7900xt sitting in between the 4080 and 4090 would have made nvidia's connector fire disasters look even more embarrassing. Instead the "7900xt" looked like a barely there uplift from the much, much cheaper 6950xt at launch. Imagine if they called the 7600 a 7400 instead? That would have been a massive improvement over the previous gen's bottom card.
Well, in that case the 4080 is "actually" a XX70 die sold for Titan prices. Names are completely arbitrary, while prices mostly depend on what they believe customers are going to pay for their products.
When I say "gen-to-gen improvement", I'm comparing die size and power consumption. In terms of performance-per-dollar, Lovelance is only 20-30% better than Ampere... at best, if you compare the 3080 with the 4070 (ie, the best Lovelace card). With the 4060, price-performance didn't improve at all compared to the 3060.
But the 4060 is a 160mm2 die, 115W. The 3060 is a 276mm2 die, 180W. That's a massive gen-to-gen improvement.
The reason they cost the same is simple: people are buying it anyway, and Nvidia makes more money selling 4060s than 3060s for $300. Not that long ago, a graphics card with a die that small would be called "XX50" and sold for $150.
>But the 4060 is a 160mm2 die, 115W. The 3060 is a 276mm2 die, 180W
Smaller die. Less power draw. That's rather obvious. What's egregious is that the 4060 is actually 5% slower than the 3060. Lovelace's naming conventions hurt it, as well.
Truth be told both brands have stellar hardware this gen. Across the board. it's them trying to sell weaker cards as a product two tiers up the stack that has everyone scratching their heads and calling shenanigans. Especially when they try to use that to justify the price.
>4060 is actually 5% slower than the 3060
cope, none of the benchmarks back it up
fr fr on god that homie be sus with that ohio ahhh energy
nice cherrypicking, homosexual
now post averages across several games
>i-it doesn't count
a newer generation being worse than the older generation is not acceptable in any scenario, especially in games that are from companies like id that are known for having well-built games
>posts the only two games where 3060 can leverage its 12 gig VRAM
>i-it's 5% slower
Got a counter argument with some facts to back up your claims, or are you just going to be a homosexual?
sure, I'll just post regular benchmark instead of 2 games
>le 5% face
Yes, it's 5% slower in a few games. Has something like this ever happened before? The newer generation being slower in certain games?
Overclocked 3060 12GB at 2GHz for the gpu and 9000 for vram is better than overclocked 4060 in all games and situations that require more than 8GB of vram.
Which is most UE5 games released recently.
4060 should be 4050 and cost under $200
oh and the ti version is even worse
jesus
128 bit bus and only 8GB of vram is a nasty combo.
Imagine losers that paid $400 for 4060ti..
>Imagine losers that paid $400 for 4060ti..
The 3070 wasn't that much better. Once the mining boom ended, both the 3070 and the 6800XT dropped to around $600. Guess which card sold better.
>What's egregious is that the 4060 is actually 5% slower than the 3060.
I don't know what you're talking about. 4060 gets at least 5-10 fps boost in most games. Maybe you're thinking of 3060 ti vs 4060 ti or something.
>both brands have stellar hardware this gen
>it's them trying to sell weaker cards as a product two tiers up
Yep. And prices would've normalized by now... if it wasn't for the AI boom. Nvidia doesn't even need to sell their gaming cards to make money anymore, there are 4080s rotting on shelves since launch. Eventually someone is going to buy them, why would they care.
From 2017-2018 to 2023-2024, we went through two mining booms and an AI boom.
depends on a price bracket and use case scenarios. certain games benefit MASSIVELY from -x3d cache thingy, certain games run marginally better on intels, all depends on what tier you're aiming for and what kind of hardware around would you want. Another bonus is that AMD tends to stick to their sockets, so if you'll buy 7xxx cpu on a decent mobo, you're probably good for next couple generations. I have same mobo starting from first gen ryzen up to this day (5xxx), and I can definitely say that it was quite convenient this way.
>full amd
oh boy, you'll save a lot on GPU (though again, depends on a price bracket - for example, cheap 4070 costs same as slightly-more-powerful 7800xt), but at the same time you'll miss out on nvidia gimmicks and possibly lose some compatibility. Again, it's a random thing, but I remember more than one case through recent years where some game would flat out refuse to work on AMD for a while, it's all kinds of bullshit. But if you can cope with that, AMD GPU price/performance is to die for.
I don't give a frick my 5600H is good enough
how is one good for gaming while the other is good for work?
can someone explain it for the tech illiterate?
I guess an easy example would be like how Nvidia has a bunch of proprietary bullshit for compute that's kind of fricked work up. Personally I haven't really looked into if Intel has similar arrangements.
Cause the kind of software people use has been used for a quite a while and its all optimized for Intel and their current architecture. Even if AMD is technically more powerful and efficient - your blender software isn't making any use of it.
Look at emulators for example. AMD was a shit pick for anyone emulating games for quite some time cause AMD openGL performance used to be shit due to lack of optimization. But that got better over time and ever since most emulators started adding Vulkan - which heavily favors AMD - AMD has been kicking Intels ass in that field as well.
>OpenGL
>vulkan
Anon we are talking about CPUs not GPUs. The AYYYMDrones don't even know basic computing.
Vulkan on Nvidia is perfectly fine, the frick are you talking about
Fine doesn't mean equal. AMD is generally better there within the same price range of GPUs..
The point was about optimization.
I've never, ever owned a single AMD processor. I've been buying Intel since the Pentium 3 and it has never let me down. I just go with the original, not the cheap copy.
yea old people are moronic when it comes to CPUs thanks for the confirmation
ITT: just buy AMD it's just cheaper and... it's just cheaper, m-okay?
We can afford to get the best processor since we have jobs, you zoomers depend on daddy's money to upgrade your PC.
kek I've had both intel
>2700k, 4790k
and AMD
>athlon x4 and 5600x
it's just most boomers are still moronic from when Intel paid to win and you've been stuck in your ways since Nothing to do with money, just brains.
>2700k, 4790k
Seems you are the one that've been stuck in the past, zoomer. There's a correlation between disposable income and IQ, that's why stupid people pick AMD.
Na I looked up benchmarks and didn't need to upgrade for a long time. Honestly could have still stuck with the 4790k but I wanted better end game performance in paradox shit. I mean you are still brainwashed from back when intel was paying so I'd refrain from talking about IQ man.
It would appear there is a correlation between brand loyalty and IQ as well
Maybe, that explains why dumb people is loyal to AMD.
how do you manage to call someone stupid while proving why you're stupid
>is
ESL. detected.
>not the cheap copy
Maybe you should check who came up with x86_64.
Maybe you should check who came up with x86.
>I really love buying Israeli computer hardware
Risky.
but AMD also has an office in israel
Office in country =/= country of origin
amd and intel are from the same country
What? They're both originated from the US.
AMD is best for everything now, makes zero sense to buy intel at all
ITT anon's hiow 2 computah knowledge stops in 2014. Always buy Intel do not question thanks goy.
https://www.neowin.net/news/eu-fines-intel-400-million-for-blocking-amds-market-access-through-payments-to-pc-makers/
Intel should have put those 400 million into the development of better CPUs instead of bribing hardware makers to block AMD
Never buying Intel products ever again.
>2002 - 2007
22 years of seething and counting.
Unfortunately, boomers are still influenced by it. Ever speak to them about PC hardware? always get vague shit like
>Intel is just more stable so I always run Intel
>I just go with the original, not the cheap copy
I'm talking about AMD fanboys seething for 30 something years. The CPU market landscape, production, and management is wholly different now. There's no reason to stick to a single brand for the sake of loyalty. Back in 2002 I'd fricking buy a duron over any crappy celeron or expensive pentium trash, but now intel is the superior budget option with lower memory latency.
what cope is this
the turbo poorgay market is super fricking tiny compared to midrange and amd is easily winning there
imagine memory latency being a fricking marketing point for you when that hardware is melting itself to death trying to compete
No one is really seething though? it is good to bring up that paying because again, as mentioned, boomers are still influenced by it to this day and are loyal to Intel because of it. Me personally, I just buy whatever has the best performance for the cost. I've always done this, it's why I've had hardware from Intel, AMD and Nvidia over the years.
Boomers are only loyal to intel simply because it's the most visible brand. Intel had the superior production capacity and higher market penetration. You could find intel in every prebuilt PC. AMD was cheaper but rarer simply because they couldn't produce CPUs as fast as intel. For those boomers it's seemed like popularity = quality.
>budget
Wrong unless you mean super budget. You can get a 5600x - which is more than enough for pretty much everything - for less than 100 bucks. Intel only has something for the super poorgays who can't afford to spend more than 50 bucks on a CPU but those people are not looking for upgrades - they are looking for something that just barely works - and definitely can't run modern more demanding titles. The bare minimum if you will.
>5600x
I can confirm, I got a 5600 OCd to x level, and it's a little beast of a CPU, runs all the modern slop with less than 60% usage
>5600x
>for less than 100 bucks
Where? In the used market? Did you mean those $120 warranty-less ryzen 5600 shipped from china for OEM PCs? I'd rather spend a bit more for a one year official warranty.
Checked. Apparently prices went up recently. Last time i checked it was new on Amazon for like 95 EUR - Now it's 135 EUR.
>5600X
12400f is better and cheaper.
>OCing 5600 to 5600X levels
I wouldn't do that. 5600 is a lower binned 5600X. Enjoy reliability issues in the future I guess.
is better and cheaper.
Nah. The 12400f is 140 bucks here at best. More expensive. The board is also more expensive than a cheap generic AM4 for the 5600x
H610M is cheaper than AM4 and have PCIE 5.0 ports for GPUs. 12100f and 12400f are pretty much fine on those. 5600X requires moderately strong VRM like all X CPUs so a cheap AM4 won't cut it in the long run.
>buying H610 board
Oof.
>PCI-E 5.0
They don't have that on H610 and there is not a single GPU that supports it anyway.
>5600X requires moderately strong VRM like all X CPUs so a cheap AM4 won't cut it in the long run.
Oh no ~65W is just too much.
You are clueless.
>there is not a single GPU that supports it anyway
PCIe 5 SSDs exist, though.
It depends on the games yes, but both are 3-5% from each other in certain titles. https://www.youtube.com/watch?v=v5N8SzBSzsk&t=440
I'd choose AM4 over intel shit any day of the week though. If i wanted an upgrade nowadays I could buy a 5800x3D which doesn't pull 900w and has cuck cores.
>If i wanted an upgrade nowadays I could buy a 5800x3D which doesn't pull 900w and has cuck cores
It's shit for video editing and other non gaming tasks. 13600K is the best CPU in that price range. It does pull more watts but the difference is miniscule next to how much your GPU is pulling, unless you somehow keep it running at peak power somehow. Also lower idle wattage.
Me? I'd rather stick to budget CPUs until the next generation intel and AM6 come out. 12400f and ryzen 5600 are still getting amazing fps on AAA games, even 12100f is still great. They'll be good for the next 5 years or so.
is better
Not really, they are about the same.
>and cheaper
It depends on your region, I guess. These are the prices where I live, both use DDR4:
12400F + Cheapest H610 MOBO = €228.86
5600 + Cheapest A520 MOBO = €190.96
>they are about the same
It depends on the game. These benchmarks only test like a dozen games, that's a pretty small sample size. I believe 12400f would perform better generally due to the lower memory latency. Starfield runs straight up like trash on AM4 systems.
>5600 + Cheapest A520 MOBO = €190.96
That mobo doesn't support PCIE 4.0+ GPUs and needs BIOS update to run ryzen 5000 CPUs. I'd rather pay slightly more for convenience. B550Ms are cheap nowadays, just use those.
>It depends on the game.
Yes it does. That's an average, which means in some games the 5600 wins and in others the 12400F wins. On average, the performance is about the same. I can also cherrypick games, you know.
>That mobo doesn't support PCIE 4.0+ GPUs
???
PCIe is backwards compatible. A520 only supports PCIe 3.0, that's true, but it doesn't matter unless you're using a 6500XT which is limited to four lanes. But no one should be using this GPU for gaming anyway.
If you really need PCIe 4.0 for some reason (and this reason is definitely not gaming, even PCIe 2.0 is enough most of the time), with the cheapest B550 MOBO the total price goes up to €211.41. Still cheaper than Intel.
>needs BIOS update to run ryzen 5000 CPUs
No it doesn't. X570, B550 and A520 were released with Zen 3. They support Zen 3 out of the box, even if you end up with a motherboard produced four years ago and never updated since then... somehow.
>That's an average
Yeah out of only a dozen or so games.
>I can also cherrypick games, you know
I picked one where the performance is straight up bad, not just less good.
>PCIe is backwards compatible
Budget cards with tiny bus like 6500XT run really bad on anything less than PCIE 4.0.
>with the cheapest B550 MOBO the total price goes up to €211.41
That's my point. Intel CPU + H610M are still cheaper in my region though.
>No it doesn't. X570, B550 and A520 were released with Zen 3. They support Zen 3 out of the box
The early BIOS versions don't support AM4. A520M is an old board so you're likely getting an outdated BIOS.
>Enjoy reliability issues
lmao what "issues"? once you stress test everything and makes sure it works, it works. haven't had any issues with my undervolt+small overclock. silicon degradation is a meme, sandy bridge cpu are still chugging away at their 4.5ghz even today. and if it happens in a decade? just feed it 0.05v and issue's solved
Those are valid points though.
It's not. Stable how? they can never expand and neither can you. Copy how? who was pasting cores together first to get multi cores because they couldn't make it on a single die like someone else? who made x64?
>the most visible brand
>You could find intel in every prebuilt PC
yes because Intel paid, keep up.
> I just buy whatever has the best performance for the cost
based thinking consumer
R7 5800x3D if you're on ddr4 am4 mobo and gpu under 4080/7900xt.
R7 7800x3D if you're on a 4080, 7900xtx or 4090, ddr5, am5, etc.
R5 3600 if you are severly constrained for cash, and on some ancient quad core and dd3... 3600 with a b450 mobo abd 16GB ddr4 3200 with max out a 6650xt and 306012GB.
While the combo is under $200 new.
Of course 5600 and 5600x are great vermeer same as 5800x3D but only 32mb of cashe and only 6 core.
7500f and 7600x are good ddr5 choices.
Thats it.
Do not get memed buying some quad core, or buying anything with E cores for vidya. 12400f is the only intel cpu somewhat ok if ylu get it cheap and with a cheap mobo, but thats a rare bundle option on sale.
what kind of work what needs high end cpu?
cant you just use gpu instead?
AMD for CPU, Nvidia GPU. Intel is garbage now.
>AMD CPU
>Uh muh PC just random powers off for some reason idk what it is
>Intel CPU
>just works
Anyone else noticed this pattern?
Nope.
Did you put an AMD CPU on an Intel mainboard?
Check your PSU, like a power delivery issue
Performance is about the same. You'll never be CPU-bottlenecked in a AAA videogame anyway. In lighter eSport games where CPU performance actually matter, any modern processor can pump out hundreds of frames per second.
AMD is cheaper, more efficient, AM5 offers better upgradability. If you're still using a Zen 1 or Zen 2 CPU, you can upgrade to a much better CPU (5600 or 5800X3D) for relatively cheap.
Intel isn't even THAT far behind. But they don't offer anything unique (unlike let's say Nvidia: their hardware is worse per dollar, but they have DLSS), so there's no reason to buy their products. Why would you ever buy a worse CPU when you can get a better one, even if only slightly better.
But as I said, any modern CPU is more than enough for current videogames. You can get whatever you want, both are fine, it doesn't matter.
intel currently offers superior price / performance in real world scenarios (1440p+, playing modern games)
if in the rare instance you're no longer GPU limited (4090, play less demanding games, low resolutions) then an AMD 3d cache CPU can squeeze out some extra frames, but please don't buy one of those otherwise
>(1440p+, playing modern games)
literally fricking any recent cpu made in the past few years can manage when you're gpu bound
why the frick would you pick intel when it's just drawing more power for no gain whatsoever
It's not like the Pentium/Celeron vs Athlon/Duron days anymore. Duron used to perform almost as good as pentium 3 at less than half the price, but now the difference between 2 brands is within a hair breadth. AMD aren't making ultra budget chips anymore like they used to, and even the cheapest 12100f runs any AAA title at 100+ fps. I don't understand why CPU brand wars are still a thing.
Because
>muh tribalism
morons need to validate their purchase somehow.
The performance and use cases have shifted severely, now it's either you have a high-cache CPU or you don't. You have something eating 200-250W peaks (x7000k) or you don't. Z and X series mobos now cost 200-300$ starting and you get very little features outside of expansion and maybe some additional I/O.
And no, the 12100f is not 'good enough' unless you are making a very budget PC that will in fact play a lot of titles at 1080p with FSR/DLSS with very bouncy FPS and some really annoying stutters. A LOT of modern games eat CPUs for breakfast.
t. Have built way too many machines and sell 'boutique' gamer PCs with a bumped up price just because I put in an LED strip and 2 RGB fans in there with a case that costs more than the CPU.
>AMD GPUs run laps over Nvidia when it comes to price/performance ratio
>LMAO poorgay, where's your 4090???
>AMD CPUs obliterate Intel's for gaming purposes, being stronger, less power hungry, colder AND cheaper
>bro, buy middle-of-the-road Intel, you never know when you'll want to start rendering stuff
At least people stopped posting Userbenchmark reviews
For gaming? Definitely AMD.
>X3D exists
>No useless E cores
>AVX512 for emulation
>APUs are neat at the low end
>Uses half the power
Intel is better for emulation though.
Not if you ever want to play PS3 games.
Nobody cares how much faster it is at something that already runs at full speed like Dolphin.
>you HAVE to pick a side goyim
Frick off moron. I buy whatever is good at the time i need it. Right now that means the 7800x3d for gaming until something better comes along.
Intel, unless you want Proton chinese spyware and for the chip sandwich to cook itself.
it just works
how did they do it?
AMD could also push power consumption to 400W and improve performance by 5%. But they didn't because that would be pretty stupid. Raptor Lake is actually pretty efficient, just not as efficient as Zen 4. Intel can't compete at lower TDPs, so all they can do is pump more power into the CPU. No one is ever going to buy a slower CPU, but maybe there are some customers out there who don't care about power consumption.
Radeon did the same when they had to compete with Nvidia's Pascal. If the RX 480 and GTX 1060 were priced the same, used the same amount of power, but the 1060 was 10% faster... no one was ever going to buy the 480. That's why they pushed the 580 (rebranded 480) to 200W, so that it ended up beating the 1060 (120W) by 5%.
And Nvidia did it when they had to compete with the much more efficient RDNA2, the 3080 was infamous for tripping power supplies.
Basically, whoever ends up with a worse architecture has to drive power consumption to the moon if they want to sell. Right now it's Intel, in a few years who knows.
Last gen for GPUs was crazy all around. AMD's own product documentation recommended a 1000w PSU for the 6900/50XT because of transient spikes.
Case in point, the 6900XT and 6950XT weren't even supposed to exist. They saw they could compete with the 3090 and they went for it, they clocked Navi21 as high as they possibly could. Completely destroying the great efficiency of RDNA2 for single digit % performance improvements.
Then they came up with the 6950XT, which was even dumber... and Nvidia responded with the 450+W 3090Ti lmao. Right now it's the other way around. Nvidia can be more conservative with clock speeds because AMD's RDNA3 is worse than Lovelace.
My point was, no architecture is inherently a power hog. You could limit the 14900K to a much more reasonable 100W TDP and it would still perform great... just not as good as Zen 4. And no one would buy it.
RDNA3 is great, but the product naming fricks up the entire stack and it leaves a very bad impression. AMD shouldn't have gone with chiplets without letting them mature a little first, as well.
>AMD shouldn't have gone with chiplets without letting them mature a little first, as well.
pretty sure chiplets cost them less than what a monolithic die would
this amd generation was focused on increasing profit margins by lowering production costs, not pure power
Yet RNDA4 will be a monolithic die and the halo card will be a $500 4080.
how the frick is 7 7800X3D so low consumption
stock ryzen chips are already heavily overclocked to win in benchmarks. If you limit a 7700x you get the same frequencies at the same wattage.
They can't do that with the 7800x3D. The 3D cache traps heat and the TSVs can't increase voltage as high. So by necessity they HAVE to use lower power limits.
Low clock speed and huge cache. Most games benefit from cache more than clock speed.
like this anon said
you cant overclock an x3D CPU without melting it
Only broke morons get AMD. Intel has shit like QuickSync and Heterogenous Architecture. Which allows for core performance to used correctly before passing it onto another.
If any of you gays want to learn feel free to
https://www.intel.com/content/dam/www/public/us/en/documents/white-papers/ia-introduction-basics-paper.pdf
As a disclaimer my first AMD cpu was the shitty Tri-core bulldozer APU; like the 2nd one they ever made. It was absolute dogshit. So it was a shitty first impression.
>Only broke morons get AMD
Is this nekker stuck in 2012?
Do you know that amd is more expensive than intel right?
Intcels are so fricking funny.
Anyone who gives two fricks about 'Wattage' is a moron looking to nitpick.
Like just lower your PL1 and PL2 values dumbass.
That cripples Intel CPUs hard while Ryzens fly at modest power limits.
I know I was making a joke; I like you anon.
I work at Intel actually. There's alot of work done to make sure Windows doesn't frick over all the work on the silicon level. Shit is assssss
I have a Ryzen 7 7800X3D and it absolutely dumpsters anything I throw at it.
I'm about to build my first computer and the build is centered around using that chip. I am excited but scared to frick it up.
You'll be fine. Just follow basic instructions, don't get ahead of yourself, and be sure to take your time. Stop if you feel overwhelmed or unsure and take a break.
>IntelME
both are fine. doesn't matter now.
Intel Software built into the chip is godly if you do literally anything else than gaming.
Too bad you can't test that because you need a different mother board
intel and nvidia. single core performance is still king for a lot of things. intel/nvidia just werks with everything and gets support faster for everything when its new. you pay a tax for it sure but not being one of those goons whining on message boards about drivers is worth it
AMD's winning at single-core now. It's over for Intel. Pack it up, Ramdeep. This is an /amd/ board, now.
The performance is amazing because of the E and P cores and knowing what threshold is needed to pass it to a P core.
It's wizardry to make sure Windows Search doesn't have a fricking autistic rampage which leads to it shitting the bed on a P core while you game.
Also everyone reeeing about power usage.
https://www.msi.com/Handheld/Claw-A1MX
Intel is finally trying to get into the handheld market. Ticktok AMDgays
i'm on win 10 and read it doesn't know how to use the p and e cores correctly because of the scheduler or something, but i haven't noticed any issues so far. its fast though, 13700k
>XeSS looks like desaturated shit
>Intel iGPU can't compete with AMD's APU
>need to run Win11 or the P+E core configuration goes full moron and your performance suffers
Pajeeeeeeeet.
>XeSS looks like desaturated shit
Tell me how FSR looks like.
>installing windows 10 in 2020+4
Might as well go with trannux.
>what's FSR look like
Don't know. AMD's rasterization is powerful enough to not need upscaling tech 🙂
>using Windows on a handheld
Need to dedicate more of that nvme to bloat my homie?
>AMD's rasterization is powerful enough
This is iGPU we're talking about. Even the top end ryzen iGPU is a lot weaker than gtx 1650.
>steam OS
Steam deck can't even run 2020 games well at 720p. I'd rather game on a netbook than ick on eck. Both are only useful for indie games, but at least you're getting full keyboard and a bigger screen at a lower price.
>This is iGPU we're talking about.
Intel iGPUs have been dogshit for years, is meteor lake actually any good?
nta but i managed to get ai to run with the igpu and it wasn't as dogshit slow as i figured it'd be. slower than a dedicated card for sure, but not bad. for a normie it'd be enough for faceberg and youtube. i hope intel does something with that arc platform and gives us another option
They might just say frick it and place the ARC GPU shit into the CPU. So instead of iGPU it's dedicated graphics ala APU.
In its current state intel's igpu peaked like their 14nm+++++++++
i think the igpu stuff is impressive for what it is on a chip, but keeping arc on the chip would seal its fate as a nothingburger. its to little space to have enough power and architecture and would never compete with ati/nvidia
They're gonna try some moronic AI shit first to squeeze out performance which may help but yeah they need to figure out a new hardware solution.
i keep up with ai shit and making cards specifically for it at this point is dumb. transformers for example has to go, it'll eventually be replaced. so anything they make hardware around now will be gone in a few years. there is a LOT of optimization to be done yet with everything ai, trying to generalize cards for it at this point is a mistake imo
they should be focusing on getting cards out and entering the market
NOPE gonna have to wait for at least arrow or panther lake.
best case they can shrink the die from the Arc GPUs
AMD inexpensive cpu building for someone else
CPU what i buy so i dont run into weird bugs
similar if it were nvidia vs amd. i kinda want the product to be good when i buy it. i dont want to wait a year for "fine wine" and unexplainable bugs
How's the weather in Mumbai?
are indians now buying higher quality products now or something?
whats the meme?
they arent buying heavily discounted ryzen cpus and making excuses for why they dont need ray tracing?
Paid. Pajeet. Marketing. Shill.
>meme tracing
Oh look here comes the nshidia shill
you dont need ray tracing
you couldnt afford it if you did.
kek it's like a Scooby Doo scene where they pull off the Intel shill's mask and it's an Nvidia fanboy underneath
>the pajeets at work defend intel like their lives depend on it
>meanwhile everyone with an iq above room temperature uses AMD
Really makes you think...
5800X3D and the 7800X3D are the most efficient CPUs ever.
You would care about power "consumption" if you are from the EU.
>3D cache
Good for you. Enjoy cache latency though.
The cache is on the chip. Huh?
Sour grapes ass homie.
you would care about power consumption if you're not an irresponsible child or in charge of anything where profit is concerned
Man the shitel employees are really trying and failing to damage control in here
Man I really don't want to but I guess I should move over to AM5. I was planning on just upgrading to the 5800X3D but I might as well just do a rebuild. I already don't like my 3800X (which I got a nice mobo deal with when the 3900X was out of stock forever). At least all I need to do besides buy the necessary upgrades is contact Noctua to get an AM5 mounting bracket.
Not worth it, DDR5+7800X3D vs DDR4+5800X3D is a fairly minor difference. Save the money from the mobo+RAM for the next gen or the one after.
t. Have both machines in the office
>DDR5+7800X3D vs DDR4+5800X3D is a fairly minor difference
I have to disagree. You can easily upgrade AM5 CPU several years later if you have a decent motherboard. AM4 is good if you want to save some money right now or even skip AM5 era.
Why would he spend 500$ more instead of just getting a 5800X3D and some more RAM, homie? The CPU is still a top performer, you could get almost half a current GPU for that money.
in the future, don't plan to ever upgrade. spend the money up front and get extra ram, processing power that you think you won't need. i got 9 years out of my last build and it cost $1200 for the tower at the time, before prices went nuts
If you dont immediately know what this file is associated with, you have no knowledge of what CPUs are or how they function.
>bought 7900X because it was on a deep discount with a mobo
>turns out 7800X3D is better for games
oh well, still a big step up from my i5 2500k, I'll miss the little guy
If you loved the 2500k that much why not get it's 10 year glow up the 12600k?
You some sort of brokie?
https://ark.intel.com/content/www/us/en/ark/products/134589/intel-core-i5-12600k-processor-20m-cache-up-to-4-90-ghz.html
Why do the Intel shills in this thread have painfully awkward English?
Do you even know what this program is?
No, Ramdeep, I do not. How does that change the fact that everyone in this thread shilling for Intel has awkward English as if it's a second language for them?
Hell, even this unrelated counter argument seems like something a pajeet would do.
Rangebanning every third world country solve 90% of the problems on Ganker
you're a fricking moron homeboy
Riveting comeback, Ranpoo. It doesn't make my observation any less poignant or correct.
All of Ganker would immediately improve if non-US IPs were immediately banned. I'd say the UK, France, Canada, and Australia can come back, but only the rural IP addresses. Urban areas might as well be considered third world at this point.
3d cache thingy is to die for, 12 cores is only good for i dunno, ffmpeg or something, it's more of a workstation CPU. then again, i doubt you're getting any framerate issues in vidya on 7900x
i just got i7-12700F. it was the only thing they had in the store atm. is it good? do i need a special cooler for it?
Just buy an AIO like a white man and be done with it.
i already had the whole pc, old cpu got fried. otherwise i'd probably just get a macbook. i'm sick of computers and technology in general, just want good drivers for audio and aggregate devices
i just put the cooler that came with it, but also have old ass nepton 240m somewhere
perfect, thanks guys!
Never buy AiO. Either splurge all the way for a custom waterloop or go air
Why not?
You're spending a lot more for something that cools just as much as an air cooler
Thats not true though
moron
>quiet
>good temps all year long
There is no reason for custom loop unless you are going way out of standard hardware specs (OCing) or because you want to have a "cool" aquarium PC. AIO is simple, easy to install, and just works.
what cooler do you have now?
You can get away with something from Thermalright or Noctua if it didn't come with an OEM cooler. The 12700k is a good CPU, even with the cuckcore nonsense. You did well.
It's good. It's not a K CPU, you don't need a water cooler for it, just get a decent air cooler like deepcool AK500 or AK620.
K just means it's unlocked. They didn't give it extra power output or anything.
>They didn't give it extra power output or anything.
Actually they did. K CPUs draw more power but could be limited.
I have a 12700k and I undervolted it with -0.075v
and just use a 120mm air cooler(ID Cooling 226 XT)
The temps and power consumption while gaming is really good
Intel still prices their CPUs like they haven't fallen off hard in the last ten years.
As somebody who always went with Intel, this gen is making me regret my 12700k and my next upgrade will most likely be a 7800x3d or its equivalent
12700k was fine when it came out, certainly a lot better than previous intel chips. Out of all the times to buy intel, that was probably the best since the original ryzen.
I was full on frustrated with AMD back in the Phenom and Bulldozer days. They had gone from being a respectable brand to irredeemable garbage. Ryzen being this good was like being blindsided.
what are you trying to do that a 12700k seems bad for
I wouldn't definitely call it bad of course, especially considering I went from a 3930k to that, not to mention it also pairs decently enough with my 4090 when playing at 4k
Yet, I still can't help but wish I had more CPU power for those games I'm CPU bound, or for the heaviest rpcs3 games like inFamous
Why not?
i built my last build around emulation and it was right before rpcs3 started to become usable and actually made use of multiple cores over something like pcsx2. it was an i5 4060k i think? i managed to play through ac4/4a on xenia using vulkan but so much of the graphics were fricked, i had make cheats because i couldn't see that i was being railed by a giant laser. i have a 13700k now but haven't tried rpcs3 yet, but i'm hoping its good
You can run RPCS3 with mobile Ryzens now. Of course a 13700k will be enough.
i remember thinking 'oh frick' when reading about how rpcs3's approach was to emulate each core to a core on your comp, which i didn't have enough for. i knew my new build was already kinda fricked. but it gave me so much great emulation, i have no complaints
Why would you upgrade your CPU for extra 10 fps?
The 7800x3d price to performance is unbeatable.
Whichever one you find the cheapest, but if both are are around the same price, then its the X3D series.
Intel mostly what has good is you get to keep ddr4 memory if you pick a ddr4 based mobo, great for those who bought like 64gb on their last system, so you don't shill $400 on ddr5
Imagine playing at such a low resolution that this is a decision.
7800X3D is the best gaming CPU you can get, but I'm waiting for Zen 5
new paradigm when?
Support is no more a thing and there's no need to fanboy over a chip maker. Just get the best one in your price range at the time you want to buy a cpu.
Any Mobo reccomendations for a 7800X3d?
Pretty much any b650 is enough. It all depends on what features you need.
B650 Aorus Elite AX
It draws like 70 watts, literally anything is fine
i'd go for some flagship x670 and then pray it'll get updates until ryzen goes into 10xxx models
I have a 7600X and I got an Asrock B650 PG Lightning, from my research it ended up being the best mobo for its price range. Also has a nice M.2 5.0 slot.
Did I frick up buying a 7600 instead of 7800X3D? I figured I'll be GPU bound at 4K in most games anyway. GPU is 6950 XT.
should be more than enough for now, but yeah, you did lose some performance. i don't think i'd care all that much, unless you're playing this one or two very specific games that improve massively with 3d cache and are demanding enough to cause issues on modern high-end hardware, and the only one I can think of is VRChat
It didn't seem worth twice the price for slightly better performance. The plan was to just get on AM5 as cheap as possible and then upgrade to a better one at end of life like you could with 5800X3d on AM4. Also 7600 seemed to have the same performance as 7800X3D in some older games
Worst case scenario youre probably having 20 or so less FPS with a 7600 over a 7800x3d. I doubt 20 fps are worth 250 bucks of price difference, especially when 250 more bucks on a GPU would give you a meatier FPS boost.
Plus, you can always just buy a 7800x3d in some years when its price is low, since you have an Am5 setup now (provided you werent moronic and didnt buy a trash mobo).
>Want to upgrade my PC
>Microcenter is opening up in my city soon
>However the date is between March-June
Decisions, Decisions
7800x3d. The additional cache makes a difference in 1% lows. I have a 5800x3d, I will not buy anything without a large cache. It's junk, garbage, waste of silicon. Intel, junk. AMD non 3d CPUs, junk.
Intel is far superior.
>5800X
>4080
Am I bottlenecking CS2 right now bros?
>tranimeposter
>buys 4080 to play CS
Checks outnyh0p
Black person it's quite simple.
Play the game with some kind of overlay RTSS, nV overlay whatever then check if your GPU is near 100% utilization.
Put it this way, Ganker shits on everything Intel & Nvidia and shills AMD but whenever a new game comes out the people b***hing about crashes are always AMD users.
More lies in one post have never been told.
>the people b***hing about crashes are always AMD users.
thats the funny part. ati always sucked. nvidia was good before intel bought them and is still good. amd just can't keep up in processors, which are hardly relevant now anyways because everything is so fast that its the gpu that matters
intel-nvidia, can't go wrong even if you pay the tax
What the frick is this stupid word salad?
when your name is rasheed, probably year 2
brandmonkey cognition in action
>. nvidia was good before intel bought them and is still good. amd just can't keep up in processors, which are hardly relevant now anyways because everything is so fast that its the gpu that matters
what
Intel is embarrassingly shitty nowadays so AMD by default.
https://www.radgametools.com/oodleintel.htm
You should never buy AMD just for the mere fact that Ganker is shilling it. These trolls always want to bait people into bad purchases.
You should never trust Ganker with anything, let alone your money.
Is this price fricked up?
Amd Ryzen 5 5600G 4.4ghz
16GB Ram 3200mhz DDR4(2x8)
SSD 480GB
600w atx 20+4
600 usd in 1 payment or 800 in 3... I swear ive seen this pc for 300 in amazon or some shit
>5600G
Don't. Get proper GPU, it's shit deal.
Ive seen it get 60 fps on cs2 and silly games which i think its enough until i can get a decent gpu to pair it with
Will do sir, thank you sir
i was a decade out of date from my last build and i just monitored the pc threads for recommendations, eventually asked a few questions and i'm very happy with my build. unlike Ganker which is always mean you can ask dumb questions and get a non-meme response
For $800 (assuming it's USD) you can get a much, much, much better computer. Like, a 5600 + 6700XT.
A 5600G with no graphics card for $600 is a scam.
if not bait, check the Ganker build a pc thread and post your country and budget, you'll get a much better machine
...no GPU?
That is not a good deal. You are not far off with your $300 estimation.
>want to upgrade GPU
>realize that I'd also have to upgrade CPU and RAM and MB and PSU
>give up and keep on playing TF2 and minecraft without shaders
you guys ever had a OC instability so hard that it does an impossible game mechanic instead of crash?
Depends on what you mean. In Dead Space if you run it at 200 fps you can break the engine and do wild stuff. Speedrunners use it to their advantage. REmake 2 also has weird things like a knife attack doing multiple hits instead of one on high fps.
A few more:
In Monster Hunter World, some weapons (bowguns with piercing bullets, for example) hit more often than originally intended at higher framerates. A normal user is never going to notice, but this breaks speedruns. There's a mod to fix this.
In MH Rise, monsters used to track their moves better at higher framerates. On Switch (or on PC at 30 FPS), you could dodge some moves by just walking (this was intended behaviour). On PC they would snipe the shit out of you. This was fixed by Capcom a few months after the PC release.
In RE4 HD (the original, not the remake) enemies throw items (like grenades) twice as often at 60 FPS compared to 30. Also, you have half the time to beat QTEs.
In RE1 HD Remake, zombies track the player better at higher framerates. It's easier to run around them and not get grabbed at 30 FPS compared to 60.
So the PAL versions of og RE trilogy is insignificantly easier?
No idea. RE1 HD Remake is a port of the GC release (you could choose between 50hz and 60hz mode on the PAL version. Everyone played at 60hz, unless your TV was ancient and didn't support it). It was probably a tiny bit easier if played at 50hz, I assume.
Personally, I tried both 30 and 60 FPS on the PC version, and I couldn't tell the difference in zombie behaviour. But it can mess with speedruns apparently.
It has nothing to do with framerate, but the first NA version of RE1 on PS1 is way harder than the JP version. Because it doesn't support autoaim for whatever reason. You press the aim button and the character doesn't automatically turn towards the enemy.
>It has nothing to do with framerate, but the first NA version of RE1 on PS1 is way harder than the JP version.
They all have weird quirks and small changes. I watched some dude beating RE2 without pressing forward once and he picked some version of the game because of something but I don't even remember what his reasoning was.
been using a amd rysen 7 3700x since 2020 and have 0 with palying all games on hight or ultra,
except total war 2 on ultra,that shit pu my pc on fire.
i dont have anything to add about anons new build but i just upgraded to a 13700k, 4070 suti and a nvme drive, i had a normal hdd before and was enjoying sims 3 still. amazing difference
>CPU significantly more powerful than GPU
Why?
For gaming, absolutely AMD. A 7600x has performance on par with intel models twice more expensive.
>for gaming
AMD smashes Intel at production as well.
Does it? I've never looked at comparisons in that setting since I just play games and use photoshop/Office. I doubt I'd get an Intel anyway since I'm in Europe and intel CPUs are ridiculously expensive, like 50% higher than their price in USA, while AMD CPUs are very affordable.
see
I don't like Intelaviv but I have DDR4 sticks a plenty so I'm probably going to get one when my 9th gen starts struggling, there's literally no games worth upgrading for.
I say this as an owner of an AMD CPU and GPU: Intel. I've had nothing but problems since switching from Intel + Nvidia and my next PC will be avoiding AMD at all costs.