>Also straight up worse single-core price/performance.
What a terrible metric. The vast majority of applications you use, especially gaymes, are multithreaded.
Clock scaling is on a curve. There are usually a couple inflection points, but at some point every 100mhz gain starts requiring ever more power. Intel is pushing ultra far outside of the peak efficiency range of their chips so they can eek out more frequency to win some benchmarks.
And? If you're buying a 7800X3D for production you're moronic. Lets talk about yidya then, since we sometimes play games inbetween complaining about everything.
>Gobbles far more watts than middle-range GPUs most people would buy which eat at most 200w at full load. >Still worse than the X3D.
Why is intel so shit?
And? If you're buying a 7800X3D for production you're moronic. Lets talk about yidya then, since we sometimes play games inbetween complaining about everything.
you are moronic if you buy a $400+ CPU for gaming.
a 5500 or 12100f is more than enough for 60fps. >60fps too low
only for the AAA slop games that require a $500 GPU to run 1080p 100fps on high settings, you will be fine with all games.
but spending $2000 on hardware isn't as stupid as not pirating $60 video games.
8 months ago
Anonymous
Depends on the games you play. FPS sure, go cheap. Play shit like Factorio or larger strategy games and it pays off.
8 months ago
Anonymous
on a 5500 and 12100f you get 200fps on factorio.
8 months ago
Anonymous
You've never built a large factory.
8 months ago
Anonymous
actually the benchmarks used "updates per second", so I don't know what that translates to fps.
it's about as good as a 12600k apparently.
I mean even at 2023 they released the 5600X3D (although those are just defective 5800X3Ds), 6 years is a pretty good run on a single socket. can you imagine the israelites at intlel thinking of something like that, it would be housefire central since they would just be brute forcing their existing nodes by just giving more power to the chip hoping it would be faster.
>ackshually
You forgot to mention how they tried to break their word and abandon it almost immediately. The only reason they didn't was because people got mad at them. And even then they only technically supported it with end users having to jump through a few hoops to enable compatibility on later boards.
I mean even at 2023 they released the 5600X3D (although those are just defective 5800X3Ds), 6 years is a pretty good run on a single socket. can you imagine the israelites at intlel thinking of something like that, it would be housefire central since they would just be brute forcing their existing nodes by just giving more power to the chip hoping it would be faster.
and even then the fact am4 4/7 ryzens are still perfectly usable for gaming today and how you can still buy mobos for them just goes to show how good that architecture was
if anything they should be the choice for budget builds, because am4 cpu+mobo+ddr4 ram is far cheaper than a comparable am5+ddr5 alternative
why wouldn't the heatsink have that shape?
if intel was smart their heatsinks would also be shaped like that jokes aside believe me it makes it so much easier to hold and put into the socket, which may not matter for an experienced builder but it matters a lot for normalhomosexuals shitting their pants at the idea of bending a pin and the CPU/MB self destructing
I know it's a nice shape because on a normal CPU there is no actual CPU under the heatsink corners so only center point heats up, and here is less metal on corners that don't heat up anyway.
why wouldn't the heatsink have that shape?
if intel was smart their heatsinks would also be shaped like that jokes aside believe me it makes it so much easier to hold and put into the socket, which may not matter for an experienced builder but it matters a lot for normalhomosexuals shitting their pants at the idea of bending a pin and the CPU/MB self destructing
I know it's a nice shape because on a normal CPU there is no actual CPU under the heatsink corners so only center point heats up, and here is less metal on corners that don't heat up anyway.
its because if the smd's sitting at the front now (you can see it in OP's thumbnail). the whole back is occupied with contact pads for the motherboard
>7800X3D
Only good in games that utilize the 3D Cache.
If you're doing anything else then it loses pretty hard.
Lots of people also do video rendering, blender etc. but each to their own. >5 years of upgrades
I mean sure, if you're that poor. You're still limited to 6000Mhz RAM while Intel can XMP at 7800Mhz, overclock to 8200-8400Mhz depending on the IMC and temperatures.
>bragging about ram speeds which are marginal at best even in "work" environments and not gayming
intelcucks are really scrapping the bottom of the barrel
So all of them? 7800X3D is not impressive for daily usage, but for gaming it's simply the #1 processor and with the 7950X3D you can get the best of both world, powerful CCX for productivity and one CCX for gaming.
>Only good in games that utilize the 3D Cache. >Lots of people also do video rendering, blender etc. but each to their own^
so basically in all games and not good in irrelevant shit if youre trying to be a pseudo creator
>you need to overdrive your ram 2000 motherfricking mhz >to get literally just 18 frames more on average
like people keep saying, ram speed doesn't matter
save yourself the money you waste on buying intelslop and the electricity to feed it and buy a 7xxx AMD card one tier up from what you were planning to buy
8 months ago
Anonymous
6000Mhz vs 8000Mhz RAM can be up to 25% difference depending on the game. It's free performance. >caring about 30W more power consumption when the GPU draws 400W
Lmao
8 months ago
Anonymous
>25% more performance in cherrypicked benchmarks >for nearly triple the cost
just buy a better gpu
>8400mhz unicorn ram that passes cinebench and nothing else >this is fine
If people actually knew what unstable ram does to a system you'd know the number of intel chips capable of sustaining those speeds is shockingly rare.
Switched to AM5, now I have boot times of over 2 minutes. >dude muh memory training turn on memory context restore
turned it on and got constant blue screens, could not even get to desktop.
It's not ten years ago, we're getting 25% leaps every 2 years.
CPU matters, they're not dumping all to GPU.
When you buy a system, you can get a 1700X and upgrade to 5800X3D five years later, without upgrading memory or changing mobos. And getting better performance than current CPUs. And for cheaper.
Why does this make you seethe, shill?
8 months ago
Anonymous
you're also forgetting we are entering the era of framegen and upscaling homosexualry, like it or not
and that will also make the CPU put in more work instead of idling like a homosexual, so getting a strong CPU is going to be more beneficial today than ever before
8 months ago
Anonymous
>25%
meaningless numbers that do nothing
8 months ago
Anonymous
You enjoy your 40 FPS, and consider buying a console. Maybe it's easier to understand.
8 months ago
Anonymous
not my problem that you consume literal marvel slop
8 months ago
Anonymous
>and get better performance than current gen CPU's
In a handful of games that support X3D parts**
8 months ago
Anonymous
All games support L3 cache. What are you talking about. X3D is the king for a reason.
>RAM is XMP 7800Mhz or overclocked to 8200-8400Mhz.
Ten years ago I thought 1600 MHz was plenty. Crazy how certain aspects of computing have scaled so far while others haven't (ultimately we haven't gotten that far past "with good water cooling I can push 5 GHz on my CPU" from the same era).
Even on Intel 7000Mts is still unstable, and even on Intel, CL matters more.
Why? The 7950X3D supports your cause because it has better binned chiplets.
Always trust an AMD fanboy to be moronic.
7800X3D is faster than the 7950X3D, because it's single CCD. It's 10+% faster, despite lower clocks.
8 months ago
Anonymous
>X3D is the king for a reas-
ACK
Depends on the game.
Gap is even larger if the RAM is XMP 7800Mhz or overclocked to 8200-8400Mhz.
ACK
How dense are you?
The answer is literally in the graph you dummy.
8 months ago
Anonymous
8 months ago
Anonymous
And this.
Even Ganker's Intel shills don't dare link Hair Chasers' video.
8 months ago
Anonymous
And lastly, power.
It's not only about electricity cost, more power is more heat, means beefier cooling, means more noise, and more heat pumping into your room too.
8 months ago
Anonymous
My 13700k uses 70-120w during gaming.
I've never trusted Black personware unboxed after him promoting the 2500k for Battlefield 1 multiplayer while only testing singleplayer, not knowing how gear 1 and gear 2 works for RAM and shilling overpriced DDR5 4800mhz kits with his links over buying a cheaper CL16 3600Mhz DDR4 kit which ran rings around the DDR5 kit.
I don't trust anybody that doesn't livestream their benchmarks.
8 months ago
Anonymous
That's great dear, but literally everyone else has the same numbers.
8 months ago
Anonymous
And this.
Even Ganker's Intel shills don't dare link Hair Chasers' video.
So what's the take away from this. Why are benchmarks all over the place? One person saying Cyberpunk is faster and the other is slower?
It's why standardized benchmarks need to be done and why gamers should be doing benchmarks, not washed up old homosexuals that thinks unlocking the FPS in CS2 would get him VAC banned.
Nope. The 7950X3D X3D chiplet is a higher bin. Framechasers bought both and tested them.
8 months ago
Anonymous
Do you understand that that does not matter.
It's split in two CCDs, making the 7950X3D mostly useless, especially in gaming.
There's a reason it was shit on, and still is. There's a reason 7800X3D is the on top of the lists. There is a reason people want a single CCD 8800X3D or whatever it's going to be.
I don't care about fps, i need more cores and threads, only a stupid gayget would buy 8/16 for just gaming.
You keep doing that moron, buy a console instead too.
8 months ago
Anonymous
It was shit on because it didn't come with a scheduler and used GameBar to determine which CCD to park.
The actual CCD itself is binned higher. The guy disabled the 2nd CCD and it had the same performance.
It literally doesn't matter that there is a 2nd CCD.
And an extra 100-200Mhz on an X3D part doesn't matter that much either.
>RAM is XMP 7800Mhz or overclocked to 8200-8400Mhz.
Ten years ago I thought 1600 MHz was plenty. Crazy how certain aspects of computing have scaled so far while others haven't (ultimately we haven't gotten that far past "with good water cooling I can push 5 GHz on my CPU" from the same era).
It's not really a huge deal. I mean, do people really upgrade CPUs? By the time it makes sense to do that, it also makes sense to upgrade everything else! So I feel it's better to build a whole new rig, and keep the old one as a spare..
AMD has been winning since 2019 and anyone arguing against was an entrenched brand loyalists. LGA1151 and 1200 are dead while AM4 gigachads are sitting on 5800X3Ds until the end of the decade.
>since 2019
The 3900X was the only good CPU and that was in applications that utilized 12 cores. >gigachads are sitting on 5800X3D until the end of the decade
Must suck to have half your games run like shit lmao.
I care about idle power consumption and AMD is moronic in this regard.
Also straight up worse single-core price/performance.
>Also straight up worse single-core price/performance.
What a terrible metric. The vast majority of applications you use, especially gaymes, are multithreaded.
the cache makes up for it chud
>single-core performance
Zen4 has no issue with that
14 vs 21 is a pretty big difference. But hey what's it like under loh no
Why is intelslop such an electricity waster?
Clock scaling is on a curve. There are usually a couple inflection points, but at some point every 100mhz gain starts requiring ever more power. Intel is pushing ultra far outside of the peak efficiency range of their chips so they can eek out more frequency to win some benchmarks.
Is it? My 13700k draws 70-120W depending on the game.
Who cares about 30W when the GPU draws 400W.
>385 W
That better also cook my breakfast.
You also missed the part where the Intel chip finishes that Blender workload 5x faster than the X3D part.
Sorry, I'm not into competitive blender speedrunning, I like to spend my money for hardware usable in real applications.
And? If you're buying a 7800X3D for production you're moronic. Lets talk about yidya then, since we sometimes play games inbetween complaining about everything.
Almost triple the power so that's almost triple the performance right? Right?
>Gobbles far more watts than middle-range GPUs most people would buy which eat at most 200w at full load.
>Still worse than the X3D.
Why is intel so shit?
you are moronic if you buy a $400+ CPU for gaming.
a 5500 or 12100f is more than enough for 60fps.
>60fps too low
only for the AAA slop games that require a $500 GPU to run 1080p 100fps on high settings, you will be fine with all games.
but spending $2000 on hardware isn't as stupid as not pirating $60 video games.
Depends on the games you play. FPS sure, go cheap. Play shit like Factorio or larger strategy games and it pays off.
on a 5500 and 12100f you get 200fps on factorio.
You've never built a large factory.
actually the benchmarks used "updates per second", so I don't know what that translates to fps.
it's about as good as a 12600k apparently.
>I care about idle power consumption and AMD is moronic in this regard
?
>the i9 11900K wastes as much w's idle as an entry level gpu eats at full load
K E K
E
K
That's whole system power. I think they have a 3090 in there.
holy frick you are moronic
Hello 2017, I speak to you from the future. Kill Fauci.
>idle power consumption
Be real homie
>AM5 will be supported for many more years
You mean like they supported AM4 for many more years?
AM4 existed from late 2017 until 2022 with new chips launching at regular intervals.
Rot in forever poverty hell, whining homosexual.
>ackshually
You forgot to mention how they tried to break their word and abandon it almost immediately. The only reason they didn't was because people got mad at them. And even then they only technically supported it with end users having to jump through a few hoops to enable compatibility on later boards.
Yes?
I mean even at 2023 they released the 5600X3D (although those are just defective 5800X3Ds), 6 years is a pretty good run on a single socket. can you imagine the israelites at intlel thinking of something like that, it would be housefire central since they would just be brute forcing their existing nodes by just giving more power to the chip hoping it would be faster.
and even then the fact am4 4/7 ryzens are still perfectly usable for gaming today and how you can still buy mobos for them just goes to show how good that architecture was
if anything they should be the choice for budget builds, because am4 cpu+mobo+ddr4 ram is far cheaper than a comparable am5+ddr5 alternative
God I hate that smug little twerp
t. linus
All he does is b***h and wine while he gets gayer and fatter. Reminds me of israeli movie critics.
Why is this CPU heatsink is in this shape?
why wouldn't the heatsink have that shape?
if intel was smart their heatsinks would also be shaped like that
jokes aside believe me it makes it so much easier to hold and put into the socket, which may not matter for an experienced builder but it matters a lot for normalhomosexuals shitting their pants at the idea of bending a pin and the CPU/MB self destructing
I know it's a nice shape because on a normal CPU there is no actual CPU under the heatsink corners so only center point heats up, and here is less metal on corners that don't heat up anyway.
its because if the smd's sitting at the front now (you can see it in OP's thumbnail). the whole back is occupied with contact pads for the motherboard
>moar cores!
Stupid Black person.
>use sleep mode
>ddr5 performance go brrrrrrrrrr
I went for the bottom of the barrel 7600 so I'm kinda forced to upgrade on this platform. Maybe like 9800x3D or something?
>7800X3D
Only good in games that utilize the 3D Cache.
If you're doing anything else then it loses pretty hard.
Lots of people also do video rendering, blender etc. but each to their own.
>5 years of upgrades
I mean sure, if you're that poor. You're still limited to 6000Mhz RAM while Intel can XMP at 7800Mhz, overclock to 8200-8400Mhz depending on the IMC and temperatures.
>bragging about ram speeds which are marginal at best even in "work" environments and not gayming
intelcucks are really scrapping the bottom of the barrel
>RAM speed doesn't matter
Spoken like a true AMD shill.
>Intel can overclock to 8200-8400Mhz
are you moronic?
How uninformed are you?
>Only good in games that utilize the 3D Cache.
So all of them? 7800X3D is not impressive for daily usage, but for gaming it's simply the #1 processor and with the 7950X3D you can get the best of both world, powerful CCX for productivity and one CCX for gaming.
>Only good in games that utilize the 3D Cache.
>Lots of people also do video rendering, blender etc. but each to their own^
so basically in all games and not good in irrelevant shit if youre trying to be a pseudo creator
>rebranding the 12990k for the 2nd time
>RAM speed doesn't ma-ACK
so where does it say the ram speed?
Intel 8200Mhz
AMD 6400Mhz
>ram speed doesn't matter
>cpus with slower ram are faster
How dense are you?
The answer is literally in the graph you dummy.
>you need to overdrive your ram 2000 motherfricking mhz
>to get literally just 18 frames more on average
like people keep saying, ram speed doesn't matter
save yourself the money you waste on buying intelslop and the electricity to feed it and buy a 7xxx AMD card one tier up from what you were planning to buy
6000Mhz vs 8000Mhz RAM can be up to 25% difference depending on the game. It's free performance.
>caring about 30W more power consumption when the GPU draws 400W
Lmao
>25% more performance in cherrypicked benchmarks
>for nearly triple the cost
just buy a better gpu
>for triple the cost
Where's stock Intel? And why is oc 7950 slower? There's something wrong about this test
This is a 13900k, it's $200 more by itself. 50% more. Not to mention whatever the ultra fast ram costs.
>Let me post cherry picked graphs that prove whatever point I want to make this time - the thread
All these high power draw electronics just means less browns and yuropoors in my vidya.
lol one of the few remaining non corpo intel shills really is working overtime
>8400mhz unicorn ram that passes cinebench and nothing else
>this is fine
If people actually knew what unstable ram does to a system you'd know the number of intel chips capable of sustaining those speeds is shockingly rare.
But muh benchmark numbers.
That's nice but I'm going to stick with my 5900X.
I hate that guys fricking face so much. his voice and mannerisms are even worse.
you can hate him but at least he isn't sucking off corporations, he shits on them when they deserve it.
Switched to AM5, now I have boot times of over 2 minutes.
>dude muh memory training turn on memory context restore
turned it on and got constant blue screens, could not even get to desktop.
>boot times
lol, I've been only using sleep mode for years
ddr5 is a scam in general until they fix their memory controllers
Honestly this, I can't even run my memory with the full EXPO profile without it becoming unstable.
>many years
so that's just name for 5 years?
like why even bother
just force people to buy another motherboard
seems fricking moronic
>just pay more, stop kvetching
no one but morons will buy another cpu within 3 years of building a computer
Hello again 2017
>he can even come up with a excuse to buy another cpu
what are you emulating ps3 games? a shit tier cpu can run linux and play old games just fine
It's not ten years ago, we're getting 25% leaps every 2 years.
CPU matters, they're not dumping all to GPU.
When you buy a system, you can get a 1700X and upgrade to 5800X3D five years later, without upgrading memory or changing mobos. And getting better performance than current CPUs. And for cheaper.
Why does this make you seethe, shill?
you're also forgetting we are entering the era of framegen and upscaling homosexualry, like it or not
and that will also make the CPU put in more work instead of idling like a homosexual, so getting a strong CPU is going to be more beneficial today than ever before
>25%
meaningless numbers that do nothing
You enjoy your 40 FPS, and consider buying a console. Maybe it's easier to understand.
not my problem that you consume literal marvel slop
>and get better performance than current gen CPU's
In a handful of games that support X3D parts**
All games support L3 cache. What are you talking about. X3D is the king for a reason.
Even on Intel 7000Mts is still unstable, and even on Intel, CL matters more.
7800X3D is faster than the 7950X3D, because it's single CCD. It's 10+% faster, despite lower clocks.
>X3D is the king for a reas-
ACK
ACK
And this.
Even Ganker's Intel shills don't dare link Hair Chasers' video.
And lastly, power.
It's not only about electricity cost, more power is more heat, means beefier cooling, means more noise, and more heat pumping into your room too.
My 13700k uses 70-120w during gaming.
I've never trusted Black personware unboxed after him promoting the 2500k for Battlefield 1 multiplayer while only testing singleplayer, not knowing how gear 1 and gear 2 works for RAM and shilling overpriced DDR5 4800mhz kits with his links over buying a cheaper CL16 3600Mhz DDR4 kit which ran rings around the DDR5 kit.
I don't trust anybody that doesn't livestream their benchmarks.
That's great dear, but literally everyone else has the same numbers.
So what's the take away from this. Why are benchmarks all over the place? One person saying Cyberpunk is faster and the other is slower?
It's why standardized benchmarks need to be done and why gamers should be doing benchmarks, not washed up old homosexuals that thinks unlocking the FPS in CS2 would get him VAC banned.
Like I said, 7800X3D is better than the 7950X3D.
Nope. The 7950X3D X3D chiplet is a higher bin. Framechasers bought both and tested them.
Do you understand that that does not matter.
It's split in two CCDs, making the 7950X3D mostly useless, especially in gaming.
There's a reason it was shit on, and still is. There's a reason 7800X3D is the on top of the lists. There is a reason people want a single CCD 8800X3D or whatever it's going to be.
You keep doing that moron, buy a console instead too.
It was shit on because it didn't come with a scheduler and used GameBar to determine which CCD to park.
The actual CCD itself is binned higher. The guy disabled the 2nd CCD and it had the same performance.
It literally doesn't matter that there is a 2nd CCD.
And an extra 100-200Mhz on an X3D part doesn't matter that much either.
Am I talking to chat GPT
AMD will never be good. It can't even fix their ftpm issues, and their GPUs are somehow even worse. Absolute poojeet tier shit
5800x3D bros
Depends on the game.
Gap is even larger if the RAM is XMP 7800Mhz or overclocked to 8200-8400Mhz.
Now let's see 7800X3D.
Why? The 7950X3D supports your cause because it has better binned chiplets.
Always trust an AMD fanboy to be moronic.
Congratulations, your disingenuous shilling has ensured I'll go AMD next
not my problem. your loss.
>RAM is XMP 7800Mhz or overclocked to 8200-8400Mhz.
Ten years ago I thought 1600 MHz was plenty. Crazy how certain aspects of computing have scaled so far while others haven't (ultimately we haven't gotten that far past "with good water cooling I can push 5 GHz on my CPU" from the same era).
Crazy what a little competition does.
5 years ago it was normal to be using a 4C/4T CPU.
>5 years ago it was normal to be using a 4C/4T CPU.
I was baffled that it got normal that PHONES had more cores than computers.
It's not really a huge deal. I mean, do people really upgrade CPUs? By the time it makes sense to do that, it also makes sense to upgrade everything else! So I feel it's better to build a whole new rig, and keep the old one as a spare..
I don't care about fps, i need more cores and threads, only a stupid gayget would buy 8/16 for just gaming.
AMD has been winning since 2019 and anyone arguing against was an entrenched brand loyalists. LGA1151 and 1200 are dead while AM4 gigachads are sitting on 5800X3Ds until the end of the decade.
>since 2019
The 3900X was the only good CPU and that was in applications that utilized 12 cores.
>gigachads are sitting on 5800X3D until the end of the decade
Must suck to have half your games run like shit lmao.
>no argument
I accept your concession.
>half your games run like shit
If believing that makes your days slightly more bearable.
The extra cache doesn't help on all games, sir.
I see the JIDF has the spare time to shill here with the war going on.
not a flex nor a brag but I went from an 8-core to a 12-core (Ryzen) am I cool yet?
Not really?
What's the giga-chad, big penis combo for CPU + GPU + Ram setup these days?