Yes, it's the best gaming CPU you can get right now. Even the 5800X3D is holding its own in benchmarks these days while being on a much older platform and limited to DDR4 memory.
are you retarded? how is 14xxx not a waste of money when its on an end of life socket that's getting replaced probably within months from now. what a dumbass you are. did you actually buy that shitty intel refresh? LMAO
>end of life
cuck cope
what a retard you are for BETA TESTING a shit platform that keeps crashing itself
You can go higher than 6000m/t but you will lose performance in games.
4 weeks ago
Anonymous
So Ryzen is unstable past 6000mhz? Why even mention it then?
4 weeks ago
Anonymous
Did I say unstable? It doesnt perform as well in games past that. Not the same for non-gaming applications.
4 weeks ago
Anonymous
So is it because of latency or just a limitaion of the chip?
What's the maximum speed you can go?
4 weeks ago
Anonymous
It's because of the design of the chip. Infinity fabric runs 1:1 at 6000m/t. That's why am5 competes with intel even when intel runs higher speed ram.
4 weeks ago
Anonymous
>That's why am5 competes with intel even when intel runs higher speed ram.
But when games utilize faster RAM and don't utilize the 3DCache, it can be quite the lead for Intel.
4 weeks ago
Anonymous
>But when games utilize faster RAM and don't utilize the 3DCache
in every case where a game utilizes faster ram, cache will beat it. cyberpunk and tomb raider are the biggest examples.
4 weeks ago
Anonymous
True, though those games are increasingly rare and usually ancient in terms of hardware use. See: FFXIV, Starfield.
4 weeks ago
Anonymous
The memory controllers are so bad that if you get unlucky you can't even run them at 6000 without becoming super unstable.
4 weeks ago
Anonymous
Proof?
4 weeks ago
Anonymous
Curiously I’ve seen overclockers have actual breakdowns over latest Intel RAM overclocking, not AMD.
are you retarded? how is 14xxx not a waste of money when its on an end of life socket that's getting replaced probably within months from now. what a dumbass you are. did you actually buy that shitty intel refresh? LMAO
I used to love this chip, i only stopped using it because i couldn't get a good LGA1156 motherboard for a decent price, (Mine failed) because I had it for like 10 years. So I said fuck it and built a ryzen. Still have the 2500k somewhere, first ground up pc I built when i was 19
Lol. This couldn't even play WoW: Legion in 2016. I remember using that pos in the first raid with the scorpion boss that had a million adds and I would get like 8 FPS. Literally unplayable. Now even my 9600k I upgraded to is barely functioning
5800x3d is the minimum for decent raid performance in WoW nowadays. Somehow the specs keep increasing but the fundamental gameplay hasn't changed since Legion
Mostly addons, tbh. Especially ElvUI. ElvUI is so unoptimized it's fucking nuts. Even if you turn off all the ElvUI modules it uses over double the CPU of other big addons like Plater, Details, and WeakAuras.
>nothing has surpassed the i5 2500k >he doesn't know about HETD intel platforms
Fucking lmao, that cpu is obsolete as fuck, X99 is indeed the best thing Intel has ever made, you can now get a Xeon 6 core CPU for $5 that is just the same as the 5820K, Intel just didnt clocked high these CPUs from factory because GPUs at the time already were fucking power hogs, but nowadays an 4Ghz OC 5820K uses pretty much the same current as a 13th gen I7, and that is on 22nm vs 10nm.
i upgraded to 8600k from 2500k because at the time it looked like its the new sandy bridge but i think 7800xD is the new 2500k actually. 6c/6t is a massive bottleneck in 2023, and i fucking hate having to upgrade it's barely been 6 years, meanwhile i used the 2500k from 2011 to 2018, mainly just to play QC at 144fps
Only if you own a 4090, otherwise you are just burning money.
imo best gaming CPU atm is the 7600 for the price, but if you already own AM4 platform, 5600 is pretty much all you need, specially for vidya, besides now CPUs like 5700x are pretty damn cheap on aliexpress too, great for 3rd world poorfags.
plenty of applications task CPU more than GPU, and its always easy to scale back GPU settings. starfiefld chugs like hell on 8600k, cities 2, AC:mirage... basically any open world engine requires at least a 5600x these days to give 60 fps 1% lows which is the bare minimum for me. and we're not even talking about high framerate gaming. 5600x is probably not going to cut it if you rather play DLSS+medium and target 120-144fps, especially if you have stuff on 2nd screen.
I currently run a 5900X on a 420mm AIO with a 3090, and all those games still run like fucking shit. It is not the CPUs to blame, its the fucking devs not doing their fucking job right at all.
I left an I5 8400 RX 6600 PC on my boomers home, and I played RE4 remake more than fine on it, I had around 78fps average on 1080p high with FSR off. Then I tried Starfield, and good lord, even with FSR it runs like shit.
Pic related just shows why modern vidya is just completely fucked and pretty much forcing your ass to spend a fuckton of money to barely being able to play these things, and the worst fucking part, is that a lot of them, just like the optimization, is fucking shit.
A FUCKING APPLE using almost 3K triangles, when if it was well done, it would use no more than 100.
Imo, if the game runs like shit, its probably not worth playing in the first place.
Yeah. It's kind of really getting tiresome of developers not putting the time into optimization and then saying "Just upgrade bro" or "Games have to be run with DLSS now, even though performance is still really questionable with DLSS on"
I'm on a 7950X3D and a 3090. The only upgrade for me at this point is yanking out my 3090 for a 4090 and even the 4090 is running into issues that should not exist with some games.
I just played Robocop, a UE5 game and UE5 has been all over the place, and I had no issues and ran it even with the fancy lumen setting at a solid 60, and that was from a small polish developer who bothered to do some optimization.
those are high tri menu assets i believe and with LODs it wouldn't be an issue regardless. what really sets the creation engine back though is the lack of properly set landscape LODs and culling. iirc all culling in CE is distance based, not frustum or Z-buffer based
It runs hot due to its chiplet design and it's a power hog relatively speaking. It's overpriced for what it is . There are much better options like the 13600K or above.
That's in gaming and I am not sure what games they tested in. It's built to run hot. Try it out in CPU intensive workloads. Firefox alone can make temps shoot up out of nowhere
Actually they didn't cut back on the thermal side of things, the motherboard manufacturers had to cut back on voltage. It turns out basically nobody was using the limits that AMD imposed.
The 7xxx CPUs can still go up to Tmax and this is expected behavior.
The ryzen 7xxx series is designed to hit Tmax before reaching Vmax.
The cooler you can keep it, the higher it will boost. This is expected behavior.
You can limit the power draw in bios if you want, but depending on which setting you use and what kind of games you're playing you could see a significant decrease in performance.
the "hot intel cpus" thing is bullshit. anybody who's actually used both recent intel and amd cpus will know this. amd chips will use 125 watts while intel chips use 200 watts and they run at the same temps.
You have that backwards, Intel are the power hogs due to ancient 10nm process. 7800 is ridiculously power efficient due to much more recent 5nm process. Intel has been flogging the same 10nm chips for several generations now with higher power limits because their silicon fabs are out of date
Meanwhile Apple is running a 3nm process and is ahead of both of them in terms thermals and power profiles, and also abandoned X86 architecture because of how inefficient it is for ARM, with huge reduction in power usage. Server farms are increasingly experimenting with ARM-based servers to reduce their power and cooling bills
X86 has a lot of legacy instructions and is bloated from 50 years of development (started in 1978), whereas ARM has a much reduced instruction set, so you can accomplish the same work with fewer instructions, which is why ARM laptops like Macbooks have 20hr battery life, while X86 laptops have 4-8hr battery life
Intel and AMD have recently been exploring RISC-V as the next step after X86 which is similar to ARM but not owned by Softbank/Apple. For now X86 is still pretty good, but the power and thermal efficiency gains of ARM are hard to ignore, so it probably will be considered legacy at some point.
ARM is for phones, its architecture is never gonna replace amd64.
4 weeks ago
Anonymous
Every macbook has been ARM for a few years now and it's the same performance with less power usage, it looks its probably the future
4 weeks ago
Anonymous
ROFL
4 weeks ago
Anonymous
Apple's M2 dumpsters the 7800x3d for anything that isn't GPU bound
4 weeks ago
Anonymous
Can't wait to play Cinebench 2024 with the boys
4 weeks ago
Anonymous
Ergh AMD cucks??? Explain yourself.
4 weeks ago
Anonymous
The standard answer is 7800x3d has shit multicore performance and is very strictly optimized for gaming with the 3d cache whereas M2 Ultra is a productivity processor and should be compared with 7950x. Still shows ARM can be a beast under the right circumstances
4 weeks ago
Anonymous
24 cores vs 8 and how many times more expensive? 3-4 times?
4 weeks ago
Anonymous
Retard
4 weeks ago
Anonymous
Sure showed me, shiteating homosexual.
4 weeks ago
Anonymous
>Comparing a 24 core CPU >To an 8 core CPU and RTX 4090 >In a CPU benchmark
Retard
4 weeks ago
Anonymous
Please list all the games the M2 beats the 7800x3d.
Should beat it in cpu bound games right? lol
Cinebench does not matter for gaming.
4 weeks ago
Anonymous
GarageBand
4 weeks ago
Anonymous
Uh... bros? Are we getting beaten by a phone CPU?!
4 weeks ago
Anonymous
Why the FUCK are you comparing a gaming CPU versus a productivity CPU?
4 weeks ago
Anonymous
>Why the FUCK are you comparing a gaming CPU versus a productivity CPU?
But AMD cucks keep comparing their CPU with Inlel's.
4 weeks ago
Anonymous
>But AMD cucks keep comparing their CPU with Inlel's.
What are you even saying?
4 weeks ago
Anonymous
Now compare it to a 64 core threadripper in running some backend database..
4 weeks ago
Anonymous
I mean, yes obviously it's going to lose in that circucmstance, the point of posting the graphic was to demonstrate ARM is capable in a desktop setting. AMD fans get so offended by this comparison.
4 weeks ago
Anonymous
>AMD fans get so offended by this comparison
It's a retarded comparison, of course people will call you out for making that comparison. It's like comparing a pair of running shoes versus a pair of hiking boots, and then saying that hiking boots are bad because you can't run in them.
4 weeks ago
Anonymous
Someone above said ARM is only for phones when it is clearly not, that was the only point I was trying to make with the image. Settle down retard
4 weeks ago
Anonymous
Comparing a 16 (cuck cores don't count) core CPU to an 8 core in a multi-threading benchmark was what was truly offensive
4 weeks ago
Anonymous
It has better single core performance too https://www.youtube.com/watch?v=5dhuxRF2c_w&ab_channel=optimum
4 weeks ago
Anonymous
>cpu score >4090 >wat
4 weeks ago
Anonymous
Read the image again dumbass, it's a single core test between 7800x3d vs an M2 Ultra
4 weeks ago
Anonymous
Rarted. The 7800X3D isn't even the best performing single core CPU in its class. It's the cache that makes it perform so well
4 weeks ago
Anonymous
*In games, I should also say
4 weeks ago
Anonymous
>M2 Ultra and 7800X3D don’t share pricepoints, configurations or even basic architectures
What’s the point of this retard graph? What’s the point of this retard post?
4 weeks ago
Anonymous
If you bothered to read the thread before posting this inane dribble, you would realize it was showing ARM is competitive with X86-64 after an earlier anon said ARM was just for phones
4 weeks ago
Anonymous
But then you have to factor in virtualization overhead, but hey it works for the Steam Deck.
4 weeks ago
Anonymous
Then why aren’t you comparing comparable configurations like a 24 core threadripper to it? This tells nothing.
4 weeks ago
Anonymous
Because it's a screenshot from a specific video and that comparison isn't available, not responding any further because this has been asked and answered several times already
4 weeks ago
Anonymous
You’re a stupid fucking monkey as you thought it was something worth reposting in the first place. I hope you’re 12 years old for your sake.
4 weeks ago
Anonymous
>too illiterate to read the thread before posting >start hurling insults when that is pointed out >accuses others of being underage despite behaving like an underage
wew
4 weeks ago
Anonymous
You are incapable of doing basic PC part comparisons, I would lurk for atleast two more years if I were you.
4 weeks ago
Anonymous
>anything that isn't GPU bound
Damn, if only this wasn't a video games board.
That's in gaming and I am not sure what games they tested in. It's built to run hot. Try it out in CPU intensive workloads. Firefox alone can make temps shoot up out of nowhere
Just give up. There are reasons to buy intel, but not for a pure gaming build.
It's hotter, less energy efficient and costs more.
If you have a 5800X, 3D or not, you don't really need to upgrade. If you have a board that you can slap a 5800X3D into, you're fine for now. Otherwise, yeah just get a 7800X3D, because if you have to buy a new board you might as well do it right the first time. Shitflinging aside, nothing comes close to competing with it.
Userbenchmark shill
Sometimes I still go to Userbenchmark to have a laugh at their increasingly desperate reviews. Their front page reviews on RTX 4000 series and Intel 13000 series namedrop AMD more than the actual product.
That's in gaming and I am not sure what games they tested in. It's built to run hot. Try it out in CPU intensive workloads. Firefox alone can make temps shoot up out of nowhere
this guy got btfo
You're on a video game board you fucking retard
Any modern CPU can handle gayming just fine unless some outlier cases. Just stop fretting so much.
Thermal density is the actual reason. Can't get around the physics. Pumping ~200W of electricity into something that has the aggregate size is a US quarter is going to be a challenge to cool.
It actually consumes almost 200W if permitted and taxed enough. The greater Intel SKUs drink almost 300W.
On normal mainstream and gayming loads (1-2 heavy threads and a few light ones in background), they will never drink that much power.
It’s still something you need to account for when considering a PSU. Once video games develop you will more likely see the CPU being utilized more, drawing more power. Either way, the AMD equivalents draw even less under gaming loads.
I has "high" idle power consumption of 30W due to chiplets. It runs hot under small loads (max power draw is 90W) because of the 3d vcache adding extra active silicon, acting as a warning blanket for the compute chip.
It also runs at whatever temp (max 90C) you let it turbo to in BIOS. Want it capped at 70C? That's just a setting. 80C? 90C? Yeah thats allowed too. So it doesn't "run hot" under any sense of the word, really. It boosts until: it's at 90C (or your temp limit) or 5.05GHz. That's the algorithm.
Nah, both platforms provide excellent performance and features for the vast majority of non-professional loads. You can't go wrong by going into either camp.
The real problem is that both AMD/Intel are pushing their silicon to the bleeding edge via factory, arm-chair-cooling a.k.a boosting. So they try to best each other at some stupid epenis benchmark. That extra 5-10% performance for 100W more is really not worth it. Both chips are insanely efficient if you actually dial them down a notch or two and the "performance loss" is minimal at best and still easily outclass their predecessors.
In current games a 7600 with an rx 7800xt will perform better than a 7800x3D and a rx 7600.
But if you want to buy a long term cpu that will be great for years and you can afford a 7900xtx or a 4090 then get the 7800x3d.
Avoid at all costs. I'm on my second set of dead motherboard and CPU with a 7950x3D. Even when it worked, it had problems with stuttering and stability. I hate it and wish I could get a refund.
>I'd overclock like a n00b on something that is already factory-overclock near breaking via boosting >I don't know how to thread schedule on my ancient OS that has a modern coat on top.
Avoid at all costs. I'm on my second set of dead motherboard and CPU with a 7950x3D. Even when it worked, it had problems with stuttering and stability. I hate it and wish I could get a refund.
Well one you are going to fill with the fastest nvme ssd you can afford and the other can be used as fast storage. Generally only one of those slots is the fastest speed.
Well one you are going to fill with the fastest nvme ssd you can afford and the other can be used as fast storage. Generally only one of those slots is the fastest speed.
>slots
is this what im looking for on the mobo spec sheet?
okay i couldnt find that on the microcenter website so i found this on the manufacturer's website
4 weeks ago
Anonymous
Yeah I wouldn't. You want at least one PCI 5 slot so you can take advantage of NVME M.2. top speeds. The drivers aren't even that costly.
4 weeks ago
Anonymous
>drivers
I meant drives. For example I have an old motherboard that only supports pcie 3.0. I have a 4.0 Samsung M.2. drive, so I lost some speed. Still works, but not as fast. So you would be doing the same with a 5.0 drive. But in my case 3.0 was top of the line at the time. So you would be buying old stock.
4 weeks ago
Anonymous
probably why its in the bundle i guess
¯_(ツ)_/¯
justifiable for the price of cpu+mobo+ram? can just replace later?
4 weeks ago
Anonymous
It is just old tech. PCIe 5.0 is the latest. PCIe 4.0 is 4 years old. I wouldn't do it, but hey there are still not as many pcie 5.0 m.2 drives out there as 4.0. Your choice.
4 weeks ago
Anonymous
I wouldn't stress too much about 4.0 vs 5.0.
Yes 5.0 is newer and faster, but currently 4x as expensive (the cards themselves, not counting the motherboard you'd need to buy to support it). The theoretical read/write maximums are nearly twice as fast but you won't be seeing any significant improvements outside of copying files or rendering.
4 weeks ago
Anonymous
Seems like just about all AM5 boards support pcie 5.0 nvme drives. Given this thread is about the 7800x3d an AM5 chip.
4 weeks ago
Anonymous
pic is loading times btw, took 5000 hours of Google searching to find this
4 weeks ago
Anonymous
If the game supports Direct Storage, those load times accelerate dramatically on PCI-E 5.0, too bad not many games do, only Ratchet and Clank, Hunt: Showdown and Forspoken do currently. Diablo 4 support is coming soon. It's really niche on PC unfortunately, while every PS5 and XBX game supports it
4 weeks ago
Anonymous
>Ratchet and Clank
Caps out at PCI-e 3.0 speeds.
4 weeks ago
Anonymous
oh really, wow what a bust of a feature then
4 weeks ago
Anonymous
3.0 is unironically more than enough and basically indistinguishable from a PS5 unless you do side-by-side comparisons as long as you have a good CPU.
4 weeks ago
Anonymous
Left is better, not only because of the better CPU, but with Intel you'd want much faster memory.
hmm. well the reason i was considering right was because i heard 7800x3d was decidedly more for gayming and worse in productivity, and i wanted something more balanced for the different applications i used.
4 weeks ago
Anonymous
Keep in mind people paying for all these expensive systems are likely putting 1000 hours into Factorio.
4 weeks ago
Anonymous
or just them as basketweaving board shitposting stations
4 weeks ago
Anonymous
In that case yes, the X3D are not productivity chips, and Intel is generally more balanced. The "balanced" AMD version would be the 7700/X.
But check benchmarks and forum posts of the programs you use.
4 weeks ago
Anonymous
7700x and 7800x3d are like the same for productivity... The only AMD CPUs worth buying are 7800x3d for gaming or 7900X for productivity, they are around the same price.
4 weeks ago
Anonymous
You're right, I remembered 7700 being faster
4 weeks ago
Anonymous
>7700x and 7800x3d are the same for productivity
Not necessarily.
4 weeks ago
Anonymous
>and i wanted something more balanced for the different applications i used.
What are you using on a regular basis where you'll actually notice the difference between the 16 threads on a 7800X3D, vs the 24 or 32 threads of its more expensive options/competitors?
4 weeks ago
Anonymous
You can get a 7900X for about the same price of $400 now. Though you missed when Newegg was even offering a deal where it also came with 32gb of GDDR5 and slopfield a little bit ago.
The drivers just got thousands of people banned from Counterstrike 2 because they shipped a literal hack as part of the official drivers to make FSR3 work. Gee I wonder why people dislike AMD graphics cards
Reflex has to be manually implemented by the devs in their game binary. AMD Anti-Lag is implemented at the driver level by intercepting draw calls which is the same way hacks work. They also didn't tell anyone they were adding it to the driver before releasing it, so Valve started banning people that looked like they were hacking, but just made the unfortunate decision of buying an AMD card
I sadly agree. I bought a 6650xt and for some reason it kept crashing until I had to tune it and undervolt, and I have no idea why because I have more than enough power.
Everything he says is bullshit and even if they are true, these are problems tied to motherboard issues, not CPU issues.
Probably bought an ASRock board if he encountered the issues themselves.
The RAM thing can be true, mostly because RAMs are now specifically XMP or EXPO "certified". If you have AMD, you obviously need EXPO, if you have Intel, you need XMP.
Price / performance and not needing to tune CPU / RAM etc. is a huge selling point. If you enjoy tweaking, Intel can come out ahead in many scenarios + don't have to worry if a game supports the extra cache or not.
>has to cherrypick and lie
The state of AMD, lmao.
4 weeks ago
Anonymous
Maybe you should post the correct CPU next time lol
Never @ me again
4 weeks ago
Anonymous
>y-you see, average FPS doesn't matter because... um... it just doesn't, OKAY?!
lmao
4 weeks ago
Anonymous
Compare game by game and not an aggregate. Certain games play better on either Intel or AMD so it's disingenuous to combine all the numbers.
4 weeks ago
Anonymous
Ah, is that why you had trouble posting the right CPU?
4 weeks ago
Anonymous
4 weeks ago
Anonymous
you're proving him right?
4 weeks ago
Anonymous
Nope. It's just a 7950X3D and the other CCD is parked. It has a higher clockspeed over the 7800X3D chiplet.
4 weeks ago
Anonymous
>Nope >>has to cherrypick and lie
>has to cherrypick and lie
The state of AMD, lmao.
Dang, anon. You're pretty bad at this
4 weeks ago
Anonymous
Actual retard. This is supporting AMD. It would be unfair if I compared the 13900KS to a stock 7800X3D, yet it's not okay with the better binned chiplet?
Lol
Lmao even
The mental gymnastics of AMD homosexuals.
4 weeks ago
Anonymous
Yeah ok, retardoid. Maybe post the right CPU next time, yeah? Anon will claim the 7800X3D to be slower yet not even post the correct CPU lmao. You've got two brain cells per hemisphere
4 weeks ago
Anonymous
>nagger unboxed >no overclocks >no RAM speeds
4 weeks ago
Anonymous
Sorry big boy, opinions from mongoloids incapable of posting the right CPU are disregarded
4 weeks ago
Anonymous
>a higher binned chiplet is bad
ok chud. just for you i'll post the lesser version.
4 weeks ago
Anonymous
Took you long enough lmao
4 weeks ago
Anonymous
Refer to
[...]
Took you long enough lmao
homosexual AMD shills. i hope they pay you well for being this scummy
4 weeks ago
Anonymous
Hey settle down, okay?
4 weeks ago
Anonymous
actually as bad as linux homosexuals. constantly misleading, gaslighting and word gymnastics to avoid any situation
4 weeks ago
Anonymous
You don't actually get riled up over anonymous internet posts, do you, anon? That's kinda sad
4 weeks ago
Anonymous
why you pulling us linaggers into this the fuck we do to you
4 weeks ago
Anonymous
INTEL SHILL MAD
INTEL SHILL MAD
4 weeks ago
Anonymous
Not even an Intel shill you homosexual. This is my opinion and I recommended the 7800X3D...
Price / performance and not needing to tune CPU / RAM etc. is a huge selling point. If you enjoy tweaking, Intel can come out ahead in many scenarios + don't have to worry if a game supports the extra cache or not.
4 weeks ago
Anonymous
>average FPS
4 weeks ago
Anonymous
Refer to
Sorry big boy, opinions from mongoloids incapable of posting the right CPU are disregarded
>tfw fell for intel estrogen cores >2 refreshes already >don't come even close to ryzen
I made a big fucking mistake. If only amd gpus were as good as their cpus
AMD's GPUs are completely fine, and a better buy than Nvidia right now. But unless AMD humiliates Nvidia like they destroyed Intel, people aren't going to switch.
People are used to buying Nvidia, they will keep buying Nvidia, 30% better price/performance isn't enough to persuade them to switch.
If Ryzen was just "well, it's the same performance but a bit cheaper," everyone would've just bought Intel anyway. But when Zen 1 came out, the 8C/16T 1700 was priced the same as the 4C/8T 7700K from Intel.
The $300 Ryzen 1700 + random ass $60 B350 motherboard performed the same as the $1000 6900K + $300 motherboard from Intel. It disrupted the market completely.
AMD's GPUs are completely fine, and a better buy than Nvidia right now. But unless AMD humiliates Nvidia like they destroyed Intel, people aren't going to switch.
People are used to buying Nvidia, they will keep buying Nvidia, 30% better price/performance isn't enough to persuade them to switch.
If Ryzen was just "well, it's the same performance but a bit cheaper," everyone would've just bought Intel anyway. But when Zen 1 came out, the 8C/16T 1700 was priced the same as the 4C/8T 7700K from Intel.
The $300 Ryzen 1700 + random ass $60 B350 motherboard performed the same as the $1000 6900K + $300 motherboard from Intel. It disrupted the market completely.
ayymd does not care, outside a few linux super computers their compute offerings are a joke. And they never invest real money in it despite being rich enough to buy xlinix.
they put a lazy minimal effort to satisfy a few corporate buddies and that's it. same for gaming.
they are happy to price their shit 10% cheaper than israelitevidia and call it a day.
How shitty are you at putting PCs together?
Takes me like an hour or two hours max to put machines together
Literally the only parts that take some time to install are the fan on the CPU, the PSU, and connecting all the bullshit from the case to your mainboard
I'm not sure about you anon, but masquerading a 7950X3D as a 7800X3D sounds pretty fake to me lol. I'm not sure why you're struggling so hard to post the right CPU lmao
the guy who "masqueraded" a 7950X3D as a 7800X3D only did so because AMD deliberately delayed the 7800X3D to get people to waste their money. of course AMD fanboys think it was to make them look worse.
It's still fake, then? Why's it so hard for you to post a genuine 7800X3D benchmark? Wtf bro, how long has it been?
4 weeks ago
Anonymous
I haven't posted any benchmarks because posting a png of a bar graph doesn't prove anything. But you're making accusations of malice simply because you aren't happy with the numbers he put out.
4 weeks ago
Anonymous
Or maybe because he faked his numbers.
4 weeks ago
Anonymous
whatever helps you sleep at night
4 weeks ago
Anonymous
Numbers of wha? A 7950X3D? Lel
4 weeks ago
Anonymous
People "called him out" on it so he bought a 7800X3D and it actually was 100-200mhz less than the 7950X3D, hence why it's binned better.
>he was lying to you for your own good
Absolutely israeli, post nose
4 weeks ago
Anonymous
he wasn't lying though. he said what he was doing.
4 weeks ago
Anonymous
He was lying through his fucking teeth. Anyone with a modicum of gray matter understands that the 7950x3d and a 7800x3d are completely different and will behave differently even if you disable a chiplet.
4 weeks ago
Anonymous
he got a 7950X3D because the 7800X3D was intentionally delayed, and disabled the non-3D ccd to create a proxy for the 7800X3D, because it wasn't available at the time. then when the 7800X3D came out, he tested that too. you're inventing a narrative because he said your favorite product isn't the fastest.
4 weeks ago
Anonymous
There's no arguing with him. He's like the Linux homosexuals on this board. They lie and mislead out of everything you bring up. He probably he a Linux nagger.
4 weeks ago
Anonymous
What the fuck are you on about you fucking schizo. What's wrong with linux you fucking pajeet.
4 weeks ago
Anonymous
>want to play XYZ game >nooooo not that game why would you want to play that imagine being a homosexual
That is why people laugh at Linaggers.
Same vibe in this thread regarding AMD. Dismissive and misleading.
4 weeks ago
Anonymous
Its too bad the strawmen in your head talk that way. I completely understand that it's not ideal that some games don't work and I won't pretend that 'lmao just play something else' is a valid response but that's not the OS's fault generally, the few games I want to play (R6S and EFT primarily) are unplayable due to anticheat. And for everything other gaming I do personally it's been much smoother than Windows.
>What's wrong with Linui think he meant Linus, but both the OS and the youtuber are bad.
>Linux bad
nuh uh Linux GOOD
4 weeks ago
Anonymous
Nu uh Linux absolutely bad
4 weeks ago
Anonymous
how so, speak nagger
4 weeks ago
Anonymous
Linux bad, simple as
Install and you'll see
4 weeks ago
Anonymous
>install
nagger I have been using LINOOX for the past 5 months. Jumped ship from windows with no experience. LINUX G O O D
4 weeks ago
Anonymous
Why Linux good? Linux bad.
4 weeks ago
Anonymous
LINUX FEEL GOOD
The main reason I switched from windows is because everytime I ran a program that wasn't an Official Microsoft (TM) program it would do this thing it would have to load for nearly a minute doing god knows what, probably scanning and reporting my eroge to microsoft. Bill Gates and his pajeet cadre don't need to know my fetishes. Now I run linux and I'm F R E E like my software, my fetishes are between me and my ISP.
LINUX G O O D
RESPONSIVE
FAST
TASTES GOOD
FEELS GOOD
WINDOWS RUNS LIKE SHIT
4 weeks ago
Anonymous
>god knows what >antivirus scan
4 weeks ago
Anonymous
That's funny because I ran Windows Enterprise with defender disabled.
4 weeks ago
Anonymous
Can I play Battlefield 2042 on Linux?
4 weeks ago
Anonymous
>Battlefield 2042
You used to be able to when they used EAC for their anticheat, but now they use a proprietary kernel level anticheat that's incompatible.
4 weeks ago
Anonymous
>What's wrong with Linui think he meant Linus, but both the OS and the youtuber are bad.
4 weeks ago
Anonymous
Then maybe he should have waited for the real thing lmao, stupid cunt
4 weeks ago
Anonymous
He tested both you braindead nagger.
The 7950X3D was the faster part because it has a better binned chiplet.
It actually supports your shilling, retard.
4 weeks ago
Anonymous
Don't care, loser. Maybe you should spend more time posting genuine benchmarks over performing mental gymnastics, yeah?
4 weeks ago
Anonymous
That is a genuine benchmark.
4 weeks ago
Anonymous
That misteriously can't be recreated and with a dubious source.
4 weeks ago
Anonymous
all of your sources are from people who literally do ad spots for pc hardware. I guess that makes them seem trustworthy to you.
4 weeks ago
Anonymous
My usual source for hw stuff is GN, because they actually publish their testing methodology.
Your source is a mystery man doing whatever he wants, however he wants, including lying and simulating hardware he doesn't have. The original source is also conveniently behind a pay wall so it cannot be verified.
4 weeks ago
Anonymous
>My usual source for hw stuff is GN
I know it is. That's why I don't care what you think.
4 weeks ago
Anonymous
>GN bad
Explain
>Your source is a mystery man doing whatever he wants, however he wants, including lying and simulating hardware he doesn't have. The original source is also conveniently behind a pay wall so it cannot be verified.
He lives streams everything.
It's behind a pay wall because buying lots of hardware can get expensive. He's not bought and paid by companies.
People steal his overclock settings (particularly RAM), so at least he got paid for it.
But no, you would rather trust nagger Unboxed who shilled his affiliate links for dogshit 4800mhz DDR5 RAM rather than high-end DDR4 RAM for the same price.
Please quote where in my post I mentioned hardware unboxed.
You just can't stop lying, can you?
4 weeks ago
Anonymous
Somebody was posting Hardware Unboxed but if it wasn't you then fine. Still proves the point that mainstream tech guys are paid off.
Stop trying to talk your way out of this one, nagger.
4 weeks ago
Anonymous
Post proof of tech Jesus being a shill.
We're waiting schizo.
4 weeks ago
Anonymous
>guy who faked 7800x3d results good 🙂 >GN bad >:(
4 weeks ago
Anonymous
cope, tranny
4 weeks ago
Anonymous
yo chill linus you wouldn't want to offend your employees
4 weeks ago
Anonymous
>Explain
doesn't play games, uses an amd fx chip for his personal system because that's how little he cares about PC performance.
spends half of every review on shit like blender because he doesn't care about games
tests retarded irrelevant games nobody cares about like strange brigade because "it's well programmed" which is useless for people who actually play games
shills pc hardware while claiming to be independent somehow
threw buildzoid under the bus for calling the gn audience "normies", which you are
threw linus under the bus because he's a jealous homosexual
annoying whiny lispy voice
ugly
stupid hair
the list could go on
4 weeks ago
Anonymous
Hello Linus, how's the wife and her boyfriend?
Also I know you're shitposting but Steve does like games, he doesn't like AAA slop. There are videos where he briefly talks about games he's looking forward to, usually from indie devs.
4 weeks ago
Anonymous
>Your source is a mystery man doing whatever he wants, however he wants, including lying and simulating hardware he doesn't have. The original source is also conveniently behind a pay wall so it cannot be verified.
He lives streams everything.
It's behind a pay wall because buying lots of hardware can get expensive. He's not bought and paid by companies.
People steal his overclock settings (particularly RAM), so at least he got paid for it.
But no, you would rather trust nagger Unboxed who shilled his affiliate links for dogshit 4800mhz DDR5 RAM rather than high-end DDR4 RAM for the same price.
4 weeks ago
Anonymous
nta, but >steal his overclock settings
If you're gonna publish benchmarks shouldn't you make all your system setting public. And as far as paywalling your videos, that seems odd in the sense that I can't imagine someone who would actually pay for benchmark videos like that. Like would anyone who would pay for his videos probably just support him without needing the incentive of paywalled videos. Just seems odd to me
4 weeks ago
Anonymous
>all your settings public
In the final video the basic timings and speeds for RAM are visible but the secondary timings are not ones are not, that is behind the paywall. You can still access it.
Why is shit like this acceptable then?
LOL
The facts don't lie retard.
[...]
Heres the 4k one too
Huge double standard because you're an AMD fanhomosexual. >I can't imagine someone who would actually pay for benchmark videos like that
Because you're not his target audience. You're somebody that watches Gamers Nexus or Hardware Unboxed and only cares about plug and play. You aren't overclocking, undervolting, whatever. > Like would anyone who would pay for his videos probably just support him without needing the incentive of paywalled videos
The end results are free on YouTube and the testing methodology but if you want to see the raw live stream then it's behind a paywall.
4 weeks ago
Anonymous
>Huge double standard because you're an AMD fanhomosexual.
that was my first time responding to you >You aren't overclocking, undervolting, whatever.
I literally am, I usually watch buildzoid for that content.
Why are you such bitch? Did AMD fuck your mom or something?
4 weeks ago
Anonymous
>Why are you such bitch? Did AMD fuck your mom or something? >my first post
That is fine, welcome.
Earlier in the thread I actually recommended AMD unless you're fine with tweaking, or your favourite game doesn't benefit from the extra cache.
4 weeks ago
Anonymous
I see fake, simulated results on there, though?
4 weeks ago
Anonymous
That misteriously can't be recreated and with a dubious source.
happy?
4 weeks ago
Anonymous
Is that a real 7800X3D this time?
4 weeks ago
Anonymous
The lower FPS, yeah.
4 weeks ago
Anonymous
Okay good, now post video source
4 weeks ago
Anonymous
It's behind a paywall
4 weeks ago
Anonymous
Yeah and my girlfriend just goes to another school lmao
4 weeks ago
Anonymous
What is that meant to mean?
My gf in highschool went to another school.
Nice christian girl, blonde hair and 6ft tall. Huge legs from dancing.
4 weeks ago
Anonymous
I'm not going to do that because A, you will just say it's all lies anyway, and B, I really don't care what random Ganker plebs think about PC hardware.
4 weeks ago
Anonymous
>source: 10 minutes in excel and 1 in paint
Also what the fuck testing methodology is that?
4 weeks ago
Anonymous
why should he have waited? because you homosexuals would twist the story to make it seem like deliberate deception? yeah, he probably should have anticipated that AMD fans are psychotic weirdos.
4 weeks ago
Anonymous
>Why should he have waited for the real thing?
Gee, anon
I went with the NZXT N7 B650e.
Good VRMs nice design thermal pads for NVMEs etc.
Nice upgrade path down the line when RAM gets better or better CPU, also has a gen 5 nvme and GPU slot so the mobo will last a while I think.
Worth the price.
>Pros
Cost effective and can be better in certain games
No need to tune it outside of potential XMP issues.
Lower power consumption**
>Cons
AMD jank
Not all games benefit from the extra cache
Less cores. If you're strictly gaming then it doesn't really matter but it can be handy to put things on the eCores.
12700KF regularly goes on sale + high end DDR4 RAM is much cheaper so cost / performance can be outdone.
Limited to 6000Mhz DDR5
Pretty neck and neck I'd say. I prefer things to just work and not all my games benefit from
**You can park eCores or the 13700k/14700k exist which have less eCores. Vendors also push Intel CPU's to their limits to get an extra 100-200mhz to look better than their competitors. They stopped doing this on AMD due to CPU's blowing up. Personally I have a 13700k at 5.4Ghz and it runs between 50-110W depending on the game. 150W power limit while rendering and it drops to 5.1Ghz. Not a huge deal since GPU's use 350-600W now.
I haven't played at 1080p in like 5 years so a 3D chip wouldn't help me at all.
Happy for AMD to have a competitive product but unfortunately I don't think it matters that much in real use cases.
>7800X3D dabs on everything at 1080p
True >1440p and 4k
Not true, it shits itself at higher resolutions and at that points the tradeoff for general applications isn't worth the minuscule fps difference.
>shits itself at higher resolutions
That’s not how CPU scaling works. At higher resolutions the gap shrinks due lower average framerates and GPU bottlenecks. Once GPU limited it doesn’t matter if you run a 9600K or a 7800X3D.
There's so few games where it has a clear lead like MSFS 2020 where it has 25-30% higher fps. For everything else you're comparing sub 10% variance.
Not worth changing my entire mobo/ram/cpu setup for.
Motherfuck, been trying to renovate my old desktop PC but now the goddamn thing won't even power on. This thing's ancient too so any replacement motherboard is long out of stock.
People overspend on the CPU so fucking much. This thread is probably just marketers though. You just keep spamming names of products to raise interest and thus be able to sell it for more.
5600x, save the $200 and get a barbell or something.
there's no "overspending on cpu" homosexual retard
until it starts bottlenecking shit or you have to replace the entire pc because tech naggers pushed out new israelitery it will just work, and the better is the cpu the longer it will take
Most people are still under the idea that we're in Intel's decade of nothing happening with CPUs. Which of course hasn't been a thing since AM4.
They're the people spending 1600 on a 4090 and then combining it with a 5600G.
>b450
Just get the 5500 instead, your board doesn't have PCIe gen 4, so why pay the extra on something you wont use, but if its just $10 difference, just get the 5600x. Performance wise, 5500 and 5600x are not much different, you would see better 1% lows with the 5600x if you had a B550 thanks to PCIe 4.0, but thats pretty much it.
Imo, I would just suggest to get a A620 + 7500f , then find a decent DDR5 kit, you would get pretty much same performance as 5800X3D, but also the new upgrade path when new AM5 cpus come out. ASRock A620 boards are curiously, the best ones made, while MSI are fucking shit.
7000 ryzen, enjoy extreme termal degradation of both CPU and MB in couple of months. A lot of guys showed the results - buldging and layering of MBs, CPU cristal peeling of the board, some people even reported solder melting under CPU heat spreader.
Clicked on couple of articles - TomsHardware, TechPowerUp etc - they either dont mention MB brands or state that this shit is present on MBs of all big brands - asus, gb, msi.
Friendly reminder: Don't fall for the goonix meme. >OOM so shit it WILL kill ANY heavy process that isn't the one actually eating all your virtual memory and shitting itself >5-10% (maybe even more in some cases) performance lost on AAA due to meme syscall translation crap >It won't JUST WORK out of the box (Fuck off with your gay little caveats, i want the full game) >No one actually develops NATIVE LINUX GAMES (The ones who did shat the bed because no one bought them) >Zero multiplayer/online coop games of substantial worth (FOTM indietrash need not to apply) >The few profitable corporations whose products rely solely on poonix don't respect their user's freedom (nor they ever will).
You're better of debloating win11 with ntlite. Don't bother with Microsoft's and Gabe's little penguin slut.
Provide proof it happened to hundered of boards shitposter.
The only ones we have are when people used ASUS boards.
I'll wait, but I know you'll call me a shill either way like the fag you are.
"We are aware of a limited number of reports online..." - corporate speak for "fuckton"
"The AMD statement follows several of its motherboard partners..." - that means "we panicked and pushed ALL big MB makers to quickly patch this shit up to avert a class action lawsuit".
I never called you a shill, I dont care what brand of CPU you have, I have both. I just have very little tolerance to lies and deceit from both AMD and Intel. You on the other hand called me names, which shows your quality as a debater
Yes, multiple manufacturers disregarded the voltage limit in order to get bigger numbers.
This has all been covered already by multiple independent news sources.
I have a b650e Taichi, it's bretty gud. Has bclk overclocking if you're interested in that. And decent connectivity. I used to have the ASUS 670e TUF that thing was a massive pile of shit. Terrible coil whine and long as startup times.
"X" models don't come with coolers, yeah. They're also slightly faster on paper... but in practice the difference is less than 5%.
X or non-X, it's always better to get whatever is cheaper.
This is wrong. I got the 3700X like 3-4 years ago and it came with an RGB cooler.
Oh yeah, you're right. So let me rephrase.
Usually, AMD sells two different versions of the same CPU: a standard cheaper version, and a better binned but more expensive version. Usually, the standard version is called Ryzen [Number] and the better binned version is called Ryzen [Number]X.
And it's almost never worth paying more for the better binned version because it performs almost the same as the standard version. If the better binned version isn't THAT much more expensive, it may be worth considering. Otherwise, the standard version is completely fine.
But as you pointed out, their naming scheme isn't always consistent. For example, in the very first generation they sold a 1700, a 1700X and a 1800X. These were all the same CPU and performed roughly the same.
With Zen 3, they launched the overpriced 5800X. Later they launched the much cheaper 5700X. But these were also the same CPU.
My 7700x didnt come with a cooler and my experience is more recent.
4 weeks ago
Anonymous
My 7600 came with a cooler, if I remember right the X ones didn't have them. The high end CPUs of course don't come with coolers since they need much better cooling than stock.
4 weeks ago
Anonymous
Then the anon saying that x versions do not come with coolers is correct.
You're not going to sustain the 5ghz boost on a basic cooler (that one comes with no stock). I threw my NH D15 from my previous build at it just to be safe, given how quickly that CPU spikes temps under sudden load.
>it's okay when the motherboard overvolts Intel to draw 300W to look better than the other motherboard competitors but when AMD does it, it's bad >no ofcourse we won't re-test power draw when using Intel's parameteres or undervolting
This is why I hate AMD naggers so much. Always double standards and dismissive of everything.
>fixed
By ASUS not overvolting the CPU anymore, dropping power consumption.
Yet it's still perfectly okay to compare that to Intel while vendors are overvolting Intel.
nagger.
>b-b-b-but productivity!
7800X3D is going to be stomping vastly more expensive CPUs in gaming for two generations to come, like the 5800X3D before it. Unless it's work related, I can wait another 40s on a Blender render......or build a second PC specific for work related tasks.
My 13700k draws 50-100W during gaming at 5.4Ghz. Why does everyone keep saying it draws 300W? Did they confuse it with the GPU? I could probably undervolt it if I wanted to.
What does a larger cache mean for gaming on a CPU? Just more instructions it can chew away at before having to make a call to memory to send/receive more instructions?
CPU wants X. X travels there from memory. But whoops, L3 (outermost cache) is full. X needs to travel back to memory to wait, then get called back to L3. That's three trips, lots of physical distance to travel. It can only now go to L2, L1, goal. But it's late.
Your game will now stutter and/or the fps might take a dip.
More L3 is more buffer before that happens.
CPU wants X. X travels there from memory. But whoops, L3 (outermost cache) is full. X needs to travel back to memory to wait, then get called back to L3. That's three trips, lots of physical distance to travel. It can only now go to L2, L1, goal. But it's late.
Your game will now stutter and/or the fps might take a dip.
More L3 is more buffer before that happens.
But also yes, as you said, a faster more constant chew.
Got a 1070 and i dont play the latest stuff is it a good idea to get a 1440 monitor?
also is it really that horrible to drop to 1080p in such monitor if needed over just lowering the graphics to maintain the original resolution?
Sure
I have a GTX 1080 (overclocked) and a 1440p monitor, runs most games at that resolution no problem. Modern games with lots of things going on will run at like 50 fps to I just lower it to 1080p and set up Nvidia Image Scaling or just sharpen through Reshade, it looks just a bit worse than native 1440p while gaining 20-30 fps.
Armored Core VI runs at 60 fps ultra at 1440p, but Monster Hunter World or Elden Ring will need 1080p to run at 60 ultra for example.
Also, if you read a lot on your PC, 1440p is a must.
That’s every board, even Ganker, though I suspect said trillion dollar companies can afford to pay 5 people to shill their products here without issue.
it beats or ties intel flagships at less than half the power draw
amd was so fucking retarded to launch the series with the 95c thermal cap instead of a proper power draw cap
>x86 >so inefficient that the likes of intel had to make e-cores to handle verbose ancient instruction noise from the x86 instruction set >p-cores do the heavy lifting >causes issues with older programs that are no longer maintained, requiring 3rd party methods to isolate it to p-cores
>arm >more efficient, uses less power/more battery life, churns out less heat >PCMR still has no tangible benefits from it in 2023
i need to wake the fuck up from this nightmare
Only if you own a 4090, otherwise you are just burning money.
imo best gaming CPU atm is the 7600 for the price, but if you already own AM4 platform, 5600 is pretty much all you need, specially for vidya, besides now CPUs like 5700x are pretty damn cheap on aliexpress too, great for 3rd world poorfags.
I just put together an 7800x3d last week with 6000mhz ram
so far, it's been great for gaming.. but all I've played is my time at sandrock and risk of rain returns.
Man I’m sitting on an RTX 3070 and I remember poping it in and slowly coming to terms that I just wasted $500 as I play mostly ancient 10-20 year old games. I still enjoy tinkering with my PC.
I did the same thing going from a 3600 to a 5800X3D and I barely noticed the difference since most of the games I play are 5+ years old.
I'll probably do a whole new build when I do notice the difference, but the price to pay to upgrade from an RTX 2070 will hurt.
In the context of the whole system ownership though, that's paying <10% more for 12% average and 15% 1% minimums. Which is a no brainer, if you just bought a 1600-2000 dollar GPU. Why would you gimp it by 10% to save another 200 bucks? You clearly have the money.
Be future proof you dumbfuck. Every new year there are games that uses more resources, Why would you buy a good one now that will be bad tomorrow? Get the best now and be good for the generation.
7600 is all you need
>$300 motherboard
Good luck
my mobo was 220 leafbucks
still $100 more expensive. remember when budget boards were like $50-$70?
Remember when gas was 32 cents and you could buy a new house for $1500
>remember how it was 2 years ago
>YEAH BUT REMEMBER THE STONE AGE
DURING THE STONE AGE WHAT???
Perchance.
Budget AM5 is $110 now.
A620 MOBOs are $70-80 these days, you don't need more for gaming.
2500k is all you need.
5600x is all you need. I believe even the 3600 still holds up great.
yes
Yes, it's the best gaming CPU you can get right now. Even the 5800X3D is holding its own in benchmarks these days while being on a much older platform and limited to DDR4 memory.
Is DDR5 worth it?
Yes but only if you're on Intel
>end of life
cuck cope
what a retard you are for BETA TESTING a shit platform that keeps crashing itself
>Yes but only if you're on Intel
Why only intel? Amd doesn't make use of it?
It's limited to 6000mhz
That’s a huge increase over the average DDR4 speeds either way?
No it is not.
You can run 8200Mhz on Ryzen?
You can go higher than 6000m/t but you will lose performance in games.
So Ryzen is unstable past 6000mhz? Why even mention it then?
Did I say unstable? It doesnt perform as well in games past that. Not the same for non-gaming applications.
So is it because of latency or just a limitaion of the chip?
What's the maximum speed you can go?
It's because of the design of the chip. Infinity fabric runs 1:1 at 6000m/t. That's why am5 competes with intel even when intel runs higher speed ram.
>That's why am5 competes with intel even when intel runs higher speed ram.
But when games utilize faster RAM and don't utilize the 3DCache, it can be quite the lead for Intel.
>But when games utilize faster RAM and don't utilize the 3DCache
in every case where a game utilizes faster ram, cache will beat it. cyberpunk and tomb raider are the biggest examples.
True, though those games are increasingly rare and usually ancient in terms of hardware use. See: FFXIV, Starfield.
The memory controllers are so bad that if you get unlucky you can't even run them at 6000 without becoming super unstable.
Proof?
Curiously I’ve seen overclockers have actual breakdowns over latest Intel RAM overclocking, not AMD.
Waste of money when the 14700K exists
are you retarded? how is 14xxx not a waste of money when its on an end of life socket that's getting replaced probably within months from now. what a dumbass you are. did you actually buy that shitty intel refresh? LMAO
nothing has surpassed the i5 2500k yet
Try playing a CPU heavy game like Shartfield on that CPU and watch it cry
I played starfield on a ryzen 5 2600 ultimate max settings with no stutters or issues
lmfao really?
I used to love this chip, i only stopped using it because i couldn't get a good LGA1156 motherboard for a decent price, (Mine failed) because I had it for like 10 years. So I said fuck it and built a ryzen. Still have the 2500k somewhere, first ground up pc I built when i was 19
i cant even run the last of us pc on it.
Lol. This couldn't even play WoW: Legion in 2016. I remember using that pos in the first raid with the scorpion boss that had a million adds and I would get like 8 FPS. Literally unplayable. Now even my 9600k I upgraded to is barely functioning
5800x3d is the minimum for decent raid performance in WoW nowadays. Somehow the specs keep increasing but the fundamental gameplay hasn't changed since Legion
Mostly addons, tbh. Especially ElvUI. ElvUI is so unoptimized it's fucking nuts. Even if you turn off all the ElvUI modules it uses over double the CPU of other big addons like Plater, Details, and WeakAuras.
this is honestly the dumbest meme you homosexuals keep posting on this board
>nothing has surpassed the i5 2500k
>he doesn't know about HETD intel platforms
Fucking lmao, that cpu is obsolete as fuck, X99 is indeed the best thing Intel has ever made, you can now get a Xeon 6 core CPU for $5 that is just the same as the 5820K, Intel just didnt clocked high these CPUs from factory because GPUs at the time already were fucking power hogs, but nowadays an 4Ghz OC 5820K uses pretty much the same current as a 13th gen I7, and that is on 22nm vs 10nm.
i upgraded to 8600k from 2500k because at the time it looked like its the new sandy bridge but i think 7800xD is the new 2500k actually. 6c/6t is a massive bottleneck in 2023, and i fucking hate having to upgrade it's barely been 6 years, meanwhile i used the 2500k from 2011 to 2018, mainly just to play QC at 144fps
plenty of applications task CPU more than GPU, and its always easy to scale back GPU settings. starfiefld chugs like hell on 8600k, cities 2, AC:mirage... basically any open world engine requires at least a 5600x these days to give 60 fps 1% lows which is the bare minimum for me. and we're not even talking about high framerate gaming. 5600x is probably not going to cut it if you rather play DLSS+medium and target 120-144fps, especially if you have stuff on 2nd screen.
I currently run a 5900X on a 420mm AIO with a 3090, and all those games still run like fucking shit. It is not the CPUs to blame, its the fucking devs not doing their fucking job right at all.
I left an I5 8400 RX 6600 PC on my boomers home, and I played RE4 remake more than fine on it, I had around 78fps average on 1080p high with FSR off. Then I tried Starfield, and good lord, even with FSR it runs like shit.
Pic related just shows why modern vidya is just completely fucked and pretty much forcing your ass to spend a fuckton of money to barely being able to play these things, and the worst fucking part, is that a lot of them, just like the optimization, is fucking shit.
A FUCKING APPLE using almost 3K triangles, when if it was well done, it would use no more than 100.
Imo, if the game runs like shit, its probably not worth playing in the first place.
>cpu doesn't care about your triangles
>2.1k is not 3k
>that pic was confirmed fake
While Starfield is slop, you are retarded.
Yeah. It's kind of really getting tiresome of developers not putting the time into optimization and then saying "Just upgrade bro" or "Games have to be run with DLSS now, even though performance is still really questionable with DLSS on"
I'm on a 7950X3D and a 3090. The only upgrade for me at this point is yanking out my 3090 for a 4090 and even the 4090 is running into issues that should not exist with some games.
I just played Robocop, a UE5 game and UE5 has been all over the place, and I had no issues and ran it even with the fancy lumen setting at a solid 60, and that was from a small polish developer who bothered to do some optimization.
>playing unoptimized AAA cancer that isn't even worth a pirate in the 1st place
There's your problem.
those are high tri menu assets i believe and with LODs it wouldn't be an issue regardless. what really sets the creation engine back though is the lack of properly set landscape LODs and culling. iirc all culling in CE is distance based, not frustum or Z-buffer based
Genuine case of poorfag cope
12400f replaced it.
It runs hot due to its chiplet design and it's a power hog relatively speaking. It's overpriced for what it is . There are much better options like the 13600K or above.
>Runs hot
>Relatively a power hog
Huh?
That's in gaming and I am not sure what games they tested in. It's built to run hot. Try it out in CPU intensive workloads. Firefox alone can make temps shoot up out of nowhere
>Ganker - Video Games
>Is the 7800x3d worth it for gaming?
K
K
Mine runs about 42 degrees in idle with a 360 AIO.
>That's in gaming
>the thing OP explicitly asked about on the video game board
It's built to run video games fast. Why use it for some arbitrary workload when there are other CPUs that are better at it.
CPUs in gaming are never 100% utilized
wattage matters, temperature doesn't
>i9 13900k 87.3*C
>ryzen 9 7950x 95.9*C
what in the absolute fuck?
Nobody is buying a Ryzen 9 for gaming
The 7950X has a thicc heat spreader, also big boy heat density
Modern CPU's boost as high as they can until they reach thermal or power limits.
This has been cut back on AMD due to their CPU's blowing up, lol.
Actually they didn't cut back on the thermal side of things, the motherboard manufacturers had to cut back on voltage. It turns out basically nobody was using the limits that AMD imposed.
The 7xxx CPUs can still go up to Tmax and this is expected behavior.
The ryzen 7xxx series is designed to hit Tmax before reaching Vmax.
The cooler you can keep it, the higher it will boost. This is expected behavior.
You can limit the power draw in bios if you want, but depending on which setting you use and what kind of games you're playing you could see a significant decrease in performance.
the "hot intel cpus" thing is bullshit. anybody who's actually used both recent intel and amd cpus will know this. amd chips will use 125 watts while intel chips use 200 watts and they run at the same temps.
So the AMD will use less power but be the same.
Not really convincing me to go intel.
I don't care what you do. It makes absolutely no difference to me what computer parts you buy.
Hello, have you heard of thermodynamics?
7950x owner here. I have never seen it go up that high wtf
Weird. My 13700k pulls 50-100W at 5.4Ghz during gaming and runs at 60-65C.
Under full load I set a power limit of 150W and it downclocks to 5.2Ghz.
You have that backwards, Intel are the power hogs due to ancient 10nm process. 7800 is ridiculously power efficient due to much more recent 5nm process. Intel has been flogging the same 10nm chips for several generations now with higher power limits because their silicon fabs are out of date
Compared to Intel it seems much more efficient. They both run hot and both need to be undervolted. Intel is egregious in this regard
Meanwhile Apple is running a 3nm process and is ahead of both of them in terms thermals and power profiles, and also abandoned X86 architecture because of how inefficient it is for ARM, with huge reduction in power usage. Server farms are increasingly experimenting with ARM-based servers to reduce their power and cooling bills
>X86
>ARM
QRD? Which is better?
Just different.
X86 has a lot of legacy instructions and is bloated from 50 years of development (started in 1978), whereas ARM has a much reduced instruction set, so you can accomplish the same work with fewer instructions, which is why ARM laptops like Macbooks have 20hr battery life, while X86 laptops have 4-8hr battery life
Intel and AMD have recently been exploring RISC-V as the next step after X86 which is similar to ARM but not owned by Softbank/Apple. For now X86 is still pretty good, but the power and thermal efficiency gains of ARM are hard to ignore, so it probably will be considered legacy at some point.
ARM is for phones, its architecture is never gonna replace amd64.
Every macbook has been ARM for a few years now and it's the same performance with less power usage, it looks its probably the future
ROFL
Apple's M2 dumpsters the 7800x3d for anything that isn't GPU bound
Can't wait to play Cinebench 2024 with the boys
Ergh AMD cucks??? Explain yourself.
The standard answer is 7800x3d has shit multicore performance and is very strictly optimized for gaming with the 3d cache whereas M2 Ultra is a productivity processor and should be compared with 7950x. Still shows ARM can be a beast under the right circumstances
24 cores vs 8 and how many times more expensive? 3-4 times?
Retard
Sure showed me, shiteating homosexual.
>Comparing a 24 core CPU
>To an 8 core CPU and RTX 4090
>In a CPU benchmark
Retard
Please list all the games the M2 beats the 7800x3d.
Should beat it in cpu bound games right? lol
Cinebench does not matter for gaming.
GarageBand
Uh... bros? Are we getting beaten by a phone CPU?!
Why the FUCK are you comparing a gaming CPU versus a productivity CPU?
>Why the FUCK are you comparing a gaming CPU versus a productivity CPU?
But AMD cucks keep comparing their CPU with Inlel's.
>But AMD cucks keep comparing their CPU with Inlel's.
What are you even saying?
Now compare it to a 64 core threadripper in running some backend database..
I mean, yes obviously it's going to lose in that circucmstance, the point of posting the graphic was to demonstrate ARM is capable in a desktop setting. AMD fans get so offended by this comparison.
>AMD fans get so offended by this comparison
It's a retarded comparison, of course people will call you out for making that comparison. It's like comparing a pair of running shoes versus a pair of hiking boots, and then saying that hiking boots are bad because you can't run in them.
Someone above said ARM is only for phones when it is clearly not, that was the only point I was trying to make with the image. Settle down retard
Comparing a 16 (cuck cores don't count) core CPU to an 8 core in a multi-threading benchmark was what was truly offensive
It has better single core performance too https://www.youtube.com/watch?v=5dhuxRF2c_w&ab_channel=optimum
>cpu score
>4090
>wat
Read the image again dumbass, it's a single core test between 7800x3d vs an M2 Ultra
Rarted. The 7800X3D isn't even the best performing single core CPU in its class. It's the cache that makes it perform so well
*In games, I should also say
>M2 Ultra and 7800X3D don’t share pricepoints, configurations or even basic architectures
What’s the point of this retard graph? What’s the point of this retard post?
If you bothered to read the thread before posting this inane dribble, you would realize it was showing ARM is competitive with X86-64 after an earlier anon said ARM was just for phones
But then you have to factor in virtualization overhead, but hey it works for the Steam Deck.
Then why aren’t you comparing comparable configurations like a 24 core threadripper to it? This tells nothing.
Because it's a screenshot from a specific video and that comparison isn't available, not responding any further because this has been asked and answered several times already
You’re a stupid fucking monkey as you thought it was something worth reposting in the first place. I hope you’re 12 years old for your sake.
>too illiterate to read the thread before posting
>start hurling insults when that is pointed out
>accuses others of being underage despite behaving like an underage
wew
You are incapable of doing basic PC part comparisons, I would lurk for atleast two more years if I were you.
>anything that isn't GPU bound
Damn, if only this wasn't a video games board.
ARM is more efficient but has compatibility issues. x86 is not going to get replaced anytime soon unless you're a normalfag
Just give up. There are reasons to buy intel, but not for a pure gaming build.
It's hotter, less energy efficient and costs more.
Stop reading Userbenchmark. 7800X3D temperatures and power consumption are amazingly low.
If you have a 5800X, 3D or not, you don't really need to upgrade. If you have a board that you can slap a 5800X3D into, you're fine for now. Otherwise, yeah just get a 7800X3D, because if you have to buy a new board you might as well do it right the first time. Shitflinging aside, nothing comes close to competing with it.
Sometimes I still go to Userbenchmark to have a laugh at their increasingly desperate reviews. Their front page reviews on RTX 4000 series and Intel 13000 series namedrop AMD more than the actual product.
I find it crazy my 7800X3D only consumes 2-3 watts in idle task usage (eg. browser surfing). Highly efficient little thing.
this guy got btfo
You're on a video game board you fucking retard
Userbenchmark shill
7800x3d literally can not run hot. The 3D cache requires it to be relatively cool to even work.
Any modern CPU can handle gayming just fine unless some outlier cases. Just stop fretting so much.
Thermal density is the actual reason. Can't get around the physics. Pumping ~200W of electricity into something that has the aggregate size is a US quarter is going to be a challenge to cool.
>200W
7800x3d is rated for 120W, but I've yet to see it go above 60W here.
It actually consumes almost 200W if permitted and taxed enough. The greater Intel SKUs drink almost 300W.
On normal mainstream and gayming loads (1-2 heavy threads and a few light ones in background), they will never drink that much power.
I threw a Blender render at it while playing Pavlov, maxed out CPU, and it wasn't pushing beyond 70 watt here with PBO on its 90C mode.
>300W
13700k is rated for 125W
I've never see it hit 125W during gaming and sits between 50-100W.
nagger my 7800X3D's MAX wattage in the entire time my PC as on was 55W.
It idles at way lower than that too.
It’s still something you need to account for when considering a PSU. Once video games develop you will more likely see the CPU being utilized more, drawing more power. Either way, the AMD equivalents draw even less under gaming loads.
It doesn't run hot due to chiplets.
I has "high" idle power consumption of 30W due to chiplets. It runs hot under small loads (max power draw is 90W) because of the 3d vcache adding extra active silicon, acting as a warning blanket for the compute chip.
It also runs at whatever temp (max 90C) you let it turbo to in BIOS. Want it capped at 70C? That's just a setting. 80C? 90C? Yeah thats allowed too. So it doesn't "run hot" under any sense of the word, really. It boosts until: it's at 90C (or your temp limit) or 5.05GHz. That's the algorithm.
No. I have a Ryzen 7 5700 and nothing comes close to even stressing it. Its less than 200
For MMOs or anything that's CPU bound it's a godsend
Why not 14th gen?
I mean if you want to pay 200$ more for a worse cpu that will swallow 300 watts just to still underperform then go ahead
>is the most kino cpu in recent years worth it for gaming
gee, I wonder
THANKS but I'm sticking with the 5800X3D until AM6
Pretty much my plan until AM5 becomes worth it.
This is such a shit period for hardware.
Nah, both platforms provide excellent performance and features for the vast majority of non-professional loads. You can't go wrong by going into either camp.
The real problem is that both AMD/Intel are pushing their silicon to the bleeding edge via factory, arm-chair-cooling a.k.a boosting. So they try to best each other at some stupid epenis benchmark. That extra 5-10% performance for 100W more is really not worth it. Both chips are insanely efficient if you actually dial them down a notch or two and the "performance loss" is minimal at best and still easily outclass their predecessors.
Completely reasonable. The 7800X3D is a huge step up for people who didn't get a 5800X3D though.
In current games a 7600 with an rx 7800xt will perform better than a 7800x3D and a rx 7600.
But if you want to buy a long term cpu that will be great for years and you can afford a 7900xtx or a 4090 then get the 7800x3d.
It's literally only for gaming. For daily use it's average in comparison all CPUs in it's class.
For 60fps all you need is the 2600
For 144fps probably 5600
Imagine being too poor to buy ARM Macbook Pro.
Should I bother with AM5 if I'm not playing the latest unoptimized slop? I hear bad things about stability and boot types.
>I hear bad things about stability and boot types.
That's just Intel shills FUD.
Avoid at all costs. I'm on my second set of dead motherboard and CPU with a 7950x3D. Even when it worked, it had problems with stuttering and stability. I hate it and wish I could get a refund.
>I'd overclock like a n00b on something that is already factory-overclock near breaking via boosting
>I don't know how to thread schedule on my ancient OS that has a modern coat on top.
Nah, too expensive.
so you're saying stick with intel?
microcenter bros
first pc build
im gonna go for it
Just make sure that motherboard has at least 2 nvme slots.
why is that
Well one you are going to fill with the fastest nvme ssd you can afford and the other can be used as fast storage. Generally only one of those slots is the fastest speed.
For storage with the fastest read/write speeds on the market
>slots
is this what im looking for on the mobo spec sheet?
Like this so it says actual speeds.
okay i couldnt find that on the microcenter website so i found this on the manufacturer's website
Yeah I wouldn't. You want at least one PCI 5 slot so you can take advantage of NVME M.2. top speeds. The drivers aren't even that costly.
>drivers
I meant drives. For example I have an old motherboard that only supports pcie 3.0. I have a 4.0 Samsung M.2. drive, so I lost some speed. Still works, but not as fast. So you would be doing the same with a 5.0 drive. But in my case 3.0 was top of the line at the time. So you would be buying old stock.
probably why its in the bundle i guess
¯_(ツ)_/¯
justifiable for the price of cpu+mobo+ram? can just replace later?
It is just old tech. PCIe 5.0 is the latest. PCIe 4.0 is 4 years old. I wouldn't do it, but hey there are still not as many pcie 5.0 m.2 drives out there as 4.0. Your choice.
I wouldn't stress too much about 4.0 vs 5.0.
Yes 5.0 is newer and faster, but currently 4x as expensive (the cards themselves, not counting the motherboard you'd need to buy to support it). The theoretical read/write maximums are nearly twice as fast but you won't be seeing any significant improvements outside of copying files or rendering.
Seems like just about all AM5 boards support pcie 5.0 nvme drives. Given this thread is about the 7800x3d an AM5 chip.
pic is loading times btw, took 5000 hours of Google searching to find this
If the game supports Direct Storage, those load times accelerate dramatically on PCI-E 5.0, too bad not many games do, only Ratchet and Clank, Hunt: Showdown and Forspoken do currently. Diablo 4 support is coming soon. It's really niche on PC unfortunately, while every PS5 and XBX game supports it
>Ratchet and Clank
Caps out at PCI-e 3.0 speeds.
oh really, wow what a bust of a feature then
3.0 is unironically more than enough and basically indistinguishable from a PS5 unless you do side-by-side comparisons as long as you have a good CPU.
hmm. well the reason i was considering right was because i heard 7800x3d was decidedly more for gayming and worse in productivity, and i wanted something more balanced for the different applications i used.
Keep in mind people paying for all these expensive systems are likely putting 1000 hours into Factorio.
or just them as basketweaving board shitposting stations
In that case yes, the X3D are not productivity chips, and Intel is generally more balanced. The "balanced" AMD version would be the 7700/X.
But check benchmarks and forum posts of the programs you use.
7700x and 7800x3d are like the same for productivity... The only AMD CPUs worth buying are 7800x3d for gaming or 7900X for productivity, they are around the same price.
You're right, I remembered 7700 being faster
>7700x and 7800x3d are the same for productivity
Not necessarily.
>and i wanted something more balanced for the different applications i used.
What are you using on a regular basis where you'll actually notice the difference between the 16 threads on a 7800X3D, vs the 24 or 32 threads of its more expensive options/competitors?
You can get a 7900X for about the same price of $400 now. Though you missed when Newegg was even offering a deal where it also came with 32gb of GDDR5 and slopfield a little bit ago.
Left is better, not only because of the better CPU, but with Intel you'd want much faster memory.
5600x till am6
simple as
What games require more than a cheap 5600
If it's only for gaming then yes. For everything else these get outperformed by even cheaper Cpu's.
>9900K still delivers 100+ fps in most modern games
I’ll get the 9900X3D eventually. It’s the only logical upgrade.
I don't need more than the 5600x.
It's the best CPU you can buy for gaming if you are building a new high end computer, but you don't really need anything above a 5600.
It just works.
Why does AMD get so much hate here?
The drivers just got thousands of people banned from Counterstrike 2 because they shipped a literal hack as part of the official drivers to make FSR3 work. Gee I wonder why people dislike AMD graphics cards
The anti-AMD homosexualry was going on long before that.
But keep bringing up something that's been dealt with.
AMDcucks keep getting btfo'd
>literal hack
Didn’t know my Nvidia Reflex is a literal hack but ok lmao
Reflex has to be manually implemented by the devs in their game binary. AMD Anti-Lag is implemented at the driver level by intercepting draw calls which is the same way hacks work. They also didn't tell anyone they were adding it to the driver before releasing it, so Valve started banning people that looked like they were hacking, but just made the unfortunate decision of buying an AMD card
https://www.pcworld.com/article/2106874/amd-anti-lag-is-causing-player-bans-in-counter-strike-2-apex-legends.html
they were correct though, anyone who bought amd card needs to get banned
Why are you posting GPUs on a CPU thread?
AMD's entire platform is shit
constant restarts for no apparent reason and just crashes randomly
Intel though, it just fucking works
I sadly agree. I bought a 6650xt and for some reason it kept crashing until I had to tune it and undervolt, and I have no idea why because I have more than enough power.
as long as it's not Intel you're good
Nah, not all games benefit from the 3DCache and you've got the typical AMD jank.
>you've got the typical AMD jank.
>amd jank
>on a cpu
the fuck are you on about?
>audio crackling
>Windows 11 bugs
>USB disconnect bugs
>CPU's blowing up at stock settings
>slower RAM and potential XMP issues
>have to constantly update UEFI and chipset drivers
Retarded intel shill.
Ahh yes, don't inform the user of potential issues. This is the AMD way.
I'll give you the CPU's blowing up one but I haven't seen fuck all regarding the rest, will have to read up
Everything he says is bullshit and even if they are true, these are problems tied to motherboard issues, not CPU issues.
Probably bought an ASRock board if he encountered the issues themselves.
The RAM thing can be true, mostly because RAMs are now specifically XMP or EXPO "certified". If you have AMD, you obviously need EXPO, if you have Intel, you need XMP.
Price / performance and not needing to tune CPU / RAM etc. is a huge selling point. If you enjoy tweaking, Intel can come out ahead in many scenarios + don't have to worry if a game supports the extra cache or not.
Spotted the intel shill.
>your brain on AMD
I literally just recommended the 7800X3D you dumb nagger.
>7950
retard
Why? It's a better binned chiplet.
Because in the end it's still not the 7800X3D you complete retardoid
>average FPS
Still more relevant than
>7950X3D
Lmao, loser
>has to cherrypick and lie
The state of AMD, lmao.
Maybe you should post the correct CPU next time lol
Never @ me again
>y-you see, average FPS doesn't matter because... um... it just doesn't, OKAY?!
lmao
Compare game by game and not an aggregate. Certain games play better on either Intel or AMD so it's disingenuous to combine all the numbers.
Ah, is that why you had trouble posting the right CPU?
you're proving him right?
Nope. It's just a 7950X3D and the other CCD is parked. It has a higher clockspeed over the 7800X3D chiplet.
>Nope
>>has to cherrypick and lie
Dang, anon. You're pretty bad at this
Actual retard. This is supporting AMD. It would be unfair if I compared the 13900KS to a stock 7800X3D, yet it's not okay with the better binned chiplet?
Lol
Lmao even
The mental gymnastics of AMD homosexuals.
Yeah ok, retardoid. Maybe post the right CPU next time, yeah? Anon will claim the 7800X3D to be slower yet not even post the correct CPU lmao. You've got two brain cells per hemisphere
>nagger unboxed
>no overclocks
>no RAM speeds
Sorry big boy, opinions from mongoloids incapable of posting the right CPU are disregarded
>a higher binned chiplet is bad
ok chud. just for you i'll post the lesser version.
Took you long enough lmao
homosexual AMD shills. i hope they pay you well for being this scummy
Hey settle down, okay?
actually as bad as linux homosexuals. constantly misleading, gaslighting and word gymnastics to avoid any situation
You don't actually get riled up over anonymous internet posts, do you, anon? That's kinda sad
why you pulling us linaggers into this the fuck we do to you
INTEL SHILL MAD
INTEL SHILL MAD
Not even an Intel shill you homosexual. This is my opinion and I recommended the 7800X3D...
>average FPS
Refer to
>720p
What are you poor?
Anon do you seriously not realize why gaming CPU benchmarks are done are at the lowest settings possible?
I have a b350-f motherboard, and a r7 3800x CPU, what is the best upgrade under $400 for me right now bros?
>productivity
like what?
20 tabs of tutorials a game you're afk in and some bloatware program + windows to produce amateur only-mom-would-like art
>tfw fell for intel estrogen cores
>2 refreshes already
>don't come even close to ryzen
I made a big fucking mistake. If only amd gpus were as good as their cpus
AMD really needs to pick up the pace with their GPUs, just so Nvidia can't israelite people anymore and we can have a normal ass market.
AMD's GPUs are completely fine, and a better buy than Nvidia right now. But unless AMD humiliates Nvidia like they destroyed Intel, people aren't going to switch.
People are used to buying Nvidia, they will keep buying Nvidia, 30% better price/performance isn't enough to persuade them to switch.
If Ryzen was just "well, it's the same performance but a bit cheaper," everyone would've just bought Intel anyway. But when Zen 1 came out, the 8C/16T 1700 was priced the same as the 4C/8T 7700K from Intel.
The $300 Ryzen 1700 + random ass $60 B350 motherboard performed the same as the $1000 6900K + $300 motherboard from Intel. It disrupted the market completely.
>$1089
>$999
>$612
R*dditors better appreciate AMD, otherwise instead of buying funkopops you'd be paying $1k for a mediocre CPU.
That was cinebench. For gaming Zen started picking up steam during 2 and completely eclipsed intelaviv during 3.
ayymd does not care, outside a few linux super computers their compute offerings are a joke. And they never invest real money in it despite being rich enough to buy xlinix.
they put a lazy minimal effort to satisfy a few corporate buddies and that's it. same for gaming.
they are happy to price their shit 10% cheaper than israelitevidia and call it a day.
Pure rasterized performance wise amd gpus are fine
t. $575 6950xt user
to upgrade from my 5600x to this i'd need to upgrade my motherboard and i am way too lazy to reinstall everything
How shitty are you at putting PCs together?
Takes me like an hour or two hours max to put machines together
Literally the only parts that take some time to install are the fan on the CPU, the PSU, and connecting all the bullshit from the case to your mainboard
>ACK
BROOOO INTEL WON AT SHIT THAT DOESN'T MATTER AYMD LOST LOST LOST
>i9-13900k max OC
Imagine the heat
120-150W at 6Ghz
7800x3d cost twice the 13600k where I live so I went with Intel for my new rig.
>is the best gaming CPU alright to use?
What the actual fuck is wrong with this board?
Listen to kpop!
Fuck no
God I asian women didn't have such unappealing flat asses and sticc bodies. They do absolutely nothing for me.
>ACK
Is this a real 7800X3D this time, or is this another one of your jests from the same source as last time?
>my bar graphs are real, your bar graphs are fake
you people are so tedious
I'm not sure about you anon, but masquerading a 7950X3D as a 7800X3D sounds pretty fake to me lol. I'm not sure why you're struggling so hard to post the right CPU lmao
the guy who "masqueraded" a 7950X3D as a 7800X3D only did so because AMD deliberately delayed the 7800X3D to get people to waste their money. of course AMD fanboys think it was to make them look worse.
Why would anyone do this?
Why go on the internet and lie?
It's still fake, then? Why's it so hard for you to post a genuine 7800X3D benchmark? Wtf bro, how long has it been?
I haven't posted any benchmarks because posting a png of a bar graph doesn't prove anything. But you're making accusations of malice simply because you aren't happy with the numbers he put out.
Or maybe because he faked his numbers.
whatever helps you sleep at night
Numbers of wha? A 7950X3D? Lel
People "called him out" on it so he bought a 7800X3D and it actually was 100-200mhz less than the 7950X3D, hence why it's binned better.
>he was lying to you for your own good
Absolutely israeli, post nose
he wasn't lying though. he said what he was doing.
He was lying through his fucking teeth. Anyone with a modicum of gray matter understands that the 7950x3d and a 7800x3d are completely different and will behave differently even if you disable a chiplet.
he got a 7950X3D because the 7800X3D was intentionally delayed, and disabled the non-3D ccd to create a proxy for the 7800X3D, because it wasn't available at the time. then when the 7800X3D came out, he tested that too. you're inventing a narrative because he said your favorite product isn't the fastest.
There's no arguing with him. He's like the Linux homosexuals on this board. They lie and mislead out of everything you bring up. He probably he a Linux nagger.
What the fuck are you on about you fucking schizo. What's wrong with linux you fucking pajeet.
>want to play XYZ game
>nooooo not that game why would you want to play that imagine being a homosexual
That is why people laugh at Linaggers.
Same vibe in this thread regarding AMD. Dismissive and misleading.
Its too bad the strawmen in your head talk that way. I completely understand that it's not ideal that some games don't work and I won't pretend that 'lmao just play something else' is a valid response but that's not the OS's fault generally, the few games I want to play (R6S and EFT primarily) are unplayable due to anticheat. And for everything other gaming I do personally it's been much smoother than Windows.
>Linux bad
nuh uh Linux GOOD
Nu uh Linux absolutely bad
how so, speak nagger
Linux bad, simple as
Install and you'll see
>install
nagger I have been using LINOOX for the past 5 months. Jumped ship from windows with no experience. LINUX G O O D
Why Linux good? Linux bad.
LINUX FEEL GOOD
The main reason I switched from windows is because everytime I ran a program that wasn't an Official Microsoft (TM) program it would do this thing it would have to load for nearly a minute doing god knows what, probably scanning and reporting my eroge to microsoft. Bill Gates and his pajeet cadre don't need to know my fetishes. Now I run linux and I'm F R E E like my software, my fetishes are between me and my ISP.
LINUX G O O D
RESPONSIVE
FAST
TASTES GOOD
FEELS GOOD
WINDOWS RUNS LIKE SHIT
>god knows what
>antivirus scan
That's funny because I ran Windows Enterprise with defender disabled.
Can I play Battlefield 2042 on Linux?
>Battlefield 2042
You used to be able to when they used EAC for their anticheat, but now they use a proprietary kernel level anticheat that's incompatible.
>What's wrong with Linui think he meant Linus, but both the OS and the youtuber are bad.
Then maybe he should have waited for the real thing lmao, stupid cunt
He tested both you braindead nagger.
The 7950X3D was the faster part because it has a better binned chiplet.
It actually supports your shilling, retard.
Don't care, loser. Maybe you should spend more time posting genuine benchmarks over performing mental gymnastics, yeah?
That is a genuine benchmark.
That misteriously can't be recreated and with a dubious source.
all of your sources are from people who literally do ad spots for pc hardware. I guess that makes them seem trustworthy to you.
My usual source for hw stuff is GN, because they actually publish their testing methodology.
Your source is a mystery man doing whatever he wants, however he wants, including lying and simulating hardware he doesn't have. The original source is also conveniently behind a pay wall so it cannot be verified.
>My usual source for hw stuff is GN
I know it is. That's why I don't care what you think.
>GN bad
Explain
Please quote where in my post I mentioned hardware unboxed.
You just can't stop lying, can you?
Somebody was posting Hardware Unboxed but if it wasn't you then fine. Still proves the point that mainstream tech guys are paid off.
Stop trying to talk your way out of this one, nagger.
Post proof of tech Jesus being a shill.
We're waiting schizo.
>guy who faked 7800x3d results good 🙂
>GN bad >:(
cope, tranny
yo chill linus you wouldn't want to offend your employees
>Explain
doesn't play games, uses an amd fx chip for his personal system because that's how little he cares about PC performance.
spends half of every review on shit like blender because he doesn't care about games
tests retarded irrelevant games nobody cares about like strange brigade because "it's well programmed" which is useless for people who actually play games
shills pc hardware while claiming to be independent somehow
threw buildzoid under the bus for calling the gn audience "normies", which you are
threw linus under the bus because he's a jealous homosexual
annoying whiny lispy voice
ugly
stupid hair
the list could go on
Hello Linus, how's the wife and her boyfriend?
Also I know you're shitposting but Steve does like games, he doesn't like AAA slop. There are videos where he briefly talks about games he's looking forward to, usually from indie devs.
>Your source is a mystery man doing whatever he wants, however he wants, including lying and simulating hardware he doesn't have. The original source is also conveniently behind a pay wall so it cannot be verified.
He lives streams everything.
It's behind a pay wall because buying lots of hardware can get expensive. He's not bought and paid by companies.
People steal his overclock settings (particularly RAM), so at least he got paid for it.
But no, you would rather trust nagger Unboxed who shilled his affiliate links for dogshit 4800mhz DDR5 RAM rather than high-end DDR4 RAM for the same price.
nta, but
>steal his overclock settings
If you're gonna publish benchmarks shouldn't you make all your system setting public. And as far as paywalling your videos, that seems odd in the sense that I can't imagine someone who would actually pay for benchmark videos like that. Like would anyone who would pay for his videos probably just support him without needing the incentive of paywalled videos. Just seems odd to me
>all your settings public
In the final video the basic timings and speeds for RAM are visible but the secondary timings are not ones are not, that is behind the paywall. You can still access it.
Why is shit like this acceptable then?
Huge double standard because you're an AMD fanhomosexual.
>I can't imagine someone who would actually pay for benchmark videos like that
Because you're not his target audience. You're somebody that watches Gamers Nexus or Hardware Unboxed and only cares about plug and play. You aren't overclocking, undervolting, whatever.
> Like would anyone who would pay for his videos probably just support him without needing the incentive of paywalled videos
The end results are free on YouTube and the testing methodology but if you want to see the raw live stream then it's behind a paywall.
>Huge double standard because you're an AMD fanhomosexual.
that was my first time responding to you
>You aren't overclocking, undervolting, whatever.
I literally am, I usually watch buildzoid for that content.
Why are you such bitch? Did AMD fuck your mom or something?
>Why are you such bitch? Did AMD fuck your mom or something?
>my first post
That is fine, welcome.
Earlier in the thread I actually recommended AMD unless you're fine with tweaking, or your favourite game doesn't benefit from the extra cache.
I see fake, simulated results on there, though?
happy?
Is that a real 7800X3D this time?
The lower FPS, yeah.
Okay good, now post video source
It's behind a paywall
Yeah and my girlfriend just goes to another school lmao
What is that meant to mean?
My gf in highschool went to another school.
Nice christian girl, blonde hair and 6ft tall. Huge legs from dancing.
I'm not going to do that because A, you will just say it's all lies anyway, and B, I really don't care what random Ganker plebs think about PC hardware.
>source: 10 minutes in excel and 1 in paint
Also what the fuck testing methodology is that?
why should he have waited? because you homosexuals would twist the story to make it seem like deliberate deception? yeah, he probably should have anticipated that AMD fans are psychotic weirdos.
>Why should he have waited for the real thing?
Gee, anon
So he lied. Got it.
>source: 10 minutes in excel
Yes, it's literally the best you can get for gaming.
Just be careful to not blow it up.
>7800X3D good
Ok but what is a good am5 mobo
I went with the NZXT N7 B650e.
Good VRMs nice design thermal pads for NVMEs etc.
Nice upgrade path down the line when RAM gets better or better CPU, also has a gen 5 nvme and GPU slot so the mobo will last a while I think.
Worth the price.
>Pros
Cost effective and can be better in certain games
No need to tune it outside of potential XMP issues.
Lower power consumption**
>Cons
AMD jank
Not all games benefit from the extra cache
Less cores. If you're strictly gaming then it doesn't really matter but it can be handy to put things on the eCores.
12700KF regularly goes on sale + high end DDR4 RAM is much cheaper so cost / performance can be outdone.
Limited to 6000Mhz DDR5
Pretty neck and neck I'd say. I prefer things to just work and not all my games benefit from
**You can park eCores or the 13700k/14700k exist which have less eCores. Vendors also push Intel CPU's to their limits to get an extra 100-200mhz to look better than their competitors. They stopped doing this on AMD due to CPU's blowing up. Personally I have a 13700k at 5.4Ghz and it runs between 50-110W depending on the game. 150W power limit while rendering and it drops to 5.1Ghz. Not a huge deal since GPU's use 350-600W now.
I haven't played at 1080p in like 5 years so a 3D chip wouldn't help me at all.
Happy for AMD to have a competitive product but unfortunately I don't think it matters that much in real use cases.
>mfw I share a board with those morons
7800X3D dabs on everything at 1080p 1440p and 4k.
>7800X3D dabs on everything at 1080p
True
>1440p and 4k
Not true, it shits itself at higher resolutions and at that points the tradeoff for general applications isn't worth the minuscule fps difference.
LOL
The facts don't lie retard.
>average FPS
>NOOOOOOOOOO that doesn't count!
>techpowerup relative performance
the most useless information on the internet
>relative performance
Worthless
Explain why you think changing the resolution would have an impact on CPU performance?
Heres the 4k one too
>shits itself at higher resolutions
That’s not how CPU scaling works. At higher resolutions the gap shrinks due lower average framerates and GPU bottlenecks. Once GPU limited it doesn’t matter if you run a 9600K or a 7800X3D.
That makes sense. Shits itself wasn't the right way to phrase it.
There's so few games where it has a clear lead like MSFS 2020 where it has 25-30% higher fps. For everything else you're comparing sub 10% variance.
Not worth changing my entire mobo/ram/cpu setup for.
Motherfuck, been trying to renovate my old desktop PC but now the goddamn thing won't even power on. This thing's ancient too so any replacement motherboard is long out of stock.
b450 fag here do i get 5600x or 5800x3d and sleep for the next 6 years?
People overspend on the CPU so fucking much. This thread is probably just marketers though. You just keep spamming names of products to raise interest and thus be able to sell it for more.
5600x, save the $200 and get a barbell or something.
there's no "overspending on cpu" homosexual retard
until it starts bottlenecking shit or you have to replace the entire pc because tech naggers pushed out new israelitery it will just work, and the better is the cpu the longer it will take
Most people are still under the idea that we're in Intel's decade of nothing happening with CPUs. Which of course hasn't been a thing since AM4.
They're the people spending 1600 on a 4090 and then combining it with a 5600G.
tech illiterate homosexuals are the ones buying intel or pretending cpu performance doesn't matter yes
You would want the 5800X3D if you were planning a 6 year build. 8 Cores + the extra cache.
5800X3D is big sexo. Remember you need to update BIOS.
>b450
Just get the 5500 instead, your board doesn't have PCIe gen 4, so why pay the extra on something you wont use, but if its just $10 difference, just get the 5600x. Performance wise, 5500 and 5600x are not much different, you would see better 1% lows with the 5600x if you had a B550 thanks to PCIe 4.0, but thats pretty much it.
Imo, I would just suggest to get a A620 + 7500f , then find a decent DDR5 kit, you would get pretty much same performance as 5800X3D, but also the new upgrade path when new AM5 cpus come out. ASRock A620 boards are curiously, the best ones made, while MSI are fucking shit.
7000 ryzen, enjoy extreme termal degradation of both CPU and MB in couple of months. A lot of guys showed the results - buldging and layering of MBs, CPU cristal peeling of the board, some people even reported solder melting under CPU heat spreader.
That's just FUD from intel shills.
Just google "ryzen 7000 cpu failure" and watch images. I dont know the root cause of this - QC issue, MB problemsm whatever - that's just sad.
That was entirely on aSUS.
Clicked on couple of articles - TomsHardware, TechPowerUp etc - they either dont mention MB brands or state that this shit is present on MBs of all big brands - asus, gb, msi.
Friendly reminder: Don't fall for the goonix meme.
>OOM so shit it WILL kill ANY heavy process that isn't the one actually eating all your virtual memory and shitting itself
>5-10% (maybe even more in some cases) performance lost on AAA due to meme syscall translation crap
>It won't JUST WORK out of the box (Fuck off with your gay little caveats, i want the full game)
>No one actually develops NATIVE LINUX GAMES (The ones who did shat the bed because no one bought them)
>Zero multiplayer/online coop games of substantial worth (FOTM indietrash need not to apply)
>The few profitable corporations whose products rely solely on poonix don't respect their user's freedom (nor they ever will).
You're better of debloating win11 with ntlite. Don't bother with Microsoft's and Gabe's little penguin slut.
>ACK
>Mobo fucked up
>I-It's Aymd's fault!!
Ohh so now it's okay when the motherboard does it but when Intel is overvolted and pulls 300W it's not their fault. Double standard once again.
Bulged up CPU fucking up MBs pins - mobo's fault. Holy shit, what do you have in your AIO, copium?
Yes thats what happens when a bad bios update provides bad voltages.
Maybe go back to high school?
Which happend to hundreds of boards of different manufacturers, with their owners ALL installed new faulty BIOSes? Yeah, sure, pal, totally plausible.
Provide proof it happened to hundered of boards shitposter.
The only ones we have are when people used ASUS boards.
I'll wait, but I know you'll call me a shill either way like the fag you are.
https://www.techpowerup.com/307808/amd-releases-first-statement-on-ryzen-7000x3d-series-burn-out-issues
"We are aware of a limited number of reports online..." - corporate speak for "fuckton"
"The AMD statement follows several of its motherboard partners..." - that means "we panicked and pushed ALL big MB makers to quickly patch this shit up to avert a class action lawsuit".
I never called you a shill, I dont care what brand of CPU you have, I have both. I just have very little tolerance to lies and deceit from both AMD and Intel. You on the other hand called me names, which shows your quality as a debater
Ok, shill
Okay, brainless brand slave
Yes, multiple manufacturers disregarded the voltage limit in order to get bigger numbers.
This has all been covered already by multiple independent news sources.
>MBs overvolting cpus to the point of failure
>not MBs fault
Literally was ASUS fault for providing shitty voltages in a BIOS update.
This never happened on any other mobo maker.
Decent B650E or cheapest X760E motherboard? (both $300ish )
How about buying the one that has the features and connections you need, fuckface?
How about you learn to ignore a post if you know absolutely nothing about the matter? Newfag
I have a b650e Taichi, it's bretty gud. Has bclk overclocking if you're interested in that. And decent connectivity. I used to have the ASUS 670e TUF that thing was a massive pile of shit. Terrible coil whine and long as startup times.
If you need to ask get the cheapest one.
Do i need a cooler for 5700X if i dont plan on overclocking it?
No, AMD's stock coolers are decent enough. Just make sure you get a case with airflow and not a glass box with no holes.
You don't need to, but they can be noisy under load. I'd spend $20 on a somewhat decent cooler if I were you.
If you mean without ANY cooler, then no. That's not how it works. It overheats in about two seconds.
Doesn't come with a cooler.
For some reason thought amd cpu with X also came with a cooler might as well default to a 5600X which does have one then and save an extra 30
Weird, the 5600 I got for my gf came with a cooler. Is this something specific with the X series?
Yes
"X" models don't come with coolers, yeah. They're also slightly faster on paper... but in practice the difference is less than 5%.
X or non-X, it's always better to get whatever is cheaper.
maybe now, i remember buying some 3 series "X" some years ago and it did came with a cooler.
Oh yeah, you're right. So let me rephrase.
Usually, AMD sells two different versions of the same CPU: a standard cheaper version, and a better binned but more expensive version. Usually, the standard version is called Ryzen [Number] and the better binned version is called Ryzen [Number]X.
And it's almost never worth paying more for the better binned version because it performs almost the same as the standard version. If the better binned version isn't THAT much more expensive, it may be worth considering. Otherwise, the standard version is completely fine.
But as you pointed out, their naming scheme isn't always consistent. For example, in the very first generation they sold a 1700, a 1700X and a 1800X. These were all the same CPU and performed roughly the same.
With Zen 3, they launched the overpriced 5800X. Later they launched the much cheaper 5700X. But these were also the same CPU.
My 7700x didnt come with a cooler and my experience is more recent.
My 7600 came with a cooler, if I remember right the X ones didn't have them. The high end CPUs of course don't come with coolers since they need much better cooling than stock.
Then the anon saying that x versions do not come with coolers is correct.
This is wrong. I got the 3700X like 3-4 years ago and it came with an RGB cooler.
The 5000 and 7000 don't come with a cooler if they're X, except I think the 5600X.
5600G has a cooler I think
You can run even a 7800X3D with the stock cooler without throttling.
You're not going to sustain the 5ghz boost on a basic cooler (that one comes with no stock). I threw my NH D15 from my previous build at it just to be safe, given how quickly that CPU spikes temps under sudden load.
>it's okay when the motherboard overvolts Intel to draw 300W to look better than the other motherboard competitors but when AMD does it, it's bad
>no ofcourse we won't re-test power draw when using Intel's parameteres or undervolting
This is why I hate AMD naggers so much. Always double standards and dismissive of everything.
>nooo it's okay when it happens to Intel but bad when it happens to AMD
Why are AMDnaggers like this?
One was fixed with a bios update
The other was not because it happened cpu side
Why are israelitetels tech illiterate?
>fixed
By ASUS not overvolting the CPU anymore, dropping power consumption.
Yet it's still perfectly okay to compare that to Intel while vendors are overvolting Intel.
nagger.
ok what do you guys think about this? looking to finally build a new computer after like 10 years https://pcpartpicker.com/list/MqL4MV
>b-b-b-but productivity!
7800X3D is going to be stomping vastly more expensive CPUs in gaming for two generations to come, like the 5800X3D before it. Unless it's work related, I can wait another 40s on a Blender render......or build a second PC specific for work related tasks.
>using stock coolers anywhere outside office tier pcs
My 13700k draws 50-100W during gaming at 5.4Ghz. Why does everyone keep saying it draws 300W? Did they confuse it with the GPU? I could probably undervolt it if I wanted to.
What games, and is the framerate locked.
If it's not CPU intensive... it's not CPU intensive.
CS2 was 50W at 400 fps default cap. Uncapped was 65W.
Battlefield 2042 is the pinnacle of CPU bound and it was using ~100W.
wow you're stupid
>no explanation
Opinion discarded.
Some people are so far gone they don't get an explanation. Enjoy the rest of your life.
The store page will also say WOF (without fan) if there's no fan.
What does a larger cache mean for gaming on a CPU? Just more instructions it can chew away at before having to make a call to memory to send/receive more instructions?
CPU wants X. X travels there from memory. But whoops, L3 (outermost cache) is full. X needs to travel back to memory to wait, then get called back to L3. That's three trips, lots of physical distance to travel. It can only now go to L2, L1, goal. But it's late.
Your game will now stutter and/or the fps might take a dip.
More L3 is more buffer before that happens.
But also yes, as you said, a faster more constant chew.
kino cpu
Got a 1070 and i dont play the latest stuff is it a good idea to get a 1440 monitor?
also is it really that horrible to drop to 1080p in such monitor if needed over just lowering the graphics to maintain the original resolution?
>1070
>1440
what a naive tranimeposter lmao
Sure
I have a GTX 1080 (overclocked) and a 1440p monitor, runs most games at that resolution no problem. Modern games with lots of things going on will run at like 50 fps to I just lower it to 1080p and set up Nvidia Image Scaling or just sharpen through Reshade, it looks just a bit worse than native 1440p while gaining 20-30 fps.
Armored Core VI runs at 60 fps ultra at 1440p, but Monster Hunter World or Elden Ring will need 1080p to run at 60 ultra for example.
Also, if you read a lot on your PC, 1440p is a must.
Ganker is a known Intel and nVidia shill board.
That’s every board, even Ganker, though I suspect said trillion dollar companies can afford to pay 5 people to shill their products here without issue.
Bros is the 4090 a decent GPU if I want to game at 1080p 60fps?
Throw some DLSS upscaling on and you're golden.
nah get a GT 210
at the rate games are going, you'll have to settle for 720p or a solid 30fps
idk but i sure as hell wont buy intel with their new limp dick ~~*e-cores*~~ technology which works only on newest windows and breaks old games
it beats or ties intel flagships at less than half the power draw
amd was so fucking retarded to launch the series with the 95c thermal cap instead of a proper power draw cap
And it didn't even improve performance lmao.
The vertical stacking of cache necessitates the thermal cap. Pump any more power into it and the solution starts to degrade very quickly.
It's overpriced. Why not just get a 7600x?
>x86
>so inefficient that the likes of intel had to make e-cores to handle verbose ancient instruction noise from the x86 instruction set
>p-cores do the heavy lifting
>causes issues with older programs that are no longer maintained, requiring 3rd party methods to isolate it to p-cores
>arm
>more efficient, uses less power/more battery life, churns out less heat
>PCMR still has no tangible benefits from it in 2023
i need to wake the fuck up from this nightmare
Intel didn't put research into bypassing x86's limitations again. So AMD humbled them. Again.
I can just see you clicking around on your MacBook m1 and rubbing your toes together
>macbook
the fuck do you think is being used in your phone?
your VR headset?
x86
dying since 1985
>dying since 1985
and yet it still hasn't died and windows 11 is still on x86.
it needs to die faster
x86 isn't dying until Windows moves to ARM and has an easy way to run old games and programs on it
>only game i play is heavily modded Kerbal Space Program
>no one ever benchmarks it so no idea whether x3d is good for it or not
Only if you own a 4090, otherwise you are just burning money.
imo best gaming CPU atm is the 7600 for the price, but if you already own AM4 platform, 5600 is pretty much all you need, specially for vidya, besides now CPUs like 5700x are pretty damn cheap on aliexpress too, great for 3rd world poorfags.
Upgraded from my 1600 to 7500f, planning to get next gen am5 upgrade, or 5800x3d on sale.
For me, it's the Ryzen 5 3600. The best budget CPU.
I just put together an 7800x3d last week with 6000mhz ram
so far, it's been great for gaming.. but all I've played is my time at sandrock and risk of rain returns.
Yeah you way overspent but at least you won’t have to worry about cpu speed for a while.
Man I’m sitting on an RTX 3070 and I remember poping it in and slowly coming to terms that I just wasted $500 as I play mostly ancient 10-20 year old games. I still enjoy tinkering with my PC.
I did the same thing going from a 3600 to a 5800X3D and I barely noticed the difference since most of the games I play are 5+ years old.
I'll probably do a whole new build when I do notice the difference, but the price to pay to upgrade from an RTX 2070 will hurt.
Whenever I see tech threads on Ganker I want to blow my brains out.
Most of you fuckers should stick to consoles.
Dunning–Kruger effect - the thread
Pluton. Nothing past Ryzen 5xxx is ok for anything.
2160p FPS from TechPowerUp data using a 4090, so these are the greatest possible differences:
$150 5600
>AVG: 151
>1% LOW: 107.8
$350 7800X3D
>AVG: 169.9
>1% LOW: 124.9
You're paying 2.33x more for 1.12x AVG and 1.15x 1% LOW.
>cpu
>4k
are you fucking retarded
Honestly I would pay just for that 1% lows increase. Nothing pulls me out of a game like stuttering does.
2000 dollar GPU but can't afford 200 dollars for a better CPU?
What kind of retard bullshit is this? lol
In the context of the whole system ownership though, that's paying <10% more for 12% average and 15% 1% minimums. Which is a no brainer, if you just bought a 1600-2000 dollar GPU. Why would you gimp it by 10% to save another 200 bucks? You clearly have the money.
Right. It's important to consider the entire setup. I don't like performance/$ calculations for this reason.
Be future proof you dumbfuck. Every new year there are games that uses more resources, Why would you buy a good one now that will be bad tomorrow? Get the best now and be good for the generation.
Got a 7600. I'm having fun with the iGpu. I'm waiting for a good game to come out to buy a GPU.