Yeah 3600 is PS5 equilevant CPU for games that do not take advantage of 8 or more cores (which is most games released so far). Alan Wake 2 used the 3600 as a PS5 equilevant CPU as well. The R7 3700X is closer to a PS5, but unless a game makes use of those two extra cores it performs the same as a 3600 on games.
Is there any point in the 5800X3D anymore when the 7600X exists? It has the same performance, costs less and is AM5 so you have an upgrade path for your mobo.
You could potentially skip AM5 and use that 5800X3D for a long time, though for the price of a new 5800X3D and an extra $100 you can get a B650, an 7600(X) and 32 GB of 6000 Mhz CL30 RAM, so I'm not sure what's the better move, probably the latter.
It really depends on the game, it matters for some games, it doesn't for others. If you're content with 60 FPS you will probably be fine for the most part. If you want 120+ you're probably fricked.
https://www.bleepingcomputer.com/forums/t/390593/speccy-users-beware-hacked-service-could-cause-you-to-become-infected/ >this compromise won't affect them as Speccy does not render the HTML that would load the malicious script.
so some random hack from over a decade ago and it was just the website, speccy wasn't even vulnerable to it? that's what you're crying about?
not him but last i checked that shit hasn't been updated in years and didn't even properly support/show modern hardware.
might have changed from that time but no reason to use that shit anyway because there's better options that do shit better
I wanted to get an Arc just to have one. Not use it, mind you. Hahaha. Hell no. I'd buy an overpriced nvidia GPU again, or use an AMD APU before I went into Intel graphics Hell. But I appreciate the fact that they went with a reasonably priced, specced, and sized GPU. Especially in this era of $700 3 slot cards with 12GB of ram.
6 months ago
Anonymous
come home white man
Also I really liked the very toned down, but very aesthetically pleasing design.
It's a bit too expensive for what it is, but you do get DLSS, and when you're forced to use uspcaling due to shitty unoptimized games, you'll be glad you can use DLSS instead of FSR, I do hope AMD continues to improve FSR though... It can certainly be improved A LOT.
This post reads like viral marketing. >DLSS >something to be happy about
>1000W seasonic >even tested a spare
idk bro
Yet mine doesn't have these issues?
6 months ago
Anonymous
>>DLSS
to be happy about
It's kinda true... If you're forced to use upscaling which one would you rather use? Ideally the GPUs would be strong enough to not need upscaling at all, but that's not the case, GPU are slow unless you spend $1000 or more, it seems.
6 months ago
Anonymous
Mid-range GPUs are more than capable of playing at 1080p/1440p and get exceptional performance. DLSS is absolutely not necessary unless you're trying to pathtrace 2077.
it's funny because dx12 games are already a gigantic piece of shit and somehow and enhances just how bad of a time you'll have with them.
you buy cheap you get cheap
6 months ago
Anonymous
>this idiot actually thinks that if you dump more money into a GPU it's going to magically be better
Unless you're running a Quadro-tier GPU for your CUDA workloads, you're just a flexing poorgay.
xx60/ti gays are the biggest coping poorgays on the board and should be bullied relentlessly.
doing the same FPS
trying playing the same game with increased resolution see if your FPS drops or stays the same
Both AMD and Nvidia cards can, for example, play games at resolutions higher than your monitor
7800xt is more than capable of playing 4k, which I believe was not feasible for your previous 1060
try it
I downsample from 4k with my 7900xt, but sometimes it doesn't work. Not all games support it, or sometimes they don't scale the UI properly. But when they do, holy shit. Goodbye AA, I never needed to know you.
well, i'm sure that it's pc gaming as a whole that's to blame and not the fact that you expect to upgrade one component in an entire system and have it all magically not bottlenecked by the shitty old ass intel CPU
The general rule is that if a game is running worse in cities or densely populated environments, it's usually the CPU's fault. And your CPU is ancient.
I upgraded my CPU from 6700 to 7800 without touching GPU (was still in post) and my FPS were significantly improved, and most importantly spikes were completely gone.
Everyone has told you this but it's your ancient fricking CPU. 7700K has been obsolete for a while. Anything 7th gen and before is obsolete. You should have built a first gen Ryzen system or should have bought a i7 7740X for an X299 platform so you could have gotten an i9 7900X years later.
bro that Cpu it's like 10 years old. You need at least a 6 Cores. Ram is slow as shit too. It should be at least 3200mhz to not bottleneck the Cpu.
This is why morons shouldn't build Pc's.
So is this going to be another thread an autist starts spamming until he eventually dies of a brain aneurysm?
https://arch.b4k.co/v/search/image/pqHO_dr4GtRAkt5o8zOfiA/
Your CPU is weaker than PS5
not the OP but I have a pretty similar CPU (8700k), is it an significant bottleneck to warrant for an upgrade?
The PS5 equivalent CPU is Ryzen 3600 according to Dragons Dogma 2 and Capcom.
But you should be buying X3D CPUs they're the real next gen of PC.
Yeah 3600 is PS5 equilevant CPU for games that do not take advantage of 8 or more cores (which is most games released so far). Alan Wake 2 used the 3600 as a PS5 equilevant CPU as well. The R7 3700X is closer to a PS5, but unless a game makes use of those two extra cores it performs the same as a 3600 on games.
PS5 uses at least 1 dedicated core for OS specific processes.
Is there any point in the 5800X3D anymore when the 7600X exists? It has the same performance, costs less and is AM5 so you have an upgrade path for your mobo.
You could potentially skip AM5 and use that 5800X3D for a long time, though for the price of a new 5800X3D and an extra $100 you can get a B650, an 7600(X) and 32 GB of 6000 Mhz CL30 RAM, so I'm not sure what's the better move, probably the latter.
I think 5800X3D outperforms the 7600x
8700K is a 6 core chip and aged much better than dark age intel quad cores (2nd-7th gen intels).
There is a bottleneck but it ain't that significant yet.
It really depends on the game, it matters for some games, it doesn't for others. If you're content with 60 FPS you will probably be fine for the most part. If you want 120+ you're probably fricked.
>2024
>speccy
>any year
>speccy
The fact that Ganker still uses that goddamn thing after it's been exposed for being a security risk is lmao-worthy.
qrd?
https://www.google.com/search?client=firefox-b-1-d&q=speccy+security+breach
https://www.bleepingcomputer.com/forums/t/390593/speccy-users-beware-hacked-service-could-cause-you-to-become-infected/
>this compromise won't affect them as Speccy does not render the HTML that would load the malicious script.
so some random hack from over a decade ago and it was just the website, speccy wasn't even vulnerable to it? that's what you're crying about?
not him but last i checked that shit hasn't been updated in years and didn't even properly support/show modern hardware.
might have changed from that time but no reason to use that shit anyway because there's better options that do shit better
Switch to hwinfo64 and stop arguing noob. This is big brother Ganker speaking
frick off shill, you got debunked now leave
>Ganker
Not the seal of approval you think it is.
Fair enough. Do you.
>8 year old cpu
lol
lmao
>op using an ancient cpu while complaining about low framerates
>thread devolves into GPU brand war
every single time.
>instigator was a tranime poster, no less
really makes you think
>reasonable AMD owning soijak-using virgin
vs
>nvidiot fanboy anime reaction image chad
I don't know who to support.
come home white man
I wanted to get an Arc just to have one. Not use it, mind you. Hahaha. Hell no. I'd buy an overpriced nvidia GPU again, or use an AMD APU before I went into Intel graphics Hell. But I appreciate the fact that they went with a reasonably priced, specced, and sized GPU. Especially in this era of $700 3 slot cards with 12GB of ram.
Also I really liked the very toned down, but very aesthetically pleasing design.
Take a good look at who starts it every time.
Picrel hehe
>amd
Hahahaahahaha what was op thinking
Because the nvidia GPU for the same price can't even compete with the 7800xt
The RTX 4070 is not a bad choice compared to the 7800 XT, in fact, I'd say it's the only RTX 40 series card that's worth getting.
>12 GB card
>worth getting
It's a bit too expensive for what it is, but you do get DLSS, and when you're forced to use uspcaling due to shitty unoptimized games, you'll be glad you can use DLSS instead of FSR, I do hope AMD continues to improve FSR though... It can certainly be improved A LOT.
This post reads like viral marketing.
>DLSS
>something to be happy about
Yet mine doesn't have these issues?
>>DLSS
to be happy about
It's kinda true... If you're forced to use upscaling which one would you rather use? Ideally the GPUs would be strong enough to not need upscaling at all, but that's not the case, GPU are slow unless you spend $1000 or more, it seems.
Mid-range GPUs are more than capable of playing at 1080p/1440p and get exceptional performance. DLSS is absolutely not necessary unless you're trying to pathtrace 2077.
6800XT is the best gpu ever made. 3 years later and people are still clawing for its performance tier and vram.
they later revealed they had a dogshit psu that didnt have the amps required because watts doesnt matter.
>1000W seasonic
>even tested a spare
idk bro
user error
it's funny because dx12 games are already a gigantic piece of shit and somehow and enhances just how bad of a time you'll have with them.
you buy cheap you get cheap
>this idiot actually thinks that if you dump more money into a GPU it's going to magically be better
>NOOOOOOOOO YOU GOTTA SPEND 1500$ FOR THE SAME RESULT
>the copium when it turns out red were better off dead
Haha
Whatever you gotta tell yourself, kid.
The poors are malding lol.
Unless you're running a Quadro-tier GPU for your CUDA workloads, you're just a flexing poorgay.
xx60/ti gays are the biggest coping poorgays on the board and should be bullied relentlessly.
>AMD
doing the same FPS
trying playing the same game with increased resolution see if your FPS drops or stays the same
Both AMD and Nvidia cards can, for example, play games at resolutions higher than your monitor
7800xt is more than capable of playing 4k, which I believe was not feasible for your previous 1060
try it
I downsample from 4k with my 7900xt, but sometimes it doesn't work. Not all games support it, or sometimes they don't scale the UI properly. But when they do, holy shit. Goodbye AA, I never needed to know you.
well, i'm sure that it's pc gaming as a whole that's to blame and not the fact that you expect to upgrade one component in an entire system and have it all magically not bottlenecked by the shitty old ass intel CPU
>What a fricking scam PC gaming is.
Bullshit, what game and what's the fps?
Baldru's Gate 3, around 30 FPS in lower city
CPU too slow
The general rule is that if a game is running worse in cities or densely populated environments, it's usually the CPU's fault. And your CPU is ancient.
minecraft you stupid fricking Black person
That processor was released before the Nintendo switch
Upgrade now
Its your CPU moron
>speccy
>7700k
>speccy
>2133mts ram
>speccy
I upgraded my CPU from 6700 to 7800 without touching GPU (was still in post) and my FPS were significantly improved, and most importantly spikes were completely gone.
>spikes
you mean dips/drops/stuttering?
Yes. People used to call sudden, drastic changes in performance "Spikes" due to how they appear on a graph.
Yeah, used to have horrible random stutter in Forza. Upgraded CPU -- and they were completely gone, frametime chart is completely flat.
>4c/8t cpu
homie your shit is worse than i3s from several years ago
>Speccy
>7700K + 7800XT
You are not white
Everyone has told you this but it's your ancient fricking CPU. 7700K has been obsolete for a while. Anything 7th gen and before is obsolete. You should have built a first gen Ryzen system or should have bought a i7 7740X for an X299 platform so you could have gotten an i9 7900X years later.
Is my 5600g obsolete yet
No, that's rather recent.
5600G is far from obsolete. Probably will last this whole console gen.
>doesn't pcie 4.0
Yep, it's trash.
bro that Cpu it's like 10 years old. You need at least a 6 Cores. Ram is slow as shit too. It should be at least 3200mhz to not bottleneck the Cpu.
This is why morons shouldn't build Pc's.
Your ram is slow as sht and the CPU is ancient. You have a double bottleneck. Blame your moronation, not PC gaming.
>Forgot pic related
jesus your memory is slow
Impossible.
>buying 2000 dolla card
>still doing 60 fps like before on a 1080p screen
wtf
morons should just stick to consoles and apple
You are bottlenecked you stupid Black person, chist.
So is this going to be another thread an autist starts spamming until he eventually dies of a brain aneurysm?
https://arch.b4k.co/v/search/image/pqHO_dr4GtRAkt5o8zOfiA/
Could it be because you have a 7 y/o CPU?
You need to upgrade your entire system my man, your shitty 7 year old 4 core cpu is bottlenecking your GPU performance.
>that terribly slow memory
>7th gen intel
your whole system needed an upgrade, not just a gpu my homie
>1066mhz ram
thats your bottleneck right there
you could get ram double the speed and gigs for like 40 bucks
It's 2133mhz. Speccy just reports it at half speed cause it's double data rate ram.