AMD can provide the cpu and the graphics chips, and their components are cheaper. Important for a console to not cost 2000 like what nvidia would charge for 14 gigs of ram!
Intel might start competing with AMD now though.
Because AMD were the only company who could make an SoC with an x86 processor and high performance graphics. Nvidia doesn't have a license to manufacture x86 CPUs and at the time these consoles were being developed intel weren't making high performance GPUs. SoCs are cheaper to manufacture, cool and use less power than using a multi chip solution from two different companies like Intel CPU and nvidia GPU.
In future we may get intel based consoles because they make high performance GPUs now so intel SoC for 8k gaming is viable. I doubt they'd go with nvidia because they won't want to break that x86 compatibility by using nvidia licensed ARM cores.
AMD are better for the price point and they produce GPU + CPUs which means they can scale them to be even cheaper. Basically why pay extra for less graphics and less processor power.
Depends on your price point, you get more processing/graphical power for cheaper. It won't be as optimised as Intel/Nvidia but it will be cheaper and the extra power generally makes up for whatever loss. Though if you are looking for premium you will go intel/nvidia and spend a shitload. There becomes this weird wrap around in premium though where you could also buy an older server CPU for the same price as current CPU. I have a rendering computer that has 128GB of RAM DDR4 and a threadripper. It uses a lot more power but I got it fairly cheap because I bought half the parts from a place my mate works for which was replacing their shit to DDR5.
Intel had their biggest innovation since Core replaced Pentium 4 not too long ago
13th and 14th gen are just refreshes of 12th gen, but 15th gen is going to be another big leap
yeah AMD made a major comeback in the last 3-5 years, Ryzen keeps getting better whereas the 11th-13th gen from Intel has been lackluster, they aren't horrible CPU's, but they have a really high TDP and produce a lot of heat to get similar gaming performance as the Ryzen counterpart
Intel still has the professional production market cornered though, their CPU's are still preferred for things like rendering and audio production
9 months ago
Anonymous
>Intel still has the professional production market cornered though
Really I thought Xeon was worse than Epyc, with Intel having better support like Nvidia. So you don't have as much software optimisation.
It depends on the application. If you're just gaming and doing regular computer stuff then the current ryzen offerings are fine enough, even when used with an nvidia card. If you use the computer for actual work type stuff like rendering, a/v editing, big math, vtubing, etc. you actually benefit more from getting an intel equivalent simply because hyperthreading works as advertised and it can handle more simultaneous programs that are gritting it out. That's why everybody recommends ryzen nowadays, if you buy an intel cpu and you're not using it to its full potential then you're just burning extra cash for zero advantage.
Also related, I thought pretty much everything now uses GPU for most of the heavy lifting so you can't really bottleneck on a CPU. So at that price point you are better off just upgrading to server cpus.
It depends on the application. If you're just gaming and doing regular computer stuff then the current ryzen offerings are fine enough, even when used with an nvidia card. If you use the computer for actual work type stuff like rendering, a/v editing, big math, vtubing, etc. you actually benefit more from getting an intel equivalent simply because hyperthreading works as advertised and it can handle more simultaneous programs that are gritting it out. That's why everybody recommends ryzen nowadays, if you buy an intel cpu and you're not using it to its full potential then you're just burning extra cash for zero advantage.
9 months ago
Anonymous
Explain why hyperthreading doesn't work on ryzen because I've never seen this kind of behavior
I think you are confusing it with Intel's quicksync video encoder which is because their chips came with iGPUs and until ryzen 7000 AMD desktop chips did not (excluding APUs which were rare and gimmicky compared to their purebred desktop cousins)
9 months ago
Anonymous
Intel's sole advantage is single thread performance due to higher clock speeds.
That's literally it. It fails for just every workstation workload compared to an equivalent Threadripper.
9 months ago
Anonymous
>clock speed is... le bad!
meanwhile AMD nails on some extra cache as a performance hack and the entire tech industry claps like seals
9 months ago
Anonymous
Yeah, because real-world workloads utilize concurrency more than raw speed, and more cache means more can be done faster without more power usage.
Intel's design strategy is DnD orc barbarian tier moronic for work. It does (old / shitty) games well, and that's it.
9 months ago
Anonymous
>concurrency
which cache does nothing for. It's a cop-out for AMD's inability to design a faster memory controller >more cache means more can be done faster without more power usage.
Cache is a thermal and power nightmare, there's a reason the 3D cache chiplets have lower power limits and lower clock speeds than the regular ones
More clock speed almost always means more performance, it's been that way for decades except in rare cases like the pentium 4 where they threw away everything in pursuit of clock speed and never reached speeds that could have compensated for doing so
Ive had amd build for years now and on release they were unstable and shit but nowadays everything just fricking works. I have 0 problems playing games, and the performance is still great.
For games, yes.
Intel makes decent CPUs as well, but if they're worth buying or not depends on the region you live in. For example, here in Europe from 2020 to 2022 Intel offered better price/performance; but now AMD is ahead again. This may have been different in other regions. But they're pretty close nowadays, either brand is fine.
As for GPUs, if you buy Nvidia strictly for videogames you're wasting money. Simple as.
For productivity, it depends on your usecase: some programs don't play nice with AMD GPUs, others work fine. You should look it up.
>Poorgay-tier
Below $200 = RX 6600
$230 = RX 6650XT
$280 = RX 6700 (These can be pretty hard to find, depending on your region)
>Most gamers should buy one of these-tier
$330 = RX 6700XT, or 6750XT if not that much more expensive (The 6750XT is around 10-15% faster, so it may be worth it if you can find one for $360 or so)
$450 = RX 6800 (These can be pretty hard to find, depending on your region)
$520 = RX 6800 XT
>Enthusiast-tier
$600 = RTX 4070 is decent, but lack of VRAM may become an issue in the not-so-distant future
$630 = RX 6950 XT, but keep in mind this requires 350-400W of power out of the box, which is A LOT.
$650(?) = 7900GRE: same performance as the 6950 XT, but 260W. For now, this is only sold in China and to system integrators. In other words, consumers can't buy this product but maybe in the future we'll be able to..?
$750 = RX 7900 XT
$950 = RX 7900 XTX
>4090-tier
$1600 = RTX 4090 stands alone at the top. If you want the best and don't care about value, this is the best.
Keep in mind this is FOR VIDEOGAMES. As I said, for productivity, it depends on your usecase because some applications run slower or straight-up don't work with AMD cards.
>Intel GPUs are trash
They are the only ones offering good performance and VRAM for $300-$400. Nvidia is in lala land and AMD has taken the place Nvidia was before
depends on the price point. it's rarely the case that one vendor isn't competitive across the board. just don't base your assessment soley on shill sites.
>good luck using ray tracing
the 6000 series can use RT better than 3000 series cards in some instances. the only time when nvidia's RT advantage is distinct is for games that are practically RT demos.
Actually it’s on par with a 3090 Ti in RT. The issue is that the XTX is 20-50% faster than a 3090 Ti in rasterization, so you lose alot of performance when enabling RT. XTX without RT is an overclocked 4080. XTX with RT is a 4070 Ti.
The XTX performs generally speaking on-par with a 4070 TI in RT, so your pic is cherrypicking. Then again, why play at 90-95 fps when you can turn that shit off, not notice a difference and get double the fps?
9 months ago
Anonymous
Why do you bullshit like this? This is what it looks like against a 4070Ti:
https://gpu.userbenchmark.com/Compare/Nvidia-RTX-4070-Ti-vs-AMD-RX-7900-XTX/4146vs4142
And then, when you turn RT off, it up front devastates both cards as well. Like I said, get a 4090. But if you cannot afford it, go for the next best thing. Which is neither a 4070Ti, or a 4080.
9 months ago
Anonymous
>cuckbenchmark
How to invalidate your opinion with one single trick.
9 months ago
Anonymous
>userbenchmark
You’re a moronic animal and not even worth the seconds it took me to fetch the minimal amount of data.
those two pictures remind me of something, it reminds me that pc gaming has completely and utterly stagnated. like, look at the ground, look at that duzzy frickin bullshit. look at the cars. look at everything, in fact. just terrible. truly terrible. this shit would probably be quite impressive in 2011, ir something but now? the major problem is really that a lot of games just look like low-res fricking dog-shit but you need a 1500-2000 dollar pc to be able to run them at decent speed and it becomes a worse and worse deal every single year that passes. to be honest, if i were to buy a super rig now, it would just be so that i can play everything that came out in the last twenty years fully maxed out and not have to worry about whatever frames i'd be getting. i certainly wouldn't buy it fo the shit games that come out these days. goddamn low-res fricking shit software, man...
9 months ago
Anonymous
That's a PS4 game, designed to run on a 400€ console from 2013. Then again, making PC exclusive high graphical fidelity games simply isn't financially feasible anymore. If you make a high budget game, you MUST make it also available on console to make back the budget.
Raytracing literally doesn't work on AMD cards in Ratchet & Clank right now. It's been disabled by the developers, the option is grayed-out. And absolutely one gives a shit: did you see ANYONE on the Internet complain about this? That's how worthless RT is.
The XTX performs generally speaking on-par with a 4070 TI in RT, so your pic is cherrypicking. Then again, why play at 90-95 fps when you can turn that shit off, not notice a difference and get double the fps?
...and the XTX may end up performing a lot better in Raytracing in the future: the 4070TI is almost running out of VRAM in current games.
It already happened with the RX 6800 vs. the 3070 it competed with in price: the 3070 was faster at Raytracing a few years ago, now it runs out of VRAM.
Why do you bullshit like this? This is what it looks like against a 4070Ti:
https://gpu.userbenchmark.com/Compare/Nvidia-RTX-4070-Ti-vs-AMD-RX-7900-XTX/4146vs4142
And then, when you turn RT off, it up front devastates both cards as well. Like I said, get a 4090. But if you cannot afford it, go for the next best thing. Which is neither a 4070Ti, or a 4080.
>userbenchmark
9 months ago
Anonymous
>tfw gpu has 16gb vram for the last 4 years
How you holding up Ti bros?
9 months ago
Anonymous
>cuckbenchmark
How to invalidate your opinion with one single trick.
I just picked it at random because it is the first one that popped up, but ok, I'll bite. Is it different to your preferred benchmarking? Show me.
9 months ago
Anonymous
>Is it different to your preferred benchmarking?
Userbenchmark isn't even "benchmarking", it's completely made up bullshit. Look at actual benchmarks: the more recent, the better (reviewers tend to re-benchmarks their cards every couple of months, to include the latest games and driver updates).
4070Ti Review: https://www.techpowerup.com/review/asus-geforce-rtx-4070-ti-tuf/32.html
4070 Review: https://www.techpowerup.com/review/nvidia-geforce-rtx-4070-founders-edition/32.html (It's a review of a different card, but it's a bit more recent. And both the 4070Ti and 7900XTX are in there)
Ratchet & Clank: https://www.techpowerup.com/review/ratchet-clank-rift-apart-benchmark-test-performance-analysis/5.html
Remnant 2: https://www.techpowerup.com/review/remnant-2-benchmark-test-performance-analysis/5.html
Resident Evil 4: https://www.techpowerup.com/review/resident-evil-4-benchmark-test-performance-analysis/4.html
9 months ago
Anonymous
They did TloU Part 1 as well, but the port's been improved a lot since then, so it's not that relevant anymore. Still, here's the link: https://www.techpowerup.com/review/the-last-of-us-benchmark-test-performance-analysis/4.html
>inb4 lel RT is le shit
It is. I tried using RT in Cyberpunk and there is no fricking difference aside from nicely looking puddles. And since it's almost always sunny in that game, it's fricking useless.
And I still have more than 60 fps with RT on my 7900xt without any upscaling
The overdrive mode does look nice, but that even tanks the 4090. Solid 60s in that mode is another gen away. Basically it just isn't worth it at this point. In another few years, sure, at which point most cards will do it, but right now it's a fool's errand for sure.
Put it this way. What you want is a 4090, obviously. However. If you can't afford one, a 7900 XTX is a very good choice. It legitimalet leaves the 4080 struggling for air. The lower end AMD cards are worthwhile as well if you are looking for actually lower prices. Sadly the rest of the Nvidia lineup is heavily gimped and stupidly overpriced. Personally I'm on a 2070 and I'm saving up for an XTX.
4080 is ok if you use stable diffusion a lot but don't want to get a new PSU etc. to feed a 4090, but apart from that very specific situation it's worse than the 7900
If you just want to play video games and can’t afford a 4090 you should be building an AMD PC. I guess if you go low enough (sub-$200) you can start looking at Intel CPUs.
When i was looking into building a budget pc amd pretty much outperformed Nvidia in every metric.
I wanted a budget gpu that could run at 1080p high 60+ fps. That turned out to be the rx6600 for $210, Nvidia counterpart is the 3050 for 230, but it was down like 10-40fps in every gaming benchmark.
You would have to buy a 3060 to beat and rx6600 but that costs $100 more and gives you like 10-15 more fps.
It's not just low-end cards: the last good generation was Pascal. Turing was trash (post-mining boom), Ampere was impossible to buy (due to mining boom), Lovelace is trash again (post-mining boom).
...followed by xx50 series cards which sell despite the fact that they are turborubbish due to mindshare alone. Anyway, the real money is at the halo tier where margins are ridiculous.
they are, but the nvidia shills will do their bestest to try and convince you they ain't. you get what you pay for, the rig will be five percent less powerful than the nvidia equivalent but will also cost five percent less.
AMD are better for the price point and they produce GPU + CPUs which means they can scale them to be even cheaper. Basically why pay extra for less graphics and less processor power.
Because Nvidia are such an awful partner they've stabbed MS and Sony in the back (and soon Nintendo too), so MS/Sony told them to frick off and took their business to AMD.
Funny how Nvidia was all "consoles will be dead in 2 more weeks!" when AMD had all the console contracts and the shield was a gigantic flop but as soon as the Switch was a thing Nvidia instantly reversed course and said they loved consoles.
1) It's cheaper
2) nVidia has been really gay about their exclusive features to the point where they would shut the developers out from their blocks of code and force them to employ an nVidia toady to work on it instead.
3) AMD does both CPU and GPU, so by going with them console makers can have one chimp to be both components and therefore cut down on how much space would be needed on the motherboard. Intel recently entered the discreet GPU market but those cards suck and Intel has never had good igpus for gaming either.
Nvidia's fricked every single partner they had. The PS3 got a shitty old gpu, macbooks had faulty GPUs and nvidia wouldn't recall them, the switch can be hacked with a paperclip. Intel is just expensive and does not make custom chips.
i am pretty sure their plan is to ultimately get rid of their partners altogether and just make their founders edition gpu's the norm in the future. they didn't have the supply lines and manufacturing lines before, but they've been building those up hardcore the last ten years or so. probably also why evga ultimately decided to tell nvidia to go and suck it. probably wouldn't have done that unless they knew that nvidia was planning on becoming a direct competitor of theirs in the nearby future.
Because nvidia makes higher quality products which naturally cost more, sony and microsoft prefer to pay less but then they have hardware that is obsolete just 2 years after
Because Nvidia is absolutely demolishing the PC market and that's where they're spending all their time.
AMD is just making use of the fact Nvidia is busy doing other shit.
Nvidia is infamous for how awful they are as business partners. Remember when EVGA quit the whole GPU market (80% of their business) because working with Nvidia is ubearable?
is ryan gosling hot shit now? i feel like he's being positioned to be the new jake gyllenhaal, or something. remember when jake gyllenhaal was everywhere and seemed to be doing five movies a year, or something? what do you guys think? is ryan doing a gyllenhaal?
The Drive meme was already done to death 12 years ago, but normalgays have only discovered it recently. So we have no chpice but to suffer their endless regurgitation of a dead meme.
cost efficiency
consoles need to be mass manifactured so unified hardware between CPU and GPU is ideal in a custom sock, the only CPU sock Nvidia has is for the shield which is why the switch uses it
Nearly every company that partnered with nvidia has come out seething about how their shit they are as a business partner. Maybe they've gotten better in the past decade but why take the risk?
That and AMD is pretty good at making cheap mass market SoCs now. nvidia not so much.
The real answer is that AMD has perfected the APU solution for years now, whereas Nvidia can't get the same efficiency in a single package and Intel is a whole decade behind in graphics technology, only starting to catch up recently with their Arc lineup.
>Consoles need decent performance (not good nor top tier) >Consoles need cheaply made and poorly designed components to keep the cost low >Company goes with the 3rd world option and picks AMD
Consoles are built to a price and need to cut corners. They cant do they if they pic something actually god-tier in performance and quality
Partner with one company instead of two. Likely can get bundled discounts on CPU and GPU technology.
probably because NVIDIA is extremely israeli and asked for like 3x what AMD did
AMD can provide the cpu and the graphics chips, and their components are cheaper. Important for a console to not cost 2000 like what nvidia would charge for 14 gigs of ram!
Intel might start competing with AMD now though.
>why does hardware that needs to be consistently manufactured for a specific market price use cheaper electronics
gee
Because AMD were the only company who could make an SoC with an x86 processor and high performance graphics. Nvidia doesn't have a license to manufacture x86 CPUs and at the time these consoles were being developed intel weren't making high performance GPUs. SoCs are cheaper to manufacture, cool and use less power than using a multi chip solution from two different companies like Intel CPU and nvidia GPU.
In future we may get intel based consoles because they make high performance GPUs now so intel SoC for 8k gaming is viable. I doubt they'd go with nvidia because they won't want to break that x86 compatibility by using nvidia licensed ARM cores.
Are AMD builds worth it?
Depends on your price point, you get more processing/graphical power for cheaper. It won't be as optimised as Intel/Nvidia but it will be cheaper and the extra power generally makes up for whatever loss. Though if you are looking for premium you will go intel/nvidia and spend a shitload. There becomes this weird wrap around in premium though where you could also buy an older server CPU for the same price as current CPU. I have a rendering computer that has 128GB of RAM DDR4 and a threadripper. It uses a lot more power but I got it fairly cheap because I bought half the parts from a place my mate works for which was replacing their shit to DDR5.
intel hasn't been better in the CPU market for many years now
Really? I'd been hearing Intel was getting stagnant but I didn't know how bad it had gotten.
Intel had their biggest innovation since Core replaced Pentium 4 not too long ago
13th and 14th gen are just refreshes of 12th gen, but 15th gen is going to be another big leap
yeah AMD made a major comeback in the last 3-5 years, Ryzen keeps getting better whereas the 11th-13th gen from Intel has been lackluster, they aren't horrible CPU's, but they have a really high TDP and produce a lot of heat to get similar gaming performance as the Ryzen counterpart
Intel still has the professional production market cornered though, their CPU's are still preferred for things like rendering and audio production
>Intel still has the professional production market cornered though
Really I thought Xeon was worse than Epyc, with Intel having better support like Nvidia. So you don't have as much software optimisation.
Also related, I thought pretty much everything now uses GPU for most of the heavy lifting so you can't really bottleneck on a CPU. So at that price point you are better off just upgrading to server cpus.
It depends on the application. If you're just gaming and doing regular computer stuff then the current ryzen offerings are fine enough, even when used with an nvidia card. If you use the computer for actual work type stuff like rendering, a/v editing, big math, vtubing, etc. you actually benefit more from getting an intel equivalent simply because hyperthreading works as advertised and it can handle more simultaneous programs that are gritting it out. That's why everybody recommends ryzen nowadays, if you buy an intel cpu and you're not using it to its full potential then you're just burning extra cash for zero advantage.
Explain why hyperthreading doesn't work on ryzen because I've never seen this kind of behavior
I think you are confusing it with Intel's quicksync video encoder which is because their chips came with iGPUs and until ryzen 7000 AMD desktop chips did not (excluding APUs which were rare and gimmicky compared to their purebred desktop cousins)
Intel's sole advantage is single thread performance due to higher clock speeds.
That's literally it. It fails for just every workstation workload compared to an equivalent Threadripper.
>clock speed is... le bad!
meanwhile AMD nails on some extra cache as a performance hack and the entire tech industry claps like seals
Yeah, because real-world workloads utilize concurrency more than raw speed, and more cache means more can be done faster without more power usage.
Intel's design strategy is DnD orc barbarian tier moronic for work. It does (old / shitty) games well, and that's it.
>concurrency
which cache does nothing for. It's a cop-out for AMD's inability to design a faster memory controller
>more cache means more can be done faster without more power usage.
Cache is a thermal and power nightmare, there's a reason the 3D cache chiplets have lower power limits and lower clock speeds than the regular ones
More clock speed almost always means more performance, it's been that way for decades except in rare cases like the pentium 4 where they threw away everything in pursuit of clock speed and never reached speeds that could have compensated for doing so
Ive had amd build for years now and on release they were unstable and shit but nowadays everything just fricking works. I have 0 problems playing games, and the performance is still great.
I preferred when ATI was independent, also Nvidia shadow play is too invaluable, israelites or not
These days, yes. They've improved significantly over the past couple of years.
For games, yes.
Intel makes decent CPUs as well, but if they're worth buying or not depends on the region you live in. For example, here in Europe from 2020 to 2022 Intel offered better price/performance; but now AMD is ahead again. This may have been different in other regions. But they're pretty close nowadays, either brand is fine.
As for GPUs, if you buy Nvidia strictly for videogames you're wasting money. Simple as.
For productivity, it depends on your usecase: some programs don't play nice with AMD GPUs, others work fine. You should look it up.
>As for GPUs, if you buy Nvidia strictly for videogames you're wasting money. Simple as.
What are the most recommended modern GPUs?
It depends on your budget.
>Poorgay-tier
Below $200 = RX 6600
$230 = RX 6650XT
$280 = RX 6700 (These can be pretty hard to find, depending on your region)
>Most gamers should buy one of these-tier
$330 = RX 6700XT, or 6750XT if not that much more expensive (The 6750XT is around 10-15% faster, so it may be worth it if you can find one for $360 or so)
$450 = RX 6800 (These can be pretty hard to find, depending on your region)
$520 = RX 6800 XT
>Enthusiast-tier
$600 = RTX 4070 is decent, but lack of VRAM may become an issue in the not-so-distant future
$630 = RX 6950 XT, but keep in mind this requires 350-400W of power out of the box, which is A LOT.
$650(?) = 7900GRE: same performance as the 6950 XT, but 260W. For now, this is only sold in China and to system integrators. In other words, consumers can't buy this product but maybe in the future we'll be able to..?
$750 = RX 7900 XT
$950 = RX 7900 XTX
>4090-tier
$1600 = RTX 4090 stands alone at the top. If you want the best and don't care about value, this is the best.
Keep in mind this is FOR VIDEOGAMES. As I said, for productivity, it depends on your usecase because some applications run slower or straight-up don't work with AMD cards.
Intel GPUs are trash
Nvidia has extra stuff for people who want to make money with the GPU, that adds price and heat
the consoile is for gaming
Yeah, i had one but now i want to earn some money with porn
>Intel GPUs are trash
They are the only ones offering good performance and VRAM for $300-$400. Nvidia is in lala land and AMD has taken the place Nvidia was before
CPUs yes
GPUs no
depends on the price point. it's rarely the case that one vendor isn't competitive across the board. just don't base your assessment soley on shill sites.
Yes, for a gaming rig and if you don't care about stupid shit like DLSS and ray tracing.
Yes.
Just got this build and everything runs like a dream.
good luck using ray tracing
>inb4 lel RT is le shit
>"Everything runs like a dream"
Literally do not give two shits about raytracing.
works fine in Control and Doom Eternal on my 7900xt. 100fps+ in 4k on ultra. You're probably stuck in the 3rd world with a 980 or a 2080 or something.
>good luck using ray tracing
the 6000 series can use RT better than 3000 series cards in some instances. the only time when nvidia's RT advantage is distinct is for games that are practically RT demos.
The 7900XTX has the same Raytracing capabilities of a 3090. Is the 3090 considered shit at RT now?
Actually it’s on par with a 3090 Ti in RT. The issue is that the XTX is 20-50% faster than a 3090 Ti in rasterization, so you lose alot of performance when enabling RT. XTX without RT is an overclocked 4080. XTX with RT is a 4070 Ti.
> RT.
yeah, about that...
The XTX performs generally speaking on-par with a 4070 TI in RT, so your pic is cherrypicking. Then again, why play at 90-95 fps when you can turn that shit off, not notice a difference and get double the fps?
Why do you bullshit like this? This is what it looks like against a 4070Ti:
https://gpu.userbenchmark.com/Compare/Nvidia-RTX-4070-Ti-vs-AMD-RX-7900-XTX/4146vs4142
And then, when you turn RT off, it up front devastates both cards as well. Like I said, get a 4090. But if you cannot afford it, go for the next best thing. Which is neither a 4070Ti, or a 4080.
>cuckbenchmark
How to invalidate your opinion with one single trick.
>userbenchmark
You’re a moronic animal and not even worth the seconds it took me to fetch the minimal amount of data.
https://www.techspot.com/review/2601-nvidia-geforce-rtx-4070-ti/
https://www.techspot.com/review/2588-amd-radeon-7900-xtx/
yeah... I mean 17 is kinda ok? I guess..?
You going to ask for seconds after gobbling up all that marketing?
I mean why ask for AMD to help with RT overdrive mode when their GPUs can't even do RT kek. You really are special
Almost wasted my time formulating an response.
Nvidia tech demo, specifically designed to run like shit on everything but their latest cards. What are you going to post next, Portal RTX?
works on my 3080
those two pictures remind me of something, it reminds me that pc gaming has completely and utterly stagnated. like, look at the ground, look at that duzzy frickin bullshit. look at the cars. look at everything, in fact. just terrible. truly terrible. this shit would probably be quite impressive in 2011, ir something but now? the major problem is really that a lot of games just look like low-res fricking dog-shit but you need a 1500-2000 dollar pc to be able to run them at decent speed and it becomes a worse and worse deal every single year that passes. to be honest, if i were to buy a super rig now, it would just be so that i can play everything that came out in the last twenty years fully maxed out and not have to worry about whatever frames i'd be getting. i certainly wouldn't buy it fo the shit games that come out these days. goddamn low-res fricking shit software, man...
That's a PS4 game, designed to run on a 400€ console from 2013. Then again, making PC exclusive high graphical fidelity games simply isn't financially feasible anymore. If you make a high budget game, you MUST make it also available on console to make back the budget.
Something like 50% of Nvidia card owners have a ray tracing enabled card. It just doesn't matter to a lot of people.
Raytracing literally doesn't work on AMD cards in Ratchet & Clank right now. It's been disabled by the developers, the option is grayed-out. And absolutely one gives a shit: did you see ANYONE on the Internet complain about this? That's how worthless RT is.
...and the XTX may end up performing a lot better in Raytracing in the future: the 4070TI is almost running out of VRAM in current games.
It already happened with the RX 6800 vs. the 3070 it competed with in price: the 3070 was faster at Raytracing a few years ago, now it runs out of VRAM.
>userbenchmark
>tfw gpu has 16gb vram for the last 4 years
How you holding up Ti bros?
I just picked it at random because it is the first one that popped up, but ok, I'll bite. Is it different to your preferred benchmarking? Show me.
>Is it different to your preferred benchmarking?
Userbenchmark isn't even "benchmarking", it's completely made up bullshit. Look at actual benchmarks: the more recent, the better (reviewers tend to re-benchmarks their cards every couple of months, to include the latest games and driver updates).
4070Ti Review: https://www.techpowerup.com/review/asus-geforce-rtx-4070-ti-tuf/32.html
4070 Review: https://www.techpowerup.com/review/nvidia-geforce-rtx-4070-founders-edition/32.html (It's a review of a different card, but it's a bit more recent. And both the 4070Ti and 7900XTX are in there)
Ratchet & Clank: https://www.techpowerup.com/review/ratchet-clank-rift-apart-benchmark-test-performance-analysis/5.html
Remnant 2: https://www.techpowerup.com/review/remnant-2-benchmark-test-performance-analysis/5.html
Resident Evil 4: https://www.techpowerup.com/review/resident-evil-4-benchmark-test-performance-analysis/4.html
They did TloU Part 1 as well, but the port's been improved a lot since then, so it's not that relevant anymore. Still, here's the link: https://www.techpowerup.com/review/the-last-of-us-benchmark-test-performance-analysis/4.html
>inb4 lel RT is le shit
It is. I tried using RT in Cyberpunk and there is no fricking difference aside from nicely looking puddles. And since it's almost always sunny in that game, it's fricking useless.
And I still have more than 60 fps with RT on my 7900xt without any upscaling
The overdrive mode does look nice, but that even tanks the 4090. Solid 60s in that mode is another gen away. Basically it just isn't worth it at this point. In another few years, sure, at which point most cards will do it, but right now it's a fool's errand for sure.
Put it this way. What you want is a 4090, obviously. However. If you can't afford one, a 7900 XTX is a very good choice. It legitimalet leaves the 4080 struggling for air. The lower end AMD cards are worthwhile as well if you are looking for actually lower prices. Sadly the rest of the Nvidia lineup is heavily gimped and stupidly overpriced. Personally I'm on a 2070 and I'm saving up for an XTX.
4080 is ok if you use stable diffusion a lot but don't want to get a new PSU etc. to feed a 4090, but apart from that very specific situation it's worse than the 7900
That's right.
If you just want to play video games and can’t afford a 4090 you should be building an AMD PC. I guess if you go low enough (sub-$200) you can start looking at Intel CPUs.
When i was looking into building a budget pc amd pretty much outperformed Nvidia in every metric.
I wanted a budget gpu that could run at 1080p high 60+ fps. That turned out to be the rx6600 for $210, Nvidia counterpart is the 3050 for 230, but it was down like 10-40fps in every gaming benchmark.
You would have to buy a 3060 to beat and rx6600 but that costs $100 more and gives you like 10-15 more fps.
Nvidia hasn't made a worthwhile XX50 card in a decade, they just don't care.
It's not just low-end cards: the last good generation was Pascal. Turing was trash (post-mining boom), Ampere was impossible to buy (due to mining boom), Lovelace is trash again (post-mining boom).
no, because the xx60 gpu's are where all the money's at. the sixty and seventy gpu's are what most normies end up buying.
...followed by xx50 series cards which sell despite the fact that they are turborubbish due to mindshare alone. Anyway, the real money is at the halo tier where margins are ridiculous.
fair point.
AMD PCs are the best if the heaviest workload for your PC is vidya.
For some more demanding professional programs, Intel/nVidia is better.
they are, but the nvidia shills will do their bestest to try and convince you they ain't. you get what you pay for, the rig will be five percent less powerful than the nvidia equivalent but will also cost five percent less.
AMD are better for the price point and they produce GPU + CPUs which means they can scale them to be even cheaper. Basically why pay extra for less graphics and less processor power.
Because Nvidia are such an awful partner they've stabbed MS and Sony in the back (and soon Nintendo too), so MS/Sony told them to frick off and took their business to AMD.
Funny how Nvidia was all "consoles will be dead in 2 more weeks!" when AMD had all the console contracts and the shield was a gigantic flop but as soon as the Switch was a thing Nvidia instantly reversed course and said they loved consoles.
1) It's cheaper
2) nVidia has been really gay about their exclusive features to the point where they would shut the developers out from their blocks of code and force them to employ an nVidia toady to work on it instead.
3) AMD does both CPU and GPU, so by going with them console makers can have one chimp to be both components and therefore cut down on how much space would be needed on the motherboard. Intel recently entered the discreet GPU market but those cards suck and Intel has never had good igpus for gaming either.
Nvidia's fricked every single partner they had. The PS3 got a shitty old gpu, macbooks had faulty GPUs and nvidia wouldn't recall them, the switch can be hacked with a paperclip. Intel is just expensive and does not make custom chips.
i am pretty sure their plan is to ultimately get rid of their partners altogether and just make their founders edition gpu's the norm in the future. they didn't have the supply lines and manufacturing lines before, but they've been building those up hardcore the last ten years or so. probably also why evga ultimately decided to tell nvidia to go and suck it. probably wouldn't have done that unless they knew that nvidia was planning on becoming a direct competitor of theirs in the nearby future.
because AMD is better than israeliteVidia
everyone, and i mean everyone, hates NVIDIA
Because nvidia makes higher quality products which naturally cost more, sony and microsoft prefer to pay less but then they have hardware that is obsolete just 2 years after
AMD simply has the best HSA.
Because Nvidia is absolutely demolishing the PC market and that's where they're spending all their time.
AMD is just making use of the fact Nvidia is busy doing other shit.
because they use APUs/SOCs instead of a discrete gpu
ITT
AMD is effectively the only option for consoles for the foreseeable future, though Intel might try to get in on it again but I find that unlikely.
Because APU is CPU+GPU
Nvidia is infamous for how awful they are as business partners. Remember when EVGA quit the whole GPU market (80% of their business) because working with Nvidia is ubearable?
evga as a whole is dying
It seems the plan was always to shut down entirely.
amd is open source
goyvidia is israeli
because absolutely fricking no one wants to deal with nvidia
i'm actually surprised that nintendo went with them
They got the deal of the century on the warehouse full of tegra SoCs nvidia couldn't sell to anyone.
Cheaper, less power hungry and more open architecture than Nvidia.
AMD makes better APUs and AMD gives good deals to partners.
nVidia is notorious for ruining every single partnership they've done in the past.
what cpu should i pair with a 6800 xt if i want to play 1440p 144hz?
The best gayman cpu on the market is the 7800x3d. Use whatever gpu you want with it i dont care.
5800x3d
is ryan gosling hot shit now? i feel like he's being positioned to be the new jake gyllenhaal, or something. remember when jake gyllenhaal was everywhere and seemed to be doing five movies a year, or something? what do you guys think? is ryan doing a gyllenhaal?
The Drive meme was already done to death 12 years ago, but normalgays have only discovered it recently. So we have no chpice but to suffer their endless regurgitation of a dead meme.
cost efficiency
consoles need to be mass manifactured so unified hardware between CPU and GPU is ideal in a custom sock, the only CPU sock Nvidia has is for the shield which is why the switch uses it
Nearly every company that partnered with nvidia has come out seething about how their shit they are as a business partner. Maybe they've gotten better in the past decade but why take the risk?
That and AMD is pretty good at making cheap mass market SoCs now. nvidia not so much.
The real answer is that AMD has perfected the APU solution for years now, whereas Nvidia can't get the same efficiency in a single package and Intel is a whole decade behind in graphics technology, only starting to catch up recently with their Arc lineup.
Years change, nV stays the same.
>Consoles need decent performance (not good nor top tier)
>Consoles need cheaply made and poorly designed components to keep the cost low
>Company goes with the 3rd world option and picks AMD
Consoles are built to a price and need to cut corners. They cant do they if they pic something actually god-tier in performance and quality
>ps5
>apu
Yes the PS5 uses an APU. The PS4 also used an APU.
No it doesnt
?