That's actually not true anymore. Check out the benches from this week. A1111 supports Olive. My 3080 gens slower and lower res images than my friends 6900XT but he still sucks ass at prompting.
I runs the latest western trash better than the competition, so you'll be tempted to play that instead of actual good videogames.
Jokes aside, the only thing AMD lacks these days is a proper alternative to DLSS. The 7800XT is 10% faster, $100 cheaper, with 4GB more VRAM than the 4070... but that's how much more value AMD has to offer to even consider buying their products, and the reason is DLSS.
Can anyone explain me why retards use DLSS, FSR or whatever upscaler? You'd think a graphicsfag wouldn't be content with a fucking upscaler with fake frames of all things
Because this is the current state of gaming, if you aren't using AI upscalers then you're effectively giving up half of your card's potential. Everyone is using it, some are even mandated to factor it in as a major component of development, either from the console manufacturers, publishers, or graphics card manufacturers, or else they won't be able to release the game otherwise.
Obviously it's for frame rate and you are assuming the DLSS will look awful. The reality is sometimes it will look noticeably worse and sometimes it's it won't any look worse at all.
>What's the catch?
Radeon cards need a little more tinkering and don't work hitch-free out of the box. The issues are minor and if you know your way around a computer they shouldn't be a problem for you.
They're worse in terms of software support. Most Radeon software is like a budget version of Nvidia software, except for sharpening which is AMD's strength.
Radeon cards also seem to have issues with power management in multi monitor setups, but Nvidia cards do too, to a lesser extent.
Radeon cards are a decent buy if you're price-conscious, especially now that the RTX4xxx series is terrible in terms of pricing.
t. owned an RX580 a few months ago, recently converted to RTX3080
Because it's a trade-off most people consider good.
Let's say you're playing at 1080p native with 60fps.
1) You can play like this.
2) You can lower the graphics to 720p native and have 100fps, but your image will degrade by 33%.
3) Or you can use activate DLSS at 1080p and have 100fps, and your image will only degrade by 1-10%, depending on how well implemented DLSS is and how good the native antialiasing is. That depends per game.
I think most people would choose 3). You'll probably thinking this is bullshit and there's no way DLSS can do this. But you can look at imgslis or videos or comparisons, it's the truth. The first few iterations of DLSS were trash, but now we're at 3.5 and it's hard to find flaws in it.
FSR2 is ok too, not as good as DLSS but it's not actually trailing that far behind. (At 4k and 1440p. At 1080p, it's very far behind DLSS.)
DLSS/FSR isnt fake frames
Its literally running a lower internal render resolution then using "AI" magic to upscale it back and in many cases (atleast past 1440p) you can get better than native results especially if the game doesnt have any forced TAA along with it, DLSS comes with DLAA which is far superior to TAA as it does not blur.
You're talking about frame generation. DLSS/FSR is upscaling that just makes the GPU's job easier through some very very accurate guesswork based on algorithms, that's why the GPU can make more frames for you. They don't come from nothing.
Because node shrinkages have become harder and harder and more expensive. Meaning that NVDIA and AMD have to use software trickery to match gains from previous gen to gen progress.
My personal recommendation is to only use iupscaling when trying to 4k and above.
it doesn't have a proprietary second chip for upscaling, ironically making nvidia the only company that doesn't support their own 1000 series cards while amd has fsr and intel has xess
Even once FSR 3 is out. It won't make a difference unless you bought a RDNA 3 GPU as only those have dedicated AI cores. Only there FSR 3 will work like DLSS on hardware level and provide similar results. Everywhere else FSR 3 will run through software again.
FSR3 should be the same on every card, it doesn't use AI cores.
What's exclusive to RDNA3 (initially at least. Historically, AMD releases their newer features on their latest cards, and then expands support to their older products over the following months/years) is Anti-Lag+ and the ability to enable frame interpolation on every game through the drivers (which is going to be worse than FSR3 implemented in the game by the developers, but better than nothing).
NVIDIA HAS NO L3 CACHE
AMD X3D CPUS AND X3D GPUS ARE THE FUTURE
Well, AMD doesn't have X3D GPUs either. But the massive L3 cache in RDNA2 worked way better than anyone expected, so I hope X3D GPUs are around the corner.
AMD has the advantage that both PS5 and Series X are running on their hardware.
Next generation will probably go all in on FSR 3+ in order to make decent framerates on consoles a possibility. But Nvidia won't give a shit even with AMD gaining the performance lead cause they make tens of billions with AI. AMD taking their marketshare inPC gaming away also won't happen anytime soon considering how loyal PC builders are. Many still think that AMD is the same trash driver company from +10 years ago. So they keep buying Nvidia even though AMD runs just as stable without ripping them off in every price class. Once players had a bad experience its hard for them to forget so Nvidia keeps winning by providing no actual value whatsoever.
This, I hate the green garden gnome but holy fuck team pajeet needs to step up in the temperature department, 90c vs 60c, same game same settings!?, No I don't need jet turbine in my case, fuck off.
FSR2 is shit and RT performance in some games is sub par, that's it. If you want to do AI shit and you don't want to install Linux stick with nV
I enjoy my 7900XT.
It uses fake DirectX hacks to gain performance in modern AAA goyslop. Running it on an actual game-agnostic API like OpenGL shows how bad it really is.
>enter amd thread >HOLY SHIT GUYS AMD IS SO BAD IT LITERALLY KILLED MY DOG AND BURNED MY HOUSE DOWN AND GAVE ME COOTIES >enter nividia thread >HOLY SHIT GUYS NIVIDIA IS SO BAD IT LITERALLY KILLED MY CAT AND BURNED MY WIFE DOWN AND GAVE ME AIDS
starting to think y'all are just haters
As a gaming graphics card? There are literally no downsides. AMD provides a FAR better value for the price. Nvidias drivers are fine on Windows but absolutely abhorrent on Linux. AMD cards also usually ship with more VRAM. I have no fucking idea how Nvidia is still shilling their super expensive "gaming graphics cards" with 16, 12 or even 8 GB VRAM. AMD also supports Meme-Tracing. Sure, it doesn't run as fine as with Nvidia cards but if you buy a graphics card from AMD's newest series, the performance difference is only noticable when you play on a 4k resolution. I sold my RTX 3090ti to a guy who needs it for encoding and AI shit. Getting my RX 7900 XTX this week. Fuck Nvidia, never again.
Headaches and subpar performance in most games.
The best anti-aliasing tho.
ATI's MSAA is sitll the best to this day.
tf2 looks jaggy as fuck with 8x msaa on nvidia cards while it looks clean on amd.
I've had few headaches, just about as many as my card's competitor (the 1080)
Performance has been perfectly fine. For the newest stuff if you want raytracing you'll be better off with nvidia otherwise it's about the same if not better for AMD in most titles.
Path tracing is non existent and raytracing is subpar. Still the best bang for buck.
I had 4 GPUs in my life, always using them for at least 4 years, I went with ATi/AMD for the first 3, but switched to Nvidia in the fourth one solely because it ran one specific game that was my main focus at the time (Overwatch). Next one will most likely be back to AMD unless I decide to do something stupid and try Intel just for shits and giggles. I don't see myself buying Nvidia again.
For how much Nvidia fans talk about driver support and whatnot, Nvidia is very quick to pull the plug and not support older GPUs with new features.
Lacks a few meme acronyms that Nvidia cards have on their boxes. It doesn't really matter in terms of playing games but some people like their acronyms.
>should i get an overpriced card or the much cheaper card with much more memory?
Very tough choice, anon.
that settles it then 7900 xtx it is, from what I've seen the performance is either better or worse in some areas so.
I'll save the $300 for a rainy day
Sorry anon, but you can't invent a PC. He patented it. Gotta get that pre-build.
I'm on team wait till next gen. Going to fully upgrade from 1080p 144hz to 4k 144hz and this gen isn't there yet even at the toppest of top end cards. My old vega 64 will have to heat my room for one more winter
Really depends on use cases but I've had a few issues with them. Not nearly as many as with AMD, but far from the plug&play experience shills advertise.
>1440p
you only need a 6800 XT or 6950 XT anon.
I have a 6700 XT for 1440p and only the latest games I might need to go from Ultra to High settings and its fine.
You don't need better than this, its a waste of money.
it doesn't happen because cache is literally just cache, there's a million ways you can use it. It does benefit sloppy code the same way that upgrading to 128gb of RAM benefits an Electron app.
DLSS has one purpose: reassuring the buyer that they didn't just drop big money on a piece of shit card that can't even do 4k at a decent framerate natively and a monitor that is basically wasted.
>DLSS has one purpose: reassuring the buyer that they didn't just drop big money on a piece of shit card that can't even do 4k at a decent framerate
Who's gonna tell him?
>hey should I optimize this bit?
Nah, fuck it, we have DLSS >hey, you should rewrite that in a more efficient way
Nah, fuck it, we got cache out the ass >hey, that takes a lot of memory, let's get it optimized
Nah, fuck it, RAM is cheap.
It's literally the same thing. Good devs will optimize and DLSS will be used to up the quality while shitty ones will use it as a crutch to average 60 fps.
they still were bad with the 5000 series with the notorious monitor black screen issues. I even refunded/returned my 5700 XT and bought a replacement used GTX 1080 instead back then.
But since the 6000 series, its been fixed and works fine.
the 5000 series was an entirely new architecture and some of it was definitely growing pains from moving on from GCN. I had a vega and had zero driver issues during that period
???
I played Baldurs Gate 3 with latest AMD driver for over 30 hours, no issues.
6700 XT, High 1440p, constant 100+ fps.
I maybe had the game crash/lock up once in 30+ hours of play.
Radeon is like driving a Toyota Corolla. It's your best bang for your buck. The value of the RX 580 will never be beat for the rest of PC gaming history. It was ridiculous.
It will get you there, but it's not the best of the best.
gamers goys, as a 4090 Chad, it is concerning to see the latest massive release has shit drivers exclusively for Nvidia. Word on the street is that Nvidia has made their driver optimization team 3rd fiddle with limited resources. A 7900XTX is looking more enticing.
can't play games in outer space
it does not play the latest western trash
less mature AI support. that's it.
That's actually not true anymore. Check out the benches from this week. A1111 supports Olive. My 3080 gens slower and lower res images than my friends 6900XT but he still sucks ass at prompting.
I runs the latest western trash better than the competition, so you'll be tempted to play that instead of actual good videogames.
Jokes aside, the only thing AMD lacks these days is a proper alternative to DLSS. The 7800XT is 10% faster, $100 cheaper, with 4GB more VRAM than the 4070... but that's how much more value AMD has to offer to even consider buying their products, and the reason is DLSS.
Can anyone explain me why retards use DLSS, FSR or whatever upscaler? You'd think a graphicsfag wouldn't be content with a fucking upscaler with fake frames of all things
Because most of these new games are optimized so poorly you have to use DLSS/FSR to hit your actual resolution and desired FPS.
Because game performance is atrocious dogshit these days for diminishing returns.
Because this is the current state of gaming, if you aren't using AI upscalers then you're effectively giving up half of your card's potential. Everyone is using it, some are even mandated to factor it in as a major component of development, either from the console manufacturers, publishers, or graphics card manufacturers, or else they won't be able to release the game otherwise.
Obviously it's for frame rate and you are assuming the DLSS will look awful. The reality is sometimes it will look noticeably worse and sometimes it's it won't any look worse at all.
this video shows outdated DLSS btw, DLSS 3 is much better
%3D
>What's the catch?
Radeon cards need a little more tinkering and don't work hitch-free out of the box. The issues are minor and if you know your way around a computer they shouldn't be a problem for you.
They're worse in terms of software support. Most Radeon software is like a budget version of Nvidia software, except for sharpening which is AMD's strength.
Radeon cards also seem to have issues with power management in multi monitor setups, but Nvidia cards do too, to a lesser extent.
Radeon cards are a decent buy if you're price-conscious, especially now that the RTX4xxx series is terrible in terms of pricing.
t. owned an RX580 a few months ago, recently converted to RTX3080
Because it's a trade-off most people consider good.
Let's say you're playing at 1080p native with 60fps.
1) You can play like this.
2) You can lower the graphics to 720p native and have 100fps, but your image will degrade by 33%.
3) Or you can use activate DLSS at 1080p and have 100fps, and your image will only degrade by 1-10%, depending on how well implemented DLSS is and how good the native antialiasing is. That depends per game.
I think most people would choose 3). You'll probably thinking this is bullshit and there's no way DLSS can do this. But you can look at imgslis or videos or comparisons, it's the truth. The first few iterations of DLSS were trash, but now we're at 3.5 and it's hard to find flaws in it.
FSR2 is ok too, not as good as DLSS but it's not actually trailing that far behind. (At 4k and 1440p. At 1080p, it's very far behind DLSS.)
DLSS at 1080p is disgusting. Also, it fucks up responsiveness, at least for me.
DLSS/FSR isnt fake frames
Its literally running a lower internal render resolution then using "AI" magic to upscale it back and in many cases (atleast past 1440p) you can get better than native results especially if the game doesnt have any forced TAA along with it, DLSS comes with DLAA which is far superior to TAA as it does not blur.
>DLSS/FSR isnt fake frames
It literally generates frames from nothing you baboon. Shut the fuck.
You're talking about frame generation. DLSS/FSR is upscaling that just makes the GPU's job easier through some very very accurate guesswork based on algorithms, that's why the GPU can make more frames for you. They don't come from nothing.
That is frame generation you're talking about. It says so in your image. DLSS and DLSS frame generation are not the same thing. Can you not read? Lmao
>DLSS is different from DLSS
I just took a shit straight on your face and some of it went down your throat because you're a mouth breathing gay.
>FLSS FRAME GENERATION
you were saying?
Because node shrinkages have become harder and harder and more expensive. Meaning that NVDIA and AMD have to use software trickery to match gains from previous gen to gen progress.
My personal recommendation is to only use iupscaling when trying to 4k and above.
Boy, they really didn't give a shit about optimization with that pos
Nothing.
it doesn't have a proprietary second chip for upscaling, ironically making nvidia the only company that doesn't support their own 1000 series cards while amd has fsr and intel has xess
Just release FSR 3 already.
Even once FSR 3 is out. It won't make a difference unless you bought a RDNA 3 GPU as only those have dedicated AI cores. Only there FSR 3 will work like DLSS on hardware level and provide similar results. Everywhere else FSR 3 will run through software again.
Oh fuck off already nvdiashill you aren't welcome here.
FSR3 should be the same on every card, it doesn't use AI cores.
What's exclusive to RDNA3 (initially at least. Historically, AMD releases their newer features on their latest cards, and then expands support to their older products over the following months/years) is Anti-Lag+ and the ability to enable frame interpolation on every game through the drivers (which is going to be worse than FSR3 implemented in the game by the developers, but better than nothing).
Well, AMD doesn't have X3D GPUs either. But the massive L3 cache in RDNA2 worked way better than anyone expected, so I hope X3D GPUs are around the corner.
People will make fun of you on the internet because they have so little in their lives that they only care about brands
The same problem with all new hardware: You have to play modern games to really justify your purchase.
NVIDIA HAS NO L3 CACHE
AMD X3D CPUS AND X3D GPUS ARE THE FUTURE
AMD has the advantage that both PS5 and Series X are running on their hardware.
Next generation will probably go all in on FSR 3+ in order to make decent framerates on consoles a possibility. But Nvidia won't give a shit even with AMD gaining the performance lead cause they make tens of billions with AI. AMD taking their marketshare inPC gaming away also won't happen anytime soon considering how loyal PC builders are. Many still think that AMD is the same trash driver company from +10 years ago. So they keep buying Nvidia even though AMD runs just as stable without ripping them off in every price class. Once players had a bad experience its hard for them to forget so Nvidia keeps winning by providing no actual value whatsoever.
Overheats.
Dies within 1 year.
Never buying amd anything ever again
This, I hate the green garden gnome but holy fuck team pajeet needs to step up in the temperature department, 90c vs 60c, same game same settings!?, No I don't need jet turbine in my case, fuck off.
FSR2 is shit and RT performance in some games is sub par, that's it. If you want to do AI shit and you don't want to install Linux stick with nV
I enjoy my 7900XT.
It uses fake DirectX hacks to gain performance in modern AAA goyslop. Running it on an actual game-agnostic API like OpenGL shows how bad it really is.
>enter amd thread
>HOLY SHIT GUYS AMD IS SO BAD IT LITERALLY KILLED MY DOG AND BURNED MY HOUSE DOWN AND GAVE ME COOTIES
>enter nividia thread
>HOLY SHIT GUYS NIVIDIA IS SO BAD IT LITERALLY KILLED MY CAT AND BURNED MY WIFE DOWN AND GAVE ME AIDS
starting to think y'all are just haters
PC Games only
Cant use for Emulation
Cant use for 3d rendering
Cant use for Animating
Cant use for AI generation
Nothing.
As a gaming graphics card? There are literally no downsides. AMD provides a FAR better value for the price. Nvidias drivers are fine on Windows but absolutely abhorrent on Linux. AMD cards also usually ship with more VRAM. I have no fucking idea how Nvidia is still shilling their super expensive "gaming graphics cards" with 16, 12 or even 8 GB VRAM. AMD also supports Meme-Tracing. Sure, it doesn't run as fine as with Nvidia cards but if you buy a graphics card from AMD's newest series, the performance difference is only noticable when you play on a 4k resolution. I sold my RTX 3090ti to a guy who needs it for encoding and AI shit. Getting my RX 7900 XTX this week. Fuck Nvidia, never again.
Headaches and subpar performance in most games.
The best anti-aliasing tho.
ATI's MSAA is sitll the best to this day.
tf2 looks jaggy as fuck with 8x msaa on nvidia cards while it looks clean on amd.
I've had few headaches, just about as many as my card's competitor (the 1080)
Performance has been perfectly fine. For the newest stuff if you want raytracing you'll be better off with nvidia otherwise it's about the same if not better for AMD in most titles.
What resolution are you using? 8xmsaa is insane.
looks like shit in 1080p on nvidia cards.
it looked good when i owned amd cards, (up to 2018 when i switched to nvidia)
>1080p
This bro time traveled from 2013?
1440p is a stopgap meme res with no discernable improvement from 1080p. Smart chads get a strong GPU and play 1080p.
Ah, so your blind.
Have you heard of this thing called glasses?
Not really. Most webcontent was designed for 720p so 2K enabled you to splitscreen without loss of quality
Worse memetracing and less AI support. That's all.
What about cuda
What about it?
How does it work on amd
Who cares? We ain't doing geometry. We trying to play some games.
ROCm as a buffer/interpreter
Path tracing is non existent and raytracing is subpar. Still the best bang for buck.
I had 4 GPUs in my life, always using them for at least 4 years, I went with ATi/AMD for the first 3, but switched to Nvidia in the fourth one solely because it ran one specific game that was my main focus at the time (Overwatch). Next one will most likely be back to AMD unless I decide to do something stupid and try Intel just for shits and giggles. I don't see myself buying Nvidia again.
For how much Nvidia fans talk about driver support and whatnot, Nvidia is very quick to pull the plug and not support older GPUs with new features.
>path tracing
Wow, I sure do love 20fps on my 4090!
it doesn't work on windows
spotty performance
Lacks a few meme acronyms that Nvidia cards have on their boxes. It doesn't really matter in terms of playing games but some people like their acronyms.
need to invent a computer soon, 4080 or a 7900 xtx Ganker?
for 1440p
Sorry anon, but you can't invent a PC. He patented it. Gotta get that pre-build.
7900 xtx. get that vram.
that settles it then 7900 xtx it is, from what I've seen the performance is either better or worse in some areas so.
I'll save the $300 for a rainy day
try and stop me
You've made the right choice, save that money for some pizzas while you game kek.
>AMDrone
>goyslop fan
yeah, it checks out
I'm on team wait till next gen. Going to fully upgrade from 1080p 144hz to 4k 144hz and this gen isn't there yet even at the toppest of top end cards. My old vega 64 will have to heat my room for one more winter
The XTX and save yourself the $300.
>should i get an overpriced card or the much cheaper card with much more memory?
Very tough choice, anon.
One comes with somewhat working drivers while the other comes with barely working drivers.
nvidia drivers aren't that bad
Really depends on use cases but I've had a few issues with them. Not nearly as many as with AMD, but far from the plug&play experience shills advertise.
Wait, which is which?
>1440p
you only need a 6800 XT or 6950 XT anon.
I have a 6700 XT for 1440p and only the latest games I might need to go from Ultra to High settings and its fine.
You don't need better than this, its a waste of money.
It doesn't support CUDA. That's the only thing that should have been mentioned and it's disappointing to see that no one else so far did.
DLSS is just a way for developers to make even worse games, barely even optimizing them because "lol just render at 320p and upscale 5head"
>no one else so far did
sorry I'm a fucking retard
The same argument could be said about 3D cache since it benefits unoptimized code more.
Weird how that never happens.
it doesn't happen because cache is literally just cache, there's a million ways you can use it. It does benefit sloppy code the same way that upgrading to 128gb of RAM benefits an Electron app.
DLSS has one purpose: reassuring the buyer that they didn't just drop big money on a piece of shit card that can't even do 4k at a decent framerate natively and a monitor that is basically wasted.
>DLSS has one purpose: reassuring the buyer that they didn't just drop big money on a piece of shit card that can't even do 4k at a decent framerate
Who's gonna tell him?
>hey should I optimize this bit?
Nah, fuck it, we have DLSS
>hey, you should rewrite that in a more efficient way
Nah, fuck it, we got cache out the ass
>hey, that takes a lot of memory, let's get it optimized
Nah, fuck it, RAM is cheap.
It's literally the same thing. Good devs will optimize and DLSS will be used to up the quality while shitty ones will use it as a crutch to average 60 fps.
the literal definition of "you get what you pay for"
Drivers take a a few months to make their GPUs look good and Ray-tracing sucks compared to NVIDIA
let me guess, you need more?
Yes, yes I do.
the mindbroken pajeets are here, pack it up
Thoughts on Powercolor Red Devil & Hellhound 7900XT? Why is the Red Devil more expensive even though they seem to be the same cards?
why do people still say AMD drivers suck? they are fixed and good now. its no longer an argument anymore.
They used to be really bad years ago to the point where it became a meme. The meme lives.
they still were bad with the 5000 series with the notorious monitor black screen issues. I even refunded/returned my 5700 XT and bought a replacement used GTX 1080 instead back then.
But since the 6000 series, its been fixed and works fine.
the 5000 series was an entirely new architecture and some of it was definitely growing pains from moving on from GCN. I had a vega and had zero driver issues during that period
By years you mean literally last year?
Baldur's Gate 3 has an incompatibility with recent AMD drivers and may corrupt them to the point of needing to reinstall them.
???
I played Baldurs Gate 3 with latest AMD driver for over 30 hours, no issues.
6700 XT, High 1440p, constant 100+ fps.
I maybe had the game crash/lock up once in 30+ hours of play.
>AMDfags vs Nvidiafags
You need to run linux
Radeon is like driving a Toyota Corolla. It's your best bang for your buck. The value of the RX 580 will never be beat for the rest of PC gaming history. It was ridiculous.
It will get you there, but it's not the best of the best.
Except the Toyota is reliable
power efficiency not quite as good as nvidia's.
that's about it though. performance is generally just as good, prices are better, and the company is a lot less gnomish
Stutter and lag.
No Nvidia Inspector
gamers goys, as a 4090 Chad, it is concerning to see the latest massive release has shit drivers exclusively for Nvidia. Word on the street is that Nvidia has made their driver optimization team 3rd fiddle with limited resources. A 7900XTX is looking more enticing.
It burns up.