Are AMD GPU's really that bad?
Are AMD GPU's really that bad?
This phonograph "reads" a rock’s rough surface and transforms it into beautiful ambient music pic.twitter.com/PYDzYsWWf8— Surreal Videos (@SurrealVideos) March 3, 2023
Community-driven video game blog & discussion
Are AMD GPU's really that bad?
This phonograph "reads" a rock’s rough surface and transforms it into beautiful ambient music pic.twitter.com/PYDzYsWWf8— Surreal Videos (@SurrealVideos) March 3, 2023
What are the best graphics cards/processors for upgrading a pc?
pentium 3 and voodoo 2
Riva tnt shits on the voodoo
The ones that offer a sizeable performance improvement without needing a new motherboard or power supply.
You cannot run stable diffusion on a 6800xt on windows, AMD's GPU compute ecosystem, ROCM is currently Linux-only. I get about 2.43 iterations a second on my RX 6800, which is similar to a 3060, 6800XT doesn't do much better. Radeon cards aren't especially good for anything beyond gaming, save for mining when that was still profitable.
I'll be honest, most I do on my 1080 is game, save for the occasional SD or video I make. And i'd assume that's the same for alot of people.. so would I have to run stable diffusion in a linux VM for AMD?
>would I have to run stable diffusion in a linux VM for AMD?
That would be a solution, but Hyper-V and most hypervisors you can run within windows can't give a VM full control to a GPU that's already used by Windows. If you have integrated graphics or another graphics card, you could use them for video output to circumvent the issue.
>I get about 2.43 iterations a second on my RX 6800
on linux? damn so i have to run linux to try out AI shit? i was considering dual booting anyway but it's still a bit of a pain in the ass. haven't touched linux in a while.
>damn so i have to run linux to try out AI shit?
Only if you want to use your radeon GPU. Pytorch, Tensorflow and all AI frameworks work perfectly fine on CPUs, it's just not as fast.
>haven't touched linux in a while
Install either fedora or anything arch. They are in my experience the easiest way to get ROCM up and running, everything already in their repositories.
My number on windows would be none. To give you more details, I'm running fedora 37, python 3.10 and basically using the sample code from stable diffusion's huggingface page, I haven't tried any of the fancy GUIs available.
from diffusers import StableDiffusionPipeline, DPMSolverMultistepScheduler
from sys import argv
from torch import autocast
model_id = "stabilityai/stable-diffusion-2-1"
# Use the DPMSolverMultistepScheduler (DPM-Solver++) scheduler here instead
pipe = StableDiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.float16)
pipe.scheduler = DPMSolverMultistepScheduler.from_config(pipe.scheduler.config)
pipe = pipe.to("cuda")
prompt = argv
image = pipe(prompt).images
>Pytorch, Tensorflow and all AI frameworks work perfectly fine on CPUs, it's just not as fast.
how much slower are we talking? or to be more specific, how do you think a 5700x will perform? more cores is better i assume?
and RAM will be relevant instead of VRAM?
Generating an image in stable diffusion, from prompt to final output takes around 30 seconds on my rx 6800.
Doing the same on my laptop, with an R5 5600U takes about 10 minutes.
Given the two extra cores and higher clockspeed, you should get closer to half that, and of course it will use RAM like anything else running on your CPU.
I get 6 it/s on a 6700 XT on Linux, I assume your numbers are Windows?
No it's what consoles have been using for decades
Not really strengthening your case there, bucko.
they're good at the poorfag pricerange. if youre paying 1k+ for a gpu anyway you're a retard if you pick AMD though.
This. Better performance/price than Nvidia on the low end
My 3060 12gb was $240 what would have been better?
That's a great price for a 3060. They've been consistently around $350 for months, while the RX 6600 is $225 and has been as low as $190. It's not even really a choice when the price gap is that big.
>if youre paying 1k+ for a gpu you're a retard
Feels good having disposable income and not being a fuckup
Just becuase you have disposable income to buy a pile of shit dosent change the fact that you bought a pile of shit
yeah but you’re a poor gay who cant buy a pile of shit and are mad about it
I'd rather have the 7900 XT than the 4070Ti (way more VRAM), but the 4080 is better than the 7900 XTX imo (you won't need more than 16GB for quite a while). The 4090 is untouchable.
For anything below the $800 price range, AMD is better and cheaper.
I bet you are one of the retards that fell for the RX 480 8GB meme.
But 2 games currently at 1440p or higher use more than 12GB and more will come because games are no longer optimized
two games being hogwarts and warhammer something
nvidia shill thinks he's slick lol
He's right about what people's perceptions are. If you have to choose between a $1,000 7900XTX and a $1,200 4080 the $200 becomes less meaningful in that price range. People are spending top dollar and they want top results for those kinds of cards. AMD has made good improvements since RX 5000 in their GPU division, but they have a deserved reputational hit for a number of reasons. The OpenGL driver issue took them 10 years too long to address. That was performance left on the table for generations. Even this gen, they're supposed to be the efficiency company and both of the 7900s launched with idle power consumption of ~100W. They need a couple years of good performance out of their driver support to turn that perception around. They also need to forget this strategy of pricing their flagship cards $100-$200 below Nvidia's. If they want to gain market share then they're going to need to try and really undercut Nvidia.
No, as long as you stick to the previous gen (which you should be doing for any graphics cards).
AMD is the best on GNU/Linux.
best value for buck. If you are not buying 4090 i suggest buying amd
Nobody on this board should be buying a 4090
and how were you planning on stopping me?
by collapsing the market
>Nobody on this board should be buying a 4090
>most garden gnome'd GPU on the market
not my problem
Point still stands
6700XT is all you need.
this, get yourself an ASrock or Sapphire 6700xt for less than 400 bucks and you're good to go for [email protected] fps
I just bought an XFX 6700xt, did I fuck up?
no, that's good
reviews on both amazon and newegg say they have some coil whine at 75c+ with those cards
usually shit and also reported coil whine, some cheap features like adding in bronze tubes instead of proper heatsinks
Well mine stays cool and doesn't whine so I dunno what to tell you.
which version is it? there is 3 different variants of each card they have
>bronze tubes instead of proper heatsinks
You can't say these things without proof moron
My source is that I made it the fuck up
Powercolor is for poor chinx and street shitters
Over the years I have owned multiple AMD GPUs from different partners. Every single one of them had coil whine at some temperature. It's like the brand itself is cursed.
Only GPU I've ever had with noticeable coil whine was a Zotac GTX 1060 3GB.
I've had one for a while and haven't had any problems (besides fitting it in my case).
Nah it's good.
>reviews on both amazon and newegg say they have some coil whine at 75c+ with those cards
How do you even reach that? With one side case fan mine doesn't even go over 65c in FURMARK let alone actual vidya. But it is the mid tier variant with the better cooler.
My powercolor red devil runs cool as a cucumber. Haven't seen it hit 60c once in my meshify 2 compact.
In my experience they have some weird performance issues on the desktop, like stuttering when watching videos in Firefox or resizing windows. Kinda similar to the NVIDIA experience on Linux. Other than that they're fine.
Isn't that a common issue with MPO ? They actually fixed that a month or so ago, and it's more a Windows issue that directly an AMD issue.
Just tried turning it off and it didn't change anything. When I watch a 4K YouTube video in Firefox it stutters really badly even though it claims it's not dropping any frames. I remember not having this problem when I was using NVIDIA.
>When I watch a 4K YouTube video in Firefox it stutters really badly
Hardware acceleration bug strikes again.
>in my experience as an incompetent retard that can barely operate a computer,
>in my experience as an incompetent retard that can barely operate a computer,
Are you going to finish the sentence? We'd love to hear your thoughts on this topic.
I've used 4 amd gpus and one nvidia. Only one amd gpu fucked up on me. My 6700xt is currently in transit to asus. Not even six fucking months and it shit the bed. Anyway, you'll find plenty of people bitching about one company or the other because their products stopped working. Keep in mind that people with bad experiences are more likely to be vocal about it. So for every bad thing you read, there are thousands of people who didn't have that experience
Why did you buy asus?
It was the cheapest 6700xt at the time. $399.99
Do Asus is now considered a shit brand? What happened?
ASUS has always been shit, just less shit than other brands. Their laptops blow caps and resistors all the time.
I have a budget laptop (~530€) from Asus since 2015 and still works, although following a recent drop the display has a weird pink line on it(1 pixel wide). My old 970DirectCU II is also still kicking and so does my old Asus motherboard from 2013 in my media PC. Weird to hear people are considering it bad.
nta but their customer service is shit
if you're on AMD you should always pick sapphire if you can.
>had 2 sapphire nitro+ GPUs
>both coil whine
Fuck you demon
everytging fucking xoil whines these days. my pny 1660ti coil whined, so did my 2060 super from msi, my 3060 ti from asus and now my 3070 from evga - they all fucking coil whine. so does the ps5 and nowadays apparently even motherboards coil whine too. picking a different brand isnt going to save you from it.
Not good, not terrible. That's Asus. I got their motherboard brands for Intel until I got fed up with lanes being cut off or diverted if I populated the PCI-E or M.2.
Their Nvidia cards are usually good; their TUF models were among the best AND the cheapest for Ampere.
As for AMD, all I remember is their awful RX 480 model. They re-used some old design or something, and half the heatpipes didn't even touch the die. Needless to say, it was shit.
Hey retard all the AIBs are meaningless brands with a revolving door of engineers on top of a few chinese PCB mega manufacturers
you think there's some fucking artisan craftsmanship going on in shenzen?
It's just a general question, why are you so worked up, gay? Again, why buy Asus? State reasons, pros, cons, etc.
the brands are completely fucking meaningless so the answer should be "the benchmarks showed it was competently designed (like all but one or two models of card every release wave)" and/or "it was the one that was on sale"
So why are NVIDIA cards so expensive compared to AMD?
don't worry anon, nowadays AMD carfs are retardedly expensive too.
pic related and brand , but also it has more features.Don't worry, AMD is right behind them in term of price, 7900 xtx is 1k and 7900 xt 850 $
again this bot
how are the new amd cards with encoding? is that av1 any good? how much performance loss when streaming?
i gotta stream some sf6 lobbies for fgg soon
I don't care about rt but if they fix the vr performance I'll buy one of their cards immediately
nvidia has better software / features despite still being iffy as fuck. amd drivers and software have always been a clusterfuck and outright inferior. fsr is absolute ass compared to dlss.
>Buy rx570 for €120 a couple years ago
>Can't even get the same performance for that price these days
Kinda crazy how GPU progress doesn't translate into prices any more.
Because it doesn't. They think they can get away with pushing the buttcoin bubble using AI as the new excuse.
Thank god all the big studios have completely fallen off a cliff when it comes to quality or I might have been be tempted to upgrade.
Those scores for the RX 6800 through to RX 6950 are way too fucking high, and that's coming from someone who owns a RX 6800 XT
It's still funny to me how much better the Vega cards aged compared to the 1070, 1070 Ti, and 1080.
Where's the 1080 Ti?
AMD cards generally age better because they have to keep writing drivers for their older architectures because that's what the consoles have. that said Nvidia has gotten better about it, Kepler went to shit in like 4 years but Maxwell from 2015 is still completely functional.
>because they have to keep writing drivers for their older architectures because that's what the consoles have.
AMD cards "age better" because it takes several years for their drivers to not be dogshit and get the performance you should be getting on day 1.
when I buy a gpu I want it to be good NOW now 6 years after launch
I've never had a gpu for more than 4 years and I never will
that fine wine shit is cope
AMD has better software / features despite still being iffy as fuck. Nvidia drivers and software lately have been a clusterfuck and outright inferior. DLSS is absolute ass and unsupported versus FSR/RSR which works for literally every fucking game Ganker objectively plays compared to DLSS.
show me the framerate gains between dlss and fsr so we can all laugh at you
I don't think I've had a single driver issue since I switched to NVIDIA in 2016.
>Nvidia drivers and software lately have been a clusterfuck
FSR vs DLSS. You tell me.
DLSS looks worse.
You're retarded right? Just look at the car at the very bottom of the screen in FSR vs DLSS
Stfu dumb moron DLSS has obvious ghosting and none of the others do. Ghosting is what kills PC games especially driving and FPS.
i think current dlss version is at 3.1.2 or something.
>using still pictures instead of videos
nobody's ever going to pay for your filthy cards
stfu fucking gay. DLSS is crap. I know what I see as I own two PCs one with AMD and Nvidia. Stop gobbling ghosting artifacts. If you can't see it then you're a fucking jit.
No one is going to buy a $1000 card to use DSL or FSR, that's fucking retarded, they both look like shit.
i said version 3.1.2, not dlss3. its confusing i know. the 3.1.2 is the current version number of the basic dlss, without frame generation.
its hilarious how bad this is but since nvidia is so big not a single so called "enthusiast" site will ever call them out on it
not only are you wrong, but don't move the goalpost; the point of DLSS is to give me more frames, show me the frame gains from FSR you disingenuous moron
This retard is baiting and you guys are falling for it.
How? It's clearly superior. It's much sharper and has far superior fine detail definition, FSR is very blurry. Look at the white text on that light blue banner. The text is super clear and easy to read with DLSS while with FSR it's mush.
Holy fuck that ghosting is out of this world. Might as well just crank the resolution down
Look at the jaggies on FSR. Also DLSS used to have a lot of ghosting but from 3.4 an upwards they have been actively trying to fix that. Your screenshot is most likely a cherrypick as usual since you're showing version 2.4
is 7900XTX really on par with a 4080? it's 500€ cheaper
its slightly better
for production Nvidia gpus are better
for pure gaming AMD gpus are better
Nvidia charge more because of brand name and marketshare and raytracing nobody uses
Ironically in my country there are 4080's available for cheaper than the 7900XTX which kind of blows my mind. I guess I'm an amd fanboy but I'd take a cheaper 4080 any day over the XTX.
I've been using mostly amd gpus since the HD 5850, not had any problems.
my 7900xtx seems alright
h264 decoding has issues though
how can ray tracing be that performance heavy? is it this a a hardware or optimization issue?
it really is that heavy when you fully path trace a game.
RT destroys your VRAM Capacity and VRAM Bandwidth. Nvidia is forcing DLSS on people to cover up the fact that RT is a bloated mess and they are also shipping GPUs with less VRAM to save themselves money without passing the savings to the consumer.
i doubt thats real vram usage, nogwarts runs 1440p60 with ultra textures on my 3060ti which only has 8gb.
No it doesn't
It's pretty close to actual usage, at 4K with RT enabled the 3080 (10GB) performs worse than a 3060 (12GB) and gets demolished by the 6800 XT (16GB).
And by "demolished" I mean like 30 FPS vs. 6 FPS, so it's arguably unplayable on both cards.
AMD is probably going to launch its midrange RDNA3 cards in June, according to believable rumors. Did you fuck up? Maybe, it depends on the price.
>Maybe, it depends on the price
I spent $370 for what it's worth
I meant how much their next-gen cards will cost. Like, let's take a hypothetical 7700 XT for example:
>30% faster than the 6700 XT
>Significantly better raytracing
If they sell this for $399, I'd say you fucked up. But it'll be $499 or more in my opinion, so I doubt you'll regret your purchase.
That benchmark is wrong: come on, in what world the 7900 XTX performs worse than a 3060. The reviewer even admitted he fucked up on the TechPowerUp forums, but they never bothered to retest and update the graphs. Pic related is the correct benchmark. My memory was a bit off, it was 14 FPS for the 6800 XT and 4 for the 3080: still, completely unplayable on both. But as you can see, the 3060 performs better because it has a bit more VRAM.
Basically, the only card that can run this game at these absurd settings is the 4090.
Actually, I bought a 580 4GB for €130 at the end of 2018: it was supposed to be a temporary card, after my old 680 died. I was planning on getting a 3080 or 6800 XT + 1440p monitor upgrade, but then mining happened so I'm still using that same 580. In hindsight, I should've bought the 8GB version lmao.
it was a meme on my 3090 and continues to be a meme
imagine buying a 4090 and being fine with 48fps lows and a 61fps average
show the gpu temps.
Those midrange cards better have more than 8gb of vram. Then I might consider them. 4060 will only have 8gb of ram so if 7600 has 12 or more I'm gonna consider it very much. 12 gigs should be the minimun.
>at 4K with RT enabled the 3080 (10GB) performs worse than a 3060 (12GB)
where the hell do you get that stuff anon? do you just make it up?
Let me guess, you need more?
I want less
Looking at charts like this makes me so glad I'm not retarded
>make a game run like shit with a press of a button, only for $1999.99!
Yes goy just buy Nvidia
gtx 1060 6gb/ amd ryzen 5 1600 here what to upgrade if i want a good 1080p60fps gaming experience?
ryzen 5 5600 + radeon 6700xt
should be around 500 bux in total
danke herr anon
I have that card, it's made for 2k.
so? not like it stops working if you connect it to a 1080p displays.
Might as welll just buy a 4090
I had to flash the bios on my GPU to get it to work. What do you think.
the drivers are shit and that's a fact.
if they were then why are none of the major tech guys saying it today on their AMD card reviews like Gamers Nexus? They go through like 30 mins of video talking inside and out of the GPU and not once they mention drivers.
Considering a 6800XT to upgrade from my 1080ti, i'd get a 3080ti but those are like $1,200. I might go team red after 10 years of green.
Anyone know if I can do stable diffusion on windows worth a 6800Xt? SD is about the only thing id miss from nvidia tbh.
6000 series prices are coming down fast. keep waiting while longer and you'll get 6950 xt for 450
>I can do stable diffusion on windows worth a 6800Xt
You can, but it's a pain in the ass and it's slow. If SD is a must then just stick with nvidia.
Idk, I just got a 6500 and got an excellent score for Blue Protocol, works fine for me. NVIDIA however, is absolute shit
I'm upgrading to a GTX 970
>New architecture AMD (5000/7000 Series)
you are the beta tester
>second generation / refresh of new architecture (6000 Series)
no, only shills and mentally ill people will say other wise
Nvidia if you are actually doing lots of rendering besides gaming but since majority of people just play games and watch youtube AMD is better for price/performance
im using a amd FX 9370 but this week i got a new 5600G with some new DDR4, i hope it will help me with my emulation, i can play most games without any problem but sometimes i get some lag.
i hope this is good
FX series were a total joke, even budget pentiums were better for emulation (G2358) when they were around.
FX had the same fate as xeons, bad single core good multicore performance
now that games use multi-core by default both an fx or xeon are better gaming cpu's than the mainlines
>i got a new 5600G with some new DDR4, i hope it will help me with my emulation
You'll be fine on that budget build. Just don't expect anything higher than 60fps at 1080p even during emulation.
why would someone ever need more then that for anything ever
It's 2023. In 2013 Ganker was hoping for 4K 144FPS NATIVE as standard, but here we are still stuck with 1080p as the default standard on native resolutions because Ganker missed one thing: how shitty developers became at optimization.
i have a 5700xt bros, and I must c-consoom. Should I upgrade or wait a generation or two?
Rx 5700 here, will upgrade maybe at the end of the year If I find a good deal or next year. Using a C2 42 LG and play most games in 1440p ultra or 4k for smaller budget games or older games.
But to be 100% honest with you, I feel like I just want to upgrade for the sake of it then something I really need, it's also your case a 5700 XT will still do the job for 90% games, only a handful will make it kneel.
Anyway I bought my gpu at the end of 2019, 4 years of use seems fair for an upgrade.
>people complaining about games not working right or having graphic problems
>90% of the time they're using AMD
Shit on Nvidia all you want, at least their shit actually works.
the driver issue hasn't been a thing since opengl fucking died. nvidia customized their drivers while amd followed the opengl recommendations.
>since opengl fucking died
AMD caught up with opengl on windows in last year's summer update.
oh so even more of a non-issue for minecraft, the last relevant opengl game
Don't worry AMD bros. AMD is paying Linus millions of dollars and all of a sudden he became pro AMD just like that. And just like that in a few years people will stop shitting on AMD GPU's.
Youtube was a mistake. Those goddamn thumbnails are nauseating. It's a shame, some of their videos are fun like them building a pc out of parts all bought on wish and aliexpress
I mean Linus himself said that having those stupid face thumbnail increases the views as much as 20%. That's why everyone does it.
I know it works and the clickbait titles too but I still fucking hate it. I stopped watching his videos because of it
>shilling is never okay
>except when it's for Nvidia
Damn it feels good to be an AMD chad. No stutter. No driver issues. Pure gaming bliss better than consoles.
>Driver casually corrupts your OS
Are you running windows by any chance?
>Hurdygurdy I love programming everything I do hurdygurdy i love it when nothing works out of the box
>hurdygurdy i love the fact that only 10% out of 50000 games on steam work on my os!
>hurdygurdy no softwares work for me but thats great
>hurdygurdy amd is poorly supported by anything linux but i love it
why are you so mad about someone not using windows? is moonlighting on your shilling career for nvidia allowed?
id use linux if it was literally plug and play like windows is and you dont have to fuck with shit to get things running and the gui wasnt so archaic
1: support all games with no downsides
2. support all softwares with no fuss or downsides
3. actually works and isnt just fork #823749873 of something else
works on mine too, but windows works better
even my clean install of Windows 10 that was meant for a singular game likes to ~~*rescan*~~ the drive when I rarely boot into it
not that I ever use it anyway
i dislike Windows and will never go back, especially when considering amdgpu/mesa on Linux is ahead of the drivers on Windows for the 7900xtx
works on my machine
sounds like user error
clean install so its the product that is sold to users
clean install works perfectly fine on my machines
try taking your hardware to the geek squad if you can't figure it out on your own
idk why you feel a need to defend a company in your free time when their product clearly has issues
and that's not even getting started on the ones involving privacy 🙂
I dunno what to tell ya man, some people just can't handle easy shit like hardware. People like you are why prebuilts are a thing.
the hardware works fine on Linux
Why is it that """linux professionals""" always seem to have so many issues running the most user friendly OS on the market? How are they so inept?
anon you're at it again trying to needlessly protect Windows/Microsoft for something that isn't my problem
the hardware is fine and works elsewhere
I'm getting second-hand embarassment from this encounter with (you)
>it's not my problem
Is that not why you're seething about the easiest OS on the market in this thread?
but I don't use Windows and simply said the drivers for my gpu are better on Linux
you're the one seething over this fact
you can't stand the fact that someone isn't using Windows for some reason
its pretty strange
I'm just glad there's an OS out there for those who are too incompetent to use Windows.
man you really don't know when to stop do you, anon
>too incompetent to use Windows.
Lol anon I....
I didn't believe they existed until I met people from Ganker
>too incompetent to use Windows.
Anon pls, Windows is baby duck zone.
Exactly, which is why it's insane that we share a board with people too inept to use it.
I've used operating systems across all brands, and it usually comes down to user retardation when something goes wrong, be it Windows, Linux, Solaris, BSD, OSX. RTFM or the documentation you slobs is it so hard, and don't use hardware that isn't supported or badly supported.
Try using it before talking shit. Unless you need specific software for work like adobe shit, linux is 100% usable. Popular desktop environments like KDE are made to be similar to windows. Enough that a windows user will be able to navigate the os with little issue. With Steam's proton, most games will work flawlessly sans anything that uses anti cheat
do you need two different os for it to work like windows
no wonder linux will never grow or become the standard
here's a question for ya linux bro. how is linux at things like switching monitors/sound devices on the fly?
for my setup i switch between dual monitors OR a tv. (work vs gaming, basically)
when using monitors audio is via optical (2.0)
when using the TV over HDMI audio is 5.1
this is partly due to how windows handles disconnected devices, there's no way to get windows to send an audio signal over HDMI without having the display connected (having a ghost window or some other hacky kludgy solution is just not worth the hassle. nor is dealing with hacked realtek drivers to get shitty dolby live/dts to work over optical)
right now i'm using displayfusion to handle this, one keyboard combo sets the audio and display devices as needed. i'd like to switch to linux and just be done with MS's cunt behavior, but it's the little things like this that always seem to be an unbelievably huge pain in the dick w/ linux.
Couldn't tell you, I'm still fairly new to linux. Been using it for a year. I use headphones and cheap speakers. I can easily switch between the two similar to windows on the task bar. Monitors are more complicated because you're either using xorg, which doesn't like multiple displays, or wayland, which has better support but doesn't play well with nvidia gpus.
I'd suggest researching recommended beginner distros like mint and fucking around with a liveboot so you can test the setup without needing to install the os
not him but ask in the linux threads or steam deck when they show up
most users of linux gaming have less than a year of usage thanks to the renewed interest from Valve
yeah, good call.
it's mostly hypothetical, i'm fine with W10 for now. the gaming thing is the only thing keeping me tethered to windows (oh and i guess now with work, visual studio. god i miss apache and mysql.). but i know they're going to push me to linux eventually due to even more anticonsumer bullshit.
yeah, if you use gamestream for LAN streaming, you can avoid it for the time being by blocking the webhelper executable via windows firewall, as well as some hosts file entries. but eventually they'll sneak one past the goalie and fuck you (probably a video card driver installation will remove geforce experience and then there you go.)
i really, really, REALLY dislike software vendors doing this shit, if it works, and costs them nothing to keep it, why fuck your users like this? hell, at least open source it so other people can carry the torch. it's evil and anti consumer
never buying another nvidia product again.
Gaming becoming viable for linux over the past year or so is what made me move over. I still dual boot since having two operating systems separating work vs. play has been a godsend for productivity and focus.
yeah, last time i played around with linux (steam, proton) back in 2017 or so, it was like 95% of the way there. Linux on the desktop seems to perpetually be at the horizon of 'good enough!'
maybe one day it'll be good for non standard (surround/multi monitor/hdr/controllers etc) setups. for single monitor, stereo sound, mouse/kb i'm sure it's great as is.
that wasnt a driver issue that was a windows issue. windows update would install its own drivers while you were installing yours. obviously this is a bad idea but microsoft pajeets allowed it.
>It was microsofts fault!!
the driver literally tells you that windows is bricking your pc and you should stop being such a fuck up using your own pc. its not the driver or amd fault, its amd and the user fault for being dumb enough to not group policy.
never had this problem with nvidia.
Never happened with Nvidia. Maybe the pajeets at AMD can have a talk with the windows pajeets and sort it out.
*windows automatically removes your driver*
sorry chud guess you arent playing games today
That doesnt remove anything though. Try again baitbro
i did a clean install of w11 when it first came out and never had any driver issues for anything in my pc
forget that, shit like webm related is funnier
Windows was bad before but dear god they shit the god damn bed since they went with the always updating model. Every month something is broken and its forced on everyone. Even when you reinstall your PC, the ISO you grab is automatically updated before you install it.
Something is wrong with your pc then because i've have zero issues with windows for years, nothing ever breaks or crashes
google MPO Windows
dont tell me there are zero issues
If you were foolish enough to use 3rd party disable updoot software instead of rummaging with the group policy editor, you will run into problems sometime down the line guaranteed while also being susceptible to bad updates like defender jailing your chrome browser back in september 2022.
>only happens with ayyyymd
Listen buddy, you're not getting paid. It's too obvious.
>windows update would install its own drivers while you were installing yours.
Happens with AMD and Nvidia, I call it the update driver loop.
I just got 6800 a month ago and to this day, I still don't know how to check fans. It just doesn't spin and temp will be only around 55-60C when playing vidya which my friend said it's normal that they don't spin around that temp so I don't really do anything about it
If you're on windows, I believe the driver software lets you set a fan curve. Otherwise, you can use something like MSI Afterburner to set custom fan curves
right click desktop, go to amd panel, click on performance tab, your temps and fan controls should be there. by default the fans dont spin on amd cards until they hit a certain temp so its normal to boot up your pc and see the fans are not working.
That's the default setting for AMD on all operating systems. You can edit a fan curve on windows IIRC but I don't think it's necessary.
Remember when PC Gaming was simple?
Nice try Satan, but you can't trick me
y̷̧͈͓̖̦̪͖̰͔͈̓͗̍̾̓̑͆̋̑̚̚ͅŏ̴̝̬̤̹̀̈̉̃̋̀̊̊̾̃ư̵̧̝̗͓̦̝͍̑̀̌̔̇̂̍̓̈́͊ͅ ̸̜̜̫͆̈́̀d̶̢͖̹̉͆̕ợ̵̧̞͂͑̾̔̑͒̏̎̾͋͒̄̂ņ̶̃̎͌t̸̨̧̛̟̭̮̠͛̊̌̃̋̔̾̍̕͝͠ͅ ̷̨̼̙̭̝̟̯̱͛̐͋̽͋̐̚͠l̴̡̡͚̙̮͚̺̉͋̓͂͊̽̚̕̚͠͝ȋ̶̛̜̗̼̯̓̈̊̑͆̾̀́̊̓̓̓͜k̶̝̹̒̌̃͐̾̽̇̎͠͝͠ͅĕ̸̞̤͎̣̓̈ ̷̨̩̙̩͉͚̘͌̂̽̋͗̔͋͐͗͛͛̀͝c̵̢̮͉̫̗̖̻̭̖̠̮͐̈́̆̓̈́̊̉͋͆͠ǫ̵̤̦̰͓̻͍̟͇̰̒͑̔́̄̄͑́̒̆͠i̸̛̙̫̯̜͎͇͖͐͐̒̈́̋̍̀̇̾́́̍ͅĺ̶̪̜̻̝͓̐͜ ̵͍̮͎̱̬̩̮́̓̋̓͋̂̌́͒̚̚͝ẁ̸͈̙̹̗̜͎̮̪̤̬̤̺̼̌̔͗̃͊͌̓̓̿̚͜ͅh̸̩͇̹̙̺̘̰̹͌̒̂́́͊̀̾͂̍̾̓͆ͅi̷̟̟͙̘͉̬͕̻̜̙͈̭̗̹̖͑n̸̨͎̤͖̞̣͙̦̼̹̙̂̉̂̑̐̇́͘ͅe̸̠̝̱̫͐̔̐̇̚ ̷̛̥̦͉̤̙̞͚͚͈̪̊̀a̴̯̖͇̦̋̿̀͑̆̀͝n̶͈͇͒̃̃̊͒ȏ̴͍͚͍̱̟͋̆̄̈́͋̾ņ̶̛͔͍̺̫͒̊͆́͗̾̈̽̋͜?̴̼̆̾̒͑͌͝
they're fine unless you spend all day looking at graphs
and I suppose with certain rendering applications and stuff nvidia tends to be better
I'd say that their software suite is a bit more fleshed out compared to nvidia's too
like I used to have AMD and I always wondered why people said that you needed msi afterburner
then I got an nvidia and I wondered why the driver software was so barebones
"oh right that's why people use msi afterburner"
Nvidia removing gamestream in lieu of cloud/monthly payment is infuriating
>the feature works so well that it's cannibalizing our paid option (local LAN streaming vs streaming over the internet)
>i know, we'll deprecate it and remove it via an update
>but won't they complain?
>no, we'll just tell them it's to "improve their experience"
>oh right, works every time.
yeah, fuck those cunts. jenson if you can read this, i hope someone beats you to death with a hammer.
Sunlight seems to work well enough so far for hosting; I was about to snag a 4080, but yeah fuck them. 7900 XTX it is.
First time hearing about this, that's actually fucked
They removed it from their Shield shit, they didn't remove the feature from the graphics cards. You can still use Moonlight on Android anyway, I assume that works on Shield devices but I don't have one of those.
anon, they are removing it from geforce experience, period -- meaning moonlight (the client) wouldn't be able to connect to the host.
Sunshine so far seems to be a good enough replacement, slightly more cumbersome to set up, but not a deal breaker (and luckily it works with AMD cards)
> are removing
Wasn't it removed since last month?
i have no idea if they went through with it on the host side since i put geforce experience in a box and don't let it reach out to the mothership for an update to be gimped by the cunts at nvidia. I have gamestream disabled ATM so that i can fuck around with Sunshine.
so there's no confusion, i don't own a shield. i use moonlight on a laptop in my living room to stream from my gaming PC. The host uses geforce experience's gamestream, moonlight utilizes this stream (instead of a nvidia shield, it's a simple laptop). Nvidia is removing gamestream from geforce experience, nerfing in-home/ LAN streaming (ostensibly to prop up their cloud gaming gayry)
>i have no idea if they went through with it on the host side
They didn't remove anything
Anyway, how good is Sunshine? Does it actually do all-hardware capture + encode like GFE does or is it another shitty solution with software capture? Software capture solutions tend to suck at high res. Steam suffers from this for instance.
>GTX 1060 6GB
And I've had a consistently better experience with the AMD cards. I also like Adrenalin quite a lot. But then again that's only empirical evidence that doesn't amount to much, and I'm moderately convinced that my three Nvidia cards had been cursed by an african warlord or some shit
For me it was
Only the 6700xt gave me problems but it's under warranty and currently being rma'd. The 6700xt was a beast when it was still working
That was the first GPU I bought with my own money, before then all I had was the 8600 GT my dad got me in high school and an old shitty FX 5000 series GPU.
>first GPU was hd5770
First GPU in my own PC in 2010, yeah. Before that, it was whatever was in my dad or my brother's PCs.
My 6700 non XT's been working like an absolute charm, pretty much curbstomping everything at 1920x1200. It doesn't UV as well as my 5600XT did, but -50mv +150mhz core +100mhz vram is still nice to have.
For me it was
>Radeon 9250 (parents old PC)
>GTX 1060 3GB
they are worse
They are bad for working on animation and by bad i mean that they take some seconds more than nvidia gpus...barely any difference
I once bought an AMD Gpu and the next day found out trannies exist.
AMD is for europoors and other third-worlders.
If you are in USA, are over 18 years old and you don't buy Nvidia you pretty much failed at life.
The problem with AMD is that they have no drivers. Generally a GPU needs good drivers to run a video game.
Rate my AMD PC. I sold my Nvidia card because it was crashing too much and the fans were loud asf.
What am I supposed to rate? Looks like any other generic PC build with rgb lights.
Thanks, I hate it
Get a solid case or solid panel accessory from fractal directly so you never have to look at it. You're supposed to use your PC, not take pictures of it.
Why are the current XFX GPUs so sexy?
Will be replacing the old ass CPU/mobo soon.
Gpus are fine
The drivers suck ass
If you are planning to neither use any AI related applications, video editing nor raytracing the 7900 XTX has the best value. If you are planning to use any of those applications, even the RTX 4070 Ti for 300 bucks less offers a similar performance.
So in short: if you plan to only game without raytracing, the 7900 XTX is your best choice. Else use RTX, as either the performane is better than AMD, or nvidia is way cheaper.
AMD is trash plain and simple
if you don't have money problems and is actually serious about their applications/games, nvidia is the only option
i'm using a GTX 1080 from like 2018, if i upgaded to a 7900 XTX, what would I be in for? (i dream of being able to play newer games at 4k without dropping quality to potato.)
i don't see the point in buying a mid-range card just to be in a position to buy yet another video card in a couple of years.. i'd rather spend more up front but be set for quite a number of years (ie. the GTX 1080)
AMD features are way better than Nvidia lately after upgrading my computer. HDR works way better, same with the DLSS equivalent and more. I checked reviews too and they say the same thing. Believe the people who actually benchmarked the stuff with data and guys who own the hardware than random anonymous Nvidia fanboys.
>Smart access memory
>is literally just rebar
How are the open source drivers on Linux? My future build in a few years will be full AMD so i am curious.
It just works out of the box on a fresh install. If you've been using nvidia you might need to reinstall mesa.
Never bought a video card before
Is it ok to buy a used one or is it better to get it completely new?
How long does one last
If you had asked before the mining craze it'd be alright. Afterwards however, it's a pretty hard gamble.
>Is it ok to buy a used one
It's usually fine, but you have no idea what that card was used for, and how. This story made headlines a few months ago: https://videocardz.com/newz/radeon-gpu-cracking-not-caused-by-drivers-storing-conditions-and-cryptomining-to-blame
Imo, unless you're getting a significantly better deal, don't risk it and buy new.
>How long does one last
Dunno, forever? Video cards can die, but it's pretty rare. They usually outlive their actual usefulness. All I can say on this topic is, high-end cards tend to break much more frequently than low-end ones.
High-end cards have a much bigger GPU die = It expands (and then contracts) more with heat = Much more likely to break the solder underneath.
High-end cards also have more memory dies = If one of them breaks, RIP card.
Much more lacking in drivers, which is a pain in the ass. It is still worth considering in the end though. Nvidia and scalpers get autisticly gnomish over even GTX shit over 5 years old.
Nah, usually what the issue ends up being some proprietary bullshit
The actual cards perform fine.
No, they're not that bad in absolute terms, but I don't think they're good enough to be very attractive given their pricing. When a 7900XTX is $1000 I don't think saving $200 over a 4080 is that attractive when you miss out on better RT performance and features. Who wants to spend $1000 on a compromised product? If it were cheaper and had amazing bang/buck it would be a different matter, but it's not cheap.
Will never touch another AMD card after 7900 XT. Constant issues from software to wattage bud with multiple monitors
Decided to just go with a msrp tuf 4070 ti
>buying the beta test and not the refresh or the final end-of-line product
AMD sucks precisely because of this.
rx580 > rx480
r9 390 > r9 290
6700xt > 5700xt
5800x3d/5950x > first generation Ryzen
The GPU are decent for the price. Still worse than Nvidia but you're getting what you pay for. The CPU have been and still are shit for most games. They're great for work though. Overall, AMD is fullfills the role of "competition" in this incredibly inbred industry. Things could be a lot better but no one has to work very hard when there are only two options.
you're completely clueless.
Ganker should I get the 3060ti or the 6700xt?
t. 3060 ti owner
What makes the 3060ti bad for you?
8gb instead of 12 vram. Sure it has marginally greater clockspeeds, but vram is better if you compare the benefits.
These mixed results are exactly why I really have no idea anymore with GPUs.
Right this moment, it's pretty easy: AyyMD is better for low-end budgeting, Nvigarden gnome is better for high end richfagging.
Outside of meme features like DLSS or AI shit a 6700XT will beat the similarly priced RTX 3060 any day.
But if you're gonna be spending buckets of cash on a GPU than AMD doesn't really have anything to go up against the RTX 40 series. Not right now, anyway.
I've got a 3060ti myself, my !st Nvidia card. Gotta say DLSS is nice
I'd get a 6700 XT because it has more VRAM, but with the 3060Ti you get DLSS, way better performance in productivity and AI tasks, better video encoder for streaming.
Performance in videogames is more or less the same.
I really wish I could get the vram of the 6700xt and all the perks of the 3060ti for somewhere in the middle in terms of price, why is this so hard to ask for?
I went with a 6700xt Sapphire. Love it and no issues but that doesn't mean anything nowadays. I'll prolly swap to Nvidia next build.
You're going to end up using some version of FSR someday because DLSS will be segmented per new generation of GPU, newer versions of DLSS will very likely not work on older generations of cards as the years go by
See: 1080ti users playing around on the latest FSR
The DLSS upscalers and all their updates work on all generations of cards which have ever supported DLSS to begin with. They don't work on Pascal because Pascal never had the hardware which DLSS uses to begin with. The only thing that doesn't work is frame generation, but that's something different entirely and they just crammed it under "DLSS" for the naming.
DLSS is basically Nvidia trying to find some way to use the enterprise hardware they leave on gamer chips despite being basically useless for games. They could easily code an FSR style upscaler that would run on raster hardware, they just don't because getting something they can sell out of byproduct is the point
DLSS doesn't actually run on tensor cores. Well, DLSS3.0 might, but nothing before that uses special hardware.
FSR is available for Nvidia too. DLSS is proprietary and limited to Nvidia.
I'm biased because I have a 3060 Ti. This thing is a fucking monster and runs pretty much everything I throw at it perfectly. Of course that might be because I only play at 1080p.
>3060ti at 1080p
Well if you're happy with it then that's great, that card will last you a long time at that resolution.
>that card will last you a long time at that resolution.
I learned that when I had FFXIV at max settings for 1080p and when doing an Alliance Raid with everyone going crazy with their shit I checked and saw the fans weren't spinning but I know they were working fine because I opened a benchmark after and they started spinning. The game did fucking nothing to it lol.
I'm planning on getting a 6700xt for 1080p, so similar boat. I plan to literally never upgrade, I already care extremely little for AAA games and I'll be fine 1080p/120hzing the rest of my life.
unless you can get the 3060ti at msrp ($399) then get the 6700xt (almost matches the 3070 now) since at $369 its a beast
I have one, its fine
But they attract linuxtards who are insufferable and annoying so when i upgrade next im going back to nvidia just to spite them
hate to break it to you but modern nvidia works fine on tuxOS
Anon, Nvidia was historically the go-to GPU on gnu/linux, their proprietary drivers were that good in comparison back then.
What a big load of bullshit, Nvidia always been inferior on Linux. The Linux creator openly hates Nvidia because they treat them like second class citizens.
IDK how people can play with DLSS or FSR. In screenshots or compressed videos it looks okay, but in-person it just looks off compared to native.
I'm undecided on what GPU to get for 1440p. Everything I was looking at a lowerish price has 8GB. Already run into issues with that on my 1070. Was thinking of going msi again since my msi 1070 is still going strong. Any suggestions?
3060ti or 3070, then agian i havent been keeping up with tech since 2015
I was looking at those but they have 8 gbs and I keep running into issues with that as is.
The 3060ti is trash, honestly not worth upgrading to that from a 1070 because you're still just gonna want a better card.
Is it? I got it for 350$ and runs games 1440p fine
>Was thinking of going msi again
The same company can make great cards one generation, and awful cards the very next generation. Always check reviews for the specific model you want to buy, and comparisons with other models of the same card.
Find a used 1080ti.
a 1080ti doesn't hold up for 1440p with newer games, especially UE5. I doubt it can even hold up 1080p60fps on Silent Hill 2
For 1440p I'd just try to find a good deal on one of the higher end AMD cards, like 6800 and up. If you're really concerned about ray-tracing you probably want at least like a 3080 or better from Nvidia. The 7900 cards from AMD will do RT but they take a pretty big hit in performance. If you can get one for a good deal though they're great at 1440.
Really depends what you want to spend, I'm generally of the opinion that you don't want to cheap out and get something you're just going to want to ditch in 2 years, so even if something is a little more than you need for 1440 now, you have to think about how it's going to be doing in the future. Something that barely does 1440 now is going to be doing 1080 pretty soon. Whatever the case you probably want like 16GB, 8 is way too low now. I wouldn't bother with anything lower than a 3080 or a 6800 right now.
>I'm undecided on what GPU to get for 1440p
keep in mind that 1440p is going to move your build up a price tier. 6700 xt, 6800 xt and 6950 xt are options at $350, $550 and $650-700 depending on if you can catch a deal. All of these cards will be able to handle 1440 utlra 60+ FPS and the higher you go the more they tolerate RT. My 6950 xt gets 90-120 FPS on the RE4 demo with ultra RT at 1440 native so you should expect most of that on the 6800 xt if you were to get that. On games with heavy RT loads like Cyberpunk I get 50-60 with just RT reflections on, so FSR has to be used with heavy RT loads. Of the 3000 series only the 3080 and 3090 are worth looking at if you can get a good deal via used or open box.
4000 series is a total fucking ripoff but the performance is good on the 4080 and 4090. 4090 is the only card in existence that can get 60+ FPS in cyberpunk at ultra RT at 1440. 7000 series is also a ripoff, just less so. Prices on the 7000 series have already dropped so I'd wait for more drops if you were considering those at all.
Every gen's 4k card turns into a 1440p card when the next gen is released.so buying top of the line last gen is always the best bet if you don't want to get gouged for the crown garden gnomeel of current gen.
>Every gen's 4k card turns into a 1440p card
This trend keeps slowing down though, we are to the point AAA games cost way too much to produce and we don't get the big jump like in past gen anymore, that's why we could afford to keep our current gpu longer now, since only a few games will really make them outdated in the coming years.
I believe the conspiracy theory that Nvidia and AMD pay off AAA devs to make unoptimized shit to justify the existence of their newer cards
I don't think they need to pay anyone, you often see retards everywhere (including this very thread) defending shit optimization because "HURR HURR poorfags"
sure but cyberpunk still can't be run at max settings native 4k on a 4090. RT loads are fucking insane and GPUs are advancing to cope.
>that's why we could afford to keep our current gpu longer now, since only a few games will really make them outdated in the coming years
upgrading beyond the current gen or even the previous one just seems insane to me. DLSS/FSR is extending the life of cards far beyond what we've ever seen, everything capable of upscaling lasts as long as the user can tolerate it.
>i can't imagine that holding after this generation, the 4080/4090 are just too fucking expensive for any kind of mass adoption
halo products are incredibly powerful marketing. The 3000 series is priced worse across the board than the 6000 series despite having comparable or even worse performance. Your typical consumer sees Nvidia made the best card so that means the 3060ti 8gb is better than whatever AMD is offering at the same or slightly lower price point. no one wants to wade through dozens of benchmarks, they just know one brand made the best card so they buy something they can afford from that brand.
>the 4090 is already pushing PSU's to their limit, basic physics dictates that wringing additional watts out of a 12v rail is begging for trouble.
we're going to hit the nightmare scenario of a flagship costing $2000+ because it comes with it's own GPU enclosure that needs a dedicated PSU +24v rails.
>we're going to hit the nightmare scenario of a flagship costing $2000+ because it comes with it's own GPU enclosure that needs a dedicated PSU +24v rails.
my man. 2 years tops, screenshot it.
i can imagine the cope
>wow 2000 is a really good deal! It comes with it's own mini-case with fans and RGB!
>you were all bitching about the psu wattage requirement on old gpus but now that it's lower since you can just use your own psu for the rest of the system you're bitching
>The EGPU design, as expected, keeps temps and noise levels cool throughout the entire system by sequestering it in it's own case. These are the best noise adjusted thermals we've ever seen. It's a great product at a horrible price.
>Nvidia's New GPU Comes With It's Own Case And It's Brilliant
>the nvidia side-car. (it's just as gay as it sounds)
But yeah, that's the only way forward at this point unfortunately, short of cloud based gaming (or dev's focusing more on optimization)
sadly my elon-net will never allow me to play anything online with sub 20ms latency. i'll give up gaming before moving back to a city/town
doubt it, we'd need a new connection standard, the 4090 is already pushing the limits of thunderbolt which isn't exactly common on motherboards
Wouldn't your pc just have two wall plugs
The issue is the data limits of the connection from the all in one gpu to the pc
i can't imagine that holding after this generation, the 4080/4090 are just too fucking expensive for any kind of mass adoption -- sure they sell a few, but a game dev can't bank on enough users having one when sorting out specs. and it's not like the next gen cards will be any cheaper; or fuck, more powerful.
the 4090 is already pushing PSU's to their limit, basic physics dictates that wringing additional watts out of a 12v rail is begging for trouble.
until +24v rails become a thing, gpu's are going to be kind of capped on performance. and as such, requirements for games.
I have a 6950xt, I paid like 600 bucks for it and it's great. If you don't care about money Nvidia is better, but most people care about money. There were worse Nvidia cards here that cost
way more just because the prices are so fucked.
nah, i'm fairly satisfied with my XFX QICK 6800 so far.
>hold forward to jason
Got lucky and got a founders 6800XT early last year during the shortage for MSRP. Never had an issues except with Horizon Zero Dawn at first. But once I turned on SAM it's been amazing. 4k 6FPS everything maxed out.
a 6700 xt cost me 380 euros.A 3060 ti would have been 480.That was a month ago.
Got a job in my third world country, gonna save for a pc around december/january of next year to replace my Ryzen 5 1600 and RX580gb
How long do you think a Ryzen 7 7700x and a RTX4080 will last at 1440p if the most recent game i've played is Spiderman Remastered and Hi Fi Rush?
dont ask how long it will last, ask will it run the shit you have now. but dont ask that because you can already google benchmarks.
I see, thank you for your answers anons
About 7 years. 10 or more if you drop your fps from 120 to 60 on modern titles. DLSS to last 15 years
I'm in similar conditions and have similar-ish plans. With graphics generally stagnating and options such as DLSS/FSR popping up a setup like that could last you an actual decade if not more.
I've had my current PC with some minor upgrades for about eight years now. It's starting to show its age, but I've not found anything I can't play decently with it yet.
No one knows. The i5 2500K lasted forever because Intel barely improved their products gen-to-gen; the Ryzen 3600 is arguably already outdated because single-thread performance improves by 30% year after year.
The GTX 480 was the fastest card on the market in 2010, by 2014-2015 it was already unusable in newer games due to lack of VRAM. On the other hand, the GTX 970 is still somewhat decent today.
It's pretty annoying honestly. This is probably the least competitive AMD has been since the Vega days, but NOW they start shilling for them? Why, what happened.
I have yet to upgrade from it because it can still run most newer games on at least medium settings. This card has been amazing. That said, it's probably time for me to get something better by now.
AMD saw the opportunity when intel stopped extreme tech upgrades. And realized that Linus needs that money for his two fucking labs. So they are paying him millions of dollars.
I'm still running a Ryzen 3600 in my 1440p build and it's still performing quite well. Truth of the matter is CPUs stop mattering to a certain point when you go above 1080p.
Not for strategy games
Good thing I don't care about strategy games
>tfw hearing the 5800x3d is an instant +90 fps in MWO
>after I got a 5900x
A-At least I have moar coars right?
3600 is pretty much on par with what PS5/XSX have so you're probably fine for this gen, at least for 60fps.
I've been running 144fps on it fine.
On older and less demanding games, sure
>How long do you think a Ryzen 7 7700x and a RTX4080 will last at 1440p
Get the 7800x3d, it will last you longer, think of it as a refresh of the 5800x3d.
They are good budget options as long as you only use your PC for games.
Both are fine.
The problem is when CEO's like Jensen treat the people and partners who helped them get where they are like shit.
You don't shit all over the people that made your company what it is.
AMD is a *little* more humble, but not my much, they've been caught lying about shit.
Probably not but I've never thought
>I wish I had an AMD card
Both companies are fine besides objectively shitty cards for the price like the 3050/60 (non Ti) or the 6500XT and 7900XT (not XTX).
Not reading the thread but my 6700 XT works on my machine.
Linux that Windows this
Blah blah blah fucking pussy shit
No but their drivers are. GL playing older games on AMD GPU's.
AMD cards are great the same way game consoles are great. They are made to do one thing and do it fairly well for a reasonable price. AMD cards are rasterization beasts. 90% of the silicon on the chips are dedicated to blasting out polygons and frames and not wasted on anything else unnecessary for games. That's why they are cheaper. That's what makes them better for the dedicated gamer on a budget.
NVIDIA cards on the other hand aren't just gaming cards. They are built to be used for a lot more than just rasterization polygons. AI/ML, CV, video/image processing, scientific computing, engineering simulations, crypto, CGI rendering, and so much more. You are paying for a lot more as many of these things can be used for gaming, but isn't strictly necessary as NVIDIA specific features are luxury features for gaming. NVIDIA cards are built for both gaming and professionals but not strictly focused either and you are paying for that extra silicon even if you don't use it.
Consumer AMD cards can do many of these things too, maybe not as well as they aren't built specifically for these tasks so you aren't strictly missing out anyways outside of specific professional tools that don't support AMD GPU acceleration.
Not inherently bad, no. The issue is more about the featureset of cards now. Raytracing is not popular by any stretch, but little timmy takbir down the way is going to beg his parents for an nvidia card to raytrace, and not an AMD.
If you don't care about raytracing, or NVENC, sure, go AMD. Me though, I use moonlight daily, and I don't really want to set up a new remote desktop, so I'm fine paying a slight premium for nvidia cards.
anon, read up on disabling geforce experience's ability to auto update. they're going to fuck you in the very near future.
they've stated their plan is to remove gamestream from geforce experience, they just haven't done it, yet.
sunshine does seem to work pretty well, my client (laptop) only does 1080p, but over a wired gigabit LAN, it's been virtually flawless with 5.1 audio. it's hardware encoded h264 (host: GTX 1080)
>implying I even fucking use geforce experience
sigh. are we talking about the same thing here? do you stream from one computer to another using moonlight on the client computer?
what are you running on your host machine to facilitate that.
No, they've stated their plan is to remove it from the Shield app.
>Starting in mid-February, a planned update to the NVIDIA Games app will begin rolling out to SHIELD owners.
>NVIDIA Games app
>rolling out to SHIELD owners
Will they remove it from PC as well? Maybe, but they haven't said they will.
>only does 1080p
Not surprising it works, 1080p works perfectly well even with fully software everything, even encode. I haven't tried Sunshine yet but at 4k only GFE and Parsec performed decently, with GFE + Moonlight being smoother and having better frame delivery.
i've got a new mini PC on order that'll be replacing the laptop, it does 4k, so we'll see about sunshine @ 4k.
and okay, maybe they have not overtly said they're killing gamestream completely (i.e. removing from GFE) but reading between the lines it sounds very likely (hell, even the Sunshine devs have said as much). Outside of shield, was there any 'official' use case for gamestream? if not, and it's cannibalizing their online streaming option, do you really expect them to keep it in GFE?
That's not a chance I'm willing to take, and given what they're doing to shield owners, it's not a company I'm willing to support with new hardware purchases regardless of what they do with gamestream/GFE.
How can it cannibalize online streaming when there are many other options which provide the same functionality which run on Android? It's not like this is stopping Shield owners from streaming from their PCs.
You know, i'm not sure..
i mean i can stream $game to my living room over my LAN for $0
OR i can pay a monthly fee to stream $game from nvidia over the internet
i'm not sure which is the better deal for me.
Not sure what point you think you're making here. You know you never needed Gamestream for that, right? Never did and still don't, even on Shield whatever since it just runs Android.
unless you're using sunlight/parsec
open up GFE, disable gamestream.
now try using moonlight, report back with results.
i'm hoping this is one of those "we're two retards talking about totally different things, LOL." but hope is fading
Why are you stuck on Moonlight? NVIDIA's shit isn't the only way to stream games from a PC, even if they remove it the option to stream games from a PC is still there. Get it? They cannot force you to pay for GeForce Now by removing Gamestream, because there are multiple alternatives. Understand?
that's the entire point of all this. replacing GFE/gamestream with sunlight before they get around to killing it (which dollars to doughnuts, they will)
from nvidia's perspective keeping GFE/gamestream alive for PC users (not shield, don't mention shield, shield is fucking absolutely irrelevant to this conversation) is potentially costing them revenue from people going that route vs their paid cloud option.
In other words, they have a free option that competes with their own paid option. why would they keep the free option around?
i mention GFE/moonlight because that is what I was using. and in making an informed decision on an upgrade for an aging 5 year old video card, being able to stream games using the new card -- it's kind of relevant. thankfully with sunshine i'm not tied to nvidia, and they can get fucked over their treatment of their Shield customers.
i don't know how else to try to explain this to you, you are so far up your own ass that you can't seem to grasp basic english. you arrogant mother fucker.
Gamestream has far better reliability, latency and compatibility than other user-land streaming software. This is because gamestream can directly capture and encode GPU framebuffers low level within the cards instead of needing to use unreliable userspace high level D3D/OpenGL/Vulkan/GDI API hooks. The hooks add latency, sometimes crashes some games, or doesn't capture anything resulting in black screens etc. and sometimes you get tearing, or poor compression artifacts.
Gamestream on the other hand just works and rarely has compatibility issues.
I don't think DXGI desktop duplication requires hooking into 3D APIs like that. Looking Glass uses it and achieves very low latency capture as far as I know.
DXGI desktop streaming suffers tearing and frame skipping/dropping issues. It's a fallback for games that can't use API hooking and generally a much worse experience.
>lack of love for AI
>lack of proprietary software and graphical features like nvidia's DLSS (no moron, fidelity fx doesn't get close), hairworks and the like
>breakthrough industry features like physx will be unavailable for your card when they're introduced by nvidia - it took years for physx to go open source
>massive unexplainable problems in a myriad of games like VRChat where it can't even play videos correctly
>new releases often have massive issues with amd products and you will wait weeks for a patch to fix it
>just as expensive as nvidia cards that are an improvement to all of the above
my rx 6700 xt is just fine, slight coil whine aside, but who cares , it runna da games
Nah, they're ok for the price
t. rocking a RX480 since 2016
how's it playing games sub 40 fps anon?
Feels good, it can even run some newer games at 60fps
bought a 7900xt about a month ago
feels pretty good, runs everything at pretty much max settings
only had a game crash GPU drivers ONCE in 1 month of heavy use
overall I'd say Im pretty happy with it and I'll recommend
is it not weird to you that amd can produce great cpus and all but their gpus are shitte
It's only recently that their desktop PC line finally had integrated gpus, before you had to buy laptop chips to get that. They tested it out with the 5600g and the 5700g as budget PC options and it seems to have worked out, so they put igpus in the 7000s.
Are they good for coom AI art? Streaming? Posting on youtube?
cumbrain has brainrot
Nah, been using them since 2010 or so. I'm on a 6900XT now, but I got to give props to my previous RX 580. It had no business doing as well as it did after I bought a 1440p monitor.
I like MSI products
the only true amd AIB gpu tierlist
Sapphire (Nitro), Powercolor (Red Devil), Asus (ROG Strix)
Sapphire (any other model), Gigabyte (AORUS)
MSI (any model), Gigabyte (any model), Asus (any other model)
Powercolor (Any fucking model other than a Red Devil), Any literally who chink manufacturer
Any XFX card is a lottery and ranges between a god tier performance card or a piece of shit you'll want to replace in 3 months
>gigabyte and asus anywhere above shit tier
I never understood why Sapphire gets sucked off this much. I haven't used any of their cards recently but I've had 4 of them and out of those 4, 3 cards had broken fans in like 3 years of use. I've literally never experienced anything like that with any other manufacturer. In fact I don't think I've had any card with broken fans that wasn't Sapphire.
I'm not sure what program you're talking, DXGI desktop duplication is a Windows API and doesn't streaming anything by itself.
Sapphire is like the EVGA of AMD GPUs, clearly not every single gpu they produce will be perfect but 9 out of 10 times they haave the most reliable performance out of all of them.
>3 cards had broken fans in like 3 years of use
So you're the reason why Sapphire has easy-to-remove fans now.
Do they really? That's nice to hear, though all my fans died or started rattling just out of warranty so I never sent 'em back for Sapphire to notice.
The fans are all easily unscrewed without removing the heatsink or anything. I think it's been a feature since the rx480 my old sapphire rx580 had easily removable fans. Check your old sapphire gpu right now and see if its a nitro with quick connect modular fans.
Last Sapphire cards I had were R9 290Xs, but I sold them off as I upgraded so I don't have anything to check anymore.
Steam in-home streaming when playing a non-3d/UI application and many popular third-party remote desktop solutions (LogMeIn, Teradici, etc) use the desktop duplication API and they all have the same problem with dealing with tearing because they don't grab frames in lockstep with the game's fliprate/frameloop. 3D API hooking lets you grab frame buffers in sync with the game's frameloop allowing the streaming software to encode and deliver frames at the same rate the game is running at without tearing. If the streaming client supports VRR/Freesync/Gsync, that can also be used to avoid tearing and less bandwidth is used for games running at a framerate lower than the display refreshrate.
I haven't used those other programs but I've only seen tearing on Steam when using NVFBC (which is considered deprecated IIRC), DXGI desktop capture has always worked perfectly well for me.
Then you likely are running games with vsync enabled (adds extra lag).
No, I'm not because Steam streaming specifically recommends disabling VSync when you stream.
Steam uses 3D API hooking except for non-3D/UI applications.
Not the case, I've seen it use DXGI when streaming games. How about Looking Glass then, are you going to tell me that also doesn't work?
I bought a Gigabyte windforce 5600XT in early 2020 and the pos would green screen on me randomly due to Gigabyte messing up the vbios, refunded it and got a 2070S instead i had no gpu so that one is also Gigabyte and this one works aside from needing a custom fan curve or it whirs up like a car also due to Gigabyte fucking up the vbios
Gigabyte GPUs are actual fucking dogshit, both AMD and Nvidia.
Now that Nvidia completely destroyed their relationship with one of the few exclusive AIBs they had who was actually worth a damn, the best AIBs now are Sapphire and maybe PNY if you don't play on overclocking.
I've had 3 different GB GPUs and they were all fine
Worst problem I had was with a 3060ti, in that its fan curve was way too high, it was an easy fix.
you cant do cool ai shit with them so yea
You can, just not on windows.
You can do it on Windows but Shark.ai is a lot to set up and you can't use LoRAs on it.
They can't generate my anime tittles like NVidia can.
So how's the VR compatibility with AMD+Linux?
I wanna drop a ton of cash on a new setup that can also do VR/Raytracing but I guess I can go without Raytracing and dodge Nvidia as long as VR is good on AMD's side.
it works about the same as it did on Nvidia (3090)
artifacting in steamvr's overlay is slightly different but ever-present until it gets fixed by Valve or whomever does it finally
Got this recently. Upgrade from a GTX 1060 6GB
I'm in fucking heaven.
>he bought a refresh
Good job anon. I got the same GPU model from sapphire.
I'm gonna buy that card when I get my tax refund, looking forward to it. On a craptop right now that struggles with Ace Attorney sometimes.
An HD 5770 was my first real GPU that I bought and did pretty well for its time. I think it was a mid-tier card. Can't remember having any problems with it. I most recently upgraded from a 1070ti to a 6950XT and the performance upgrade was obviously noticeable. Haven't had a problem yet but there is this odd hitch when playing full screen video where it will black screen for a second before displaying the fullscreen. It might be because I'm running it through an AVR though. Other than that, I haven't encountered any problems with current day AMD GPUs.
No, my 6800 is an absolute beast and maxes out 1440p games with high framerates. Not sure why people are trying to spend 800 dollars on meme cards. Really starting to believe Nvidia has an entire department dedicated to shilling on this board
Yes they are. I had sold my amd gpu because of driver issues after 2 months of problems. I feel bad for the poor gay that bought it.
Sorta off topic but hopefully an anon can shed some light. I added and replaced some fans in my rig today before I swapped them out I ran Prime95 and it settled at 83 degrees after 20 minutes. After swapping my new fans and dusting I run Prime95 again - it settles at 60 degreees for about 5 - 6 minutes then spikes up to 90 degrees then it slowly goes back down to 60. Running multiple runs and the behaviour repeats itself every time. The CPU is a 5800x.
>it settles at 60 degreees for about 5 - 6 minutes then spikes up to 90 degrees then it slowly goes back down to 60.
Sounds like it's going through all the various instruction sets available to the CPU testing stress loads on each one. This isn't a problem, stress test your CPU by playing the most demanding game in your collection. Synthetic loads aren't that good as an indicator of daily use.
I had that thought but that doesn't explain why it was stable before I swapped fans.
Check your fan curves in whatever software you're using to control them and see if they're default settings. It sounds like what AMD does with their GPU fans only starting up at 60C except why would it apply to a CPU.
>GPU fans only starting up at 60C
perfectly fine for a gpu and probably preferable since larger variations in temps will cause more damage to components than a steady warm state
Yes, yes, we all know that, my own 6750xt is idling at 37C no fans on right now. The question is why is that behavior being applied to a CPU.
Thats what they do on nvidia. Thats normal
Anon, the original post was referring to a 5800x. A CPU. Semi-passive heatsink behavior is known for GPU cards but not usually seen for modern CPUs on desktop.
I like my 6700XT except there's one game I can't play on it unless I roll back to old drivers because of AMD messing with OpenGL driver calls or something, besides that it's been great.
Is it Hatsune Miku Project Diva Arcade Future Tone?
No, it's the Combat Mission series of games, not surprised that other games are having issues, though.
Went with AMD for my most recent PC because it came with two games I was gonna buy Day 1 anyway, figured it was like getting it for 120 dollars off.
It sucks, developers literally just do not fucking care about AMD GPUs. Even when there is Vulkan support it doesn't feel like enough, simple shit like watching videos on the internet can behave fucky sometimes, drivers are a pain in the ass, emulators are an inconsistent crapshoot (PS3 emulation runs fantastic, but somehow can barely run a Saturn game at 12fps)
I'm upgrading a 10 year old pc so I got a 6750 xt but I don't know what cpu to get now
help don't let me drown in the sea
5800x3d or wait for the next monolith die from AMD 7800x3d
the gpus are pretty good, it's the drivers that are bad
This, they work fine on Linux because people other than AMD work on them.
AMD should really start hiring some of the people working on their open source drivers, it's clear their current in-house staff are shit.
their driver team are mostly h1b visa pajeets and they will always be trash until that changes
Reality imitates memes
They are the PC version of playing games on a clone console.
amd bros BTFO
i have an rx 6600, it's fine, it played CP2077 without issue
they're the best for value.
i got an rx 6600 for $200 and its been incredible tbh.
never take any opinion on Gankerirgins seriously
for real, some of you fucking losers need to get a life, stop this brand gayry and give factual arguments instead of devolving every single thread into a shitflinging fest
Mad coz bad GPU
Modern Ganker is 90% brand fanboys.
Just go with whatever offers the best value for you.
Personally I've owned multiple AMD and Nvidia GPUs in my life and I've never had issues with either brand because I'm not retarded.
Unironically no, however that's only in regards to gaming. Nvidia still comes out on top for other shit not related to gaming.
how am I supposed to fix this
what game are you playing on windows?
You're on windows and you use that garbage? Not even on linux is it worth using that junk normalfag program. Get MPC+HC off the KLite bundle or go full blown mpv.
mpc crash almost every time you enter full screen
That's some weird shit, I have no problems on my end. Try SMPlayer then, it's a GUI of mpv.
I like to dabble every now and then with 3D-modeling, and unlike most of this board I actually like the idea of ray-tracing and DLSS. So when it came time to upgrade from a seven year old 980, I went with a 3060 Ti. If I'm going to upgrade to a 40 series card it will probably be a 4070 Ti because the 4080 and 4090 are not only ridiculously expensive, but they also look like fucking bricks and are absolutely butt-ugly.
I would look into AMD but I always see reviews on sites as well as comments from anons on here about how they buy a Radeon card and end up regretting it, and remember my first custom PC had a Radeon card and would flicker every now and again for no reason.
>and unlike most of this board I actually like the idea of ray-tracing
And yet you went with a GPU that's dogshit at it.
Jokes on you, the only game I have installed that has ray-tracing is Minecraft. It works just fine for me!
Except I never play Minecraft....
This generation AMD gpu's is really bad, 7900xtx eatup mostly 600W and perform worst or like 4080.
the 7900xtx caps itself at 291w
I've been running the 7900xtx for the last few months. Not really seeing any problems with it. I can't really compare performance against a modern nVidia card since my previous system was a 1060, but stability is as expected (modern games crash occasionally, but when is that not true?), and performance is good enough that I wouldn't be able to tell if it were better (sidenote: what's a good FPS counter these days that works on every game?)
Steam has a fps counter in its options menu
This would generally require either buying games, or going way out of my way to run them through steam.
I don't want ALL the metrics, that's a huge obtrusive block that gets in the way of playing games. I just want a tiny little FPS counter.
Afterburner then, Mr. Piratefag. I think you can even use it on emulators? Not sure.
You can configure what metrics you want to see with the amd software
>I don't want ALL the metrics
on the Tracking / Overlay tab, select the Tracking tab and disable metrics except for FPS
>modern games crash occasionally, but when is that not true?
what games crash?
>what's a good FPS counter these days that works on every game
On windows you have MSI Afterburner, turn on the onscreen display. For linux you have MangoHud.
Recently, Hogwarts Legacy crashed about 3-5 times in the maybe 30 hours I played it. I also tried NMS again and got at least 3 hard locks within 2 hours so that one could be a card issue or an issue with literally anything else on my machine. Hard locks in Returnal. Maybe a couple of crashes in mechwarrior 5 but I could be misremembering from the previous time I played that.
Except for NMS, it's nothing out of the ordinary.
What the fuck, none of that is normal.
the amd software included with the drivers has an fps/stats counter as well. AMD Software -> Performance -> Metrics -> Overlay -> Show Metrics Overlay. Games shouldn't crash btw, that is not normal
I want to buy a new GPU so fucking bad but I can't think of a single upcoming game that justifies it
Still using a 1080 ti and I only play at 1080p
You don't need more at 1080p. That 1080ti is a rare good product on the same level as the i5 2500k/2600k for CPUs succeeded by the 5800x3d.
Yeah it still holds up incredibly well
I do have to mess with settings occasionally, but a lot of games still run great maxed out at 1080p
Just feel like upgrading to further future proof myself for when it's not enough anymore, but I always struggle to find game(s) coming out to justify it
Honestly, I only upgraded in preparation for Dragon's Dogma 2.
I was still running a 1070.
Not that guy but what did you get?
Dragon's Dogma 2 is the only upcoming game that I'm interested in, if not for that I would continue using my GTX 660 until it is literally stops working at all.
>Not that guy but what did you get?
RX 6950 XT
Saw it on sale for $650 so I snatched it up.
I'll probably use this GPU until I quit (modern) gaming altogether.
Your GPU will have the same lifespan as the Steam Deck. For as long as the Deck can run games, your GPU should be able to match its driver. So, 10 years.
convince me it's a terrible idea to upgrade my GTX 1080 to the 7900XTX
work bonus burning a hole in my pocket.
Invest in silver.
RDNA3 is a beta test, don't buy it until the drivers are better.
RDNA3's drivers are better than 2's, which are 1's, which are better than Vega's
When is an AMD GPU not a beta test?
>When is an AMD GPU not a beta test?
From my experience, after a year or two passes. The drivers really suck.
honestly, if you have all the kit and caboodle right now like 1440p/4k monitor and a computer that wont bottleneck, go for it
if you're doing a whole system refresh id wait till next gen
i did a full refresh at the start of the year with a 7600x and a 4090 with an lgc2, i dont regret it for a second bar the mental gymnastics i did to justify spending 500 dollars more than my entire system for the gpu, but again, i said fuck it i havnt bought a gpu in 7 years and a cpu in 11 so i got my moneys worth
thanks anon, yeah i upgraded everything else last year
i just held off on the GPU because, damn they were rare and way, way overpriced.
now they're just overpriced
most of my gaming is on a 4k TV, and the GTX 1080 is showing its age. it's been a trooper though, got it for like 550 back in 2018. so i don't mind paying a bit of money for a card that'll last a few years (and i'd wager a 7900 XTX will be relevant even longer)
>bought 3080 10gb for 900 dollars last June
>it can barely run AAA games above 60fps now never reaching 142fps goal
Guess its 1080p all settings low for as much fps for the next 5 years for me.
Turn of DLSS and it should run faster.
3080 10gb for 900 dollars last June
Why in the fuck would you do that?
You never buy a GPU above MSRP.
Disable meme tracing and stop playing at 4K.
i have a 10gb 3080 and have no issues running AAA games at 4k at 100+fps. You sure its not your RAM or CPU?
probably bottlenecked by 3700X and 16GB ddr4. I plan on getting a 5800X3D soon
Does switching to 1440p really make a difference visually?
If you have two 1080p monitors don't bother.
1080p looks blurry by comparison
Yes, not only is it clearer but there's also far less aliasing
It's more significant than going from 1440p to 4K
They play games well. If, for some reason, you know for a fact that you will be doing some non-gaming tasks that use CUDA cores, that's when you get Nvidia.
Just built my PC what to play with this?
Star Ocean The Divine Force
Daemon X Machina
why are you black
Because now I'm the true master race on Ganker lmao. All I play is PC now and emulate consoles I gave all my consoles and games away
Stupid moron haha
AMD works. It's just some things like emulators run like shit if they don't have Vulkan support
I had a rx590 sapphire pulse I bought for $140 5 years ago. It replaced my decade old gtx770. I upgraded to a 3080 when it came out for $750. I prefer g-sync over free-sync. If I were to upgrade, I'll probably get another nvidia card. It's usually software and gpu optimization. Plus AMD runs a literal furnace. Not sure why they always get so hot.
You all fight between Amd & Nvidia. But when it comes to manufacturers do you have a go-to? I remember buying an EVGA Nvidia card back in 2016 solely because it had two DVI slots but now that I have only HDMI/DP monitors I don't need to scour for specific ports from specific manufacturers.
Sapphire and XFX on AMD
used to be EVGA on Nvidia, now maybe MSI takes that spot
>still no 7500 xt
Is $50 for a 1050 ti worth it?
A 1050ti is kind of pointless for gaming, it's barely any better than modern integrated GPUs
No, they're not. In the current generation (or to a degree one past), the AMD GPUs are a way better buy overall and AMD is a less shitty company, supports FOSS in general. If you can afford $1600+ for marginal at times or very situational gains, then by all means by Nvidia 4090 as its the only NV card worth it so far. However for everyone else, the $1K AMD 7900XTX , or the $800 XT is a better value for the performance. I expect the same will be true as lower tier cards are released as well.
Is a 6750xt good for 1440p gaming?
>This is a trick question.
Neither are inherently good or better. It is per title preferential rendering which is then compounded with opinion which will typically override statistical or observational analysis. Truthfully, NVIDIA is had a ton of tight stuff they let slip for publicity stunt everyone bought into building their business on functional, or rather fictional theory which eventually got forwarded to AMD, such as XBOX / PlayStation and now their only device in this category is the useless Switch which is a very niche product for new younger people. As a stylized preference they are extremely elitist and it is reflected in their work which I'll be one to accept it has grown on. Most people don't deserve their gift of inheritance and thus don't expect it to ever be different. They've successfully covered up every blunder of their ignorance therefore it is futile to go against their stance when people blindly follow their trend without full realization of the raw history. As enthusiast GPU, one will be hard pressed to find a single title where they aren't the best acceleration even if how their doing their pipeline is fraudulent and filled with error or never fully realized. AMD on the other hand used to be really excellent and providing a different type of GPU back when they were ATI, hence the reason they were touted as worse yet were genuinely beefier in the cutting edge state of the art engineering. They just were better hardware and destroyed everything NVIDIA ever tried then especially in efficiency and reliability. Won't hear this often, because there is a lot of people who spread opinionated falsehood about the past. Often it is user error contributing to public perception. Fragility just would require higher quality support of the known tolerance of the product. AMD basically picked up a ton of the slack when they should have been cutthroat, and at the end of the day having a GPU in the PC to do 3D accelerated work is better than not having one
Didn't read but GPUs should not cost as much as a used beater
they're affordable for the average user, however lacking every feature known to mankind.
People say that they're good. I felt for 5700xt youtuber shilling. Bought saphire one. Worst gpu i ever had.
Never again will i buy amd shit.
>Trouble is now AMD has evolved into public perception as being the open sourced option viable price performance option competitive to NVIDIA and their proliferation of the identical practice, when this couldn't be further of the truth. They basically have everything at their fingertips, and can progress innovation at their leisure. They are a specialized with fine tuned colorspace control for unique usage case scenario in specific title which were solely designed for the hardware itself. Good example of this is BATTLEFIELD 4, which is now a devoid of life corrupt experience littered with tryhard hacking nonsense and will never see another content update directly of the publisher officially. This is terrible for the industry as a whole when people accept a worse product. Accept the highest end enthusiast, or go home. End of story. Otherwise stagnation will. ATI is gone and dead, and their involvement in the Wii / GC / Xbox 360 has come to a close. It isn't the thing it once could be, and they've exhausted every known trick in the book, and their skill had been completed ignored and forgotten for gimmicky retarded bullshit nuance gay shit, no joke. Everything that was cool about this contender has rotted into brain dead territory. There is no resolution either, both vendor are here to stay forever. Is it excellent in the STEAMDECK where they have full throttle Windows 11 driver support for their high end device, absolutely, but there is no foreseeable option where the consumer is able to choose their experience. Just end up needing to purchase a pair or few of both the highest tier GPU to get the full story. There is no tradeoff whatsoever. Stuff designed for AMD is just better in their HARDWARE, and stuff for NVIDIA is just suited to their preference of style. Only exception to this rule is super funded or just skillfully incorporated title which feature unrivaled and unconditional support of both to their strength / level of fidelity
>INB4 Console gay justify Series X / PS5 as high end. These gayS just need to be shot in the head
>DO I WANT HOW 7990 XTX render color
DO I WANT HOW NVIDIA DOES RTX
HMM I'M RETARDED
nvidia color profile
AMD color profile
DLSS vs Fsr
DLSS x Fsr
Playing HALO or Playing HALO
the same fucking thing, but it looks different and shit, and they both do different shit good it's such an annoying mind fuck. POINT is, can't have both unless you truly stupid as fuck with your rich spending decisions so choose wisely and don't come crying here when you fucking chose wrong. Doesn't hurt to just accept one or the other as we've accepted NVIDIA is extension into itself trend and AMD is considered worse even though it was a never a competition trying to utilized their driver specific functionality in specific titles
RX 6600 is more than enough
Rx 6600 is a piece of shit
7990 XTX or 4090 or you're the fucking problem