Are ayyymd GPUs still a meme?
I've been thinking of upgrading from my 1070ti and this review (https://www.youtube.com/watch?v=jLVGL7aAYgY) has brought my attention to the rx 3800 which is cheaper and seems to handle its own weight vs the 3080 for 1080p gaymen.
I remember people used to complain about AMD's drivers and GPUs being complete shit back in the day so that is still kind of holding me back from just straight up buying one of these.
I've never had an issue with them even though I have a 3070ti, I even prefer radeon replay over nvidia's bullshit
werks on my machine
t. 6700xt
People saying AMD drivers are dogshit is either everyone somehow being a shill on a Vietnamese RAM overclocking forum or they're just shit.
I guess I'm just afraid because I remember some threads ago I saw some posts about AMD cards having some sort of hitching problem where people would consistently get frame issues, it was probably bullshit but I just wanted some extra reassurance.
There's a common problem, especially with emulation, where AMD GPUs will recognize they don't need to run at full power but end up undershooting the value they do need, causing dips in framerate
There's a min frequency slider in the software you can turn up from the default 500 to something higher to avoid that in any cases I've seen of it
They ARE dogshit. On windows, anyways, Linux has very good drivers in comparison.
For gaming, worse raytracing performance is about the only drawback of note
Do some extra research if you have non-vidya use cases for a GPU, such as Blender
my 6800xt's been good to me
RTX is a meme to begin with
Cutting your frames by half in exchange for darker areas and realistic water puddles ain't fricking worth it.
Which isn't even a real advantage for Nvidia most of the time, when you take price into account. AMD and Nvidia have similar RT perf/$, but AMD is ahead in raster perf/$.
The biggest advantage is probably DLSS. But again, with AMD you get much faster cards for the same price: what looks better, a 6700XT with FSR@80% resolution or a 3060 with DLSS@60% resolution? The 6700XT is faster, so it doesn't need to lower the resolution as much as a 3060 to achieve the same framerate.
>AMDunboxed
>why? well they disrespect Nvidia!
>AMDunboxed
>shits on 7900XT(X)
?
>They compare DLSS vs. FSR, they prove DLSS is better.
BASED BASED BASED
>They test newer games with 8GB, 12GB and 16GB cards, they prove 8GB is dead and 12GB is barely enough.
REEEE AMD UNBOXED!!!
For gaming likely fine. For anythyng else no unles you are on linux.
they are pretty much fine these days. 6750xts are a steal right now compared to what is coming out
>I remember people used to complain about AMD's drivers and GPUs being complete shit back in the day
Their drivers are still a bit shit. But their graphics cards are okay, especially the lower mid range. The 6700XT is probably the best value card you can get right now for high framerate 1080p gameplay. Their hardware is alright. It's on the software side that they frick things up, the drivers and tech are a bit behind. But at least compared to Nvidia their offerings aren't as overpriced.
t. 4070
I think the vendors in my country have become aware the 6700XT has gotten more popular. It got more expensive again.
they lack nvidia features and their linux drivers are crap, only relevant if you're going to be doing more than gaming or mining though
>linux drivers are crap
the frick are you on about?
You should be fine if you use DDU to clean up before installing the drivers and don't update those drivers without a reason.
you're browsing Ganker
everything is meme
I bought a 6800XT reference on launch. god it was a blackscreen nightmare. sold and bought a 3080 a year later. couldn’t be happier now
What are these weird ass brands like 51Risc or MLsse?
Chink brands for the chinkies.
Been using them for a while now, and the only issue I ever encountered was BepInEx failing to load some textures for a Japanese porn VN translation mod.
I haven't had an Nvidia in a while, so I am not sure what I might be missing out on, but I also don't feel like there's anything wrong with AMD.
AMD GPUs just fricking work. Let's see your shitty 4070 do this.
> 464W
That card uses more power than a 4090.
you can always undervolt
You can undervolt a 4090 too. The 7900XTX would still use more power.
RDNA2 is way more power efficient, cheaper, and basically on par with RDNA3. 7900XT = 6950XT
>on par with RDNA3
You must have hit your head.
thats a fake benchmark channel.
>modding a pubestache into Leon's beautiful face
it gets burned off in the house fire afterwards anyways
Undervolting the 7900XT(X) doesn't actually reduce power consumption - you just get more performance. At stock ALL of the big fat RDNA3 cards are power limited (even the mighty nitro) as the silicon really can do 3ghz (or more in some scenarios) but it sucks juice like americans eat burgers to get there.
If anyone ITT has an RDNA3 card and cyberpunk turn on the RT overdrive mod and watch your clocks go into orbit. Performance is shit because RT overdrive is tuned for nvidia, but man does it make RDNA3 run full throttle.
Pic sorta related - down scaled from 8k.
I unironically prefer textures from old games
That's the fan HD project.
you're not the only one, I like my visuals to be sharp, I'm getting tired of new games being filled with vaseline because of taa and 5000 different post processing effects
That isn't vanilla RE4.
For RDNA3 you shouldn't be touching max frequency clocks at all - let the card do whatever it wants. Even a reference model will default to 3ghz(ish) max clocks in adrenalin but will never get anywhere close to that outside of niche scenarios.
>For RDNA3 you shouldn't be touching max frequency clocks at all
the default clocks are way too aggressive on the max clock. it's just like oc cards on rdna2 with underspec coolers (msi). the card will never hit those boost clocks in reality because they are unsustainable for long term and this leads to clock dropping and the voltage getting ignored (again MUCH WORSE THAN MPT).
thats why RDNA2 is better because you can use MPT to force the voltage low while keeping your clocks high. while on RDNA3 they killed that so now you gotta lower your clocks to keep with voltage low which is not ideal.
> the card will never hit those boost clocks in reality because they are unsustainable for long term
I disagree with the caveat that most rasterisation workloads won't hit that. In compute scenarios the card(s) can absolutely sustain those clocks depending on the power target. Now I will also concede that the definition of reality could be interpreted to mean the aforementioned rasterisation but saying the clocks are unsustainable is true.
Pic sorta related - reference card with +15% power limit. Any of the good AIB models would smash 3ghz if allowed to drink a little over 400w. AMD made the smart move to limit the XTX to 355w as performance scaling with power is atrocious.
RDNA3 has the same clock behavior as RDNA2 where your undervolt doesn't apply when you go over a certain max freq in Adrenalin. You have to use MPT to force the voltage (but RDNA3 doesn't have MPT only RDNA2 does) or set a lower clock limit in Adrenalin but this gives you worse results than MPT and you also miss stuff like VDDC, SOC, TDC, and other voltage tweaking.
I found this out the hard way, only way to reduce power draw is to lower core frequency on per game basis. So I just run 2300MHz @1070mV and cap framerates.
Using higher resolution will cause massive increase in power draw even if clocks stay the same.
I hate how AyyMD blocked More Power Tools from working on RDNA3.
I wish I could just set lower power limit, but -10% PL is all I can get.
As for cyber troony on my 7900XT clocks stay under stock 2800MHz even with overdrive enabled. I uninstalled the game after checking out how it performs
>As for cyber troony on my 7900XT clocks stay under stock 2800MHz even with overdrive enabled
That is because the XT has only a 325W (stock) power limit. I get why that is the case (because if they gave it the 355w of the XTX the performance delta would be virtually nil) but 2800mhz is still a notable increase.
Except power draw goes down with PT enabled and clock stays the same.
I could add 15% PL but it's fricking summer and it's too hot in my room as is :^)
Going from 1100mV 2800MHz to 1070mV 2300MHz is only around 10% performance decrease but power draw is around 25% lower, but this depends on the game.
It's bizzare how GPU clocks very high in some games from early 2000 but then in others it just use less than 60W.
1070mv is a good UV - the average is 1100mv (which is what my XTX holds at).
1100mV is stock on my card, so it's only -30mV.
I prefer how it works on nV cards with voltage curve or old AMD cards.
If I had access to whole voltage table it would be great but RDNA3 is so locked down.
I'm reminded of some obscene overclocking I pulled on GCN. Man was that an architecture that would take as much power (and voltage) as you could cool. I pushed a 290x to 1200mhz core on air because I ended up with a golden sample.
They shit on the XT, the XTX was merely fine (because the T was priced waaaay too high, XTX was sorta kinda on the money). However they have a rep as AMD shills because they don't toe the line for nvidia. The counter/balance to them is digital foundry which declares Nvidia the best no matter what.
Old GPUs were so much more fun to play with, there was so much performance headroom. Now everything runs at the redline out of the box.
While I love OC there is nearly no point in overclocking anymore, just for lulz.
i suppose the positive outcome is when you buy a part you know it is going to be the best it can be out of the box which is good overall.
Now it's the other way around. Products are clocked to the moon for +5% performance, I'd rather have a more efficient chip.
At the very least this is why AMD offers eco mode for their cpus.
>Nearly 400W
Damn son.
Now we get that out of the box and we have to UV to get our hardware back on the peak of efficiency.
I tried OC for as long as I remember, mostly because I had shit hardware so every bit of performance mattered. Then 2500k came around and I could just push that as far as temps allowed, 4.6GHz was perfectly stable but it got very hot so I just kept 4.4GHz for almost a decade. I haven't tried modding GPUs to allow higher power limits or raised voltage but I got cheap hot air station recently so maybe I'll buy some old GPU and go to town with it.
>Damn son.
That is the max of 2x8pin and the PCIE slot. It was some real shit for 2014 (and beyond).
>step 1, open nvidia control panel
>not wanting the sovlful lightweight xp ui
>thinking the gay goyslop ui of the amd radeon shit is good
Bloated slop botnet program, the nvidia control panel is the one good thing about nvidia.
6800xt is amazing
Btw, AMD cards aren't as good as nvidia for things like Maya or cinema4d
There's really no general difference between amd or nvidia gpu's. Its still a luck of the draw, you either get a stable gpu or not. But right now the 4xxx series seem to be like one step forward and one step back compared to 3xxx.
You are literally burning your money if you upgrade from 3 to 4 right in greedia's pants.
Do NOT buy any new AMD products until a year or at minimum 6 months since their release.
That's the simplest rule you should follow to avoid the "bad drivers" meme.
>t. dealt with the 5700 XT disaster launch personally
I mean they fricked up with 7800X3D/AM5 too at launch and that was just a miscommunication problem with mobo manufacturers wasn't it? I'd say bad omen at this point not just bad drivers
buying AMD turns your computer into MUSTARD GAS
Yeah, drivers are still shit but they're good if you're on an extreme budget. The nvidia options are always better, fsr is shit and dlss is significantly better in every way.
nvidia features are overrated it seems. dlss seems to be cope for people who can't run native 1440p or 4k like amd gpu owners can. also amd can undervolt and do things better in the drivers, nvidia sucks at that.
Well you're totally wrong. DLSS is objectively better, performs better, and looks way better in motion.
Cool reddit narrative, dlss on quality is still better than 4k native.
>and looks way better in motion.
Now you're actually trolling
It does. DLSS suffers less from ghosting than FSR2.
I'm not talking about FSR I'm talking about compared to native
>dlss on quality is still better than 4k native
post rig
>DLSS is objectively better, performs better, and looks way better in motion.
How is rendering less pixels better than having more pixels especially when AI scaling does worse with less pixels? What you're basically saying is it's okay for Nvidia to sell you less memory bandwidth for more money because of pic related DLSS can save it.
Works on my machine
This is probably bait because it's almost unbelievably stupid.
Think of DLSS as an extra layer of insurance for future proofing. A card that can handle native 1440p at 60fps now might not be able to do the same in games that come out 3 years from now. DLSS could then act as a crutch that buys you more time. But you shouldn't use DLSS performance as a benchmark.
i'd just buy a new gpu instead of coping with dlss. you can't save up money in years? get a job homosexual.
I most likely have a better paying job than you. Only poor people have the chronic impulse to consume.
We are comparing features of two competing products, but your brain automatically defaulted to "just consume more product, you must be poor". Why?
Both FSR and DLSS are shit compared to native
Don't defend game developers for relying on them
6700xt Sapphire. No issues. Only thing is AMDs adrenaline replay drivers are shit and cause stuttering. Long as you don't update you good
6800xt is one of the best bangs for your buck right now, especially if you can get one for around $500 or less.
>I remember people used to complain about AMD's drivers and GPUs being complete shit back in the day so that is still kind of holding me back from just straight up buying one of these.
No tech reviewer mentions driver issues when benchmarking all their games because they're not a problem now. Keep in mind nvidia either expends marketing budget on Ganker or there's actual unpaid shills.
>580 8gb for 3 fricking hundred
I have a 7600 mobile and it’s played everything on 1080p 60fps, good enough for me
Threadly reminder that AMD cards work flawlessly on Linux, using open source drivers.
Only their proprietary Windows drivers are shit.
Now try undervolting.
use corectl
https://gitlab.com/corectrl/corectrl/-/issues/344
>7900xtx
works fine on my 6800xt
I have the model from your pic. Works on my machine.
Corectrl does the job. It also allows you to set the power limit to whatever you want, unlike Windows where you're limited to minus 6% for whatever reason.
>haslel cpu
I'll give you props as it is a xeon, but still lol @ u
Good enough for the most part but yeah it's due for an upgrade.
My live service anti-cheat enabled goyslop says my addiction isn't allowed on linux/proton so I don't care
This plays everything I want at 1080p.
Same, the only thing I regret now is not getting a 6700 (non-XT) for that vram bump as a bit of a future-proof, they were priced about the same as 6650XT around here.
The industry is already pretty much demanding you run new titles with upscaling and in that department there just isn't even a comparison, DLSS is miles ahead of FSR. So if that bothers you then it's a simple question.
should stick with 1070ti unless you're doing a substantial upgrade to 4070 or higher
If consoles like Playstation, Steam Deck, and Xbox use AMD hardware they can't be all that bad. That means they work on Linux (Vulkan) and Windows (DX11/12).
They're gr8
Pretty much the only thing you're losing is like enterprise stuff.
So if you don't give a shit about Blender rendering, or raytracing or "AI" bullshit and only care about Vidya, it's the smarter choice.
I am gonna grab a 6750XT soon to go with my Ryzen 7 5800X3D and 32GB 3200 Ram. I could get it right now but I need to do a bit more research before I pull the trigger. Actually I have 2 questions if anyone here can answer.
1. Is the 6750XT bigger in size than a 1080Ti?
2. Does the 6750XT use a LOT more voltage power than 1080Ti?
no, no
Thank you.
both cards have the same power consumption, but the actual size depends on which model you buy, there's definitely some 6750XTs out there larger than some 1080Tis
God damn. The 1080Ti is a size beast, I didn't think they could get too much bigger.
GPU sizes scale with power consumption. While the 4090 is a little overspeeced (legend has it it was going to be a 600w card and AIB's built to that, only for Nvidia to drop the power when they got word of AMD's offerings) the fact of the matter is for something as parallel as a gpu moar of everything gets moar performance and power is an easy way to do it. The 1080ti is a 250w card which was high when it was new, now top end cards are 350w+.
To put things in a little bit of perspective the "golden era" of pcgaming (which realistically means windows 98-xp) had cpus that wouldn't even need a fan and fit into a single slot.
Is it even worth upgrading from a 1080Ti to a 6750XT? I mean it's faster, but not THAT much faster. I'd get at least a 6800/6800XT.
I've got this guy, so far it's pretty great running most things at 4k or 1440p.
Bought a 7900xtx but couldn't use it yet because my new motherboard died.
lmao Jay
I will never buy an anus board again.
I blame elmor.
I have no idea who jay and elmor are. Did it happen to some eceleb?
Jay as in Jayztwocents, professional moron. Elmor is the online handle of the dude who was the pro overclocker and all around good guy at asus (who pushed for a lot of features on their mobos) who left to set up his own company that makes gear for very, very srs bsns overclocking and hardware monitoring. The rep asus built on their old school maximus, formula and crosshair mobos is due to elmor.
Ah. Frick elmor then.
amd is still a meme. Just get Intel.
amd has no drivers. Just buy the latest nvidia 4060 and you'll be set for a generation.
Great advice jensen.
One last question. I am seeing people say the 6750XT is about 5% stronger than the 6700XT for an extra 80+ dollars, is that indeed the case? Should I save money and get the 6700XT instead? I mean I am down for saving money.
its a 6700xt with a overclock on the memory and core. it wont oc that high on your own but you can get close to it.
Yes, it's about 5-10% faster. Not worth paying that much more.
Just borrowed this bros what to play?
>stolen photo
Frick off.
you giving it back right?
I just built a PC a few weeks ago so I spent a lot of time analyzing the current market. Right now AMD cards tend to be far better value for the money. For example, in Canada a 6750XT costs almost $200 less than a 3070 which has comparable rasterization performance. The only tradeoff is no DLSS and worse raytracing performance. Unless you're buying a top of the line card, AMD is probably better value.
Yeah last year I got the 6800XT for the price of 3070 or 3070ti can't remember which.
1070 waitgay here, im gaming @1080p and waiting for rtx 5060
>Waiting until 2026 for midrange GPU
Why?
>buy 460(ti) goy
>how about no, 6700XT for me
>REEEEEEEEEEE
A rough summary of current tech.
Navi 31 is shit and broken. Navi 32 might fix a few things.
I love my 6900XT
AMD's GPUs are mostly fine nowadays and probably even better price to performance in gaming but as far as I know, if you ever want to get dabble with AI shit / 3d rendering / most computing heavy workflows and have it run on locally instead of a server somewhere you're mostly shit out of luck since they're all optimized for NVIDIA cards. Cuda cores and whatnot