I'm replacing my Nvidia GPU with an AMD GPU and no one can stop me.
UFOs Are A Psyop Shirt $21.68 |
DMT Has Friends For Me Shirt $21.68 |
UFOs Are A Psyop Shirt $21.68 |
I'm replacing my Nvidia GPU with an AMD GPU and no one can stop me.
UFOs Are A Psyop Shirt $21.68 |
DMT Has Friends For Me Shirt $21.68 |
UFOs Are A Psyop Shirt $21.68 |
I'm doing the same, will probably switch it over tomorrow
although now I look at it i'm not entirely certain this thing is going to fit in my case so there might be a good bit more delay
Good. NVIDIA is broken garbage.
nvidia has dlss
No one cares.
>fake frames
>1080p
Do poors really?
NVIDIA's shitty drivers provide an inferior experience, no matter the resolution.
NOT LIKE THIS SHITVIDIA TRANSXISTERS! NOT LIKE THIS!
>renderer tailor-made for AMD hardware performs optimally on AMD hardware
I wonder why in addition to consoles, AMD ALSO locked-in Starfield's PC release as well
Good, it's their turn. The amount of fricking games I've launched from 2000 to 2015 that had Nvidia logos at startup was ridiculous.
I also got Starfield free with my 6600, which means even though it's a mid-range card they're still going to optimize for it. AMDchads won.
>The amount of fricking games I've launched from 2000 to 2015 that had Nvidia logos at startup was ridiculous
AMD ruined Rage's launch in a way that disillusioned Carmack from vidya entirely and killed the company until Doom, devs didn't work with AMD then because they break more than they fix, and up until 2015, they were cruising off of the CAP system, which 0% of devs thought was an acceptable solution to fix driver problems
They only do so now because it's easier to get a powerful AMD card close to MSRP than it is an Nvidia one, along with them gambling on zoomers wanting AMD cards for the latest games only, and not something like KOTOR or OGL-based, or anything that'd look good with MSAA or SGSSAA. If Nvidia brought cards back to MSRP starting with the mountain of unsold 12GB 3080's, Bethesda would've laughed AMD out the door while reminding them that an official ATI solution for nonexistent MSAA at one point was renaming the game executable Oblivion.exe
My ATI/AMD history has been 9800 Pro > X1950 > 4890 > 5770 > 6970 > 7970 > Fury > 580 up until I grabbed a cheap 1080 ti, and around the 6970 is when drivers were truly being skullfricked by the terrible move to .Net, the 7970 is the last card I'd consider "good", but the drivers were truly bad, ATI Tray Tools getting killed in the process was legit the end of an era. People can scream about how I'm only focusing on the bad/old, but of the two anons I've asked to try KOTOR on their brand-new 7900's with the latest drivers, the game is STILL FRICKING BROKEN ON ATI/AMD HARDWARE SINCE 2010. I don't care that it's an ancient game in comparison, how are you gonna frick-up handling OpenGL 1.3? Nvidia cards can run it right outta the box if they want to, why can't AMD? If they're better than Nvidia, how are they unable to fix fricking KOTOR?
Nvidia just fricking werks
KOTOR just works on my AMD PC so you're wrong
With framebuffer effects and soft shadowing on?
yes i've been playing it on Steam Deck which is a ryzen+rdna2 handheld.
This reads like a genuine shill post.
Refute it then
nice wall of text, but nah. NAH.
Get fricked AMD shills no one's buying your inferior products
Sorry, numbers don't lie.
>1080p
a poor performance indeed
Higher res is going to favor higher vram
Let me guess, you need more?
>6600
>25FPS
>at fricking 1080p
how is this possible? My 6600 does Cyberpunk 2077 at 1080p 60fps with pretty much all settings at max. R&C doesn't even look that good.
see
I mean, first post has already blown out the OP's butthole. Why'd you link me?
It's at max settings, which means the game requires around 10GB of VRAM at 1080p. So the game has to constantly swap textures in and out of RAM... but the 6600 has an 8x – not 16x – PCIe 4.0 interface. And this destroys 1% lows.
The 6700XT is supposed to be only 40-50% faster than the 6600, but it's three times faster in this case because it's not bottlenecked by VRAM and bus interface.
TechPowerUp didn't test this, but I'm sure at high settings (or whatever settings don't go over the 8GB VRAM buffer) the 6600 and the 6700XT are much closer.
Before I upgraded from a 580 8GB to the 6600, I was running RE4 remake at 1080p 60fps on max settings besides shadows - in the settings menu it would say it was using almost 9GB of VRAM yet I still had no issues or framerate drops.
Seems like another shoddy playstation port, just like that TLOU dumpsterfire. When's Bloodborne?
Different engines behave differently when you run out of VRAM. RE Engine loads lower-quality textures if it needs to, but for the most part performance stays consistent (unless you go significantly above your VRAM buffer).
Other engines swap to RAM once VRAM is full, but this kills performance (TloU does this, for example). Some engines straight-up crash.
It doesn't necessarily mean it's a bad port. Running out of VRAM is something that shouldn't happen in the first place.
Also, the 5700XT (usually a 6600XT-equivalent) is much faster for this same exact reason: it has a 16x bus interface.
R&C launch was dicey. RT isn't an option on AMD cards and there was strange behavior with some AMD cards not hitting max utilization in scenes where they should have.
The game needed more work considering the graphical downgrades compared to PS5 and how direct storage seemingly reduces the frame rate on Nvidia
The game is kinda broken in general.
As you said RT doesn't work on AMD cards for whatever reason, or textures don't load properly on Nvidia cards unless you restart the game (this was fixed yesterday I think?). Also, shadows were completely broken on release, on both brands, RT enabled or disabled didn't matter (fixed with a day-2 patch).
And the game runs so much better on AMD cards right now, people would be losing their minds... if it weren't for the fact this is an Nvidia-sponsored title lmao. I have no idea if it's an issue with DirectStorage, but 1% lows are awful on Nvidia right now, even at 4K. See:
It's perfectly playable, don't get me wrong. It's nowhere near as broken as TLoU Part 1, or Horizon a few years ago. But it requires a bit more polish. And they're definitely polishing, they released two patches in the span of a week.
RTX 3080 beating 3090 wtf is going on here.
>1% low
The variance between the benchmark runs kicks in.
why are you showing the 1% lows of each card?
https://www.pcgameshardware.de/Ratchet-und-Clank-Rift-Apart-Spiel-73185/Specials/Benchmark-Test-Review-PC-Version-1425523/
At least post something more up do date and not made by a 60 year old boomer.
Shoulve done that ages ago
>nvidia
The 380x has served me well for a long time, I'm in the process of importing a 6600 now
Is this the ti?
Isn't that the gpu which has fans that look like swastikas at high rpms? I don't have the gif but if you buy this specific model be ready for a pleasant Easter egg!
guess i am buying amd huh
it only looks like that with high shutter speed in a camera
to your eye it would just be a blur
So it's like those videos where helicoptors appear to be moving without the blade turning?
I'm considering the switch, too. For less than what nvidia is trying to sell a 4060TI for, you can buy an AMD card which competes shoulder-to-shoulder with a 4070ti. My only concerns are the usual: AMD drivers (or lack thereof) and the extra head AMD puts out compared to nvidia... i don't want no spaceheater.
>AMD drivers (or lack thereof)
It's a meme. As long as you stick to videogames, AMD drivers are fine if not better. See:
. That pic is an extreme example – it's not THAT bad usually – but higher driver overhead is a common Nvidia issue these days. It's been for years, and it's getting worse and worse as videogames become more and more demanding on the CPU. Look at benchmarks for recent videogames, Nvidia has consistently worse 1% lows. Pic related, another example.
>extra head (I assume "heat") AMD puts out compared to nvidia
It depends on the card.
7900XTX (360W) vs. 4080 (315W), or 7900XT (320W) vs. 4070TI (300W): AMD is hotter, but not THAT much hotter. Especially when you consider AMD has to power more VRAM chips (+2-3W per chip, each chip is 2GB).
6800XT (300W) vs. 4070 (200W): in this case, previous-gen vs. current-gen, Nvidia is quite a bit more efficient... obviously. It's like comparing the 3080 with the 4070.
> 7900XTX (360W) vs. 4080 (315W), or 7900XT (320W) vs. 4070Ti (300W): AMD is hotter, but not THAT much hotter. Especially when you consider
Why do AMDgays always lie? The 7900xtx consumes over 100w more on average than the 4080 in gaming, and often a lot more than that. Thats a massive difference and you're trying to scam an anon with your nonsense rather than owning up to the truth. AMDgays are an actual cult i swear.
If you let your GPU run at 100% you will get near max TBP on both. But if you cap framerates and UV Ada GPUs are much more efficient for sure.
yeah only decent human being will buy amd cause nvidia user are degenerate coomers
>4 years ago
>building brand new pc
>rx 580 looks like some great value
>set up whole thing
>first 2 weeks all great
>start getting black screens literally while just browsing
>use ddu but it keeps happening after a while
>tired of being spooked so just get israelitevidia
i am not sure if it was a combination of wangablows and amd drivers but i just did not wanted to put up with it anymore
in before
>works on my pc
>k nvidia shill
i literally not advocating for israelitevidia either but do buy something that you can handle
This is just the typical AMD experience. Imagine unironically shilling for israeliteAMD
Literally no one would buy AMD gpus if shit like this happened in reality.
Only AMD drivers issues I've seen were due to windows 10 trying to override AMD drivers with their own, happened to my nephew and I just couldn't find a way to stop it from happening, even messing with the regedit didn't help. Ended up uninstalling the AMD drivers and letting windows just install its comparatively older build to stop the conflict every time it tried to override it
>2017. (6 years ago)
>friend wants a mining rig
>build a 3 gpu rig for him with RX 580 8GB gpus
>it worked 24/7 for 5 years
>sell the RX 580 8GB gpus as used when ETH switched to pos
>no complaints they are still running
Its a skill issue.
You tech illiterate Black folk dont know how to use computers let alone to administrate or make them.
>works on my pc
good for you anons like i said i dont advise against buying amd gpu but when i allready spend hours of tinkering with ddu/windows Black persony/downvolting only for the problem to never really go away left me with a sour feeling
similar to how my decade old gt 230 would crush and revert itself during its last 2 years of life
To be honest with you, everything should work flawlessly if you use ddu then plug in a new card. Seems like you were one of those unfortunate souls that got a dodgy card.
Their software is a bit finnicky though, I've been getting this message for the last 2 days when trying to open Adrenalin. I've had it in the past, it's usually windows updating drivers on its own causing clashes.
i've seen countless people shitting on AMD and Adrenaline due to this error, when as a matter of facts is Windows fricking up the driver updates.
Once you disable Windows updating GPU drivers, everything works as it should, cause it's Windows fault, not AMD
>windows updating GPU drivers
Why the FRICK does windows do that for you? I’ve never had Windows do that for nvidia, is it a win 11 thing? I literally don’t touch my settings, I’ve just never had it happen before
It's a Windows Home Edition thing I assume? I'm currently using Windows 10 LTSC 2021 (Enterprise branch), where driver updates are considered optional and don't install on their own. Here's what I'm currently seeing if I check for updates.
>Windows 10 LTSC 2021 (Enterprise branch)
you see that cause that versions is tailored for user control, Windows 10 Home and Pro aren't, and by default they let windows do anything on its own unless you specifically go and change it to "dont update" or "ask me about optional updates"
>change it to "dont update"
This changes nothing, because Windows would uncheck (or outright ignore) that option first and then would just pretend that this change never happened.
It's a W10 issue, no idea if it happens on W11 as well.
it doesnt do it on nvidia cause with each update Windows checks what the latest nvidia drivers are, and it either instalsl them, or doesnt touch them if they're already installed.
But with amd, windows is often a month behind on driver updates and doesnt bother checking, and if it finds a different driver than what it thinks to be the laters(which is actually from a month prior), it doesnt compare the number version, but it simply installs the old driver, which causes adrenaline incompatibilities, which Windows decides to resolve by removing adrenaline entirely.
>it doesnt do it on nvidia cause with each update Windows checks what the latest nvidia drivers are, and it either instalsl them, or doesnt touch them if they're already installed.
I haven't updated my Nvidia drivers in years (there's no need to because Nvidia doesn't do anything to old hardware except cripple it) and Windows has never even offered to update them, much less done it automatically.
I was under the impression you have to submit your drivers to Microsoft for them to be certified to take part in the automatic updating and Nvidia doesn't do that.
what's the point of your post then?
what's the point of sharing your anecdotal case in which an AMD product didn't work expected?
when some does this, they're either suggesting others to not buy a clearly flawed product, or shilling for the other side, but you claim to not be doing either, so what's the point?
>my RX580 doesnt work on my pc
sad for you anon, works for me and tens of thousands of other people, so now what?
Same
AMDevoted will gaslight you that is was your fault, but I literally went through 200"always working fix" steps and did my own research, yet the card shat itself randomly.
Got upset so hard one day with blackscreen when I almost won in some MP game, that I ordered a 1080 and after popping it in had zero problems since.
I will replace it in like 2 years, but after 2 AMD cards I will try another israeliteVidya card.
As a 580 owner for the last six years, I know for a fact you made that up.
You must be moronic.
Works on my machine, rx580 is the best card I've ever bought, it can still run most new games on medium settings 1440p.
I just replaced my 580 8gb with a 6600, but I agree. that thing was a beast and I never had any issues with it. Resident Evil 4 Remake ran 60fps with nearly all settings on max besides shadows. I'm gonna list it for sale tomorrow.
we hear this all the times form both AMD and NVIDIA users, and 99% of the times it's either an actual widespread driver issue that is solved in less than a couple of days(but no one mentions that cause it would invalidate their complaint), or an isolated issue most users of that same GPU with those same drivers don't have, meaning that these posts are utterly useless cause they provide no useful information for anyone, and cause if we were to give credit to what's said in these posts, then we should never buy any GPU from neither AMD or NVIDIA, cause it always happens with all GPUs from both AMD and NVIDIA
Nobody can stop you from buying it, but AMD will definitely stop you from actually using it.
AMD/ATI Radeons win in pure raster, while being about 20% cheaper.
They lose in everything else, such as AI, fake frames, fake upscaling, compute, streaming and video decode.
I'm pretty salty that my overpriced 6900xt cant even do 8k60 fps playback.
that said i would like to see nvidia abandon home consumers and go pure B2B, just because the moronic shills and paid astroturfers would lose their minds
I love me my 6950XT I snagged for $600. Way better than the Vega 64 I had for the longest. No issues playing stuff I want at 1440p with 144fps.
I recently "upgraded" from my ancient GTX 950 to an AMD RX 6600 and it is FRICKED UP. I can run some newer games but everything fricking looks weird and blurry at a distance (when it SHOULDNT), all games have weird perormance issues, games that i could previously run perfectly well now have stuttering, and lowering the settings doesnt change anything, sometimes even makes it worse! EVEN WATCHING VIDEOS IS WORSE! Everything on youtube looks blurrier than it was on my 10 year old Nvidia card. I looked up shit online and plenty of other people are saying they have the same problem but no actual solutions are found. Apparently is a driver issue of some sort but the redditors are bullshitting about how "YOU DIDNT UNINSTALL ALL THE NGREEDIA DRIVERS THEY ARE TANKING YOUR PERFROMANCE!" and console war tier babby bullshit like that. Frick AMD. I am never buying another product from these disgusting shithead yellow chink pieces of shit. I hope the commies wipe them all out soon.
rdna 2 have shitty decoder and encoder.
works on my machine
i bought a 6600xt and it couldnt go an hour in-game without a hard reboot of the computer. so thats the end of that meme.
Should I get a second hand HP OEM 3090 for $700 and Wait™ for 5000 in 2025 or just bite the bullet for a 4090 ($1800)?
I am extremely fricking scared of the 4000 melting connector, btw.
What would you even play with a 4090?
I want to do AI stuff too
>I can't wait 20 seconds more for my [insert anime] AI dicky pics
If this all you want then get 3090, get 4090 if you REALLY need to do 200 prompted pics to shift through for your fap session
There is no good games that need this power
I also want to clone voices not just Stable DIffusion, but the worst part is I don't even know how much power I need for that, everyone's focused on chatbots and SD.
You can do this stuff even with just CPU now.
In the end, the amount of spare cash should decide what you want to buy.
>You can do this stuff even with just CPU now
I've already tried it but it's slow as shit, I would have to upgrade my CPU anyways (i5 9600K) so might as well build something new entirely.
new 4090s come with a proper 16 pin connector. besides that it really depends what you can get. a factory new 3090 for 700 at 1440p will last you another 3 years easily. but if it's a miners 3090 and you're at 4k or looking to upgrade to 4k and you're in the market for a $1000+ gpu it would be much smarter to buy the new tech.
frick nyidia
I had to do 5 rma's in 5 months with a 7900xtx. 3 brands, 4 models.
Got a 4090, and just works. Amd is great if it works but thats a big *if*
>5 RMA's
Yup, it's shill time.
I bought a 6700XT and I'm underwhelmed by it. Honestly, I keep fricking buying AMD because they're so much cheaper, and it keeps being junk. Next time I'm just buying the 4090 tier moron card.
The AMD shills got another one..
>I can afford $300 GPU but next time I will just buy $1500 one
You won't.
Why is Zotac a bad brand?
Shit fans mostly, nearly every complain is about them.
Buying a 2070 for 200, upgrading from a 1650 laptop, talk me out of it
Get a 6650xt or 6700xt
That 2070 is worth like $150
I’ve never owned or played with AMD gpu and in all that time I’ve seen maybe thousands of posts crying about how Witcher 3 wasn’t made to support it, or why does multi-core cause so much issues in older games and so on, it just seems bleak to me
Why should I switch now? Honest question
i had a 2070 super and it was acceptable at 1440 but I never got the 120+ FPS I wanted without turning a bunch of settings down. assuming you're using a 1080p monitor the 2070 should be fine but I don't think buying a 8gb GPU in 2023 makes sense even at 1080. 6700 xt is probably the cheapest 12gb+ card out there
I have that card (XFX 6800xt) it's pretty good OP
Buying amd shit instead of enjoying AI on Nvidia cards.. pathetic.
fake frames
>he has a GPU that generates real frames
woah!
Tried it, regretted it, went back to Team Green.
Having a GPU that just werks is worth the premium.
I tried NVIDIA, regretted it, went back to Team Red.
Having a GPU that just werks and gives the same performance at a lower cost is just great
AMD software is garbage, nothing works. No changes are saved and in 3 years they'll just launch another software which is supposed to work but is just as garbage as the previous shit.
At least you don't have to bother with that shit when using Linux.
Got a 5700XT, worked like a charm
Then an update was released and the drivers would crash just by starting a YouTube video. Rolled back and never installed an update again
When I was buying new PC in may, I could buy either RX6700XT or 3060 for same price. In the end I picked AMD because I previously owned RX480 for over 7 years. But people around the net keep telling me I did wrong and Nvidia is way better. I'm tired of this whole comparison stuff with one site being biased toward Nvidia and other toward AMD, I just wanted to buy card that will serve for another 7 years.
I still use 1680x1050 monitor by the way.
>People around the net
Aka you only browse reddit and what the Google algorithm recommends you from your telemetry history.
Except I don't even touch Reddit and use DuckDuck instead. My around the net I mean Steam and Ganker.
Both sides have cultists and shills.
Try to remember both will try to misinform you, so just go with your gut. Chances are that this decision will be more accurate or you will learn a good lesson
>got a 6800XT
>0 problems since i bought it
>performs even well enough at AI shit for me (still worse than nvidia though)
>paid way less than for a compareable nvidia product
if nvidia doesnt get their shit together for their gaming line-up my next gpu will be amd again
also frick brand loyality, you should never base your purchase at how much you like a certain brand. always go with the superior product for your usecase
Can't buy the good AMD GPUs anymore
Have you fellas considered not being poor?
I use nvidia because cuda justwerks™. i use applications like video editing and stablediffusion and whenever I Google it everyone always says nvidia is simply better for stuff like that. I'm on windows idk if Linux is different I hear amd Linux drivers are better than and Windows drivers.
>blocks in your'e path
nice. hope you snatched that at that price it's a good deal
The most awful installs of Windows10 home ignore your shit and updates what it wants
I think Nvidia unsubscribed from the auto updates racket after they once had problems with Windows not recognizing a game ready driver as newest available and downgraded some win10 home users to an older graphics drivers because game ready aren't signed? Something like that.