Here. Are you happy now Ganker?
Here is the 250 USD GPU with lots of VRAM that you asked for so much.
Now you can finally upgrade and play Todds shitty new space simulator and experience the same Oblivion animations and bugs all over again.
Here. Are you happy now Ganker?
Here is the 250 USD GPU with lots of VRAM that you asked for so much.
Now you can finally upgrade and play Todds shitty new space simulator and experience the same Oblivion animations and bugs all over again.
Will it work with Unreal Engine 5?
Why wouldn't it?
Even Intel ARC works with UE5
does my dick impress the ladies?
there’s your answer
UE5 doesn't need a specific GPU sweety.
I really wish AMD GPUs were better at Blender. Such a good price-to-VRAM ratio, what a shame. 🙁
With ROCm and HIP being ported to Windows, that should improve soon.
All RNDA3 GPUs are supported, and RNDA2 starting with 6800XT are supported.
OptiX still absolutely shits on HIP though, so for anyone doing real Blender workloads, you have only the one option really.
Sure. You get slower performance on Blender.
So a 7900XTX runs like a 4080 for games, but runs like a 3080Ti for shit like Blender and RT.
It will still be garbage. Rocm exists on linux and it's still far worse than nvidia in for example stable diffusion.
ROCm is on Windows now, retard.
www.google.com
Don't see auto1111 working on my amd card yet so what do i care. And no, the directml garbage version doesn't count.
Do you have a compatible GPU that supports ROCm's Windows version?
ROCm is on Windows now, retard.
https://rocm.docs.amd.com/en/latest/release/windows_support.html
6700xt. Pretty sure it was listed as supported.
Could be the DirectML version hasn't been properly updated. It's supposed to only use DirectML if CUDA or ROCm is not available.
Were you doing it under Windows or Linux?
Linux ROCm doesn't have as many GPUs supported natively.
Didn't try the directml fork since rocm came out. Figured it was directml only and wouldn't fall back onto anything else. Or that there would be some news about it.
I'll give it another go. Last time I tried directml I wanted to cut myself with paper as it was slower than a 2060 in my laptop.
I know. I used it myself and running it in linux got me 4its at best. Shark is fast with vulkan, relatively speaking but featureless.
Were you using Linux or Windows?
windows
Ah. Maybe the fork wasn't completed yet. I know it's dependant on Python to update first so maybe full support isn't ready. Sorry, I've been waiting for it to be idiot-proofed before I check it out myself.
Found an old directml folder I still had. Tried it and getting 2seconds per iteration. Same old stuff.
Wait is that the problem with you people? You can't wait 2 seconds for each step? Man I've been running SD on a 960m for a year and it takes me almost a fucking minute for each step. Damn
fuck you
I am not an amdlet, and you always lie, the same with "the drivers are le fixed lol"
Rocm is fine for stable diffusion
a 7900xtx gets ~20it/s which is about what a 4080 gets, and the amd is cheaper
So if I have a 6700xt I'm fucked? Stable diffusion will continue to suck?
6700xt doesn't get HIP sdk but it gets ROCm.
I dont know what that means. I googled it but I'll assume you're saying it's going to work ok
ROCm is amd's cuda competitor and HIP translates cuda code to rocm code i think.
My dude that is all chinese to me. I'm just a chemist
cuda is the language with which you program nvidia cards. since amd shit the bed with keeping up with nvidia's cards in terms of processing power for AI everyone just bought nvidia. so most of all code for AI is written in cuda. if amd wants to break the monopoly they needed an easy way to get people to buy amd cards but not learn a new language so they build something that translates cuda code to their own language (take this with a grain of salt i never read into it that much)
Why do people care? Most of the gays out there praising the capabilities of Nvidia GPUs in software and tasks applications don't even use them. In fact, most of those gays are 60-70 users anyway. If you really need to use that shit you don't go for the low/mid tier. Is nothing more than a coping mechanism to justify themselves over their purchase.
>is nothing more than a coping mechanism to justify themselves over their purchase.
yup.
a lot of vram is good for running AI models too
It's a shame that the cards with the most vram at a cheaper price (on the new market anyway) are slow as shit (3060 12gb, 4060 ti 16gb) or amd cards which are basically crippled
>Now you can finally upgrade and play Todds shitty new space simulator and experience the same Oblivion animations and bugs all over again.
It's an unoptimized mess but it doesn't really use that much vram
refurbished 3060 12GB costs $205
Link me
>refurbished
Just get a 7800? Same price as 4070 but way better performance.
>same price as 4070
lol no
It's $500.
So a 4070 MSRP.
No. 4070 is more expensive. $599 MSRP.
4060ti MSRP is $499.
It's the same price as a 4060Ti and 27% faster.
>Pretending to be retarded.
>way better performance
it's the same performance without garden gnomevidia tech which arguably makes it worse
Because the 6800 is better
As long as is cheaper and new. Do not try to get anything used.
I already upgraded to a 6750XT OC
I don't need to buy a heater anymore either
same. it's overkill for anything i play
I even got a 1440p-144Hz screen to go with it after playing mostly at 30 FPS my whole life. Can't see that much of a difference to 60 FPS but the extra resolution is nice.
30-60 is really iffy. 60-120 however, is a massive difference.
I don't really see it. I'm just happy I don't have to play at 800x600 anymore
Every time I see a post like this I assume it's bait there is no way people can be this blind.
I didnt mean between 30 and 60, but 60 and 144
I uphold my statement. You must only play slow paced games. Even going from 60 to 90 is very noticeable.
I dont play any shooters or anything like that except TF2 once a year. Usually strategy games
a stable 30 is 100% playable and fine
stable 60 is definitely better but as long as it's stable your brain doesn't notice as much.
Stable 30 > variable 45-60
Ony my steam deck I can even get away with 20fps on certain titles
Runescape on my phone is a great title that runs on low fps. I have it locked to 15fps and I think it even looks better than when I run it at 60. Though this is a special case where my nostalgia for my shitty pc supersedes the actual better looking 60. My phone's battery likes the 15 too so I keep it there.
No RTX.
No DLSS.
No deal.
To hell with that witchcraft.
You're not supposed to use those once your computer is good enough retard.
Nah mate no need. I still use a GTX 980 and I'm yet to meet anything that I want to play which I can't already.
Last time I got an amd card, a 290x, it died after 3 years, couldn't play a massive amount of old games, emulating was a pain in the ass if not downright impossible and video editing was like getting kicked in the nuts.
It's not just the price, that was almost 10 years ago and amd still somehow is suffering through this bullshit.
No it's not.
Literally like 70% of this thread is bitching about amd issues.
And there's a bunch of new, unlisted issues like blender and AI.
Now I'm not an nvidia shill, I fucking hate that gay slit eyed garden gnome jensen and garden gnomevidia, but AMD is insanely incompetent when it comes to GPUs and there's a reason why people STILL refuse to buy them despite garden gnomevidia treating everyone like cattle.
All of those posts are gays from /pcbg/ atroturfing and zoomers ironically shitposting.
I own an AMD GPU.
I can use it for AI.
I can use it for emulation.
I can play old games with it.
But go ahead and believe whatever you want. I can't force you to believe me.
pretty much this, I've used AMD since 2013 and apart from the dicey drivers in 2014 things have been rock solid, it's all nothing but FUD
I have a 390x (rebranded 290) and I've never had any of those issues.
I have been an AMD user since the 290, and I agree the 290 was the worst card AMD ever made, all AMD cards I have had since the 290 worked 100% fine. That thing was cursed.
I'm still using this 2070 super until the rtx 6xxx generation at least. I give 0 fucks about AAA garbage
I already have a 6700 XT, so I don't feel the need to upgrade for at least another 2-3 years.
I'm not interested in Todd's game either.
Too bad they don't work for stable diffusion, cope amdlets!
They do now!
proof?
https://github.com/lshqqytiger/stable-diffusion-webui-directml
Everything except training works right now, you just have to install a fork of automatic
Have you messed with the new windows port of ROCm?
I'm not a programmer guy I just play video games and generate images of tiny women with giant tits
Bought an RX 6800 XT recently that died on me within 5 days that I had to send back. Pretty annoying but I got a full refund at least, never buying AMD again.
>Gigabyte user.
Yup, yup.
Buy Saphire not ASrock retard
I remember that one time I bought a sapphire card. It was back in 2010(?), and it was the ATI Radeon 4870. I had to RMA it 4 times. Yes, four times. The first time, the fans stopped spinning, vbios issue, I couldn't self-serve it. The replacement card popped a capacitor, and died. The replacement of that one had VRAM issues, it began artifacting and died shortly after. The 4th card worked for a couple of years, until one of the fans stopped spinning. I had to homebrew a replacement fan, and chisel it to stop it from hitting the heatsink.
FUCK amd video cards and FUCK Sapphire.
This one is unique. You put some thought into this one. I applaud your restraint in not ending it with, "But then I bought nvidia and never had a problem since."
Yes, in fact, I have exclusively purchased nvidia video cards ever since. I am unironically happy with the cards themselves, but I dislike the fact they are openly and brazenly kiking me out of my money, because they have an effective monopoly and can afford to push out slop that's marginally better than the previous generation, charge you 50% more for it, and act like they are doing you a favor.
I wish it wasn't so.
Well you're a retarded fanboy now because of a single bad card. Enjoy that green dick up your ass.
>fanboy
Hardly. I have the company. But I like products that are reliable. Moreover, DLSS is an extremely useful technology, and FSR is decades behind.
I will skip a generation, maybe two, like I often do. My 3090ti will allow me to comfortably coast until 2025.
Imagine using a dx10 card for that long
>dx11 in 2011
How many games even utilized dx11 back then? 99% of games were running fucking dx9.
https://www.pcgamingwiki.com/wiki/List_of_Direct3D_11_games
Are you genuinely cognitively impaired? Take a look at the webpage you linked. There are hardly any games using dx11 in that time period, you chucklefuck.
Well, first of all you're a gay moron. Second, if you ctrl+f 2011 you'll find most of the highly relevant 2011 games.
>18 games out of the 100+ dx11 games
>only 8 are actual video-games, the rest being simulators
braindead
I guess I should've said terascale. Terascale quickly aged like garbage.
I also had a 4000 card and the fan also didnt work lol
But I didnt give a fuck so it ran at 120 Celsius for 8 years
I have never had tech fail in my entire life and I'm starting to think it's 99% user error.
The worst brands have an average of a 6-11% failure rate. AT WORST. Not always.
Thats definitely something that can happen to people.
Because it is.
But some components are more prone to dying than the rest.
But around 1% of hardware is DOA mostly due to shipping.
everyone will rush to the internet and bitch about their hwardware being a lemon, so it just seems like its a frequent occurrence
The people whose hardware works don't talk about it, they just use it. They're the vast majority. The people whose hardware does fail spend all the time talking about it online and never forgive it. Loud minority.
I bought a 5600xt in 2020, pos would randomly green screen due to Gigabyte fucking up as usual. I refunded it and got a Nvidia card for now but i will probably get a Sapphire or Powercolor next time.
My 6600 was defective, got a 6700 with the refund money and it just werks. I'd sooner quit PC gaming than buy Nvidia again.
my 6700xt ASRock works flawlessly
cope nvidroid
Thank you Mr Jensen
i'll keep waiting and see what happens when the 50 series comes out
AMD cards are fin-ACK!
>Works on my mach-ACK!
But seriously morons who say they have no problems with AMD cards are just liars. All these gays either don't play videogames at all or just play 1 hr a day and it's some dogshit gacha or CS/Dota tier shit that runs trouble free on any hardware.
Picrelated is the real AMD experience.
>being this buttmad you gotta shitpost this hard
I hope you're being paid to do this.
my amd card is pushing 5 years since I bought it and has been slammed constantly
rendering video
playing games
never turn off my pc, even had it OCd for a while when I was big into COD and wanted more fps
6800XT owner here. Have had the card for about year and a half now. Not a single AMD driver crash yet and I am the type of a neet who literally plays videogames 24/7 while watching videos on my second monitor.
What brand is your card?
ASUS TUF
I gave amd a chance with vega. It was utter trash because of the driver issues. I dont even care about price and performance issue, the fucking drivers were a nightmare and the stutters in multiple games.
just honestly fuck AMD. Nvidia garden gnomes you to hell and back with it's prices but at least their shit works.
Vega was garbage. RNDA2 was really good. 3 is almost as good.
works fine on my linux machine
I think the problem is windows update, it can delete your drivers and adrenaline installs
I already got a 3060 12gb for $240
my 5700xt runs Todd's game fine at 1080p
no reason to upgrade
if dragon's dogma 2 runs well I can hold out even longer
just keep waiting
2 more cycles
5700xt was in retrospect the best time to buy
great card.
well that or 1080ti for richfags
IMO heres the goat timeline for cards, in retrospect this is what we all should have done, the road travelled by the GOAT cards with regards to performance and longevity for price
generic shitty card
nvidia 8800
amd (ati?) 4870
amd 7870
nvidia1080ti or amd 290x
amd 5700x
and now wait for the next leap
Not bad.
ATi's last branded line was the 3xxx series, iirc but AMD had already bought them out by then.
>amd 5700x
lol
maybe in terms of performance per price in a vaccum but in reality this was the most bug-ridden unstable AMD card of the last decade
it chased away half the AMD userbase to nvidia lol
i mean xt. i never had it, but its still doing well now and costs almost nothing used. mogs my 2060 (that i got as a gift so im not mad about it)
Can confirm 🙁
they fixed the driver issues as time went on, the hate on the GPU makes them amazing price-wise now
t. recently upgraded a pc to a 5700xt and never had an issue.
yeah, amd are always dogshit early due to shit driver guys but in the long term they become good value
I always buy their gpus at the 1 year mark at the very least and it's been a pretty good experience, so yeah I can concur.
bro needs to snag a used 5700xt, the 480 is going to get choked at 1080p
I've had my 5700XT for 3 years and I paid 400 for it, I've had no issues at all, it's a great card, I'll probably get another 2 years out of it.
For me, it's:
>2007: GeForce 8800GT
>2009: Radeon HD 5870 (or 5850)
>2012: Radeon HD 7870 (or 7850)
>2014: GeForce GTX 970
>2016: GeForce GTX 1060 6GB / Radeon RX 480
>2019: Radeon RX 5700XT
The $200-400 goats. Bonus mention:
>2022: Radeon RX 6700XT (when it dropped to $350)
Not a massive upgrade compared to the 5700XT (+30%), but it's worth upgrading to for those still stuck with a 1060 or 480 (2-3x performance).
I forgot:
>2013: Radeon R9 280X
Similar to the 6700XT. Not that much of an upgrade compared to the 7870, but great value for those with older cards.
Nvidia shills are lying! They fixed the driv-ACK!
https://community.amd.com/t5/gaming/how-to-running-optimized-automatic1111-stable-diffusion-webui-on/ba-p/625585
>Now you can finally upgrade and play Todds shitty new space simulator
Starfield already runs pretty pathetic on the 7600. I don't get why an extra VRAM version will do anything considering Starfield only uses up about 4-5gb VRAM.
Literally had an amdlet yesterday in trash SD thread bitching about his card not working. Cope.
>trash
Consider where you were.
Who else generates porn? weebs can easily find anything in gelbooru. They rarely need SD.
>thinks trash is the only board doing SD
Frig off.
weebs have much more porn than furries
>intel gpu >>amd gpu
post gpu
I already got a 5700xt 8gb for $70
why bother?
I though they were done with the lineup?
anyway it makes sense, there's a massive gap between the 7700 XT and the 7600
Still using RX 480, but I'm not upgrading until 4k becomes properly priced.
I'm fine running most games at 40 fps and 1080p
>Still using RX 480, but I'm not upgrading until 4k becomes properly priced.
You're gonna wait a long time. Mid range GPUs are always neutered as to not scale properly with resolution. Lower end cards are still targeting 1080p. You're waiting for a future when low end cards target 1440p.
That future was actually in the past, the 1060 was a big enough jump that it could deliver a decent 1440p performance at the time.
its going to run worse than their last gen cards
don't gpus come out with ridiculously inflated prices nowadays? I don't have any hope that it'll be anywhere near 250 tbh
Prolly $350. The base 7600 is like $260 iirc.
fsr 3.0 will save us amdfags...
AMDkings don't care for fake frames or fake resolution.
>Here is the 250 USD GPU with lots of VRAM that you asked for so much.
bro how is it gonna cost $250 when that's the price of the absolute cheapest 7600 non-XT? It's probably going to be $350, maybe $300 at most.
Even before Rocm was officially released on Windows, I was using it through WSL to run Stable Diffusion without any issue.
Waiting for the 7900XTX Taichi to fall under 1000€
I have the white version of the 7900xtx Taichi.
What a beautiful graphics card.
DLSS and FSR look fucking hideous why would anyone ever use that shit
Because if you're not using those then you're using TAA.
>inb4 "i play without antialiasing"
Lol no you don't. Only 4k fags don't need antialiasing and they're a tiny percentage.
TAA doesn't have motion smearing. TAA isnt the only AA.
I feel like you're the same dude that brings this up in every thread. I will dub you TAA Autist.
>TAA doesn't have motion smearing.
Oh, no no no.
HAHAHAHA.
>TAA doesn't have motion smearing.
come the fuck on dude
go open up rdr2 or halo infinite
stop trolling
I think it's time you killed yourself.
It might get me to upgrade from the RX5700 I've been using since I bought it just before coof quarantine.
i think that ones still good for atleast 1 gen more
Yeah probably. This thing's been Ol' Reliable for me since I got it, few crashes and it chews through anything I throw at it where 1440p is concerned.
Why would anyone buy something? weird.
why are they taking so long to release anything?
Small gpu company please understand
Because they want to release gimped GPUd that can get hundreds of frames with FSR3. The problem is FSR3 is taking forever so they can't release any new GPUs. They're basically copying Nvidia.
Lol
I don't have a single game with TAA so I don't know what this autistic queer is spazzing out over.
>linux gaming
grim. im gonna have to use this shit too once windows goes full cloud and they stop supporting w10
Test it out on a live OS, don't go jumping straight into linux.
>7900xtx
It's a 7900xt
>7900xt
It's a 7800xt
>7800xt
It's a 7700xt
>7700xt
It's a 7600xt
They pulled the same stunt as Nvidia, but because they're cheaper people are hailing them as GPU saviors instead of the crooks they are, just wait for next gen because this one is cursed
Honestly if you have a GPU from last gen there's no reason to upgrade yet.
Exactly, I do regret buying a MSI 6700xt instead of sapphire because it's loud as fuck, but it's definitely more than enough to play my games at 1440p, and I won't buy another one unless I get a significant upgrade for a descent price/performance ratio.
Doesn't help that the 4000 series in general is bad value due to low VRAM, unless you decide to go for a 4090. Been twiddling my thumbs on an old 1070 TI hoping there's either a 4000 Super series of more sane pricing, or 5000 makes shit viable again.
>4000 Super series of more sane pricing
Lmao, not going to happen bucko. Wait for next gen if you want to get Nvidia.
>hoping there's either a 4000 Super series of more sane pricing, or 5000 makes shit viable again.
There are many rumours around that AMD won't compete with Nvidia next generation. The RX8000 will probably be an RDNA3 refresh (or RDNA3.5). Nvidia has free reign. They can price the 5000 series to whatever the hell they want.
>if you have a GPU from last gen there's no reason to upgrade yet
This ALWAYS skip a gen unless you get like 100~150% perf for half the price you bought your current card.
>buy 1080ti
>literally never buy another gpu again
Wait to the 7090.
>next gen bro, this time for sure
Just like last gen and the one before that and one before that.
3000/6000 series were great performance for a good price. The problem was cryptobros ruining the market for everyone.
So they weren't available for a good price for most of their life...
At launch they were, but retailers were stupid so scalpers and miners bought everything with bots
You needed to wait outside Micro center to get one.
1080Ti Chads keep winning.
1080ti can't play shitfield at 1080p60 low, it's so over...
3000 was a good series of cards, so were rx6000. their MSRPs going nuts being because of the mining craze doesn't change that.
They were the goyslop of cards, they offered lowered prices yes, but arguably that's a bad thing since your introducing third worlders and console plebs into PC gaming. The RTX 2000 was much more revolutionary and so is the 4000 series.
>The RTX 2000 was much more revolutionary
lol
what
everyone was disappointed with the 2000 series compared to 1000
there's a reason the SUPER refresh of 2000 cards came out, that wouldn't have happened if they were revolutionary
Ok bros, so who makes the best AMD cards and who makes the worst ones?
Sapphire and Asus...I think.
What about XFX any experience with them?
Not great not terrible, perfectly middle of the road. Should suit you fine for the years to come, but I heard their Customer service is absolute garbage(can't confirm since I never bought from them).
>best
Powercolor and sapphire
>Worst
MSI(jet engine) and ASUS(don't even need an explanation)
i got a used asus TUF card a while back and it was ok for me, whats meant to be bad about asus? not arguing etc just interested to know what people say about why they suck so i dont get another later on
I just don't trust them after the MOBO fiasco some time ago
So far yes MSI hasn't given me any trouble, it works fine but it's definitely way too loud
>the MOBO fiasco
The shit about the drivers burning the CPU? What about MSI lying about their shit being hacked?
Same thing as ASUS and GIGABYTE when it comes to my opinion of them, only reason I bought MSI is because my card died and I needed a new one ASAP, sapphire wasn't available and MSI was on sale.
Is MSI's only problem loud cooling? And whats wrong with Asus? I've had an Asus laptop for 12 years never gave me a problem.
>Best
Sapphire
>Great
Powercolor, XFX, AMD (MBA)
>Alright
ASUS, MSI
>Bad
Asrock
>Worst
Gigabyte
Also
This list is just looking at the quality and performance of the GPUs. Not the overall performance and value of the company.
ASUS are overall fairly mediocre for example but their customer service is absolute dogshit F tier. If something breaks no matter what the GPU, the mainboard or anything else. They will not give a shit - put you into day long conversations only to refuse a refund or replacement anyway because the pajeet on the other end of the line never had the power to offer any kind of replacement or refund in the first place.
Why are you placing AssRock so low I haven't seen any problems with their cards. But I've seen many gays complaining about high temps on power color red devil due to bad thermal paste application on 7900XTX.
Also reference design cooling on AMD is just bad.
Asrock are a gamble.
They are either great or fucking dogshit depending on how great your luck is.
Assrock stuff is generally cheaper here than most other brands and i personally come from a poorfag household so i got multiple parts from them in the past. Some that lasted way longer than they should have and others that had to get replaced or refunded.
If you asked me which name is the most inconsistent when it comes to quality its definitely Assrock.
I know their low end motherboards are trash but so far I've seen no complains about their GPUs. They are also relatively new to that market.
The low end board from Asrock was my best experience with them funny enough. That board was cheap as shit and it just worked. No issues whatsoever.
VRM on many of those could not even handle i5 or r7 without overheating or they came with hard coded power limit you were fucked.
I had AssRock board for many years pic related.
Had a VRM death on my old-ass ASRock board from the AMD64 days, would have figured they'd be better about that a decade and a bit later though given how much more of a draw modern hardware requirements are.
It's quite funny because the boards which would benefit from VRM cooling the most usually don't come with any heatsinks.
Asus is even worse than AsRock with cutting corners on entry hardware despite pretending to be "premium" brand.
Are you talking about the actual board and its components, or the thermals?
GPUs overall.
Of course each brand has its history of ups and downs in different areas. For AMD: Sapphire are generally the best in regards to reliability and performance. Powercolor meanwhile is all about cooling and overclocking abilities.
I've never bought Sapphire because that's blue.
AMD is red.
I recognize they have the best cards but it does not match.
I'm a Powercolor main with autism.
But I'm still using my rx580?
all that so you can play poorly ported console games or gacha phone games on their pc clients.
How do Nvidiafags live now that EVGA are dead? They were the best with basically no competition. Top tier quality with zero issues and a top tier customer service in the rare case something actually went wrong somehow.
What you wanna buy now? MSI? Fucking Zotac?
EVGA had almost no presence outside of US, so for vast majority it didn't change anything.
But it's still sad because their cards looked nice and customer support was great.
Get Nvidia so you can generate literal anime. Pic related, my latest masterwork.
why the lady in the back is getting absorbed by a kid
japan innit
It happens all the time in Japan.
AMDs 7000s series can do that too as they also got AI cores now.
Nvidia still have the advantage in most - not all - raytracing games and workstations though. But AMD is kicking their ass in non-upscale framerate value
Is the performance good?
Is this relevant to your question?
this is big if true
https://community.amd.com/t5/gaming/how-to-running-optimized-automatic1111-stable-diffusion-webui-on/ba-p/625585
Sure. Here's the official guide on how to set up Olive.
You need a beast PC to convert models for this. And there isn't a civit.ai equivalent if you don't have one
A few months old so not counting the recent GPU releases.
this is fake news, I've seen benches where the 4090 is getting 30its
Again: It's not the most up to date and both are probably performing way higher now.
Not him but it's not hassle free on windows with all the dependencies you need to install. Updating GPU driver can just break the whole setup and windows isn't kind enough to tell you what is wrong.
Shark is much more limited but it Just Works™ so for beginners it's better.
why does this bitch have toasted bread if everything else looks pre-restoration
it's an isekai about an elf who travels through time
>those nips
wew, what site do you generate these on
I generate them locally.
Nice
whats your setup for generating and creating the animation sequence? Or are you just generating a shitload of frames based on specific curations and then compiling them into a gif/webm?
I found an online thing for you to play around with.
https://huggingface.co/spaces/guoyww/AnimateDiff
It's not using the latest 15v2 version though which was a massive boost in quality.
wtf
no milly?
I've started playing around with stable diffusion. Someday i'll make something cool and extraordingary and creative, but I've just been busy making porn of anime characters and k-pop girls, and cumming to them makes me tired
Jesus christ, sometimes I wonder if AIfags have working eyes
they don't, they think they do though
rude
see pic
>250
No, that's insanely gnomish. It should be $100.
itll be 375
400 in europe for the shittest model since the 7700 XT is 500
All the more reason to not buy one. Nobody should be buying GPUs right now. They're a complete ripoff and AMD and Nvidia are both colluding to gaslight the general population into thinking GPUs NEED to cost over twice what they should actually cost.
You are LITERALLY a bad person if you cave to them.
This. If you buy a new GPU now, you're telling them that you're fine with increased prices. If the current batch of cards is successful, they WILL increase prices again next generation.
I bought a used 3060ti for 160 bucks and I'm happy with it, it's great for 1080p. My next buy will probably be in few years when a 50xx card is around 150 bucks too.
How about they make a cheap card instead, huh? There hasn't been a viable upgrade from 480 in $200-250 budget for >7 years at this point.
The 6600 series were a great pick for a while. If AMD can release the 7600XT with 12GB for $250, that sounds really good too. The 4060 at $300 isn't bad either.
>7600XT with 12GB for $250
7600 is already $270
blame OP
Then 7600XT 12GB for $300 will compete directly with the $300 4060. The extra VRAM will be a very welcome addition.
Different regions. Different prices.
A 7600 XT with 12 GB will be 300 bucks at worst in most countries. Maybe 320 with some higher markup. Still a value king for anyone who just wants a mid-low GPU that can play even the worst optimized AAA slop and last 5+ years
>5 years
not even the 4070 and the 7800 XT will last that much
>This delusional
The more you buy the more you save.
games fully transitioned to 9th gen at this point, minimal system requirements won't noticeably rise until next one
Not him but we only started seeing UE5 games and it will get only worse from now.
All UE5 games have bombed hard so far. I don't see specs getting any higher, because the people that can play those games are a tiny percentage. Go look up steam hardware stats.
I will not upgrade until I can get 16gb for $300. Seethe garden gnomes.
You set too low a bar. 32GB vRAM for $300 or no deal. Nothing's out that's worth a $1000+ GPU.
I just got a 4070 TI and 13th gen processor I'm good for 4 years senpai. Miss me with that new shit.
> 12GB
But for real, Nvidia and AMD GPU divisions deserve to be bombed into the stone age (like early 2000's) in a global war. Less computers would ironically improve the world.
So what's better, AMD or Nvidia?
Right now it's Intel #1 or a stupid cheap (meaning not sold on the usual new sale websites) last gen GPU that isn't a 3090.
Bro. The Pajeet responsible for AMD's worst era for drivers is now the head of the driver division for graphics at Intel.
Have fun with that.
Raja koduri isn't responsible for the drivers. He's the architect and was the main guy when ATI were making sick graphics cards before AMD bought them. The Intel cards he's designed recently are getting high praise. Meanwhile AMD are still shit.
Raja left Intel in April.
Nvidia is for normalfags
AMD is for contrarians
intel is contrarians
amd is poorfags
>amd is poorfags
But the price is almost the same. At most you'll save 100 dollars compared to the Nvidia equivalent. That statement doesn't make sense to me.
nvidia is for braindead zoomers who need youtube to think for them
AMD is for people who think troubleshooting random shit is fun.
No such thing.
AMD is cheaper... in the US and nowhere else, see you in 2 years. And 7800 XT is just 6800 XT with shit drivers and no better power consumption.
Nvidia has DLSS 2, doesn't crap the bed in DX9, but has dropped driver support for new games in favor of quick AI bucks, and has only the 4070 as even a somewhat reasonable option... once it drops in price and it should still have 16Gb of RAM.
Also CPUs are now at least as important as the GPU, it's not 2012 anymore.
AMD is better, except not, because it's one issue after another, and the previous great socket isn't recommendable anymore.
Intel is still to catch up to any Ryzen.
>And 7800 XT is just 6800 XT with shit drivers and no better power consumption.
But it has sick OC capabilities. In some games it manages to compete against a 4070Ti.
No one's gonna do that
Their loss.
True but no one's still gonna do that
OCing is outside of the scope of the average broccoli-headed Gankeritter poster. Half these retards will burn their houses down in the attempt, the rest will just kill their GPU then we'll have another batch of dipshit retards screaming about how AMD was soooooo bad because their card died on them trying to do something they saw on Youtube.
>Half these retards will burn their houses down in the attempt,
To OC AMD cards you need to drop the voltage and raise the power limit. You never touch the frequencies and the power limit is limited to +15%. In the end you end up with a more performing card, while consuming the same power and dissipating the same heat.
Congrats.
But you're not the average Gankeriktoker.
These retards don't even DDU when they switch GPU brands, then throw tantrums because their AMD card wasn't working properly.
>and no better power consumption.
Yeah going from 400+ to 300 W drain is basically nothing right?
>same price
>around the same performance
>but slightly better in most games
>on RDNA 3 so it can use FSR++ HYPRFX, Anti-Lag+ which previous GPUs can't
>overall 20% more efficient than the 6800 XT and that despite barely any driver support at this point
The 6800 is also more expensive in most regions by like 20-50 bucks. WIll probably change fast though.
AMD, not only because it's better value (unless you need top-end or AI which AMD doesn't have quite right), but also
>giving NJudea money after the 970
>giving NJudea money after 2000 series
>giving NJudea money after cooperating with 3000 series scalpers
>giving NJudea money after 4000 series
There's also Intel now but I'd wait a few more years for them to get drivers right.
NJudea money after cooperating with 3000 series scalpers
This right here. People will keep giving Nvidia money after they fucked all of their consumers in the ass with the LHR cards.
Don't forget all the times they gimped older gen GPUs through driver updates, and intentionally hardware-locked features just to force people to upgrade.
>and intentionally hardware-locked features just to force people to upgrade.
AMD does the same. See Anti-Lag+.
???
It's exclusiveto RNDA3 only.
>giving AMD money at all
Don't make me laugh. AMD enabled nvidia to do what they've done because AMD have been such utter garbage for a good 10+ years now. Their graphics hardware is literally amongst the worst in terms of features, support, compatibility and developer priority.
Pffft.
AMD could fix every issue and you gay doomers would still bring up shit from 4 years ago that's not even an issue anymore.
More like if amd fixed all their issues people would buy them. Take a good look at their CPU division. Everyone clowned on them when their CPUs were failed abortions compared to intel CPUs but when they finally fixed up with ryzen everyone dunked on intel instead for being greedy garden gnomes.
I've been using AMD CPUs off and on since 2001. They only beat Intel when Intel shits the bed. They've always been a value brand in that regard.
AMD has better features and support than fucking Intel does.
No they fucking don't. Intel is already up there with nvidia with the latest cutting edge ray tracing and AI hardware in their GPUs. AMD doesn't have either of these, they rely on shader cores for everything.
Ray tracing and upscaling are not the only features that GPUs have.
Good luck getting that Intel GPU to run Blender, or Cinebench.
>blender
AMD sucks at this too, hell, good luck trying to run autodesk on an AMD GPU.
>cinebench
A fucking benchmark?
AMD runs blender about 1 gen behind nvidia. Intel not at all.
AMD have had decades to catch up to nvidia but always fumble. Intel are relatively new to performance GPUs and they're rapidly catching up to nvidia, forget amd. Their main shortcoming is their driver library which they're building from scratch but then then it's looking extremely promising. Amd have completely stagnated to the point they aren't even on the same trajectory as intel and nvidia. You may as well classify nvidia and intel as A class and amd as B class.
$0.05 has been credited to your account.
The only poster that comes off as an irate, kvetching shill is (you), anon.
I'm just bowing out before you gays start bringing up outdated shit from the Vega era to say about current day AMD.
We're like two, or three posts away from having "AMD can't run emulators"
>AMD can't run emulators
How good is Radeon at emulating Xenia, the only real GPU bound emulator?
>GPU bound
the point is about correctness not performance
You mean accuracy?
Current day amd is no different from 2013 amd. They're still providing mixed bag hardware and ignoring what the fans want. An entire generation of gaming is being held back by amd again. Two gens in a fucking row. Meanwhile we just heard news of switch 2 being an absolute monster because of the partnership with nvidia. A switch 2 will likely provide better RT and resolution scaling than a PS5 solely because of the dedicated cores nvidia uses.
You're trying too hard.
Speaking facts? Yeah I know I am. Nobody has any counter arguments to any of this. Literally everyone except amd fanboys agrees amd hardware is in the fucking dump while nvidia drives off into the sunset with their massively more advanced GPUs. The community has been begging amd for RT and AI cores since 2018. The very least amd can do is have feature parity with nvidia and Intel but they can't even manage that. And it's not because they don't want to, we already know they can't because they're too far behind at the moment.
>he just can't stop
>nvidea cocksucker was a bingtendie all along
kek
>being an absolute monster
Yeah I can't wait for all the Switch 2 games to look like vaseline poured on the screen since they're all running 540p internally upscaled via DLSS3.
You know damn well they will. They think Switch 2 is gonna be 1440p144hz and look like PS4 Pro. It's gonna be 480p upscaled to 1080p with 30 fake frames and 12 real frames just so they can manage PS4 on low.
ps4 pro is shit so it's basically guaranteed for the switch 2 to look and perform better.
nothing wrong with 540p upscaled with DLSS. it's been proven to be very effective even on older versions of DLSS from 3+ years ago.
Holy fuck.
Welp. There goes video games.
It was fun while it lasted.
more like video games are saved. hardware is getting more and more expensive because it's harder to shrink the transistors so you end up with bigger, more power hungry and hot GPUs and all the associated costs which come with those. being able to effectively reproduce high end graphics on lower end hardware is a lifesaver.
You're very stupid. You want to know how I know you're not even 20 yet? The fact that you don't know that this will just lead to devs relying on it as a crutch, and soon your mid-range GPU will need it to hit 60 fps in 1080p.
You do realize that the AI needs rasterization to interpolate from, right? That space for those AI cores means sacrificing space for rasterization cores, right?
Fuck. Get off Ganker, kid. You're not even old enough to be here.
you're typing a lot without actually saying much. nothing i said agrees or denies with anything you just said. you're fearmongering with the crutch stuff and then being captain obvious with the raster performance.
He's not a kid he's just a shitskin. Literally, and I mean literally, ONLY shitskins think this AI upscale shit is good for games. They also worship Nvidia and AMD's frame interpolation tech thinking it will save PC gaming when it actually will just exacerbate the issue of optimization on ALL platforms, not just PC. Why bother optimizing at all when your GPU can generate frames for us!
In my experience shitskins worship nvidia and loathe AMD.
which is the opposite in reality. amd is huge in india because it's cheap. one of the lead architects was also this popular indian guy. nvidia is huge in places like china and east asia.
He's talking about South America.
Nvidia is the most popular and used GPU company in the world. There are more shitskins than white people therefore the more Nvidia users there's the likelihood of them being brown is higher. It's simple mathematics.
>likelihood of them being brown is higher
brown people are generally the poorest on the planet so the liklihood of them being able to afford hardware which is expensive even to 1st worlders is slim. amd is the product of choice for these people and always has been. it's the only reason why amd are still alive (consoles included).
>Most popular cards according to steam are: 1060, 1650, 3060.
>No those people are not brown
Anon ....
I'm betting we've got a pajeet. If he were a Hispanic, he'd be spitting fire and slinging insults right now. Instead he just quadtruples down and keeps running his mouth like he's not making a fool of himself. That's classic street-shitter.
i don't even follow what you're trying to say anymore. amd is LITERALLY the poor mans brand. always has been and always will be. they've almost always been cheaper than the competition throughout their history even during the short period they were on top. cheaper brands are popular in poor places.
Yet the most popular cards are from green team. It's almost like poor people over pay for shit cuz they believe it's better. Same shit you see with gays getting iPhones.
i still don't know what you're trying to say. the 1060 was hugely popular because it was the best bang/buck card when it released. the rx480 launched with problem after problem ranging from bad drivers to pulling to much power from the pci slot to having as high a power draw as a 1080 under load. i don't know about the 1650 because it came out in a time where i lost all interest in the gpu market (covid). and the 3060 is the best bang/buck entry into RTX right now which is unrivalled by amd.
Are you the guy crying about the skin color of the person using Nvidia graphics or not? If not, then there's no need to respond.
Are there more poor people or rich people?
depends on the context such as current economy
You really can't answer that eh?
Believe it or not, AMD is actually more popular in richer regions (not more popular than Nvidia, more popular compared to poorer regions I mean). That's where their products are actually cheaper than the competition, so savvy customers who care about value buy their cards.
Anywhere else, Nvidia and AMD cards are much closer in price. Also, in poor regions people don't even buy new cards; they buy used low-end cards.
Believe it or not number two, it seems (and I say "it seems" because we only have data from a few retailers) AMD is selling more discrete GPUs than Nvidia these days. People who know what they're buying, buy AMD.
Nvidia sells most of its GPUs through laptops and pre-builts. They dominate that market.
Believe it or not number three, people who build their own computer are a small percentage of the overall gaming PC market.
This. I weighed my options and realized that my GPU is basically an expensive toy, and I don't need to pay more for for it when I'm only going to use it for videogames. Nvidia cheaping out on VRAM and memory bandwidth was a dick move this gen, and looking at the performance of AMD vs nvidia for these console game ports, it looks like AMD might be coming out ahead, if only slightly. Losing out on RT perf doesn't bother me much, because the only times when AMD can't compete in it are in games I'm not particularly worried about playing anyway. Cyberpunk 2077 and Witcher 3 don't interest me much, and even then you need DLSS to make them not run like shit even on the hardware they were designed around.
It's an interesting tech, but until you can enable it on a high-end GPU and still get 120fps at 4k, it's nothing more than a pretty gimmick to me.
Men of Brahma are generally leading interested in technology, so it does not make sense that they would spend so little, it is not what you say. Ching chong bastard bitch mad that he stuck with a xx70 and can't get a good dame!
SAAR, SAAR
DO NOT RMA THE GPU
SAAR SAAR
BLODY BICH BASTAD SAAR
If AMD is that popular why are they so low in the steam survey?
>implying poor people use steam
>implying they can buy games at all
>There are no free games on steam.
>The most popular games are not the free ones trust me.
>No don't look at the charts.
You do realize steam price their games according to the region? Why do you think Russian use it at all.
russians make money. the average indian can't even afford shoes.
Are you sure about that anon?
yes. intel has a habit of advertising their hardware in these bulk prebuilts.
You responded to the wrong post anon.
amd makes all of their money by selling cheap hardware in bulk to businesses. Any PC you find in a factory, grocery store, convenience store, library, public school, what the fuck ever, is almost certainly AMD-driven.
>is almost certainly AMD-driven.
it's always intel driven. amd never gets these contracts. intelhd graphics was a winner for intel. amd still doesn't bundle hd graphics as standard.
t. worked in a hospital for a while
This. I've never seen AMD with loads of OEM shit. Even AMD laptops are rare as shit despite them having APUs which shit all over integrated.
Assuming you are in Burgerland, hospitals are decidedly different than the types of businesses I listed.
Nvidia has the reputation of being "elite" so they think if they buy an Nvidia card (they usually can only barely afford x50 or x60 tier SKUs, some buy x30's lol) that they're finally part of the elite. It lets them escape into a fantasy where they are not brown, not poor as shit, and don't live in a ghetto.
as opposed to what? amd? amd is an embarrassment compared to nvidia
Now post the 4060.
lol yeah. That's why they have such a hard-on for DLSS. They think it's going to give their entry-level trash GPU upper mid-range performance for free. It would be hilarious if it wasn't so tragic.
That's not what's happening. What's happening is pajeetware like Immortals of Aveum only barely managing acceptable performance on top-end hardware with upscaling and framegen on
Imagine how bad it will be in 5 years
immortals of aveum is a new game on a new engine and the first port from those devs to the new hardware (i think only the 2nd ever UE5 game after fortnite). so obviously it's going to be a worst case because it's so new. that's the worst example you could have used.
In five years I'll have stopped buying video games and will just be emulating at 16k downsampled to 8k.
Yeah. His typing reminds me of kids on the internet 15+ years ago.
Amazing that. An adult shitskin has the intellectual capacity of a white child.
>An adult shitskin has the intellectual capacity of a white child.
Many such cases.
Fuck off 4chan.
Stop pushing for expensive photorealistic lighting and suddenly you have vastly more GPU power than you know what to do with.
even if games don't use RT there is still a lot of other demanding tech that will inevitably be used as devs push for bigger and more detailed games.
Other than mandatory motion blur.
I'm convinced the people shilling this shit are blind. Left looks like Mass Effect 1.
it looks way better than mass effect 1. it's also a compressed youtube screenshot.
Amd are so non existent outside of consoles and more recently the CPU market that I should be accusing you of being the paid marketer. Nobody uses AMD. Nobody cares about them. Nobody has for over a decade.
>Intel are relatively new to performance GPUs and they're rapidly catching up to nvidia
They apparently do since they keep adding more support for stuff via updates
>giving AMD money for being a generation behind every release
AMD are literally holding the entire graphics industry back because their hardware is so far behind. Even fucking iPhone is coming with dedicated ray tracing cores and dedicated AI cores for upscaling and stuff. Intel has it as well and obviously nvidia has had it for 5 years now. Meanwhile amd are still unable to do any worthwhile ray tracing or upscaling. FSR2 is like a poor mans DLSS or even XeSS.
Upscaling is a crutch that only gay half-moron zoomers think is a good thing.
RNDA3 has AI cores.
Upscaling is the future of video games. High end graphics on mid range hardware. It only makes sense. And AMD are by far the worst at it.
>mid-range cope feature
Fuck that. I buy high-end. Why would I want to sacrifice visuals for more FPS?
Your generation is so fucking retarded. Thanks for ruining PC gaming.
>or even XeSS.
You say that as if XeSS is shit when it's probably even better than DLSS. Intel really delivered on that front. They might make garbage drivers, but XeSS is great.
I wasn't implying it's bad. FSR is massively worse than both with its only saving grace being vendor agnostic and backwards compatible with ancient hardware. If AMD weren't holding the entire industry back because of their hardware in consoles, DLSS and XeSS would be standard across the board. XeSS even has vendor agnostic version assuming your GPU has dedicated hardware for it like nvidia hardware.
is the 7800xt based or cringe
relying on anonymous posters to give you advice on a gpu by defining it as based or cringe is cringe
Sometimes performs like a 6800XT
Sometimes performs like a 6950XT
OCs like a 4770.
CUs count like a 6700XT
nvidia's price equivalent is 27% slower, and will be shit at RT same as the AMD equivalent because doing RT at the low-end is stupid.
>250 USD
Don't you dare lie to me.
Remember kids:
Powercolor (Taiwain) > Sapphire (China)
That's the AMD reference you retarded fag.
Mongoloid
Yes, exactly what the market needs
EVGA is out. Update the picture.
The fact that nvidia lost EVGA might actually lead me to reconsider my decisions, going forward. EVGA was a company that I've grown to trust, which is extremely rare in current year. It's a damn shame that they closed down shop.
I'm in the same boat. MSI/Asus/Gigabyte all had a spotty history with the quality of their cards. No idea who the fuck Galax is, and PNY I haven't touched since the GF6000 series.
>bought a 1660ti right at the start of the pandemic because my PC was starting to have some issues and I foresaw the issues sourcing new parts coming
>was torn on whether to go budget like I did or go for something a bit better to play upcoming big titles
>all the big titles like cyberpunk, BF2042, starfield etc. all ended up being complete dogshit not worth playing
And now years later there's still no reason to upgrade
vchuds are as brand loyal as women are to handbags
>mfw just got a 3060 on my first ever build
Behead upscaling shills with dull knives.
Glad I have a card already and can worry more about what game I want to play.
Good for you, gay
post the article bro fuck
Apple won.
not relevant unless its 200USD and pulls 120w max under load.
When the FUCK are AMD going to release the high end GAMER GPUS bros?! I'm sitting here waiting on that shit!
The 7900XTX is their high-end card. That's it, we're not getting anything faster from them this generation. Even if they -could- make something faster, they won't because it would be too expensive to produce and they'd rather use that silicon on CPUs or AI cards.
Maybe they'll launch a 7900XTX refresh at some point, with higher clocks and faster memory. But I doubt it. RDNA3 is done, and RDNA4 won't even have high-end GPUs for the reasons explained above: they make way more money with CPUs and AI.
they've given up on trying to compete on the high end, which is understandable. they just don't have the resources and staff that Nvidia have. might as well undercut them at mid-range, like they always did. I hope FSR3 doesn't suck, since I'm probably sticking with AMD for my next build. their Adrenalin software is too good for me to go back to Nvidia. And my build will be like 500+ bucks cheaper as well.
>Adrenalin software
>software you use once to undervolt and never touch again
>mattering
For me, it's DLSS, DLSS, and DLDSR in combination with RT that makes Radeon worthless
>500+ bucks cheaper
You really are retarded
That's not the reason. Navi41 and Navi42 were supposed to be these MCM beasts: https://www.notebookcheck.net/Initial-RDNA-4-details-reveal-Navi-41-to-be-2x-faster-than-Navi-31-as-RDNA-4-could-have-50-60-better-performance-watt-vs-RDNA-3.675584.0.html . Don't read too much into the "estimate" performance numbers, but these designs were clearly aiming at the high end. They are not giving up because they "can't compete" with Nvidia. They didn't give up during the Vega days, they sure as fuck aren't giving up now that they're like 20% behind.
These designs were supposed to use CoWoS (Chip-on-Wafer-on-Substrate) from TSMC. Guess what else uses this packaging? Nvidia's AI cards. Guess who's going to pay more for this: Nvidia ($10000 AI cards) or AMD ($500-1000 gaming cards). They wouldn't be able to compete buying the components they need, not performance.
So they're scrapping Navi41 and Navi42, and use that silicon (and CoWoS) to make $10000 AI cards themselves instead. Navi 43 and 44 on the other hand are going to be monolithic.
It also doesn't help that Apple dropped its colossal throbbing cock on the table and said that they'll buy every single 3nm lines 100% output for 2023 and a big part of 2024.
Isn't Zen 5 also fucked then?
>These designs were supposed to use CoWoS (Chip-on-Wafer-on-Substrate) from TSMC.
I feel like every RDNA generation since 2 AMD is pushing closer and closer to returning to HBM
Well, they can't use HBM either because it's needed for AI. The price of HBM skyrocketed not that long ago (due to demand for AI chips), Samsung announced they're going to increase HBM production next year (to meet said demand).
AI is stealing all the cool technology from us gamers, REEEE
>FSR 3
Well, it's not gonna suck on the 70XX series cause those GPUs have dedicated AI cores that FSR 3 is utilizing there. Which is why i believe the reports that its basically looking and working like DLSS does. Which also utilizes Nvidias AI cores unlike current FSR which is doing everything on software level.
But FSR 3 on anything but RDNA 3 is just gonna be an improved version of FSR 2 and still run in software. Cause FSR 3 can't just make AI cores magically appear in your 5700 XT or 1080 Ti.
So technically we are getting FSR 3 (True) and FSR 3 (Fake)
what about frame gen? I'm on RDNA2, btw. also: is FSR3 also driver level, or only frame gen?
Framegen is coming with FSR 3 everywhere but AMD haven't really gone into detail how thats gonna work
But I already have a 12gb 3080.
Still not buying anything this gen. Next gen is the true time to upgrade.
yeah, never buy on a transitional gen. you're basically buying a prototype.
>amd marketing thread
VRAM won't help you
I just want another 9800PRO and 9800AIW tier card again. I miss that so much.
My 7900xtx Hellhound works perfectly fine for 6 months already. Guess I'm lucky or something, never had AMD card before but nvidia shitty pricing made me switch.
most AMD memes about being turbo-glitchy or overheating are ancient (overheating) or a toss-up between user error and getting a lemon (turbo-glitchy)
>ATI Tray Tools - killed on purpose via driver lockouts
>Radeonpro - killed on purpose via driver lockouts
>MPT - killed on purpose via driver lockouts
Meanwhile, Inspector and RTSS/Rivatuner still somehow work. Why would you buy a GPU made by a company that very blatantly doesn't want you to use it to 100% of it's capability?
Profile Inspector is the only way to fix retarded dev's LoD bias. Can't do that with Radeon cards
Like all things AMD, you USED to, but they killed it along with 3rd-party tool support. AMD completely giving the finger to modders has been their biggest historical fuckup, and when they aren't skullfucking their drivers to remove features, they're buying out the modders themselves so the tools die in contract legalese Hell. The Radeonpro was bought to work on Raptr with the clause he can't work on Radeonpro anymore, and when Raptr died as predicted, he ragequit AMD development entirely because he expected to get a cushy job out of this
And then there was the time that Regeneration of hybrid PhysX fame got PhysX running on AMD cards no sweat, only to disappear of the internet entirely. Hilbert of Guru3D says that most prevailing rumor is that AMD paid him off for life so he'd fuck off forever and keep the info buriedm which I buy, because Regen was extremely competent, and isn't the type to just disappear from modding entirely
https://web.archive.org/web/20160323093516/http://www.ngohq.com/news/14254-physx-gpu-acceleration-on-radeon-update.html
>AI is stealing all the cool technology from us gamers, REEEE
It honestly didn't help that HBM was demonstrated on the Fury, in a 4 FUCKING GB CONFIGURATION IN 2015. I got one secondhand and still felt like I paid too much. While HBM helped it punch a lot higher than it should've in it's class, it absolutely got destroyed by the likes of Arkham Knight and Mankind Divided, and while these weren't exactly optimized, 1080's and the prior-gen 290/390's destroyed it because they were 8GB, showing that as time went on, newer games were leaving less room for the framebuffer itself
AMD just barely got GPU scheduling working again (for supported cards) after fucking it up the first time and leaving it dead for years, and HAGS is required for the newest DLSS implementations - which Nvidia supported without issue on day one. They themselves probably don't know how it works yet
Need 20 GB vram for 300
i already have the 7900 XT with 20 gigs of vram
I already got a 2060 12GB for $280 a year ago
please for the love of christ rangeban India, if not permanently at least for a few hours everyday so the site is useable
India is working for everyone though. Except Intel who are gods chosen people