Yeah but that was a bad card from AMD's dark ages.
Now you have current gen nvidia cards with lower memory bandwidth than previous gens from the same tier.
1080 destroyed it when it was new, even a 2060 or 3050 destroys it now.
The fact it loses to the 6600 on a PCIe x8 interface is downright embarrassing.
memory bus isn't everything, effective bandwidth is important too which is why the 4060 outperforms the 3060 and is practically on par with the 3060ti despite it having only a 128bit bus compared to the 3060 ti's 256 bit bus
Depends on the games you play and resolution
I know a bunch of the morons here will scream about needing more VRAM but then play games from 10+ years ago
they have 16GB but have weak graphic cards, they do not max anything and run at 30 fps.
Meanwhile a 8GB on PC can max majority of games, at 60 fps.
VRAM doesn't mean shit without context, 8GB can still last another decade if developers decide to optimize for it and don't be lazy or go full graphics that are subjective by art design.
12gb is not enough. it crashes in most new AAA titles if you use the same settings as equivalent AMD cards with more vram. meaning you have to turn down settings or have worse graphics on nvidia.
it's actually insane that Nvidia can make a new GPU and people will actually consider whether or not they need to replace last year's "most powerful GPU ever"
>no cards from the era when people still bought AMD cards supported >HIP SDK only on 6800 and up >supported by frick all software anyway
It's actually fricking nothing.
Everything productivity-related uses CUDA, and anyone that wants to use their GPU for anything other than playing vidya knows this. No one wants to ditch CUDA (which just werks) for some kind of unproven software hackjob from a company that's historically had most of its issues on the software side.
Like the cards everyone's mad at for sucking? The 5xxx series?
So it's on the GPUs that are actually useful?
Because it only worked on a few workstation cards under linux until now?
God damn, dude. Fanboy harder. You're starting to seem like a reasonable person.
Shouldn't be a problem if you're not getting it for current AAA games and just want to play past games, indie games or any non AAA games. Every time people keep saying certain gpus suck is that it can't run highest setting on CURRENT games. Keep in my devs don't do optimization well or rely on features to make their game work okay.
no 12cm was never enough
If you turn everything down, but why buy a new GPU if you're going to do that?
don't be a slave to the "ultra" preset
it's still great for 1080p but barely enough for 1440p
Depends on the memory bus
if its less than 256bit then its shit and should never have existed
This. A gimped bus will always destroy a card's longevity.
Behold, peak longevity.
Want one of these cards with the LC but they're hard to find
Probably the sexiest card AMD has released
Yeah but that was a bad card from AMD's dark ages.
Now you have current gen nvidia cards with lower memory bandwidth than previous gens from the same tier.
Seems to do fairly well in modern day games at 1080p. Loses to a RX 6600 a lot, though.
1080 destroyed it when it was new, even a 2060 or 3050 destroys it now.
The fact it loses to the 6600 on a PCIe x8 interface is downright embarrassing.
Just about all last-gen or cross-gen.
Which is fine when we talk about how long 1060 lasted. Or 1080 for that matter. No biggie.
AMD? Definitely a disqualifier.
memory bus isn't everything, effective bandwidth is important too which is why the 4060 outperforms the 3060 and is practically on par with the 3060ti despite it having only a 128bit bus compared to the 3060 ti's 256 bit bus
It will continue to be until next gen consoles.
CLOUD WATCH OUT
Depends on the games you play and resolution
I know a bunch of the morons here will scream about needing more VRAM but then play games from 10+ years ago
It won't be enough until VRAM => System RAM
6 GB is enough
New games suck dick
anything above 1080p is for nerds
this can barely run witcher3
i owned this gpu, i know what i'm talking about
Seems fine on my 3060
PS5 and Xbox have 16GB of memory. With how unoptimised PC games are. 16GB is gonna be the next 8GB for the next 5 years.
they have 16GB but have weak graphic cards, they do not max anything and run at 30 fps.
Meanwhile a 8GB on PC can max majority of games, at 60 fps.
VRAM doesn't mean shit without context, 8GB can still last another decade if developers decide to optimize for it and don't be lazy or go full graphics that are subjective by art design.
It's shared memory and the CPU struggles because GDDR6 has insanely high latency.
i have a 10gb 3080 and even that still works fine for me
12gb is not enough. it crashes in most new AAA titles if you use the same settings as equivalent AMD cards with more vram. meaning you have to turn down settings or have worse graphics on nvidia.
That's only an issue with AAA goyslop.
Didn't had that issue with Ratchet in 4K and Raytracing off
Not him, but many homosexuals run so much bloatware it fills up VRAM to the brim.
those 12GBs are stuck riding the moron bus so no
ignore this, i meant to reply to op
That's a Vega 64, 8GB of HBM2 on a 4096 bit bus.
It aged like shit.
it's actually insane that Nvidia can make a new GPU and people will actually consider whether or not they need to replace last year's "most powerful GPU ever"
Probably because the latest generation is complete garbage.
https://rocm.docs.amd.com/en/latest/release/windows_support.html
>no cards from the era when people still bought AMD cards supported
>HIP SDK only on 6800 and up
>supported by frick all software anyway
It's actually fricking nothing.
desperation
Everything productivity-related uses CUDA, and anyone that wants to use their GPU for anything other than playing vidya knows this. No one wants to ditch CUDA (which just werks) for some kind of unproven software hackjob from a company that's historically had most of its issues on the software side.
Cool. You don't know shit. Thanks for playing.
Like the cards everyone's mad at for sucking? The 5xxx series?
So it's on the GPUs that are actually useful?
Because it only worked on a few workstation cards under linux until now?
God damn, dude. Fanboy harder. You're starting to seem like a reasonable person.
Honestly I went from a rx5500xt 8gb to a 4090. All it did was force me to upgrade other shit that didn't matter. Games are pretty I guess
Shouldn't be a problem if you're not getting it for current AAA games and just want to play past games, indie games or any non AAA games. Every time people keep saying certain gpus suck is that it can't run highest setting on CURRENT games. Keep in my devs don't do optimization well or rely on features to make their game work okay.