It is, it's a gimped version of the 7900XT. A gimped version of one of the fastest cards on the market is still one of the fastest cards on the market. >Why is everyone so hyped for the 7900?
It's not that good, no one is "hyped". Same performance-per-dollar as the 7800XT. It's decent given the rest of the market, that's all.
So, buy a marked up Nvidia GPU for 1000 - 1500
Or, buy an AMD GPU of the same strength for 400 - 1000
I am currently on RTX3080ti, previously I had a 1080ti. Every time I look over at AMD I see GPUs that are equivalent to mine but for NORMAL prices while Nvidia GPUs are still at scalper prices.
Serious fricking question, why the hell does everyone ignore AMD GPUs? Are they faulty? Buggy? Cheap plastic? Freesync is garbage? What's the RISK if you go full AMD???
memes about the drivers and lately stupid people not realizing raytracing, AI, and GPGPU in general work fine on AMD cards (no, ROCm isn't a hack, HIP works fine, and everything important has official HIP ports)
i've got a sneaking suspicion some of the OEMs are bad too and the people with problems keep buying cards from them
sapphire is the one you want to get if you do get an AMD card
RT does not run nicely on AMD, but, who the hell uses that? It's just a Nvidia gimpping technique.
I've always had desktop AMD GPU's and not a single issue.
I've never overclocked so that could be a part of why it never brocke for me.
>RT does not run nicely on AMD
AMD is not that far behind on average. The 7900 GRE is $550-ish and only slightly behind the $500-ish 4070 in Raytracing at 1440p. It wins at 4K even: https://tpucdn.com/review/sapphire-rx-7900-gre-nitro/images/relative-performance-rt-3840-2160.png
AMD only gets destroyed in Cyberpunk and Portal with Path Tracing, but these are literally tech demos designed to run on Nvidia hardware... but it's not like a 4070 can run these well either. Sure, they run at 3 FPS on the AMD card and at 9 FPS on the equivalent Nvidia cards (3x more!!), but in practice they are unplayable on anything slower than a 4090. Even with a 4090, "playable" is arguable.
>What's the RISK if you go full AMD???
fsr looks a lot shittier than dlss (youre going to have to use upscalers for some games pc ports are bad)
amd drivers cause issues with emulation (nintendo's going to kill emulation so this will be less relevant soon)
>amd drivers cause issues with emulation
source?
everything is vulkan now which amd created through mantle nvidia actually sucks at vulkan emulation its documented all over Ganker...
>amd drivers cause issues with emulation (nintendo's going to kill emulation so this will be less relevant soon)
no, they don't
this meme only exists because of performance issues with their opengl implementation (which were fixed a while ago, up to 90% performance increases) although i think there might still be some errata and conformance issues
that being said any emulator not using vulkan is by morons and isn't worth using
but even if you insist on that, the opengl equivalent to dxvk and vkd3d-proton, zink (opengl-on-vulkan) is basically feature complete now and it apparently works on windows
microsoft's using the mesa project for a lot of shit so some of zink's internals even have semi-official support
>Serious fricking question, why the hell does everyone ignore AMD GPUs?
The real answer is manufacturing. They don't have the manufacturing capacity of Nvidia and thus cannot meet the low demand for their GPUs on launch. During the GPU shortage a few years ago, people bought every GPU they could get their hands on, and this made the problem more apparent as Nvidia sold 10 times as many GPUs as AMD despite both selling out within 20 seconds of putting stock up for over 1 year straight.
That's the answer that matters across the board. Every other thing is just internet shitflinging or marketing. AMD focused their manufacturing efforts on CPUs. If everyone in the world decided to buy AMD at the same time for 3 years straight, AMD would still have less than half of the market share, because they can't make GPUs any faster.
they have capacity on gpu shortage with TSMC, nvidia that year use samsung for their node
AMD, Sapphire, Powercolor and XFX selling it direct to miner kek.
>amd drivers cause issues with emulation (nintendo's going to kill emulation so this will be less relevant soon)
no, they don't
this meme only exists because of performance issues with their opengl implementation (which were fixed a while ago, up to 90% performance increases) although i think there might still be some errata and conformance issues
that being said any emulator not using vulkan is by morons and isn't worth using
but even if you insist on that, the opengl equivalent to dxvk and vkd3d-proton, zink (opengl-on-vulkan) is basically feature complete now and it apparently works on windows
microsoft's using the mesa project for a lot of shit so some of zink's internals even have semi-official support
it fixed on 2022 with dxnavi with 22.5 driver, took them a decade to fix but only on rx 6000 or rdna 2 gpu.
One more thing: where I live I could recall my XTX and get a 4080 super for the same price but I won't for two reasons:
>getting 16GB of VRAM for that price is a SCAM. I'm paying out of the nose for a GPU, it better last me 5-7 years >Nvidia's connector is so awful I'd be genuinely scared to put it in my case since bending it in any shape leads to disconnects or the meme housefire
XTX might be more power hungry/hotter but that's why I went with a 7800X3D
>XTX might be more power hungry/hotter
Not by much... if under heavy load. It's 350W vs. 300W, the difference is nothing.
The issue with RDNA3 is that it has a "fixed" power usage due to chiplets. So, if you are playing an older game or anything else that doesn't stress your card all that much, the 4080 can downclock very efficiently and consume let's say 60W; the 7900XTX can't, so it ends up consuming let's say 120W, twice as much.
AMD has improved this behaviour over time with driver updates, but I doubt they can fix it completely due to the nature of chiplets themselves. AMD's CPUs behave the same way: at idle, belive it or not, Intel is often more efficient. This is also why Ryzen on laptops is monolithic: a CPU that consumes 10W while you're doing nothing would drain the battery in no time.
No DLSS (FSR is noticeably worse)
No CUDA
No support in any productivity software
Shit RT performance
Shit VR performance
Shit OpenGL performance
Shit driver stability
Shit AI performance (if supported at all)
If all you do with your GPU is play semi-new vidya gaems with no RT then AMD is a fine choice, but if you want to do anything more than that even Intel is a better option
>No DLSS (FSR is noticeably worse)
oh no my shitty upscaling which shouldn't even be a thing in the first place >No CUDA
HIP (literally CUDA with renamed APIs, ports are so easy scripts can do them) >No support in any productivity software
blender's cycles renderer supports HIP and HIP-RT and there's radeon pro render which supports blender, maya, and one other thing >Shit RT performance
it's good enough for games but admittedly not as good for things like content creation >Shit VR performance
how did "some wireless headsets have an encoding issue that was fixed" turn into "AMD sucks at VR"
wireless headsets are shit anyways >Shit OpenGL performance
fixed >Shit driver stability
meme >Shit AI performance (if supported at all)
see
Why buy Nvidia for AI again?
, nope, stop lying using benchmarks from before the RDNA3 AI accelerator routines were actually exposed to user software
>why the hell does everyone ignore AMD GPUs?
Because they come out a year after the card they're supposed to compete with and then launch with bullshit issues in like 90% of games.
productivity , for the longest time nvidia had shdowplay , streaming and recording with no fps loss , so streamers , 3d programs , some 3d programs like Blender render with Cycles , which is wastly superior on rtx cards , and some even use renderers that only work on nvidia gpus at all , and now AI , the newest productivity toy that works better on Nvidia cards , remember pcs are consoles , theyre not just for gaming
>remember pcs are consoles , theyre not just for gaming
Literally 99% of tards buy nvidia exclusively for gaming with no interest in blender or even beginning level stuff like video editing
Its actually hilarious that nvidia and microsoft are doing everything in their power to prevent the masses from using vulkin even though its inevitable.
it's mostly just devs being extremely moronic, vulkan requires a lot more effort and knowledge to get started using
if anything microsoft has made it easier to use vulkan since they open sourced the direct x shader compiler you can write shaders for vulkan in the directx shader language
despite that it's actually fricking up dx12 performance too, one of the devs of vkd3d-proton has a series of blogposts outlining just how fricked dx12 shader binaries can get and all the fixes vkd3d-proton needs to make to them to translate them to vulkan's format thanks to the compiler being bad and devs not knowing what they're doing when they write shaders
Yes and I know that pisses you off
aren't the chinese versions of GPUs supposed to be gimped? Why is everyone so hyped for the 7900?
It is, it's a gimped version of the 7900XT. A gimped version of one of the fastest cards on the market is still one of the fastest cards on the market.
>Why is everyone so hyped for the 7900?
It's not that good, no one is "hyped". Same performance-per-dollar as the 7800XT. It's decent given the rest of the market, that's all.
which game are you talking about? fire strike?
>Why is everyone so hyped for the 7900
Nobody is hyped for AMD GPUs
It's all paid marketeers
I'm hyped for AMD GPUS. iGPUs in their APUs I mean.
So, buy a marked up Nvidia GPU for 1000 - 1500
Or, buy an AMD GPU of the same strength for 400 - 1000
I am currently on RTX3080ti, previously I had a 1080ti. Every time I look over at AMD I see GPUs that are equivalent to mine but for NORMAL prices while Nvidia GPUs are still at scalper prices.
Serious fricking question, why the hell does everyone ignore AMD GPUs? Are they faulty? Buggy? Cheap plastic? Freesync is garbage? What's the RISK if you go full AMD???
>faulty? Buggy?
Last one I had wqs a fricking nightmare (rx480)
they're great on Linux but I'm not a troony
Seems to be marketing failure, nvidia is also chasing the coronavirus prices still. Back then the prices for high end were double the RRP
memes about the drivers and lately stupid people not realizing raytracing, AI, and GPGPU in general work fine on AMD cards (no, ROCm isn't a hack, HIP works fine, and everything important has official HIP ports)
i've got a sneaking suspicion some of the OEMs are bad too and the people with problems keep buying cards from them
sapphire is the one you want to get if you do get an AMD card
RT does not run nicely on AMD, but, who the hell uses that? It's just a Nvidia gimpping technique.
I've always had desktop AMD GPU's and not a single issue.
I've never overclocked so that could be a part of why it never brocke for me.
>RT does not run nicely on AMD
AMD is not that far behind on average. The 7900 GRE is $550-ish and only slightly behind the $500-ish 4070 in Raytracing at 1440p. It wins at 4K even: https://tpucdn.com/review/sapphire-rx-7900-gre-nitro/images/relative-performance-rt-3840-2160.png
AMD only gets destroyed in Cyberpunk and Portal with Path Tracing, but these are literally tech demos designed to run on Nvidia hardware... but it's not like a 4070 can run these well either. Sure, they run at 3 FPS on the AMD card and at 9 FPS on the equivalent Nvidia cards (3x more!!), but in practice they are unplayable on anything slower than a 4090. Even with a 4090, "playable" is arguable.
>What's the RISK if you go full AMD???
fsr looks a lot shittier than dlss (youre going to have to use upscalers for some games pc ports are bad)
amd drivers cause issues with emulation (nintendo's going to kill emulation so this will be less relevant soon)
>amd drivers cause issues with emulation
source?
everything is vulkan now which amd created through mantle nvidia actually sucks at vulkan emulation its documented all over Ganker...
>amd drivers cause issues with emulation (nintendo's going to kill emulation so this will be less relevant soon)
no, they don't
this meme only exists because of performance issues with their opengl implementation (which were fixed a while ago, up to 90% performance increases) although i think there might still be some errata and conformance issues
that being said any emulator not using vulkan is by morons and isn't worth using
but even if you insist on that, the opengl equivalent to dxvk and vkd3d-proton, zink (opengl-on-vulkan) is basically feature complete now and it apparently works on windows
microsoft's using the mesa project for a lot of shit so some of zink's internals even have semi-official support
>Serious fricking question, why the hell does everyone ignore AMD GPUs?
The real answer is manufacturing. They don't have the manufacturing capacity of Nvidia and thus cannot meet the low demand for their GPUs on launch. During the GPU shortage a few years ago, people bought every GPU they could get their hands on, and this made the problem more apparent as Nvidia sold 10 times as many GPUs as AMD despite both selling out within 20 seconds of putting stock up for over 1 year straight.
That's the answer that matters across the board. Every other thing is just internet shitflinging or marketing. AMD focused their manufacturing efforts on CPUs. If everyone in the world decided to buy AMD at the same time for 3 years straight, AMD would still have less than half of the market share, because they can't make GPUs any faster.
they have capacity on gpu shortage with TSMC, nvidia that year use samsung for their node
AMD, Sapphire, Powercolor and XFX selling it direct to miner kek.
it fixed on 2022 with dxnavi with 22.5 driver, took them a decade to fix but only on rx 6000 or rdna 2 gpu.
I didn't.
One more thing: where I live I could recall my XTX and get a 4080 super for the same price but I won't for two reasons:
>getting 16GB of VRAM for that price is a SCAM. I'm paying out of the nose for a GPU, it better last me 5-7 years
>Nvidia's connector is so awful I'd be genuinely scared to put it in my case since bending it in any shape leads to disconnects or the meme housefire
XTX might be more power hungry/hotter but that's why I went with a 7800X3D
>XTX might be more power hungry/hotter
Not by much... if under heavy load. It's 350W vs. 300W, the difference is nothing.
The issue with RDNA3 is that it has a "fixed" power usage due to chiplets. So, if you are playing an older game or anything else that doesn't stress your card all that much, the 4080 can downclock very efficiently and consume let's say 60W; the 7900XTX can't, so it ends up consuming let's say 120W, twice as much.
AMD has improved this behaviour over time with driver updates, but I doubt they can fix it completely due to the nature of chiplets themselves. AMD's CPUs behave the same way: at idle, belive it or not, Intel is often more efficient. This is also why Ryzen on laptops is monolithic: a CPU that consumes 10W while you're doing nothing would drain the battery in no time.
No DLSS (FSR is noticeably worse)
No CUDA
No support in any productivity software
Shit RT performance
Shit VR performance
Shit OpenGL performance
Shit driver stability
Shit AI performance (if supported at all)
If all you do with your GPU is play semi-new vidya gaems with no RT then AMD is a fine choice, but if you want to do anything more than that even Intel is a better option
>No DLSS (FSR is noticeably worse)
oh no my shitty upscaling which shouldn't even be a thing in the first place
>No CUDA
HIP (literally CUDA with renamed APIs, ports are so easy scripts can do them)
>No support in any productivity software
blender's cycles renderer supports HIP and HIP-RT and there's radeon pro render which supports blender, maya, and one other thing
>Shit RT performance
it's good enough for games but admittedly not as good for things like content creation
>Shit VR performance
how did "some wireless headsets have an encoding issue that was fixed" turn into "AMD sucks at VR"
wireless headsets are shit anyways
>Shit OpenGL performance
fixed
>Shit driver stability
meme
>Shit AI performance (if supported at all)
see
, nope, stop lying using benchmarks from before the RDNA3 AI accelerator routines were actually exposed to user software
>Shit OpenGL performance
>Shit driver stability
OS issue.
>why the hell does everyone ignore AMD GPUs?
Because they come out a year after the card they're supposed to compete with and then launch with bullshit issues in like 90% of games.
Take the AMD pill
2070 was the last card I bought from shitvidia.
productivity , for the longest time nvidia had shdowplay , streaming and recording with no fps loss , so streamers , 3d programs , some 3d programs like Blender render with Cycles , which is wastly superior on rtx cards , and some even use renderers that only work on nvidia gpus at all , and now AI , the newest productivity toy that works better on Nvidia cards , remember pcs are consoles , theyre not just for gaming
>remember pcs are consoles , theyre not just for gaming
Literally 99% of tards buy nvidia exclusively for gaming with no interest in blender or even beginning level stuff like video editing
>the latest Japanese kino on PC
?
>Benchamarks
>rabbi edition
What's the 4090 at
Kinda
What's the difference between the 7900XTX and gre?
https://www.techpowerup.com/gpu-specs/radeon-rx-7900-xtx.c3941
https://www.techpowerup.com/gpu-specs/radeon-rx-7900-gre.c4166
>3070 8GB
No. Sadly, GPU-s without at least 16GB VRAM barely run RPG Maker games these days.
buy an ad
Can your GPU handle the latest fuzzy donut?
Why buy Nvidia for AI again?
I found a used 3080 10GB for 550 bucks, should I pull the trigger
frick no, you can get a 4070 Super for that
Laptop 3050. I should be fine until the switch 2 comes out.
>laptop
Sell it before it starts malfunctioning and you're stuck with e-waste nobody wants
>t.used to have a 3080 laptop
I still want to have a laptop to move around with, so it's a risk I'm willing to take.
1440p is overrated
Just game on 1080p, you won't need to shell hundreds of bucks for the latest goymachine
Its actually hilarious that nvidia and microsoft are doing everything in their power to prevent the masses from using vulkin even though its inevitable.
it's mostly just devs being extremely moronic, vulkan requires a lot more effort and knowledge to get started using
if anything microsoft has made it easier to use vulkan since they open sourced the direct x shader compiler you can write shaders for vulkan in the directx shader language
despite that it's actually fricking up dx12 performance too, one of the devs of vkd3d-proton has a series of blogposts outlining just how fricked dx12 shader binaries can get and all the fixes vkd3d-proton needs to make to them to translate them to vulkan's format thanks to the compiler being bad and devs not knowing what they're doing when they write shaders
t. hobbyist graphics programmer
I'm surprised people are still buying Nvidia cards after getting burned twice on their 20 and 30 series GPUs
Even the rasterization has barely improved. They're just shilling Raytracing performance even though most games have not incorporated RT.
750ti
Will upgrade to 4090 in 20y
i updated from a 750ti 2 years ago to a 3060ti
no regrets. its not like i paid a lot for the 750ti
>you need a 4080 to play jap slop
lmao, a 3060 will run anything.
Not on 1440p