Can graphics cards get too big?
![]() It's All Fucked Shirt $22.14 |
![]() Tip Your Landlord Shirt $21.68 |
![]() It's All Fucked Shirt $22.14 |
Can graphics cards get too big?
![]() It's All Fucked Shirt $22.14 |
![]() Tip Your Landlord Shirt $21.68 |
![]() It's All Fucked Shirt $22.14 |
>mfw can't upgrade my card without getting an entire new case
Big corpo won by a landslide, not only you need to change your CPU and MOBO, now you also need to change your case for an ever increasing size of the GPU, next thing you'll know is that they will ask for a new set of ram sticks every two years
>not just buying a new computer every 4 years
stop.
being.
poor.
i have over half a million put away and i upgrade every 7 years - 4 years is fricking overkill
a case is fricking $50 moron
for a mid tower that can't fit current year's XBOX HEUG gpus
actual moron
>he doesn't just have an EATX case on standby
>want to upgrade gpu
>realize i need a new power supply for it
>and a new case
>if I'm upgrading my gpu, i may as well also upgrade my cpu since it's been years at this point
>same with mobo
>realize I'm pretty much building a new PC
>don't upgrade gpu
mf puts a cover on his handhelds like a grandma with their couches
>CONSOOOOM!!!
Black person.
and.
jew.
yes consume, this is america not the soviet union
>shitskinned mutt
>paypig
good goy!
You need a big cooler to dissipate 400 watts of power.
The FE 4090 is also one of the smaller ones, my 4090 is even thicker and longer.
4090 TDP is 450 watt and so its size is reasonable. I would argue if we could make 4090 bigger and use larger fan, it would have less noise.
4090 being as big as it is, is the best decision nvidia made recently. Despite the 450W power draw the thing is very quiet and if you set the power limit to 350W, which loses you maybe 2% of performance, the thing is dead silent. The die is also a massive 600mm^2 so the temps stay very low. It's the quietest and coolest air cooled card I've ever owned. The 7900 XTX is pushing 350W through just 300mm^2 (MC uses little power) and the hotspot can get crazy high. 6900 XT had a similar issue with hotspot regularly oscillating around 100C if you wanted the card to remain reasonably quiet.
I hope they don't gimp the 5090 in that regard and again give us a massive die with oversized cooler.
next year:
4090 is superseded by the 5070 and uses half the power draw.
>970 barely faster than 780 Ti
>1070 barely faster than 980 Ti, equal if you OC both
>2070 slower than 1080 Ti
>3070 slower than 2080 Ti
>4070 slower than 3080
There is a downward trend lol. xx70 is getting worse and worse each gen.
Not happening chief. Don't forget RTX 40xx was a massive leap from Samshit's 10nm+ to TSMC's 4nm. TSMC 4nm -> 3nm isn't as big. You'll get higher density and maybe 30% higher efficiency at iso performance. Even if they finally figure out double issue FP32/INT I doubt we'll see a jump as big as 3090 -> 4090 and the rest of the lineup scales to that. We will likely see double the ray intersections again but this is irrelevant outside of ray tracing.
Also nvidia has no need to rush. AMD won't be competing on the high end (again) so nvidia can release a 5080 costing $1000 while being 5% faster than the 4080 super at most. They are free to do whatever the frick they want and charge whatever they want in the high end. Maybe a 5090 will be much faster but cost even more than current retail prices for the 4090.
AMD might as well not exist to nvidia. They released the $700 1080 Ti when AMD couldn't even produce a 1070 competitor. And yet they released a $1500 3090 when AMD finally had a card that competed with it. They haven't brought up AMD in marketing or presentations in a decade. They're using the Apple strategy where they don't acknowledge competition's existence.
Pointless without GDDR7 or GDDR7X. 4090 is mostly bandwidth bound and it's not even the full AD102 so there's a room for 4090 Ti. An AD101 would be stupidly expensive since it'd be close to the ~850mm reticle limit and would need HBM. It would be $5k at the very minimum and not due to nvidia's margins. Nvidia's margins on 4090 are smaller than people think, it's like 60-70%. For context, Samsung and Apple have 100%+ margins on their flagship phones.
I wouldn't care what they're doing at the absolute top end if 60s weren't also insultingly awful.
>7000D
im good for at least another decade
The actual PCB is relatively small, its literally 90% heatsink
Somewhere inside nvidia's labs, there exists an AD101 that consumos upwards of 500W.
TDP or peak power? single die or a whole chip? for chiplet, 500watt isn't big problem for a whole chip.
AD101 doesn't exist and power limit is set arbitrarily anyway. You can even put 4090 FE at 600W by dragging a slider in Afterburner because nvidia officially allows it to go to 130% power limit which doesn't even void your warranty or anything. Galax has a 1kW+ BIOS for their top end 4090 IIRC. The performance gains are pathetic, though. Single digit percents.
>The performance gains are pathetic, though. Single digit percents.
That's why it needs more cores. Therefore an hypothetical AD101.
Let's say I have an RTX card and I like using RTSS to keep tabs on utilization and such.
With GTX, you could trust the little percentage number meant actual total GPU utilization. But RTX cards have tensor cores and RT cores which supposedly go completely unused if there's no application taking advantage of them.
With that in mind, can you still trust the utilization percentage in RTSS?? Or is my 29% under load GPU here actually under 40% or 50% load?
GPU usage is a mostly pointless metric. You want Intel PresentMon and look at GPU busy times compared to frametimes. If you want to dive deep you need nvidia Nsight. RTSS can show you 100% usage in two games but in one the GPU is being stalled by something having low occupancy or low cache hit rates or whatever. The RT cores are not separete from the SMs, you still need to cast the ray but nvidia was smart enough to move things like BVH traversal to the RT cores to allow concurrent shading and RT (unlike AMD where it stalls the shading)
If you want a simple heuristic how much of your GPU is actually at work outside of perfmon GPU busy times, look at max power limit compared to current power consumption but subtract what the VRAM consumes. On 4090 it's around 60W last time I checked (GPU-Z shows that). Whether that work is useful is impossible to know without advanced metrics, however. For example if your 450W card is using 250W at 99% GPU usage you know something's fricked. It's just a heuristic however and doesn't cover every case.
PC gaming is kind of a meme when you compare it to mobile APUs which can achieve respectable performance at 15W tdp. Your GPU alone should not pull 450W.
Don’t care still not upgrading from 1080ti
does it sound like a jet engine when the fans get going?
is it worth it to get a 4070 TI super this late in the gen?
Absolutely not.
do you think the 50 series is going to bring major advances?
YES
like what kind of advances?
You know ray tracing? That but it's gay tracing
Probably not but the 40 series is just not good enough value to be worth it and buying now says to nvidia that that's okay. Even if 50 series is equally bad on value, at least we'll be closer to some games being released that might need newer hardware. There's nothing right now.
you could beat a man to death to death with that graphics card
nah senpai your just 2 poor