>That'll be $2499.99 + tax + 1600W PSU + new breakers for your house, enjoy your Nvidia™ RTX™ experience in upscaled 1080p (soon to be obsoleted by RTX 4080TI in 5 months)
How the FRICK does Nvidia get away with this?
>That'll be $2499.99 + tax + 1600W PSU + new breakers for your house, enjoy your Nvidia™ RTX™ experience in upscaled 1080p (soon to be obsoleted by RTX 4080TI in 5 months)
How the FRICK does Nvidia get away with this?
Maybe don't spend your money on gacha shit and you could afford it
silver lining, shitty american wiring means tdp can't increase beyond what they'll pull next gen
You don't actually know a single thing about wiring, do you?
Not him but at 110V and 1.5mm wire you cannot safely use a 2KW psu as the 16A fuse would trigger.
Putting a 20A fuse would be just calling for a hazard.
And having something more on the same socket like a monitor, chargers, and other shit people usually use with a computer could also be too much.
It's already at a price point where if you can buy it you can afford an electrician to make it work safely. Some highend projectors require shit like that
stop saying "price point" its just simply price
i know math. most circuits are 120v15a. that's 1800w peak or 1440w sustained, for a whole room.
And only 80% of that is actually converted to usable power for your PC, leaving you with barely above 1000W, plus GPUs can spike up to 1.5x their power limits under load, we are looking at 700-900W for the GPU alone. You are absolutely risking blowing a fuse if you buy a top tier next gen GPU
>And only 80% of that is actually converted to usable power for your PC
Not if you buy a platinum psu
>just buy a $500 PSU so your PC doesn't shutdown randomly while gaming!
nah I'm thinking frick off nvidia
Nah he's right. This product clearly isn't for the poors of v.
Go keep playing on your 1080 bro.
>You're just poor!
The classic Nvidiatard defense. Just because I have plenty of money doesn't mean I want to waste it if I can avoid it
NTA, but AMD just doesn't give a frick unfortunately. It's the sad truth. They cannot even add mesh shading support in vulkan for god knows how long.
I hate nvidia, but regardless if you are a dev or a normal user who wants things to just work, they are the only choice. The battle was lost a long time ago.
>They cannot even add mesh shading support in vulkan for god knows how long
As far as I knew that was in already. More importantly there is still only one game that has even turned on mesh shaders in its use of DX12 so far. Mesh shaders would be great if they would get adopted soon, but they also alienate most of the gpus users on the planet right now. It will probably years before it's a standard
Only in dx12. That's the funny part, they added it to directX, but cannot even publish a vulkan extension.
> Mesh shaders would be great if they would get adopted soon
Exactly. And we don't see it because AMD is being fricking moronic. They only added the necessary hardware in rdna2 too.
But I guess we at least get the moronic raytracing meme.
Because Microsoft would tear them a new one if AMD didn't get shit working in DX12
>More importantly there is still only one game that has even turned on mesh shaders in its use of DX12 so far.
Which game?
>Which game?
The chinese MMO Justice
Oh that was more than just some chink tech demo? hmm I'd give it a try but I'm not putting CCP spyware on my PC
Hard to say how much it exists since it's still China only.
>plus GPUs can spike up to 1.5x their power limits under load, we are looking at 700-900W for the GPU alone.
Even 3090ti doesn't have transients above 670W.
Pretty sure that was a spike range for the next gen nvidia high end.
The top end of lovelace (4090 or whatever nvidia would call) would use the same power delivery system as 3090ti.
3090ti pcb's were literally a test run for the next gen. I doubt we will even hit 750W.
Power supply with good caps can easily tank such transients. Average power consumption in furmark is under 500W.
The 3090Ti also doesn't have a 600W power limit. It's only 450W. And this is reference we are talking about, I can guarantee you EVGA, ASUS and MSI will push cards 50-100W above their reference power limits like they always do
lol, nvidia will just tell people to get a new electrical system, and their good little cattle will eat it up too
I can just plug into the clothes dryer plug.
euros really are insanely moronic
articulate your stance
American power is still 240v homosexual, just use a dryer outlet
is btc mining still a thing or is that meme done with?
3060ti is all you need, unless you do VR.
And if you do, even 3090ti is not enough.
Nobody mines bitcoin with cpu/gpu anymore. GPU mining is ETH mostly, and it tanked hard too.
I have a 3060ti and VR is fine with it.
It's "fine", but you cannot really push rendering res that high.
It struggles rendering Quest 2 at native for example, but even 3090ti is not enough for native + 120hz, because quest 2 "native" is almost twice amount the pixels of 4k (8.3k vs 14.7k of quest2).
And Quest2 is almost 2 years old at this point, so it's only going to get worse.
>With electric prices going up our collective asses
Who gives a frick about electricity, basedboy?
I would unironically take 2kW GPU as long as performance is good.
Good thing the 4090 is right around the corner, with compute performance twice as fast.
Let's wait till benchmarks.
Ti is no longer worth it because of their shitty thermal thresholds. It's 5-8% more performance for 100-250W more power consumption
This. Frick morons who enable trash power design because le heckin 5 more fps.
With electric prices going up our collective asses an extra 100-200W of TDP will make a difference.
Yes I fricking know TDP is a "rating" and not a constant power draw, I have a fricking degree in that shit. Higher TDP means a "normal" task that doesnt max out your resources will still take up more because it's designed by diversity hire Black folk.
Regular 3090 actually has shitty thermal pads. Both the 3080 Ti and 3090 Ti have better cooling.
so my $700 brand new 3080 is still going to be the best value GPU for the next 4 years?
I'm going for 6 years with mine
>have 2070S, bought right before prices went fricktarded, @1080p144
>wait until the end of the 40xx gen
>buy the upper mid / lower high end equivalent once it gets cheaper, and a 4k144 monitor once they get cheaper at the same time
My plan is well laid out. Come at me.
>4k144
Not even the top end of the 4000 series will be able to do this. We're at least another generation away.
Even 3090 can do it easily, if devs are not complete morons.
If devs are morons, even 20 generations would not be enough.
Zoom Eternal is an outliner
now do that on RDR2
Eternal had competent developers.
Pozzed AAA trash made by indians will never run well.
not either of the people on this but i dont care about running at 4k
>devs are not complete morons
This will never not be the case. GPUs getting faster is our only hope.
>before price increase
>rtx gen
oh no no no no
>How the FRICK does Nvidia get away this?
Because Nvidia can do whatever they want, and PCMasterRace is largely a bunch of Nvidia cultists.
Still using a 1080 FTW with no issues. Stay mad.
This. My 1070 SC is till doing fine with my 1080p monitor.
>my old gpu is still doing fine with my 15 year old obsolete resolution monitor
no shit
>1080p
>obsolete
Try hard, consoomer.
>playing PC games at a lower res than my phone screen
ha ha
Your phone screen is small enough you wouldn't be able to see the difference if it was actually 720
AMD should be more power efficient so if you dont like what Nvidia is doing then dont fricking buy them.
I don't see the problem op
gonna wait until the 6000 series comes ot two years from now, or so. i want to max out 4k and also have some future-proofing.
>CPU technology is advancing at breakneck speeds and they're more affordable than ever before
>GPUs struggle for 8% gains while consuming more and more power
What's holding GPUs back?
Amount of pixels people want to shade increasing faster than GPU tech can advance.
1080p is just 2kk pixels. 4k is 8.3kk (4x increase). 8k is 33kk (another 4x increase).
CPU load on the other hand doesn't change much.
CPU load didn't change because every engine was staying low cpu overhead to be compatible across consoles and the fact intel was cpu leader but didn't increase performance beyond ~10% or less for years until AMD caught up with them.
It's fundamentally hard to increase CPU workload that much. Everything computationally expensive that you can do on a CPU, you can do on a GPU better.
Game enemy AI is still on the cpu but it hasn't really progressed in a decade. Also counts of aI running at once as well.
AI workload is miniscule compared to physics for example and tolerable latency is much higher, you would even notice is AI reacts in 200ms for example vs reaction in the next frame (it would even be more realistic actually). But for physics you need next state of simulation every frame, so basically every 8ms if you are not on console framerates.
You just cannot compare the workloads.
>you would even notice is AI reacts in 200ms
you wouldn't even notice if*
you've got the pace backwards. gpu performance is doubling next gen while cpus have to dig deep for 12% ST improvements
RTX 3080 here, I only play at 1080p. Every game I throw runs good and since I will never play games over 1080p I think I made a good choice.
You really should try super sampling down from 1440p to 1080p. You can disable a lot of AA and see sharper clearer images for less GPU load. I've been doing it for years. 2080ti at 1080p sampling down from 1440p.
1080p144 bros. You happy with your card?
I think 5 layers of hyperbole kind of takes away any meaning you are trying to make.,
>NOOOOO I HAVE TO HAVE THE LASTEST AND GREATEST THING FOR 3 FPS MORE I WILL NEVER BE SATISFIED WITH MY LIFE I MUST CONSOOM
ok
>old card dies
>have to replace it with one of the new housefires
unless you are already on a 3000 series you can just get one of those
sorry for your loss, anon
it's in a better pcie slot now
>Latest and greatest
lmao
lol if you can't see the difference between the mod and RT version
All of them are mods anon
just the bottom one is a path tracing mod which also runs like shit no thanks to ray tracing being expensive
Middle looks like shit.
Middle is traditional rendering. Bottom is pathtracing.
There's going to be a future of nVidia cards that require its own dedicated PSU + outlet.
>i have all of this information but apparently im still going to buy it anyway
you either buy a 4080 or a 4070 depending on budget and be done with it anything higher or lower is for other people this has literally been the rule since the beginning
>gpus use more wattage than a microwave
enjoy your $2000 electric bill
Maybe if you have a dogshit 600W microwave
Jokes on you, electricity is free here.
Next time don't frick with russia.
How often do pcu.cks have to replace this shit?
Depends on the standards. If you go with console standards, aka fine with 18fps, once every 8 years.
bro my 1070 still holds it own
1070 is "only" 6 years old.
and im sure it would be ok for another two pretty easily i don't plan to use it that longinfact i planned to replace it with the 3000 series but then well crypto buttholes happened
18 is heavily exaggerated
*cries in bloodborne*
Bloodborne isn't the standard
>cries about not being able to play Bloodborne
Fixed your post.
True. The saddest part is that I have a ps4 and even bought bloodborne.
cuck?
Okay whatever word filter you were avoiding is gone now.
combine pc and cuck
I was thinking of going to amds rx 7000 series though
Also debating this because I may start taking Linux gaming more seriously.
AMD's TDP will be much lower than Nvidia's. I'm kind of liking the idea of the 7600 XT and 7700 XT. Hopefully the 7500 XT also moves past the GTX 1060 territory.
Why don't you homies just go for an AMD gpu?
muh drivers
im heavily considering it getting sick of nvidia's shit
I value my time more than I hate nvidia.
the frick does that have to do with graphics card choice?
Nvidia is objectively better, if you ignore that they are buttholes.
I browse vr gaming thread on /vg/ and every couple of days people with amd graphics cards are complaining about massive issues. It's like a clockwork. So either average amd user is a drooling moron or radeon actually has issues.
ahh maybe 4000 series it will be then
>radeon actually has issues.
No
>So either average amd user is a drooling moron
yes
I probably will, all of my last few GPUs have been AMD and my current Nvidia GPU was just one I got for free from a friend who upgraded.
device = 'cuda'
Is it really using more power? I remember when the 900 series came out it actually used less power than the 700 series or something. Whatever happened to that trend?
You're thinking of either Pascal (GTX 1000) which was more power efficient than Maxwell (GTX 900) or Kepler (GTX 600/700) was way more efficient than Fermi (aka Thermi, GTX 400/500)
Maxwell was massively more efficient than Kepler. That's because it was a gaming architecture and not compute. We have gone back to compute which is inefficient for gaming.
Imagine thinking these are for graphics, anymore
nvidia wasn't banking on crypto taking a shit and we're all paying for it.
>How the FRICK does Nvidia get away with this?
Clout. They have a rabid fanbase who shill for them non-stop and they outright pay streamers and israelitetubers to "promote" their product by talking about how amazing it is.
The big majority is still yet to upgrade their GPUs. NVIDIA is going way too much in the premium direction. They barely have any affordable cards. It's going to be hard to compete with the 7600/7500 XT. They've now said that they are going to reduce their TSMC 5nm orders, which means a more focused batch of high end cards.
>The big majority is still yet to upgrade their GPUs. NVIDIA is going way too much in the premium direction.
Wrong. More than 25% of steam users are on RTX already. 40%+ are on RTX + non-rtx Turing (1600 series).
Only 20% are still on Pascal (1000 series).
And everything older is basically non-existent.
Basically, majority upgraded (more than two thirds).
>It's going to be hard to compete with the 7600/7500 XT.
Lol. Rdna2 is a joke. Check steam hardware survey, rdna2 is less than 1%.
Rdna1 is a joke too. Less than 2%. Even old shit like RX480-580 and co only has 3.6% steam marketshare.
>he just pulled fake numbers out of his ass
try and cope more chud. Just because US are consoomtards doesnt mean the rest of the world is the same. Never post fake numbers again my board or you will feel our wrath
Why do you lie.
You are free to check it yourself.
https://store.steampowered.com/hwsurvey
Users are sill overwhelmingly on pascal cards. Your stats are bullshit.
Let's not forget that there are also a dozen other cards in the same 1060-1070 performance class.
I've led you to the water, but I cannot make you drink.
Just because 10% of users have 1060 or 1050ti, doesn't mean everyone is on pascal. They are actually the minority.
Here's a simple question for you morons that always gets ignored and never answered.
What game worth playing needs this?
dirt rally 2.0 in VR at locked framerate with high resolution, MSAA and dialed up graphics effects
VR is a meme technology, but ok. Anything else?
Also if you enjoy driving games I automatically assume you are mentally ill.
>MUH "VR is a gimmick!"
NTA, but opinion discarded.
I got a 3060ti a month ago and considering the current state of vidya I won't change it until 2032.
I dare anyone to name 10 games worth upgrading your GPU for
Jackie Chan
1.secondlife
2.secondlife
3.secondlife
4.secondlife
5.secondlife
6.secondlife
7.secondlife
8.secondlife
9.secondlife
10.secondlife
Anything made after 2011 in 4K
>inb4 4K is le meme while pretending 1440p isn't
Starfield, for modding.
TES6, for modding.
Jackie Chan 2
>having the best-of-the-best costs a lot of money
no way OP
no way
stop