The RTX 4080
Thalidomide Vintage Ad Shirt $22.14 |
DMT Has Friends For Me Shirt $21.68 |
Thalidomide Vintage Ad Shirt $22.14 |
The RTX 4080
Thalidomide Vintage Ad Shirt $22.14 |
DMT Has Friends For Me Shirt $21.68 |
Thalidomide Vintage Ad Shirt $22.14 |
>1200
Tf are they smoking?
899 for the 12 gb version
thats crazy big difference for 4 gb vram im assuming that car will be slower?
The 12GB version is a con, its basically a 4070 rebranded to 4080.
How? It's also $300 cheaper than the 16GB while 70 series is usually $200 cheaper... If 12GB is a scam then they both are both are overpriced scams.
It's a scam. Basically a 4070 with cut down die and memory bandwidth. Yet they call it a "4080". Now shut up.
But it's cheaper than a real 4070 would have been. Not a very good scam.
No lol? It's de facto what a 4070 would have been, but now they can ask for $899. So a genious scam.
What are you talking about? 3070 MSRP is $499.
And a 3080 was $700 so I don't know what your point is
>mfw the 3070ti is 700$
>mfw the 3080 is 1000$
>MSRP is
Is what the announcement was about.
3070 was 500
"4080" is 900
Is this official from nvidia or is this one of speculative leaks?
official
>Once again calling the completely different GPUs the same name
Frick Nvidia. Glad they EVGA ditched their shitty ass.
>Basically a 4070 with cut down die and memory bandwidth
It IS a 4070. It's using the AD104 die, while the real 4080 and 4090 use the AD102 die
It's the same trickery with how the 3080 mobile uses the 3070's die, except this time Nvidia is bringing that horseshit to the desktop
it's more likely that they're using AD103 and a cut down variant, but if they're impressively israeli then it's possible they'd put AD104 in the baseline 4080. they only finished GA103 in time for laptop segmentation last gen, and it's likely they wouldn't let their work establishing a new die size go to waste
AD103, AD104, point is that it's not AD102 and so it's a 4070 rebadged as "4080 12GB" to trick morons into paying more
>if they're impressively israeli then it's possible they'd put AD104 in the baseline 4080
That's what the 2080 and 2080Super were, don't count it out yet. Especially since AD103 reportedly has 84 SMs and the 4080 16GB is confirmed to have 76 active SMs
They still think it's the height of the mining craze.
Yeah it's DOA, nobody with sense is gonna pay these astronomical prices when a fire sale on the 3000 is upon us at any moment
You realize they priced the 4000 series so high just so that wouldn't happen, right?
So they can lower the price to the real msrp 6 months after they released it and the cattle will be buying them in bulk.
prices are "sticky" once it's gone up, it's hard to come back down
OP pic is a 4070 rebranded 80.
It's scammy as shit because the 16GB version has better specs on everything yet its' still named 4080
Bro, cant you read? It's says 2 to 4 times the speed. Now do the math. That's a steal!
>2 to 4 times the speed with raytracing and dlss*
You better double lock your door tonight buddy.
Hey this is a special offer goy. Youre an anti semite if you dont buy this at full price.
They'll buy it anyway, unfortunately.
2-4x faster in what?
stealing your money
i think he said not sure 2x in cyberpunk 3x in portal rtx and 4x in that little car demo
AI dicky
making excessively wealthy PC people waste their money on shit they don’t need
3080. Note that those numbers are only true in DLSS comparisons.
Marketing trick. Its gonna be solely ray trace performance, not raw power
The kWh on your meter.
>mfw spilled my anus on the floor and got the 3070ti for 700$
>It's 800$ now
Said got it one year ago for 700$, now it's going for 800-900$
How and why?
And just like that, Nvidia wins again. AMD already lost, how did they do it?
What games do you need a 4000 series card for?
i need it for cyberpunk 2077, rdr2, mfs, racing sims in vr
what do u need it for anon ?
>cyberpunk 2077, rdr2
Bro you can use a 3060 for cyberpunk and get 60fps on Ultra, the frick do you need a 4000 series for? Do you need 4k 144fps while streaming or something?
i cant tho it doesnt even run 4k 60 on 3090
Huh? What's the rest of your setup? Did you skimp on cooling or ram or SSD? My PC is not particularly amazing (card is worse than yours) and I can get a consistent 60 on Ultra even when going on a car exploding rampage and fighting off infinite max tac spawns. Granted, my PC starts getting toasty as frick but I don't get any frame drops.
>3060 for cyberpunk and get 60fps on Ultra
Try turning on ray tracing and tell me how that framerate is again.
>inb4 rtx is a meme
Cyberpunk looks like ass without it.
Just want to double-check...
You need it for PlayStation 4 games?
Based PCbrainiac
>Rdr2
>mfw i played all of rdr2 with a gtx750 ti
>Lists games
>Doesn't list actual functions
You can run Cyberpunk on a 760 by turning off MemeTracing, anon. You're wasting cash for a meme.
you can pick up a 3060 for like $350. what's the point of spending 1200 on a card when they can be a respectable rig? at some point you're getting diminishing returns with the way game graphics are going, fewer and fewer games are actually improving because they have to support consoles.
cyberpunk is still a meme gameplay wise. and GTA 6 isn't coming out until the PTX 8090 comes out with full path tracing for just $2999
Realistically? Absolutely nothing. People have ballooned their own requirements because of nonsense like 4K and Ray Tracing.
Machine learning. These new cards don't need to be faster, they just need significantly more memory.
I need for AI, but frick this prices.
anon 30 series and below work for stable diffusion
You fricking Black folk are seriously getting fricking ripped for make shit jpg? You are no better than Monkey JPG crypto Black folk, atleast these Black folk were trying to make money with their scam, what the frick you even do with your PPROOOOOOOOOOOMPPPPPPPPPTTTTTTTTTTIIIIIIIIIINNNNNNNNNNNNNNNGGGGGGGGGGG?
moron
Dude i get it, it's a funny tech but it's just a game.
A game not worth these moronic prices.
Stable Diffusion "art" is valueless
You can try to make money with erotic art and pay for the GPU later. I bought an RTX 3070 for an unbelievable $1500 a few years ago and paid for it in two months rendering 3D erotica.
No you didn’t
Post an example of this erotica you sold for 1.5k
I made that money in two months on Patreon rendering art for an eroge.
My buddies were laughing at me two years ago, when I managed to snatch a 3070 for MSRP+80$ from a mom and pop's store, just before the prices skyrocketed to the loony bin.
>You moron, you overpaid, the issues are gonna clear up any day now.
With each month I feel like it was the best gaming purchase I made this decade lmao.
VR Ray Tracing
It's honestly about 8k 60fps or 4k 240hz now or if you're into VR there's that as well. Or hosting your own AI frickfests including dungeon ai
Modded TES6 and Starfield.
Starfield demo stuttered on a 3700X @ 3.6GHz + RTX 2080 Super (SeX). Now blow that up to 1440p/4K and add mods. You are going to need your RTX 4000.
Gee can't wait to spend at least 3000 dollars on a system that'll run Todd's GayBlack folk in Outer Space game adaptation.
To be fair literally nothing indicates the game isn't poorly optimized and won't be less intensive once modded for optimization (or I guess patched by Beth (LMAO))
You're probably fine with a 20xx if you're just targeting 60fps at 1440p possibly.
Bethesda games don't gain performance lol. They get faster by the hardware outpacing them. Skyrim got heavier after becoming 64-bit. You used to be able to run it on a HD 4870 1GB or even older.
No, I meant in the sense that there are stability and optimization mods that do pretty decent at not affecting the graphics but making it run smoother.
>3700X @ 3.6GHz + RTX 2080 Super (SeX)
Nice bait anon
we all know new consoles are equivalent to 1660 Ti tops
Doing AI shit on your own hardware
You just posted the answer to your own question
haven't found a game i can't play on my 1060 so probable nothing.
For me, VR, I need to run 8k 60fps. I'd also like to dabble in this AI stuff, I have AMS card and I learned it's shit for that
I really want to be able to play 2025+ games in 4K @ 60 fps with high/ultra settings
I think I'm gonna fully enjoy 1440p 144Hz ray tracing, that's realistically approachable now.
For hit game Call of Duty™ Modern Warfare™ 2™ of course!
this but unironically
star citizen official release
VR. Still can't hit 120fps with maxed out resolution on a Quest 2 in most games
I wanted to upgrade my 2060 but not for 1200 bucks LMFAO, but tbh my machine 7 years old need a new one built anyways, would have been nice.
you have morons who bought 4k monitors and other shit dont bother asking there braind damaged
4K TVs are fantastic and cheap and make 60fps look good unlike pc monitors.
4080$
>that little VRAM for $1200
holy blatant planned obsolescence. you would think that at that price they'd fricking give you like 16gb at the very least
>can't read
thanks jens
You'll get a RTX3080 for $399 by Black Friday. HOLD HOLD HOLD!
I picked up a 3070 for $500 a week ago... I regret nothing!
waiting on 4060 TI or 4070
I already own a 3080ti, im set for life basically. unless they discontinue DLSS
dude you wont have DLSS 3 though
Maybe i still have all the other shit they claim to have, there is literally no difference other than raw power.
NTA but games aren't improving, last GPU I used were atleast 8 years old and it could run every single game quite well, just not on max on those newer games which said had lot graphics.. It was only one single game that fricked me over and that was because of the lack of VRAM. One are set for years, 5-10.
>just not on max on those newer games which said had lot graphics
stop being ESL
I would rather not..
>buying a new card for better upscaling technology
At this point they've given up on native res RT haven't they.
the future is high resolution + dlss
almost indistinguishable from msaa if done right
Yeah that's true. If a game has dlss I use that over native + AA
oy vey those inferior cuda cores could never run DLSS 3, what a shame, we just couldnt do it, its impossible goy trust us
kek
man PC gaming is the ultimate scam
>man PC gaming is the ultimate scam
yep
AMD + Linux
enjoy your GPU for 10+ years
every game runs on linux? doubt that. can I record gameplay on linux (without a performance hit)?
Only game that doesn't work for me is Red Dead 2. Recording, I'm not a youtuber homosexual so I have no idea.
what if you use AMD + Linux AND have some kind of reasonable minimum standard? 10+ years? wtf? Do you only play 2D games on a decades old monitor?
I mean define reasonable standards, by all accounts SF6 and T8 are gonna remain 60FPS locked for competitive reasons so for at least some portion of people 60fps is fine.
that's a long time to play Tux Racer
So how will this work for games? Will a DLSS 3 enabled game automatically downgrade to 2 for older cards or are 30 series cards stuck with the miniscule number of games currently using DLSS with future games being incompatible?
I think it'll still work, you'll just see less perf improvement.
Which option will harass gamers into buying their overpriced, wheel'less teslas?
That's your answer.
it looks like DLSS 3 just takes the existing DLSS settings but performs them both more efficiently and with less visual degradation.
Source: me
20X0 series is still serving me well enough.
>4080 12gb
STOP
THE 4080 12GB IS A REBRANDED 4070
IT DOESN'T EVEN RUN ON THE SAME ARCHITECTURE AS THE 4080 16GB WHICH IS WHY THE PERF IS SO MUCH WORSE
JUST LOOK AT THIS SHIT
>4080 is just a 4070 rebranded
frick novideo
>next generation games
>portal
>CP2077
Ahhhh I sure can't wait to play 10yo games again with a coat of paint for $1500.
HASHAHAHAHAHAHAH THEY USED 40XX VERSION OF DLSS THAT CAN'T BE USED ON 3090 TI JUST TO BEAT IT IN PERFORMANCE MODE.
LMAO AND IDIOTS ACTUALLY SHILLING FOR NVIDIA.
doesnt that just prove that DLSS 3 is good?
>perfomance mode
>good
DLSS 3 doesn't work with 3090 Ti.
So they comparing a RAW performance with shitty jpeg compression performance.
it's DLSS2 vs DLSS3, not native resolution
The optical flow SDK works with all RTX cards, there is no reason to lock it to 40 series. They were designed to work on the current GPU hardware.
Where on the bar do you see how good it is? For all we know they upscaled it from 720 and its blurry ghosting shit.
Didnt they do the same shit with 2080ti and 3060 or some shit saying it was same level when it really isn't
bro, what the frick does this even mean. Can someone explain the rational other than the obvious bs marketing.
Left is an apples to apples comparison.
Right is DLSS2 vs DLSS3, DLSS3 cleans house because it has motion interpolation and "fakes" frames.
How the fk do you make a fake frame. What lmao
Show the frame again.
Motion interpolation. How technologically illiterate are you?
Go turn on your TV's motion smoothing/similar named thing and watch a movie
Motion interpolation.
That's how all the shitty 60fps anime videos on youtube are made.
Now Njudea wrote software so it's done on their tensor cores in the 40xx series.
the "4080" 12GB is not even the same chip as the 4080 16GB.
they're baiting you.
It's a high end 1060 3GB scam.
But... if you are fine with sticking to 1080p, go for it.
>2 4080 cards
>one is noticeably worse than the other one because it's not actually a 4080, it's actually a 4070
>Measuring performance in x0.0
Awfully scammy for a so called premium product.
They did this same trick before on previous generations and it turned out to be bullshit. Lol.
Everything about this makes me angry. Fricking hell AMD stop shitting the bed on GPUs, look at what it allows.
>nvidia meme charts
BUT WHAT DOES IT MEAN?!?!?!
WHERE'S THE FRAMES YOU-YOU-YOU!
TRIPLE
Black folk!?!?!
kek that's another 970 "4gb" move
njudea deserve bankruptcy tbh
good thing 99% of AAA is trash I couldnt give two shits about
nobody will even be able to afford these new cards so devs will still be optimising for 1080/2070 level. who cares if you don't game at 4k and ray tracing? makes no difference.
>IT DOESN'T EVEN RUN ON THE SAME ARCHITECTURE AS THE 4080 16GB
What did sperg-kun mean by this?
probably the fact that it's a different chip
What are you poor?!?!? Oohh ha oooh hahaha aaaghh nnnnhgg f
Sorry, gonna stick with my 1050ti
same not gonna buy a card until there's something under 500$ that can play 1440p 144 fps
You will never have a gpu ever again.
In 5 years, they'll be trying to rip 1500$ off your back for an xx70 or whatever they rebrand it to class of gpu.
Leather Jacketman has left the constraints of normal reality.
Based 1050ti haver
1050 Ti cucks are eternal goyim. There was a time you could buy an RX 570 4GB for less lmao.
>1,200 dollars
That's before scalpers pick up 90% of all of these cards on day one and jack up prices to 1,500 at least.
Gonna stick with my 1060 6gb strix.
Looks like im sticking with my 9700k + 1080ti a little while longer
>mfw got a 3060ti on launch day for MSRP
this purchase is only aging better and better
>aging better
>no dlss 3.0 because you bought a joke generation
lmao get fricked moron
>upscaling
HAHAHAHAHA
>paying $900 minimum for a new mid range gpu
who's the one getting fricked again?
>6600 chads
founders editions look so lame
I was planning on building a gaming PC soon
Does this mean it's a good time to go for a 3000 series card?
Steam Deck is selling despite horrible specs for a reason.
No it's time to get an AMD 6600 for under $300 possibly from Sapphire and FRICK YOU NVDIA
Yeah go for it I guess
I would wait for 7500/7600 XT and 13400. Those will make for a great combo.
When's the price for the 3000 series to go down? I want to get a new gaming laptop.
Reminder that you can buy the 3060Ti FE at BESTBUY for 399$.
>FE
Explain.
they're ugly
FE cards suck, they're loud and run hot compared to non-FE cards
>FE
no thanks
yikes
>Got a new 3080ti this month for £800 as they were panic selling
Wew lad, are the prices rising now, glad I wasnt a moron and listened to techgays saying to keep waiting
Did the homosexual really not announce the 4070/4060? I took the day off to watch this shit. FRICK YOU!!
its always been like that. 70 usually gets announced with the 80 and 90 but 60 and below always get announced and released later
Anons, there are two version of xx80 in the list and one of them is not a Ti.
used your brain.
>used your brain.
you first
the 4080 12gb is the 4070
Wait. So crypto was a problem before cause you couldn't get a card. And now crypto dying is a problem because you can't get a card?
Yet people still will buy into Jensen's dogshit scams
>$1200
He skipped the 3000 series for this LMFAO
Waitgays never win.
Do people who say "what are you, poor," really mean it?
Maybe in the sense that $1,600 usd isn't all that much money at all. Still stupid to give into frivolities on a whim though.
judging by steam charts, those people are 1060s with a cheap 1080p monitor, because there's more "what are you, poor" posters than xx90 owners. factor in "influencers" and streamers that are given the cards for free too.
? You only need to save for about 2 months for it..? 1 month if you eat less.
>1 month if you eat less
Nah, frick this. If I have to choose between my gains and video games, I know what I'm going to choose.
Use the gains to seduce an old lady.
Daily reminder all RTX GPUs have the Optical Flow Accelerator, Nvidia is locking DLSS 3.0 to the 40 series for no reason other than sales.
>no more EVGA
Dropped
>Picked up my evga 3070 for a reasonable price a few weeks ago
Feels good, I reckon this will do for a good few generations like my 970 did.
These look trash and not worth it, are 30s gonna crash in price or not.
>are 30s gonna crash in price or not.
They're specifically not selling mid-low end 4xxx cards so they can keep 3xxx card prices up and sell off their overstock of them.
You guys are way too simple.
40 series way overpriced, no one will buy it
>"But 30 series look good!"
But actually, even the 30 series is overpriced
Nshita needs to get rid of the massive amount of 30 series stock that they've artificially price inflated but don't want to sell it too low
40 series will have a massive price drop to its ACTUAL intended price once the 30 series stock is down
People will then praise Nshita saying how cheap 40 series cards are, even though it's still overpriced or where it should be.
Not to mention Nshita wants to preserve the ridiculous pricing from the mining era and have that MSRP be the "norm".
tl;dr: It's a trick from Nshita to get rid of their 30 series card from the mining era.
Everyone will buy it
Graphics cards were still being bought by consumers(non crypto miners) for over inflated prices of $1100 before the prices of the cards tanked
Not really, in my country the GPU price already cutdown by half after the mining shit stop.
The market is flodded with used GPU it make any new GPU that often sold at discount got ignored.
I can see their stock number during sales and it barely goes down.
Nvidia rising the MSRP price is a such fricking dumb moved no wonder EVGA stop making GPU all together.
Fomo is powerful and gamers are moronic
>40 series will have a massive price drop to its ACTUAL intended price once the 30 series stock is down
AHAHAHAHA
No they won't
The 4090 is priced $100 above the actual market value but it's good perf.
The "3080" variants are completely overpriced to completely kill the low end; They are 2x the price/perf compared to the 4090.
Personally I will wait for Navi32 that has an improvement of 3x the amount of cores and cache
I've only ever used AMD cards my entire life, and I want to see if the grass is greener on the other side, would like to get off of the 5700xt, Too bad Njudea never offers a good price point on their shit.
I'm using a 5700XT as well. Was thinking of upgrading to 4090 Ti, but obviously not anymore due to the pricing.
Didn't really need the extra power anyways. I can survive with my currently graphics card.
Looks like I'll be sitting this one out until RTX 5090 Ti
>5090 Ti
Imagine the house fire
I'm on 120hz 4k with my 5700xt and it's really getting long in the tooth. Going to need a 3070 equivalent from AMD for my next upgrade at the least. Once you buy a decent monitor it is hard to keep up. Most games from the last 5 years I run at 2560x1440 or use CAS Fidelity FX crap (the AMD DLSS). 4k is definitely the current standard, even consoles like PS5 is 4k 30/60fps. 8k is so close. Can't wait to play Quake 1 from 1996 in 8k at 120hz+ eventually.
QRD on the bad AMD drivers meme?
Its mostly a meme at this point but AMD drivers take time to catch up in terms of emulation, openGL games, and the latest shit that comes out.
The stability of the drivers when doing even basic shit like just browsing the web can also be questionable, sometimes it's extremely stable and there are no crashing drivers no matter what you do, some months it crashes because you tried to open a game while you had a youtube tab open. I would say these events are pretty uncommon now, but it happens enough where those who really mind it will tired of the schizophrenic stability after a few cycles.
Thanks anon. I don't play the latest games, I always wait a couple of years for a good sale, like if I buy a new PC I 'might' be playing rdr2 so slow updates aren't that much of an issue.
>tfw I can't even afford a used card
>mfw still using a 2014 laptop
>tfw I can't even afford a used card
yes you can. buy something low end not an 80 or 90 homosexual
>pay $500+ for a graphics card
No, I can't.
Not my problem.
How does it compare to a 3080? I'll probably ride this card to a 5080 in a few years.
>RX 6800 XT literally cost almost $200 more in my country compare to RX 6700 XT
The frick is going on?
6700 XT is super cheap in my country almost at 3050 price with 3060 performance, but why the frick 6800 xt price jumped this high? it even cost more than RTX 3070.
I don't see any massive of performance upgrade when comparing them.
the closer to flagship you get the price/performance ratio gets worse. The final 10% of top-end performance envelope ends up being 90% of the final cost. That is why majority of people always settle for second or third-best in a generation of cards. I've always bought the 1070/2070/3070 equivalent from AMD. My next card is going to be a flagship because I'm not connecting my display to a GPU that cost less than it anymore
Yeah, but the pricing between 3060, 3060 ti and 3070 looks more reasonable.
There is no card in between 6700 xt and 6800 XT and even if did the pricing of 6800 xt literally overprice compared to 3070.
Might as well bought a 3080.
Anon, the 6800 is in between the 6700xt and 6800xt.
Cost $50 less than 6800xt and $150 more than 6700 xt so still moronic.
3060, 3060ti and 3070 only got $50-$65 price differences in here.
They kind of abandoned this now though with the 4080 being so expensive. There is very little room for pricing the 4080 Ti. The 4080 16GB and 4080 Ti 20GB will be priced very close to each other for some bizarre reason.
Since I'm sure someone's asking somewhere in the thread, yeah 8GB Vram is fine for now and in fact might be fine for the generation if you're willing to turn down some things and you're playing ay 1080p (or even more turned down 1440p) if you want "high" but not best quality for the generation you'll probably want 10GB to preserve frames.
AMD vs Invidia? If you really have to ask in this thread it doesn't matter get the cheaper equivalent.
bros...
my entire fricking build was like $2800AUD in 2020. I've upgraded my CPU to a 5800x since then but even then, holy shit. $3k for 4090. No 8k 140hz for me I guess.
Why would you even want 8K, I cannot imagine sitting as close to my 8K TV as I do my monitor and a 32in monitor at 8K is basically unnoticeable versus 4k
>basically unnoticeable
maybe for you but I actually play video games. Some games have really bad Anti-Aliasing implementations that could benefit from higher resolution than 4k. STALKER comes to mind. Old games always look cool at higher and higher resolutions. You could do cool stuff like render the entire island in morrowind at 8k with all LOD disabled for epic screenshot that you could zoom into and maintain detail. Get an magination you golem.
It is genuinely basically unnoticeable for most games I mean good on you for having a use case, but I don't really care
stfu
Minimun wage in my country can go down all the way to 2,5 euros/hour
how much is your medical insurance
We have public healthcare
You also have ridiculous income tax and a high wait period.
it's almost like directly comparing wages doesn't account for much
>what you dont pay in insurance you pay in taxes
"""public"""" healthcare is just siphoning money from the working middle class to the parasitic public sector """"worker""" and the lumpenproletariat.
You've never had to interact with an insurance company or american medical billing.
true, but i live in a country with public healthcare and find myself paying private insurance if i want actual medical care and not wait 6 months for a cardiologist appointment.
the only good thing about public healthcare is the affordable drugs prices (again you pay for them in the taxes), everything else, how it works, public hospitals, public healthcare personnel, it's pure garbage.
better than getting to pay out of suprise bills
The US spends way more taxpayer money on health care than European countries yet none of that actually goes back to helping the taxpayers.
>euro
>medical insurance
Not a thing when it's paid wholly by taxes
Now imagine having to pay rent, bills, gas, food and all sort of expenses that come from not living in your parents basement.
You can get a good 2010's set of wheels for they money they are trying to scam you out of. Graphics cards having automobile prices is so outrageous, it crosses over to towards hilarity.
The fact morons excuse this is why we're here now, with a 900 MSRP xx70 card with a peeling xx80 sticker hastily slapped on it.
>now imagine blablabla
yes it's expensive but you're still 1st world making more than most countries in the world, not even month's wage for something you don't need (4090), you can get 4070 for like a week's pay which is fricking nothing
And if asking these prices for a freaking gpu is OUTRAGEOUS for ME, imagine how it is everywhere else, where they either earn less cash, are gonna get fricked up the ass with taxes, import tariffs and retailer cuts, OR BOTH.
...And that's before they all get scalped into 2x MSRP territory.
This generation is a wash. If gamers won't vote with their wallets(spoilers, they won't) PC gaming is heading for the same asscrack niche of the market it was in the 00's.
yes it's half a year for me
obviously i'll never buy it
dont even care at this point
>waitgays right once again
>buy 1 generation
>skip 1 generation
the 40 series is the new 20 series GPU scam
There was no time to buy Ampere. Did you sleep over the lockdowns?
how are we feeling 970 bros
when do we stop waiting
If you didn't get a 1060 6GB that's your fault.
970 still going strong my dude.
It's was only slightly better than the 980, though the gap is increasing with time.
i went from a 980 to a 3080 (very early, before gpu prices went absolutely mental) and have been very happy. Definitely won't be considering anything before 5x series since newly releasing games aren't really pushing graphics anyway, unless you really need 4k max settings with ray tracing.
my 970 and WD black hard drive that were both bought 7 years ago are surprisingly chugging along
It depends on what is needed to play the next big releases. If I can manage to play the games i'm intrested in at lower res with acceptable frame rates then I'll wait. The most probable outcome is that it won't manage the real transition to next gen and I'll try to get a 3080 discounted or wait for the scam 4080 to drop in value after they sell the current 30xx stock. In any of these scenarios the 970 will be framed and given all the due medals for its 8 years of service or 9 since only god knows when next gen exclusives games will reach PC markets.
Reminder factually speaking the downside to use mining cards is not a performance hit (unless you were dumb enough to buy one that literally burned) its the fact that due to usage it will fail sooner than current. Or basically assume an at worst 5% average performance loss, but a 50% life expectancy loss.
>The RTX 4080
More like the RTX 4070
4060 Ti = 4060
4070 = 4060 Ti
4080 12 = 4070
4080 16 = 4080.
STOP KVETCHING AND BUY THE RTX 40 SERIES GOY!
10k pln for 4090 in poland
got some toilets to clean my german brothers ?
It is like a nudge for people to start treating hardware like software. Right now you should never buy a game at launch. Wait until first adopters debug it for you and devs release the complete game they had in mind but had to strip into parts to have DLC. By the time full game releases you should also have affordable hardware to play it.
Hardware isn’t software dummy.
They can just stop making older cards. (They already have)
It’s just a greedy company / monopoly gouging you because they call it “cutting edge”. (No pun intended 🙂
3k aud hahahahaha
It's not even 30% faster than a 3080 in cherry picked games
>every benchmark used DLSS in perfomance mode
>No raw performance graph.
>basically useless for games that doesn't used them since its an exclusive tech that also locked to certain Nvidia cards.
?
didnt mean to quote ya
Well, most of them are going to be using 4K and thus DLSS. I know I will when I get my 5080 and 4K.
Just like when people from a decade ago said all of future game will used hardware/GPU dedicated PHYSX that make all non PHYSX GPU will be obsolete, right?
Well, it didn't go anywhere. It went to single GPU, then back to CPU. You still need to crunch the data to this day.
So basically it makes GPU tech useless and any old game that used this tech unplayable unless you bought their GPU.
In other words, marketing scam.
There wasn't any tech there. NVIDIA artificially locked it to GeForce, but as physics progressed they couldn't israelite us over anymore.
DLSS is an actual technology that doesn't simply happen by number crunching.
>DLSS is an actual technology
Exclusive technology just like Physx that soon abandoned because HAVOC exist and works in any GPU.
>exist and works in any GPU.
Does this sounds familiar? I wonder why.
Yeah, well, upscaling isn't going anywhere, so I don't see why you keep this conversation up.
The discussion start as if DLSS is the only upscaling that exist just like PHYSX vs HAVOC.
High-end AMD owners will use FSR too, so raw performance is kind of redundant.
Still a scam benchmark because not all games will got FSR or DLSS.
"AI" is literally number crunching. Tensor cores are specialized number crunchers for the type of number crunching that "AI" needs. The only part of DLSS that isn't number crunching is the pre-crunched numbers that constitute the weights of the neural network.
when 1080p became standard did you wait 5+ years to adopt it? 4k was standard in 2019. Consoles do 4k/60fps routinely now. Think about it.
brutal. It's over for PC gaming
>Consoles do 4k/60fps
jesus fricking christ Black person, you arent even trying.
As if Njudea and AMD aren't doing upscaling voodoo.
NTA but yes. Sorry man I'm only here in the thread to assuge the fears of the more easily panicked anons.
>4k was standard in 2019. Consoles do 4k/60fps routinely now.
>Consoles do 4k/60fps routinely now.
>wo long does 4k at 40fps if you're lucky
>god of war (a last gen game) can do 4k 60... if you're in a hallway with 3 enemies
>red dead 2
>not on your life
yeah dude, 4k60, totally common except for every major game.
>4k was standard in 2019
im genuinely impressed that you were able to convince yourself of this.
>Consoles do 4k/60fps routinely now
Oh poor anon. You probably believe in Santa Claus too?
Are 3080 prices gonna drop when this releases bros?
>new gpus are now the price off cars
wtf why buy this over a console. pc gaming is dying.
They also consume the same amount of electricity as electric cars
Graphics cards are just small Teslas without wheels, even using the same chips.
theyre also the size off cars now too
ALL TECH SHILLS ARE CALLING THE 40 SERIES THE 20 SERIES
see i was right, its another israeli scam for nvidiots.
Can't wait to return my 3090 Ti and buy back cheap the same 3090 Ti, perfectly timed purchase by my end
I don't even know what I'd play with a 40xx. I have a 1080 and it works just fine, plus I play everything on a TV at 60fps so I don't really need more.
is there a way to check to see the highest card you can buy before bottlenecking kicks in on your system? i know i cant frick with the 90 or 80 and i dont want to buy a card and have it be a waste of time
Yeah, google CPU GPU benchmark.
>1100€ in germoney
>for a RTX 4070
lol
lmao
also kek@366W peak consumption
More like RTX fourty-give me money
that has to be the most blatant scam in recent history. even worse than star citizen.
it’s a 4070, not a lower spec 4080 lol.
but people won’t pay $900 msrp for a 4070 so they simply changed the name to 4080
Yup. 1060 = 980. 2060 = 1080. 3060 = 2070. Derp....
That was a downgrade too.
IS AMD RDNA3 going to have GDDR6X or still GDDR6?
>2-4x faster than the 3080ti
i doubt it
It's that on DLSS performance mode. You know, the thing you're not going to fricking use if you have a $2k video card.
but all the games that have dlss support are shit
>You know, the thing you're not going to fricking use if you have a $2k video card.
Are you moronic? 3090 requires DLSS to do 4k above 60fps in most games. 4090 will probably too, especially if you watn to get anywhere near 100+
The main benefit of 100+ fps is mostly reduced input lag. Which motion interpolation will inherently increase anyway.
>The main benefit of 100+ fps is mostly reduced input lag.
And far superior motion quality (and just more frames you can definitely see)
I don't think it's gonna hit latency all that much.
Not that it matter at this price point..
On Performance Mode?
lmao, let's not be moronic. Most games aren't unoptimized AAA garbage
just wait until RDNA3 drops for the ultimate destruction of nvidya
DLSS 3 is basically taking the entire frame and remaking instead of simply changing tiny bits like it does in DLSS 2
This is why its faster but requires "new" optical flow accelerator hardware because it basically ditches DLSS 2
they really should have called it something else then.
>got a 3060 Ti FE for MSRP
kek not my problem, good luck with your overpriced housefires you dumb rubes
somebody stop him!
>got a 3090ti for $1000 due to best buy rewards
Still overpriced but I am set for a good while.
3090ti is $1000 right now for everyone, you turbo-mongoloid
I expect yet another embarrassing display by shAMD and everyone to suddenly stop complaining about Nvidia's horribly predatory pricing.
They're gonna have matching prices for sure. The competition meme is so fricking wrong
Can't wait to get a 4090 to play Persona 3 port and some PS4 games.
If you wanna battle then I'll take you to the street
Where there's no rules
Take off the gloves ref,
please step down
Gotta prove my skills so get down
>Cards all sell out within seconds
>People paying 2-3 times the price for them on ebay
>"Why is the price increasing? I just don't get it!"
>Cards all sell out within seconds
it isnt 2020, granddad
doesn't matter, there are no jobs anymore so there is a rise in onlyfans people and peddlers at the malls. So you damn well know a rise in scalpers
People buying 30 series at inflated prices was due to the perfect storm of covid lockdowns causing shortages while also increasing demand from people stuck at home. And because Ethereum mining would make the card pay for itself even at 3x the MSRP.
None of those factors really exist anymore. Inflation may be up but even that doesn't justify these prices. I guess Nvidia is just hoping people are moronic
>I guess Nvidia is just hoping people are moronic
Smart move.
I can't believe GTX1060chads and rx580chads still winning after all these years holy shit, just what the frick is wrong with GPU market
Using it until it dies on me or until it doesn't run a game I want to (won't happen)
So should I go ahead and get a 3080 now, or should I continue to wait and hope the 4080s drop in price next year
buimping
Get a 3080, or 3090.
The new gen isn't getting any cheaper, since still they have a giga billion 30xx's to sell and they don't wanna step on their toes.
Now post the TDP
Wiaiting for 4080ti the best card
So when does the industry switch to ARM
ARM is garbage and the only people who talk about ARM replacing X86 are boomers who think pissing money into Candy Crush is "gaming".
The warehouse I work for managed to get sold on the ARM meme and now tasks that used to take 1 second take 1 minute because of how garbage ARM becomes when you confront it with any kind of workload.
Is the 4060 still only gonna be 8GB?
yes, but it will be called 4070 now, and cost 700 bucks
>dlss 3 ONLY works on 4000 series
FRICKING WHYYYYYYYYY
Money.
What the FRICK are these prices in €
>1500€ for a fricking 3080
lmao
I already found artifact issues on DLSS3's motion smoothing within seconds of looking at it.
his red colors bleed into the sky for several frames and the background behind him also has artifacts.
BETTER PAY 2k$ FOR THAT THEN ANON
oh no no no no no no no no
>freeze frame and zoom 4x in
>SEE THERE ARE A FEW PIXELS THAT ARE WRONG ITS SHIT
kys
>tfw 3080Ti
FUUUUCK I BOUGHT A 3080TI FOR 1200 AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
lol
never buy a graphics card while drunk
Recently? Why?
lmao
KEK even
I bought it for 1500 like 2 years ago, was totally worth it considering how many games I played on my beautiful 1440p@144hz monitor, considering how the prices are still fricked up and considering how 40xx series still looks like shit.
>I cope cope cope cope cope 2 copes ago, was totally worth it cooooope cope cope gaymes I coped on my beautiful cope, cope cope cope cope cope cope cope cope cope cope cope cope cope cope cope cope cope I like to drink semen.
lamo just take the L
Uhuh.. Happens to the best of us, just try and not make the same mistake, or some similar...
until I find a legitimate performance drop comparison with a 3090 to a 4k series I won't switch
I already ditched my old 2060 for a 6900xt last year. Not going back to Nvidia.
>teamgroup
oh no no no no no no no no no
God damn that card looks cool as hell.
How much is 3090 expected to drop if at all?
It probably wont go any lower. The 4x power is deliberately BS that factors in very specific resolutions and this new meme technology nobody is going to use for at least a few years.
>have rtx 3070
>barely play games
I could have just gotten a cheap 4chin mashine laptop.
same here but witha 3080
similar but I end up playing everything on Steam Deck
Sell one of those to me please.
kay for 5k
Put in an order, the queue is damn short by now. You'll probably have it by christmas.
I will be purchasing a 4090 exclusively for Virtamate
I hope the best for you.
get a real b***h. they're cheaper
How do you turn a real b***h off?
I use xanax for mine
that shall nag and try to change who you are? too much responsibility, prostitute always nags til I have to beat them, It's not really my fault that she never learns.. FRICKING prostituteS
that was my plan as well but virtamate is starting to look outdated
Same virt a mate bro... my 1080ti can't keep up all those hair in VR
Nvidia doing a good job chasing people away into AMD.
AMD will be barely better price wise, but you will make tradeoffs on ray tracing performance for power efficiency and VRAM
If you really want GPUs to be unfricked, best pray Intel is able to succeed and establish a 3rd player
games graphics havent evolved for years now, why would anyone need 2-4x faster cards every year?
Programmer skill has gone down 2-4x
no need to worry
some autist will make dlss 3 work on 30 series cards
Brainlet question:
Why are graphics cards consuming more and more energy? Have there been no improvements in regards to efficiency?
performance per watt has increased every single generation, 40 series is no different. think of it this way: if you limited a 4090 to the same power draw as a 3090 ti, your fps would still be significantly higher.
Why would you even need a 3090ti?
video cards are not only used for gaming
You don't need a GPU at all unless the job you do requires one. For the most part asking someone why they would even need a high end GPU is like asking someone why they would need expensive rims for their car. The answer is they don't need them but they want them and are willing to pay the price they are being offered at.
Why not release a budget option then, since it would still be an improvement?
because
1. It’s less profitable because “budget” options are often just flagship chips with features soldered off or disabled on the hardware side
2. They want to discourage you from buying these cards
3. Cloud computing is the future that NVIDIA is gearing towards
more transistors need more power
Look at the size of the cards and the gimmicks they are pushing
>AI to make shitty memes
>blurry upscaling algorhythms
It's obvious we've hit a brick wall when it comes to nanotechnology but they can't stop.
Why would you need a 4000 series card?
Did your space heater break?
>450W
>i5-6600k
>1070
I just want to upgrade. Why does it have to be so expensive these days...
In absolute awe at the size of this lad
is this the generation where cards start breaking off from the pcie slot because they're too chongy?
Yes
https://videocardz.com/newz/lenovo-geforce-rtx-4090-gpu-pictured-some-more-it-is-36-cm-long
good on lenovo though, from a glance, this actually looks quite alright. hope that quality heatsink persists down the stack.
proly wont even fit in smaler cases
So do I need to buy a 1000000W power box for my PC so that my RTX 3080 can actually put its gamer LED colors on?
Is 10GB VRAM really too low for 4k....I have a 3080 and I never use ray tracing and pretty much always use DLSS when it's available
in most cases, no. if your game is coded like ff7 remake, yes. performance will literally nosedive the second you max out vram
It really is. I don't mind using DLSS though and 1440p is still fine on my monitor maxed out. Waiting for 50 series to upgrade.
Get ready for a 2299$ 5090, 1599$ 4080 and a 1199$ 5070, or whatever they will rebrand it to.
I'll start saving now then
upscaling sucks wiener at anything but 4K quality mode (1440p internal). Either
>there's not enough internal pixels for it to look good
>you are upscaling too far for it to look good
Yeah I meant dlss for 4K, native on my monitor.
The future is 8k + dlss on your 4k monitor
It looks crisp as frick, and with a 4090 you might be able to play real games at reasonable frame rates
1080ti bros... I think i'm gonna upgrade for a 4080 12GB... seems like the only fair deal out right now
Same, 12gb is the bare fricking minimum so it looks like the only card that's ""acceptable"".
Once it get a 200 bucks price cut, anyway. Everyone knows it's actually a 4070. Frick nvidya
The card are SUPPOSED to replace those that preceded them with maaaybe a small mark-up, NOT be placed above them in the price stack for twice the price, sweatie.
At least, that's how it worked until last gen.
ABsolutely called it and consistent with leaks. Don't expect more than 70% better rasterization perf than the 3090.
Why is arguably pretty good but not at that fricking price obviously
If you don't care about DLSS or RT you might actually be better off buying a 3090 or 6900XT right now
Unfortunately, I want good RT so I'm fricked.
70% raster improvement is fricking huge and one of the biggest leaps of all time moron.
I know but Nvidia keeps promising 100% or 200% improvement all the time even right now in OP's pic
And at those fricking prices you would expect that much.
That's not a reason for them to mark them up 70% of their predecessors, Njudea shill.
Have fun paying 4k$ for 6080 in for years, moron. you've earned it.
nothing personal kiddo
>1080ti was 700 bucks MSRP
>4080ti will be 1400 MSRP
>the benchmarks aren't even out yet and you're praising the performance
>midwits
It's not that impressive when they're just sending a frick ton of wattage to the cards to achieve it. The 4090 is probably gonna spike at 600 watts or something crazy. Which makes the "dollar per frame" value way worse than it initially seems because you're also going to need to buy an expensive power supply too
So TLDR its a skip gen unless you play heavy ray tracing stuff?
Unironically yes, unless the bullshit 4070 (real 4070 is what's now known as 4080 12g) turns out to be slightly weaker (same gap as 2070 vs 2070s) but much cheaper.
Sure, but start saving yesterday, because the next gen is gonna have even more unreal prices, now that last gen proved that morons are going to pay out the ass to get their hands on these.
I'm fine with my 3090, runs VR and all my high fidelity games really well.
Devs just need to fix their games so they aren't bloated choppy messes.
I got a 30 series mainly for 1440p 120fps and VR (half life alyx, pavlov, blade and sorcery) so I agree, it depends on the game and it was worth it to me.
Maxed out Alyx at 90fps is so great
>https://pcpartpicker.com/list/yWW2Bj
Thoughts?
It would go for 1033$
Maybe less since I'll buy em on sale.
Might just change the MOBO into ''ASRock B660M-HDV'' But I don't know enough about ASRock just yet.
>bought a 3080 from a scalper in 2020
>mined enough ethereum to pay it back AND pocket a few hundred bucks AND play all the games i want
lol
lmao
It just goes to show how absolutely shit AAA gaming is that Jensen had to show Cyberpunk and Portal 2 for god sake to try to get any hype going for new raytracing features. If that's the best games to offer RT then it's going to be a worthless feature for a while longer.
>THEY
>SHOWCASED
>MORROWIND RTX MOD
>20 YEAR OLD GAME
LOL
LMAO
Yeah and the AI upscaling that they showed in that morrowind demo was absolute garbage which is the most hilarious part of it
If this keep happens and it actually sells, god fricking knows what happen to 50 series price.... 60 prices... 80..... Oh frick i dont want to think about it
america is headed for a total economic collapse under their dementia patient, so don't worry about that.
The 4090 will actually beat it in cost per dollar. What the frick are they thinking.
No competition.
Remember Intel before Ryzen came ?
AMD's only problem has been Windows and OEMs going with Intel+Nvidia.
Now that devices with AMD hardware are shipping in volume with OSes other than Windows, and Intel is looking to kick Nvidia out of the OEM market Nvidia is going to be in serious trouble.
Especially with their hardware partners starting to break off relations with them.
These $1600 GPUs are purely to grab headlines for Nvidia, but they won't really be relevant outside of AI research. Hardly anyone is really going to buy them for gaming.
Just no point when a $400 all-in-one PC can run everything. Discrete GPUs are heading towards the end, just like discrete ALUs went away.
>Nvidia is going to be in serious trouble
CUDA is the only game in town for AI, OpenCL is hot garbage
DirectML would like a word
Might be true, but that's not gaming, and not a mainstream thing normal people are getting their gaming PCs for.
Nvidia could easily exit the home GPU market, or see their marketshare greatly reduced by OEMs no longer shipping Intel+Nvidia systems.
Wouldn't be the first time a company looked on top of the world in the gaming market, only to exit.
>AMD's only problem has been Windows and OEMs going with Intel+Nvidia.
Because they had no iGPU, or at best a really shitty one. Notice that now all of AMDs CPUs come with an iGPU, and a very low power (read: good) iGPU. Now all those Dell optiplex/Acer chromebooks that sell by the millions can have very cheap AMD chips that shred Intel's Celeron/Pentium trash that has been used in those devices since forever.
>and Intel is looking to kick Nvidia out of the OEM market Nvidia is going to be in serious trouble.
Turns out the platform level benefits of Intel+Intel or AMD+AMD are huge for laptops. Intel DeepLink and AMD Smart Technologies are like pulling performance out of a hat when you're power and thermal limited.
Which is why Nvidia wanted to grab ARM, so they could establish their own CPU brand and sell NV+NV laptops. But they were wienerblocked by the UK, so now Nvidia is scrambling before they lose a large source of income from the laptop market. And it's already happening. Compare how many RX 5000 laptops there were vs RX 6000. AMD and Intel are taking the frick over in the mobile graphics space. And Apple is not an option either because of the M1.
>Discrete GPUs are heading towards the end
You get it. Look at how good the Mac Studio is. Performance right on the heels of a 12900K + 3090 while consuming less than half the power. It's tiny and quiet too. That is what the PC of the future will look like
>Which is why Nvidia wanted to grab ARM, so they could establish their own CPU brand and sell NV+NV laptops
To be clear, there's nothing really stopping them from still doing this, not getting control of ARM doesn't stop them from making ARM SOCs with Nvidia GPUs, it just stops them from establishing their designs as the reference implementations of the standard architecture.
The main barriers to Nvidia selling all Nvidia solutions for PCs are really:
If they're interested in gaming they're going to need support for AMD64 and win32 software. Hardly any developers are going to port their PC software to ARM, just like hardly any developers have been interested in either native GNU/Linux support, or implementing native support for the latest versions of Windows.
Depending on how they achieve AMD64 compatibility, it may create issues for compatibility with Windows since its basically only on AMD64 compatible hardware, so if they're depending on a userspace compatibility layer like Apple, they'd need to run a UNIX-like BSD or GNU/Linux based OS. This might limit their appeal outside of some kind of Steam Deck style non-traditional form factor, and Nvidia has kind of fricked themselves over when it comes to drivers for those other OSes.
Finally if they do their own hardware like the Shield they would probably have to do on their own because OEMs all have relationships with Intel, and to them Nvidia CPUs will be more of an unproven commodity than Intel GPUs. Nvidia would need some way to break into that market.
I would be very surprised if Nvidia did not have a team working on AMD64 support on a future SOC, the relevant patents should all be expired now. Its just going to be a moonshot level project for them, and may not be worth it to them.
>If they're interested in gaming they're going to need support for AMD64 and win32 software
Easy, the same trick Apple uses for the M series, put in HW acceleration/assists for x86 emulation. Then work with MS to integrate that into the Windows on ARM x86 emulator just like how Apple's Rosetta 2 does with M series chips.
>Hardly any developers are going to port their PC software to ARM
They will if Nvidia's shit sells well enough, which given their brand power is very possible
>the relevant patents should all be expired now
The old stuff yes, but new things such as SIMD extensions are still protected. And not having those will lock you out of most games, as DirectX math libraries are designed to compile using the latest SIMD extensions. It would be an unwinnable game of catchup for Nvidia, ARM and emulation would be better
>Easy... work with MS to integrate that into the Windows on ARM
Its not that easy, because they'd need to get Microsoft on board, and Microsoft has much closer business relations with Intel and AMD to a lesser extent.
Windows on ARM hasn't exactly been a success, and Microsoft encouraging desktop ARM hardware might be seen as a negative thing by their board as a result.
>They will if Nvidia's shit sells well enough, which given their brand power is very possible
So far at least Nvidia hasn't had that much luck getting PC developers to port their games for their Tegra devices.
>ARM and emulation would be better
When it comes to patents ARM and emulation might have the same issues. Who knows what deals Apple made to make MacOS's emulation support on M1 happen, Apple has way deeper pockets, and needs the legacy support less which gives them a stronger negotiating position.
>Windows on ARM hasn't exactly been a success
Because of an exclusivity contract with qualcomm which is set to expire very soon
>So far at least Nvidia hasn't had that much luck getting PC developers to port their games for their Tegra devices.
Because those tegras fricking suck. If you made a real gaming laptop with Nvidia ARM CPU + GeForce mobile GPU, things might be different
>Because of an exclusivity contract with qualcomm which is set to expire very soon
Seriously doubt that's the only issue. Microsoft has some significant problems in the ARM market. Not the least of which is their close relationship with Intel and their dependence on legacy support.
Microsoft's access to consumers is actually pretty limited, and goes through OEMs, and all of those OEMs have business relationships with Intel.
Microsoft trying to follow Apple's lead with their own equivalent to the M1 could kick off a real shit storm, and Microsoft upper management may not want to risk their still dominant position within the business desktop niche market.
Nvidia would honestly be better off going it on their own if they were really serious about trying to compete with a complete solution in the PC gaming hardware market.
>baww I need to upgroood my 970/1060
GeForce 2080s are $300 now, have at it
>2080
Dont exist anywhere in yuropooristan.
is a 2080 really worth if the 3k's will most likely go down even further when the 4ks come?
3060ti bests the 2080s and it's 450€ in Europe
They are pushing the masses towards cloud gaming.
It's ovel.
Hopefully the mainstream cards are priced decently. I want an AV1 encoder.
Why would you wait for a 40 series card for an AV1 encoder unless it's a nice to have? It will take forever before Nvidia will decide to release anything low end if at all because of the overstock of 30 series cards at retail and in the used market.
Huh? There is no other consumer hardware AV1 encoder.
incel; arc
Intel Arc and it's already on the market making their marketing of being first true. Granted, that's probably not the answer you want to hear in which case, you can then wait for equivalent AMD and Nvidia offerings to come down in price.
>I want an AV1 encoder.
Intel CPU with an iGPU. QuickSync wipes the floor with NVENC and VCN. And now you are free to pick any GPU in case AMD doesn't give an AV-1 encoder in RDNA3
They have no fricking shame
>1300€ 12GB
>1500€ 16GB
>2000€ 4090
Oy vey, someone isn't getting scammed if you ask me.
his wouldn’t be happening if amd didn’t suck so much ass
No moron.
This wouldn't happen if everyone wasn't a fricking moron and didn't buy last gen for 3x MSRP. They simply normalized the COVID and ETH wombo combo prices into the new MSRP. That's all this is.
AMD is just as happy, because thanks to Njewia they can price their cards 10% below theirs and be the "good guys" while still raking in mad cash.
I wonder what the actual - no bullshit - performance increase will be.
20%!?
Did you see the raster performance of 4k native Cyberpunk with RT? It was running at fricking 23 FPS, then used DLSS to get to 90-103 (not even 120)
>MUH RASTER PERFORMANCE
>23 fps 4k cyberpunk
Still not upgrading it (the gpu)
jesus
The 50 series won’t even be marketed to gamers at all. It’ll start at 2k and be shilled to researches and hospitals.
Just wait troonys on suicide watch
it’s easier, cheaper, and more bountiful to be a console gamer now
Console is easier to get pirated too. This might be the death of PC gaming.
With the recent ps4/5 exploit stuff I might unironically try to get one.
>recommended 850W PSU for the 4090
>doesnt take into account the 2.5x transient power spikes
if you dont get atleast a 1200W PSU for that fricking ticking time bomb you'll burn your house down guaranteed
>falling for the tech israelites tricks
you will never run into a transient spike issue unless your psu is absolute shit
well consider how many morons cheap out on their PSUs, it's gonna happen to a lot of people
>still on my trusty 750Ti playing SimCity 4 all day
Have fun with your Xbox hueg card, I guess
>fun
Man I sure do wish that I could have some fun.. when playing nad such.. 🙁
There are a lot of decent cards at low prices. Check out the Nvidia 3060, or the AMS 6000 series
So the trend is gonna be more power and higher temperature?
Yes. Just like your fridge, AC, oven, washing machine, and dryer. Get used to it :).
I think it's more likely the trend will be iGPUs and Steam Decks.
up to a certain point when electricity price in the United States reaches europoor levels
energy price is not the US' problem. Rather most of our houses are wired for 15A120V or 20A120V (if you're lucky). That's 1800-2400W on a single circuit at most. Then you have to factor in efficiency losses (around 80%) from the PSU, other devices on the circuit (lights and AC use a lot more power than you'd expect), and all of a sudden a flagship gaming PC is dangerously close to tripping the circuit breaker for many. If Nvidia thinks people are going to upgrade their entire electrical infrastructure to support their shit, they are out to lunch
this, i cant plug any thing else to the wall socket where i have my PC
>spend a month's wages to buy a 4090
>buy the cutting edge Goyslop game to play on it
>game spikes the card to 100% constantly because devs haven't optimized anything in over a decade
>card blows up because manufacturers tried to save a penny on capacitors
>house breakers trip but not before the old wiring in the walls catches fire
>house burns down
You just know it will happen to somebody
How does it feel knowing that 100s of twitch streamers will get one for basically free from donators to play fortnite while Gankerirgins will still be on 970s
[Embed] I just checked out Cyberpunk's performance on my 3090. Everything maxed 4K, and it has the same exact performance with DLSS off/RT on with this video. DLSS ultra performance with RT on is 75-85 fps vs DLSS3's 98-105. What a fricking scam. Not even to mention the fact that most games I play with DLSS, I would have it on quality mode.
Did you test with "RT Overdrive" on your 3090? Because if so that's even worse for nvidia, they are saying 4000 is much much faster for RT, but if your numbers are also with "RT Overdrive" then Nvidia is really in the mud
Set to psycho, no overdrive option
They are counting DLSS 3 results vs DLSS 2 as "2x the inprovement". What a bunch of snakes.
Don’t worry fren
The autisms will find a way to enable dlss3 on older cards
Yea. At this rate what is to stop someone from interpolating 1 frame into 60 FPS and saying it boosts performance by 60 times? This is so fricking stupid, how is it even legal?
kek that price
PC gaming is dead
It's gonna hobble around with perfectly functional 20xx and 30xx(assuming one managed to snatch one for a normallish price) cards already in gamer rigs. No normalgay has the expendable income to excuse paying 1200+ for a GPU. This gen will be stuck with the same morons who bought for 2-3x MSRP last gen.
If they stick the course for next gen, then yeah, we might have a crash.
>no graphs/claims purely on rasterization performance
it's gonna be their biggest release blunder yet. for sure day one reviews of these cards will all be disappointments. I hope none of you literal morons buy into this
>Spending over $1000 for a graphics card
The destroyer of wallets
3060tichads, yeah, I think I will skip this one
same brother
same
>evga
Nice antique device you’ve got there
Evga.... it honestly hurts they were the least moronic of the nvidia manufacturers
The crypto stuff turned NVIDIA into crackheads. They will do anything for that extra buck now. They might not feel it yet, but the seeds of destruction have been sown. Their demise —and PC gaming by proxy— is inevitable.
AMD will take their place
They would literally take their place and pull the same israelitey tactics. Corporations are not your friend. They all exist to make money, and AMD is no different.
I know, but for now they're making better hardware for better price and with smart leadership they should become new market leader.
So what about the Switch 2?
There’s no way in frick Nintendo will work with NVIDIA again. Except their future consoles to use rdna chips,
The chip for the Switch 2 was confirmed today, it's again nvidia.
>Tegra239 SoC
Weird
So whats that mean in terms of power? Will Switch 2 be PS4 pro tier?
>8-core CPU - likely to be ARM Cortex A78C/A78
>Ampere-based GPU that may incorporate some Lovelace features
In between PS4 and PS4 pro, they can't have it as big of a housefire as the Steam Deck for obvious reasons.
>In between PS4 and PS4 pro
I think thats perfectly acceptable for Nintendo games tbh. A Zelda game with even a base PS4 level hardware would allow them to do a lot
Depends. If Nintendo underclocks it again in order to cope with nvidia being a housefire, then it will probably fall short of the PS4 Pro. If they don't then performance should be similar. Expect 599 since unlike Switch 1 they aren't getting Tegras by the millions on clearance sale
>600 dollars for Base PS4 tier hardware in 2023
Fricking Nintendo man
Well you never know, if Nintendo is smart they will have explored other options (i.e. with AMD) as a backup. And Nintendo does not like pricey consoles, so if push comes to shove with Nvidia (which could very well happen when even EVGA is sick of their shit), Nintendo might go AMD.
It seems unlikely, ARM still offers better performance per watt, and Nintendo isn't concerned with legacy compatibility like the Steam Deck is.
Nintendo would be more likely to go with an alternate ARM vendor if somebody else can deliver a better product, and Nintendo post WiiU is going to be reluctant to buy custom hardware they're the only consumer of. They're going to want something cheap, off the shelf, and available in bulk to ensure there's a low price and they won't get screwed by lack of supply.
>ARM still offers better performance per watt
Implementation says otherwise, see HW Unboxed's comparison of 6900HS vs 12th gen mobile vs M1 Pro
The frick am I supposed to do with all that power? Play Skyrim but with an even more ridiculous enb?
30 series is still good for 1440p@ even 240hz
40 series is like, beyond enthusiast tier.
No point in getting one especially at those prices. No this is not a cope. There’s literally not a game out there that would benefit from those cards at 1440p.
Finally, they finally revealed the 4080...I can finally order a 3080 in peace knowing that its a piece of shit...at last......................
4080 should cost 500$
but people are stupid and they make profit of that. same gose with smartphones, they are overpriced to the limit.
keking at how this pic looks like its being thrown at a very fast speed at someones face lol