Novidiot gaymers, prepare to get scammed again
https://videocardz.com/newz/nvidia-geforce-rtx-4090-rtx-4080-16gb-12gb-max-tgp-and-gpu-clocks-specs-have-been-leaked
Novidiot gaymers, prepare to get scammed again
https://videocardz.com/newz/nvidia-geforce-rtx-4090-rtx-4080-16gb-12gb-max-tgp-and-gpu-clocks-specs-have-been-leaked
didnt they do this with the 1080 as well? it used a GP104 which is normally used for the 60/70 class cards.
And the 2080. And 2080Super. And the mobile 3080/Ti
If you don't buy the flagship, do not expect Nvidia to give you the top chip
And don't forget that there were TWO Pascal Titans, released 6 months apart and only one of them had the full die like a Titan should
God bless AMD for giving nvidia the kick in the ass they needed. And I hope Intel finds success too because duopoly is only 2nd to monopoly in terms of being cancer
>God bless AMD for giving nvidia the kick in the ass they needed
Evidently it didn't fricking do anything.
People forget how HUGE nvidia is as a company. Even if AMD does everything right it would take them years to bring nisraelite to their knees.
Nvidia has too much momentum, after being dominant for so long. Most people don't autistically follow hardware news and benchmarks.
It takes normies a few generations to get with the program that nvidia isn't the "only choice" anymore.
We are half way there. Navi has competed well and if AMD continues to pump out strong products + nvidia keeps being israelites with pricing, the sticker shock will get to normies. Nvidia caught a lucky break with the chip shortage
they've been doing that since kepler 600 series, where have you been?
>They literally just bumbped the videocards up a tier to charge more money without announcing a price-hike
Holy shit.
>They don't have a 4070 anymore
>The card most people would care about as the best "Value" option as an upgrade
>For now you need to go go for an overpriced 4080 that nobody realistically needs
>or cope with the 4060 a tier lower whenever they decide to launch it
Ah yes, the same thing intel did with i3/5/7/9
I forget which gen it was, but they moved the featureset of the i7 into i9(hyperthreading), and started calling chips with an i5 feature set i7s and pricing them as i7s.
Jews the whole lot of em.
Society is collapsing.
I wonder if this is going to cause hilarious obscure driver issues with the rename.
only idiots buy new series when they come out. just get a cheap 3060 it's more than enough for any game worth something.
Wanna know how I can tell you're a poorgay with no 4k display?
>see 4k display
>I am blind or it is a meme, I see no difference
>zooms insist that it is "like liquid gold in your eyes"
I am not 15, homosexual.
Ignore all zoomers that waste money on top shit because their favorite streamer endorses it.
Exactly. They are the only ones excited for 4k, when the only real advancement was from 480p to 720p or even 1080p.
To be honest, if you really don't see the difference between 1080p and 4k, then all that money is a waste for sure. There may not be as much of a significant jump as from 720p to 1080p, but let's not pretend 4k is not noticable.
>will just buy 3070 and sit on it for 5 more years as I did with 1070 anyway
>upgrades from a 6 year old 8gb vram card to a 8gb vram card
>expects it lasts 5 years
lol
im happy with 1070's performance 🙂
yeah, what an idiot, I love playing the newest feminist PoC simulator at 4k 120 fps
you homosexuals are so dumb, unless you're going to do 4k 120hz 8gb vram is fine
Only in games designed for PS4/Xbone
do you seriously think "next gen" consoles are more powerful than a gtx 1080? lmao
you're entirely right in the point you're making, but also, Control is a terribly optimised piece of shit performance-wise and there are far better examples that you could use
11GB of VRAM is insufficient at 1440p without downgrading textures to early 8th gen levels. Have you even played a recent game? Is this the part where you cope by stammering out a "w-well, new games suck anyway"?
>"w-well, new games suck anyway"?
But they do suck
>you can play new games with an old GPU!
>no you can't, do you even play new games?
>no they suck
>you can play new games with an old GPU!
Yes you can.
Maybe, but not The Way It's Meant To Be Played™
>It's the end of the world if any new pozz game doesn't run at 144fps, at 4k!
You can run shit even at 720p, homosexual.
>BUT INFERIOR SETTINGS
Nobody cares, if the game is pozz.
hardware discussion in a nutshell
I play modern games at 1080p 50FPS on my gtx 970 3.5GB. Changing monitor to 50hz gives you a free 20% performance boost, and it looks just as smooth same as 60hz
Not true. A game will use as much RAM as it can get whether it needs to or not. Does it stutter with 8GB RAM? No? Then shut the frick up.
>he doesn't know that israelitevidia slowly crippling old cards' performance with each driver update
>driver update
why do you do this?
because hes trying to pretend hes not an NPC but in reality hes the biggest one of all
>That wattage increase
What the frick? Am I going to need a 1000w psu now!?
why cant they just release the bing bing wahoo video game hardware without israeli frickery. god damn
capitalism ruins everything it touches
i heard the 4070 was barely more powerful than the 3070, so does this mean the "4080" is going to be less powerful than the 3080?
The "4070" should still be more powerful than the 3080
There will be a main 4080 with 16 GB vRAM and a meme 4080 with 12 GB for scamming purposes
>and a meme 4080 with 12 GB for scamming purposes
Primarily for prebuilds so they can show an 8 instead of a 7 or 6 because 8 is higher.
The 4070 TimeSpy Extreme performance looks like its about equal to the 3090 FE
Fricking mentally ill moron
>still using my 1060 6gb
I know i'll be forced to upgrade eventually but I haven't seen a single video game worth playing yet that would require it.
i'm upgrading solely because i want to try Stable Diffusion, AAA video games fricking suck
Shit that's actually a tempting reason now that you reminded me of it.
Isn't stable diffusion perfectly working on any 4 gb vram gpu? From my understanding more power just mean faster work and more vram mean better quality (and therefore a 3060 is best)
>Isn't stable diffusion perfectly working on any 4 gb vram gpu?
only Nvidia cards, but yeah you can get it running even on 2GB VRAM
>512x512
Working, yes, but the gains in rendering speed are immense.
You won't need until PS5 is out of the market, currently the ps4 isn't
yeah i guess i will run my 1050ti till the 5050ti drops
the more I see what games are coming the less I want to upgrade.
Remember me?
The chosen one.
criminal
Why haven't you taken the AMDpill?
no nvenc or nvidia broadcast
AMD is just Nvidia with all the same issues, but to a less extreme degree
Apple Silicon is enlightenment
>fast
>optimized
>insanely efficient, laptops don't even lose performance when pulling the plug
>runs both x86 and ARM
>has the best OS, linuxgays and winjeets can cope and seethe
can it run vidya doe?
>t. mac pro m1 haver
ok you got me there
It can sort of game, it's primary issue is the lack of native mac games + microsoft won't sell windows on ARM to consumers since they signed an exclusivity deal with qualcomm (but that deal expires soon)
The chips are amazing, there's just no software to back it up right now. But hopefully that changes in the future
>homosexuals shilling apple of all things
Holy shit
is right
>optimized
Unfortunately not for gaming.
That's one of the very few games that does actually run fine on the M2, and even then, a 6800U laptop is cheaper while performing almost the same and running laps around it or just win by default since at least it runs it in 99% of the other games.
I look at this graph, and all I see is that with the same power consumption there is not much of a difference in performance, probably only due to better node they bought exclusivity for.
>nodez are unfair!!
And the mac is emulating x86 while the PC runs native code. Results are what matters
And both are unnacceptable levels of performance showing that gaming on laptops without a gpu is a waste of time. Sure the cherrypicked performance per watt graphs make them look nice, but when you look at what you're actually getting it's laughable.
>nodez are unfair
I didnt say that you moronic appleBlack person.
>Apple Silicon
They just order best chips from taiwan and dont push them further to the zone of diminishing returns.
Took it last year
Mostly the poor ray tracing performance. AMD cards are powerful and have lots of vram which is great for future proofing, but the ray tracing performance is disappointment.
I need VRAM for A.I. images.
>still using my 2060 super because none of my games demand more for a 1440/120 experience
>will probably upgrade to a 3080 when prices collapse and sit on that for 5 years
if not for advancements in rtx features, i'm sure even the 2080 ti could last most gamers until 2025
I got a 2080S and im still content to wait. I can run Cyber punk and Dyling 2 with rtx and Dlss maxxed at good framerates
Greedvidia as usual
What's the point of the name change except to increase MSRP?
I think I'm going AMD this generation, israelitevidia needs to be knocked down a few pegs before they become intel.
>jewvidia needs to be knocked down a few pegs before they become intel
it's too late, they already are Intel, except without the hubris
I'm still sitting on my 1660 super and see zero reason to upgrade.
newbie to PC gaming, live close to a microcenter and was considering a prebuilt from them. Are they a meme like other prebuilts? I've had mixed opinions given to me so far.
depends on how fair the pricing is. you'd have to post an example
You can at least trust they will give you a pc with a default windows install without bloatware added. As for cases and parts choices, you will have to police them on that because they will prioritze moving old inventory out on you if you let them. My microcenter tried to give me a prebuilt with 3700x charged full new price even though amd 5xxx cpus were out already and they had them in stock.
I'm late but best prebuilt I've ever owned was from microcenter. No issues at all after 6 years. Their prices might be fricked nowadays though.
>Live just an hour and a half from a Microcenter
>Have family and friends that live out near it
It's a wonderful place
put the parts into pcpartpicker.com
if there is about a $100 surcharge per $400 of parts it's about normal for a prebuilt
>this bullshit again
I will not, I repeat, NOT pay more than $250 for a GPU. It has to be AT LEAST a 60-class card too.
>hurr durr you won't ever upgrade then
Then I will not upgrade. I don't need gay tracing and shit for bing bang wahoo.
cool might buy it, but more likely my gtx970 will get me through another gen because games don't actually need anything more than a 970 anymore
lmfao at any moron that has bought a card in the last 8 years
>my gtx970 will get me through another gen
We ran out of luck with that card: that shit is never gonna happen again, one of its kind. Praise the 970 and its analogue output.
>3.5gb
I still use 970 to this day. Still alive and kicking. Its fine because I only play Dota 2 anyway lol.
It's by no means perfect or even good by today standards. But it still plays recent games with high settings at 1080. The only problem I've ever had was with Deathloop which is the only current gen game I've deemed worth trying and with just a few tweaks I managed to play it just fine. That card has 8 years and I don't think the stars will ever align again for a scenario like this one (see pandemic, recession, lack of new gen games, and good enough gpu).
This. I only upgraded to a 2070 from a 970 because i thought 1080p 60hz would be outdated far quicker.
i still see no point in a 40 series gpu
>it can go up to 366W
Holy shit who would want this? Not only is that just absurd, but there's already an increase in electricity costs across the globe, especially in western Europe which makes up at least 20% of their sales. Are they just so shit that they can no longer manage performance and power usage?
You haven't seen shit if you think that's bad
>660W
This thing must cost at least a few hundred bucks to run every month even at American rates. Euros are gonna go to jail for using this. And how the frick are you going to cool that system? Don't they realize that electricity produces heat? I'm not sure if liquid cooling will fix that issue
>And how the frick are you going to cool that system?
I wish I was joking
https://videocardz.com/newz/leaked-lenovo-geforce-rtx-4090-legion-gpu-is-a-true-quad-slot-design
And just you wait for the 4090Ti, rumored to be [spoiler]800W[/spoiler]
what an unholy abomination
I cannot wait to see people snap their pcie slots right off with these things
if nvidia sells these things without support brackets they have no conscience
That's an AIB model. Nvidia founders editions will stick to the 450W spec and come with an appropriately sized cooler for that wattage. So no bracket
Including a support bracket for the pic is Lenovo's problem here
>13 heat pipe design
WOOOOWIIEEEE THAT'S ONE HOT BOYYY
It's really funny how zoomers forget how much power people were consuming when doing SLI setups which is what any enthusiast did. You guys are poorgays that think you are worthy of enthusiast tier hardware kek
SLI was actual fricking cancer and only for benchmorons. Almost no game was just plug and play with an SLI setup and even when it worked, there would be shit like scan line issues or stuttering or just crashes.
and multi-GPU was fricking stupid, it meant work had to be taken away from normal driver improvements in favor of manually hacking use of multiple GPUs into each and every game. Oh your game wasn't popular enough to be deemed worth it for driver mGPU? Tough luck one of your cards is now a $400 paper weight for that game.
And despite all of this, it still suffers the same issues that make mGPU shit
>shit framepacing and stutters
>huge power hog
>compatibility problems
>expensive, moronic pricey proprietary bridges which are just fancy wires under the hood and have no right to cost $120
But now Nvidia wants to bring mGPU power consumption to single card builds
>660w
no fricking way, how do you even cool that
>completely different die for two 4080 models with no indication to distinguish them except vram size
okay this one seems like typical nvidia behavior but
>516w on an 80-class card
what the actual frick?
>660W
What is this, a literal furnace? Great I finally have a card that can burn down my house if things ever go wrong.
so the 4090 is basically a 3090ti that has had its silicon absolutely pushed to the limit
getting 12900k vibes
Not really. The 3090 ti was already pushed to the limit. 4090 is the next generation of Nvidia gpu core design.
>rumored
Useless.
>12gb 4080
lol bullshit half of these rumors are moronic and make no logical sense
why the frick would they make two modesl of the 80 card before they even consider making a 70 or 60 card
But they did make the 4070
They're just calling it 4080 12 GB so that they can charge more $$$ for it
>what is RTX 3080 10GB and 12GB
>what is GTX 1060 3GB and 6GB
>what is like 5 different versions of the GT 1030
>what is the RTX 3080 mobile 8GB and 16GB (not to be confused with the 3080Ti mobile which is only 16GB)
enjoy the SKU ladder goy!
>the 50 ti is now just a 60 card
>the 60 card is a ti now
>the 70 card is a 80 card
>the 60 ti is a 70 card
good lord
think of it this way: if they called the 4080 12gb 4070 instead people would expect xx70 tier pricing, ie $500 or less, now that it's called 4080 they can charge $700 or more with ease, not to mention scalpers will love the fact that they can get even better margins on what's supposed to be a high end card so all those 4080 12gb models will go flying off the shelf, all because it's called 4080 and not 4070
it's all marketing frickery
Jesus christ I just want a decent deal on a GPU. I'm still using a GTX 970... There's a 3060ti on Newegg for $420 right now. Maybe I should just fricking buy it
That's only $20 over MSRP, do it right away
>but RTX 4000
Nvidia is leading with the 4090 only according to rumors, and it won't be until well into 2023 until the 4070 launches. The 4060/Ti will take even longer than that, especially since Nvidia will do everything they can to sucker people into those top 3 SKUs
who gives a frick
you're not gonna get one anyway
scamvidia
Not surprising, they've slowly been doing this for years.
>GTX 590
>TWO fermi chips on the same board
why was nvidia allowed to sell such a massive fire hazard?
You joke, but the supposed 4070-masquerading-as-a-4080 in the OP apparently has the same power draw as that dual chip fermi GPU.
Imagine if they made a modern dual chip GPU, it'd put space heaters to shame.
why are poorgays complaining about power consumption?
that's just the price you have to pay for better performance, it's normal
>Use a UPS
>oops now your ups can't be used because there is no ups that can stand a pc using 900 watts for even 15 minutes
based!
You will still have the option of undervolting back to reasonable levels. A stock 3060 ti is a 200W card and my undervolt makes it 100-120W card at max, but I hardly ever see max anyway. Performance loss from this aggressive undervolt is only 10% too.
UPS aren't made for you to use your PC moron, they're made to buy time when you shutdown in a safe manner.
My pc is like 300w under use, so yes i can play for an hour with my ups
moron
Just don't live in a third world shithole where you need a ups.
Simple as
>Just move to a first world place where they call you names and everybody hates you for not being full white
no
You have never stepped foot for even a second in a first world country. Hint, it's nothing like the moronic shit you see on here or the rest of the internet.
Ganker is the true final reality show, anon. I have even seen americans disgusted to touch the hand of brown people, irl, you can't convince me otherwise.
I even have a place where people from other countries can stay for a few nights and few of them are not racist.
And I've lived in the USA my whole life in the "worst" state for racism, Georgia while being black. The only place that treated me in a half racist way was a hotel shuttle driver in Wisconsin that refused to drive my family to a wedding until his manager talked to him.
>>Just move to a first world place where they call you names and everybody hates you for being white
FTFY
That's fiction.
Everybody is white, you white kid.
>BUT LE MEDIA
The media lies and both the right and left hate blacks and spanish homosexuals.
>everybody is white
what is California
what is Rust Belt
what are all big aglomerations
why are you in those places if you are white
I accept your concession
If you say so, I am never going to muttmerica anyway
Depends on where you are. I occasionally surf at Huntington Beach and it's like 70% white people, 20% asians, and the rest are hispanic/latino or whatever.
Even London isn't majority "white" lol, big western cities are diverse multi culti tutti frutti
there are more whites than any other race, that's why it's called "minority". gays
"whites" don't have a future demographically in "white" countries (or anywhere really)
as I've said, in many big western cities "whites" are just another minority
>there are more whites than any other race, that's why it's called "minority"
Let's suppose whites are the majority with 40% of population, that leaves 60% to non-whites. So on the streets you see more non-whites than whites.
>it's normal to need a 1200w PSU just so you don't get random shutdowns from power spikes
>from a single GPU
no
use your brain anon, what other reasons besides having to pay more overall are there? you can do it, you're a big boy
Do you think coolers just magically destroy the heat they absorb from chips? No, it has to go somewhere, and that somewhere is into your room
>I have AC
If you use whole house AC, every room but the PC room is too cold if you blast it, but the AC can't keep up with the heat output if you don't blast it.
And not all houses/rooms are able to use a standalone AC, for a number of reasons that are not always related to "poorgay"
just get a fan or open your window. next are you going to tell me that your room has no windows and you dont have enough outlets for a fan?
>having 800W heater in a room
>with no AC
>in summer
>just open a window bro
I wish you lived like that for the dumbassery you speak.
>If you use whole house AC
why would anyone do this
transient spikes and heat buddy
>it's normal
actually the opposite is true, the rise in power usage is abnormal, that's why so many people are complaining about it
yea idk whats all this shittalk is about the most powerful card will be 450w its not even that much
They're gonna announce a xx95 series soon right?
just move the old xx90 series up one tier and charge more money like how intel did with their i9
That means they know these things will be extremely limited and the ones that will be out in public will be scalped to oblivion.
They are withholding the rest of the lineup until AyyyMD shows their hand.
Expect a 4080 "Super" and so forth.
With the amount of corporate espionage that's going on, I'm certain nvidia already knows what AMD is going to offer and just makes these moronic super high power cards to keep the "gaming crown". Hopefully their 70 and 60 class cards are less moronic. Since AMD is rumored to still use 6nm for their lower class cards, nvidia might actually be more efficient in the low end if they also use 4nm.
Just buy a 30xx now then when it's cheap. 40xx is just a refresh.
This is cope. RTX 40 series is THE generation to get this decade
>implying
The greatest AI stuff ever done is just around the corner, and you will be stuck crying on 12GB of VRAM unable to use any of it.
30 series if they had actually been at the real prices outside day 1 would have been that. 40 seires has no official prices yet but is likely going to be even higher than the current 30 seires prices since they will be on the shelf together which will mean Nvidia won't want a 3080 and 4080 on the shelf costing the same or the 30 series will stop selling.
this is what killed my interest in the 4000 series
they can't get rid of the 3000 series fast enough and they won't drop the prices to make way for 4000 series cards so what we'll have is 2 year old cards at MSRP and 4000 series cards sold at a premium
I can only imagine how how 6080 is going to look like.
Let's just hope they have multidie working by then, otherwise they'll make every bigger dies they'll have to run at higher and higher frequencies and voltages just to beat AMD.
There's no games that necessitate an upgrade
What drives companies to pull shit like this?
>What drives companies to pull shit like this?
dumb consoomers who kept buying cards relentlessly while they were still 2-3x the price
shareholders
>you only made 582 gorillion last quarter! we expect 900 gorillion at least!
>Buying the card immediately after the generation's big leap forward.
Why? They always suck.
I don't care about 4k, so I will simply buy a 3000 series card and be happy.
Daily reminder that these new cards require a new PSU with the new ATX 3.0 standard, otherwise expect your PC to shut down everytime you play a game plus expect a housefire because current PSUs cannot handle even 3090 loads and spikes.
All I want is a decent 4060 (Ti)
>12GB
>192-bit bus
>Under 220W
>Under $400
Likelihood: 0%
will probably buy a 3080ti in november if prices drop down enough
>9080 gaming room
>change name
>now have to pay more money for the same shit
kek
What do I do if I just want to play some f2p shit and maybe a aaa now and again? My 1070 still runs okay but I want to be prepared. Not in terms of exact parts but like, what do I need overall? How much should I budget? 2k usd? 3? It's solely vidyap I'll be playing. Not looking for 4k or anything just 1080p60. It's qll I really want or need.
I'm not a poor gay. I make plenty of money. I could buy a $4000 PC today if I wanted to upgrade. But there are no new games that are worth a new system. Seriously, where the frick are the games? I guess Dragon's Dogma 2 will be nice. I'll just stick with my 3080 and keep playing these 5 year old unoptimized, spaghetti coded pieces of shit.
You can play DD2 with a 1070 most likely Nips
can't make graphically demanding games to save their lives
I just fricking played RDR 2 on ultra settings at 1440p with my 3080 12 gb and it never went over 5 gb vram usage. quit lying Black personboy.
How bad is the VRAM going to be on the 4070? I want to get into VR.
10gb to cuck you
If AMD's stuff is good they'll bump it to 12gb and the 4090TI will have 32gb and the 4080TI will have 20gb
My 1660 ran Elden Ring and will probably run Starfield. Guess I won't upgrade for now.
So uhh could the 4080 16gb do 4k120hz?
t. OLED TV owned
Sure, but since modern games all have shit optimization you will need to run everything at low settings.
I run anything i want 4k120 with a 3060ti
I don't give a shit about cyberpunk or whatever modern shit though
my laptop with a basically 6700XT does 4K120 LG OLED
Do research into how demanding your games are and what you need to run them at 4K120
bastards will kill my wallet
The rumoured cards were way too powerful. They probably couldn't get the watt spikes under control either. Given energy costs more than gold now they figured it best to drop the original 4080. Good thing for me too 'cause my 3070 will last longer now tbh. It does suck however that they will charge more for what they were previously gonna charge as a 4070 but that's njudea for you.
sorry I am not black, so you are missing the shot by a long distance here.
that's cool but I'm still sticking to my 1080ti
Maybe if amd weren't such a fricking garbage dump of a company we wouldn't be forced to buy nvidia
I thought you people also hated browns? now they don't exist, only whites and blacks? you make no sense.
I don't care about the 4000 series but can I at least expect the price of the 3060 to drop in the coming months?
no
You will pay gorillions for shit hardware and you will be happy
Why should I care what they call it? I’ll buy one (or not) based it’s price and it’s performance relative to alternatives (previous gen, AMD).
>upgraded to a 3080Ti last month
guess I'm sticking with this bad boy for the next 5-6 years
>Laughs in 3070
I'll just buy a cheap 3060 or a 3070 when 4xxx drops.
they won't be so cheap if performance increase will be measly 10%
>tfw got a 6900xt at retail price when they came out
>still get 90+ fps on max settings on everything
No point in upgrading until I can't maintain 60 fps
30xx is a 40xx beta test
now we will see true 4k / rayracing
What's the AMD equivalent of a 3060 ti/3070?
Are they doing better in power consumption/heat?
AMD RX 6700 XT 12 GB
Tbh get the 6800 non xt both 3070 and 6700XT are at 230W while the 6800 is at 250W but performs 25-35% better than the 3070 and also performs better than a 3070ti which uses 290W
RX 6700XT/RX 6800
they are priced (well at MSRP, real world price is a shitshow) higher but also do notably better in rasterization + more VRAM
sirs ive got a rtx 3090 that i bought to test i can send it back for 5 days and get full price back should i return it? im thinking yes
Spending $1000 on a card when in 2 months you'll get a better card for the same or slightly cheaper seems bad. I'd say yes
Maximum greed
i ordered a 3080 ti
deep down I wanted a cooler 3070 with less temps less watts
This isn't the 4070. The 4070 had a tdp of 285 watts and lower specs. It will probably release next year. You won't get one of these cards before then anyway as it will be a paper launch until they get rid of the 3k series.
>3080Ti
>32GB 3600
>Ryzen 7 5800X3D
>1440p 165Hz
>4K TV for couch single player
It's almost like i care for those fricking new scammer cards.
and all that power to play the shitposting game
I'm very good at it
Let me know when I can get a card that can do 4k at 60 fps in AAA games, 4k 144 fps in competitive games, doesn't draw too much power, and is $300.
Otherwise, not interested.
>970
>4670k
>still no game i need to upgrade for
Maybe that harry potter game will be good.
I'm on a 4790k and gtx 970 at 1080p/60fps and I've yet to run into a game that I can't run on max settings
I can't max resident evil 2 remake, anon
Don't care about any of this, I will not buy a GPU with more than 200W TDP
israelitery aside, are we hitting the limits of thermodynamics or is Nvidia just lazy?
Both
>16K shaders (ALUs)
>2.5GHz-3GHz
That was always going to draw insane amounts of power, no way around it
However, nvidia builds massive monolithic chips. Those chips are highly likely to suffer from silicon defects, which means the amount of voltage (thus power) needed to push them to those high clock speeds is increased. AMD is moving to chiplets, where they basically split the single chip into many, smaller sub-chips with each part of the GPU on them. smaller chiplets = much easier to get a strong bin/low defect chip = less voltage required = less power consumed.
>4070 is now the 4080 12GB
>4080 16 GB the same
>4090 is untouched
could be worse
Can't imagine anything more cringe than spending $1000+ on a PC part to play your little video games with
Get a life you fricking loser
you're on the videogames board of Ganker, you're either on the wrong website or you failed as a normalgay and are still coping about it
sounds like poorgay cope
kek, they have done this so many times in prior gens that you are probably paying XX90ti prices for what actually would have been a low-midrange piece of shit